CN117994174A - Focal length splicing method and device, electronic equipment and storage medium - Google Patents

Focal length splicing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117994174A
CN117994174A CN202211320247.2A CN202211320247A CN117994174A CN 117994174 A CN117994174 A CN 117994174A CN 202211320247 A CN202211320247 A CN 202211320247A CN 117994174 A CN117994174 A CN 117994174A
Authority
CN
China
Prior art keywords
image
lens
object distance
distortion
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211320247.2A
Other languages
Chinese (zh)
Inventor
史飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN202211320247.2A priority Critical patent/CN117994174A/en
Publication of CN117994174A publication Critical patent/CN117994174A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a focal length splicing method, a focal length splicing device, electronic equipment and a storage medium. The method comprises the following steps: according to distortion correction parameters obtained through pre-calibration, performing distortion correction on an image to be processed obtained through shooting by a first lens; acquiring a current object distance, and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained by pre-calibration; translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance; if the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing. By using the technical scheme of the invention, the image frame jump during focal length splicing can be reduced.

Description

Focal length splicing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a focal length splicing method, a focal length splicing device, an electronic device, and a storage medium.
Background
In the field of video surveillance, a zoom camera is often required in order to enable the surveillance equipment to cover more areas and scenes. The zoom camera generally adopts a plurality of lenses to respectively image, each lens is respectively responsible for different focal segment ranges, and the plurality of lenses are spliced through focal lengths to form the camera with a larger zoom range.
In the zooming process, as the optical axes cannot be overlapped when different lenses are installed, the optical axis distance and rotation deviation exist, so that the image centers of different lenses are inconsistent, and when the lenses are switched, the image center jump occurs. Meanwhile, because the rotation angles in the image sensor planes of different lenses are not consistent, image rotation jump occurs when the lens switching is performed. In addition, as the distortion degree of the short-focus lens is serious, the distortion degree of the medium-focus lens and the long-focus lens can be gradually reduced, the distortion degrees of the lenses with different focal lengths are different, and image distortion jump can occur when the lens switching is performed. Therefore, in the focal length splicing process, image frame jump during lens switching becomes a problem to be solved urgently.
Disclosure of Invention
The invention provides a focal length splicing method, a focal length splicing device, electronic equipment and a storage medium, so as to reduce image frame jump during focal length splicing.
In a first aspect, an embodiment of the present invention provides a focal length splicing method, where the method includes:
According to distortion correction parameters obtained through pre-calibration, performing distortion correction on an image to be processed obtained through shooting by a first lens;
Acquiring a current object distance, and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained by pre-calibration;
translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance;
If the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing.
In a second aspect, an embodiment of the present invention further provides a focal length splicing apparatus, where the apparatus includes:
the distortion correction module is used for carrying out distortion correction on the image to be processed obtained by shooting the first lens according to the distortion correction parameters obtained by calibration in advance;
the current object distance acquisition module is used for acquiring the current object distance and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained through pre-calibration;
the translation rotation correction module is used for translating the image to be processed after distortion correction according to the center deviation parameter and rotating the translated image according to the rotation parameter obtained by calibration in advance;
And the focal length splicing module is used for cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification if the current digital zoom magnification is determined to be smaller than the digital zoom threshold, so as to realize focal length splicing.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the focal length stitching method according to any one of the embodiments of the present invention when executing the program.
In a fourth aspect, embodiments of the present invention also provide a storage medium storing computer-executable instructions that, when executed by a computer processor, are configured to perform a focus splicing method according to any of the embodiments of the present invention.
According to the technical scheme, distortion correction is carried out on an image to be processed obtained through shooting of a first lens through a pre-calibrated distortion correction parameter, a center deviation parameter is obtained according to the current object distance and the mapping relation between the pre-calibrated object distance and the center deviation, center translation correction is carried out through the center deviation parameter, rotation correction is carried out through a pre-calibrated rotation parameter, and when the current digital zoom magnification is smaller than a digital zoom threshold, digital zoom and focal length splicing are carried out on the corrected image. The problem of prior art in focus concatenation in-process, image frame jump when the camera lens switches is solved, image frame jump in the focus concatenation in-process has been reduced.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a focal length splicing method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a first distortion factor curve and a second distortion factor curve according to a first embodiment of the present invention;
Fig. 3 is a flowchart of a focal length splicing method according to a second embodiment of the present invention;
FIG. 4 is a schematic diagram of a relationship between focal position and object distance according to a second embodiment of the present invention;
fig. 5 is a schematic diagram of a lens center shift according to a second embodiment of the present invention;
fig. 6 is a schematic structural diagram of a focal length splicing device according to a third embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a focal length splicing method provided in an embodiment of the present invention, where the embodiment is applicable to a case where a zoom camera performs focal length splicing when a focal length is changed, the method may be performed by a focal length splicing device, where the focal length splicing device may be implemented in hardware and/or software, and the focal length splicing device may be configured in an electronic device and used in cooperation with the zoom camera, where the zoom camera includes at least two lenses.
As shown in fig. 1, the method includes:
s110, carrying out distortion correction on the image to be processed obtained by shooting the first lens according to the distortion correction parameters obtained by calibration in advance.
Wherein the distortion correction parameter is used to correct lens distortion of the image, the lens distortion representing the degree of distortion of the imaging of the lens at different angles of view. The lens distortion generally includes radial distortion and tangential distortion, the radial distortion is generally caused by curvature change defect of the lens itself, the influence on the distortion degree is large, the tangential distortion is generally caused by non-parallelism between the lens and the imaging surface, and the image sensor can ensure parallelism with the lens when being mounted, and the tangential distortion is negligible, so the lens distortion in this embodiment refers to radial distortion.
In this embodiment, since the distortion degrees of the lenses in different focal segment ranges under the respective angles of view are different, in general, the distortion degrees of the telephoto lens are smaller than those of the short-focal lens, so in order to make the consistency of the image distortion degrees of the different lenses better, the distortion degrees of the different lenses can be corrected to an ideal distortion-free state in the zooming process, and the checkerboard distortion correction mode can be adopted for correction, and the checkerboard distortion correction mode is the existing distortion correction mode, and the specific process of the checkerboard distortion correction mode is not repeated in this embodiment. The lens with higher distortion degree can be corrected to the same state of the lens with lower distortion degree, so that image frame jump caused by lens distortion can be eliminated when the lens is switched. In this embodiment, the description of the distortion correction process is given taking as an example the case of correcting a lens having a higher distortion degree to the same state as a lens having a lower distortion degree.
It should be noted that, in this embodiment, the zoom photographing device includes two lenses, that is, a first lens and a second lens, where the first lens is a short-focal lens, the focal length is f 1, the second lens is a long-focal lens, the focal length is f 2-f3, and f 1<f2<f3, when the target focal length in the zooming process is between f 1 and f 2, the image to be processed obtained by photographing the first lens is processed, so that the image to be processed exhibits an equivalent image effect of imaging with the second lens. However, the number of lenses is not limited in this embodiment, and when the number of lenses is three or more, in the zooming process, the same manner as in this embodiment is adopted to perform focal length stitching on the corresponding nth lens and n+1th lens, so as to solve the problem of image frame jump when the nth lens and the n+1th lens are switched.
Optionally, the calibration process of the distortion correction parameter may include the following A1-A5:
A1, respectively acquiring a first checkerboard calibration image of a first lens when the object distance is infinity and a second checkerboard calibration image of a second lens when the object distance is infinity, and respectively carrying out angular point identification on the first checkerboard calibration image and the second checkerboard calibration image;
Specifically, the imaging effect when the object distance is infinity can be simulated by adjusting the distance-increasing mirror. In the embodiment of the invention, the first lens shoots when the distance between the vertical lens and the object of the checkerboard image is infinity to obtain a first checkerboard calibration image, and the second lens shoots when the distance between the vertical lens and the object of the checkerboard image is infinity to obtain a second checkerboard calibration image.
The corner recognition is used for recognizing and obtaining the corner positions in the first checkerboard calibration image and the second checkerboard calibration image, and the corner, namely the extreme point, can be the intersection of two lines. The corner recognition may adopt a Kitchen-Rosenfeld corner detection algorithm, a Harris corner detection algorithm, a SUSAN corner detection algorithm, or the like, and the specific algorithm adopted for the corner recognition is not limited in this embodiment.
A2, determining the distortion rate of each corner point in the second checkerboard calibration image to obtain a second distortion rate curve;
And respectively calculating the actual distance from each corner point identified in the second checkerboard calibration image to the image center of the second checkerboard calibration image and the ideal distance from each corner point identified in the second checkerboard calibration image to the image center of the second checkerboard calibration image.
Wherein the actual distance can be expressed by the following formula: Where r d is the actual distance from the corner point to the center of the image, and (x, y) represents the pixel coordinates of the corner point when the center of the image is taken as the origin.
The ideal distance can be expressed by the following formula: Wherein r i represents the ideal distance from the corner point to the center of the image, a is the pixel size of the checkerboard matched with the center of the image, n is the number of the corner points from the center of the image in the width direction of the second checkerboard calibration image, and m is the number of the corner points from the center of the image in the height direction of the second checkerboard calibration image.
And after the actual distance and the ideal distance are obtained for each angular point respectively, calculating to obtain the distortion rate of each angular point, wherein the distortion rate can be a value obtained by dividing the difference value between the actual distance and the ideal distance by the ideal distance.
Since the distance between each corner point and the center of the image is different, the angle of view of the position where each corner point is located is different, and the angle of view of each corner point can be expressed by the following formula: wherein alpha is the angle of view corresponding to the corner point, and f is the focal length of the second lens.
Therefore, for each corner point, the angle of view and the distortion rate are obtained respectively, so that the distortion rate curves under different angles of view are obtained. Fig. 2 provides a schematic diagram of a first distortion ratio curve and a second distortion ratio curve, as shown in fig. 2, a short focal point is a first lens, a long focal point is a second lens, and the first distortion ratio curve and the second distortion ratio curve are obtained for the first checkerboard calibration image and the second checkerboard calibration image respectively, and according to fig. 2, the distortion degree of the first lens is higher than that of the second lens.
A3, fitting according to the actual distance of the corner points and the ideal distance of the corner points in the first checkerboard calibration image to obtain a first distortion coefficient;
For each corner in the first checkerboard calibration image, the actual distance and the ideal distance are obtained respectively, and the distortion of the lens can be expressed by the following formula: r d=ri(1+kri 2), wherein k is a distortion parameter, and fitting the formula according to the actual distance of each corner point and the ideal distance of each corner point to obtain a first distortion coefficient matched with the first lens.
A4, carrying out distortion correction on the first checkerboard calibration image according to the target distortion coefficient, and obtaining a target distortion rate curve according to the distortion rate of each corner point in the first checkerboard calibration image after the distortion correction;
Wherein the target distortion coefficient has a value of 0 to a first distortion coefficient;
The first distortion coefficient is used for representing the distortion coefficient when the ideal image reaches the actual imaging effect of the first lens, and the distortion correction parameter is used for representing the distortion coefficient when the first checkerboard calibration image reaches the actual imaging effect of the second lens. As can be seen from fig. 2, the distortion correction parameter should have a value between 0 and the first distortion coefficient. Therefore, traversing the 0 to first distortion coefficients according to a preset search step length to obtain distortion correction parameters.
In the embodiment of the invention, the target distortion coefficient is a distortion coefficient value traversed from 0 to the first distortion coefficient according to a preset searching step length. And carrying out distortion correction on the first checkerboard calibration image according to the target distortion coefficient, carrying out corner recognition on the first checkerboard calibration image after the distortion correction in the same mode, and respectively calculating each corner to obtain a field angle and a distortion rate to obtain a target distortion rate curve after the distortion correction on the first checkerboard calibration image according to the target distortion coefficient.
Specifically, the distortion correction :xd=x(1+kir2);yd=y(1+kir2),r2=x2+y2, of the target distortion coefficient to the first checkerboard-calibrated image may be achieved by the following formula, where (x, y) represents the pixel coordinates in the first checkerboard-calibrated image, (x d,yd) represents the pixel coordinates of the target distortion coefficient to the distortion correction of the first checkerboard-calibrated image, and k i represents the target distortion coefficient. Optionally, if the calculated x d、yd is a floating point number, a bilinear interpolation mode may be adopted to obtain an integer value matched with x d、yd, where the bilinear interpolation mode is an interpolation mode in the prior art, and a specific process of the bilinear interpolation is not described in detail in this embodiment.
A5, taking the corresponding target distortion coefficient when the error of the target distortion rate curve and the second distortion rate curve meets the preset error condition as a distortion correction parameter.
Traversing from 0 to the first distortion coefficient, and calculating the value of each target distortion coefficient to obtain a corresponding target distortion rate curve.
Optionally, the predetermined error condition may be that a least square error between the target distortion rate curve and the second distortion rate curve is the smallest, or the euclidean distance between each distortion rate curve and the second distortion rate curve may be calculated, where the predetermined error condition refers to that the euclidean distance between the target distortion rate curve and the second distortion rate curve is the smallest, but the predetermined error condition is not limited in this embodiment.
Specifically, the distortion correction :xd=x(1+kxr2);yd=y(1+kxr2);r2=x2+y2; may be performed on the image to be processed obtained by photographing the first lens according to the following formula, where (x, y) is a pixel coordinate in the image to be processed, (x d,yd) is a pixel coordinate in the image to be processed after the distortion correction, and k x is a distortion correction parameter.
In the embodiment of the invention, for the focal length f 1、f2 or between f 1 and f 2, whether the focal length is changed from large to small or from small to large, the image to be processed obtained by shooting with the first lens is obtained, and the image to be processed is subjected to distortion correction through the distortion correction parameters.
S120, acquiring a current object distance, and obtaining a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained through pre-calibration.
The current object distance is matched with the current scene of the second lens, the center deviation parameter is used for correcting the deviation between imaging centers of different lenses, the center deviation is related to the object distance, and the center deviation under different object distances is different, so that the mapping relation between different object distances and the center deviation can be obtained through calibration in advance, and the center deviation parameter is obtained after the current object distance is obtained in the embodiment of the invention.
According to the technical scheme, the problem of image frame jumping can be well solved under different object distances of the zooming shooting device.
S130, translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance.
The rotation parameter is used to represent the difference in rotation angle in the image sensor plane between different lenses. In the embodiment of the invention, distortion correction is carried out on the image to be processed through the distortion correction parameters, so that the influence of lens distortion on image frame jump is solved, translation correction is carried out through the center deviation parameters, the influence of inconsistent lens imaging centers on image frame jump is solved, and the influence of the difference of the in-plane rotation angles of the image sensors among different lenses on the image frame jump is solved through rotation correction of the rotation parameters.
In the embodiment of the invention, when the distortion corrected image to be processed is translated and rotated, the central correction scaling rate and the rotation correction scaling rate can be respectively set. The meaning of the center correction scaling factor indicates that when the distortion corrected image to be processed is translated according to the center deviation parameter, a translation area in the image to be processed is determined according to the center correction scaling factor. The meaning of the rotation correction scaling means that when the translated image is rotated according to the rotation parameter, a rotation region in the translation region is determined according to the rotation correction scaling.
For example, the center correction scaling may be set to 96%, i.e., 96% of the area is cropped in the image to be processed for center shift correction. The rotation correction scaling rate may be set to 94%, that is, a region of 94% is cut out again in the region after the center shift correction for rotation correction. The center correction zoom ratio and the rotation correction zoom ratio may be set according to a hardware error of the zoom photographing apparatus, but the specific values selected by the center correction zoom ratio and the rotation correction zoom ratio are not limited in this embodiment.
And S140, if the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing.
The current digital zoom magnification can be calculated by dividing the digital zoom equivalent focal length by the focal length of the first lens, and the digital zoom equivalent focal length is the target focal length in the zooming process. The digital zoom threshold may be determined based on a scaling factor, a center correction scaling factor, and a rotation correction scaling factor. In particular, the digital zoom threshold may be a product of a scaling factor, a center correction scaling factor, and a rotation correction scaling factor.
In the embodiment of the invention, in the process of zooming from f 1 to f 2, if the current digital zoom magnification corresponding to the image to be processed obtained by shooting by the first lens is smaller than the digital zoom threshold, cutting the rotated image to be processed according to the current digital zoom magnification, specifically, the length of the cut image is the product of the length of the image before cutting and the current digital zoom magnification, and the width of the cut image is the product of the width of the image before cutting and the current digital zoom magnification.
The super-resolution reconstruction is performed on the cut image to amplify the cut image to the resolution of the image to be processed, wherein the super-resolution reconstruction refers to obtaining a high-resolution image by using a certain image processing algorithm on a sequence of low-resolution images, and the adopted super-resolution reconstruction algorithm is not limited in the embodiment.
In the embodiment of the invention, when the current digital zoom magnification corresponding to the image to be processed obtained by shooting by the first lens reaches the digital zoom threshold, the image output by the zoom shooting device is switched to the image obtained by shooting by the second lens when the focal length is f 2, at this time, the consistency of the angles of view of the images obtained by shooting by the first lens and the second lens is better, and the jump generated by the image picture is minimum.
In the embodiment of the present invention, the process from f 2 to f 3 is implemented through the second lens, and the zooming process of the second lens is not described herein.
Correspondingly, when the focal length is changed from large to small, in the process of changing the focal length from f 3 to f 2, the zooming process is realized through the second lens, when the focal length is changed to f 2, the image output by the zooming shooting device is switched to the image after the first lens performs distortion correction, center translation correction, rotation correction and digital zooming and focal length splicing on the shot image when the current digital zooming magnification is the digital zooming threshold. When the focal length is changed from f 2 to f 1, the image obtained by shooting the first lens under the current digital zoom magnification corresponding to the target focal length is output, and the image obtained after the digital zoom and focal length are spliced is subjected to distortion correction, center translation correction and rotation correction.
According to the technical scheme, any change of the focal length between f 1 and f 3 can be achieved, when the focal length is changed, only the angle of a field of view of an image subjected to digital zooming is changed, the resolution is kept unchanged, and therefore the zooming effect of optical zooming is achieved, and the focal length splicing process is achieved.
According to the technical scheme, distortion correction is carried out on an image to be processed obtained through shooting of a first lens through a pre-calibrated distortion correction parameter, a center deviation parameter is obtained according to the current object distance and the mapping relation between the pre-calibrated object distance and the center deviation, center translation correction is carried out through the center deviation parameter, rotation correction is carried out through a pre-calibrated rotation parameter, and when the current digital zoom magnification is smaller than a digital zoom threshold, digital zoom and focal length splicing are carried out on the corrected image. The problem of prior art in focus concatenation in-process, image frame jump when the camera lens switches is solved, image frame jump in the focus concatenation in-process has been reduced.
Example two
Fig. 3 is a flowchart of a focal length splicing method according to a second embodiment of the present invention, where the calibration process, the image distortion correction process, and the current object distance obtaining process of the mapping relationship between the object distance and the center deviation are further embodied based on the foregoing embodiments.
As shown in fig. 3, the method includes:
S210, carrying out distortion correction on the image to be processed obtained by shooting the first lens according to the distortion correction parameters obtained by calibration in advance.
In the above embodiment, the calibration process of the distortion correction parameter and the process of performing distortion correction on the image to be processed according to the distortion correction parameter have been specifically described, and this embodiment is not described herein again.
S220, judging whether the scene of the shooting device is fixed, if so, executing S230, otherwise, executing S240.
For a shooting device for fixing a scene, the current object distance is determined only after the scene is fixed, so that the center deviation parameter is calculated.
S230, acquiring a predetermined current object distance.
The current object distance is determined according to the zoom position and the focusing position of the second lens of the shooting device and a relation curve between the focusing position of the second lens and the object distance.
Fig. 4 provides a schematic diagram of a relationship between a focusing position and an object distance, and as shown in fig. 4, the relationship between a zoom position and a focusing position is different at different object distances. The relationship curve between the focusing position and the object distance may be obtained through calibration in advance or may be obtained through calculation of optical parameters of the zoom camera, which is not limited in this embodiment.
In the embodiment of the invention, the zoom position and the focusing position of the second lens and the relation curve of the focusing position and the object distance are obtained, and the current object distance can be calculated. If the values of the zoom position and the focus position are not on the preset relation curve, linear interpolation can be carried out on the preset relation curve, and the current object distance is obtained. For example, if a point (ZPos, FPos) is between a curve with an object distance of 5m and a curve with an object distance of 10m, ZPos represents a zoom position, FPos represents a focus position, then at ZPos, the focus position corresponding to the curve with the object distance of 5m is FPos 5m, and the focus position corresponding to the curve with the object distance of 10m is FPos 10m, then the current object distance D is calculated by the following formula:
s240, acquiring a current holder position, and determining a current object distance matched with the current holder position.
The current object distance is determined according to the zoom position and the focusing position of the second lens of the shooting device at the current holder position and a relation curve of the focusing position of the second lens and the object distance.
When the scene is a non-fixed scene, the zoom shooting device may be a ball-type camera or a pan-tilt integrated machine. At this time, for each holder position, the object distance corresponding to each holder position is determined in advance according to the zoom position, the focus position, and the relationship curve between the focus position and the object distance of the second lens. The above steps of the object distance determining process are already described, and the step is not described herein. And under the non-fixed scene, determining the object distance corresponding to the current holder position according to the current holder position.
S250, obtaining a center deviation parameter according to the current object distance and the mapping relation between the object distance and the center deviation obtained through pre-calibration.
Optionally, the calibration process of the mapping relationship between the object distance and the center deviation may be further divided into the following steps B1-B6:
B1, respectively acquiring a first marker calibration image and a second marker calibration image of the first lens and the second lens under the condition of calibrating object distances;
Specifically, under the condition of the target distance, a first target calibration image is obtained by shooting the target image through a first lens, and a second target calibration image is obtained by shooting the target image through a second lens.
The two markers with the distance L are arranged in the calibration image, and meanwhile, in order to facilitate subsequent feature point matching, details can be arranged in the calibration image as much as possible.
B2, performing feature point matching on the first marker calibration image and the second marker calibration image to obtain a first feature point set and a second feature point set;
Before feature point matching is carried out, distortion correction is carried out on the first marker calibration image through distortion correction parameters obtained through pre-calibration, and the first marker calibration image and the second marker calibration image after the distortion correction are converted into gray level images.
And carrying out feature point matching on the two images converted into the gray level images, wherein the matched feature point sets are respectively marked as a first feature point set corresponding to the first marker calibration image after distortion correction and a second feature point set corresponding to the second marker calibration image.
For example, a SIFT (SCALE INVARIANT Feature Transform, scale-invariant feature transform matching) algorithm, a SURF (Speeded-Up Robust Features, accelerated robust feature) algorithm, or the like may be used for feature point matching, and the specific algorithm used for feature point matching is not limited in this embodiment.
B3, obtaining a scaling factor, a rotation parameter, a central deviation abscissa value and a central deviation ordinate value according to the first characteristic point set, the second characteristic point set and the similar transformation formula;
Wherein the similarity transformation formula is q=srp+t; wherein P represents the characteristic point in the gray scale corresponding to the first marker calibration image after distortion correction, Q represents the characteristic point in the gray scale corresponding to the second marker calibration image matched with P, R is a rotation matrix, θ represents an in-plane rotation angle of the image sensor, that is, a rotation parameter,T is a translation matrix, T x is a central deviation abscissa value, T y is a central deviation ordinate value, and then the relationship between P and Q in a homogeneous coordinate system can be expressed by the following formula:
Where (x Q,yQ) denotes the pixel coordinates of Q and (x P,yP) denotes the pixel coordinates of P.
The above formula is converted into a formula when translation and rotation are performed by taking the center of the image as a reference point:
Where W represents the image width and H represents the image height.
Substituting the characteristic point pairs in the first characteristic point set and the second characteristic point set into the formula to obtain values of four parameters, namely a scaling factor, a rotation parameter, a central deviation abscissa value and a central deviation ordinate value.
Further, in order to reduce the influence of the error of the feature point matching on the parameter calculation result, the scaling factor, the rotation parameter, the center deviation abscissa value, and the center deviation ordinate value may be calculated by the following steps C1 to C6:
And C1, determining at least two pairs of first characteristic points in the first characteristic point set, wherein the first characteristic point pairs comprise two first characteristic points, and the distance between the two first characteristic points is larger than or equal to a preset distance threshold value.
And two first feature points with the optional distance larger than or equal to a preset distance threshold value in the first feature point set are used for calculating a scaling factor, a rotation parameter, a central deviation abscissa value and a central deviation ordinate value, so that errors caused by parameter calculation due to the fact that the distance between the first feature points is too close can be reduced.
And C2, calculating candidate transformation parameters according to each first characteristic point in the first characteristic point pair and the second characteristic point matched with each first characteristic point.
The transformation parameters are scaling factor, rotation parameter, and the abscissa value of the center deviation and the ordinate value of the center deviation.
And C3, calculating simulation second feature points matched with the rest first feature points according to the candidate transformation parameters for the rest first feature points except the first feature point pairs in the first feature point set.
And C4, calculating errors of the second characteristic points in the second characteristic point set matched with the rest of the first characteristic points.
And C5, sequencing each candidate transformation parameter according to the number of the simulated second characteristic points with the errors smaller than the preset error threshold value from large to small.
And C6, determining target transformation parameters according to the ordered candidate transformation parameters.
Specifically, the candidate transformation parameters of the N groups ranked in the front may be selected, the average value of the N groups of candidate transformation parameters may be calculated as the target transformation parameter, or the candidate transformation parameter ranked first may be used as the target transformation parameter, which is not limited in this embodiment.
B4, determining pixel deviation according to the optical axis distance between the first lens and the second lens, the number of image pixels and the marker distance;
B5, determining a translation vector according to the central deviation abscissa value and the central deviation ordinate value;
and B6, determining the mapping relation between the object distance and the center deviation according to the pixel deviation, the translation vector and the calibration object distance.
Specifically, the mapping relationship between the object distance and the center deviation can be expressed by the following formula: Wherein Offset D represents a center deviation of the first lens with respect to the second lens at an object distance D, D c represents a calibration object distance, C 1 represents a pixel deviation, C 2 represents a translation vector, B represents an optical axis distance between the first lens and the second lens, n represents the number of image pixels, L represents a marker distance, T x represents a center deviation abscissa value, and T y represents a center deviation ordinate value.
The derivation process of the mapping relation formula of the object distance and the center deviation is as follows:
taking the example that two lenses are arranged side by side in the vertical direction, the horizontal distance between the optical axes of the lenses is zero. FIG. 5 shows a schematic view of the center shift of a lens, wherein when the first lens module, i.e. the first lens, is parallel to the optical axis of the second lens module, i.e. the second lens, the distance between the optical axes in the vertical direction is B, the object distance is D, the focal length is f, and the center shift of the images imaged by the two lenses is B When the included angle beta exists between the optical axes of the first lens and the second lens, the deviation caused by the included angle is C, C=D tan beta, and the deviation caused by C is C, and the/>
The image center deviation is composed of a deviation B caused by B and a deviation C caused by C, and the total deviation is OffsetAt a nominal object distance D c,/>Thus, for any object distance, the center deviation can be expressed as
Since Offset Dc is the center Offset at object distance D c, that is, (T x,Ty),B、Dc can be obtained during calibration, and f can be obtained by the following formula: Thus,/> Substituting f into/>In (3), the Offset D is converted into a pixel size to obtain/>And then a mapping relation formula of the deviation of the object distance and the center can be obtained.
And S260, translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance.
The process of performing the center shift correction according to the center deviation parameter and the process of performing the rotation correction according to the rotation parameter have been specifically described in the above embodiments, and the present embodiment is not described herein.
And S270, if the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing.
Likewise, the above embodiments of the digital zoom and focal length splicing process have been specifically described, and this embodiment is not described herein again.
Example III
Fig. 6 is a schematic structural diagram of a focal length splicing device according to a third embodiment of the present invention. As shown in fig. 6, the apparatus includes: a distortion correction module 310, a current object distance acquisition module 320, a translational rotation correction module 330, and a focal length stitching module 340. Wherein:
The distortion correction module 310 is configured to perform distortion correction on an image to be processed obtained by shooting with the first lens according to a distortion correction parameter obtained by calibration in advance;
The current object distance obtaining module 320 is configured to obtain a current object distance, and obtain a center deviation parameter according to the current object distance and a mapping relationship between the object distance and a center deviation obtained by calibration in advance;
The translation rotation correction module 330 is configured to translate the distortion corrected image to be processed according to the center deviation parameter, and rotate the translated image according to the rotation parameter obtained by calibration in advance;
And the focal length splicing module 340 is configured to cut and reconstruct the rotated image to be processed according to the current digital zoom magnification if the current digital zoom magnification is determined to be smaller than the digital zoom threshold, so as to implement focal length splicing.
According to the technical scheme, distortion correction is carried out on an image to be processed obtained through shooting of a first lens through a pre-calibrated distortion correction parameter, a center deviation parameter is obtained according to the current object distance and the mapping relation between the pre-calibrated object distance and the center deviation, center translation correction is carried out through the center deviation parameter, rotation correction is carried out through a pre-calibrated rotation parameter, and when the current digital zoom magnification is smaller than a digital zoom threshold, digital zoom and focal length splicing are carried out on the corrected image. The problem of prior art in focus concatenation in-process, image frame jump when the camera lens switches is solved, image frame jump in the focus concatenation in-process has been reduced.
On the basis of the above embodiment, the apparatus further includes a distortion correction parameter calibration module, specifically configured to:
Respectively acquiring a first checkerboard calibration image of the first lens when the object distance is infinity and a second checkerboard calibration image of the second lens when the object distance is infinity, and respectively carrying out angular point identification on the first checkerboard calibration image and the second checkerboard calibration image;
determining the distortion rate of each corner in the second checkerboard calibration image to obtain a second distortion rate curve;
Fitting according to the actual distances of the corners and the ideal distances of the corners of each corner in the first checkerboard calibration image to obtain a first distortion coefficient;
carrying out distortion correction on the first checkerboard calibration image according to the target distortion coefficient, and obtaining a target distortion rate curve according to the distortion rate of each corner point in the first checkerboard calibration image after the distortion correction;
Wherein the target distortion coefficient has a value of 0 to a first distortion coefficient;
And taking the corresponding target distortion coefficient when the error of the target distortion rate curve and the second distortion rate curve meets the preset error condition as a distortion correction parameter.
Based on the above embodiment, the distortion correction module 310 includes:
The distortion correction unit is used for carrying out distortion correction on the image to be processed obtained by shooting by the first lens according to the following formula:
xd=x(1+kxr2);
yd=y(1+kxr2);
r2=x2+y2
Wherein, (x, y) is the pixel coordinates in the image to be processed, (x d,yd) is the pixel coordinates in the image to be processed after distortion correction, and k x is the distortion correction parameter.
Based on the above embodiment, the current object distance acquiring module 320 includes:
The fixed scene object distance acquisition unit is used for acquiring a predetermined current object distance if the scene of the shooting device is fixed, wherein the current object distance is determined according to the zoom position and the focusing position of the second lens of the shooting device and the relation curve of the focusing position of the second lens and the object distance;
The device comprises an unfixed scene object distance acquisition unit, a camera and a camera, wherein the unfixed scene object distance acquisition unit is used for acquiring a current holder position if the scene of the camera is unfixed, and determining a current object distance matched with the current holder position, wherein the current object distance is determined according to a zoom position and a focusing position of a second lens of the camera at the current holder position and a relation curve of the focusing position of the second lens and the object distance.
On the basis of the above embodiment, the device further includes a center deviation calibration module, specifically configured to:
respectively acquiring a first marker calibration image and a second marker calibration image of the first lens and the second lens under the condition of calibrating object distances;
Performing feature point matching on the first marker calibration image and the second marker calibration image to obtain a first feature point set and a second feature point set;
Obtaining a scaling factor, a rotation parameter, a central deviation abscissa value and a central deviation ordinate value according to the first characteristic point set, the second characteristic point set and the similar transformation formula;
Determining pixel deviation according to the optical axis distance between the first lens and the second lens, the number of image pixels and the marker distance;
determining a translation vector according to the center deviation abscissa value and the center deviation ordinate value;
And determining the mapping relation between the object distance and the center deviation according to the pixel deviation, the translation vector and the calibrated object distance.
On the basis of the above embodiment, the mapping relationship between the object distance and the center deviation is expressed by the following formula:
/>
C2=(Tx,Ty);
Wherein Offset Dpc represents a center deviation of the first lens with respect to the second lens at an object distance D, D c represents a calibration object distance, C 1 represents a pixel deviation, C 2 represents a translation vector, B represents an optical axis distance between the first lens and the second lens, n represents the number of image pixels, L represents a marker distance, T x represents a center deviation abscissa value, and T y represents a center deviation ordinate value.
On the basis of the embodiment, the digital zoom threshold is determined according to a zoom factor, a center correction zoom rate and a rotation correction zoom rate.
The focal length splicing device provided by the embodiment of the invention can execute the focal length splicing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 7 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention, and as shown in fig. 7, the electronic device includes a processor 70, a memory 71, an input device 72 and an output device 73; the number of processors 70 in the electronic device may be one or more, one processor 70 being taken as an example in fig. 7; the processor 70, the memory 71, the input means 72 and the output means 73 in the electronic device may be connected by a bus or other means, in fig. 7 by way of example.
The memory 71 is used as a computer readable storage medium for storing software programs, computer executable programs, and modules, such as modules corresponding to the focal length splicing method in the embodiment of the present invention (for example, the distortion correction module 310, the current object distance acquisition module 320, the translational rotation correction module 330, and the focal length splicing module 340 in the focal length splicing device). The processor 70 executes various functional applications of the electronic device and data processing, i.e., implements the above-described focus splicing method, by running software programs, instructions, and modules stored in the memory 71. The method comprises the following steps:
According to distortion correction parameters obtained through pre-calibration, performing distortion correction on an image to be processed obtained through shooting by a first lens;
Acquiring a current object distance, and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained by pre-calibration;
translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance;
If the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing.
The memory 71 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the terminal, etc. In addition, memory 71 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 71 may further include memory remotely located relative to processor 70, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 72 may be used to receive entered numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. The output means 73 may comprise a display device such as a display screen.
Example five
A fifth embodiment of the present invention also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a focus stitching method, the method comprising:
According to distortion correction parameters obtained through pre-calibration, performing distortion correction on an image to be processed obtained through shooting by a first lens;
Acquiring a current object distance, and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained by pre-calibration;
translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance;
If the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present invention is not limited to the above-described method operations, and may also perform the related operations in the focal length splicing method provided in any embodiment of the present invention.
From the above description of embodiments, it will be clear to a person skilled in the art that the present invention may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, although in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a FLASH Memory (FLASH), a hard disk, or an optical disk of a computer, etc., and include several instructions for causing an electronic device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present invention.
It should be noted that, in the embodiment of the focal length splicing apparatus, each unit and module included are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (10)

1. A focal length stitching method, comprising:
According to distortion correction parameters obtained through pre-calibration, performing distortion correction on an image to be processed obtained through shooting by a first lens;
Acquiring a current object distance, and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained by pre-calibration;
translating the image to be processed after distortion correction according to the center deviation parameter, and rotating the translated image according to the rotation parameter obtained by calibration in advance;
If the current digital zoom magnification is determined to be smaller than the digital zoom threshold, cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification so as to realize focal length splicing.
2. The method according to claim 1, wherein the calibration process of the distortion correction parameter is:
Respectively acquiring a first checkerboard calibration image of the first lens when the object distance is infinity and a second checkerboard calibration image of the second lens when the object distance is infinity, and respectively carrying out angular point identification on the first checkerboard calibration image and the second checkerboard calibration image;
determining the distortion rate of each corner in the second checkerboard calibration image to obtain a second distortion rate curve;
Fitting according to the actual distances of the corners and the ideal distances of the corners of each corner in the first checkerboard calibration image to obtain a first distortion coefficient;
carrying out distortion correction on the first checkerboard calibration image according to the target distortion coefficient, and obtaining a target distortion rate curve according to the distortion rate of each corner point in the first checkerboard calibration image after the distortion correction;
Wherein the target distortion coefficient has a value of 0 to a first distortion coefficient;
And taking the corresponding target distortion coefficient when the error of the target distortion rate curve and the second distortion rate curve meets the preset error condition as a distortion correction parameter.
3. The method according to claim 2, wherein the distortion correction of the image to be processed obtained by the first lens capturing is performed according to the distortion correction parameters obtained by calibration in advance, comprising:
and carrying out distortion correction on the image to be processed obtained by shooting by the first lens according to the following formula:
xd=x(1+kxr2);
yd=y(1+kxr2);
r2=x2+y2
Wherein, (x, y) is the pixel coordinates in the image to be processed, (x d,yd) is the pixel coordinates in the image to be processed after distortion correction, and k x is the distortion correction parameter.
4. The method of claim 1, wherein obtaining the current object distance comprises:
If the scene of the shooting device is fixed, acquiring a predetermined current object distance, wherein the current object distance is determined according to a zoom position and a focusing position of a second lens of the shooting device and a relation curve of the focusing position of the second lens and the object distance;
If the scene of the shooting device is not fixed, the current holder position is obtained, the current object distance matched with the current holder position is determined, and the current object distance is determined according to the zoom position and the focusing position of the second lens of the shooting device at the current holder position and the relation curve of the focusing position of the second lens and the object distance.
5. The method according to claim 1, wherein the calibration process of the mapping relationship between the object distance and the center deviation is:
respectively acquiring a first marker calibration image and a second marker calibration image of the first lens and the second lens under the condition of calibrating object distances;
Performing feature point matching on the first marker calibration image and the second marker calibration image to obtain a first feature point set and a second feature point set;
Obtaining a scaling factor, a rotation parameter, a central deviation abscissa value and a central deviation ordinate value according to the first characteristic point set, the second characteristic point set and the similar transformation formula;
Determining pixel deviation according to the optical axis distance between the first lens and the second lens, the number of image pixels and the marker distance;
determining a translation vector according to the center deviation abscissa value and the center deviation ordinate value;
And determining the mapping relation between the object distance and the center deviation according to the pixel deviation, the translation vector and the calibrated object distance.
6. The method of claim 5, wherein the mapping between object distance and center deviation is represented by the following formula:
C2=(Tx,Ty);
B=(Bx,By);
Wherein Offset Dpc represents the center deviation of the first lens with respect to the second lens at an object distance D, D c represents the nominal object distance, C 1 represents the pixel deviation, C 2 represents the translation vector, B x、By represents the horizontal and vertical spacing of the optical axis between the first lens and the second lens, and if the two lenses are mounted vertically, it is possible to adjust such that the horizontal directions are aligned, i.e., the horizontal spacing is zero; if the two lenses are horizontally arranged, the two lenses can be adjusted to be aligned in the vertical direction, namely, the vertical distance is zero; n represents the number of image pixels, L represents the marker pitch, T x represents the center deviation abscissa value, and T y represents the center deviation ordinate value.
7. The method of claim 5, wherein the digital zoom threshold is determined based on a scaling factor, a center correction scaling factor, and a rotation correction scaling factor.
8. A focal length splicing apparatus, comprising:
the distortion correction module is used for carrying out distortion correction on the image to be processed obtained by shooting the first lens according to the distortion correction parameters obtained by calibration in advance;
the current object distance acquisition module is used for acquiring the current object distance and acquiring a center deviation parameter according to the current object distance and a mapping relation between the object distance and the center deviation, which are obtained through pre-calibration;
the translation rotation correction module is used for translating the image to be processed after distortion correction according to the center deviation parameter and rotating the translated image according to the rotation parameter obtained by calibration in advance;
And the focal length splicing module is used for cutting and reconstructing the rotated image to be processed according to the current digital zoom magnification if the current digital zoom magnification is determined to be smaller than the digital zoom threshold, so as to realize focal length splicing.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the focus splicing method of any of claims 1-7 when the program is executed by the processor.
10. A storage medium storing computer executable instructions which, when executed by a computer processor, are for performing the focus splicing method of any of claims 1-7.
CN202211320247.2A 2022-10-26 2022-10-26 Focal length splicing method and device, electronic equipment and storage medium Pending CN117994174A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211320247.2A CN117994174A (en) 2022-10-26 2022-10-26 Focal length splicing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211320247.2A CN117994174A (en) 2022-10-26 2022-10-26 Focal length splicing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117994174A true CN117994174A (en) 2024-05-07

Family

ID=90891575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211320247.2A Pending CN117994174A (en) 2022-10-26 2022-10-26 Focal length splicing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117994174A (en)

Similar Documents

Publication Publication Date Title
US11546518B2 (en) Dual aperture zoom camera with video support and switching / non-switching dynamic control
CN111641775B (en) Multi-shooting zoom control method, device and electronic system
JP4658711B2 (en) Motion vector detection apparatus and method
US9325899B1 (en) Image capturing device and digital zooming method thereof
CN108596837B (en) Image splicing method, device, equipment and computer medium
CN112261387B (en) Image fusion method and device for multi-camera module, storage medium and mobile terminal
CN111445537B (en) Calibration method and system of camera
JPH09212648A (en) Moving image processing method
CN111292278B (en) Image fusion method and device, storage medium and terminal
CN105335977B (en) The localization method of camera system and target object
CN111340737B (en) Image correction method, device and electronic system
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN112396558A (en) Image processing method, image processing apparatus, and computer-readable storage medium
CN111815517A (en) Self-adaptive panoramic stitching method based on snapshot pictures of dome camera
CN111343360B (en) Correction parameter obtaining method
CN111292380B (en) Image processing method and device
CN117994174A (en) Focal length splicing method and device, electronic equipment and storage medium
CN111028296A (en) Method, device, equipment and storage device for estimating focal length value of dome camera
CN107251089B (en) Image processing method for motion detection and compensation
Imtiaz et al. Identification and correction of microlens-array error in an integral-imaging-microscopy system
WO2024124816A1 (en) Digital zoom method and apparatus, electronic device and storage medium
JP2008287338A (en) Image processor
CN117670657A (en) Multi-image stitching method, system, equipment and storage medium based on halcon
CN113554686A (en) Image processing method, apparatus and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination