CN106875341B - Distorted image correction method and positioning method thereof - Google Patents

Distorted image correction method and positioning method thereof Download PDF

Info

Publication number
CN106875341B
CN106875341B CN201510919593.6A CN201510919593A CN106875341B CN 106875341 B CN106875341 B CN 106875341B CN 201510919593 A CN201510919593 A CN 201510919593A CN 106875341 B CN106875341 B CN 106875341B
Authority
CN
China
Prior art keywords
image
distorted
distorted image
steps
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510919593.6A
Other languages
Chinese (zh)
Other versions
CN106875341A (en
Inventor
罗运岑
杜亚凤
周炳
王冬梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Opotech Co Ltd
Original Assignee
Ningbo Sunny Opotech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sunny Opotech Co Ltd filed Critical Ningbo Sunny Opotech Co Ltd
Priority to CN202110870949.7A priority Critical patent/CN114331860A/en
Priority to CN201510919593.6A priority patent/CN106875341B/en
Priority to PCT/CN2016/107427 priority patent/WO2017092631A1/en
Publication of CN106875341A publication Critical patent/CN106875341A/en
Application granted granted Critical
Publication of CN106875341B publication Critical patent/CN106875341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/80

Abstract

The invention provides a distorted image correction method and a distorted image positioning method which can be used for real-time correction of fisheye images. The distorted image correction method includes the steps of: (1) determining a correction parameter; and (2) determining a correction algorithm based on the correction parameters; wherein the step (1) further comprises the steps of: (11) determining the position and the outline of a multipath distorted image; and (12) determining a correction factor for the distorted image. The distorted image correction method adopts a four-point positioning method to accurately position the position and the outline of the distorted image so as to ensure the accuracy and the effectiveness of the distorted image correction method.

Description

Distorted image correction method and positioning method thereof
Technical Field
The present invention relates to image correction technology in the field of photography, and more particularly, to a method for correcting distorted images and a method for positioning the same.
Background
Photography and video shooting are very important in daily life and work of modern people, and become an indispensable part of life and work of people.
People have been used to record a droplet in life using an electronic device having a photographing and image capturing function. People like and need a tool to record the moment, time, and scene that children grow, friends and relatives meet, and the good scenery that memorializes in life.
With the increasing diversification of demands of people for photographic technologies, various photographic lenses are used and favored by people. For example, in order to enable a photographic imaging apparatus to have a wider field of view space, a "distorted image" is produced. The distorted image has the characteristics of short focal length and large field of view, and has wide market demand in an omnibearing vision system.
The distorted image can reach an ultra-large visual angle close to or larger than 180 degrees, so that a larger scene can be captured by using the distorted image, and the distorted image has great potential application value. For example, a video monitoring system applying distorted images to some public places adopts a ceiling-mounted mode, so that the scenes of the whole area can be recorded. Therefore, people do not need to install a plurality of monitoring cameras in different areas, and space, resources and use cost are saved. For example, people always encounter the situation in daily life, clearly feel beautiful in the front of eyes, and how the people cannot record the situation by using a photographing and shooting device in hands, which is mainly caused by the fact that the visual angle capability of the photographing and shooting device cannot reach the range which can be seen by human eyes.
Although the distorted image has the advantage of a large field of view, which can reach or even exceed the range that can be seen by human eyes, such an excessively large viewing angle of the distorted image is achieved by sacrificing the appearance of the subject in its original form. That is, an image captured with the distorted image is distorted. The fisheye image outline appears as a circular structure. The distorted image can cause a very strong perspective effect when being close to a shot object for shooting, emphasizes the contrast of the shot object in the near-large and far-small directions and enables a shot picture to have a shocking appeal, so the distorted image is popular with shooting enthusiasts. However, such distorted images are often undesirable, in addition to enhancing artistic appeal. For example, surveillance cameras that are ubiquitous in life are provided in some necessary locations today, and can help people to restrict daily activities. Some monitoring records may even be valid evidence of the fact. But such distorted pictures tend to affect the recognition of some details.
Even if distorted images can be aesthetically appealing, many consumers desire that these distorted images be restored to their original appearance. The method has very important significance in application value whether the method is used for commemoration or comparison with distorted images. Therefore, a technique for correcting a distorted image is receiving much attention from developers.
The premise of fisheye image correction is to extract the contour of the fisheye image. The current commonly used fisheye image contour extraction methods include an area statistical method, a scanning line approximation method and a region growing method. The methods have advantages and disadvantages, but have some disadvantages, can not completely and accurately position the center coordinates and the radius of the fisheye image, and have limitation on the application range.
In distortion correction of fisheye images, the current method can be mainly summarized into 3D correction and 2D correction. The main methods in the field include a spherical perspective projection model-based correction method, a quadratic surface perspective model-based correction method, a fisheye image distortion correction method based on circle segmentation, a fisheye image plane correction method based on geometric properties, and the like. The methods have advantages and disadvantages, and do not completely meet the real-time correction requirements of high-definition videos in terms of calculation complexity and correction effect, and have a certain distance in real application.
Real-time correction of fisheye images is of great significance to consumers to obtain corrected images in time. Especially, the method has important significance for correcting the fisheye video image. Currently, a real-time and efficient correction method for a high-definition fisheye video is urgently needed.
Disclosure of Invention
The invention mainly aims to provide a distorted image correction method and a positioning method thereof, wherein the positioning method can accurately position the circle center and the radius of a fisheye image.
Another objective of the present invention is to provide a method for correcting a distorted image, wherein the method has the characteristics of fast correction speed and good correction effect.
Another object of the present invention is to provide a distorted image correction method, wherein the method can be used to correct a fisheye image.
It is another object of the present invention to provide a distorted image correction method, wherein the method is suitable for real-time correction of fish-eye images.
Another object of the present invention is to provide a distorted image correction method, wherein the method can be used to correct multiple fisheye images.
The invention also aims to provide a distorted image correction method, wherein the method fully considers the characteristic that the contour of the fisheye image presents a circular structure, fully utilizes the correlation among the multi-path fisheye video images, and is suitable for realizing the real-time correction of the multi-path high-definition fisheye images on an embedded type.
Other advantages and features of the invention will become apparent from the following description and may be realized by means of the instrumentalities and combinations particularly pointed out in the appended claims.
According to an aspect of the present invention, there is provided a distorted image correction method for correction of a fisheye image, wherein the distorted image correction method includes: the method comprises the following steps:
(1) determining a correction parameter; and
(2) determining a correction algorithm according to the correction parameters;
wherein the step (1) comprises the following steps:
(11) determining the position and the outline of a multipath distorted image; and
(12) determining a correction factor for the distorted image;
and (11) accurately positioning the position and the contour of the distorted image by adopting a four-point positioning method so as to ensure the accuracy and the effectiveness of the distorted image correction method.
According to one embodiment, said step (11) comprises the steps of:
(113) superposing the distorted image to obtain a superposed image;
(114) linearly compressing the superposed image to obtain a normalized image; and
(115) and determining the position and the contour of the distorted image according to the pixel value of the normalized image at each position.
According to one embodiment, before step (113), said step (11) comprises the steps of:
(112) and filtering the distorted image to filter out the noise of the distorted image.
According to one embodiment, step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142)according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure GDA0003059591520000041
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image.
According to one embodiment, step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure GDA0003059591520000042
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image.
According to one embodiment, step (115) comprises the steps of:
(1151) setting a threshold Th
(1152) Recording that the pixels on the normalized image are greater than or equal to the threshold ThA point of (a);
(1153) determining the location and contour of the distorted image from the points on the normalized image determined in step (1152) where the pixels are greater than or equal to the threshold.
According to one embodiment, step (115) comprises the steps of:
(1151) setting a threshold Th
(1152) Recording that the pixels on the normalized image are greater than or equal to the threshold ThA point of (a);
(1153) determining the location and contour of the distorted image from the points on the normalized image determined in step (1152) where the pixels are greater than or equal to the threshold.
According to one embodiment, the threshold value ThCan be obtained by the following formula:
Figure GDA0003059591520000051
wherein p isx,yIs the pixel value of the normalized image at coordinate point (x, y), W is the image width of the normalized image, and H is the image height of the normalized image.
According to one embodiment, the threshold value ThCan be obtained by the following formula:
Figure GDA0003059591520000052
wherein p isx,yIs the pixel value of the normalized image at coordinate point (x, y), W is the image width of the normalized image, and H is the image height of the normalized image.
According to one embodiment, step (1152) comprises the steps of:
(11521) Scanning the normalized image from four directions; and
(11522) Respectively recording that the first one encountered in the scanning process in the four directions is greater than or equal to the threshold value ThPoint (2) of (c).
According to one embodiment, step (1152) comprises the steps of:
(11521) Scanning the normalized image from four directions; and
(11522) Respectively recorded in the above fourThe first one encountered during the scan in one direction is greater than or equal to the threshold ThPoint (2) of (c).
According to one embodiment, step (1) further comprises a step of:
(13) establishing a plane rectangular coordinate system;
wherein step (1152) further comprises the steps of:
(11523) The first one encountered during the four direction scanning according to step (11522) is greater than or equal to the threshold ThThe coordinate value of the point in the plane rectangular coordinate system accurately positions the circle center position and the imaging radius of the distorted image;
wherein the four directions include row-by-row from top to bottom, row-by-row from bottom to top, column-by-column from left to right, column-by-column from right to left, wherein a first one encountered during scanning is greater than or equal to the threshold T during scanning row-by-row from top to bottom, row-by-row from bottom to top, column-by-column from left to right, column-by-column from right to left, respectivelyhAre respectively marked as
Figure GDA0003059591520000053
Wherein step (1152) further comprises the steps of:
(11524) Respectively calculating the vertical distance and the horizontal distance of the two groups of corresponding coordinates, wherein the calculation mode is as follows:
d1=|y1-y2|
d2=|x3-x4|
(11525) Determining an imaging diameter d of the distorted image3Is d1And d2And a larger value of d, so that the imaging radius R of the distorted image is d32; and
(11536) Determining the circle center position of the distorted image, wherein the circle center coordinate is (x)c,yc) Wherein, in the step (A),
Figure GDA0003059591520000061
Figure GDA0003059591520000062
according to one embodiment, step (1) further comprises the steps of:
(14) determining a coordinate point (x) of a contour point of the distorted image in the rectangular plane coordinate systemil,yi) (ii) a And
(15) determining a horizontal distance l of the contour points of the distorted image from the center of the imageik
Wherein
Figure GDA0003059591520000063
Wherein xilIs the horizontal coordinate, y, of the contour point of the i-th distorted imageiIs the vertical coordinate of the contour point of the ith distorted image, where likFor the ith distorted image with vertical coordinate ykThe horizontal distance of the contour point from the center of the image.
According to one embodiment, step (12) comprises the steps of:
(121) detecting angular points of a plurality of paths of the distorted images;
(122) detecting the corner points of the superposed images; and
(123) determining the correction factor alpha of each path of image according to the corner of each path of distorted image and the corner of the superposed imagei
Wherein the coordinate label of the corner points of the plurality of distorted images detected in the step (121) in the rectangular plane coordinate system is (x)ik,yik) Where i represents the video of the order, and k represents the corner number in the video.
According to one embodiment, step (123) comprises the steps of:
(1231) respectively accumulating the abscissa values of the angular points of each path of distorted image in the rectangular plane coordinate system to obtain the accumulated value X of the abscissa of the angular point of each path of distorted imagei
(1232) Accumulating the overlayThe horizontal coordinates of the image corner points in the plane rectangular coordinate system are obtained to obtain the accumulated value X of the horizontal coordinates of the corner points of the superposed imageM
(1233) Setting a correction factor alpha of the superimposed imageM(ii) a And
(1234) calculating correction factor alpha of each path of distorted imageiIn which α isi=αM·Xi/XM
According to one embodiment, the correction factor α of the superimposed image isMThe value of (a) ranges between 0.7 and 1.3.
According to one embodiment, step (2) further comprises the steps of:
(21) determining a distortion correction formula according to the correction parameters obtained in the step (1) as follows:
Figure GDA0003059591520000071
according to one embodiment, before step (112), said step (11) further comprises the steps of:
(111) and acquiring checkerboard distortion images of the multiple lenses.
According to one embodiment, the distorted image correction method further includes the steps of:
(3) and correcting the multipath distorted images according to the correction algorithm.
According to one embodiment, step (3) further comprises the steps of:
(31) and (4) generating a distortion correction table according to the distortion correction formula in the step (21).
According to one embodiment, step (3) further comprises the steps of:
(32) and applying the correction table to a plurality of paths of high-definition distorted images under an embedded system to realize the real-time correction of the distorted images.
According to another aspect of the present invention, the present invention further provides a distorted image positioning method for positioning a fish-eye image, wherein the distorted image positioning method comprises the following steps:
(113) superposing the multi-path distorted images to obtain a superposed image;
(114) linearly compressing the superposed image to obtain a normalized image; and
(115) and determining the position and the contour of the distorted image according to the pixel value of the normalized image at each position.
According to one embodiment, before step (113), the distorted image positioning method further comprises the steps of:
(112) and filtering the distorted image to filter out the noise of the distorted image.
According to one embodiment, step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure GDA0003059591520000081
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image.
According to one embodiment, step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure GDA0003059591520000082
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image.
According to one embodiment, step (115) comprises the steps of:
(1151) setting a threshold Th
(1152) Recording that the pixels on the normalized image are greater than or equal to the threshold ThA point of (a);
(1153) determining the location and contour of the distorted image from the points on the normalized image determined in step (1152) where the pixels are greater than or equal to the threshold.
According to one embodiment, step (115) comprises the steps of:
(1151) setting a threshold Th
(1152) Recording that the pixels on the normalized image are greater than or equal to the threshold ThA point of (a);
(1153) determining the location and contour of the distorted image from the points on the normalized image determined in step (1152) where the pixels are greater than or equal to the threshold.
According to one embodiment, the threshold value ThCan be obtained by the following formula:
Figure GDA0003059591520000091
wherein p isx,yIs the pixel value of the normalized image at coordinate point (x, y), W is the image width of the normalized image, and H is the image height of the normalized image.
According to one embodiment, the threshold value ThCan be obtained by the following formula:
Figure GDA0003059591520000092
wherein p isx,yIs the pixel value of the normalized image at coordinate point (x, y), W is the image width of the normalized image, and H is the image height of the normalized image.
According to one embodiment, step (1152) comprises the steps of:
(11521) Scanning the normalized image from four directions; and
(11522) Respectively recording that the first one encountered in the scanning process in the four directions is greater than or equal to the threshold value ThPoint (2) of (c).
According to one embodiment, step (1152) comprises the steps of:
(11521) Scanning the normalized image from four directions; and
(11522) Respectively recording that the first one encountered in the scanning process in the four directions is greater than or equal to the threshold value ThPoint (2) of (c).
According to one embodiment, the distorted image positioning method further comprises the steps of:
(13) establishing a plane rectangular coordinate system;
wherein step (1152) further comprises the steps of:
(11523) The first one encountered during the four direction scanning according to step (11522) is greater than or equal to the threshold ThThe coordinate value of the point in the plane rectangular coordinate system accurately positions the circle center position and the imaging radius of the distorted image;
wherein the four directions include row-by-row from top to bottom, row-by-row from bottom to top, column-by-column from left to right, column-by-column from right to left, wherein a first one encountered during scanning is greater than or equal to the threshold T during scanning row-by-row from top to bottom, row-by-row from bottom to top, column-by-column from left to right, column-by-column from right to left, respectivelyhAre respectively marked as
Figure GDA0003059591520000101
Wherein step (1152) further comprises the steps of:
(11524) Respectively calculating the vertical distance and the horizontal distance of the two groups of corresponding coordinates, wherein the calculation mode is as follows:
d1=|y1-y2|
d2=|x3-x4|
(11525) Determining an imaging diameter d of the distorted image3Is d1And d2And a larger value of d, so that the imaging radius R of the distorted image is d32; and
(11536) Determining the circle center position of the distorted image, wherein the circle center coordinate is (x)c,yc) Wherein, in the step (A),
Figure GDA0003059591520000102
Figure GDA0003059591520000103
according to one embodiment, the distorted image positioning method further comprises the steps of:
(14) determining a coordinate point (x) of a contour point of the distorted image in the rectangular plane coordinate systemil,yi) (ii) a And
(15) determining a horizontal distance l of the contour points of the distorted image from the center of the imageik
Wherein
Figure GDA0003059591520000111
Wherein xilIs the horizontal coordinate, y, of the contour point of the i-th distorted imageiIs the vertical coordinate of the contour point of the ith distorted image, where likFor the ith distorted image with vertical coordinate ykThe horizontal distance of the contour point from the center of the image.
According to one embodiment, before step (112), the distorted image positioning method further comprises the steps of:
(111) and acquiring checkerboard distortion images of the multiple lenses.
Further objects and advantages of the invention will be fully apparent from the ensuing description and drawings.
These and other objects, features and advantages of the present invention will become more fully apparent from the following detailed description, the accompanying drawings and the claims.
Drawings
Fig. 1 is a schematic diagram of a filter template used in a distorted image correction method according to a preferred embodiment of the present invention.
Fig. 2 is a schematic diagram of the distorted image correction method according to the above preferred embodiment of the present invention.
Fig. 3 illustrates a step of determining a distorted image contour of the distorted image correction method according to the above preferred embodiment of the present invention.
Fig. 4 illustrates a step of determining a correction factor of the distorted image correction method according to the above preferred embodiment of the present invention.
Fig. 5 illustrates a schematic diagram of a distorted image correction method according to the above preferred embodiment of the present invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
Fig. 1 to 4 of the drawings illustrate a distorted image correction method according to a preferred embodiment of the present invention. The distorted image correction method can be applied to distortion correction of a fish-eye image, but is not limited to distortion correction of a fish-eye image. It should be understood by those skilled in the art that the distorted image correction method is applicable to any distorted image that is circularly distorted in accordance with the contour of the fisheye image. The present preferred embodiment describes the distorted image correction method of the present invention in detail, taking the distorted image correction of multiple paths of distorted images as an example.
As shown in fig. 2 of the drawings, the distorted image correction method includes the steps of:
(1) determining a correction parameter;
(2) determining a correction algorithm according to the correction parameters; and
(3) and correcting the multi-path distorted images in real time according to the correction algorithm.
Wherein the correction parameters in step (1) are determined from the distorted image. Specifically, the step (1) includes the steps of:
(11) determining the outline of each path of the distorted image; and
(12) determining a correction factor alpha for each of the distorted imagesi
More specifically, the step (11) includes the steps of:
(111) acquiring checkerboard distortion images of multiple paths of lenses;
(112) filtering the distorted images to filter noise on each distorted image, thereby preventing the noise from influencing the correction of the distorted images;
(113) superimposing the distorted image filtered in step (112) to obtain a superimposed image IS
(114) Linearly compressing the superimposed image ISTo obtain a normalized image IM(ii) a And
(115) from the normalized image IMPixel value p at each positionx,yAnd accurately positioning the position and the outline of the distorted image.
Wherein the distorted image acquired in step (111) has some noise due to some factors, which may interfere with the determination of the distorted image profile, so that the distorted image needs to be filtered to reduce the influence of the noise on the determination of the distorted image profile. The filtering template used therein is shown in fig. 1.
Those skilled in the art will appreciate thatFor a distorted image with no noise or with noise small enough to affect the correction of the distorted image, no correction is necessary. That is, step (112) is not necessary if the distorted image acquired by step (111) has no noise or is sufficiently small to have an effect on the correction of the distorted image. In this way, the distorted image can be directly superimposed in step (113). That is, step (113) becomes superimposing the distorted image to obtain a superimposed image IS
In order to more accurately locate the distorted images, the distorted images are superimposed in step (113) to counteract errors in determining the position of the distorted images. Alternatively, if the distorted images are separately located, errors caused by various environmental factors or human factors are inevitably generated, where the errors may not only cause the distorted images to be located inaccurately, but also cause the distorted images to be misaligned due to different errors, thereby further causing the quality of the corrected images to be unable to be guaranteed. Therefore, the distorted images of all paths are uniformly positioned by an image superposition method, which is favorable for ensuring the corrected quality of the images.
Wherein step (114) comprises the steps of:
(1141) acquiring the superimposed image ISMaximum pixel value P ofmaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the superposition image ISMaximum pixel value P ofmaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed using equation 1:
Figure GDA0003059591520000131
wherein p isx,yFor the normalized image I after linear compressionMPixel value, P, at coordinate point (x, y)x,yFor superimposing imagesISThe pixel value at the upper coordinate point (x, y).
In addition, it is worth mentioning that the preferred embodiment utilizes the characteristics that the middle part of the distorted image shot image is not deformed and the surrounding contour is in circular bending distortion to firstly position the center of a circle where the surrounding contour of the circular distorted image is located, and then accurately position the fish-eye image contour. The circle center positioning method is convenient and accurate, and the distorted image correction method has the characteristics of simplicity and high efficiency.
Specifically, the step (115) includes the steps of:
(1151) setting a threshold Th
(1152) Recording the normalized image IMThe upper pixel is greater than or equal to the threshold value ThA point of (a);
(1153) from the normalized image I determined in step (1152)MThe upper pixel is greater than or equal to the threshold value ThDetermines the location and contour of the distorted image.
Further, the method of this step (115) for determining the distorted image position and contour is a four-point localization method. Specifically, the step (1152) includes the steps of:
(11521) For the normalized image I from four directionsMScanning is carried out; and
(11522) Respectively recording that the first one encountered in the scanning process in the four directions is greater than or equal to the threshold value ThPoint (2) of (c).
More specifically, the four directions are row-by-row from top to bottom, row-by-row from bottom to top, column-by-column from left to right, and column-by-column from right to left, respectively.
Wherein the threshold value ThObtained by the following formula 2:
Figure GDA0003059591520000141
wherein p isx,yFor the normalized image IMA pixel value at a coordinate point (x, y), W being an image width of the normalized image, H being the normalized imageThe image height of the image is normalized.
In order to make the distorted image correction method faster, more accurate and more effective, the step (1) of the distorted image correction method further comprises a step of:
(13) and establishing a plane rectangular coordinate system.
It should be noted that there is no difference in the sequence among the step (11), the step (12) and the step (13), and the sequence among the three steps can be interchanged without limitation.
The rectangular plane coordinate system established in step (13) enables each point on the image in the distorted image correction method to be determined by specific coordinate values, thereby helping to determine the relative position relationship related to the distorted image correction method in the coordinate system.
On the other hand, each point in the plane rectangular coordinate system can be calibrated through a specific numerical value, so that the geometric figure can be accurately determined by utilizing the specific mathematical relationship of the geometric figure. In the preferred embodiment of the present invention, since the fisheye image contour is in a circular structure, the preferred embodiment uses the mathematical relationship of circles to precisely locate the fisheye image contour, so that the distorted image correction is more accurate. On the other hand, mathematical calculations are facilitated.
As shown in fig. 2, in the superimposed image ISThe rectangular plane coordinate system is established in the plane, wherein the rectangular plane coordinate system is composed of an X axis and a Y axis which are perpendicular to each other, wherein the X axis and the Y axis intersect at an origin O, and the coordinates of coordinate points in the coordinate system are marked as (X, Y).
It should be noted that the rectangular plane coordinate system is established for the convenience of calculation and calibration, and there is no substantial limitation on the present invention. That is, regardless of whether the coordinate system is established in the overlay image ISThe position in the plane does not influence the correction effect of the distorted image correction method on the distorted image. That is, each coordinate point (x, y) merely serves as a relative index, wherein the specific values of x and y do not have an absolute meaning.
Accordingly, this step (1152) includes the steps of:
(11523) The first one encountered during the four direction scanning according to step (11522) is greater than or equal to the threshold ThThe coordinate value of the point in the plane rectangular coordinate system accurately positions the circle center position and the imaging radius of the distorted image.
Specifically, the pixel points meeting the requirement found in the scanning process from top to bottom row by row, from bottom to top row by row, from left to right column by column, from right to left column by column are respectively marked as p(x1,y1)、p(x2,y2)、p(x3,y3)、p(x4,y4)
Respectively calculating the vertical distance and the horizontal distance of two groups of corresponding coordinates, wherein the calculation mode is shown as formula 3 and formula 4:
d1=|y1-y2equation 3
d2=|x3-x4Equation 4
Selection of d1And d2The larger value of the two is used as the imaging diameter d of the distorted image3Obtaining the imaging radius R of the distorted image, and the coordinate of the circle center is (x)c,yc)。R、xc、ycThe calculation is respectively carried out through formula 5, formula 6 and formula 7 to obtain:
R=d3equation 2 formula 5
Figure GDA0003059591520000151
Figure GDA0003059591520000152
The step (1) further comprises the steps of:
(14) determining a coordinate point (x) of a contour point of the distorted image in the rectangular plane coordinate systemil,yi) (ii) a And
(15) determining the distortionHorizontal distance l of contour point of image from image centerik
Wherein the contour of the distorted image refers to a surrounding contour of the distorted image. Wherein the coordinate value of the coordinate point of the contour point of the distorted image in the rectangular plane coordinate system in the step (14) is determined by the following formula:
Figure GDA0003059591520000161
wherein xilIs the horizontal coordinate, y, of the contour point of the i-th distorted imageiIs the vertical coordinate of the contour point of the ith distorted image.
The horizontal distance l from the contour point of the distorted image to the center of the image in the step (15)ikIs determined by the following formula:
Figure GDA0003059591520000162
wherein likFor the ith distorted image with vertical coordinate ykThe horizontal distance of the contour point from the center of the image.
It is worth mentioning that, according to the distorted image correction method of the preferred embodiment of the present invention, in the step (15), the horizontal distance l of the contour point of the distorted image from the center of the imageikCalculating by means of the rectangular plane coordinate system and the existing mathematical relation formula of the geometric image outline so as to enable the likThe numerical value of the image distortion correction method is accurate, so that the accuracy and the precision of the distorted image correction method are guaranteed. It will be understood by those skilled in the art that this is merely an example and not a limitation of the invention.
The step (12) comprises the following steps:
(121) detecting angular points of a plurality of paths of the distorted images;
(122) detecting superimposed images ISThe corner points of (a); and
(123) according to each path of the corner points of the distorted image and the imageSuperimposed image ISDetermining the correction factor alpha of each image pathi
Specifically, the coordinate label of the corner points of the plurality of distorted images detected in step (121) in the rectangular plane coordinate system is (x)ik,yik) Where i represents the video of the order, and k represents the corner number in the video.
The step (123) includes the steps of:
(1231) respectively accumulating the abscissa values of the angular points of each path of distorted image in the rectangular plane coordinate system to obtain the accumulated value X of the abscissa of the angular point of each path of distorted imagei
(1232) Accumulating the overlay image ISThe horizontal coordinate of the angular point in the plane rectangular coordinate system to obtain the superposed image ISAccumulated value X of angular point abscissaM
(1233) Setting the overlay image ISCorrection factor alpha ofM(range between 0.7 and 1.3); and
(1234) according to Xi、XMAnd alphaMCalculating correction factor alpha of each path of distorted imageiIn which α isi=αM·Xi/XM
The formula for accumulating the abscissa values of the corner points of each distorted image in the rectangular plane coordinate system in step (1231) is shown in formula 10:
Figure GDA0003059591520000171
wherein x isikRepresents the abscissa size of the kth corner point under the ith path of checkerboard distorted image, and K represents the total number of corner points of each path of distorted image.
It should be noted that there is no distinction between the sequences of steps (121) and (122), and the sequences of the two steps may be interchanged. It will be understood by those skilled in the art that steps (121) and (122) may also be performed simultaneously. That is, according to the preferred embodiment of the present invention, the steps (121) and (122) do not differ in any order.
The step (2) further comprises the following steps:
(21) determining a distortion correction formula according to the correction parameters obtained in the step (1):
Figure GDA0003059591520000172
wherein, aiIs the radius of the major axis of the ith fisheye video image; bi1, 2, 3, …, Z; z is the radius of the distorted image width, xilIs the horizontal coordinate, x, of the profile of the i-th distorted imagecIs the horizontal coordinate of the center of the distorted image,/iIs the distance, alpha, from the horizontal coordinate of the outline of the ith distorted image to the horizontal coordinate of the center of the distorted imageiThe correction factor for the ith distorted image reflects the magnitude of the correction amplitude.
The step (3) further comprises the following steps:
(31) generating a distortion correction table according to formula 11; and
(32) and applying the correction table to a plurality of paths of high-definition distorted images under an embedded system to realize the real-time correction of the distorted images.
It should be noted that the arabic numbers 1, 2, 3, 4, 5, etc. used in the numbers used in the steps of the distorted image correction method in the present invention only play a role of labeling, and do not distinguish the order. It will be understood by those skilled in the art that the steps in the distorted image correction method are not distinguished by a sequential order without violating the logical order of the steps themselves. Of course, it will be understood by those skilled in the art that where some subsequent steps require a prerequisite of a previous step, these steps are distinguished by order of precedence. The order of steps that are not mutually exclusive may be interchanged as long as the objects of the invention are achieved.
To describe the present invention in more detail, the distorted image correction method will be described in further detail below, taking correction of a fisheye video image as an example.
The distorted image correction method collects checkerboard images of multiple paths of distorted images, performs low-pass filtering operation on the checkerboard images, filters high-frequency noise on the images and eliminates related influences. The filtering template used is shown in fig. 1.
Superposing the preprocessed multi-path fisheye images to obtain a superposed image IS. Traverse the overlay image ISTo obtain a maximum value PmaxAnd a minimum value Pmin. By PmaxAnd PminSuperimposing the image ISLinear compression is carried out to obtain a normalized image IM,IMThe pixel values of the image range between 0 and 255. Linear compression is performed using equation 1.
Setting a threshold ThThe calculation method is shown in formula 2 as the mean value of the superimposed images.
For normalized image IMPerforming four-direction pixel-by-pixel scanning, and recording the first one greater than or equal to threshold T when scanning in each directionhThe coordinate position of the pixel point of (1). The four scanning directions are respectively row-by-row from top to bottom, row-by-row from bottom to top, column-by-column from left to right, and column-by-column from right to left. The pixel points which meet the requirements found in the scanning process are respectively marked as
Figure GDA0003059591520000181
And respectively calculating the vertical distance and the horizontal distance of the two groups of corresponding coordinates in the calculation modes shown in the formula 3 and the formula 4.
Selection of d1And d2The larger value of the two is used as the imaging diameter d of the fisheye image3Then obtaining the imaging radius R of the fisheye image, and the coordinate of the circle center is (x)c,yc)。R、xc、ycRespectively calculated by formula 5, formula 6 and formula 7.
The coordinates (x) of the contour points of each fisheye video image are obtained through the formula 8il,yi)。
Obtaining the horizontal distance l between the contour point of each path of fisheye video image and the center of the image through a formula 9ik
Detecting the angular points (x) in each checkerboard image by an angular point detection algorithmik,yik) Where i represents the video of the order, and k represents the corner number in the video.
And accumulating the horizontal coordinates of the corner points of each path of fisheye video to obtain the accumulated value of the horizontal coordinates of the corner points of each path of fisheye video.
Detecting superimposed images ISAnd accumulating the abscissa of all the corner points to obtain XM
Setting a correction factor alpha for a superimposed imageM(ranging between 07 and 13), the correction factor α of each of the other videos can be obtainedi=αM·Xi/XM
And synthesizing various parameters obtained in the previous step to obtain distortion correction formulas 11 of the fisheye videos of all paths.
By using the formula 11, distortion correction tables corresponding to pixels one by one can be generated for each channel of fisheye video, and the real-time correction requirement of multiple channels of high-definition fisheye video under the embedded system is met.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (15)

1. A distorted image correction method for correcting a fisheye image, comprising the steps of:
(1) determining a correction parameter; and
(2) determining a correction algorithm according to the correction parameters;
wherein the step (1) comprises the following steps:
(11) determining the position and the outline of a multipath distorted image; and
(12) determining a correction factor for the distorted image;
the position and the outline of the distorted image are accurately positioned by adopting a four-point positioning method in the step (11) so as to ensure the accuracy and the effectiveness of the distorted image correction method;
wherein the step (11) comprises the steps of:
(112) filtering the distorted image to filter out noise of the distorted image;
(113) superposing the distorted image to obtain a superposed image;
(114) linearly compressing the superposed image to obtain a normalized image; and
(115) determining the position and the contour of the distorted image according to the pixel value of the normalized image at each position;
wherein the step (12) comprises the following steps:
(121) detecting angular points of a plurality of paths of the distorted images;
(122) detecting the corner points of the superposed images; and
(123) determining the correction factor alpha of each path of image according to the corner of each path of distorted image and the corner of the superposed imagei
Wherein, the coordinate mark of the corner points of the plurality of paths of distorted images detected in the step (121) in a plane rectangular coordinate system is (x)ik,yik) Wherein i represents the video of the second path, and k represents the corner point number in the video;
wherein step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure FDA0003059591510000021
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image;
wherein step (115) comprises the steps of:
(1151) setting a threshold Th
(1152) Recording that the pixels on the normalized image are greater than or equal to the threshold ThA point of (a);
(1153) determining the location and contour of the distorted image from the points on the normalized image determined in step (1152) where the pixels are greater than or equal to the threshold;
wherein the threshold value ThCan be obtained by the following formula:
Figure FDA0003059591510000022
wherein p isx,yIs the pixel value of the normalized image at coordinate point (x, y), W is the image width of the normalized image, and H is the image height of the normalized image;
wherein step (1152) comprises the steps of:
(11521) Scanning the normalized image from four directions; and
(11522) Respectively recording that the first one encountered in the scanning process in the four directions is greater than or equal to the threshold value ThA point of (a);
wherein step (1152) further comprises the steps of:
(11523) The first one encountered during the four direction scanning according to step (11522) is greater than or equal to the threshold ThThe coordinate value of the point in the plane rectangular coordinate system accurately positions the circle center position and the imaging radius of the distorted image;
wherein the four directions include row by row from top to bottom, row by row from bottom to top, column by column from left to right, column by column from right to left, whereinThe first one encountered in the scanning process from top to bottom row by row, from bottom to top row by row, from left to right column by column, from right to left column by column is greater than or equal to the threshold ThAre respectively marked as
Figure FDA0003059591510000034
Wherein step (1152) further comprises the steps of:
(11524) Respectively calculating the vertical distance and the horizontal distance of the two groups of corresponding coordinates, wherein the calculation mode is as follows:
d1=|y1-y2|
d2=|x3-x4|
(11525) Determining an imaging diameter d of the distorted image3Is d1And d2And a larger value of d, so that the imaging radius R of the distorted image is d32; and
(11536) Determining the circle center position of the distorted image, wherein the circle center coordinate is (x)c,yc) Wherein, in the step (A),
Figure FDA0003059591510000031
Figure FDA0003059591510000032
2. the distorted image correcting method according to claim 1, wherein the step (1) further comprises the steps of:
(14) determining a coordinate point (x) of a contour point of the distorted image in the rectangular plane coordinate systemil,yi) (ii) a And
(15) determining a horizontal distance l of the contour points of the distorted image from the center of the imageik
Wherein
Figure FDA0003059591510000033
Wherein xilIs the horizontal coordinate, y, of the contour point of the i-th distorted imageiIs the vertical coordinate of the contour point of the ith distorted image, where likFor the ith distorted image with vertical coordinate ykThe horizontal distance of the contour point from the center of the image.
3. The distorted image correcting method according to claim 1 or 2, wherein the step (123) comprises the steps of:
(1231) respectively accumulating the abscissa values of the angular points of each path of distorted image in the rectangular plane coordinate system to obtain the accumulated value X of the abscissa of the angular point of each path of distorted imagei
(1232) Accumulating the abscissa of the corner point of the superimposed image in the rectangular plane coordinate system to obtain an accumulated value X of the abscissa of the corner point of the superimposed imageM
(1233) Setting a correction factor alpha of the superimposed imageM(ii) a And
(1234) calculating correction factor alpha of each path of distorted imageiIn which α isi=αM·Xi/XM
4. The distorted image correcting method according to claim 3, wherein the correction factor α of the superimposed imageMThe value of (a) ranges between 0.7 and 1.3.
5. The distorted image correcting method according to claim 4, wherein the step (2) further comprises the steps of:
(21) determining a distortion correction formula according to the correction parameters obtained in the step (1) as follows:
Figure FDA0003059591510000041
wherein a isiIs the radius of the long axis of the ith fisheye video image;bi1, 2, 3, …, Z being the radius of the distorted image width; x is the number ofilIs the horizontal coordinate, x, of the profile of the i-th distorted imagecHorizontal coordinate of center of distorted image, liIs the distance, α, from the horizontal coordinate of the outline of the i-th distorted image to the horizontal coordinate of the center of the distorted imageiThe correction factor for the ith distorted image reflects the magnitude of the correction amplitude.
6. The distorted image correcting method according to claim 5, wherein the step (11) further comprises, before the step (112), the steps of:
(111) and acquiring checkerboard distortion images of the multiple lenses.
7. The distorted image correcting method according to claim 6, wherein the distorted image correcting method further comprises the steps of:
(3) and correcting the multipath distorted images according to the correction algorithm.
8. The distorted image correcting method according to claim 7, wherein the step (3) further comprises the steps of:
(31) and (4) generating a distortion correction table according to the distortion correction formula in the step (21).
9. The distorted image correcting method according to claim 8, wherein the step (3) further comprises the steps of:
(32) and applying the correction table to a plurality of paths of high-definition distorted images under an embedded system to realize the real-time correction of the distorted images.
10. A distorted image positioning method is used for positioning a fisheye image, and is characterized by comprising the following steps:
(113) superposing the multi-path distorted images to obtain a superposed image;
(114) linearly compressing the superposed image to obtain a normalized image; and
(115) determining the position and the contour of the distorted image according to the pixel value of the normalized image at each position;
wherein step (115) comprises the steps of:
(1151) setting a threshold Th
(1152) Recording that the pixels on the normalized image are greater than or equal to the threshold ThA point of (a);
(1153) determining the location and contour of the distorted image from the points on the normalized image determined in step (1152) where the pixels are greater than or equal to the threshold;
wherein the threshold value ThCan be obtained by the following formula:
Figure FDA0003059591510000051
wherein p isx,yIs the pixel value of the normalized image at coordinate point (x, y), W is the image width of the normalized image, and H is the image height of the normalized image;
wherein step (1152) comprises the steps of:
(11521) Scanning the normalized image from four directions; and
(11522) Respectively recording that the first one encountered in the scanning process in the four directions is greater than or equal to the threshold value ThA point of (a);
wherein the distorted image positioning method further comprises a step of:
(13) establishing a plane rectangular coordinate system;
wherein step (1152) further comprises the steps of:
(11523) The first one encountered during the four direction scanning according to step (11522) is greater than or equal to the threshold ThThe coordinate value of the point in the plane rectangular coordinate system accurately positions the circle center position and the imaging radius of the distorted image;
wherein the four directions include row by row from top to bottom, row by row from bottom to top, column by column from left to right, column by column from rightTo the left, wherein the first encountered during scanning from top to bottom row by row, from bottom to top row by row, from left to right column by column, from right to left column by column, respectively, is greater than or equal to said threshold ThAre respectively marked as
Figure FDA0003059591510000061
Wherein step (1152) further comprises the steps of:
(11524) Respectively calculating the vertical distance and the horizontal distance of the two groups of corresponding coordinates, wherein the calculation mode is as follows:
d1=|y1-y2|
d2=|x3-x4|
(11525) Determining an imaging diameter d of the distorted image3Is d1And d2And a larger value of d, so that the imaging radius R of the distorted image is d32; and
(11536) Determining the circle center position of the distorted image, wherein the circle center coordinate is (x)c,yc) Wherein, in the step (A),
Figure FDA0003059591510000062
Figure FDA0003059591510000063
11. the distorted image positioning method according to claim 10, wherein before the step (113), the distorted image positioning method further comprises the steps of:
(112) and filtering the distorted image to filter out the noise of the distorted image.
12. The distorted image locating method as set forth in claim 10, wherein the step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure FDA0003059591510000071
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image.
13. The distorted image locating method as set forth in claim 11, wherein the step (114) comprises the steps of:
(1141) obtaining a maximum pixel value P of the superimposed imagemaxAnd a minimum pixel value Pmin(ii) a And
(1142) according to the maximum pixel value P of the superposed imagemaxAnd a minimum pixel value PminFor the superimposed image ISPerforming linear compression;
wherein the linear compression is performed by adopting the following formula:
Figure FDA0003059591510000072
wherein p isx,yFor the pixel value, P, of the normalized image after linear compression at the coordinate point (x, y)x,yIs the pixel value at coordinate point (x, y) on the superimposed image.
14. The distorted image positioning method according to claim 10, wherein the distorted image positioning method further comprises the steps of:
(14) determining the contour points of the distorted imageCoordinate point (x) in the rectangular plane coordinate systemil,yi) (ii) a And
(15) determining a horizontal distance l of the contour points of the distorted image from the center of the imageik
Wherein
Figure FDA0003059591510000081
Wherein xilIs the horizontal coordinate, y, of the contour point of the i-th distorted imageiIs the vertical coordinate of the contour point of the ith distorted image, where likFor the ith distorted image with vertical coordinate ykThe horizontal distance of the contour point from the center of the image.
15. The distorted image locating method according to claim 14, wherein before the step (112), the distorted image locating method further comprises the steps of:
(111) and acquiring checkerboard distortion images of the multiple lenses.
CN201510919593.6A 2015-11-30 2015-12-11 Distorted image correction method and positioning method thereof Active CN106875341B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110870949.7A CN114331860A (en) 2015-12-11 2015-12-11 Distorted image correction method and positioning method thereof
CN201510919593.6A CN106875341B (en) 2015-12-11 2015-12-11 Distorted image correction method and positioning method thereof
PCT/CN2016/107427 WO2017092631A1 (en) 2015-11-30 2016-11-28 Image distortion correction method for fisheye image, and calibration method for fisheye camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510919593.6A CN106875341B (en) 2015-12-11 2015-12-11 Distorted image correction method and positioning method thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110870949.7A Division CN114331860A (en) 2015-12-11 2015-12-11 Distorted image correction method and positioning method thereof

Publications (2)

Publication Number Publication Date
CN106875341A CN106875341A (en) 2017-06-20
CN106875341B true CN106875341B (en) 2021-08-06

Family

ID=59177267

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110870949.7A Pending CN114331860A (en) 2015-12-11 2015-12-11 Distorted image correction method and positioning method thereof
CN201510919593.6A Active CN106875341B (en) 2015-11-30 2015-12-11 Distorted image correction method and positioning method thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110870949.7A Pending CN114331860A (en) 2015-12-11 2015-12-11 Distorted image correction method and positioning method thereof

Country Status (1)

Country Link
CN (2) CN114331860A (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107689033B (en) * 2017-07-21 2021-03-30 哈尔滨工程大学 Fisheye image distortion correction method based on ellipse segmentation
CN108053373A (en) * 2017-12-05 2018-05-18 长沙全度影像科技有限公司 One kind is based on deep learning model fisheye image correcting method
WO2020014881A1 (en) * 2018-07-17 2020-01-23 华为技术有限公司 Image correction method and terminal
CN109472760B (en) * 2019-02-01 2019-05-21 深兰人工智能芯片研究院(江苏)有限公司 A kind of method, apparatus of correcting distorted image
CN110443847B (en) * 2019-07-31 2022-08-05 浪潮金融信息技术有限公司 Automatic vending machine holder positioning detection method based on camera
CN110807816B (en) * 2019-10-31 2022-08-09 浪潮金融信息技术有限公司 Automatic vending machine holder positioning detection method
CN111340937A (en) * 2020-02-17 2020-06-26 四川大学华西医院 Brain tumor medical image three-dimensional reconstruction display interaction method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102194223A (en) * 2010-03-09 2011-09-21 新奥特(北京)视频技术有限公司 Method and system for calibrating distortion coefficient of zoom lens
CN102522058A (en) * 2011-12-22 2012-06-27 广州视睿电子科技有限公司 Four-point positioning deformation correction algorithm based on display
CN102809880A (en) * 2012-08-15 2012-12-05 无锡羿飞科技有限公司 System and method for superimposing multiple projectors on basis of spherical display
CN103124334A (en) * 2012-12-19 2013-05-29 四川九洲电器集团有限责任公司 Lens distortion correction method
CN103268592A (en) * 2013-04-24 2013-08-28 南京邮电大学 Method for correcting fisheye images
CN103369192A (en) * 2012-03-31 2013-10-23 深圳市振华微电子有限公司 Method and device for Full-hardware splicing of multichannel video images

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7079251B2 (en) * 2003-10-16 2006-07-18 4D Technology Corporation Calibration and error correction in multi-channel imaging
JP5593060B2 (en) * 2009-11-26 2014-09-17 株式会社メガチップス Image processing apparatus and method of operating image processing apparatus
CN101739707B (en) * 2009-12-16 2012-06-13 合肥工业大学 Elliptic fisheye image-based distortion correction method
CN103996172B (en) * 2014-05-08 2016-08-31 东北大学 A kind of fisheye image correcting method based on more corrective

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102194223A (en) * 2010-03-09 2011-09-21 新奥特(北京)视频技术有限公司 Method and system for calibrating distortion coefficient of zoom lens
CN102522058A (en) * 2011-12-22 2012-06-27 广州视睿电子科技有限公司 Four-point positioning deformation correction algorithm based on display
CN103369192A (en) * 2012-03-31 2013-10-23 深圳市振华微电子有限公司 Method and device for Full-hardware splicing of multichannel video images
CN102809880A (en) * 2012-08-15 2012-12-05 无锡羿飞科技有限公司 System and method for superimposing multiple projectors on basis of spherical display
CN103124334A (en) * 2012-12-19 2013-05-29 四川九洲电器集团有限责任公司 Lens distortion correction method
CN103268592A (en) * 2013-04-24 2013-08-28 南京邮电大学 Method for correcting fisheye images

Also Published As

Publication number Publication date
CN106875341A (en) 2017-06-20
CN114331860A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN106875341B (en) Distorted image correction method and positioning method thereof
WO2017092631A1 (en) Image distortion correction method for fisheye image, and calibration method for fisheye camera
US10339386B2 (en) Unusual event detection in wide-angle video (based on moving object trajectories)
EP3403403B1 (en) Calibration method and apparatus for panoramic stereo video system
CN103517041B (en) Based on real time panoramic method for supervising and the device of polyphaser rotation sweep
US9544498B2 (en) Method for forming images
JP5739409B2 (en) Method for determining the relative position of a first image device and a second image device and these devices
WO2017054314A1 (en) Building height calculation method and apparatus, and storage medium
WO2014023231A1 (en) Wide-view-field ultrahigh-resolution optical imaging system and method
WO2016025328A1 (en) Systems and methods for depth enhanced and content aware video stabilization
CN105551050B (en) A kind of image depth estimation method based on light field
CN109886995B (en) Multi-target tracking method in complex environment
US10404912B2 (en) Image capturing apparatus, image processing apparatus, image capturing system, image processing method, and storage medium
Liu et al. Robust autocalibration for a surveillance camera network
Pulli et al. Mobile panoramic imaging system
JP2003179800A (en) Device for generating multi-viewpoint image, image processor, method and computer program
KR101745493B1 (en) Apparatus and method for depth map generation
CN114693760A (en) Image correction method, device and system and electronic equipment
KR101670328B1 (en) The appratus and method of immersive media display and image control recognition using real-time image acquisition cameras
CN113724335A (en) Monocular camera-based three-dimensional target positioning method and system
CN108076365B (en) Human body posture recognition device
TWI516120B (en) Method for generating panoramic image and image capturing device thereof
WO2018099128A1 (en) Method and device used in projection apparatus
CN114331835A (en) Panoramic image splicing method and device based on optimal mapping matrix
CN108174054B (en) Panoramic motion detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant