CN108122191A - Fish eye images are spliced into the method and device of panoramic picture and panoramic video - Google Patents

Fish eye images are spliced into the method and device of panoramic picture and panoramic video Download PDF

Info

Publication number
CN108122191A
CN108122191A CN201611075609.0A CN201611075609A CN108122191A CN 108122191 A CN108122191 A CN 108122191A CN 201611075609 A CN201611075609 A CN 201611075609A CN 108122191 A CN108122191 A CN 108122191A
Authority
CN
China
Prior art keywords
pixel
fish eye
panoramic picture
cylindrical surface
eye images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611075609.0A
Other languages
Chinese (zh)
Other versions
CN108122191B (en
Inventor
余晖
曾智
陈卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu meiruo Mengjing Technology Co.,Ltd.
Original Assignee
Chengdu Guan Chong Chong Yu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Guan Chong Chong Yu Technology Co Ltd filed Critical Chengdu Guan Chong Chong Yu Technology Co Ltd
Priority to CN201611075609.0A priority Critical patent/CN108122191B/en
Publication of CN108122191A publication Critical patent/CN108122191A/en
Application granted granted Critical
Publication of CN108122191B publication Critical patent/CN108122191B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T3/047
    • G06T3/14

Abstract

The invention discloses the method and devices that a kind of fish eye images are spliced into panoramic picture and panoramic video, and N number of fish eye images are got in synchronization using N number of fish eye lens;According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, the intersecting area for getting every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to Feature Points Matching, obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area;According to spherical coordinate and the mapping relations of latitude and longitude coordinates, the attitude matrix of acquisition matching characteristic point pair;Using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, using the attitude matrix of matching characteristic point pair, attitude matrix of each fish eye lens in world coordinate system is calculated;According to attitude matrix of each fish eye lens in world coordinate system, the pixel value of each pixel in panoramic picture is determined;According to the pixel value of each pixel in the panoramic picture, the panoramic picture is obtained.

Description

Fish eye images are spliced into the method and device of panoramic picture and panoramic video
Technical field
The present invention relates to technical field of image processing more particularly to a kind of fish eye images to be spliced into panoramic picture and aphorama The method and device of frequency.
Background technology
Fish eye lens is a kind of short focus, the special pick-up lens of big visual field, and visual angle is close to 180 °, beyond human eye institute energy The angular field of view seen, in order to reach this effect, fish eye lens front lens to the eyes of front projection and fish like.Fish eye lens Using nonlinear organization, the physical characteristic of camera lens determines that under identical area the visual angle of shooting is far longer than common lens The visual angle of shooting, but simultaneously, deformation can be also generated at the edge of image.Near the fish eye images center of circle that fish eye lens is shot Information content is maximum, and deformation is minimum, and with the increase of radius, information content is reduced, and deformation gradually increases.Due to fish eye lens physics Characteristic is all widely used in photography, medical treatment, Examined effect, security monitoring, fire prevention detection, virtual reality etc..Due to Itself the advantages of, causes fish eye lens to be widely applied to various aspects, for the fish eye images feature captured by fish eye lens, There is extraordinary actual application prospect and technical research to be worth for the splicing for carrying out panoramic picture using fish eye images.
In the prior art, during more secondary fish eye images being spliced into panoramic picture, first from more good fortune fish eye images Middle image sequence of the acquisition with lap, then carries out image registration, and the image smoothing after registration is spliced or merged To panoramic picture, and obtain the panoramic picture of high quality if necessary at this time, the algorithm of image registration and smooth registration needs It devotes a tremendous amount of time so that the real-time of the panoramic picture of acquisition is deteriorated;And ensuring the real-time of the panoramic picture Property on the basis of when, will necessarily make the algorithm of image registration and smooth registration calculate time shorten, cause to occur obtaining complete The problem of precision of scape image is low, so that the real-time and precision during there is acquisition panoramic picture cannot be compatible with.
The content of the invention
The present invention provides the method and devices that a kind of fish eye images are spliced into panoramic picture and panoramic video, solve existing There are in technology real-time and precision during there are problems that obtaining panoramic picture that cannot be compatible with, realize and obtaining panoramic picture The effect that its precision is also improved on the premise of real-time is ensured in the process.
The application first aspect provides a kind of method that fish eye images are spliced into panoramic picture, including:
N number of fish eye images are got in synchronization using N number of fish eye lens, wherein, N is the integer not less than 2;
According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, N number of fish eye images are got by warp N number of cylindrical surface projecting image after latitude expansion;
The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to Feature Points Matching, is obtained The matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two has the cylindrical surface projecting image of intersecting area For one group of cylindrical surface projecting image;
According to spherical coordinate and the mapping relations of latitude and longitude coordinates, the matching of every group of cylindrical surface projecting image intersecting area is obtained Characteristic point is to the three-dimensional coordinate set on spherical surface, and three of the matching characteristic point pair using every group of cylindrical surface projecting image intersecting area Dimension coordinate collection carries out Attitude Calculation, obtains the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area;
Using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, thrown using every group of cylinder The attitude matrix of the matching characteristic point pair of shadow image intersecting area, is calculated appearance of each fish eye lens in world coordinate system State matrix;
According to attitude matrix of each fish eye lens in world coordinate system, determine by N number of fish eye images splicing and Into panoramic picture in each pixel pixel value;
According to the pixel value of each pixel in the panoramic picture, the panoramic picture is obtained.
Optionally, the three-dimensional coordinate set of the matching characteristic point pair for obtaining every group of cylindrical surface projecting image intersecting area, tool Body includes:
The matching characteristic point of every group of cylindrical surface projecting image intersecting area is obtained in cylindrical surface projecting figure using latitude and longitude coordinates Two-dimensional coordinate collection as in;
According to the mapping relations and the two-dimensional coordinate collection of latitude and longitude coordinates and spherical coordinate, every group of cylindrical surface projecting figure is obtained As the three-dimensional coordinate set of the matching characteristic point pair of intersecting area.
Optionally, when the resolution ratio of the panoramic picture is not more than the resolution ratio of each fish eye images, the basis is every Attitude matrix of a fish eye lens in world coordinate system is determined in the panoramic picture being spliced by N number of fish eye images The pixel value of each pixel, specifically includes:
According to attitude matrix of each fish eye lens in world coordinate system, each pixel in the panoramic picture is obtained Corresponding pixel points in corresponding fish eye images;
According to the pixel value of each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture, determine The pixel value of each pixel in the panoramic picture.
Optionally, attitude matrix of each fish eye lens of the basis in world coordinate system, obtains the panoramic picture In each corresponding pixel points of the pixel in corresponding fish eye images, specifically include:
The latitude and longitude coordinates of each pixel in the panoramic picture are converted into spherical coordinate;
According to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system Attitude matrix obtains the new spherical coordinate of each pixel in the panoramic picture;
According to each pixel in the new spherical coordinate of each pixel in the panoramic picture and N number of fish eye images Plane coordinates and spherical coordinate mapping relations, obtain in the panoramic picture each pixel in corresponding fish eye images Corresponding pixel points.
Optionally, it is described according to each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture Pixel value determines the pixel value of each pixel in the panoramic picture, specifically includes:
If some pixel is corresponding with K pixel in K fish eye images respectively in the panoramic picture, according to institute State the pixel value and weight of K pixel, determine the pixel value of the pixel in the panoramic picture, the K pixel with The K fish eye images correspond, and K is the integer not less than 2 and no more than N;
If some pixel is only corresponding with a specific pixel point in a fish eye images in the panoramic picture, really The pixel value of the pixel in the fixed panoramic picture is the pixel value of the specific pixel point.
Optionally, it is described according to K pixel corresponding with the pixel in the panoramic picture in the K fish eye images The pixel value of point, determines the pixel value of the pixel in the panoramic picture, specifically includes:
Obtain the weight of each pixel in the K pixel;
According to the weight of the pixel value of the K pixel and each pixel, the pixel in the panoramic picture is determined The pixel value of point.
Optionally, when the resolution ratio of the panoramic picture is more than the resolution ratio of each fish eye images, the basis is each Attitude matrix of the fish eye lens in world coordinate system determines every in the panoramic picture being spliced by N number of fish eye images The pixel value of a pixel, specifically includes:
According to attitude matrix of each fish eye lens in world coordinate system, the first portion in the panoramic picture is obtained Each corresponding pixel points of the pixel in corresponding fish eye images in pixel, first portion's pixel are the panorama sketch The all pixels point corresponding with the pixel in N number of fish eye images as in;
According to the pixel of each corresponding pixel points of the pixel in corresponding fish eye images in first portion's pixel Value determines the pixel value of each pixel in first portion's pixel;
According to the pixel value of each pixel in first portion's pixel, second portion in the panoramic picture is determined The pixel value of each pixel in pixel, wherein, first portion's pixel and the second portion pixel are described All pixels point in panoramic picture.
Optionally, attitude matrix of each fish eye lens of the basis in world coordinate system, obtains the panoramic picture In first portion's pixel in each corresponding pixel points of the pixel in corresponding fish eye images, specifically include:
The latitude and longitude coordinates of each pixel in the panoramic picture are converted into spherical coordinate;
According to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system Attitude matrix obtains the new spherical coordinate of each pixel in the panoramic picture;
According to each pixel in the new spherical coordinate of each pixel in the panoramic picture and N number of fish eye images Plane coordinates and spherical coordinate mapping relations, from the pixel of the panoramic picture determine first portion's pixel Point, and obtain each corresponding pixel points of the pixel in corresponding fish eye images in first portion's pixel.
Optionally, the intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is obtained, is specifically included:
According to N number of fish-eye distribution and each fish-eye visual field, N number of cylindrical surface projecting image is obtained In every group of cylindrical surface projecting image intersecting area.
The application second aspect additionally provides a kind of method that fish eye images are spliced into panoramic video, and N number of fish eye lens exists N number of fish eye images that synchronization obtains form one group of fish eye images, including:
K group fish eye images are obtained successively using N number of fish eye lens, wherein, N and K are the integer not less than 2;
Following steps are performed for each group of fish eye images of acquisition:
Step A:According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, one group of flake is got Image is by N number of cylindrical surface projecting image after longitude and latitude expansion;
Step B:The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to characteristic point Match somebody with somebody, obtain the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, the cylinder that each two has intersecting area is thrown Shadow image is one group of cylindrical surface projecting image;
Step C:According to spherical coordinate and the mapping relations of latitude and longitude coordinates, every group of cylindrical surface projecting image intersecting area is obtained Matching characteristic point to the three-dimensional coordinate set on spherical surface, and utilize the matching characteristic point of every group of cylindrical surface projecting image intersecting area To three-dimensional coordinate set carry out Attitude Calculation, obtain the posture square of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area Battle array;
Step D:Using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, every group is utilized The attitude matrix of the matching characteristic point pair of cylindrical surface projecting image intersecting area, is calculated each fish eye lens in world coordinate system In attitude matrix;
Step E:According to attitude matrix of each fish eye lens in world coordinate system, determine by N number of fish eye images The pixel value of each pixel in the panoramic picture being spliced;
Step F:According to the pixel value of each pixel in the panoramic picture, the panoramic picture is obtained;
After every group of fish eye images have performed above-mentioned steps, K panoramic picture is got;
According to the time sequencing of the K groups fish eye images, the K panoramic picture is formed into panoramic video.
Based on the technical concept identical with the application first aspect, the application third aspect additionally provides a kind of camera shooting and sets It is standby, including:
Fish eye images acquiring unit, for getting N number of fish eye images in synchronization using N number of fish eye lens, wherein, N is the integer not less than 2;
Cylindrical surface projecting image acquisition unit, for the mapping relations according to latitude and longitude coordinates, spherical coordinate and plane coordinates, Get N number of cylindrical surface projecting image after N number of fish eye images are unfolded by longitude and latitude;
Matching characteristic point is to acquiring unit, for by the phase of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image Region is handed over to carry out Feature Points Matching, obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two tool The cylindrical surface projecting image for having intersecting area is one group of cylindrical surface projecting image;
First attitude matrix acquiring unit for the mapping relations according to spherical coordinate and latitude and longitude coordinates, obtains every group The matching characteristic point of cylindrical surface projecting image intersecting area utilizes every group of cylindrical surface projecting image to the three-dimensional coordinate set on spherical surface The three-dimensional coordinate set of the matching characteristic point pair of intersecting area carries out Attitude Calculation, obtains every group of cylindrical surface projecting image intersecting area The attitude matrix of matching characteristic point pair;
Second attitude matrix acquiring unit, for using a certain fish-eye coordinate system in N number of fish eye lens as World coordinate system using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, is calculated each Attitude matrix of the fish eye lens in world coordinate system;
Pixel value acquiring unit for the attitude matrix according to each fish eye lens in world coordinate system, is determined by institute State the pixel value of each pixel in the panoramic picture that N number of fish eye images are spliced;
Panoramic picture acquiring unit for the pixel value according to each pixel in the panoramic picture, obtains described complete Scape image.
Based on the technical concept identical with the application second aspect, the application fourth aspect additionally provides a kind of camera shooting and sets It is standby, including:
Fish eye images acquiring unit, for obtaining K group fish eye images successively using N number of fish eye lens, wherein, it is described N number of Fish eye lens forms one group of fish eye images in N number of fish eye images that synchronization obtains, and N and K are the integer not less than 2;
Panoramic picture acquiring unit performs following steps for being directed to each group of fish eye images of acquisition:Step A:According to The mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates are got after one group of fish eye images are unfolded by longitude and latitude N number of cylindrical surface projecting image;Step B:The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is carried out Feature Points Matching obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two has intersecting area Cylindrical surface projecting image be one group of cylindrical surface projecting image;Step C:According to spherical coordinate and the mapping relations of latitude and longitude coordinates, obtain The matching characteristic point of every group of cylindrical surface projecting image intersecting area is taken to be thrown to the three-dimensional coordinate set on spherical surface, and using every group of cylinder The three-dimensional coordinate set of the matching characteristic point pair of shadow image intersecting area carries out Attitude Calculation, obtains every group of cylindrical surface projecting image and intersects The attitude matrix of the matching characteristic point pair in region;Step D:With a certain fish-eye coordinate system in N number of fish eye lens For world coordinate system, using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, it is calculated every Attitude matrix of a fish eye lens in world coordinate system;Step E:According to posture of each fish eye lens in world coordinate system Matrix determines the pixel value of each pixel in the panoramic picture being spliced by N number of fish eye images;Step F:According to institute The pixel value of each pixel in panoramic picture is stated, obtains the panoramic picture;Above-mentioned steps have been performed in every group of fish eye images Afterwards, K panoramic picture is got;
Panoramic video acquiring unit, for the time sequencing according to the K groups fish eye images, by the K panoramic picture Form panoramic video.
Beneficial effects of the present invention are as follows:
Based on above-mentioned technical proposal, since the embodiment of the present application is obtaining appearance of each fish eye lens in world coordinate system After state matrix, according to attitude matrix of each fish eye lens in world coordinate system, determine to be spliced by N number of fish eye images The pixel value of each pixel in the panoramic picture formed;Further according to the pixel value of each pixel in the panoramic picture, obtain Take the panoramic picture, it is known that the embodiment of the present application is the pixel value using each pixel in the panoramic picture to obtain Panoramic picture is stated, compared with prior art, reduces the calculating process for carrying out image mosaic or fusion, can effectively reduce calculating Amount, and obtain by the pixel value of each pixel the precision of the panoramic picture that the panoramic picture obtains also It improves, obtains real-time and precision during panoramic picture in this way, solving the problems, such as to exist in the prior art and cannot be compatible with, it is real The effect that its precision is also improved on the premise of real-time is ensured during obtaining panoramic picture is showed.
Description of the drawings
Fig. 1 is the flow chart of the method that fish eye images are spliced into panoramic picture in the embodiment of the present invention;
Fig. 2 a are structure diagram of the fish eye images under image coordinate system in the embodiment of the present invention;
Fig. 2 b are for fish eye images in the embodiment of the present invention to the mapping relations figure regarding spherical surface;
Fig. 3 is that each respective pixel of the pixel in corresponding fish eye images in panoramic picture is obtained in the embodiment of the present invention The first method flow chart of point;
Fig. 4 is that each respective pixel of the pixel in corresponding fish eye images in panoramic picture is obtained in the embodiment of the present invention The second method flow chart of point;
Fig. 5 is the flow chart of the method that fish eye images are spliced into panoramic video in the embodiment of the present invention;
Fig. 6 is the first module diagram of picture pick-up device in the embodiment of the present invention;
Fig. 7 is second of module diagram of picture pick-up device in the embodiment of the present invention.
Specific embodiment
The present invention provides the method and devices that a kind of fish eye images are spliced into panoramic picture and panoramic video, solve existing There are in technology real-time and precision during there are problems that obtaining panoramic picture that cannot be compatible with, realize and obtaining panoramic picture The effect that its precision is also improved on the premise of real-time is ensured in the process.
The preferred embodiment of the present invention is described in detail below in conjunction with the accompanying drawings.
Embodiment one:
As shown in Figure 1, provide a kind of method that fish eye images are spliced into panoramic picture in the embodiment of the present invention, including with Lower step:
S101, N number of fish eye images are got in synchronization using N number of fish eye lens, wherein, N is whole not less than 2 Number;
S102, the mapping relations according to latitude and longitude coordinates, spherical coordinate and plane coordinates get N number of fish-eye image As by N number of cylindrical surface projecting image after longitude and latitude expansion;
S103, the intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to Feature Points Matching, The matching characteristic point pair of every group of cylindrical surface projecting image intersecting area is obtained, wherein, each two has the cylindrical surface projecting of intersecting area Image is one group of cylindrical surface projecting image;
S104, the mapping relations according to spherical coordinate and latitude and longitude coordinates obtain every group of cylindrical surface projecting image intersecting area Matching characteristic point to the three-dimensional coordinate set on spherical surface, and utilize the matching characteristic point of every group of cylindrical surface projecting image intersecting area To three-dimensional coordinate set carry out Attitude Calculation, obtain the posture square of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area Battle array;
S105, using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, utilize every group of column The attitude matrix of the matching characteristic point pair of face projected image intersecting area, is calculated each fish eye lens in world coordinate system Attitude matrix;
S106, the attitude matrix according to each fish eye lens in world coordinate system determine to be spelled by N number of fish eye images The pixel value of each pixel in the panoramic picture connect;
S107, the pixel value according to each pixel in the panoramic picture, obtain the panoramic picture.
Wherein, in step S101, N number of fish eye lens can be on same picture pick-up device, and then can be accurate N number of fish eye lens is controlled to obtain N number of fish eye images in synchronization, wherein, each fish eye lens obtains one Fish eye images, i.e., described N number of fish eye images are corresponded with N number of fish eye lens.
Next step S102 is performed, in this step, according to the mapping of latitude and longitude coordinates, spherical coordinate and plane coordinates Relation gets N number of cylindrical surface projecting image after N number of fish eye images are unfolded by longitude and latitude.
In specific implementation process, after N number of fish eye images are got by step S101, will can first it obtain The each fish eye images arrived from two-dimensional assemblage be spherical coordinate, then by the spherical coordinate of each fish eye images press longitude and latitude projection Expansion, N number of cylindrical surface projecting image after being unfolded.
Specifically, in each fish eye images got during two-dimensional assemblage is spherical coordinate, Ke Yixian Each fish eye images are converted into plane rectangular coordinates from image coordinate, then again by each fish eye images from plane rectangular coordinates System is converted to spherical coordinate;It is of course also possible to directly by each fish eye images from two-dimensional assemblage be spherical coordinate, the application It is not specifically limited, wherein, the plane coordinates is the image coordinate of fish eye images.
Certainly, step S102 can also first determine the pixel value of each pixel in N number of cylindrical surface projecting image, In, during the pixel value of each pixel in determining N number of cylindrical surface projecting image, first according to spherical coordinate and warp The mapping relations of latitude coordinate obtain each three-dimensional coordinate set of the pixel on spherical surface in each cylindrical surface projecting image, then According to spherical coordinate and the mapping relations of plane coordinates, using three-dimensional coordinate of each pixel on spherical surface, get each The correspondence of each pixel in cylindrical surface projecting image and the pixel in corresponding fish eye images, so as to obtain each column The pixel value of each pixel in the projected image of face, according to the pixel value of each pixel in each cylindrical surface projecting image, Get N number of cylindrical surface projecting image.
Specifically, referring to Fig. 2 a and Fig. 2 b, fish eye lens is as half of spherical surface 10, each pixel on half of spherical surface 10 It can be represented with three-dimensional coordinate P (x, y, z), each pixel on similary half of spherical surface 10 can also be with latitude and longitude coordinates table Show, and then each pixel on half of spherical surface 10 can be may map on a cylindrical surface projecting image 20.
Specifically, in cylindrical surface projecting image 20 is determined after the latitude and longitude coordinates of each pixel, Fig. 2 b is reverse Fig. 2 a are mapped to, i.e., according to spherical coordinate and the mapping relations of latitude and longitude coordinates, obtain each picture in cylindrical surface projecting image 20 Three-dimensional coordinate of the vegetarian refreshments on spherical surface, three-dimensional coordinate of each pixel on spherical surface with P (x, y, z) represent, according to aperture into As knowable to model,
r2=(x/z)2+(y/z)2Formula (3)
θ=atan (r) formula (4)
Further, referring to Fig. 2 b, θdMeet relationship below with θ:
θd=θ (1+k1θ2+k2θ4+k3θ6+k4θ8) formula (5)
In formula (5), k1、k2、k3And k4Represent the distortion parameter of fish eye images, the distortion parameter can pass through mark Determine method acquisition, such as the fisheye calibration functions in opencv can be used to obtain above-mentioned parameter, wherein, θd Represent the deflection angle of fish eye images pixel under image coordinate system.
Further, the formula of the two-dimensional coordinate according to three-dimensional coordinate point projection on the image, it may be determined that go out cylinder throwing The formula of the corresponding distortion coordinate of each pixel in shadow image 20 is:
Wherein, in formula (6), (x ', y ') represents the distortion coordinate of each pixel.
And continue the formula of the two-dimensional coordinate according to three-dimensional coordinate point projection on the image, by the abnormal of each pixel Become coordinate be transformed into the coordinate in image coordinate system into:
Wherein, f in formula (7)x, fy, cxAnd cyIt is fish-eye inner parameter, can be obtained by scaling method, In this way, by formula (3) to formula (7), each pixel in cylindrical surface projecting image 20 and corresponding flake can be got The mapping relations of pixel in image, then each pixel in cylindrical surface projecting image 20 and corresponding fish eye images In pixel mapping relations, the pixel value of the pixel in fish eye images is filled in corresponding in cylindrical surface projecting image 20 On pixel, so as to get the cylindrical surface projecting image 20 for being filled with pixel value, all pictures of filling are got by such method N number of cylindrical surface projecting image of element value, so as to get N number of cylindrical surface projecting image.
The application implement in, by each fish eye images got from two-dimensional assemblage be spherical coordinate specific reality Formula (3) can be arrived to the reverse execution of formula (7) during applying, after the spherical coordinate for obtaining each fish eye images, The spherical coordinate of each fish eye images is unfolded by longitude and latitude projection again, N number of cylindrical surface projecting image after being unfolded.
Since the embodiment of the present application can get cylindrical surface projecting image and fish eye images by formula (3) to formula (7) Plane coordinates mapping relations, so as to get the cylindrical surface projecting image of each fish eye images, and formula (3) arrives formula (7) Implementation procedure in solved without carrying out repeatedly multinomial equation, and in the prior art in the cylindrical surface projecting figure of each fish eye images It as being typically that formula (3) to formula (7) is carried out reverse execution, and needs to carry out eight order polynomial solutions, implement with the application Example is compared, and calculation amount is greatly and the precision of numerical value that is calculated is relatively low, in this way, compared with prior art, the application is real The calculation amount for applying the cylindrical surface projecting image that example gets each fish eye images by formula (3) to formula (7) is significantly dropped It is low, and the accuracy of calculating can also be improved, it can be enable to calculate the time in the case where calculation amount is able to be greatly lowered big Amplitude shortens, so as to realize that on the premise of real-time is ensured during obtaining panoramic picture its precision is also improved Effect.
Next step S103 is performed, in this step, by every group of cylindrical surface projecting image in N number of cylindrical surface projecting image Intersecting area carry out Feature Points Matching, obtain the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, every two A cylindrical surface projecting image with intersecting area is one group of cylindrical surface projecting image.
In specific implementation process, during step S103 is performed, can according to it is described it is N number of it is fish-eye distribution and Each fish-eye visual field, obtains the intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image.
Specifically, can be to each fish eye lens into line label, and each fish-eye there is larger visual field example It it is such as 90 °~180 °, two fish eye images that adjacent fish eye lens is shot centainly have intersecting area, according to fish eye images With the correspondence of cylindrical surface projecting image, in this way, according to N number of fish-eye distribution and each fish-eye can regard , the intersecting area of every group of cylindrical surface projecting image is got, the intersecting area of every group of cylindrical surface projecting image is then subjected to feature Point matching, obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area.
Specifically, by taking cylindrical surface projecting image 20 and cylindrical surface projecting image 21 is one group of cylindrical surface projecting images as an example, cylinder is thrown The intersecting area picture of shadow image 20 and cylindrical surface projecting image 21 is expressed as C20, cylindrical surface projecting image 21 and cylindrical surface projecting image 20 intersecting area picture is expressed as C21, and feature point set D20 is extracted from C20 by feature point extraction algorithm and passes through spy Sign point extraction algorithm extracts feature point set D21 from C21, then carries out characteristic point to D20 and D21 by Feature Points Matching algorithm Matching, gets matching characteristic point to the matching characteristic point for 21 intersecting area of cylindrical surface projecting image 20 and cylindrical surface projecting image It is right.
Wherein, the feature point extraction algorithm for example can be ORB, SIFT, SURF scheduling algorithms, and the Feature Points Matching is calculated Method for example may be employed normalized crosscorrelation (Normalized Cross Correlation method, abbreviation NCC) matching and calculate Method, sequential similarity detection (sequential similarity detection algorithm, abbreviation SSDA) algorithm and survey The degree factor has pixel grey scale absolute value of the difference and (Sum of Absolute Differences detect SAD) algorithm etc..
Next step S104 is performed, in this step, according to spherical coordinate and the mapping relations of latitude and longitude coordinates, is obtained The matching characteristic point of every group of cylindrical surface projecting image intersecting area utilizes every group of cylindrical surface projecting to the three-dimensional coordinate set on spherical surface The three-dimensional coordinate set of the matching characteristic point pair of image intersecting area carries out Attitude Calculation, obtains every group of cylindrical surface projecting image intersection The attitude matrix of the matching characteristic point pair in domain.
In specific implementation process, the matching characteristic point is being got to afterwards, utilizing longitude and latitude by step S103 Coordinate obtains the matching characteristic point of every group of cylindrical surface projecting image intersecting area to the two-dimensional coordinate collection in cylindrical surface projecting image;Again According to the mapping relations and the two-dimensional coordinate collection of latitude and longitude coordinates and spherical coordinate, every group of cylindrical surface projecting image intersection is obtained The three-dimensional coordinate set of the matching characteristic point pair in domain.
Specifically, each feature of matching characteristic point centering of every group of cylindrical surface projecting image intersecting area can be obtained first Latitude and longitude coordinates of the point on corresponding cylindrical surface projecting image further according to spherical coordinate and the mapping relations of latitude and longitude coordinates, obtain Three-dimensional coordinate composition of each characteristic point of matching characteristic point centering of every group of cylindrical surface projecting image intersecting area on spherical surface Three-dimensional coordinate set.
Specifically, the three-dimensional coordinate set in the matching characteristic point pair using every group of cylindrical surface projecting image intersecting area carries out appearance When state calculates, by taking first group of cylindrical surface projecting image as an example, this group of cylindrical surface projecting image includes the first cylindrical surface projecting image and second Cylindrical surface projecting image, the first, second cylindrical surface projecting image have the first intersecting area, the first cylindrical surface projecting image and The second cylindrical surface projecting image has the first matching characteristic point pair in first intersecting area;From first matching characteristic Point centering obtains all characteristic points in the first cylindrical surface projecting image for first group of matching characteristic point and from described the The characteristic point that one matching characteristic point centering obtains in the second cylindrical surface projecting image is second group of matching characteristic point;Then obtain First group three-dimensional coordinate set and the acquisition second matching characteristic point of the first group of matching characteristic point on spherical surface is taken to exist The second three-dimensional coordinate set on spherical surface, using Attitude estimation algorithm to first group of three-dimensional coordinate set and the second three-dimensional seat Mark collection is calculated, and obtains the attitude matrix of the matching characteristic point pair of first group of cylindrical surface projecting image intersecting area, described Attitude matrix includes spin matrix and translation matrix;In this way, by the above method, every group of cylindrical surface projecting image can be obtained and intersected The attitude matrix of the matching characteristic point pair in region.
Specifically, the Attitude estimation algorithm can be the following formula:
Wherein, R represents spin matrix in formula (8), and t represents translation matrix, in this way, can be by formula (8) come to institute It states the first three-dimensional coordinate set and second three-dimensional coordinate set is calculated, first three-dimensional coordinate set can be substituted into formula (8)In, and second three-dimensional coordinate set is substituted into formula (8)In, and according to matched characteristic point pair Mode substitutes into corresponding two three-dimensional coordinates in formula (8) respectively, so as to get first group of cylindrical surface projecting image phase Hand over the spin matrix and translation matrix of the matching characteristic point pair in region;The attitude matrix includes spin matrix and translation matrix, Therefore, it is possible to based on the spin matrix and the translation matrix got, determine first group of cylindrical surface projecting image phase Hand over the attitude matrix of the matching characteristic point pair in region.
For example, equally by taking cylindrical surface projecting image 20 and cylindrical surface projecting image 21 is one group of cylindrical surface projecting images as an example, cylinder The intersecting area picture of projected image 20 and cylindrical surface projecting image 21 is expressed as C20, cylindrical surface projecting image 21 and cylindrical surface projecting figure As 20 intersecting area picture is expressed as C21, feature point set D20 is extracted from C20 by feature point extraction algorithm and is passed through Feature point extraction algorithm extracts feature point set D21 from C21, then carries out feature to D20 and D21 by Feature Points Matching algorithm Point matching, the feature point set D200 obtained in D20 are matched with the feature point set D210 in D21, wherein, the feature points in D200 Amount is identical with the characteristic point quantity in D210, and the characteristic point in D200 and D210 is one-to-one;Then obtain D200 and The three-dimensional coordinate of each characteristic point, is then substituted into formula (8) and is calculated in D210, between acquisition D200 and D210 Spin matrix and translation matrix will get the spin matrix and the translation matrix as cylindrical surface projecting image 20 and cylinder The attitude matrix of the matching characteristic point pair of 21 intersecting area of projected image.
Wherein, by taking a matching characteristic point pair in D200 and D210 as an example, if special in characteristic point A1 and D210 in D200 It is matching characteristic point pair to levy point B1, then the three-dimensional coordinate of A1 is substituted into formula (8)In, and the three-dimensional coordinate of B1 is substituted into Formula (8)In, in this way, by D200 and D210 according to the method described above by the three-dimensional coordinate equal generation of each matching characteristic point pair Enter in formula (8), then carry out joint solution, get spin matrix R and translation matrix t, and using the R got and t as The attitude matrix of the matching characteristic point pair of 21 intersecting area of cylindrical surface projecting image 20 and cylindrical surface projecting image.
Next step S105 is performed, in this step, with a certain fish-eye coordinate in N number of fish eye lens It is for world coordinate system, using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, is calculated Each attitude matrix of the fish eye lens in world coordinate system.
In specific implementation process, since the adjacent fish eye lens of each two in N number of fish eye lens will necessarily exist Intersecting area, and the posture square of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area has been obtained by step S104 Battle array, the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area is two adjacent fish-eye appearances State matrix in this way, using any one fish-eye coordinate system in N number of fish eye lens as world coordinate system, utilizes every group The attitude matrix of the matching characteristic point pair of cylindrical surface projecting image intersecting area, is calculated each fish eye lens in world coordinate system In attitude matrix.
For example, N number of fish eye lens includes fish eye lens 0, fish eye lens 1, fish eye lens 2, fish eye lens 3, flake Camera lens 4, fish eye lens 5 and fish eye lens 6, it is fish eye lens 0 respectively that N number of fish eye lens, which shoots to obtain N number of fish eye images, Corresponding fish eye images B0,1 corresponding B1 of fish eye lens, 2 corresponding B2 of fish eye lens, 3 corresponding B3 of fish eye lens, flake mirror 6 corresponding B6 of first 4 corresponding B4,5 corresponding B5 of fish eye lens and fish eye lens, per two neighboring in N number of fish eye lens Fish eye lens be { (0,1), (1,2), (2,3), (3,4), (4,5), (5,6), (6,0) }, fish with intersecting area based on this Eye pattern image set is combined into { (B0, B1), (B1, B2), (B2, B3), (B3, B4), (B4, B5), (B5, B6), (B6, B0) }, accordingly Cylindrical surface projecting image collection with intersecting area for (G0, G1), (G1, G2), (G2, G3), (G3, G4), (G4, G5), (G5, G6), (G6, G0) } and (G0, G1) corresponding spin matrix is R0 and translation matrix is t0, similarly, (G1, G2) corresponding R1 and T1, (G2, G3) correspond to R2 and t2, and (G3, G4) corresponds to R3 and t3, and (G4, G5) corresponds to R4 and t4, and (G5, G6) corresponds to R5 and t5, (G6, G0) correspond to R6 and t6, using the corresponding fish-eye coordinate systems of G0 as world coordinate system, by (R0, t0), (R1, t1), (R2, t2), (R3, t3), (R4, t4), (R5, t5) and (R6, t6) can obtain fish eye lens 1, fish eye lens 2, fish eye lens 3rd, fish eye lens 4, fish eye lens 5 and fish eye lens 6 compared with fish eye lens 0 attitude matrix, that is, get fish eye lens 1, The attitude matrix of fish eye lens 2, fish eye lens 3, fish eye lens 4, fish eye lens 5 and fish eye lens 6 in world coordinate system.
Next step S106 is performed, in this step, according to posture square of each fish eye lens in world coordinate system Battle array determines the pixel value of each pixel in the panoramic picture being spliced by N number of fish eye images.
In specific implementation process, when the resolution ratio of the panoramic picture is not more than the resolution ratio of each fish eye images, The attitude matrix in world coordinate system according to each fish eye lens is needed, obtains in the panoramic picture each pixel in phase Answer the corresponding pixel points in fish eye images;Further according to each correspondence of the pixel in corresponding fish eye images in the panoramic picture The pixel value of pixel determines the pixel value of each pixel in the panoramic picture.
When being not more than the resolution ratio of each fish eye images due to the resolution ratio in the panoramic picture, in the panoramic picture Each pixel can be corresponding with some pixel in corresponding one or more fish eye images, therefore, can be according to every Attitude matrix of a fish eye lens in world coordinate system obtains in the panoramic picture each pixel in corresponding fish eye images In corresponding pixel points.
Pixel value in the embodiment of the present application can use RGB, SRGB and CMKY etc. to be indicated, and the application does not make specifically Limitation.
Specifically, attitude matrix of each fish eye lens of the basis in world coordinate system, obtains the panorama sketch The specific implementation step of each corresponding pixel points of the pixel in corresponding fish eye images is specific as follows as in, as shown in Figure 3:
S301, the latitude and longitude coordinates of each pixel in the panoramic picture are converted into spherical coordinate;
S302, according to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system In attitude matrix, obtain the new spherical coordinate of each pixel in the panoramic picture;
S303, according to each in the new spherical coordinate of each pixel in the panoramic picture and N number of fish eye images The plane coordinates of pixel and the mapping relations of spherical coordinate obtain in the panoramic picture each pixel in corresponding fish-eye image Corresponding pixel points as in.
In step S301, the latitude and longitude coordinates of each pixel in the panoramic picture are obtained first, further according to longitude and latitude The mapping relations of coordinate and spherical coordinate are spent, obtain three-dimensional coordinate of each pixel on spherical surface i.e. ball in the panoramic picture Areal coordinate so that the latitude and longitude coordinates of each pixel are converted to spherical coordinate in the panoramic picture, was embodied Journey may be referred to the narration for Fig. 2 a and Fig. 2 b, and clean in order to illustrate letters, details are not described herein again.
Then step S302, in this step, each picture in the panoramic picture is got by step S301 are performed After the spherical coordinate of vegetarian refreshments, can the panorama be determined according to the spherical coordinate of each pixel in the panoramic picture The corresponding one or more fish eye lenses of each pixel in image, then according to one or more of fish eye lenses in the world Attitude matrix in coordinate system obtains the new spherical coordinate of each pixel in the panoramic picture, wherein, the panoramic picture In the new spherical coordinate of each pixel can be one or more, for it is multiple when, the number of the new spherical coordinate of each pixel Amount is no more than N.
Specifically, according to N number of fish-eye distribution and visual field, each fish eye lens can be accurately determined Mapping range on spherical surface, if the spherical coordinate of some pixel is in the fish-eye mapping range in the panoramic picture It is interior, it is determined that the pixel is corresponding with the fish eye lens.
For example, N number of fish eye lens includes fish eye lens 0, fish eye lens 1, fish eye lens 2, fish eye lens 3, flake Camera lens 4, fish eye lens 5 and fish eye lens 6 get a pixel H in the panoramic picture, if according to N number of flake The distribution of camera lens and visual field determine H in the mapping range of fish eye lens 1, it is determined that H is corresponding with fish eye lens 1, according to H Attitude matrix in world coordinate system of three-dimensional coordinate and fish eye lens 1, obtain new spherical coordinates of the H on spherical surface.
Certainly, also may be used in the panoramic picture is got by step S301 after the spherical coordinate of each pixel With the attitude matrix according to each fish eye lens in world coordinate system, the N number of of each pixel in the panoramic picture is obtained New spherical coordinate.
For example, N number of fish eye lens includes fish eye lens 0, fish eye lens 1, fish eye lens 2, fish eye lens 3, flake Camera lens 4, fish eye lens 5 and fish eye lens 6 get a pixel H in the panoramic picture, alive according to fish eye lens 0 Attitude matrix in boundary's coordinate system obtains first new spherical coordinates of the H on spherical surface;Similarly, according to fish eye lens 1 in the world Attitude matrix in coordinate system obtains second new spherical coordinates of the H on spherical surface;According to fish eye lens 2 in world coordinate system In attitude matrix, obtain three new spherical coordinates of the H on spherical surface;According to appearance of the fish eye lens 3 in world coordinate system State matrix obtains four new spherical coordinates of the H on spherical surface;According to attitude matrix of the fish eye lens 4 in world coordinate system, Obtain five new spherical coordinates of the H on spherical surface;According to attitude matrix of the fish eye lens 5 in world coordinate system, obtain H and exist The 6th new spherical coordinate on spherical surface;And the attitude matrix according to fish eye lens 6 in world coordinate system, H is obtained in ball The 7th new spherical coordinate on face.
Then step S303 is performed, in this step, according to the new spherical coordinate of each pixel in the panoramic picture, According to the plane coordinates of each pixel in the new spherical coordinate and fish-eye correspondence and N number of fish eye images With the mapping relations of spherical coordinate, each respective pixel of the pixel in corresponding fish eye images in the panoramic picture is obtained Point.
In specific implementation process, in the new spherical coordinate of each pixel in obtaining the panoramic picture, it is necessary to root It is obtained according to attitude matrix of each fish eye lens in world coordinate system, in this way, the new ball of each pixel can be determined Areal coordinate and fish-eye correspondence are determining the new spherical coordinate of each pixel and fish-eye correspondence Afterwards, and then can be obtained according to each plane coordinates of pixel and the mapping relations of spherical coordinate in N number of fish eye images Take each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture.
Specifically, according to each plane coordinates of pixel and the mapping of spherical coordinate in N number of fish eye images Relation is obtained in the panoramic picture during each corresponding pixel points of the pixel in corresponding fish eye images, specifically can be with By performing formula (3) each correspondence of the pixel in corresponding fish eye images in the panoramic picture is got to formula (7) Pixel.
Since the embodiment of the present application is to obtain in panoramic picture each pixel in phase by formula (3) to formula (7) The corresponding pixel points in fish eye images are answered, and formula (3) need not carry out repeatedly multinomial equation into the implementation procedure of formula (7) Formula solves, and compared with prior art, the embodiment of the present application gets in panoramic picture each pixel in corresponding fish eye images The calculation amounts of corresponding pixel points be greatly lowered, and the accuracy of calculating can also be improved, and be able to significantly in calculation amount Degree can enable its calculating time be greatly shortened in the case of reducing, and then can realize during panoramic picture is obtained really The effect that its precision is also improved on the premise of guarantor's real-time.
For example, N number of fish eye lens includes fish eye lens 0, fish eye lens 1, fish eye lens 2, fish eye lens 3, flake Camera lens 4, fish eye lens 5 and fish eye lens 6 get a pixel H in the panoramic picture, if according to N number of flake The distribution of camera lens and visual field determine H in the mapping range of fish eye lens 1, it is determined that H is corresponding with fish eye lens 1, according to H Attitude matrix in world coordinate system of three-dimensional coordinate and fish eye lens 1, new spherical coordinates of the H on spherical surface is obtained, according to H The mapping of each pixel and spherical coordinate is closed in new spherical coordinate, 1 corresponding fish eye images B1 of fish eye lens on spherical surface System obtains pixel H1 corresponding with H from fish eye images B1, then H1 is corresponding pixel points of the H in fish eye images B1;If H is also In the mapping range of fish eye lens 3, pixel corresponding with H can also be obtained from 3 corresponding fish eye images B3 of fish eye lens Point H3.
In another example N number of fish eye lens includes fish eye lens 0, fish eye lens 1, fish eye lens 2, fish eye lens 3, fish Glasses are first 4, fish eye lens 5 and fish eye lens 6, get each pixel in the panoramic picture N number of new spherical coordinate it Afterwards, first new spherical coordinate according to H on spherical surface, H is corresponding with fish eye lens 0 at this time, then needs first new ball The mapping relations of each pixel and spherical coordinate in areal coordinate, 0 corresponding fish eye images B0 of fish eye lens, from fish eye images B0 It is middle to search pixel corresponding with H, if finding, obtain corresponding pixel points H0s of the H in fish eye images B0;If it does not search It arrives, it is determined that not there is no pixel corresponding with H in fish eye images B0;Based on same method, from fish eye images B1, B2, B3, Pixel corresponding with H is searched in B4, B5 and B6, obtains corresponding pixel points of the H in B1, B2, B3, B4, B5 and B6.
Specifically, corresponded to since some pixel in the panoramic picture may exist in multiple fish eye images Pixel, it is also possible to only there are corresponding pixel points in a fish eye images, in this way, in order to enable each in the panoramic picture Following manner may be employed to determine the pixel of each pixel in the panoramic picture in the accuracy of the pixel value of pixel Value, specially:If some pixel is corresponding with K pixel in K fish eye images respectively in the panoramic picture, basis The pixel value and weight of the K pixel determine the pixel value of the pixel in the panoramic picture, the K pixel It is corresponded with the K fish eye images, K is the integer not less than 2 and no more than N;If some pixel in the panoramic picture Point is only corresponding with a specific pixel point in a fish eye images, it is determined that the pixel of the pixel in the panoramic picture It is worth for the pixel value of the specific pixel point.
Specifically, some pixel is corresponding with K pixel in K fish eye images respectively in the panoramic picture When, it is necessary to first obtain the weight of each pixel in the K pixel;Pixel value further according to the K pixel and every The weight of a pixel determines the pixel value of the pixel in the panoramic picture.
Specifically, in the weight of each pixel in obtaining the K pixel, each pixel can be pre-set Weight, such as the weight of pre-set each pixel can be 1/K etc.;It is of course also possible to according to the K pixel In between each pixel and the central pixel point of corresponding fish eye images distance weight is set, one in the K pixel Its weight is bigger closer to the central pixel point of corresponding fish eye images for a pixel, conversely, its value is smaller.
For example, a pixel H is only corresponding with the pixel H1 in fish eye images B1 in the panoramic picture, then it can be true The pixel value for determining H is the pixel value of H1.
In another example by taking K=2 as an example, in the panoramic picture pixel H respectively with the pixel in fish eye images B1 Pixel H3 in H1 and fish eye images B3 is corresponded to, and distance is represented with d1 between obtaining the central pixel point of H1 to fish eye images B1 And distance is represented with d3 between obtaining the central pixel point of H3 to fish eye images B3, if the weight of H1 represents the weight with H3 with w1 It being identified with w3, then the specifically used formula of w1, w3 (9) calculates, wherein, formula (9) is:
Wherein, after w1 and w3 is got by formula (9), then represented with PX1 further according to the pixel value of H1 and H3 Pixel value represent that, if the pixel value of H is represented with PX, PX can specifically be calculated using formula (10) with PX3, wherein, it is public Formula (10) is:
PX=PX1 × w1+PX3 × w3 formula (10).
So so that the pixel value of the H got by formula (9) and formula (10) is more accurate, so that obtain The matching degree higher of the pixel value of each pixel and N number of fish eye images in the panoramic picture.
In specific implementation process, when the resolution ratio of the panoramic picture is more than the resolution ratio of each fish eye images, meeting So that having in the panoramic picture, partial pixel point and any one pixel in N number of fish eye images be not corresponding, this When, attitude matrix of each fish eye lens of basis in world coordinate system obtains each pixel in the panoramic picture The specific implementation step of corresponding pixel points in corresponding fish eye images is specific as follows, as shown in Figure 4:
S401, the attitude matrix according to each fish eye lens in world coordinate system obtain in the panoramic picture Each corresponding pixel points of the pixel in corresponding fish eye images in one part of pixel point, first portion's pixel is described All pixels point corresponding with the pixel in N number of fish eye images in panoramic picture;
S402, according to each corresponding pixel points of the pixel in corresponding fish eye images in first portion's pixel Pixel value determines the pixel value of each pixel in first portion's pixel;
S403, the pixel value according to each pixel in first portion's pixel are determined in the panoramic picture The pixel value of each pixel in 2 partial pixel points, wherein, first portion's pixel and the second portion pixel For all pixels point in the panoramic picture.
Wherein, in step S401, the attitude matrix first according to each fish eye lens in world coordinate system, by described in The latitude and longitude coordinates of each pixel are converted to spherical coordinate in panoramic picture;According to each pixel in the panoramic picture The attitude matrix of spherical coordinate and each fish eye lens in world coordinate system obtains each pixel in the panoramic picture New spherical coordinate;According to each picture in the new spherical coordinate of each pixel in the panoramic picture and N number of fish eye images The plane coordinates of vegetarian refreshments and the mapping relations of spherical coordinate determine first portion's picture from the pixel of the panoramic picture Vegetarian refreshments, and each corresponding pixel points of the pixel in corresponding fish eye images in first portion's pixel are obtained, it is specific Implementation process may be referred to step S301~S303, and in order to illustrate the succinct of book, details are not described herein again.
Specifically, the attitude matrix according to each fish eye lens in world coordinate system is searched in the panoramic picture Each corresponding pixel points of the pixel in corresponding fish eye images, by the panoramic picture found with N number of flake The corresponding all pixels point of pixel in image obtains first portion's pixel as first portion's pixel In each corresponding pixel points of the pixel in corresponding fish eye images.
Next perform step S402, if in first portion's pixel some pixel respectively with K fish eye images In K pixel correspond to, then according to the pixel value and weight of the K pixel, determine the pixel in the panoramic picture The pixel value of point, the K pixel are corresponded with the K fish eye images, and K is the integer not less than 2 and no more than N; If some pixel is only corresponding with a specific pixel point in a fish eye images in first portion's pixel, it is determined that The pixel value of the pixel in the panoramic picture is the pixel value of the specific pixel point.
Next perform step S403, in the second portion pixel each pixel in each fish eye images not It, can be by inserting after the pixel value of first portion's pixel is obtained by step S402 there are corresponding pixel Value method determines the pixel value of each pixel in the second portion pixel.
In the embodiment of the present application, the interpolation method can be that (anti-distance adds Inverse Distance to a Power Weigh interpolation method), Kriging (Kriging regression method), Minimum Curvature (minimum curvature), Modified Shepard' S Method (improving Xie Biedefa), Natural Neighbor (Natural neighbors interpolation method), Nearest Neighbor are (recently Adjoint point interpolation method), Polynomial Regression (image factoring), Radial Basis Function (radial direction base letters Number methods), Triangulation with Linear Interpolation (linear interpolation triangulation), Moving Average (method of moving average) and Local Polynomial (Local Polynomial method) etc., the application is not specifically limited.
After step S106 performs completion, step S107 is performed, according to the picture of each pixel in the panoramic picture Element value, obtains the panoramic picture.
In specific implementation process, the pixel value of each pixel in the panoramic picture is got by step S106 Afterwards, pixel value is filled in each pixel in the panoramic picture, so as to get the panoramic picture.
In this way, when the resolution ratio of the panoramic picture is not more than the resolution ratio of each fish eye images, according to the panorama The pixel value of each pixel is respective pixel value in corresponding fish eye images come definite in image so that described complete The pixel value of each pixel is more matched with the pixel value of pixel in fish eye images in scape image, and then improves the panorama The display precision of image.
Moreover, when the resolution ratio of the panoramic picture is more than the resolution ratio of each fish eye images, in the panoramic picture First portion's pixel in the resolution ratio of panoramic picture of each pixel when being not more than the resolution ratio of each fish eye images, make The pixel value for obtaining first portion's pixel is more matched with the pixel value of pixel in fish eye images, further according to described first The pixel value of pixel is divided to get the pixel value of the second portion pixel using interpolation method, in first portion's picture On the basis of the accuracy of the pixel value of vegetarian refreshments improves, it can cause the accuracy of pixel of the second portion pixel also therewith It improves, so so that the display precision of the panoramic picture also improves therewith.
Beneficial effects of the present invention are as follows:
One, since the embodiment of the present application is after the attitude matrix of each fish eye lens in world coordinate system is obtained, According to attitude matrix of each fish eye lens in world coordinate system, the panorama being spliced by N number of fish eye images is determined The pixel value of each pixel in image;Further according to the pixel value of each pixel in the panoramic picture, the panorama is obtained Image, it is known that the embodiment of the present application is to obtain the panorama sketch using the pixel value of each pixel in the panoramic picture Picture compared with prior art, reduces the calculating process for carrying out image mosaic or fusion, can effectively reduce calculation amount, and logical The pixel value of each pixel is crossed to obtain the panoramic picture so that the precision of the panoramic picture obtained must also improve, such as This, solves the problems, such as to exist in the prior art and obtains real-time and precision during panoramic picture and cannot be compatible with, and realizes Obtain the effect for ensuring that its precision is also improved on the premise of real-time during panoramic picture.
Secondly, due in the embodiment of the present application using first obtain panoramic picture each pixel spherical coordinate, Then according to each plane coordinates of pixel and the mapping of spherical coordinate in the spherical coordinate of each pixel and fish eye images Relation obtains each corresponding pixel points of the pixel in corresponding fish eye images in panoramic picture;Further according to corresponding pixel points Pixel value obtains the pixel value of each pixel in panoramic picture, so as to get panoramic picture, in the process using public affairs Formula need not carry out multiple polynomial solving, and the prior art needs to carry out multiple polynomial solving, and calculation amount is larger and calculates Precision also has and reduces to a certain degree, in this way, compared with prior art, the embodiment of the present application gets each picture in panoramic picture The calculation amount of corresponding pixel points of the vegetarian refreshments in corresponding fish eye images is greatly lowered, and can also improve the accurate of calculating Degree, and its calculating time can be enable to be greatly shortened in the case where calculation amount is able to be greatly lowered, and then can realize The effect that its precision is also improved on the premise of real-time is ensured during obtaining panoramic picture.
Embodiment two:
The embodiment of the present application two additionally provides a kind of method that fish eye images are spliced into panoramic video, and N number of fish eye lens exists N number of fish eye images that synchronization obtains form one group of fish eye images, referring to Fig. 5, including:
S501, K group fish eye images are obtained successively using N number of fish eye lens, wherein, N and K are whole not less than 2 Number;
Following steps are performed for each group of fish eye images of acquisition:
Step A:According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, one group of flake is got Image is by N number of cylindrical surface projecting image after longitude and latitude expansion;
Step B:The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to characteristic point Match somebody with somebody, obtain the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, the cylinder that each two has intersecting area is thrown Shadow image is one group of cylindrical surface projecting image;
Step C:According to spherical coordinate and the mapping relations of latitude and longitude coordinates, every group of cylindrical surface projecting image intersecting area is obtained Matching characteristic point to the three-dimensional coordinate set on spherical surface, and utilize the matching characteristic point of every group of cylindrical surface projecting image intersecting area To three-dimensional coordinate set carry out Attitude Calculation, obtain the posture square of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area Battle array;
Step D:Using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, every group is utilized The attitude matrix of the matching characteristic point pair of cylindrical surface projecting image intersecting area, is calculated each fish eye lens in world coordinate system In attitude matrix;
Step E:According to attitude matrix of each fish eye lens in world coordinate system, determine by N number of fish eye images The pixel value of each pixel in the panoramic picture being spliced;
Step F:According to the pixel value of each pixel in the panoramic picture, the panoramic picture is obtained;
S502, after every group of fish eye images have performed above-mentioned steps, get K panoramic picture;
The K panoramic picture is formed panoramic video by S503, the time sequencing according to the K groups fish eye images.
Wherein, by N number of fish eye lens pan video when, N number of fish eye lens is obtained in synchronization The N number of fish eye images taken form one group of fish eye images, and each group of fish eye images are spliced into a panoramic picture, then according to the time K panoramic picture of acquisition is formed panoramic video by precedence, wherein, each panoramic picture is a two field picture.
In step S501, by N number of fish eye images pan video when, N number of fish-eye image can be used As taking the K groups fish eye images successively, wherein, the value of K can be changed correspondingly according to the length of panoramic video, panoramic video Duration it is longer, then the value of K is bigger;Conversely, the duration of panoramic video is shorter, then the value of K is smaller.
If desired the panoramic video is obtained in real time, then after one group of fish eye images of every acquisition, for each of acquisition Group fish eye images perform step A- steps F;If being not required to obtain the panoramic video in real time, the K groups fish-eye image can obtained As and then be performed both by step A- steps F for each group of fish eye images in the K groups fish eye images of acquisition.
Wherein, in step, can first by each fish eye images got from two-dimensional assemblage be spherical coordinate, The spherical coordinate of each fish eye images is unfolded by longitude and latitude projection again, N number of cylindrical surface projecting image after being unfolded.
In step, in each fish eye images got during two-dimensional assemblage is spherical coordinate, Ke Yixian Each fish eye images are converted into plane rectangular coordinates from image coordinate, then again by each fish eye images from plane rectangular coordinates System is converted to spherical coordinate;It is of course also possible to each fish eye images are directly converted into spherical coordinate, the application from image coordinate It is not specifically limited.
Certainly, in step, N number of cylindrical surface projecting image can also be first obtained, in N number of cylindrical surface projecting image The pixel value of each pixel do not determine, the pixel value mistake of each pixel in N number of cylindrical surface projecting image is determined Cheng Zhong first according to spherical coordinate and the mapping relations of latitude and longitude coordinates, obtains each pixel in each cylindrical surface projecting image Three-dimensional coordinate set on spherical surface, then according to spherical coordinate and the mapping relations of plane coordinates, using each pixel in ball Three-dimensional coordinate on face gets each pixel in each cylindrical surface projecting image and the pixel in corresponding fish eye images Correspondence, so as to obtain the pixel value of each pixel in each cylindrical surface projecting image, according to each cylindrical surface projecting The pixel value of each pixel in image, gets N number of cylindrical surface projecting image.
In the embodiment of the present application, the specific implementation process of step A specifically may be referred to the implementation process of step S102, in order to Specification is succinct, and details are not described herein again.
After step A has been performed, step B is then performed, it in stepb, can be according to described N number of fish-eye point Cloth and each fish-eye visual field, obtain the intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image.
In the embodiment of the present application, the specific implementation process of step B specifically may be referred to the implementation process of step S103, in order to Specification is succinct, and details are not described herein again.
Next step C is performed, in step C, the matching characteristic point is being got to afterwards, utilizing by step B The matching characteristic point that latitude and longitude coordinates obtain every group of cylindrical surface projecting image intersecting area sits the two dimension in cylindrical surface projecting image Mark collection;Further according to the mapping relations and the two-dimensional coordinate collection of latitude and longitude coordinates and spherical coordinate, every group of cylindrical surface projecting figure is obtained As the three-dimensional coordinate set of the matching characteristic point pair of intersecting area.
In the embodiment of the present application, the specific implementation process of step C specifically may be referred to the implementation process of step S104, in order to Specification is succinct, and details are not described herein again.
Next perform step D, due in N number of fish eye lens each two it is adjacent fish eye lens will necessarily there are phases Region is handed over, and the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area has been obtained by step C, every group The attitude matrix of the matching characteristic point pair of cylindrical surface projecting image intersecting area is two adjacent fish-eye attitude matrixs, In this way, using any one fish-eye coordinate system in N number of fish eye lens as world coordinate system, every group of cylindrical surface projecting is utilized The attitude matrix of the matching characteristic point pair of image intersecting area, is calculated posture of each fish eye lens in world coordinate system Matrix.
In the embodiment of the present application, the specific implementation process of step D specifically may be referred to the implementation process of step S105, in order to Specification is succinct, and details are not described herein again.
Next step E is performed, when the resolution ratio of the panoramic picture is not more than the resolution ratio of each fish eye images, is needed Attitude matrix that will be according to each fish eye lens in world coordinate system obtains in the panoramic picture each pixel corresponding Corresponding pixel points in fish eye images;Further according to each correspondence picture of the pixel in corresponding fish eye images in the panoramic picture The pixel value of vegetarian refreshments determines the pixel value of each pixel in the panoramic picture.
Specifically, attitude matrix of each fish eye lens of the basis in world coordinate system, obtains the panorama sketch Each corresponding pixel points of the pixel in corresponding fish eye images can be obtained by step S301~S303 as in.
In step E, when the resolution ratio of the panoramic picture is more than the resolution ratio of each fish eye images, it can cause described Have that partial pixel point and any one pixel in N number of fish eye images be not corresponding in panoramic picture, at this moment, described According to attitude matrix of each fish eye lens in world coordinate system, each pixel is obtained in the panoramic picture in corresponding flake Corresponding pixel points in image can be obtained by performing step S401~S403.
In the embodiment of the present application, the specific implementation process of step E specifically may be referred to the implementation process of step S106, in order to Specification is succinct, and details are not described herein again.
Next perform step F, in the panoramic picture is got by step E the pixel value of each pixel it Afterwards, pixel value is filled in each pixel in the panoramic picture, so as to get the panoramic picture.
In this way, when the resolution ratio of the panoramic picture is not more than the resolution ratio of each fish eye images, according to the panorama The pixel value of each pixel is respective pixel value in corresponding fish eye images come definite in image so that described complete The pixel value of each pixel is more matched with the pixel value of pixel in fish eye images in scape image, and then improves the panorama The display precision of image.
Moreover, when the resolution ratio of the panoramic picture is more than the resolution ratio of each fish eye images, in the panoramic picture First portion's pixel in the resolution ratio of panoramic picture of each pixel when being not more than the resolution ratio of each fish eye images, make The pixel value for obtaining first portion's pixel is more matched with the pixel value of pixel in fish eye images, further according to described first The pixel value of pixel is divided to get the pixel value of the second portion pixel using interpolation method, in first portion's picture Under the basis that the accuracy of the pixel value of vegetarian refreshments improves, the accuracy of the pixel of the second portion pixel also improves therewith, So so that the display precision of the panoramic picture also improves therewith.
After every group of fish eye images have performed above-mentioned steps, step S502 is performed, gets K panoramic picture.
In specific implementation process, the K panoramic picture is corresponded with the K groups fish eye images, for each After group fish eye images perform step A- steps F, the K panoramic picture is obtained.
Next step S503 is performed, in this step, according to the time sequencing of the K groups fish eye images, by the K Panoramic picture forms panoramic video.
In specific implementation process, since the panoramic video is made of panoramic picture one by one, according to The time sequencing of the K groups fish eye images, you can K panoramic picture is formed panoramic video.
Specifically, the time sequencing of the K groups fish eye images is each group of fish shot according to N number of fish eye lens The time of eye pattern picture determines, in N number of fish eye lens during the K groups fish eye images are shot, in every one group of shooting It, can be to each fish eye images one time tag of mark in every group of fish eye images, in this way, according to every group of fish-eye image during fish eye images The time tag of fish eye images as in obtains the time sequencing of the K groups fish eye images, then according to the K groups fish-eye image The time sequencing of picture is successively ranked up the corresponding K panoramic picture.
If for example, the K groups fish eye images be K1, K2, K3 and K4 group fish eye images correspond to, wherein, K1 group fish eye images Corresponding K11 panoramic pictures, K2 group fish eye images correspond to K21 panoramic pictures, K3 group fish eye images correspond to K31 panoramic pictures and K4 group fish eye images correspond to K41 panoramic pictures, if according to the time tag of each fish eye images in every group of fish eye images, determine Time-sequencing is K1, K2, K3 and K4, it is determined that in the panoramic video time-sequencing of panoramic picture for K11, K21, K31 and K41。
Based on the identical technical concept of the method provided with this implementation one, the embodiment of the present invention additionally provides a kind of camera shooting and sets It is standby, as shown in fig. 6, including:
Fish eye images acquiring unit 601, for getting N number of fish eye images in synchronization using N number of fish eye lens, In, N is the integer not less than 2;
Cylindrical surface projecting image acquisition unit 602, for being closed according to the mapping of latitude and longitude coordinates, spherical coordinate and plane coordinates System, gets N number of cylindrical surface projecting image after N number of fish eye images are unfolded by longitude and latitude;
Matching characteristic point is to acquiring unit 603, for by every group of cylindrical surface projecting image in N number of cylindrical surface projecting image Intersecting area carries out Feature Points Matching, obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two Cylindrical surface projecting image with intersecting area is one group of cylindrical surface projecting image;
First attitude matrix acquiring unit 604 for the mapping relations according to spherical coordinate and latitude and longitude coordinates, obtains every The matching characteristic point of group cylindrical surface projecting image intersecting area utilizes every group of cylindrical surface projecting figure to the three-dimensional coordinate set on spherical surface As the three-dimensional coordinate set progress Attitude Calculation of the matching characteristic point pair of intersecting area, every group of cylindrical surface projecting image intersecting area is obtained Matching characteristic point pair attitude matrix;
Second attitude matrix acquiring unit 605, for a certain fish-eye coordinate system in N number of fish eye lens For world coordinate system, using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, it is calculated every Attitude matrix of a fish eye lens in world coordinate system;
Pixel value acquiring unit 606, for the attitude matrix according to each fish eye lens in world coordinate system, determine by The pixel value of each pixel in the panoramic picture that N number of fish eye images are spliced;
Panoramic picture acquiring unit 607, for the pixel value according to each pixel in the panoramic picture, described in acquisition Panoramic picture.
Preferably, the first attitude matrix acquiring unit 604 further includes:
Three-dimensional coordinate set obtains subelement, for utilizing latitude and longitude coordinates every group of cylindrical surface projecting image intersecting area of acquisition Matching characteristic point is to the two-dimensional coordinate collection in cylindrical surface projecting image;According to the mapping relations of latitude and longitude coordinates and spherical coordinate and The two-dimensional coordinate collection obtains the three-dimensional coordinate set of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area.
Preferably, pixel value acquiring unit 606, for being not more than each fish eye images in the resolution ratio of the panoramic picture Resolution ratio when, according to attitude matrix of each fish eye lens in world coordinate system, obtain each picture in the panoramic picture Corresponding pixel points of the vegetarian refreshments in corresponding fish eye images;According to each pixel in the panoramic picture in corresponding fish eye images Corresponding pixel points pixel value, determine the pixel value of each pixel in the panoramic picture.
Preferably, the second attitude matrix acquiring unit 605 further includes:
First corresponding pixel points obtain subelement, for the latitude and longitude coordinates of each pixel in the panoramic picture to be turned It is changed to spherical coordinate;According to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system In attitude matrix, obtain the new spherical coordinate of each pixel in the panoramic picture;According to each in the panoramic picture The mapping of the plane coordinates of each pixel and spherical coordinate is closed in the new spherical coordinate of pixel and N number of fish eye images System, obtains each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture.
Preferably, pixel value acquiring unit 606, in the panoramic picture some pixel respectively with K flake K pixel in image to it is corresponding when according to the pixel value and weight of the K pixel, determine this in the panoramic picture The pixel value of pixel, the K pixel are corresponded with the K fish eye images, and K is not less than 2 and whole no more than N Number;And in the panoramic picture some pixel only with a specific pixel point in a fish eye images to it is corresponding when it is true The pixel value of the pixel in the fixed panoramic picture is the pixel value of the specific pixel point.
Preferably, pixel value acquiring unit 606, it is additionally operable to obtain the weight of each pixel in the K pixel;Root According to the pixel value of the K pixel and the weight of each pixel, the pixel value of the pixel in the panoramic picture is determined.
Preferably, pixel value acquiring unit 606, is additionally operable to be more than each fish eye images in the resolution ratio of the panoramic picture Resolution ratio when, according to attitude matrix of each fish eye lens in world coordinate system, obtain first in the panoramic picture Each corresponding pixel points of the pixel in corresponding fish eye images in partial pixel point, first portion's pixel are described complete All pixels point corresponding with the pixel in N number of fish eye images in scape image;According in first portion's pixel The pixel value of each corresponding pixel points of the pixel in corresponding fish eye images, determines each picture in first portion's pixel The pixel value of vegetarian refreshments;According to the pixel value of each pixel in first portion's pixel, determine in the panoramic picture The pixel value of each pixel in 2 partial pixel points, wherein, first portion's pixel and the second portion pixel For all pixels point in the panoramic picture.
Preferably, pixel value acquiring unit 606 further includes:
Second corresponding pixel points obtain subelement, for the latitude and longitude coordinates of each pixel in the panoramic picture to be turned It is changed to spherical coordinate;According to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system In attitude matrix, obtain the new spherical coordinate of each pixel in the panoramic picture;According to each in the panoramic picture The mapping of the plane coordinates of each pixel and spherical coordinate is closed in the new spherical coordinate of pixel and N number of fish eye images System determines first portion's pixel from the pixel of the panoramic picture, and obtains in first portion's pixel Each corresponding pixel points of the pixel in corresponding fish eye images.
Preferably, the picture pick-up device further includes intersecting area acquiring unit, for according to described N number of fish-eye point Cloth and each fish-eye visual field, obtain the intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image.
Example IV:
Based on the identical technical concept of the method provided with embodiment two, the embodiment of the present application four additionally provides a kind of camera shooting Equipment, referring to Fig. 7, including:
Fish eye images acquiring unit 701, for obtaining K group fish eye images successively using N number of fish eye lens, wherein, the N A fish eye lens forms one group of fish eye images in N number of fish eye images that synchronization obtains, and N and K are the integer not less than 2;
Panoramic picture acquiring unit 702 performs following steps for being directed to each group of fish eye images of acquisition:Step A:Root According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, get one group of fish eye images and be unfolded by longitude and latitude N number of cylindrical surface projecting image afterwards;Step B:By the intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image into Row Feature Points Matching obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two has intersection The cylindrical surface projecting image in domain is one group of cylindrical surface projecting image;Step C:According to spherical coordinate and the mapping relations of latitude and longitude coordinates, The matching characteristic point of every group of cylindrical surface projecting image intersecting area is obtained to the three-dimensional coordinate set on spherical surface, and utilizes every group of cylinder The three-dimensional coordinate set of the matching characteristic point pair of projected image intersecting area carries out Attitude Calculation, obtains every group of cylindrical surface projecting image phase Hand over the attitude matrix of the matching characteristic point pair in region;Step D:With a certain fish-eye coordinate in N number of fish eye lens It is for world coordinate system, using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, is calculated Each attitude matrix of the fish eye lens in world coordinate system;Step E:According to appearance of each fish eye lens in world coordinate system State matrix determines the pixel value of each pixel in the panoramic picture being spliced by N number of fish eye images;Step F:According to The pixel value of each pixel, obtains the panoramic picture in the panoramic picture;Above-mentioned step has been performed in every group of fish eye images After rapid, K panoramic picture is got;
Panoramic video acquiring unit 703, for the time sequencing according to the K groups fish eye images, by the K panorama sketch As composition panoramic video.
Preferably, panoramic picture acquiring unit 702 further includes:
Three-dimensional coordinate set obtains subelement, for utilizing latitude and longitude coordinates every group of cylindrical surface projecting image intersecting area of acquisition Matching characteristic point is to the two-dimensional coordinate collection in cylindrical surface projecting image;According to the mapping relations of latitude and longitude coordinates and spherical coordinate and The two-dimensional coordinate collection obtains the three-dimensional coordinate set of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area.
Preferably, panoramic picture acquiring unit 702 further includes:
Pixel value acquiring unit, for being not more than the resolution ratio of each fish eye images in the resolution ratio of the panoramic picture When, according to attitude matrix of each fish eye lens in world coordinate system, each pixel is obtained in the panoramic picture in phase Answer the corresponding pixel points in fish eye images;According to correspondence picture of each pixel in corresponding fish eye images in the panoramic picture The pixel value of vegetarian refreshments determines the pixel value of each pixel in the panoramic picture.
Preferably, panoramic picture acquiring unit 702 further includes:
First corresponding pixel points obtain subelement, for the latitude and longitude coordinates of each pixel in the panoramic picture to be turned It is changed to spherical coordinate;According to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system In attitude matrix, obtain the new spherical coordinate of each pixel in the panoramic picture;According to each in the panoramic picture The mapping of the plane coordinates of each pixel and spherical coordinate is closed in the new spherical coordinate of pixel and N number of fish eye images System, obtains each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture.
Preferably, panoramic picture acquiring unit 702 further includes:
Pixel value acquiring unit, in the panoramic picture some pixel respectively with the K in K fish eye images Pixel to it is corresponding when according to the pixel value and weight of the K pixel, determine the pixel of the pixel in the panoramic picture Value, the K pixel are corresponded with the K fish eye images, and K is the integer not less than 2 and no more than N;And institute State in panoramic picture some pixel only with a specific pixel point in a fish eye images to it is corresponding when determine the panorama sketch The pixel value of the pixel as in is the pixel value of the specific pixel point.
Preferably, the pixel value acquiring unit, it is additionally operable to obtain the weight of each pixel in the K pixel; According to the weight of the pixel value of the K pixel and each pixel, the pixel of the pixel in the panoramic picture is determined Value.
Preferably, the pixel value acquiring unit, is additionally operable to be more than each fish-eye image in the resolution ratio of the panoramic picture During the resolution ratio of picture, according to attitude matrix of each fish eye lens in world coordinate system, in the panoramic picture is obtained Each corresponding pixel points of the pixel in corresponding fish eye images in one part of pixel point, first portion's pixel is described All pixels point corresponding with the pixel in N number of fish eye images in panoramic picture;According to first portion's pixel In each corresponding pixel points of the pixel in corresponding fish eye images pixel value, determine each in first portion's pixel The pixel value of pixel;According to the pixel value of each pixel in first portion's pixel, determine in the panoramic picture The pixel value of each pixel in second portion pixel, wherein, first portion's pixel and the second portion pixel Point is all pixels point in the panoramic picture.
Preferably, the pixel value acquiring unit further includes:
Second corresponding pixel points obtain subelement, for the latitude and longitude coordinates of each pixel in the panoramic picture to be turned It is changed to spherical coordinate;According to the spherical coordinate of each pixel and each fish eye lens in the panoramic picture in world coordinate system In attitude matrix, obtain the new spherical coordinate of each pixel in the panoramic picture;According to each in the panoramic picture The mapping of the plane coordinates of each pixel and spherical coordinate is closed in the new spherical coordinate of pixel and N number of fish eye images System determines first portion's pixel from the pixel of the panoramic picture, and obtains in first portion's pixel Each corresponding pixel points of the pixel in corresponding fish eye images.
Preferably, panoramic picture acquiring unit 702 further includes:
Intersecting area acquiring unit, for according to N number of fish-eye distribution and each fish-eye visual field, obtaining The intersecting area of every group of cylindrical surface projecting image into N number of cylindrical surface projecting image.
Beneficial effects of the present invention are as follows:
One, since the embodiment of the present application is after the attitude matrix of each fish eye lens in world coordinate system is obtained, According to attitude matrix of each fish eye lens in world coordinate system, the panorama being spliced by N number of fish eye images is determined The pixel value of each pixel in image;Further according to the pixel value of each pixel in the panoramic picture, the panorama is obtained Image;Understand that the embodiment of the present application is to obtain the panorama sketch using the pixel value of each pixel in the panoramic picture Picture compared with prior art, reduces the calculating process for carrying out image mosaic or fusion, can effectively reduce calculation amount, and logical The pixel value of each pixel is crossed to obtain the panoramic picture so that the precision of the panoramic picture obtained must also improve, such as This, solves the problems, such as to exist in the prior art and obtains real-time and precision during panoramic picture and cannot be compatible with, and realizes Obtain the effect for ensuring that its precision is also improved on the premise of real-time during panoramic picture.
Secondly, due in the embodiment of the present application using first obtain panoramic picture each pixel spherical coordinate, Then according to each plane coordinates of pixel and the mapping of spherical coordinate in the spherical coordinate of each pixel and fish eye images Relation obtains each corresponding pixel points of the pixel in corresponding fish eye images in panoramic picture;Further according to corresponding pixel points Pixel value obtains the pixel value of each pixel in panoramic picture, so as to get panoramic picture, in the process using public affairs Formula need not carry out multiple polynomial solving, and the prior art needs to carry out multiple polynomial solving, and calculation amount is larger and calculates Precision also has and reduces to a certain degree, in this way, compared with prior art, the embodiment of the present application gets each picture in panoramic picture The calculation amount of corresponding pixel points of the vegetarian refreshments in corresponding fish eye images is greatly lowered, and can also improve the accurate of calculating Degree, and its calculating time can be enable to be greatly shortened in the case where calculation amount is able to be greatly lowered, and then can realize The effect that its precision is also improved on the premise of real-time is ensured during obtaining panoramic picture.
Module described in the embodiment of the present invention or unit can pass through universal integrated circuit, such as CPU (CentralProcessing Unit, central processing unit) or pass through ASIC (Application Specific IntegratedCircuit, application-specific integrated circuit) it realizes.
One of ordinary skill in the art will appreciate that realizing all or part of flow in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, the program can be stored in a computer read/write memory medium In, the program is upon execution, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, the storage medium can be magnetic Dish, CD, read-only memory (Read-Only Memory, ROM) or random access memory (Random Access Memory, RAM) etc..
The above disclosure is only the preferred embodiments of the present invention, cannot limit the right model of the present invention with this certainly It encloses, one of ordinary skill in the art will appreciate that realize all or part of flow of above-described embodiment, and will according to right of the present invention Made equivalent variations are sought, still falls within and invents covered scope.

Claims (12)

1. a kind of method that fish eye images are spliced into panoramic picture, which is characterized in that including:
N number of fish eye images are got in synchronization using N number of fish eye lens, wherein, N is the integer not less than 2;
According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, N number of fish eye images are got by longitude and latitude N number of cylindrical surface projecting image after expansion;
The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to Feature Points Matching, obtains every group The matching characteristic point pair of cylindrical surface projecting image intersecting area, wherein, it is one that each two, which has the cylindrical surface projecting image of intersecting area, Group cylindrical surface projecting image;
According to spherical coordinate and the mapping relations of latitude and longitude coordinates, the matching characteristic of every group of cylindrical surface projecting image intersecting area is obtained Point is to the three-dimensional coordinate set on spherical surface, and the three-dimensional seat of the matching characteristic point pair using every group of cylindrical surface projecting image intersecting area Mark collection carries out Attitude Calculation, obtains the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area;
Using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, every group of cylindrical surface projecting figure is utilized As the attitude matrix of the matching characteristic point pair of intersecting area, posture square of each fish eye lens in world coordinate system is calculated Battle array;
According to attitude matrix of each fish eye lens in world coordinate system, determine what is be spliced by N number of fish eye images The pixel value of each pixel in panoramic picture;
According to the pixel value of each pixel in the panoramic picture, the panoramic picture is obtained.
2. the method as described in claim 1, which is characterized in that the matching for obtaining every group of cylindrical surface projecting image intersecting area The three-dimensional coordinate set of characteristic point pair, specifically includes:
The matching characteristic point of every group of cylindrical surface projecting image intersecting area is obtained in cylindrical surface projecting image using latitude and longitude coordinates Two-dimensional coordinate collection;
According to the mapping relations and the two-dimensional coordinate collection of latitude and longitude coordinates and spherical coordinate, every group of cylindrical surface projecting image phase is obtained Hand over the three-dimensional coordinate set of the matching characteristic point pair in region.
3. the method as described in claim 1, which is characterized in that be not more than each fish-eye image in the resolution ratio of the panoramic picture During the resolution ratio of picture, attitude matrix of each fish eye lens of basis in world coordinate system is determined by N number of fish-eye image The pixel value of each pixel, specifically includes in the panoramic picture that picture is spliced:
According to attitude matrix of each fish eye lens in world coordinate system, each pixel is obtained in the panoramic picture in phase Answer the corresponding pixel points in fish eye images;
According to the pixel value of each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture, determine described The pixel value of each pixel in panoramic picture.
4. method as claimed in claim 3, which is characterized in that appearance of each fish eye lens of basis in world coordinate system State matrix obtains each corresponding pixel points of the pixel in corresponding fish eye images in the panoramic picture, specifically includes:
The latitude and longitude coordinates of each pixel in the panoramic picture are converted into spherical coordinate;
According to the posture of the spherical coordinate and each fish eye lens of each pixel in the panoramic picture in world coordinate system Matrix obtains the new spherical coordinate of each pixel in the panoramic picture;
According in the new spherical coordinate of each pixel in the panoramic picture and N number of fish eye images each pixel it is flat The mapping relations of areal coordinate and spherical coordinate obtain each correspondence of the pixel in corresponding fish eye images in the panoramic picture Pixel.
5. method as claimed in claim 3, which is characterized in that it is described according to each pixel in the panoramic picture corresponding The pixel value of corresponding pixel points in fish eye images determines the pixel value of each pixel in the panoramic picture, specifically includes:
If some pixel is corresponding with K pixel in K fish eye images respectively in the panoramic picture, according to the K The pixel value and weight of a pixel, determine the pixel value of the pixel in the panoramic picture, the K pixel with it is described K fish eye images correspond, and K is the integer not less than 2 and no more than N;
If some pixel is only corresponding with a specific pixel point in a fish eye images in the panoramic picture, it is determined that institute The pixel value for stating the pixel in panoramic picture is the pixel value of the specific pixel point.
6. method as claimed in claim 5, which is characterized in that it is described according in the K fish eye images with the panorama sketch The pixel value of the corresponding K pixel of the pixel as in determines the pixel value of the pixel in the panoramic picture, specific to wrap It includes:
Obtain the weight of each pixel in the K pixel;
According to the weight of the pixel value of the K pixel and each pixel, the pixel in the panoramic picture is determined Pixel value.
7. the method as described in claim 1, which is characterized in that be more than each fish eye images in the resolution ratio of the panoramic picture Resolution ratio when, attitude matrix of each fish eye lens of basis in world coordinate system, determine by N number of fish eye images The pixel value of each pixel, specifically includes in the panoramic picture being spliced:
According to attitude matrix of each fish eye lens in world coordinate system, first portion's pixel in the panoramic picture is obtained Each corresponding pixel points of the pixel in corresponding fish eye images in point, first portion's pixel are in the panoramic picture All pixels point corresponding with the pixel in N number of fish eye images;
According to the pixel value of each corresponding pixel points of the pixel in corresponding fish eye images in first portion's pixel, really The pixel value of each pixel in fixed first portion's pixel;
According to the pixel value of each pixel in first portion's pixel, second portion pixel in the panoramic picture is determined The pixel value of each pixel in point, wherein, first portion's pixel and the second portion pixel are the panorama All pixels point in image.
8. the method for claim 7, which is characterized in that appearance of each fish eye lens of basis in world coordinate system State matrix obtains each correspondence picture of the pixel in corresponding fish eye images in first portion's pixel in the panoramic picture Vegetarian refreshments specifically includes:
The latitude and longitude coordinates of each pixel in the panoramic picture are converted into spherical coordinate;
According to the posture of the spherical coordinate and each fish eye lens of each pixel in the panoramic picture in world coordinate system Matrix obtains the new spherical coordinate of each pixel in the panoramic picture;
According in the new spherical coordinate of each pixel in the panoramic picture and N number of fish eye images each pixel it is flat The mapping relations of areal coordinate and spherical coordinate determine first portion's pixel from the pixel of the panoramic picture, and Obtain each corresponding pixel points of the pixel in corresponding fish eye images in first portion's pixel.
9. the method as described in claim 1, which is characterized in that obtain every group of cylindrical surface projecting in N number of cylindrical surface projecting image The intersecting area of image, specifically includes:
According to N number of fish-eye distribution and each fish-eye visual field, obtain every in N number of cylindrical surface projecting image The intersecting area of group cylindrical surface projecting image.
10. a kind of method that fish eye images are spliced into panoramic video, which is characterized in that N number of fish eye lens is obtained in synchronization N number of fish eye images form one group of fish eye images, including:
K group fish eye images are obtained successively using N number of fish eye lens, wherein, N and K are the integer not less than 2;
Following steps are performed for each group of fish eye images of acquisition:
Step A:According to the mapping relations of latitude and longitude coordinates, spherical coordinate and plane coordinates, one group of fish eye images are got N number of cylindrical surface projecting image after being unfolded by longitude and latitude;
Step B:The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to Feature Points Matching, is obtained To the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two has the cylindrical surface projecting figure of intersecting area As being one group of cylindrical surface projecting image;
Step C:According to spherical coordinate and the mapping relations of latitude and longitude coordinates, of every group of cylindrical surface projecting image intersecting area is obtained With characteristic point to the three-dimensional coordinate set on spherical surface, and utilize the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area Three-dimensional coordinate set carries out Attitude Calculation, obtains the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area;
Step D:Using a certain fish-eye coordinate system in N number of fish eye lens as world coordinate system, every group of cylinder is utilized The attitude matrix of the matching characteristic point pair of projected image intersecting area, is calculated each fish eye lens in world coordinate system Attitude matrix;
Step E:According to attitude matrix of each fish eye lens in world coordinate system, determine to be spliced by N number of fish eye images The pixel value of each pixel in the panoramic picture formed;
Step F:According to the pixel value of each pixel in the panoramic picture, the panoramic picture is obtained;
After every group of fish eye images have performed above-mentioned steps, K panoramic picture is got;
According to the time sequencing of the K groups fish eye images, the K panoramic picture is formed into panoramic video.
11. a kind of picture pick-up device, which is characterized in that including:
Fish eye images acquiring unit, for getting N number of fish eye images in synchronization using N number of fish eye lens, wherein, N is Integer not less than 2;
Cylindrical surface projecting image acquisition unit for the mapping relations according to latitude and longitude coordinates, spherical coordinate and plane coordinates, obtains To N number of fish eye images by N number of cylindrical surface projecting image after longitude and latitude expansion;
Matching characteristic point is to acquiring unit, for by the intersection of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image Domain carries out Feature Points Matching, obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two has phase The cylindrical surface projecting image for handing over region is one group of cylindrical surface projecting image;
First attitude matrix acquiring unit for the mapping relations according to spherical coordinate and latitude and longitude coordinates, obtains every group of cylinder The matching characteristic point of projected image intersecting area is intersected to the three-dimensional coordinate set on spherical surface, and using every group of cylindrical surface projecting image The three-dimensional coordinate set of the matching characteristic point pair in region carries out Attitude Calculation, obtains the matching of every group of cylindrical surface projecting image intersecting area The attitude matrix of characteristic point pair;
Second attitude matrix acquiring unit, for using a certain fish-eye coordinate system in N number of fish eye lens as the world Using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, each flake is calculated in coordinate system Attitude matrix of the camera lens in world coordinate system;
Pixel value acquiring unit for the attitude matrix according to each fish eye lens in world coordinate system, is determined by described N number of The pixel value of each pixel in the panoramic picture that fish eye images are spliced;
Panoramic picture acquiring unit for the pixel value according to each pixel in the panoramic picture, obtains the panorama sketch Picture.
12. a kind of picture pick-up device, which is characterized in that including:
Fish eye images acquiring unit, for obtaining K group fish eye images successively using N number of fish eye lens, wherein, N number of flake Camera lens forms one group of fish eye images in N number of fish eye images that synchronization obtains, and N and K are the integer not less than 2;
Panoramic picture acquiring unit performs following steps for being directed to each group of fish eye images of acquisition:Step A:According to longitude and latitude The mapping relations of coordinate, spherical coordinate and plane coordinates are spent, are got N number of after one group of fish eye images are unfolded by longitude and latitude Cylindrical surface projecting image;Step B:The intersecting area of every group of cylindrical surface projecting image in N number of cylindrical surface projecting image is subjected to feature Point matching, obtains the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, wherein, each two has the column of intersecting area Face projected image is one group of cylindrical surface projecting image;Step C:According to spherical coordinate and the mapping relations of latitude and longitude coordinates, obtain every The matching characteristic point of group cylindrical surface projecting image intersecting area utilizes every group of cylindrical surface projecting figure to the three-dimensional coordinate set on spherical surface As the three-dimensional coordinate set progress Attitude Calculation of the matching characteristic point pair of intersecting area, every group of cylindrical surface projecting image intersecting area is obtained Matching characteristic point pair attitude matrix;Step D:Using a certain fish-eye coordinate system in N number of fish eye lens as generation Using the attitude matrix of the matching characteristic point pair of every group of cylindrical surface projecting image intersecting area, each fish is calculated in boundary's coordinate system Attitude matrix of the glasses head in world coordinate system;Step E:According to posture square of each fish eye lens in world coordinate system Battle array determines the pixel value of each pixel in the panoramic picture being spliced by N number of fish eye images;Step F:According to described The pixel value of each pixel, obtains the panoramic picture in panoramic picture;Every group of fish eye images performed above-mentioned steps it Afterwards, K panoramic picture is got;
For the time sequencing according to the K groups fish eye images, the K panoramic picture is formed for panoramic video acquiring unit Panoramic video.
CN201611075609.0A 2016-11-29 2016-11-29 Method and device for splicing fisheye images into panoramic image and panoramic video Active CN108122191B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611075609.0A CN108122191B (en) 2016-11-29 2016-11-29 Method and device for splicing fisheye images into panoramic image and panoramic video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611075609.0A CN108122191B (en) 2016-11-29 2016-11-29 Method and device for splicing fisheye images into panoramic image and panoramic video

Publications (2)

Publication Number Publication Date
CN108122191A true CN108122191A (en) 2018-06-05
CN108122191B CN108122191B (en) 2021-07-06

Family

ID=62226901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611075609.0A Active CN108122191B (en) 2016-11-29 2016-11-29 Method and device for splicing fisheye images into panoramic image and panoramic video

Country Status (1)

Country Link
CN (1) CN108122191B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876712A (en) * 2018-06-21 2018-11-23 东南大学 Spherical closing cockpit panorama display methods based on dual-projection transformation
CN109304741A (en) * 2018-12-07 2019-02-05 宁波宝尼尔厨具电器有限公司 Interior cutter head incompleteness degree indicates system
CN109587304A (en) * 2019-01-04 2019-04-05 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109618108A (en) * 2019-01-07 2019-04-12 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109660732A (en) * 2019-01-04 2019-04-19 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109660731A (en) * 2019-01-04 2019-04-19 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109660733A (en) * 2019-01-04 2019-04-19 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109685721A (en) * 2018-12-29 2019-04-26 深圳看到科技有限公司 Panorama joining method, device, terminal and corresponding storage medium
CN110717936A (en) * 2019-10-15 2020-01-21 哈尔滨工业大学 Image stitching method based on camera attitude estimation
CN111429336A (en) * 2019-12-27 2020-07-17 上海庄生晓梦信息科技有限公司 Processing method and processing device for fisheye video data
CN111507894A (en) * 2020-04-17 2020-08-07 浙江大华技术股份有限公司 Image splicing processing method and device
CN111723830A (en) * 2019-03-20 2020-09-29 杭州海康威视数字技术股份有限公司 Image mapping method, device and equipment and storage medium
CN112995646A (en) * 2021-02-09 2021-06-18 聚好看科技股份有限公司 Display method and display device of fisheye video
CN113450254A (en) * 2021-05-20 2021-09-28 北京城市网邻信息技术有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN114462622A (en) * 2022-02-07 2022-05-10 舵敏智能科技(苏州)有限公司 Deep learning model deployment and training method for crowdsourcing data
CN115174805A (en) * 2022-06-27 2022-10-11 影石创新科技股份有限公司 Panoramic stereo image generation method and device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070159527A1 (en) * 2006-01-09 2007-07-12 Samsung Electronics Co., Ltd. Method and apparatus for providing panoramic view with geometric correction
CN103226840A (en) * 2013-04-24 2013-07-31 武汉大学 Panoramic image splicing and measuring system and method
US20160050349A1 (en) * 2014-08-15 2016-02-18 Sony Corporation Panoramic video
CN105550984A (en) * 2015-12-30 2016-05-04 北京奇艺世纪科技有限公司 Fisheye image correction and wandering display method and apparatus
CN105678729A (en) * 2016-02-24 2016-06-15 段梦凡 Splicing method for panoramic images of fish-eye lenses
CN105741233A (en) * 2016-01-27 2016-07-06 桂林长海发展有限责任公司 Spherical mosaic method and system of video image
CN106127680A (en) * 2016-06-29 2016-11-16 深圳市优象计算技术有限公司 A kind of 720 degree of panoramic video fast browsing methods

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070159527A1 (en) * 2006-01-09 2007-07-12 Samsung Electronics Co., Ltd. Method and apparatus for providing panoramic view with geometric correction
CN103226840A (en) * 2013-04-24 2013-07-31 武汉大学 Panoramic image splicing and measuring system and method
US20160050349A1 (en) * 2014-08-15 2016-02-18 Sony Corporation Panoramic video
CN105550984A (en) * 2015-12-30 2016-05-04 北京奇艺世纪科技有限公司 Fisheye image correction and wandering display method and apparatus
CN105741233A (en) * 2016-01-27 2016-07-06 桂林长海发展有限责任公司 Spherical mosaic method and system of video image
CN105678729A (en) * 2016-02-24 2016-06-15 段梦凡 Splicing method for panoramic images of fish-eye lenses
CN106127680A (en) * 2016-06-29 2016-11-16 深圳市优象计算技术有限公司 A kind of 720 degree of panoramic video fast browsing methods

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ÖMER COGAL 等: "Spherical Panorama Construction using Multi Sensor Registration Priors and Its Real-Time Hardware", 《2013 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA》 *
姚路 等: "基于投影模型和图像融合的拼接畸变消除算法", 《计算机应用与软件》 *
王依桌: "鱼眼图像的校正与柱面全景拼接方法", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876712A (en) * 2018-06-21 2018-11-23 东南大学 Spherical closing cockpit panorama display methods based on dual-projection transformation
CN109304741A (en) * 2018-12-07 2019-02-05 宁波宝尼尔厨具电器有限公司 Interior cutter head incompleteness degree indicates system
CN109685721A (en) * 2018-12-29 2019-04-26 深圳看到科技有限公司 Panorama joining method, device, terminal and corresponding storage medium
CN109685721B (en) * 2018-12-29 2021-03-16 深圳看到科技有限公司 Panoramic picture splicing method, device, terminal and corresponding storage medium
CN109660732A (en) * 2019-01-04 2019-04-19 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109660731A (en) * 2019-01-04 2019-04-19 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109660733A (en) * 2019-01-04 2019-04-19 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109587304A (en) * 2019-01-04 2019-04-05 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109587304B (en) * 2019-01-04 2021-04-16 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN109618108A (en) * 2019-01-07 2019-04-12 Oppo广东移动通信有限公司 Electronic equipment and mobile platform
CN111723830A (en) * 2019-03-20 2020-09-29 杭州海康威视数字技术股份有限公司 Image mapping method, device and equipment and storage medium
CN111723830B (en) * 2019-03-20 2023-08-29 杭州海康威视数字技术股份有限公司 Image mapping method, device and equipment and storage medium
CN110717936B (en) * 2019-10-15 2023-04-28 哈尔滨工业大学 Image stitching method based on camera attitude estimation
CN110717936A (en) * 2019-10-15 2020-01-21 哈尔滨工业大学 Image stitching method based on camera attitude estimation
CN111429336A (en) * 2019-12-27 2020-07-17 上海庄生晓梦信息科技有限公司 Processing method and processing device for fisheye video data
CN111429336B (en) * 2019-12-27 2023-09-08 上海庄生晓梦信息科技有限公司 Processing method and processing device for fish-eye video data
CN111507894A (en) * 2020-04-17 2020-08-07 浙江大华技术股份有限公司 Image splicing processing method and device
CN111507894B (en) * 2020-04-17 2023-06-13 浙江大华技术股份有限公司 Image stitching processing method and device
CN112995646A (en) * 2021-02-09 2021-06-18 聚好看科技股份有限公司 Display method and display device of fisheye video
CN113450254A (en) * 2021-05-20 2021-09-28 北京城市网邻信息技术有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN114462622A (en) * 2022-02-07 2022-05-10 舵敏智能科技(苏州)有限公司 Deep learning model deployment and training method for crowdsourcing data
CN115174805A (en) * 2022-06-27 2022-10-11 影石创新科技股份有限公司 Panoramic stereo image generation method and device and electronic equipment

Also Published As

Publication number Publication date
CN108122191B (en) 2021-07-06

Similar Documents

Publication Publication Date Title
CN108122191A (en) Fish eye images are spliced into the method and device of panoramic picture and panoramic video
WO2021120407A1 (en) Parallax image stitching and visualization method based on multiple pairs of binocular cameras
CN105957007B (en) Image split-joint method based on characteristic point plane similarity
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
CN100485720C (en) 360 degree around panorama generation method based on serial static image
CN110782394A (en) Panoramic video rapid splicing method and system
CN106683071B (en) Image splicing method and device
CN106056620B (en) Line laser camera measurement system calibrating method
CN104778656B (en) Fisheye image correcting method based on spherical perspective projection
CN106683045A (en) Binocular camera-based panoramic image splicing method
JP2013038775A (en) Ray image modeling for fast catadioptric light field rendering
CN108846796B (en) Image splicing method and electronic equipment
CN106534670B (en) It is a kind of based on the panoramic video generation method for connecting firmly fish eye lens video camera group
CN109559349A (en) A kind of method and apparatus for calibration
CN106464780B (en) XSLIT camera
CN109615664A (en) A kind of scaling method and equipment for optical perspective augmented reality display
CN111009030A (en) Multi-view high-resolution texture image and binocular three-dimensional point cloud mapping method
CN102903101A (en) Method for carrying out water-surface data acquisition and reconstruction by using multiple cameras
CN106997617A (en) The virtual rendering method of mixed reality and device
CN108269234A (en) A kind of lens of panoramic camera Attitude estimation method and panorama camera
JP4554231B2 (en) Distortion parameter generation method, video generation method, distortion parameter generation apparatus, and video generation apparatus
CN107240149A (en) Object dimensional model building method based on image procossing
CN105488764B (en) Fisheye image correcting method and device
CN111583117A (en) Rapid panoramic stitching method and device suitable for space complex environment
CN107680035A (en) A kind of parameter calibration method and device, server and readable storage medium storing program for executing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210112

Address after: No. 102, 1st floor, building 1, No. 1129, shijicheng Road, high tech Zone, Chengdu, Sichuan 610000

Applicant after: Chengdu meiruo Mengjing Technology Co.,Ltd.

Address before: No.3, 1st floor, unit 1, building 2, 219 Tianhua 2nd Road, high tech Zone, Chengdu, Sichuan 610041

Applicant before: CHENGDU GUANJIE CHUANGYU TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant