CN115239724B - 360-degree panoramic stitching image analysis and evaluation method - Google Patents

360-degree panoramic stitching image analysis and evaluation method Download PDF

Info

Publication number
CN115239724B
CN115239724B CN202211156487.3A CN202211156487A CN115239724B CN 115239724 B CN115239724 B CN 115239724B CN 202211156487 A CN202211156487 A CN 202211156487A CN 115239724 B CN115239724 B CN 115239724B
Authority
CN
China
Prior art keywords
splicing
image
shooting angle
camera
quality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211156487.3A
Other languages
Chinese (zh)
Other versions
CN115239724A (en
Inventor
郑智宇
邓志颖
李浩然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eagle Drive Technology Shenzhen Co Ltd
Original Assignee
Eagle Drive Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eagle Drive Technology Shenzhen Co Ltd filed Critical Eagle Drive Technology Shenzhen Co Ltd
Priority to CN202211156487.3A priority Critical patent/CN115239724B/en
Publication of CN115239724A publication Critical patent/CN115239724A/en
Application granted granted Critical
Publication of CN115239724B publication Critical patent/CN115239724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Abstract

The invention belongs to the technical field of spliced image evaluation, and particularly discloses a 360-degree panoramic stitching image analysis and evaluation method, which is characterized in that an effective viewing range of a target automobile is set before vehicle-mounted 360-degree panoramic image stitching operation is carried out, the focal lengths of all cameras on the body of the target automobile are uniformly regulated and controlled, the consistency of the corresponding viewing ranges of all the cameras is improved to a certain extent, meanwhile, when overlapping area images in stitching subimages are taken, scientific and objective taking basis is provided for the overlapping area images, the taking efficiency is improved, the high quality degree of the taken overlapping area images is improved to the maximum extent, finally, the influence of stitching smoothness and stitching uniformity on stitching effect is fully considered when the stitching effect of the vehicle-mounted 360-degree panoramic image is evaluated, so that comprehensive evaluation is carried out, and the defect that the evaluation dimension of the stitching effect of the vehicle-mounted 360-degree panoramic image is too single in the prior art is effectively overcome.

Description

360-degree panoramic stitching image analysis and evaluation method
Technical Field
The invention belongs to the technical field of spliced image evaluation, and particularly relates to a 360-degree panoramic stitched image analysis and evaluation method.
Background
In recent years, with the rapid development of road transportation and transportation industries, the number of automobiles is rapidly increased, and the problems of driving safety, traffic disputes and the like are increasingly severe. Under the condition, the automobile data recorder is more and more concerned by drivers, and the automobile data recorder is arranged on the automobile, so that the automobile data recorder not only can clearly record the accident occurrence process when an accident occurs, but also has the function of assisting driving.
The auxiliary driving function is realized through a 360-degree panoramic image system, the specific operation mode is that a plurality of cameras arranged on an automobile body are utilized to collect environment images in a shooting angle range, the collected environment images are subjected to all-round view splicing to obtain 360-degree panoramic images, vehicle-mounted display is carried out, a driver can eliminate a visual field blind area through the 360-degree panoramic images of the vehicle-mounted display in the driving process, auxiliary driving is realized, accordingly, the splicing processing mode and the splicing effect of the vehicle-mounted 360-degree panoramic images directly determine the auxiliary utility of the auxiliary driving, and the analysis and evaluation of the splicing processing mode and the splicing effect of the vehicle-mounted 360-degree panoramic images are particularly important.
In the process of implementing the present application, the inventor finds that the following problems exist in the prior art: 1. at present, when vehicle-mounted 360-degree panoramic images are spliced, on one hand, unified regulation and control of focal lengths of all cameras before splicing are lacked, the view finding range corresponding to each camera cannot be consistent, huge obstacles are brought to subsequent splicing, the splicing difficulty is greatly increased, on the other hand, the cameras are limited by the arrangement positions on a vehicle body, overlapped shooting angle intervals inevitably exist, and then environment images shot by the cameras have overlapped area images, the situation relates to taking of the overlapped area images, at present, the overlapped area images are taken and processed due to the lack of scientific objective taking basis, taking efficiency is reduced, the taken overlapped area images are often not high enough, and the integral splicing effect of the vehicle-mounted 360-degree panoramic images is influenced.
2. The current splicing effect evaluation to vehicle-mounted 360-degree panoramic images mostly centers on the splicing smoothness of splicing positions, for example, whether the splicing has dislocation or not is neglected, the influence of the splicing uniformity on the splicing effect is ignored, the splicing effect evaluation dimension is too single, the splicing effect cannot be comprehensively and accurately reflected by the evaluation result, and the available value of the evaluation result is not high.
Disclosure of Invention
In order to overcome the defects, the invention builds a method for analyzing and evaluating the 360-degree panoramic stitching image, and the specific technical scheme is as follows: a360-degree panoramic stitching image analysis and evaluation method comprises the following steps: s1: and counting the number of the cameras on the target automobile, numbering the cameras according to a preset sequence, and acquiring the arrangement directions of the cameras.
S2: and setting an effective view-finding range corresponding to the target automobile.
S3: and acquiring the visual angle range corresponding to each camera, and combining the visual angle range with the arrangement direction of each camera to acquire the shooting angle interval of each camera in one circle.
S4: and comparing the shooting angle intervals corresponding to the cameras, screening the overlapped shooting angle intervals, recording the cameras contained in the overlapped shooting angle intervals as the characteristic cameras, and recording the serial numbers of the characteristic cameras contained in the overlapped shooting angle intervals.
S5: and adjusting and controlling the focal length of each camera according to the effective view-finding range corresponding to the target automobile, and acquiring the environment image in the corresponding shooting angle interval by each camera after the adjustment and control are completed.
S6: and analyzing the optimal camera corresponding to the overlapped shooting angle interval.
S7: and performing primary processing on the environment image in the shooting angle interval corresponding to each camera by using the optimal camera corresponding to the overlapped shooting angle interval to obtain an effective environment image in the shooting angle interval corresponding to each camera.
S8: and carrying out 360-degree all-round view splicing on the effective environment images in the shooting angle interval corresponding to each camera to obtain the panoramic image in the effective view-finding range corresponding to the target automobile.
S9: and analyzing the splicing smoothness and the splicing uniformity of the panoramic image in the effective viewing range corresponding to the target automobile so as to comprehensively evaluate the splicing effect coefficient of the panoramic image in the effective viewing range corresponding to the target automobile.
In a further technical solution, the specific implementation manner corresponding to S2 is: and taking the center point of the body of the target automobile as the center of a circle and the set distance as the radius to make a circle, wherein the area in the circle is the effective view finding range corresponding to the target automobile.
In a further technical solution, the specific screening method for screening out the overlapping shooting angle sections in S4 is to compare the shooting angle sections corresponding to the cameras, determine whether there are some shooting angle sections that are partially consistent, and if so, mark the partially consistent shooting angle sections as the overlapping shooting angle sections, thereby screening out the overlapping shooting angle sections from the shooting angle sections corresponding to the cameras.
In a further technical scheme, in the S5, the specific operation mode corresponding to the focal length of each camera is adjusted and controlled according to the effective view range corresponding to the target vehicle is to use the set distance as an effective shooting distance, and then match the effective shooting distance with the adaptive focal lengths corresponding to the preset various shooting distances, and match the adaptive focal lengths corresponding to the effective shooting distances from the effective shooting distances, so as to adjust and control the focal lengths of the cameras accordingly.
In a further technical solution, the specific implementation manner corresponding to S6 refers to the following steps: s61: and extracting the environment image in the corresponding shooting angle interval by each characteristic camera based on the serial number of each characteristic camera included in the overlapped shooting angle interval.
S62: and separating the environment images in the overlapped shooting angle interval from the environment images in the shooting angle interval corresponding to each characteristic camera.
S63: and respectively extracting quality indexes of the environment images in the corresponding overlapped shooting angle intervals of the characteristic cameras, wherein the quality indexes comprise resolution, color depth and signal-to-noise ratio.
S64: quality indexes corresponding to the environmental images in the overlapped shooting angle intervals corresponding to the characteristic cameras form an environmental image quality index set
Figure 910241DEST_PATH_IMAGE001
Figure 714249DEST_PATH_IMAGE002
The representation is the quality index of the environment image in the overlapping shooting angle interval corresponding to the jth characteristic camera, j is the number of the characteristic camera, j =1, 2.. The m, w is the quality index, w = r1 or r2 or r3, wherein r1, r2 and r3 are respectively represented as resolution, color depth and signal-to-noise ratio.
S65: comparing the set of the environmental image quality indexes with the set standard image quality indexes, calculating the quality index of the environmental image in the corresponding overlapped shooting angle interval of each characteristic camera, and recording the quality index as the quality index
Figure 364542DEST_PATH_IMAGE003
Figure 912198DEST_PATH_IMAGE004
In which
Figure 98591DEST_PATH_IMAGE005
Figure 807921DEST_PATH_IMAGE006
Figure 211090DEST_PATH_IMAGE007
Are respectively expressed as the j-th featureThe camera correspondingly overlaps the resolution, color depth and signal-to-noise ratio of the environment image in the shooting angle interval,
Figure 296857DEST_PATH_IMAGE008
Figure 852604DEST_PATH_IMAGE009
Figure 220918DEST_PATH_IMAGE010
respectively expressed as standard resolution, standard color depth and standard signal-to-noise ratio, respectively expressed as ratio factors corresponding to preset resolution, color depth and signal-to-noise ratio, and e expressed as a natural constant.
S66: and comparing the quality indexes of the environment images in the overlapped shooting angle intervals corresponding to the characteristic cameras, and screening the characteristic camera with the maximum quality index as the optimal camera corresponding to the overlapped shooting angle interval.
In a further technical solution, in the step S7, the environment image in the shooting angle interval corresponding to each camera is preliminarily processed by using the preferred camera corresponding to the overlapping shooting angle interval, and a corresponding specific processing method is as follows: (1) And acquiring the number of the preferred camera, extracting the characteristic cameras except the preferred camera from the characteristic cameras included in the overlapped shooting angle interval according to the number, and recording the extracted characteristic cameras as the designated cameras.
(2) And performing segmentation and elimination processing on the environment images in the overlapped shooting angle sections from the environment images in the shooting angle sections corresponding to the appointed cameras.
In a further technical solution, the specific operation method corresponding to S9 is as follows: s91: and marking the splicing positions of the panoramic images corresponding to the target automobile in the effective viewing range, and sequentially numbering the marked splicing positions as 1,2, i, n according to a set sequence.
S92: analyzing the splicing fluency corresponding to each splicing position and recording the splicing fluency as
Figure 862115DEST_PATH_IMAGE011
S93: and analyzing the splicing uniformity corresponding to each splicing position and recording the uniformity as
Figure 841DEST_PATH_IMAGE012
S94: splicing quality coefficient corresponding to each splicing position is evaluated based on splicing smoothness and splicing uniformity corresponding to each splicing position
Figure 145514DEST_PATH_IMAGE013
Wherein
Figure 212959DEST_PATH_IMAGE014
And A and B respectively represent weight factors corresponding to preset splicing fluency and splicing uniformity.
S95: carrying out mean value processing on the splicing quality coefficients corresponding to the splicing positions to obtain an average splicing quality coefficient of the panoramic image in the effective viewing range corresponding to the target automobile, and recording the average splicing quality coefficient as the average splicing quality coefficient
Figure 75872DEST_PATH_IMAGE015
S96: extracting the maximum splicing quality coefficient from the splicing quality coefficients corresponding to all the splicing positions
Figure 487131DEST_PATH_IMAGE016
And minimum splice quality factor
Figure 17470DEST_PATH_IMAGE017
And substituting it into the splicing quality fluctuation index
Figure 505083DEST_PATH_IMAGE018
Obtaining the splicing quality fluctuation index of the panoramic image in the corresponding effective viewing range of the target automobile
Figure 337516DEST_PATH_IMAGE019
S97: efficient fetching based on target car correspondenceAverage splicing quality coefficient and splicing quality fluctuation index of panoramic images in the scene range comprehensively evaluate splicing effect coefficient of panoramic images in the effective viewing range corresponding to the target automobile
Figure 568777DEST_PATH_IMAGE020
Figure 202890DEST_PATH_IMAGE021
Wherein
Figure 595825DEST_PATH_IMAGE022
And expressing the occupation factor corresponding to the set average splicing quality coefficient.
In a further technical solution, the S92 specifically includes the following steps: s921: and sequentially focusing the panoramic image in the effective viewing range corresponding to the target automobile at each splicing position, and extracting the width of the splicing gap corresponding to each splicing position.
S922: comparing the width of the splicing gap corresponding to each splicing position with a preset allowable splicing gap width threshold value, and calculating the splicing tightness corresponding to each splicing position according to the calculation formula
Figure 698910DEST_PATH_IMAGE023
Figure 219016DEST_PATH_IMAGE024
Expressed as the closeness of the splice corresponding to the ith splice location,
Figure 192788DEST_PATH_IMAGE025
indicated as the splice gap width corresponding to the ith splice location,
Figure 271471DEST_PATH_IMAGE026
indicated as a set allowed splice gap width threshold.
S923: and sequentially focusing the panoramic image in the effective viewing range corresponding to the target automobile at each splicing position, thereby identifying and forming two environment images corresponding to each splicing position, and extracting a splicing boundary line corresponding to each splicing position from the two environment images.
S924: and respectively extracting the outline of the appearance of the shot from the two environment images corresponding to the splicing positions, and marking the intersection points of the outline of the appearance of the shot and the splicing boundary line in the two environment images corresponding to the splicing positions, so as to obtain two intersection points on the splicing boundary line corresponding to the splicing positions.
S925: obtaining the distance between two intersection points on the splicing boundary line corresponding to each splicing position, and substituting the distance into the splicing dislocation
Figure 330694DEST_PATH_IMAGE027
Calculating the splicing dislocation corresponding to each splicing position
Figure 481639DEST_PATH_IMAGE028
Wherein
Figure 575497DEST_PATH_IMAGE029
Expressed as the distance between two intersection points on the splicing boundary line corresponding to the ith splicing position,
Figure 575814DEST_PATH_IMAGE030
denoted as a predefined reference intersection distance.
S926: the splicing compactness and the splicing dislocation corresponding to each splicing position are analyzed through a splicing fluency analysis formula
Figure 371600DEST_PATH_IMAGE031
And analyzing the splicing fluency corresponding to each splicing position.
In a further technical solution, the S93 specifically includes the following steps:
s931: whether the same shot object exists on the two sides of the splicing boundary line corresponding to each splicing position or not is distinguished, if the same shot object exists on the two sides of the splicing boundary line corresponding to a certain splicing position, the splicing position is recorded as a same-object splicing position, the analysis process of the splicing uniformity degree corresponding to the same-object splicing position is executed in S932-S933, otherwise, the splicing position is recorded as a foreign-object splicing position, and the analysis process of the splicing uniformity degree corresponding to the foreign-object splicing position is executed in S934.
S932: and marking areas where the same shot object exists in the environment images on the two sides of the splicing boundary line corresponding to the splicing position of the same object to obtain associated areas on the two sides of the splicing boundary line corresponding to the splicing position of the same object, and further extracting the image definition and the image color chromaticity.
S933: respectively and correspondingly comparing the image definition and the image color chroma of the related areas at the two sides of the splicing boundary line corresponding to the splicing position of the same object to obtain an image definition difference value and an image color chroma difference value, and passing the image definition difference value and the image color chroma difference value through a splicing uniformity analysis formula of the splicing position of the same object
Figure 217196DEST_PATH_IMAGE032
To obtain the splicing uniformity corresponding to the splicing position of the same object
Figure 181873DEST_PATH_IMAGE033
Wherein
Figure 353091DEST_PATH_IMAGE034
Figure 636174DEST_PATH_IMAGE035
Respectively representing the image definition difference and the image color chroma difference of the related areas at the two sides of the splicing boundary line corresponding to the splicing position of the same object,
Figure 551040DEST_PATH_IMAGE036
Figure 367293DEST_PATH_IMAGE037
the values are expressed as a set allowable image sharpness difference value and an allowable image color chromaticity difference value, respectively.
S934: extracting image definition from regions on two sides of a splicing boundary line corresponding to the foreign matter splicing position, carrying out difference value comparison, and further enabling a comparison result to pass through a foreign matter splicing position splicing uniformity analysis formula
Figure 443833DEST_PATH_IMAGE038
To obtain a foreign body splicing positionCorresponding splicing uniformity
Figure 964944DEST_PATH_IMAGE039
Wherein
Figure 932769DEST_PATH_IMAGE040
And the difference value of the image definition of the areas at two sides of the splicing boundary line corresponding to the splicing position of the foreign matter is expressed.
In a further technical solution, the step S9 further includes identifying a low-quality splicing position, and displaying a serial number of the low-quality splicing position in the background, where the specific identification is to compare a splicing quality coefficient corresponding to each splicing position with a set standard splicing quality coefficient, and if the splicing quality coefficient corresponding to a certain splicing position is smaller than the set standard splicing quality coefficient, marking the splicing position as the low-quality splicing position.
By combining all the technical schemes, the invention has the advantages and positive effects that: (1) According to the method, before the vehicle-mounted 360-degree panoramic image splicing operation is carried out, the effective viewing range of the target automobile is set, and the focal lengths of the cameras on the body of the target automobile are uniformly regulated and controlled, so that the consistency of the corresponding viewing ranges of the cameras is improved to a certain extent, the subsequent splicing difficulty is greatly reduced, and the method has the characteristic of high practicability.
(2) When the overlapped area images are taken, scientific and objective taking basis is provided, the taking efficiency is improved, the high quality degree of the taken overlapped area images is improved to the maximum extent, and basic guarantee is provided for the integral splicing effect of the subsequent vehicle-mounted 360-degree panoramic images.
(3) According to the method and the device, when the splicing effect of the vehicle-mounted 360-degree panoramic image is evaluated, the influence of the splicing smoothness and the splicing uniformity on the splicing effect is fully considered, so that comprehensive evaluation is performed, the defect that the evaluation dimension of the splicing effect of the vehicle-mounted 360-degree panoramic image is too single in the prior art is effectively overcome, the splicing effect can be comprehensively and accurately reflected by the evaluation result, and the available value of the evaluation result is improved.
Drawings
The invention is further illustrated by means of the attached drawings, but the embodiments in the drawings do not constitute any limitation to the invention, and for a person skilled in the art, other drawings can be obtained on the basis of the following drawings without inventive effort.
FIG. 1 is a flow chart of the method steps of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the invention provides a method for analyzing and evaluating a 360-degree panoramic stitching image, which comprises the following steps: s1: and counting the number of the cameras on the target automobile, numbering the cameras according to a preset sequence, and acquiring the arrangement directions of the cameras.
S2: the method comprises the following specific implementation modes of setting an effective view-finding range corresponding to a target automobile: and taking the central point of the body of the target automobile as the center of a circle and the set distance as the radius to make a circle, wherein the area in the circle is the effective viewing range corresponding to the target automobile.
S3: and acquiring the visual angle range corresponding to each camera, and combining the visual angle range with the arrangement direction of each camera to acquire the shooting angle interval of each camera in a circle.
For example, the shooting angle interval of the camera within one circle can be
Figure 855726DEST_PATH_IMAGE041
S4: the method comprises the steps of comparing shooting angle intervals corresponding to all cameras, screening overlapping shooting angle intervals from the shooting angle intervals, and judging whether the shooting angle intervals are partially consistent or not, if so, marking the partially consistent shooting angle intervals as overlapping shooting angle intervals, screening the overlapping shooting angle intervals from the shooting angle intervals corresponding to all cameras, marking the cameras contained in the overlapping shooting angle intervals as characteristic cameras, and recording the serial numbers of all characteristic cameras contained in the overlapping shooting angle intervals.
Illustratively, a certain camera corresponds to a shooting angle interval of
Figure 119479DEST_PATH_IMAGE042
The shooting angle interval corresponding to a certain camera is
Figure 127887DEST_PATH_IMAGE043
The shooting angle intervals corresponding to the two cameras have overlapped shooting angle intervals, namely the shooting angle intervals are
Figure 368244DEST_PATH_IMAGE044
The two cameras are
Figure 411286DEST_PATH_IMAGE045
And a corresponding feature camera.
S5: the focal length of each camera is regulated and controlled according to the effective view finding range corresponding to the target automobile, the specific operation mode is that the set distance is used as the effective shooting distance, the effective shooting distance is matched with the adaptive focal length corresponding to the preset various shooting distances, the adaptive focal length corresponding to the effective shooting distance is matched from the effective shooting distance, therefore, the focal length regulation and control of each camera are carried out, and the environment image in the corresponding shooting angle interval is collected by each camera after the regulation and control are completed.
According to the embodiment of the invention, the effective viewing range of the target automobile is set before the vehicle-mounted 360-degree panoramic image splicing operation is carried out, and the focal lengths of the cameras on the body of the target automobile are uniformly regulated and controlled, so that the consistency of the corresponding viewing ranges of the cameras is improved to a certain extent, the subsequent splicing difficulty is greatly reduced, and the method has the characteristic of strong practicability.
S6: the method for analyzing the optimal camera corresponding to the overlapped shooting angle interval specifically comprises the following steps: s61: and extracting the environment image in the corresponding shooting angle interval by each characteristic camera based on the serial number of each characteristic camera included in the overlapped shooting angle interval.
S62: separating the environmental image in the overlapping shooting angle interval from the environmental image in the shooting angle interval corresponding to each characteristic camera, wherein the specific separation operation mode is to extract angle values at two ends from the overlapping shooting angle interval, for example, a certain overlapping shooting angle interval is
Figure 829629DEST_PATH_IMAGE046
If the two-end angle value in the overlapping shooting angle interval is 40 degrees and 60 degrees, marking the two-end angle value in the environment image in the corresponding shooting angle interval of each feature camera included in the overlapping shooting angle interval, and separating the environment image between the two-end angle value as the environment image in the corresponding overlapping shooting angle interval of each feature camera.
S63: and respectively extracting quality indexes of the environment images in the corresponding overlapped shooting angle intervals of the characteristic cameras, wherein the quality indexes comprise resolution, color depth and signal-to-noise ratio.
S64: forming an environment image quality index set by quality indexes corresponding to the environment images in the corresponding overlapped shooting angle intervals of the characteristic cameras
Figure 78995DEST_PATH_IMAGE047
Figure 873775DEST_PATH_IMAGE048
The representation is the quality index of the environment image in the overlapping shooting angle interval corresponding to the jth characteristic camera, j is the number of the characteristic camera, j =1, 2.. The m, w is the quality index, w = r1 or r2 or r3, wherein r1, r2 and r3 are respectively represented as resolution, color depth and signal-to-noise ratio.
S65: comparing the environmental image quality index set with the set standard image quality index, and calculatingThe high-quality index of the environment image in the corresponding overlapped shooting angle interval of each characteristic camera is recorded as
Figure 20592DEST_PATH_IMAGE049
Figure 609836DEST_PATH_IMAGE050
Wherein
Figure 77989DEST_PATH_IMAGE051
Figure 410881DEST_PATH_IMAGE052
Figure 677783DEST_PATH_IMAGE053
Respectively expressed as the resolution, color depth and signal-to-noise ratio of the environment image in the corresponding overlapped shooting angle interval of the jth characteristic camera,
Figure 703508DEST_PATH_IMAGE054
Figure 656027DEST_PATH_IMAGE055
Figure 58190DEST_PATH_IMAGE056
the resolution, the color depth and the signal-to-noise ratio of the environment image in the overlapped shooting angle interval are larger, and the quality index of the environment image in the overlapped shooting angle interval corresponding to the characteristic camera is larger.
The method is applied to the embodiment, the analysis of the quality index of the environment image in the overlapped shooting angle interval integrates the influence of three parameters including resolution, color depth and signal-to-noise ratio on the quality index, the analysis result realizes the quantification of the quality condition of multiple parameters of the environment image in the overlapped shooting angle interval, and compared with the method for analyzing the quality index by using a single parameter, the analysis method has great advantages and can provide reliable reference basis for screening of the optimal camera.
S66: and comparing the quality indexes of the environment images in the overlapped shooting angle intervals corresponding to the characteristic cameras, and screening out the characteristic camera with the maximum quality index as the optimal camera corresponding to the overlapped shooting angle interval.
On the basis of the scheme, the environment image in the overlapped shooting angle interval corresponding to the optimal camera can be determined to be used as the taking object of the overlapped area image by screening the optimal camera.
When the embodiment of the invention carries out taking processing on the images of the overlapping area in the environmental image, a scientific and objective taking basis is provided for the images of the overlapping area, the taking efficiency is improved, the high quality degree of the taken images of the overlapping area is improved to the maximum extent, and a basic guarantee is provided for the integral splicing effect of the subsequent vehicle-mounted 360-degree panoramic image.
S7: and performing primary processing on the environment image in the shooting angle interval corresponding to each camera by using the optimal camera corresponding to the overlapped shooting angle interval to obtain an effective environment image in the shooting angle interval corresponding to each camera.
The specific processing method corresponding to the preliminary treatment is as follows: (1) And acquiring the number of the preferred camera, extracting the characteristic cameras except the preferred camera from the characteristic cameras included in the overlapped shooting angle interval according to the number, and recording the extracted characteristic cameras as the designated cameras.
(2) And performing segmentation and elimination processing on the environment images in the overlapped shooting angle interval from the environment images in the shooting angle interval corresponding to each appointed camera.
S8: and sequencing the effective environment images in the shooting angle interval corresponding to each camera according to the sequence of the shooting angle interval from small to large to obtain the sequencing result of the effective environment images in the shooting angle interval corresponding to each camera, and performing 360-degree all-round stitching based on the sequencing result to obtain the panoramic image in the effective viewing range corresponding to the target automobile.
S9: the method comprises the following steps of analyzing the splicing smoothness and the splicing uniformity of a panoramic image in an effective viewing range corresponding to a target automobile so as to comprehensively evaluate the splicing effect coefficient of the panoramic image in the effective viewing range corresponding to the target automobile, wherein the specific operation method comprises the following steps: s91: and marking the splicing positions of the panoramic images corresponding to the target automobile in the effective viewing range, and sequentially numbering the marked splicing positions as 1,2, i, n according to a set sequence.
S92: analyzing the splicing fluency corresponding to each splicing position and recording the splicing fluency as
Figure 664752DEST_PATH_IMAGE057
The method specifically comprises the following steps: s921: and sequentially focusing the panoramic image in the effective viewing range corresponding to the target automobile at each splicing position, and extracting the width of the splicing gap corresponding to each splicing position.
S922: comparing the width of the splicing gap corresponding to each splicing position with a preset allowable splicing gap width threshold value, and calculating the splicing tightness corresponding to each splicing position according to the calculation formula
Figure 110645DEST_PATH_IMAGE023
Figure 802658DEST_PATH_IMAGE058
Expressed as the closeness of the splice corresponding to the ith splice location,
Figure 24823DEST_PATH_IMAGE059
indicated as the splice gap width corresponding to the ith splice location,
Figure 485891DEST_PATH_IMAGE060
expressed as a set allowable splice gap width threshold, where the larger the splice gap width, the less tight the splice.
S923: and sequentially focusing the panoramic image in the effective viewing range corresponding to the target automobile at each splicing position, thereby identifying and forming two environment images corresponding to each splicing position, and extracting a splicing boundary line corresponding to each splicing position from the two environment images.
S924: and respectively extracting the outline of the appearance of the shot from the two environment images corresponding to the splicing positions, and marking the intersection points of the outline of the appearance of the shot and the splicing boundary line in the two environment images corresponding to the splicing positions, so as to obtain two intersection points on the splicing boundary line corresponding to the splicing positions.
S925: obtaining the distance between two intersection points on the splicing boundary line corresponding to each splicing position, and substituting the distance into the splicing dislocation
Figure 837107DEST_PATH_IMAGE061
Calculating the splicing dislocation corresponding to each splicing position
Figure 281995DEST_PATH_IMAGE062
In which
Figure 760381DEST_PATH_IMAGE063
Expressed as the distance between two intersection points on the splicing boundary line corresponding to the ith splicing position,
Figure 83478DEST_PATH_IMAGE064
expressed as a predefined reference intersection distance, wherein the larger the distance between two intersection points on the stitching boundary line, the larger the stitching dislocation.
S926: the splicing compactness and the splicing dislocation corresponding to each splicing position are analyzed through a splicing fluency analysis formula
Figure 887486DEST_PATH_IMAGE031
And analyzing the splicing fluency corresponding to each splicing position.
The influence of the splicing compactness on the splicing fluency in the splicing fluency analysis formula is positive influence, and the influence of the splicing dislocation on the splicing fluency is negative influence.
S93: and analyzing the splicing uniformity corresponding to each splicing position and recording the uniformity as
Figure 537779DEST_PATH_IMAGE065
The method specifically comprises the following steps: s931: identifying splice positionsWhether the same shot object exists on two sides of the corresponding splicing boundary line or not is set, if the same shot object exists on two sides of the splicing boundary line corresponding to a certain splicing position or not, the splicing position is recorded as a same-object splicing position, the analysis process of the splicing uniformity degree corresponding to the same-object splicing position is executed in S932-S933, otherwise, the splicing position is recorded as a foreign-object splicing position, and the analysis process of the splicing uniformity degree corresponding to the foreign-object splicing position is executed in S934.
S932: and marking areas where the same shot object exists in the environment images on the two sides of the splicing boundary line corresponding to the splicing position of the same object to obtain associated areas on the two sides of the splicing boundary line corresponding to the splicing position of the same object, and further extracting the image definition and the image color chromaticity.
S933: respectively and correspondingly comparing the image definition and the image color chroma of the related areas at the two sides of the splicing boundary line corresponding to the splicing position of the same object to obtain an image definition difference value and an image color chroma difference value, and passing the image definition difference value and the image color chroma difference value through a splicing uniformity analysis formula of the splicing position of the same object
Figure 85435DEST_PATH_IMAGE066
To obtain the splicing uniformity corresponding to the splicing position of the same object
Figure 6249DEST_PATH_IMAGE067
Wherein
Figure 981158DEST_PATH_IMAGE068
Figure 869480DEST_PATH_IMAGE069
Respectively representing the image definition difference and the image color chroma difference of the related areas at the two sides of the splicing boundary line corresponding to the splicing position of the same object,
Figure 204515DEST_PATH_IMAGE070
Figure 494682DEST_PATH_IMAGE071
respectively expressed as the set allowable image definition difference value and the allowable image color chroma difference value, wherein the imageThe smaller the definition difference is, the smaller the image color chroma difference is, and the larger the splicing uniformity is.
S934: extracting image definition from regions on two sides of a splicing boundary line corresponding to the foreign matter splicing position, carrying out difference value comparison, and further enabling a comparison result to pass through a foreign matter splicing position splicing uniformity analysis formula
Figure 122716DEST_PATH_IMAGE072
To obtain the splicing uniformity corresponding to the splicing position of the foreign matters
Figure 763913DEST_PATH_IMAGE073
In which
Figure 637060DEST_PATH_IMAGE074
And the image definition difference of the areas on two sides of the splicing boundary line corresponding to the splicing position of the foreign matter is expressed.
It should be noted that the analysis of the stitching uniformity is based on the stitched images, and the characteristics of the stitched images are that the images are stitched together, and each of the images is collected by a different camera, so that there is a difference of image viewing parameters, such as definition, and the larger the difference of the image viewing parameters is, the visual perception is reflected on the visual perception, so that visual contrast is caused, and the viewing perception is affected, and meanwhile, in the process of stitching uniformity analysis, it is considered that there are the same shots on both sides of the stitching boundary line corresponding to some stitching positions, so that the image viewing parameters include not only definition but also color chromaticity, and in general, the color chromaticity presented by the same shots in the image should be consistent, when there is a difference of color chromaticity presented by the same shots on both sides of the stitching boundary line, the color discontinuity appears, and the larger the difference of color chromaticity is, the higher the degree of color discontinuity is, so as to give people a distorted visual perception.
S94: splicing quality coefficient corresponding to each splicing position is evaluated based on splicing smoothness and splicing uniformity corresponding to each splicing position
Figure 47313DEST_PATH_IMAGE075
Therein is disclosedIn (1)
Figure 364025DEST_PATH_IMAGE076
A and B are respectively expressed as weight factors corresponding to preset splicing fluency and splicing uniformity, wherein
Figure 977671DEST_PATH_IMAGE077
Is taken as
Figure 405241DEST_PATH_IMAGE078
Or
Figure 919268DEST_PATH_IMAGE079
According to the embodiment of the invention, when the splicing effect of the vehicle-mounted 360-degree panoramic image is evaluated, the influence of the splicing smoothness and the splicing uniformity on the splicing effect is fully considered, so that comprehensive evaluation is carried out, the defect that the evaluation dimension of the splicing effect of the vehicle-mounted 360-degree panoramic image in the prior art is too single is effectively overcome, the splicing effect can be comprehensively and accurately reflected by the evaluation result, and the available value of the evaluation result is improved.
S95: carrying out mean value processing on the splicing quality coefficients corresponding to the splicing positions to obtain an average splicing quality coefficient of the panoramic image in the effective viewing range corresponding to the target automobile, and recording the average splicing quality coefficient as the average splicing quality coefficient
Figure 141302DEST_PATH_IMAGE080
Wherein
Figure 245174DEST_PATH_IMAGE081
And n represents the number of splice locations.
S96: extracting the maximum splicing quality coefficient from the splicing quality coefficients corresponding to all the splicing positions
Figure 476435DEST_PATH_IMAGE082
And minimum splice quality factor
Figure 861280DEST_PATH_IMAGE083
And substituting it into the splicing quality fluctuation index
Figure 503483DEST_PATH_IMAGE084
Obtaining the splicing quality fluctuation index of the panoramic image in the corresponding effective viewing range of the target automobile
Figure 340989DEST_PATH_IMAGE085
And if the difference between the maximum splicing quality coefficient and the minimum splicing quality coefficient is larger, the splicing quality fluctuation index is larger, which indicates that the splicing quality distribution corresponding to each splicing position is more unbalanced.
S97: comprehensively evaluating the splicing effect coefficient of the panoramic image in the corresponding effective viewing range of the target automobile based on the average splicing quality coefficient and the splicing quality fluctuation index of the panoramic image in the corresponding effective viewing range of the target automobile
Figure 861094DEST_PATH_IMAGE086
Figure 100445DEST_PATH_IMAGE087
In which
Figure 179129DEST_PATH_IMAGE088
And expressing the occupation factor corresponding to the set average splicing quality coefficient.
In the specific embodiment of the invention, the evaluation of the splicing effect of the panoramic image in the corresponding effective viewing range of the target automobile not only considers the overall splicing quality condition of the panoramic image, but also considers the splicing quality distribution equilibrium condition of the panoramic image, thereby realizing the dual evaluation of the overall and individual differences, avoiding the defect of one-sided evaluation existing in the process of only performing overall evaluation, and ensuring that the evaluation result can meet the use requirement of a driver to the maximum extent.
In the basis of the above scheme, the step S9 further includes identifying a low-quality splicing position, and displaying the number of the low-quality splicing position in the background, where the specific identification manner is to compare the splicing quality coefficient corresponding to each splicing position with a set standard splicing quality coefficient, and if the splicing quality coefficient corresponding to a certain splicing position is smaller than the set standard splicing quality coefficient, then the splicing position is recorded as the low-quality splicing position.
According to the embodiment of the invention, through identifying the low-quality splicing position, a targeted processing target can be provided for the improper splicing processing of the panoramic image in the corresponding effective viewing range of the target automobile, so that the blind processing is avoided to a certain extent, and the processing efficiency is greatly improved.
The foregoing is merely exemplary and illustrative of the present invention and various modifications, additions and substitutions may be made by those skilled in the art to the specific embodiments described without departing from the scope of the invention as defined in the following claims.

Claims (7)

1. A360-degree panoramic stitching image analysis and evaluation method is characterized by comprising the following steps:
s1: counting the number of cameras on a target automobile, numbering the cameras according to a preset sequence, and acquiring the arrangement directions of the cameras;
s2: setting an effective view finding range corresponding to a target automobile;
s3: acquiring a visual angle range corresponding to each camera, and combining the visual angle range with the arrangement direction of each camera to acquire a shooting angle interval of each camera in a circle;
s4: comparing shooting angle intervals corresponding to all cameras, screening overlapping shooting angle intervals, recording the cameras contained in the overlapping shooting angle intervals as characteristic cameras, and recording the serial numbers of all the characteristic cameras contained in the overlapping shooting angle intervals;
s5: adjusting and controlling the focal length of each camera according to the effective view finding range corresponding to the target automobile, and acquiring an environment image in a corresponding shooting angle interval by each camera after the adjustment and control are completed;
s6: analyzing the optimal camera corresponding to the overlapped shooting angle interval;
the specific implementation manner of S6 is as follows:
s61: extracting an environment image in the corresponding shooting angle interval by each characteristic camera based on the serial number of each characteristic camera contained in the overlapping shooting angle interval;
s62: separating the environment images in the overlapped shooting angle interval from the environment images in the shooting angle interval corresponding to each characteristic camera;
s63: respectively extracting quality indexes of the environment images in the corresponding overlapped shooting angle intervals of the characteristic cameras, wherein the quality indexes comprise resolution, color depth and signal-to-noise ratio;
s64: forming an environment image quality index set by quality indexes corresponding to the environment images in the corresponding overlapped shooting angle intervals of the characteristic cameras
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE004
Expressing the quality index of the environment image in the overlapping shooting angle interval corresponding to the jth characteristic camera, wherein j is the number of the characteristic camera, j =1, 2.. The m, w is the quality index, and w = r1 or r2 or r3, wherein r1, r2 and r3 are respectively expressed as resolution, color depth and signal-to-noise ratio;
s65: comparing the environmental image quality index set with the set standard image quality index, calculating the quality index of the environmental image in the corresponding overlapped shooting angle interval of each characteristic camera, and recording the quality index as the quality index
Figure DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE008
Wherein
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE012
Figure DEST_PATH_IMAGE014
Respectively expressed as the resolution, color depth and signal-to-noise ratio of the environment image in the corresponding overlapped shooting angle interval of the jth characteristic camera,
Figure DEST_PATH_IMAGE016
Figure DEST_PATH_IMAGE018
Figure DEST_PATH_IMAGE020
respectively representing the resolution, the color depth and the signal-to-noise ratio of the standard, respectively representing a, b and c as proportion factors corresponding to preset resolution, color depth and signal-to-noise ratio, and representing e as a natural constant;
s66: comparing the quality indexes of the environment images in the overlapped shooting angle intervals corresponding to the characteristic cameras, and screening out the characteristic camera with the maximum quality index as a preferred camera corresponding to the overlapped shooting angle interval;
s7: performing primary processing on the environment image in the shooting angle interval corresponding to each camera by using the optimal camera corresponding to the overlapping shooting angle interval to obtain an effective environment image in the shooting angle interval corresponding to each camera;
s8: performing 360-degree all-round stitching on the effective environment images in the shooting angle interval corresponding to each camera to obtain panoramic images in the effective view-finding range corresponding to the target automobile;
s9: analyzing the splicing smoothness and the splicing uniformity of the panoramic image in the effective viewing range corresponding to the target automobile so as to comprehensively evaluate the splicing effect coefficient of the panoramic image in the effective viewing range corresponding to the target automobile;
the specific operation method corresponding to the step S9 is as follows:
s91: splicing position marks are carried out on the panoramic image corresponding to the target automobile in the effective viewing range, and the spliced positions of the marks are numbered as 1,2,. Multidot.i,. Multidot.n in sequence according to a set sequence;
s92: and analyzing the splicing fluency corresponding to each splicing position and recording the splicing fluency as
Figure DEST_PATH_IMAGE022
S93: and analyzing the splicing uniformity corresponding to each splicing position and recording the uniformity as
Figure DEST_PATH_IMAGE024
S94: evaluating the splicing quality coefficient corresponding to each splicing position based on the splicing fluency and splicing uniformity corresponding to each splicing position
Figure DEST_PATH_IMAGE026
Wherein
Figure DEST_PATH_IMAGE028
A and B respectively represent weight factors corresponding to preset splicing fluency and splicing uniformity;
s95: carrying out mean value processing on the splicing quality coefficients corresponding to the splicing positions to obtain an average splicing quality coefficient of the panoramic image in the effective view finding range corresponding to the target automobile, and recording the average splicing quality coefficient as
Figure DEST_PATH_IMAGE030
S96: extracting the maximum splicing quality coefficient from the splicing quality coefficients corresponding to all the splicing positions
Figure DEST_PATH_IMAGE032
And minimum splice quality factor
Figure DEST_PATH_IMAGE034
And substituting it into the splicing quality fluctuation index
Figure DEST_PATH_IMAGE036
Obtaining the splicing quality fluctuation index of the panoramic image in the corresponding effective viewing range of the target automobile
Figure DEST_PATH_IMAGE038
S97: comprehensively evaluating the splicing effect coefficient of the panoramic image in the corresponding effective viewing range of the target automobile based on the average splicing quality coefficient and the splicing quality fluctuation index of the panoramic image in the corresponding effective viewing range of the target automobile
Figure DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE042
Wherein
Figure DEST_PATH_IMAGE044
Expressing the ratio factor corresponding to the set average splicing quality coefficient;
the S93 specifically includes the following steps:
s931: judging whether the same shot object exists on two sides of the splicing boundary line corresponding to each splicing position, if the same shot object exists on two sides of the splicing boundary line corresponding to a certain splicing position, recording the splicing position as a same-object splicing position, executing S932-S933 on the analysis process of the splicing uniformity degree corresponding to the same-object splicing position, otherwise, recording the splicing position as a foreign-object splicing position, and executing S934 on the analysis process of the splicing uniformity degree corresponding to the foreign-object splicing position;
s932: marking areas where the same shot object exists in the environment images on the two sides of the splicing boundary line corresponding to the splicing position of the same object to obtain associated areas on the two sides of the splicing boundary line corresponding to the splicing position of the same object, and further extracting image definition and image color chroma from the associated areas;
s933: respectively and correspondingly comparing the image definition and the image color chroma of the related areas at the two sides of the splicing boundary line corresponding to the splicing position of the same object to obtain an image definition difference value and an image color chroma difference value, and passing the image definition difference value and the image color chroma difference value through a splicing uniformity analysis formula of the splicing position of the same object
Figure DEST_PATH_IMAGE046
To obtain the splicing uniformity corresponding to the splicing position of the same object
Figure DEST_PATH_IMAGE048
Wherein
Figure DEST_PATH_IMAGE050
Figure DEST_PATH_IMAGE052
Respectively representing the image definition difference and the image color chroma difference of the related areas at the two sides of the splicing boundary line corresponding to the splicing position of the same object,
Figure DEST_PATH_IMAGE054
Figure DEST_PATH_IMAGE056
respectively representing the set allowable image definition difference value and the set allowable image color chroma difference value;
s934: extracting image definition from regions on two sides of a splicing boundary line corresponding to the foreign matter splicing position, carrying out difference value comparison, and further enabling a comparison result to pass through a foreign matter splicing position splicing uniformity analysis formula
Figure DEST_PATH_IMAGE058
To obtain the splicing uniformity corresponding to the splicing position of the foreign matters
Figure DEST_PATH_IMAGE060
Wherein
Figure DEST_PATH_IMAGE062
And the difference value of the image definition of the areas at two sides of the splicing boundary line corresponding to the splicing position of the foreign matter is expressed.
2. The method for analyzing and evaluating the 360-degree panoramic stitching image according to claim 1, characterized in that: the specific implementation manner corresponding to the S2 is as follows: and taking the center point of the body of the target automobile as the center of a circle and the set distance as the radius to make a circle, wherein the area in the circle is the effective view finding range corresponding to the target automobile.
3. The method for analyzing and evaluating the 360-degree panoramic stitching image according to claim 1, characterized in that: the specific screening method for screening the overlapping shooting angle intervals in S4 is to compare the shooting angle intervals corresponding to the cameras, determine whether there are some shooting angle intervals that are partially consistent, and if so, mark the partially consistent shooting angle intervals as the overlapping shooting angle intervals, thereby screening the overlapping shooting angle intervals from the shooting angle intervals corresponding to the cameras.
4. The analysis and evaluation method for the 360-degree panoramic stitching image according to claim 2, characterized in that: and in the S5, the specific operation mode corresponding to the focal length of each camera is regulated according to the effective view finding range corresponding to the target automobile, namely the set distance is used as the effective shooting distance, the effective shooting distance is matched with the adaptive focal length corresponding to the preset various shooting distances, the adaptive focal length corresponding to the effective shooting distance is matched, and therefore the focal length of each camera is regulated and controlled accordingly.
5. The method for analyzing and evaluating the 360-degree panoramic stitching image according to claim 1, characterized in that: in S7, the specific processing method for performing preliminary processing on the environment image in the shooting angle interval corresponding to each camera by using the preferred camera corresponding to the overlapping shooting angle interval is as follows:
(1) Acquiring the number of the preferred camera, extracting the characteristic cameras except the preferred camera from the characteristic cameras contained in the overlapped shooting angle interval according to the number of the preferred camera, and recording the extracted characteristic cameras as the appointed cameras;
(2) And performing segmentation and elimination processing on the environment images in the overlapped shooting angle interval from the environment images in the shooting angle interval corresponding to each appointed camera.
6. The method for analyzing and evaluating the 360-degree panoramic stitching image according to claim 1, characterized in that: the S92 specifically includes the following steps:
s921: sequentially focusing the panoramic image in the effective viewing range corresponding to the target automobile at each splicing position, and extracting the width of a splicing gap corresponding to each splicing position;
s922: comparing the width of the splicing gap corresponding to each splicing position with a preset allowable splicing gap width threshold value, and calculating the splicing tightness corresponding to each splicing position according to the calculation formula
Figure DEST_PATH_IMAGE064
Figure DEST_PATH_IMAGE066
Expressed as the splice tightness corresponding to the ith splice location,
Figure DEST_PATH_IMAGE068
indicated as the splice gap width corresponding to the ith splice location,
Figure DEST_PATH_IMAGE070
expressed as a set allowable splice gap width threshold;
s923: sequentially focusing the panoramic image in the effective viewing range corresponding to the target automobile at each splicing position, identifying and forming two environment images corresponding to each splicing position, and extracting splicing boundary lines corresponding to each splicing position from the two environment images;
s924: extracting outline contours of the appearance of the shot from the two environment images corresponding to the splicing positions respectively, and marking intersection points of the outline contours of the appearance of the shot and the splicing boundary lines in the two environment images corresponding to the splicing positions respectively, so as to obtain two intersection points on the splicing boundary lines corresponding to the splicing positions;
s925: obtaining the distance between two intersection points on the splicing boundary line corresponding to each splicing position, and substituting the distance into the splicing dislocation
Figure DEST_PATH_IMAGE072
Calculating the splicing dislocation corresponding to each splicing position
Figure DEST_PATH_IMAGE074
Wherein
Figure DEST_PATH_IMAGE076
Expressed as the distance between two intersection points on the splicing boundary line corresponding to the ith splicing position,
Figure DEST_PATH_IMAGE078
expressed as a predefined reference cross-distance;
s926: the splicing compactness and the splicing dislocation corresponding to each splicing position are analyzed through a splicing fluency analysis formula
Figure DEST_PATH_IMAGE080
And analyzing the splicing fluency corresponding to each splicing position.
7. The method for analyzing and evaluating the 360-degree panoramic stitching image according to claim 1, characterized in that: and S9, identifying the low-quality splicing positions, and displaying the numbers of the low-quality splicing positions in a background manner, wherein the specific identification manner is to compare the splicing quality coefficients corresponding to the splicing positions with the set standard splicing quality coefficient, and if the splicing quality coefficient corresponding to a certain splicing position is smaller than the set standard splicing quality coefficient, recording the splicing position as the low-quality splicing position.
CN202211156487.3A 2022-09-22 2022-09-22 360-degree panoramic stitching image analysis and evaluation method Active CN115239724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211156487.3A CN115239724B (en) 2022-09-22 2022-09-22 360-degree panoramic stitching image analysis and evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211156487.3A CN115239724B (en) 2022-09-22 2022-09-22 360-degree panoramic stitching image analysis and evaluation method

Publications (2)

Publication Number Publication Date
CN115239724A CN115239724A (en) 2022-10-25
CN115239724B true CN115239724B (en) 2022-11-22

Family

ID=83667303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211156487.3A Active CN115239724B (en) 2022-09-22 2022-09-22 360-degree panoramic stitching image analysis and evaluation method

Country Status (1)

Country Link
CN (1) CN115239724B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116229765B (en) * 2023-05-06 2023-07-21 贵州鹰驾交通科技有限公司 Vehicle-road cooperation method based on digital data processing

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447865A (en) * 2015-11-23 2016-03-30 深圳进化动力数码科技有限公司 Method and device for evaluating static splicing quality of panoramic splicing algorithm
CN107154022A (en) * 2017-05-10 2017-09-12 北京理工大学 A kind of dynamic panorama mosaic method suitable for trailer
CN108198135A (en) * 2018-01-02 2018-06-22 佛山科学技术学院 A kind of optimal suture line search method of Panorama Mosaic
CN113192003A (en) * 2021-03-26 2021-07-30 宁波大学 Spliced image quality evaluation method
CN113194309A (en) * 2021-06-02 2021-07-30 重庆渝微电子技术研究院有限公司 Imaging quality evaluation system of 360-degree panoramic looking-around equipment
CN113628160A (en) * 2021-06-30 2021-11-09 中汽研汽车检验中心(天津)有限公司 Method for evaluating splicing quality of plane splicing views of automobile panoramic image monitoring system
CN113691721A (en) * 2021-07-28 2021-11-23 浙江大华技术股份有限公司 Synthesis method and device of time-lapse video, computer equipment and medium
CN114066831A (en) * 2021-11-04 2022-02-18 北京航空航天大学 Remote sensing image mosaic quality non-reference evaluation method based on two-stage training
CN114372919A (en) * 2022-03-22 2022-04-19 鹰驾科技(深圳)有限公司 Method and system for splicing panoramic all-around images of double-trailer train
CN114785960A (en) * 2022-06-16 2022-07-22 鹰驾科技(深圳)有限公司 360 degree panorama vehicle event data recorder system based on wireless transmission technology

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130238285A1 (en) * 2010-08-27 2013-09-12 Clonnequin Pty Ltd Mannequin, method and system for purchase, making and alteration of clothing
JP6415094B2 (en) * 2014-04-25 2018-10-31 キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, and program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447865A (en) * 2015-11-23 2016-03-30 深圳进化动力数码科技有限公司 Method and device for evaluating static splicing quality of panoramic splicing algorithm
CN107154022A (en) * 2017-05-10 2017-09-12 北京理工大学 A kind of dynamic panorama mosaic method suitable for trailer
CN108198135A (en) * 2018-01-02 2018-06-22 佛山科学技术学院 A kind of optimal suture line search method of Panorama Mosaic
CN113192003A (en) * 2021-03-26 2021-07-30 宁波大学 Spliced image quality evaluation method
CN113194309A (en) * 2021-06-02 2021-07-30 重庆渝微电子技术研究院有限公司 Imaging quality evaluation system of 360-degree panoramic looking-around equipment
CN113628160A (en) * 2021-06-30 2021-11-09 中汽研汽车检验中心(天津)有限公司 Method for evaluating splicing quality of plane splicing views of automobile panoramic image monitoring system
CN113691721A (en) * 2021-07-28 2021-11-23 浙江大华技术股份有限公司 Synthesis method and device of time-lapse video, computer equipment and medium
CN114066831A (en) * 2021-11-04 2022-02-18 北京航空航天大学 Remote sensing image mosaic quality non-reference evaluation method based on two-stage training
CN114372919A (en) * 2022-03-22 2022-04-19 鹰驾科技(深圳)有限公司 Method and system for splicing panoramic all-around images of double-trailer train
CN114785960A (en) * 2022-06-16 2022-07-22 鹰驾科技(深圳)有限公司 360 degree panorama vehicle event data recorder system based on wireless transmission technology

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
《A 360-degree floating 3D display based on light field regeneration》;Xia Xinxing等;《Opt Express》;20130531;第21卷(第9期);第11237-47页 *
《基于局部相位一致性特征的图像拼接质量评价算法研究》;孔玲玲;《中国优秀硕士学位论文全文数据库 信息科技辑》;20200215(第2期);第I138-1437页 *
《汽车覆盖件的三维重构与工艺分析的研究与实践》;黄焕江 等;《中国优秀博硕士学位论文全文数据库 (硕士) 工程科技Ⅱ辑》;20040215(第2期);第C035-20页 *
《注意力分布机制下的全景图像质量评价》;安平 等;《中国传媒大学学报(自然科学版)》;20211020;第28卷(第05期);第36-42页 *

Also Published As

Publication number Publication date
CN115239724A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN107622229B (en) Video vehicle re-identification method and system based on fusion features
CN102314600B (en) Shadow removal in image captured by vehicle-based camera for clear path detection
CN101828201B (en) Image processing device and method, and learning device, method
WO2018040756A1 (en) Vehicle body colour identification method and device
EP2575077A2 (en) Road sign detecting method and road sign detecting apparatus
CN115239724B (en) 360-degree panoramic stitching image analysis and evaluation method
CN102034080B (en) Vehicle color identification method and device
JP2014178328A (en) Steel pipe internal corrosion analyzer and steel pipe internal corrosion analysis method
KR20170056474A (en) Method, device and storage medium for calculating building height
JP6700373B2 (en) Apparatus and method for learning object image packaging for artificial intelligence of video animation
CN107689157B (en) Traffic intersection passable road planning method based on deep learning
CN109866684A (en) Lane departure warning method, system, readable storage medium storing program for executing and computer equipment
CN112070733A (en) Defect rough positioning method and device based on weak supervision mode
CN107644538B (en) Traffic signal lamp identification method and device
CN112598066A (en) Lightweight road pavement detection method and system based on machine vision
CN111832388B (en) Method and system for detecting and identifying traffic sign in vehicle running
CN114372919B (en) Method and system for splicing panoramic all-around images of double-trailer train
Wang et al. Automatic dissection position selection for cleavage-stage embryo biopsy
CN111275634A (en) Molten pool shape detection and arc welding robot control method
CN113390882A (en) Tire inner side defect detector based on machine vision and deep learning algorithm
CN108230248A (en) A kind of assessment of viewing system splicing effect and automatic fine tuning method based on self-adaptive features point registration
CN112183427A (en) Rapid extraction method for arrow-shaped traffic signal lamp candidate image area
CN110688979A (en) Illegal vehicle tracking method and device
CN116883868A (en) Unmanned aerial vehicle intelligent cruising detection method based on adaptive image defogging
CN110969135A (en) Vehicle logo recognition method in natural scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant