CN114581389A - Point cloud quality analysis method based on three-dimensional edge similarity characteristics - Google Patents
Point cloud quality analysis method based on three-dimensional edge similarity characteristics Download PDFInfo
- Publication number
- CN114581389A CN114581389A CN202210176395.5A CN202210176395A CN114581389A CN 114581389 A CN114581389 A CN 114581389A CN 202210176395 A CN202210176395 A CN 202210176395A CN 114581389 A CN114581389 A CN 114581389A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- scale
- edge
- small
- pcle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a point cloud quality analysis method based on three-dimensional edge similarity characteristics, which considers that the characteristics of a human visual system have higher sensitivity to the edge contour characteristics of a point cloud image and the three-dimensional characteristics of the point cloud, normalizes the scales of reference and distorted point clouds and extracts the edge and structural characteristics by adopting a multi-scale 3D-DOG filter, wherein the multi-scale filter can show the details of the point cloud from different degrees, namely the degradation degree of the point cloud can be effectively reflected from different angles; the method fully utilizes the sensitivity of human vision to edge information, simulates the process of subjective evaluation of the point cloud image by human eyes, has better point cloud quality analysis performance compared with other methods, and has higher identification accuracy, sensitivity and robustness.
Description
Technical Field
The invention relates to the field of image processing, in particular to a point cloud quality analysis method based on three-dimensional edge similarity characteristics.
Background
Recent trends in multimedia technology indicate that 3D point clouds, as an advanced content representation form, have a significant position in immersive applications because they can represent advanced content forms such as more realistic scenes in modern communication systems. A point cloud is a storage format that retains the original geometric information (e.g., attributes of position, color, normal, intensity, etc.) in 3D space, typically acquired using a three-dimensional scanner, a lidar and an RGB-D camera. The 3D point cloud has wide applications in different fields, including application scenarios such as augmented/virtual reality, 3D printing, autopilot, robot, and three-dimensional monitoring.
However, the point cloud usually contains millions of points and rich attribute information, and various levels of distortion and noise are introduced during the acquisition, processing, compression, transmission, reconstruction and display processes, which all reduce the quality of the point cloud and thus affect the satisfaction of the end user on the visual experience. Therefore, how to fully consider the combination of the characteristics of the human visual system and the characteristics of the point cloud and design the point cloud quality analysis method which accords with the human visual characteristics is applied to real-time dynamic monitoring and adjustment of the point cloud visual quality, comparison or optimization of the performance of a point cloud processing algorithm and the like, and has important theoretical research significance and practical application value.
Disclosure of Invention
The invention mainly aims to overcome the defects in the prior art and provides a point cloud quality analysis method based on three-dimensional edge similarity characteristics. The method effectively extracts the edge structure characteristics of the point cloud, accords with the subjective perception of human eyes to the distorted point cloud, and has better point cloud quality analysis performance.
The invention adopts the following technical scheme:
a point cloud quality analysis method based on three-dimensional edge similarity features comprises the following steps:
PC for inputting reference point cloudrAnd distorted point cloud PCdCarrying out feature extraction after the scale normalization;
applying a double-scale 3D-DOG operator to the reference point cloud and the distorted point cloud, and respectively extracting small-scale 3D edge characteristics PCSE of the reference point cloudrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEd;
Small-scale 3D edge features PCSE from reference point cloudsrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdAnd calculating to obtain a reference point cloud PCrAnd distorted point cloud PCdThe small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces;
and calculating to obtain the point cloud objective quality Score by using a three-dimensional edge intensity weighting pooling method based on the small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces.
Specifically, the input reference point cloud PCrAnd distorted point cloud PCdCarrying out feature extraction after the scale normalization, and specifically comprising the following steps:
inputting reference point cloud PCrAnd distorted point cloud PCd;
And constructing two all-zero three-dimensional matrixes according to the maximum coordinate values of the pair of reference point clouds and the distortion point clouds, and filling the brightness information of each coordinate position into the two all-zero three-dimensional matrixes by taking the pair of point cloud coordinates as a reference until the dimensions of the two all-zero three-dimensional matrixes are the same.
Specifically, a two-scale 3D-DOG operator is applied to the reference point cloud and the distorted point cloud, and small-scale 3D edge features PCSE of the reference point cloud are respectively extractedrWith large scale 3D edge features PCLErAnd distorted point cloudSmall scale 3D edge feature PCSEdWith large scale 3D edge features PCLEdThe method comprises the following steps:
extracting reference point cloud PC by using 3D-DOG operator respectivelyrAnd distorted point cloud PCdThe dual-scale three-dimensional edge characteristics of (a) are:
wherein the content of the first and second substances,a 3D-DOG filter being a small scale kernel,3D-DOG filter being a large scale kernel, derived PCSErFor small scale features of reference point clouds, PCLErPCSE (personal computer aided engineering) which is a large-scale feature of a reference point clouddFor small-scale features of distorted point clouds, PCLEdFor the large scale features of the distorted point cloud, (x, y, z) represents the 3D coordinates of each pixel point in the point cloud, sigma1And σ2Is the standard deviation, σ, at a small scale3And σ4Standard deviation at large scale.
Specifically, the method comprises the following steps:andthe method specifically comprises the following steps:
the formula is as follows:
wherein G (x, y, z, σ) is a 3D gaussian filter, and the formula of the 3D gaussian filter is as follows:
specifically, the method comprises the following steps: small-scale 3D edge features PCSE from reference point cloudsrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdCalculating to obtain a reference point cloud PCrAnd distorted point cloud PCdThe small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces are as follows:
respectively by calculating small-scale 3D edge features PCSE of the reference point cloudrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdAnd obtaining the three-dimensional point cloud edge similarity Spces and Lpces under two scales:
wherein, T1And T2Is a constant for ensuring numerical stability.
Specifically, the method comprises the following steps: based on the small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces, calculating by using a three-dimensional edge intensity weighting pooling method to obtain a point cloud objective quality Score, which is specifically as follows:
PCEW(x,y,z)=max(PCLEr,PCLEd)
PCES(x,y,z)=[Spces(x,y,z)]α·[Lpces(x,y,z)]β
wherein, PCEW is the weight in the three-dimensional edge intensity weighting pooling strategy, PCES is obtained by multiplying Spces and Lpces according to a set proportion, alpha and beta are set coefficients, and alpha + beta is 1.
As can be seen from the above description of the present invention, compared with the prior art, the present invention has the following advantages:
the invention provides a point cloud quality analysis method based on three-dimensional edge similarity characteristics, which considers that the characteristics of a human visual system have higher sensitivity to the edge contour characteristics of a point cloud image and the three-dimensional characteristics of the point cloud, normalizes the scales of reference and distorted point clouds and extracts the edge and structural characteristics by adopting a multi-scale 3D-DOG filter, wherein the multi-scale filter can show the details of the point cloud from different degrees, namely the degradation degree of the point cloud can be effectively reflected from different angles; the method fully utilizes the sensitivity of human vision to edge information, simulates the process of subjective evaluation of the point cloud image by human eyes, has better point cloud quality analysis performance compared with other methods, and has higher identification accuracy, sensitivity and robustness.
Drawings
Fig. 1 is a schematic flow chart provided by an embodiment of the present invention.
Fig. 2 is two reference and distorted point cloud images provided by the embodiment of the present invention, where (a) is an exemplary diagram of reference point cloud and (b) is an exemplary diagram of distorted point cloud.
The invention is described in further detail below with reference to the following figures and specific examples.
Detailed Description
The invention provides a point cloud quality analysis method based on three-dimensional edge similarity characteristics, which simulates the process of subjective evaluation of a point cloud image by human eyes by using the sensitivity of the human eye vision to edge information, and has better point cloud quality analysis performance compared with other methods, and the method has higher identification accuracy, sensitivity and robustness.
Referring to fig. 1, a point cloud quality analysis method based on three-dimensional edge similarity features includes the following specific steps:
s101: PC for inputting reference point cloudrAnd distorted point cloud PCdAnd (3) carrying out feature extraction after the scale normalization, wherein the feature extraction specifically comprises the following steps:
inputting reference point cloud PCrAnd distorted point cloud PCdSince some distorted point clouds are compared with the reference point clouds, coordinate position shift and rotation are generated, and the scale range is not uniform. Firstly, two all-zero three-dimensional matrixes are constructed according to the maximum coordinate values of a pair of reference and distorted point clouds, and the brightness information of each coordinate position is filled into the two all-zero three-dimensional matrixes by taking each coordinate of the pair of point clouds as a reference until the dimensions of the two all-zero three-dimensional matrixes are the same, as shown in fig. 2, two reference and distorted point cloud images provided by the embodiment of the invention are provided, wherein a diagram (a) is an example diagram of a reference point cloud, and a diagram (b) is an example diagram of a distorted point cloud.
S102: applying a double-scale 3D-DOG operator to the reference point cloud and the distorted point cloud, and respectively extracting small-scale 3D edge features PCSE of the reference point cloudrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdThe method comprises the following steps:
extracting reference point cloud PC by using 3D-DOG operator respectivelyrAnd distorted point cloud PCdThe double-scale three-dimensional edge features of (1) wherein the small-scale filtering kernel can extract more sharp detailed edge contents of the point cloud, otherwise the large-scale filtering kernel can extract the outline and shape information features of the point cloud, as follows:
wherein the content of the first and second substances,a 3D-DOG filter being a small scale kernel,3D-DOG filter being a large scale kernel, derived PCSErFor small scale features of reference point clouds, PCLErPCSE (personal computer aided engineering) which is a large-scale feature of a reference point clouddFor small-scale features of distorted point clouds, PCLEdFor the large scale features of the distorted point cloud, (x, y, z) represents the 3D coordinates, σ, of each pixel point in the point cloud1And σ2Is the standard deviation, σ, at a small scale3And σ4Standard deviation at large scale;
wherein, (x, y, z) represents the 3D coordinates of each pixel point in the point cloud, G (x, y, z, sigma) is a 3D Gaussian filter, and 9 multiplied by 9 is adoptedX 9 Gaussian Kernel, σ1And σ2Is the standard deviation at small scale, where σ1=0.9,σ2=1;σ3And σ4Is the standard deviation at large scale, where σ3=2.1,σ4The formula of the 3D gaussian filter is 2.2 as follows:
s103: small-scale 3D edge features PCSE from reference point cloudsrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdAnd calculating to obtain a reference point cloud PCrAnd distorted point cloud PCdThe small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces are as follows:
respectively by calculating small-scale 3D edge features PCSE of the reference point cloudrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdThe similarity Spces and Lpces of the three-dimensional point cloud edge under two scales can be obtained:
wherein, T1And T2Is a constant for ensuring the stability of the value, T1=0.04,T2=0.01。
S104: based on the small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces, calculating by using a three-dimensional edge intensity weighting pooling strategy to obtain a point cloud objective quality Score, which is specifically as follows:
PCEW(x,y,z)=max(PCLEr,PCLEd)
PCES(x,y,z)=[Spces(x,y,z)]α·[Lpces(x,y,z)]β
wherein, PCEW is the weight in the three-dimensional edge intensity weighting pooling strategy, and the selected PCLE is the reference and distortion characteristics under large scalerAnd PCLEdThe larger of these. PCES is obtained by multiplying Spces and Lpces according to a certain proportion, wherein alpha is 0.7 and beta is2=0.3。
The above description is only an embodiment of the present invention, but the design concept of the present invention is not limited thereto, and any insubstantial modifications made by using the design concept should fall within the scope of infringing the present invention.
Claims (6)
1. A point cloud quality analysis method based on three-dimensional edge similarity features is characterized by comprising the following steps:
PC for inputting reference point cloudrAnd distorted point cloud PCdCarrying out feature extraction after the scale normalization;
applying a double-scale 3D-DOG operator to the reference point cloud and the distorted point cloud, and respectively extracting small-scale 3D edge features PCSE of the reference point cloudrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEd;
Small-scale 3D edge features PCSE from reference point cloudsrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdAnd calculating to obtain a reference point cloud PCrAnd distorted point cloud PCdThe small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces;
and calculating to obtain the point cloud objective quality Score by using a three-dimensional edge intensity weighting pooling method based on the small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces.
2. The method of claim 1, wherein the inputted reference point cloud is PC-processedrAnd distorted point cloud PCdCarrying out feature extraction after the scale normalization, and specifically comprising the following steps:
inputting reference point cloud PCrAnd distorted point cloud PCd;
And constructing two all-zero three-dimensional matrixes according to the maximum coordinate values of the pair of reference point clouds and the distortion point clouds, and filling the brightness information of each coordinate position into the two all-zero three-dimensional matrixes by taking the pair of point cloud coordinates as a reference until the dimensions of the two all-zero three-dimensional matrixes are the same.
3. The method of claim 1, wherein a two-dimensional 3D-DOG operator is applied to the reference point cloud and the distorted point cloud to extract small-scale 3D edge features PCSE of the reference point cloud respectivelyrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdThe method comprises the following steps:
extracting reference point cloud PC by using 3D-DOG operator respectivelyrAnd distorted point cloud PCdThe dual-scale three-dimensional edge characteristics of (a) are:
wherein the content of the first and second substances,a 3D-DOG filter being a small scale kernel,3D-DOG filter being a large scale kernel, derived PCSErFor small scale features of reference point clouds, PCLErPCSE (personal computer aided engineering) which is a large-scale feature of a reference point clouddFor small-scale features of distorted point clouds, PCLEdFor the large scale features of the distorted point cloud, (x, y, z) represents the 3D coordinates, σ, of each pixel point in the point cloud1And σ2Is the standard deviation, σ, at a small scale3And σ4Standard deviation at large scale.
5. the method of claim 3, wherein the method comprises: small-scale 3D edge features PCSE from reference point cloudsrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdAnd calculating to obtain a reference point cloud PCrAnd distorted point cloud PCdThe small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces are as follows:
respectively by calculating small-scale 3D edge features PCSE of the reference point cloudrWith large scale 3D edge features PCLErAnd small scale 3D edge features PCSE of the distorted point clouddWith large scale 3D edge features PCLEdAnd obtaining the three-dimensional point cloud edge similarity Spces and Lpces under two scales:
wherein, T1And T2Is a constant for ensuring numerical stability.
6. The method of claim 5, wherein the method comprises: based on the small-scale 3D edge similarity Spces and the large-scale 3D edge similarity Lpces, calculating by using a three-dimensional edge intensity weighting pooling method to obtain a point cloud objective quality Score, which is specifically as follows:
PCEW(x,y,z)=max(PCLEr,PCLEd)
PCES(x,y,z)=[Spces(x,y,z)]α·[Lpces(x,y,z)]β
wherein, PCEW is the weight in the three-dimensional edge intensity weighting pooling strategy, PCES is obtained by multiplying Spces and Lpces according to a set proportion, alpha and beta are set coefficients, and alpha + beta is 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210176395.5A CN114581389A (en) | 2022-02-24 | 2022-02-24 | Point cloud quality analysis method based on three-dimensional edge similarity characteristics |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210176395.5A CN114581389A (en) | 2022-02-24 | 2022-02-24 | Point cloud quality analysis method based on three-dimensional edge similarity characteristics |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114581389A true CN114581389A (en) | 2022-06-03 |
Family
ID=81775182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210176395.5A Withdrawn CN114581389A (en) | 2022-02-24 | 2022-02-24 | Point cloud quality analysis method based on three-dimensional edge similarity characteristics |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114581389A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117789198A (en) * | 2024-02-28 | 2024-03-29 | 上海几何伙伴智能驾驶有限公司 | Method for realizing point cloud degradation detection based on 4D millimeter wave imaging radar |
-
2022
- 2022-02-24 CN CN202210176395.5A patent/CN114581389A/en not_active Withdrawn
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117789198A (en) * | 2024-02-28 | 2024-03-29 | 上海几何伙伴智能驾驶有限公司 | Method for realizing point cloud degradation detection based on 4D millimeter wave imaging radar |
CN117789198B (en) * | 2024-02-28 | 2024-05-14 | 上海几何伙伴智能驾驶有限公司 | Method for realizing point cloud degradation detection based on 4D millimeter wave imaging radar |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110363858B (en) | Three-dimensional face reconstruction method and system | |
KR102319177B1 (en) | Method and apparatus, equipment, and storage medium for determining object pose in an image | |
CN108596961B (en) | Point cloud registration method based on three-dimensional convolutional neural network | |
CN108875935B (en) | Natural image target material visual characteristic mapping method based on generation countermeasure network | |
CN111488865B (en) | Image optimization method and device, computer storage medium and electronic equipment | |
JP6902122B2 (en) | Double viewing angle Image calibration and image processing methods, equipment, storage media and electronics | |
CN108734078B (en) | Image processing method, image processing apparatus, electronic device, storage medium, and program | |
US20190206117A1 (en) | Image processing method, intelligent terminal, and storage device | |
CN110310269B (en) | Light field image quality evaluation method based on polar plane multi-scale Gabor characteristic similarity | |
CN114581389A (en) | Point cloud quality analysis method based on three-dimensional edge similarity characteristics | |
CN111161348A (en) | Monocular camera-based object pose estimation method, device and equipment | |
CN117115358A (en) | Automatic digital person modeling method and device | |
CN117218192A (en) | Weak texture object pose estimation method based on deep learning and synthetic data | |
CN116452715A (en) | Dynamic human hand rendering method, device and storage medium | |
CN111985535A (en) | Method and device for optimizing human body depth map through neural network | |
Che et al. | Reduced-reference quality metric for screen content image | |
CN113902786B (en) | Depth image preprocessing method, system and related device | |
CN116188497B (en) | Method, device, equipment and storage medium for optimizing generation of DSM (digital image model) of stereo remote sensing image pair | |
CN112967398B (en) | Three-dimensional data reconstruction method and device and electronic equipment | |
CN112734901B (en) | 3D instruction book generation method and related equipment | |
CN111161340B (en) | Image redirection method and system based on depth feature extraction and terminal | |
CN113538222A (en) | Three-dimensional animation scene frame scoring model determining method and scoring system | |
Xiong et al. | The Application of Graphics and Images in Digital Technology Environment | |
CN117557722A (en) | Reconstruction method and device of 3D model, enhancement realization device and storage medium | |
CN114445560A (en) | Head-mounted equipment and three-dimensional reconstruction method, device, system and medium thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220603 |