WO2020087434A1 - Procédé et dispositif d'évaluation de la résolution d'une image de visage - Google Patents

Procédé et dispositif d'évaluation de la résolution d'une image de visage Download PDF

Info

Publication number
WO2020087434A1
WO2020087434A1 PCT/CN2018/113361 CN2018113361W WO2020087434A1 WO 2020087434 A1 WO2020087434 A1 WO 2020087434A1 CN 2018113361 W CN2018113361 W CN 2018113361W WO 2020087434 A1 WO2020087434 A1 WO 2020087434A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
face area
pixels
value
picture
Prior art date
Application number
PCT/CN2018/113361
Other languages
English (en)
Chinese (zh)
Inventor
汪香君
刘会芬
陈巧
孙瑞泽
Original Assignee
深圳技术大学(筹)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳技术大学(筹) filed Critical 深圳技术大学(筹)
Priority to PCT/CN2018/113361 priority Critical patent/WO2020087434A1/fr
Publication of WO2020087434A1 publication Critical patent/WO2020087434A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the invention relates to the technical field of image processing, and in particular to a method and device for evaluating the sharpness of a human face image.
  • the sharpness of the picture is the most important standard for measuring the quality of the picture.
  • image technology such as face recognition and face three-dimensional reconstruction
  • the evaluation method of the sharpness of the face image generally adopts the normalization process of the local gradient value of each pixel of the face image, and assigns different weighted values to each pixel of the face or background.
  • the concept of gradient density is introduced to calculate the blur of face image.
  • the technical problem to be solved by the present invention is to provide a method and a device for evaluating the sharpness of a face image, aiming to solve that the weight function of the face image evaluation in the prior art is a Gaussian function, the calculation is complicated, and the background of the image needs to participate in the calculation, It will cause the technical problem that the evaluation error of human face image sharpness is too large.
  • a first aspect of an embodiment of the present invention provides a method for evaluating the clarity of a face image.
  • the method includes:
  • a second aspect of an embodiment of the present invention provides a face image clarity evaluation device, which includes:
  • a pre-processing module used to obtain a picture sequence, and pre-process all pictures in the picture sequence
  • the face detection module is used to select a pre-processed image for face detection to obtain a face detection frame, and perform face feature point detection in the face detection frame to identify a preset number of face features Point, a closed face area range is obtained by connecting the face feature points at the preset label;
  • the calculation module is used to accumulate the absolute sum of the horizontal and vertical gradient values of all pixels in the face area to obtain an accumulation result, and divide the accumulation result by all pixels in the face area The number of points to obtain the average value of the gradient values of all pixels within the face area, and take the average value as the sharpness factor of the picture;
  • the circulation module is used to select a pre-processed picture for face detection until all pictures in the picture sequence are taken;
  • the sorting module is used to sort the clarity factor values of all pictures in the picture sequence to obtain the clarity order of all pictures in the picture sequence.
  • An embodiment of the present invention provides a method and device for evaluating the clarity of a human face image.
  • the face area range is obtained, and then the horizontal and vertical directions of all pixels within the face area area are accumulated. Sum of the absolute values of the gradient values in the direction to obtain the accumulation result, divide the accumulation result by the number of all pixels in the face area, to obtain the average value of the gradient values of all pixels in the face area, and take the average as the image Sharpness factor, sort the clearness factor values of all pictures to get the clearness order of all pictures. Since the face sharpness is evaluated by calculating the average value of the gradient values of all pixels in the face area, on the one hand, the calculation can be simplified and the calculation efficiency can be improved. On the other hand, only the face area range is used as the calculation range. Make the evaluation of the sharpness of the face image more accurate.
  • FIG. 1 is a schematic flowchart of a method for evaluating sharpness of a face image according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of obtaining a picture sequence in a method for evaluating the clarity of a face image according to an embodiment of the present invention
  • FIG. 3 is a distribution diagram of face feature points in a method for evaluating sharpness of a face image according to an embodiment of the present invention
  • FIG. 4 is a schematic flowchart of image preprocessing in a method for evaluating sharpness of a face image according to an embodiment of the present invention
  • FIG. 5 is a schematic flowchart of a method for calculating an average value of gradient values in a method for evaluating sharpness of a face image according to an embodiment of the present invention
  • FIG. 6 is a schematic flowchart of another image preprocessing in a method for evaluating sharpness of a face image according to an embodiment of the present invention
  • FIG. 7 is a schematic flowchart of another method for calculating an average value of gradient values in a method for evaluating sharpness of a face image according to an embodiment of the present invention
  • FIG. 8 is a schematic diagram of a device for evaluating a sharpness of a face image according to an embodiment of the present invention
  • FIG. 9 is a schematic diagram of another device of a face image sharpness evaluation device according to an embodiment of the present invention.
  • FIG. 1 is a schematic flowchart of a method for evaluating sharpness of a face image according to an embodiment of the present invention. The method includes:
  • the picture sequence refers to a collection containing several pictures, and the number of pictures contained in each picture sequence can be manually specified.
  • FIG. 2 is a schematic diagram of a process of acquiring a picture sequence in a method for evaluating the clarity of a face image according to an embodiment of the present invention.
  • the specific steps of the sequence include:
  • S201 Determine whether the signal input source is a video or a picture sequence. Since in different application scenarios such as face recognition and face texture mapping, the types of signal input sources are different, it is necessary to first determine whether the signal input source is a video or a sequence of pictures in order to facilitate subsequent processing.
  • the signal input source is video
  • decode the video into a picture sequence and then obtain the picture sequence.
  • the number of pictures included in the decoded picture sequence can be adjusted according to actual needs, and the decoded picture sequence contains all video information as a standard.
  • face detection refers to: detecting the face existing in the image, and accurately framing its position to form a face detection frame.
  • multiple face detection frames may be obtained, and the face detection frame of the picture with the largest face detection frame size is selected.
  • the face feature points are pixel points characterizing the face features of the face, including feature information of specific parts and overall feature information, performing face feature point detection within the face detection frame to identify a preset number of faces In the case of feature points, the number of face feature points can affect the evaluation of the picture clarity.
  • the embodiment of the present invention adopts the classic 68 face feature points.
  • FIG. 3 is a distribution diagram of face feature points in a method for evaluating sharpness of a face image according to an embodiment of the present invention.
  • S103 Accumulate the sum of the absolute values of the horizontal and vertical gradient values of all the pixels in the face area to obtain the accumulated result, and divide the accumulated result by the number of all pixels in the face area to obtain the face The average value of the gradient values of all pixels in the area, and the average value is taken as the sharpness factor of the picture.
  • the clarity factor is an abstract expression of the average value of the gradient value.
  • the method used in this patent is the clarity represented by the average value of the gradient value.
  • Degree factor to judge the clarity of the picture is the degree of the picture.
  • S104 Perform face detection by selecting a pre-processed picture until all pictures in the picture sequence are taken.
  • the pre-processed pictures need to be repeatedly selected until all pictures in the picture sequence are taken out before proceeding to the next step.
  • the sharpness factor represents the average value of the gradient values of all pixels in the face area of the picture, and the larger the sharpness factor, the clearer the picture. Sort the clearness factor values of all pictures in the picture sequence. Obtain the clarity order of all the pictures in the picture sequence, and you can get the clearest picture.
  • FIG. 4 is a schematic flowchart of image preprocessing in a method for evaluating sharpness of a face image according to an embodiment of the present invention.
  • the method includes:
  • the noise in the picture refers to unnecessary or redundant interference information existing in the image data. Since the presence of noise seriously affects the quality of the picture, the noise in the picture is removed in the preprocessing stage.
  • grayscale the image expressed in grayscale. Converting the pictures in the picture sequence into grayscale images is beneficial for calculating the gradient values of pixels.
  • FIG. 5 is a schematic flowchart of a method for calculating an average value of gradient values in a method for evaluating sharpness of a face image according to an embodiment of the present invention.
  • the method includes:
  • P (i, j) be the gray value of the pixels in the i-th row and j-th column in the face area of the picture
  • P (i + 1, j) be the gray-scale values of the pixels in the i-th row and j-th column Degree value
  • P (i, j + 1) is the gray value of the pixel point in the i-th row and j + 1th column in the face area range.
  • the gradient value can be calculated by the gray value of the picture, that is, the picture in the picture sequence is converted into a gray image according to the above scheme, and then the gray value is calculated, or the picture in the picture sequence can be converted to YCbCr
  • the space is calculated, and the gradient value of each pixel is the sum of the Cb and Cr gradient values of the pixel in the horizontal and vertical directions.
  • FIG. 6 is a schematic flowchart of another image preprocessing in a method for evaluating sharpness of a face image according to an embodiment of the present invention.
  • the method includes:
  • the noise in the image refers to unnecessary or unnecessary interference information existing in the image data. Since the presence of noise seriously affects the quality of the image, the noise in the picture is removed in the preprocessing stage.
  • the YCbCr space map is a kind of color space
  • the Y value is the light density and is non-linear
  • the Cb value and Cr value are the blue and red density shift components.
  • FIG. 7 is a schematic flowchart of another method for calculating an average value of gradient values in a method for evaluating sharpness of a face image according to an embodiment of the present invention.
  • the method includes:
  • S601 Calculate the absolute values of the gradient values of the Cb value and the Cr value of each pixel in the horizontal direction and the vertical direction of the face area in the face area.
  • S603 Accumulate the absolute value sum of the gradient values of the Cb value and the Cr value of the horizontal and vertical directions of all the pixels in the face area to obtain an accumulation result.
  • S604 Divide the accumulation result by the number of all pixels in the face area to obtain the average value of the gradient values of all pixels in the face area.
  • Cb (i, j) be the Cb value of the pixel in the i-th row and j-th column in the face area of the picture
  • Cb (i + 1, j) be the Cb value of the pixel in the i-th row and j-th column
  • Cb (i, j + 1) is the Cb value of the pixel point in the i-th row and j + 1 column in the face area
  • Cr (i, j) be the pixel in the i-th row and j-th column in the face area in the picture Cr value of the point
  • Cr (i + 1, j) is the Cr value of the pixel in the i th row and the jth column
  • Cr (i, j + 1) is the pixel point in the i th row and the j + 1th column in the face area
  • the sum of the absolute values of the horizontal and vertical gradient values of the pixel is calculated sumCbCr (i, j), then the horizontal and vertical directions of all pixels within the face area are accumulated
  • the sharpness factor of the picture is: sumALL / Total.
  • the gray value of the image can be calculated, that is, the image in the image sequence is converted into a gray image and the gray value is calculated.
  • the pictures in the picture sequence can also be converted into YCbCr space for calculation.
  • the gradient value of each pixel is the sum of the horizontal and vertical Cb and Cr gradient values of the pixel.
  • FIG. 8 is a schematic diagram of a device for evaluating a sharpness of a face image according to an embodiment of the present invention.
  • the device includes:
  • the pre-processing module 10 is used to obtain a picture sequence and pre-process all pictures in the picture sequence.
  • the face detection module 20 is used to select a pre-processed image for face detection, obtain a face detection frame, and perform face feature point detection in the face detection frame to identify a preset number of face feature points , By connecting the face feature points at the preset label to obtain the closed face area range.
  • the calculation module 30 is used to accumulate the sum of the absolute values of the horizontal and vertical gradient values of all pixels in the face area to obtain the accumulation result, and divide the accumulation result by the number of all pixels in the face area to obtain The average value of the gradient values of all pixels in the face area, and the average value is taken as the sharpness factor of the picture.
  • the circulation module 40 is configured to select a pre-processed picture for face detection until all pictures in the picture sequence are taken.
  • the sorting module 50 is used to sort the clarity factor values of all pictures in the picture sequence to obtain the clarity order of all pictures in the picture sequence.
  • the pre-processing module 10 includes:
  • the judgment module 101 is used to judge whether the signal input source is a video or a picture sequence.
  • the decoding module 102 is configured to obtain the picture sequence after decoding the video into a picture sequence if the signal input source is video.
  • the obtaining module 103 is configured to directly obtain a picture sequence if the signal input source is a picture sequence.
  • the denoising module 104 is used to sequentially select each picture in the picture sequence to remove noise in each picture.
  • the grayscale image module 105 is used to convert the noise-removed picture into a grayscale image.
  • calculation module 30 includes:
  • the first absolute value calculation module 301 is configured to calculate the absolute value of the gradient value of the gray value of the horizontal direction and the vertical direction of each pixel in the face area range.
  • the first absolute value sum module 302 is used to calculate the sum of the absolute values of the gradient values of the gray value of the horizontal direction and the vertical direction of each pixel in the face area range.
  • the first accumulation module 303 is used to accumulate the absolute value sum of the gradient values of the horizontal and vertical gray values of all the pixels in the face area to obtain an accumulation result.
  • the first average calculation module 304 is used to divide the accumulation result by the number of all pixels in the face area to obtain the average value of the gradient values of all pixels in the face area.
  • the gradient value can be calculated by the gray value of the picture, that is, the gray value is calculated after converting the picture in the picture sequence into the gray image according to the above device, or the picture in the picture sequence can be converted Calculate in the YCbCr space.
  • the gradient value of each pixel is the sum of the Cb and Cr gradient values of the pixel in the horizontal and vertical directions. Please refer to FIG. 9.
  • FIG. Another device schematic diagram of the device, so the structure of the pre-processing module 10 may also include:
  • the judgment module 101 is used to judge whether the signal input source is a video or a picture sequence.
  • the decoding module 102 is configured to obtain the picture sequence after decoding the video into a picture sequence if the signal input source is video.
  • the obtaining module 103 is configured to directly obtain a picture sequence if the signal input source is a picture sequence.
  • the denoising module 104 is used to sequentially select each picture in the picture sequence to remove noise in each picture.
  • the YCbCr space map module 106 is used to convert the noise-removed picture into a YCbCr space map.
  • the structure of the computing device 30 may further include:
  • the second absolute value calculation module 305 is configured to calculate the absolute values of the gradient values of the Cb value and the Cr value of the horizontal direction and the vertical direction of each pixel in the face area range, respectively.
  • the second absolute value sum module 306 is used to calculate the sum of the absolute values of the gradient values of the Cb value and the Cr value of each pixel in the horizontal and vertical directions of the face area of the face, respectively.
  • the second accumulation module 307 is used to accumulate the absolute value sum of the gradient values of the Cb value and the Cr value of the horizontal and vertical directions of all the pixels in the face area to obtain the accumulation result.
  • the second average calculation module 308 is used to divide the accumulated result by the number of all pixels in the face area to obtain the average value of the gradient values of all pixels in the face area.
  • An embodiment of the present invention provides a face image sharpness evaluation device.
  • a face area range is obtained, and then the horizontal and vertical directions of all pixels within the face area range are accumulated.
  • the absolute value of the gradient values is summed to obtain the cumulative result, and the cumulative result is divided by the number of all pixels in the face area to obtain the average value of the gradient values of all pixels in the face area.
  • Factor sort the clarity factor values of all pictures to get the clarity order of all pictures. Since the face sharpness is evaluated by calculating the average value of the gradient values of all pixels in the face area, on the one hand, the calculation can be simplified and the calculation efficiency can be improved. On the other hand, only the face area range is used as the calculation range. Make the evaluation of the sharpness of the face image more accurate.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of modules is only a division of logical functions.
  • there may be other divisions for example, multiple modules or components may be combined or integrated To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or modules, and may be in electrical, mechanical, or other forms.
  • modules described as separate components may or may not be physically separated, and the components displayed as modules may or may not be physical modules, that is, they may be located in one place, or may be distributed on multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional module in each embodiment of the present invention may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software function modules.
  • the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the present invention essentially or part of the contribution to the existing technology or all or part of the technical solution can be embodied in the form of a software product, the computer software product is stored in a storage medium , Including several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods of various embodiments of the present invention.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Processing (AREA)

Abstract

La présente invention se rapporte au domaine technique du traitement d'image, et concerne un procédé pour évaluer la résolution d'une image de visage, consistant : à obtenir une plage de zone de visage au moyen de la réalisation d'une détection de visage sur une image prétraitée, puis à calculer une valeur moyenne de valeurs de gradient de tous les pixels dans la plage de zone de visage, à prendre la valeur moyenne en tant que facteur de résolution des images, et à trier des valeurs de facteur de résolution de toutes les images pour obtenir un ordre de résolution de toutes les images. La présente invention concerne également un dispositif pour évaluer la résolution d'une image de visage. La résolution de l'image de visage est évaluée au moyen du procédé de calcul de la valeur moyenne des valeurs de gradient de tous les pixels dans la plage de zone de visage. Par conséquent, d'une part, le calcul peut être simplifié et l'efficacité de calcul peut être améliorée ; et d'autre part, seule la plage de zone de visage est utilisée comme plage de calcul de telle sorte que l'évaluation de la résolution de l'image de visage est plus précise.
PCT/CN2018/113361 2018-11-01 2018-11-01 Procédé et dispositif d'évaluation de la résolution d'une image de visage WO2020087434A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/113361 WO2020087434A1 (fr) 2018-11-01 2018-11-01 Procédé et dispositif d'évaluation de la résolution d'une image de visage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/113361 WO2020087434A1 (fr) 2018-11-01 2018-11-01 Procédé et dispositif d'évaluation de la résolution d'une image de visage

Publications (1)

Publication Number Publication Date
WO2020087434A1 true WO2020087434A1 (fr) 2020-05-07

Family

ID=70463524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/113361 WO2020087434A1 (fr) 2018-11-01 2018-11-01 Procédé et dispositif d'évaluation de la résolution d'une image de visage

Country Status (1)

Country Link
WO (1) WO2020087434A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113177881A (zh) * 2021-04-28 2021-07-27 广州光锥元信息科技有限公司 提升图片清晰度的处理方法及装置
CN116630220A (zh) * 2023-07-25 2023-08-22 江苏美克医学技术有限公司 一种荧光图像景深融合成像方法、装置及存储介质
CN117472677A (zh) * 2023-12-26 2024-01-30 深圳市魔力信息技术有限公司 人脸识别设备的测试方法和测试系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202697A1 (en) * 2009-02-10 2010-08-12 Seiko Epson Corporation Specifying position of characteristic portion of face image
CN103942525A (zh) * 2013-12-27 2014-07-23 高新兴科技集团股份有限公司 一种基于视频序列的实时人脸优选方法
CN106803067A (zh) * 2016-12-28 2017-06-06 浙江大华技术股份有限公司 一种人脸图像质量评估方法及装置
CN107665361A (zh) * 2017-09-30 2018-02-06 珠海芯桥科技有限公司 一种基于人脸识别的客流计数方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202697A1 (en) * 2009-02-10 2010-08-12 Seiko Epson Corporation Specifying position of characteristic portion of face image
CN103942525A (zh) * 2013-12-27 2014-07-23 高新兴科技集团股份有限公司 一种基于视频序列的实时人脸优选方法
CN106803067A (zh) * 2016-12-28 2017-06-06 浙江大华技术股份有限公司 一种人脸图像质量评估方法及装置
CN107665361A (zh) * 2017-09-30 2018-02-06 珠海芯桥科技有限公司 一种基于人脸识别的客流计数方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113177881A (zh) * 2021-04-28 2021-07-27 广州光锥元信息科技有限公司 提升图片清晰度的处理方法及装置
CN113177881B (zh) * 2021-04-28 2023-10-27 广州光锥元信息科技有限公司 提升图片清晰度的处理方法及装置
CN116630220A (zh) * 2023-07-25 2023-08-22 江苏美克医学技术有限公司 一种荧光图像景深融合成像方法、装置及存储介质
CN116630220B (zh) * 2023-07-25 2023-11-21 江苏美克医学技术有限公司 一种荧光图像景深融合成像方法、装置及存储介质
CN117472677A (zh) * 2023-12-26 2024-01-30 深圳市魔力信息技术有限公司 人脸识别设备的测试方法和测试系统
CN117472677B (zh) * 2023-12-26 2024-03-08 深圳市魔力信息技术有限公司 人脸识别设备的测试方法和测试系统

Similar Documents

Publication Publication Date Title
CN110705583B (zh) 细胞检测模型训练方法、装置、计算机设备及存储介质
Hu et al. An overview of underwater vision enhancement: From traditional methods to recent deep learning
CN111524106B (zh) 颅骨骨折检测和模型训练方法、装置、设备和存储介质
US9740967B2 (en) Method and apparatus of determining air quality
WO2022001509A1 (fr) Procédé et appareil d'optimisation d'image, support de stockage informatique, et dispositif électronique
US11600008B2 (en) Human-tracking methods, systems, and storage media
CN110189336B (zh) 图像生成方法、系统、服务器及存储介质
CN112308095A (zh) 图片预处理及模型训练方法、装置、服务器及存储介质
CN110781885A (zh) 基于图像处理的文本检测方法、装置、介质及电子设备
TW202014984A (zh) 一種圖像處理方法、電子設備及存儲介質
CN109389569B (zh) 基于改进DehazeNet的监控视频实时去雾方法
WO2020087434A1 (fr) Procédé et dispositif d'évaluation de la résolution d'une image de visage
CN111462120A (zh) 一种基于语义分割模型缺陷检测方法、装置、介质及设备
CN109711268B (zh) 一种人脸图像筛选方法及设备
Hu et al. A multi-stage underwater image aesthetic enhancement algorithm based on a generative adversarial network
WO2022116104A1 (fr) Procédé et appareil de traitement d'image, dispositif, et support de stockage
CN111784658B (zh) 一种用于人脸图像的质量分析方法和系统
CN114298985B (zh) 缺陷检测方法、装置、设备及存储介质
CN116071315A (zh) 一种基于机器视觉的产品可视缺陷检测方法及系统
CN115601820A (zh) 一种人脸伪造图像检测方法、装置、终端及存储介质
JP2022133378A (ja) 顔生体検出方法、装置、電子機器、及び記憶媒体
Babu et al. An efficient image dahazing using Googlenet based convolution neural networks
CN114581318A (zh) 一种低照明度图像增强方法及系统
Zheng et al. Overwater image dehazing via cycle-consistent generative adversarial network
CN109409305A (zh) 一种人脸图像清晰度评价方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18938494

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18938494

Country of ref document: EP

Kind code of ref document: A1