CN114295561B - Color difference measuring method and system based on device-independent color space - Google Patents

Color difference measuring method and system based on device-independent color space Download PDF

Info

Publication number
CN114295561B
CN114295561B CN202210201793.8A CN202210201793A CN114295561B CN 114295561 B CN114295561 B CN 114295561B CN 202210201793 A CN202210201793 A CN 202210201793A CN 114295561 B CN114295561 B CN 114295561B
Authority
CN
China
Prior art keywords
sequence
image
color
detection area
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210201793.8A
Other languages
Chinese (zh)
Other versions
CN114295561A (en
Inventor
孟然
柴华
林锐斌
贾勇
王哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Smarter Eye Technology Co Ltd
Original Assignee
Beijing Smarter Eye Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Smarter Eye Technology Co Ltd filed Critical Beijing Smarter Eye Technology Co Ltd
Priority to CN202210201793.8A priority Critical patent/CN114295561B/en
Publication of CN114295561A publication Critical patent/CN114295561A/en
Application granted granted Critical
Publication of CN114295561B publication Critical patent/CN114295561B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a color difference measuring method and system based on an equipment-independent color space, wherein the method comprises the following steps: creating a reference image and a training image; using the training image, each detection area and each positioning core to obtain and store a standard multi-dimensional component mean sequence of each detection area; calculating an offset of the position with respect to a pre-stored position on the reference image based on the real-time image; obtaining a sub-graph sequence of each detection area based on a real-time image based on the position offset of each positioning core; calculating a multi-dimensional component mean sequence of each detection area based on the sub-graph sequence; and calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence. The technical problems that in the prior art, the chromatic aberration measurement accuracy is poor and the chromatic aberration measurement is difficult are solved.

Description

Color difference measuring method and system based on device-independent color space
Technical Field
The invention relates to the technical field of color difference measurement, in particular to a color difference measurement method and a system based on an equipment-independent color space.
Background
With the development of socio-economic, people have higher and higher requirements on the quality of printed matters. For example, for most high-end printed products, the consistency between the product ink color and the standard sample ink color is more and more concerned, and the measurement method and the control method of the ink color difference (hereinafter referred to as "color difference") of the printed products become important.
Before the color difference measurement, an image to be measured needs to be acquired by a camera, but color data output by the camera is RGB color space data. In applications involving color representation and color difference measurement, the RGB color space has two fatal disadvantages: 1) the RGB color space is a device-dependent color space; 2) the RGB color space is a non-uniform color space. As a device-dependent color space, RGB color data obtained by imaging the same object under different imaging devices and different light sources may be different, which may cause ambiguity in color representation of the object. As a non-uniform color space, the color difference in the RGB color space is not only related to the euclidean distance between the color coordinates, but also related to the specific position of the color coordinates, which brings much inconvenience to the measurement of the color difference, resulting in poor accuracy of the color difference measurement and difficulty in the color difference measurement.
Disclosure of Invention
Therefore, the invention provides a color difference measuring method and system based on an equipment-independent color space, and aims to solve the technical problems that in the prior art, the color difference measurement accuracy is poor, and the color difference measurement is difficult.
In order to achieve the above object, the embodiments of the present invention provide the following technical solutions:
a method of color difference measurement based on a device-independent color space, the method comprising:
acquiring a normal printed matter image, creating a reference image based on the normal printed matter image, and dividing at least one detection area and at least one positioning core on the reference image, wherein each positioning core and at least one detection area have a positioning association relation;
acquiring a standard color sample image, and taking the standard color sample image as a training image;
acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel;
acquiring a real-time image of a to-be-detected printed matter;
based on the real-time image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image;
respectively carrying out integer pixel shift and sub-pixel difference value calculation on the positions of the detection areas associated with the positioning kernels based on the position shift amount of each positioning kernel so as to obtain a sub-image sequence of each detection area based on a real-time image;
calculating a multi-dimensional component mean sequence of each detection region based on the sub-graph sequences;
and calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence.
Further, the performing a positioning calculation by using each positioning core specifically includes:
positioning calculation of positioning kernel is carried out by adopting a surface matching mode, an edge matching mode or a geometric matching mode, and the obtained positioning calculation result is recorded
Figure 461680DEST_PATH_IMAGE001
Wherein,
Figure 69379DEST_PATH_IMAGE002
and the coordinate of any positioning core i on the real-time image is defined, n is the number of the positioning cores, and n is a positive integer.
Further, the amount of shift of the position from a pre-stored position on the reference image is calculated using the following formula
Figure 652807DEST_PATH_IMAGE003
Figure 902523DEST_PATH_IMAGE004
Wherein,
Figure 887796DEST_PATH_IMAGE005
for the positioning core in the coreCoordinates on the time image;
Figure 400817DEST_PATH_IMAGE006
coordinates of the localization kernel on a reference image.
Further, based on the position offset of each localization kernel, performing integer pixel offset and sub-pixel difference calculation on the position of each detection area associated with each localization kernel, respectively, to obtain a sub-image sequence of each detection area based on a real-time image, specifically including:
circulating each detection area, performing integer pixel shift and sub-pixel interpolation calculation on the position of the detection area according to the incidence relation between the detection area and the positioning core to obtain a sub-image sequence of each detection area based on a real-time image, and recording the sub-image sequence as a sub-image sequence
Figure 471541DEST_PATH_IMAGE007
And m is the number of the detection areas, and each element corresponds to a real-time image covered by one detection area.
Further, based on the sub-graph sequence, calculating a multi-dimensional component mean sequence of each detection region, specifically including:
cycling the subgraph sequence;
obtaining L component image sequence of each detection area in the device-independent color space by using the color calibration matrix of the off-line detection device
Figure 524948DEST_PATH_IMAGE008
A component image sequence
Figure 833569DEST_PATH_IMAGE009
B component image sequence
Figure 314229DEST_PATH_IMAGE010
Wherein
Figure 606670DEST_PATH_IMAGE011
obtaining a saturation S component diagram of each detection area by using a prestored conversion formulaImage sequence
Figure 463768DEST_PATH_IMAGE012
And luminance I component image sequence
Figure 361317DEST_PATH_IMAGE013
Wherein
Figure 12878DEST_PATH_IMAGE014
filtering the invalid points in the detection area to obtain all valid points;
circulating all the L component image sequences of the detection area, and calculating the average value to obtain the L component average value sequence based on the real-time image
Figure 58194DEST_PATH_IMAGE015
Circulating all the detection area a component image sequences, and calculating the average value to obtain a component average value sequence based on real-time images
Figure 453404DEST_PATH_IMAGE016
Circulating all the b component image sequences of the detection area, and calculating the average value to obtain the b component average value sequence based on the real-time image
Figure 205459DEST_PATH_IMAGE017
Circulating all the detection area I component image sequences, and calculating the average value to obtain a brightness I component average value sequence based on real-time images
Figure 293501DEST_PATH_IMAGE018
Wherein,
Figure 560534DEST_PATH_IMAGE019
further, calculating a color difference value sequence and a color direction sequence of each detection area according to the multi-dimensional component mean sequence and the standard multi-dimensional component mean sequence, specifically including:
based on the color difference training database, obtaining an L component mean value sequence based on a training image
Figure 493855DEST_PATH_IMAGE020
A component mean sequence
Figure 631575DEST_PATH_IMAGE021
B component mean sequence
Figure 890518DEST_PATH_IMAGE022
And luminance I component mean sequence
Figure 379268DEST_PATH_IMAGE023
Wherein
Figure 585122DEST_PATH_IMAGE024
calculating the color difference value sequence of each detection area by the following formula
Figure 108507DEST_PATH_IMAGE025
And color direction sequence
Figure 272772DEST_PATH_IMAGE026
Figure 514398DEST_PATH_IMAGE027
Wherein,
Figure 523942DEST_PATH_IMAGE028
Figure 901834DEST_PATH_IMAGE029
is a color difference value sequence;
Figure 502579DEST_PATH_IMAGE030
is a color squareAnd (4) direction sequence.
Further, using the training image, each of the detection regions, and each of the localization kernels, obtaining and storing a standard multidimensional component mean sequence of each of the detection regions, specifically including:
on the training image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the training image, and calculating the offset of the position of each positioning core relative to the position of each positioning core on a reference image;
based on the position offset of the positioning kernel, performing integer pixel offset and sub-pixel difference value calculation on the position of each detection area associated with the positioning kernel to obtain a sub-image sequence of each detection area based on a training image;
and calculating the multi-dimensional component mean sequence of each detection area based on the sub-graph sequence, and storing the multi-dimensional component mean sequence as a standard multi-dimensional component mean sequence.
The present invention also provides a color difference measurement system based on an apparatus-independent color space, the system comprising:
the template creating unit is used for acquiring a normal printed matter image, creating a reference image based on the normal printed matter image, and dividing at least one detection area and at least one positioning core on the reference image, wherein each positioning core and at least one detection area have a positioning association relationship;
the parameter training unit is used for acquiring a standard color sample image, taking the standard color sample image as a training image, and acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel;
the real-time image acquisition unit is used for acquiring a real-time image of the to-be-detected printed matter;
the offset calculation unit is used for performing positioning calculation by using each positioning core based on the real-time image so as to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image;
a sub-image sequence calculating unit, configured to perform integer pixel shift and sub-pixel difference calculation on the position of each detection area associated with each positioning core, respectively, based on the position shift amount of each positioning core, so as to obtain a sub-image sequence of each detection area based on a real-time image;
a mean sequence calculation unit, configured to calculate a multi-dimensional component mean sequence of each of the detection regions based on the sub-image sequence;
and the result output unit is used for calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence.
The present invention also provides an intelligent terminal, including: the device comprises a data acquisition device, a processor and a memory;
the data acquisition device is used for acquiring data; the memory is to store one or more program instructions; the processor is configured to execute one or more program instructions to perform the method as described above.
The present invention also provides a computer readable storage medium having embodied therein one or more program instructions for executing the method as described above.
According to the color difference measuring method and system based on the device-independent color space, a normal printed matter image is obtained, a reference image is created based on the normal printed matter image, at least one detection area and at least one positioning core are defined on the reference image, and each positioning core and the at least one detection area have a positioning association relation; acquiring a standard color sample image, and taking the standard color sample image as a training image; acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel; acquiring a real-time image of a to-be-detected printed matter; based on the real-time image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image; respectively carrying out integer pixel shift and sub-pixel difference value calculation on the positions of the detection areas associated with the positioning kernels based on the position shift amount of each positioning kernel so as to obtain a sub-image sequence of all the detection areas based on a real-time image; calculating a multi-dimensional component mean sequence of each detection region based on the sub-graph sequences; and calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence. Therefore, the invention obtains the color difference value sequence and the color direction sequence of each detection area based on the calculation of the multi-dimensional component mean value sequence and the created color difference training database, thereby providing accurate data support for the subsequent ink consistency control, having higher color difference measurement accuracy, simultaneously, utilizing the pre-created color difference training database, effectively reducing the calculated amount, reducing the color difference measurement difficulty, and solving the technical problems of poor color difference measurement accuracy and difficult color difference measurement in the prior art.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It should be apparent that the drawings in the following description are merely exemplary, and that other embodiments can be derived from the drawings provided by those of ordinary skill in the art without inventive effort.
The structures, ratios, sizes, and the like shown in the present specification are only used for matching with the contents disclosed in the specification, so as to be understood and read by those skilled in the art, and are not used to limit the conditions that the present invention can be implemented, so that the present invention has no technical significance, and any structural modifications, changes in the ratio relationship, or adjustments of the sizes, without affecting the effects and the achievable by the present invention, should still fall within the range that the technical contents disclosed in the present invention can cover.
FIG. 1 is a flowchart of an embodiment of a method for measuring chromatic aberration based on device-independent color space according to the present invention;
FIG. 2 is a flow chart of the calculation of the multi-dimensional component mean sequence in the method of FIG. 1;
FIG. 3 is a partial schematic view of a reference image;
FIG. 4 is a schematic diagram of a modeling process;
FIG. 5 is a schematic diagram of a reference image with a mask;
FIG. 6 is a flow chart of a training process;
FIG. 7 is a flow chart of a training process in a particular scenario;
FIG. 8 is a graph of the effect of threshold segmentation on saturation components of detection regions;
fig. 9 is a block diagram of a color difference measurement system based on a device-independent color space according to an embodiment of the present invention.
Detailed Description
The present invention is described in terms of particular embodiments, other advantages and features of the invention will become apparent to those skilled in the art from the following disclosure, and it is to be understood that the described embodiments are merely exemplary of the invention and that it is not intended to limit the invention to the particular embodiments disclosed. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In one embodiment, as shown in fig. 1, the method for measuring color difference based on device-independent color space provided by the present invention includes the following steps:
s101: acquiring a normal printed matter image, creating a reference image based on the normal printed matter image, and dividing at least one detection area and at least one positioning core on the reference image, wherein each positioning core and at least one detection area have a positioning association relation;
s102: acquiring a standard color sample image, and taking the standard color sample image as a training image;
s103: acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel;
s104: and acquiring a real-time image of the to-be-detected printed matter. In the measurement process, template information stored in the modeling process and a training result stored in the training process need to be read, and after an image (namely a real-time image) of a to-be-measured printed matter is obtained, subsequent calculation is performed.
S105: and based on the real-time image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image.
In an actual use scene, the absolute position of the to-be-detected printed matter in the image can be changed due to the fact that the position of the to-be-detected printed matter placed on the offline detection equipment detection table is completely consistent every time cannot be guaranteed. Therefore, on the real-time image, the positioning calculation of all the positioning kernels is firstly carried out, and the position offset of the positions of all the positioning kernels relative to the reference image is calculated.
Specifically, the positioning calculation using each positioning core specifically includes the following steps:
positioning calculation of positioning kernel is carried out by adopting a surface matching mode, an edge matching mode or a geometric matching mode, and the obtained positioning calculation result is recorded as
Figure 965922DEST_PATH_IMAGE031
Wherein,
Figure 510648DEST_PATH_IMAGE032
and the coordinate of any positioning core i on the real-time image is defined, n is the number of the positioning cores, and n is a positive integer.
Further, the amount of shift of the position from a pre-stored position on the reference image is calculated using the following formula
Figure 8625DEST_PATH_IMAGE033
Figure 514693DEST_PATH_IMAGE034
Wherein,
Figure 199752DEST_PATH_IMAGE035
coordinates of the localization kernel on a reference image.
S106: and respectively carrying out integer pixel shift and sub-pixel difference calculation on the positions of the detection areas associated with the positioning kernels based on the position shift amount of each positioning kernel so as to obtain a sub-image sequence of each detection area based on a real-time image. Specifically, when the positioning calculation of all the positioning kernels on the real-time image is completed, each detection area is circulated, and according to the association relationship between the detection area and the positioning kernels, whole-pixel migration and sub-pixel interpolation are performed on the positions of the detection areas to obtain a small image sequence of each detection area based on the real-time image, wherein the small image sequence is the sub-image sequence.
In some embodiments, obtaining the sub-image sequence of each detection area based on the real-time image specifically includes:
circulating each detection area, performing integer pixel shift and sub-pixel interpolation calculation on the position of the detection area according to the incidence relation between the detection area and the positioning core to obtain sub-image sequences of all the detection areas based on real-time images, and recording the sub-image sequences as
Figure 82258DEST_PATH_IMAGE036
And m is the number of the detection areas, and each element corresponds to a real-time image covered by one detection area.
S107: and calculating a multi-dimensional component mean sequence of each detection area based on the sub-graph sequence.
As shown in fig. 2, step S107 specifically includes the following steps:
s1071: cycling the subgraph sequence;
after sub-image sequences of all detection areas based on real-time images are obtained, an L component image sequence, an a component image sequence, a b component image sequence, a saturation S component image sequence and a brightness I component image sequence of the detection areas are further obtained through calculation:
s1072: obtaining L component image sequences of each detection area in the device-independent color space by using the color calibration matrix of the off-line detection device
Figure 434742DEST_PATH_IMAGE037
A component image sequence
Figure 111711DEST_PATH_IMAGE038
B component image sequence
Figure 752908DEST_PATH_IMAGE039
Wherein
Figure 439104DEST_PATH_IMAGE040
S1073: obtaining the saturation S component image sequence of each detection area by using a prestored conversion formula
Figure 646094DEST_PATH_IMAGE041
And luminance I component image sequence
Figure 759544DEST_PATH_IMAGE042
Wherein
Figure 153616DEST_PATH_IMAGE043
s1074: and filtering invalid points in the detection area to obtain all valid points. Specifically, circulating all detection region saturation S component image sequences, performing image segmentation on each sub-image by using a percentage threshold, setting low saturation pixel points as invalid points, and setting high saturation pixel points as valid points;
s1075: circulating all the L component image sequences of the detection area, and calculating the average value to obtain the L component average value sequence based on the real-time image
Figure 377924DEST_PATH_IMAGE044
S1076: circulating all the detection area a component image sequences, and calculating the average value to obtain a real-time imageA component mean sequence of
Figure 705000DEST_PATH_IMAGE045
S1077: circulating all the b component image sequences of the detection area, and calculating the average value to obtain the b component average value sequence based on the real-time image
Figure 989351DEST_PATH_IMAGE046
S1078: circulating all detection area I component image sequences, and calculating an average value to obtain a brightness I component average value sequence based on real-time images
Figure 339561DEST_PATH_IMAGE047
Wherein,
Figure 367560DEST_PATH_IMAGE048
in the process of calculating the mean value, circulating all the L component image sequences of the detection area, calculating the mean value aiming at each sub-image to obtain an L component mean value sequence which is recorded as
Figure 549142DEST_PATH_IMAGE049
. Similarly, the pixel points participating in the mean calculation must meet two conditions: firstly, the pixel point must be located in the area covered by the detection area color secondary mask; and secondly, the pixel point must be a valid point segmented in the subgraph corresponding to the saturation S component image sequence. In the same way, we can also get a component mean sequence based on real-time images
Figure 207657DEST_PATH_IMAGE050
B component mean sequence
Figure 107480DEST_PATH_IMAGE051
Luminance I component mean sequence
Figure 673590DEST_PATH_IMAGE052
Wherein
Figure 709679DEST_PATH_IMAGE053
S108: and calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence.
By calculation, obtaining real-time image-based
Figure 804674DEST_PATH_IMAGE054
Figure 926214DEST_PATH_IMAGE055
Then, reading a pre-stored chromatic aberration training database to obtain a reference image-based image
Figure 296016DEST_PATH_IMAGE056
Figure 452190DEST_PATH_IMAGE057
Wherein
Figure 452507DEST_PATH_IMAGE058
. Then, the color difference sequence of each detection region can be obtained by the following calculation
Figure 61343DEST_PATH_IMAGE059
And color direction sequence
Figure 969256DEST_PATH_IMAGE060
Wherein
Figure 979938DEST_PATH_IMAGE061
Figure 416735DEST_PATH_IMAGE062
Wherein the color difference sequence
Figure 512867DEST_PATH_IMAGE063
Representing each detection zoneThe color difference of the tested printed matter relative to the standard color sample has the greatest advantage that the numerical value is consistent with the perception of human eyes, but the color difference is that the distance value has no directionality. It is possible to pass through a sequence of color directions
Figure 224471DEST_PATH_IMAGE064
The direction of its color is known and,
Figure 558501DEST_PATH_IMAGE065
when the color is 1, the color of the detected printed matter at the detection area is lighter than that of the standard color sample; otherwise, the color is darker. After the direction value sequence and the color difference value sequence of the color are obtained, the direction value sequence and the color difference value sequence can be used for guiding the adjustment of printing equipment, so that the ink color consistency of printed matters is maintained.
In the above specific embodiment, before the color difference measurement, a template for color difference measurement needs to be established in the modeling process through pre-modeling and training, various related detection parameters are set, and a standard sample is used in the training process to obtain a standard color value of the detection area. Therefore, the measured printed matter is subjected to color difference measurement by using information such as a detection template, a detection parameter, a standard color value and the like in the measurement process.
Specifically, as shown in fig. 3 and 5, the apparatus for color representation or color difference measurement is generally an off-line inspection apparatus independent of the printing apparatus. Because the light source illumination of the off-line detection equipment is more constant, the spectrum composition is more stable, and the imaging environment is more superior. The online detection equipment arranged on the printing equipment has the defects that the light source is polluted, the fluctuation is large, and the flattening condition of a printed product is restricted.
In this embodiment, referring to fig. 4, the modeling process specifically includes the following steps:
collecting a reference image; a printed product which is completely normally printed is placed on an off-line detection device, and an image of the printed product is collected to be used as a reference of a modeling process, and the image is called as a reference image. In practice, printed matter is generally printed in a large sheet form, and for convenience of description, only a part of the printed matter will be described.
Drawing a detection area and a positioning core; on the reference image, a detection region and a localization core are drawn (positions of the detection region and the localization core are specified). The detection area is a mapped image area which participates in calculation during training and measurement.
Generally, each ink key and each color sub draw a corresponding detection area to calculate the color difference of the color sub in the range of the ink key, so as to guide how the color of the ink key is mixed for the corresponding color sub. The ink key is the minimum width unit of ink color adjustment, and usually one printed matter is composed of several tens of ink key widths. The color number refers to the pattern printed by the ink in the same ink fountain.
In fig. 3 and 5, solid line rectangles with different sizes and numbers 1, 2, and 3 are detection regions corresponding to three different color ranks. And calculating the patterns covered by the detection areas to obtain the color difference of the corresponding color times in the area. The dashed rectangle numbered a is a positioning kernel, which is used to perform positioning on a training image (training process) and a real-time image (measurement process) to obtain a positioning offset, so as to guide the detection region to perform the same value of offset. One detection zone may use localization results of one or more localization cores; multiple detection zones may also use the localization result of one localization core. The association relationship between the detection region and the positioning core may be specified in the modeling process. For example, the detection regions No. 1, 2, and 3 may be specified in the model, and the localization result of the localization core a may be used.
Drawing a detection area mask; the detection area covers a pattern area, most of which usually belongs to the color sub-area to be detected, but there may be some patterns belonging to other color sub-areas. In order to avoid the influence of the patterns on the color measurement result, the patterns are usually distinguished by drawing a pattern with an irregular shape, and the pattern with the irregular shape is called a 'mask'. For differentiation, the color sub-mask is represented semi-transparently as shown in fig. 5, and each detection area is calculated only for the area of the pattern covered by the color sub-mask during training or measurement.
Setting detection parameters and positioning parameters, and storing the detection parameters and the positioning parameters into a template database; and after drawing all the detection areas, the detection area masks and the positioning cores, establishing the association relationship between the positioning cores and the detection areas. And then setting detection parameters such as a search range of a positioning core, sub-pixel positioning precision, a white point segmentation threshold, a color difference alarm threshold and the like. And then storing the information in a template database for reading and calling in a training process and a measuring process. By this point, the modeling process ends.
Further, as shown in fig. 6, the training process of the color difference is a process of training by using the training image to obtain and store a standard multidimensional component mean sequence of each detection area, and specifically includes:
s601: on the training image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the training image, and calculating the offset of the position of each positioning core relative to the position of each positioning core on a reference image;
s602: based on the position offset of the positioning kernel, performing integer pixel offset and sub-pixel difference value calculation on the position of each detection area associated with the positioning kernel to obtain a sub-image sequence of each detection area based on a training image;
s603: and calculating the multi-dimensional component mean sequence of each detection area based on the sub-graph sequence, and storing the multi-dimensional component mean sequence as a standard multi-dimensional component mean sequence.
Specifically, similar to the calculation method of the color difference measurement method described above, in some embodiments, as shown in fig. 7, the training process of the color difference is specifically as follows:
reading template data and a standard sample image; in the training process, an image of a standard color sample needs to be acquired, and the template information saved in the modeling process is read at the same time.
Positioning nuclear positioning calculation; this can result in a change in the absolute position of the standard color sample in the image, since there is no guarantee that the position of the standard color sample on the off-line inspection apparatus inspection stage is exactly the same each time the standard color sample is placed. I.e. the relative position of the standard color sample and the imaging system is fluctuating to a small extent, the lateral fluctuation range can be written as if the fluctuation range is described in pixels
Figure 697358DEST_PATH_IMAGE066
The longitudinal fluctuation range can be written as
Figure 15207DEST_PATH_IMAGE067
. The search range of the localization kernel is set at the time of modeling, which should not be lower than the fluctuation range of the standard color sample.
On a standard color sample, positioning calculation of all positioning kernels is performed, and a surface matching mode, an edge matching mode or a geometric matching mode can be selected according to actual conditions. The positioning result is recorded as
Figure 530502DEST_PATH_IMAGE068
. Wherein,
Figure 984617DEST_PATH_IMAGE069
the abscissa and the ordinate of a certain positioning core on a standard color sample are shown, and n is the number of the positioning cores. Calculating the position offset of all the positioning core positions relative to the reference image can be recorded as
Figure 294376DEST_PATH_IMAGE070
Figure 99521DEST_PATH_IMAGE071
. Wherein,
Figure 152927DEST_PATH_IMAGE072
the abscissa and the ordinate of a certain positioning core on the reference image are shown, and n is the number of the positioning cores.
Detecting whole pixel translation and sub-pixel interpolation of the area; when the positioning calculation of all the positioning cores on the standard color sample image is completed, each detection area is circulated, and the whole pixel shift and the sub-pixel interpolation are carried out on the positions of the detection areas according to the incidence relation between the detection areas and the positioning cores to obtain the small image sequences of all the detection areas based on the standard color sample image, which are recorded as
Figure 195970DEST_PATH_IMAGE073
. m is the number of detection zones, each element corresponding to the image data of a standard color sample covered by one detection zone. For convenience of explanation, it is assumed that DeltaPos = (2.5, 1.5) of kernel a is located, which indicates that it is shifted to the right by 2.5 pixels and to the down by 1.5 pixels on the standard color sample image with respect to the reference image. If the detection area 1 and the detection area A are in a correlation relationship, the original coordinate of the detection area 1 needs to be shifted to the right by 2.5 pixels, and after the original coordinate is shifted to the downward by 1.5 pixels, the standard color sample image data covered by the coordinate is the real image data to be calculated. Wherein, the whole pixel directly carries out whole pixel shift on the coordinate of the detection area 1; the sub-pixels with 0.5 pixel need to be on the standard color sample image, and interpolation calculation is carried out in a bilinear interpolation mode in the coordinate neighborhood after the whole pixel of the detection area is shifted, so that the image data covered by the shifted detection area can be obtained. The bilinear interpolation algorithm is a conventional algorithm, and is not described herein again.
And (4) color conversion. Obtaining a sequence of minigrams (RGB color images) at all detection areas on a standard color based sample image
Figure 676629DEST_PATH_IMAGE074
And then, circulating the images, and obtaining an L component image sequence, an a component image sequence and a b component image sequence of the detection areas in the device-independent color space by using an off-line detection device color calibration matrix, wherein the L component image sequence, the a component image sequence and the b component image sequence are respectively recorded as:
Figure 234650DEST_PATH_IMAGE075
Figure 826168DEST_PATH_IMAGE076
(ii) a The conversion formula from the RGB color space to the saturation S and the brightness I is utilized to obtain the saturation S component image sequence and the brightness I component image sequence of the detection area, which are respectively recorded as:
Figure 986366DEST_PATH_IMAGE077
. The algorithm for converting the RGB space into the L component, the a component, the b component, the saturation S component, and the luminance I component is a general algorithm, but not limited theretoAnd will be described in detail.
Filtering white points; circulating all detection region saturation S component image sequences
Figure 637928DEST_PATH_IMAGE078
And (3) using a percentage segmentation algorithm, wherein a percentage threshold is a white point segmentation threshold set during modeling, each small graph is subjected to image segmentation, the low-saturation pixel points are set as invalid points (black), and the high-saturation pixel points are set as valid points (white). As shown in fig. 8, for example, if the white point division threshold is 0.6, the saturation low-value pixel point accounting for 60% of the number of the small image pixels is set to black; and the rest 40% of the saturation high-value pixel points are set to be white.
Calculating an average value; cycling through all detection region L component image sequences
Figure 89769DEST_PATH_IMAGE079
Calculating the average value of each small graph to obtain an L component average value sequence which is recorded as
Figure 484978DEST_PATH_IMAGE080
. However, in each small graph, the pixel points participating in the mean calculation must meet two conditions: firstly, the pixel point must be located in the area covered by the detection area color secondary mask; secondly, the pixel point must be a valid point segmented in the small image corresponding to the saturation S component image sequence. According to the same method, an a component mean sequence, a b component mean sequence and a brightness I component mean sequence can be obtained and are respectively recorded as:
Figure 299350DEST_PATH_IMAGE081
Figure 121813DEST_PATH_IMAGE082
. The standard color sample participating in the training may be one or more, and if the standard color sample is more than one, the calculation results of the L-component mean sequence, the a-component mean sequence, the b-component mean sequence, and the luminance I-component mean sequence are also the average values of the plurality of calculation results.
Storing training data;finally, the L component mean value sequence
Figure 857688DEST_PATH_IMAGE083
A component mean sequence
Figure 56588DEST_PATH_IMAGE084
B component mean sequence
Figure 725466DEST_PATH_IMAGE085
Luminance I component mean sequence
Figure 718830DEST_PATH_IMAGE086
And storing to create a color difference training database, and calling and using the process to be measured.
In the foregoing specific embodiment, according to the color difference measurement method based on the device-independent color space provided by the present invention, a normal printed matter image is obtained, a reference image is created based on the normal printed matter image, at least one detection area and at least one positioning core are defined on the reference image, and each positioning core has a positioning association relationship with at least one detection area; acquiring a standard color sample image, and taking the standard color sample image as a training image; acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel; acquiring a real-time image of a to-be-detected printed matter; based on the real-time image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image; respectively carrying out integer pixel shift and sub-pixel difference value calculation on the positions of the detection areas associated with the positioning kernels based on the position shift amount of each positioning kernel so as to obtain a sub-image sequence of all the detection areas based on a real-time image; calculating a multi-dimensional component mean sequence of each detection region based on the sub-graph sequences; and calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence. Therefore, the invention obtains the color difference value sequence and the color direction sequence of the detection area based on the calculation of the multi-dimensional component mean value sequence and the created color difference training database, thereby providing accurate data support for the subsequent ink color consistency control, having higher color difference measurement accuracy, simultaneously, utilizing the pre-created color difference training database, effectively reducing the calculated amount, reducing the color difference measurement difficulty, and solving the technical problems of poor color difference measurement accuracy and difficult color difference measurement in the prior art.
In addition to the above method, the present invention further provides a color difference measuring system based on a device-independent color space, as shown in fig. 9, the system comprising:
the template creating unit 100 is configured to acquire a normal printed matter image, create a reference image based on the normal printed matter image, and define at least one detection area and at least one positioning core on the reference image, where each positioning core has a positioning association relationship with at least one detection area;
a parameter training unit 200, configured to obtain a standard color sample image, use the standard color sample image as a training image, and obtain and store a standard multidimensional component mean sequence of each detection region by using the training image, each detection region, and each positioning kernel;
a real-time image obtaining unit 300, configured to obtain a real-time image of a to-be-detected printed product;
an offset calculation unit 400, configured to perform positioning calculation using each positioning kernel based on the real-time image to obtain a position of each positioning kernel on the real-time image, and calculate an offset of the position with respect to a pre-stored position on the reference image;
a sub-image sequence calculation unit 500, configured to perform integer pixel shift and sub-pixel difference calculation on the position of each detection region associated with each localization kernel, respectively, based on the position shift amount of each localization kernel, so as to obtain a sub-image sequence of each detection region based on a real-time image;
a mean sequence calculation unit 600, configured to calculate a multi-dimensional component mean sequence of each of the detection regions based on the sub-graph sequence;
a result output unit 700, configured to calculate a color difference value sequence and a color direction sequence of each detection region according to the multi-dimensional component mean sequence and the standard multi-dimensional component mean sequence.
In the foregoing specific embodiment, the color difference measurement system based on the device-independent color space, provided by the present invention, includes obtaining a normal printed matter image, creating a reference image based on the normal printed matter image, and defining at least one detection area and at least one positioning core on the reference image, where each positioning core has a positioning association relationship with at least one detection area; acquiring a standard color sample image, and taking the standard color sample image as a training image; acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel; acquiring a real-time image of a to-be-detected printed matter; based on the real-time image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image; respectively carrying out integer pixel shift and sub-pixel difference value calculation on the positions of the detection areas associated with the positioning kernels based on the position shift amount of each positioning kernel so as to obtain a sub-image sequence of all the detection areas based on a real-time image; calculating a multi-dimensional component mean sequence of each detection region based on the sub-graph sequences; and calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence. Therefore, the invention obtains the color difference value sequence and the color direction sequence of the detection area based on the calculation of the multi-dimensional component mean value sequence and the created color difference training database, thereby providing accurate data support for the subsequent ink color consistency control, having higher color difference measurement accuracy, simultaneously, utilizing the pre-created color difference training database, effectively reducing the calculated amount, reducing the color difference measurement difficulty, and solving the technical problems of poor color difference measurement accuracy and difficult color difference measurement in the prior art.
The present invention also provides an intelligent terminal, including: the device comprises a data acquisition device, a processor and a memory;
the data acquisition device is used for acquiring data; the memory is to store one or more program instructions; the processor is configured to execute one or more program instructions to perform the method as described above.
In correspondence with the above embodiments, embodiments of the present invention also provide a computer storage medium containing one or more program instructions therein. Wherein the one or more program instructions are for executing the method as described above by a binocular camera depth calibration system.
In an embodiment of the invention, the processor may be an integrated circuit chip having signal processing capability. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The processor reads the information in the storage medium and completes the steps of the method in combination with the hardware.
The storage medium may be a memory, for example, which may be volatile memory or nonvolatile memory, or which may include both volatile and nonvolatile memory.
The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory.
The volatile Memory may be a Random Access Memory (RAM) which serves as an external cache. By way of example and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), SLDRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
The storage media described in connection with the embodiments of the invention are intended to comprise, without being limited to, these and any other suitable types of memory.
Those skilled in the art will recognize that the functionality described in this disclosure may be implemented in a combination of hardware and software in one or more of the examples described above. When software is applied, the corresponding functionality may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above embodiments are only for illustrating the embodiments of the present invention and are not to be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the embodiments of the present invention shall be included in the scope of the present invention.

Claims (10)

1. A method for measuring chromatic aberration based on a device-independent color space, the method comprising:
acquiring a normal printed matter image, creating a reference image based on the normal printed matter image, and dividing at least one detection area and at least one positioning core on the reference image, wherein each positioning core and at least one detection area have a positioning association relation;
acquiring a standard color sample image, and taking the standard color sample image as a training image;
acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel; when the standard value is calculated, calculating the multi-dimensional color components of the multiple standard presswork in the same detection area, and then calculating the mean value of the corresponding results of the multiple standard presswork, wherein the mean value is the standard multi-dimensional component mean value; in the printed matter to be detected, a plurality of standard multi-dimensional component mean values can be obtained instead of one detection area, and the standard multi-dimensional component mean values form a sequence which is a standard multi-dimensional component mean value sequence;
acquiring a real-time image of a to-be-detected printed matter;
based on the real-time image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image;
respectively carrying out integer pixel shift and sub-pixel difference value calculation on the positions of the detection areas associated with the positioning kernels based on the position shift amount of each positioning kernel so as to obtain a sub-image sequence of each detection area based on a real-time image; calculating a multi-dimensional component mean sequence of each detection region based on the sub-graph sequences; the color space has a plurality of color components, the Euclidean distance is calculated during color difference measurement, three color dimensions are used instead of one dimension, the multi-dimensional components refer to the three-dimensional color components, the multi-dimensional component mean value refers to the average value of each dimension color component, as the detection areas are multiple, the multi-dimensional component mean values are obtained, and the sequence formed by the mean values is called as a multi-dimensional component mean value sequence;
calculating a color difference value sequence and a color direction sequence of each detection area according to the multi-dimensional component mean value sequence and the standard multi-dimensional component mean value sequence;
the method comprises the following steps that a to-be-detected printed matter is provided with a plurality of detection areas, a plurality of color difference values are obtained through calculation, and an array formed by the color difference values is called a color difference value sequence; the Lab color mode expresses the change trend of chromatic aberration, saturation and chromaticity by directional axes, each directional axis represents a color direction, each detection area corresponds to a group of color directions, because the to-be-detected printed matter has a plurality of detection areas, a plurality of color directions are obtained through calculation, and an array formed by the color directions is called as a color direction sequence.
2. The chromatic aberration measurement method according to claim 1, wherein the performing the positioning calculation using each positioning kernel specifically includes:
positioning calculation of positioning kernel is carried out by adopting a surface matching mode, an edge matching mode or a geometric matching mode, and the obtained positioning calculation result is recorded as
Figure 147546DEST_PATH_IMAGE001
Wherein,
Figure 791017DEST_PATH_IMAGE002
and the coordinate of any positioning core i on the real-time image is defined, n is the number of the positioning cores, and n is a positive integer.
3. The color difference measuring method according to claim 2, wherein the position is calculated with respect to a pre-stored value in the parameter using the following formulaOffset of position on examination image
Figure 126183DEST_PATH_IMAGE003
Figure 402575DEST_PATH_IMAGE004
Wherein,
Figure 481389DEST_PATH_IMAGE005
coordinates of the positioning core on the real-time image;
Figure 979367DEST_PATH_IMAGE006
coordinates on a reference image for the localization core.
4. A color difference measuring method according to claim 1, wherein, based on the position offset of each localization kernel, integer pixel offset and sub-pixel difference calculation are respectively performed on the position of each detection area associated with each localization kernel to obtain a sub-image sequence of each detection area based on a real-time image, specifically comprising:
circulating each detection area, performing integer pixel shift and sub-pixel interpolation calculation on the position of the detection area according to the incidence relation between the detection area and the positioning core to obtain a sub-image sequence of each detection area based on a real-time image, and recording the sub-image sequence as a sub-image sequence
Figure 751014DEST_PATH_IMAGE007
And m is the number of the detection areas, and each element corresponds to a real-time image covered by one detection area.
5. The color difference measurement method according to claim 4, wherein calculating a multi-dimensional component mean sequence of each detection region based on the sub-graph sequences specifically comprises:
cycling the subgraph sequence;
obtaining L component image sequence of each detection area in the device-independent color space by using the color calibration matrix of the off-line detection device
Figure 498390DEST_PATH_IMAGE008
A component image sequence
Figure 380895DEST_PATH_IMAGE009
B component image sequence
Figure 733379DEST_PATH_IMAGE010
Wherein
Figure 675927DEST_PATH_IMAGE011
obtaining the saturation S component image sequence of each detection area by using a prestored conversion formula
Figure 645020DEST_PATH_IMAGE012
And luminance I component image sequence
Figure 65637DEST_PATH_IMAGE013
Wherein
Figure 538207DEST_PATH_IMAGE014
filtering the invalid points in the detection area to obtain all valid points;
circulating all the L component image sequences of the detection area, and calculating the average value to obtain the L component average value sequence based on the real-time image
Figure 651657DEST_PATH_IMAGE015
Circulating all the detection area a component image sequences, and calculating the average value to obtain a component average value sequence based on real-time images
Figure 124357DEST_PATH_IMAGE016
Circulating all the b component image sequences of the detection area, and calculating the average value to obtain the b component average value sequence based on the real-time image
Figure 348665DEST_PATH_IMAGE017
Circulating all the detection area I component image sequences, and calculating the average value to obtain a brightness I component average value sequence based on real-time images
Figure 410162DEST_PATH_IMAGE018
Wherein,
Figure 694513DEST_PATH_IMAGE019
6. the color difference measurement method according to claim 5, wherein calculating the color difference value sequence and the color direction sequence of each detection area according to the multi-dimensional component mean sequence and the standard multi-dimensional component mean sequence specifically comprises:
obtaining an L-component mean sequence based on a training image
Figure 638198DEST_PATH_IMAGE020
A component mean sequence
Figure 666197DEST_PATH_IMAGE021
B component mean sequence
Figure 847780DEST_PATH_IMAGE022
And luminance I component mean sequence
Figure 834190DEST_PATH_IMAGE023
Wherein
Figure 734013DEST_PATH_IMAGE024
calculating the color difference value sequence of each detection area by the following formula
Figure 300124DEST_PATH_IMAGE025
And color direction sequence
Figure 601792DEST_PATH_IMAGE026
Figure 244257DEST_PATH_IMAGE027
Wherein,
Figure 631376DEST_PATH_IMAGE028
Figure 1178DEST_PATH_IMAGE029
is a color difference value sequence;
Figure 891773DEST_PATH_IMAGE030
in a sequence of color directions.
7. The chromatic aberration measurement method according to claim 1, wherein the obtaining and storing of the standard multidimensional component mean sequence of the detection regions using the training image, the detection regions, and the localization kernels specifically comprises:
on the training image, performing positioning calculation by using each positioning core to obtain the position of each positioning core on the training image, and calculating the offset of the position of each positioning core relative to the position of each positioning core on a reference image;
based on the position offset of the positioning kernel, performing integer pixel offset and sub-pixel difference value calculation on the position of each detection area associated with the positioning kernel to obtain a sub-image sequence of each detection area based on a training image;
and calculating the multi-dimensional component mean sequence of each detection area based on the sub-graph sequence, and storing the multi-dimensional component mean sequence as a standard multi-dimensional component mean sequence.
8. A device independent color space based color difference measurement system, the system comprising:
the template creating unit is used for acquiring a normal printed matter image, creating a reference image based on the normal printed matter image, and dividing at least one detection area and at least one positioning core on the reference image, wherein each positioning core has a positioning association relation with at least one detection area;
the parameter training unit is used for acquiring a standard color sample image, taking the standard color sample image as a training image, and acquiring and storing a standard multi-dimensional component mean sequence of each detection area by using the training image, each detection area and each positioning kernel; when the standard value is calculated, calculating the multi-dimensional color components of the multiple standard presswork in the same detection area, and then calculating the mean value of the corresponding results of the multiple standard presswork, wherein the mean value is the standard multi-dimensional component mean value; in the printed matter to be detected, a plurality of standard multi-dimensional component mean values can be obtained instead of one detection area, and the standard multi-dimensional component mean values form a sequence which is a standard multi-dimensional component mean value sequence;
the real-time image acquisition unit is used for acquiring a real-time image of the to-be-detected printed matter;
the offset calculation unit is used for performing positioning calculation by using each positioning core based on the real-time image so as to obtain the position of each positioning core on the real-time image, and calculating the offset of the position relative to the pre-stored position on the reference image;
a sub-image sequence calculating unit, configured to perform integer pixel shift and sub-pixel difference calculation on the position of each detection area associated with each positioning core, respectively, based on the position shift amount of each positioning core, so as to obtain a sub-image sequence of each detection area based on a real-time image;
a mean sequence calculation unit, configured to calculate a multi-dimensional component mean sequence of each of the detection regions based on the sub-image sequence; the color space has a plurality of color components, the Euclidean distance is calculated during color difference measurement, three color dimensions are used instead of one dimension, the multi-dimensional components refer to the three-dimensional color components, the multi-dimensional component mean value refers to the average value of each dimension color component, as the detection areas are multiple, the multi-dimensional component mean values are obtained, and the sequence formed by the mean values is called as a multi-dimensional component mean value sequence;
a result output unit, configured to calculate a color difference value sequence and a color direction sequence of each detection region according to the multi-dimensional component mean sequence and the standard multi-dimensional component mean sequence;
the method comprises the following steps that a to-be-detected printed matter is provided with a plurality of detection areas, a plurality of color difference values are obtained through calculation, and an array formed by the color difference values is called a color difference value sequence; the Lab color mode expresses the change trend of chromatic aberration, saturation and chromaticity by directional axes, each directional axis represents a color direction, each detection area corresponds to a group of color directions, because the to-be-detected printed matter has a plurality of detection areas, a plurality of color directions are obtained through calculation, and an array formed by the color directions is called as a color direction sequence.
9. An intelligent terminal, characterized in that, intelligent terminal includes: the device comprises a data acquisition device, a processor and a memory;
the data acquisition device is used for acquiring data; the memory is to store one or more program instructions; the processor, configured to execute one or more program instructions to perform the method of any of claims 1-7.
10. A computer-readable storage medium having one or more program instructions embodied therein for performing the method of any of claims 1-7.
CN202210201793.8A 2022-03-03 2022-03-03 Color difference measuring method and system based on device-independent color space Active CN114295561B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210201793.8A CN114295561B (en) 2022-03-03 2022-03-03 Color difference measuring method and system based on device-independent color space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210201793.8A CN114295561B (en) 2022-03-03 2022-03-03 Color difference measuring method and system based on device-independent color space

Publications (2)

Publication Number Publication Date
CN114295561A CN114295561A (en) 2022-04-08
CN114295561B true CN114295561B (en) 2022-09-20

Family

ID=80978468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210201793.8A Active CN114295561B (en) 2022-03-03 2022-03-03 Color difference measuring method and system based on device-independent color space

Country Status (1)

Country Link
CN (1) CN114295561B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674962A (en) * 2013-09-27 2014-03-26 北京中钞钞券设计制版有限公司 Printing plate quality detection system and method
CN106340012A (en) * 2016-08-23 2017-01-18 凌云光技术集团有限责任公司 Print color detection method and print color detection device
CN109507209A (en) * 2019-01-22 2019-03-22 中科院金华信息技术有限公司 A kind of film printing defect detecting system and method
CN110632094A (en) * 2019-07-24 2019-12-31 北京中科慧眼科技有限公司 Pattern quality detection method, device and system based on point-by-point comparison analysis
CN112085707A (en) * 2020-08-18 2020-12-15 北京华夏视科技术股份有限公司 Printed matter color detection method, device, equipment and medium
WO2021094496A1 (en) * 2019-11-14 2021-05-20 Basf Coatings Gmbh Method and device for identification of effect pigments in a target coating

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4739017B2 (en) * 2005-12-28 2011-08-03 キヤノン株式会社 Color evaluation processing method, color evaluation processing device, computer program, and recording medium
CN104568154B (en) * 2014-12-25 2016-08-24 北京凌云光技术有限责任公司 Printed matter quality of colour detection method and system
CN109155071B (en) * 2017-06-30 2021-01-29 华为技术有限公司 Color detection method and terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674962A (en) * 2013-09-27 2014-03-26 北京中钞钞券设计制版有限公司 Printing plate quality detection system and method
CN106340012A (en) * 2016-08-23 2017-01-18 凌云光技术集团有限责任公司 Print color detection method and print color detection device
CN109507209A (en) * 2019-01-22 2019-03-22 中科院金华信息技术有限公司 A kind of film printing defect detecting system and method
CN110632094A (en) * 2019-07-24 2019-12-31 北京中科慧眼科技有限公司 Pattern quality detection method, device and system based on point-by-point comparison analysis
WO2021094496A1 (en) * 2019-11-14 2021-05-20 Basf Coatings Gmbh Method and device for identification of effect pigments in a target coating
CN112085707A (en) * 2020-08-18 2020-12-15 北京华夏视科技术股份有限公司 Printed matter color detection method, device, equipment and medium

Also Published As

Publication number Publication date
CN114295561A (en) 2022-04-08

Similar Documents

Publication Publication Date Title
CN111028213A (en) Image defect detection method and device, electronic equipment and storage medium
CN109510948B (en) Exposure adjusting method, exposure adjusting device, computer equipment and storage medium
CN108960062A (en) Correct method, apparatus, computer equipment and the storage medium of invoice image
CN110858899B (en) Method and system for measuring optical axis center and field angle of camera movement
CN110717942A (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110519585B (en) Imaging calibration method and device applied to image acquisition equipment
CN107911682A (en) Image white balancing treatment method, device, storage medium and electronic equipment
CN107911683B (en) Image white balancing treatment method, device, storage medium and electronic equipment
JP2021531571A (en) Certificate image extraction method and terminal equipment
CN114878595B (en) Book printing quality detection method
CN111105365A (en) Color correction method, medium, terminal and device for texture image
CN115665393A (en) Color and shadow fusion algorithm applied to network virtual scene
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN118014832A (en) Image stitching method and related device based on linear feature invariance
CN114295561B (en) Color difference measuring method and system based on device-independent color space
CN110751690A (en) Visual positioning method for milling machine tool bit
CN108765379B (en) Calculation method, device, Medical Devices and the storage medium of eyeground pathological changes region area
CN114399506B (en) Image detection method and system for rainbow printed matter
CN110689586B (en) Tongue image identification method in traditional Chinese medicine intelligent tongue diagnosis and portable correction color card used for same
CN112345548A (en) Method and device for detecting surface forming smoothness of graphite plate of fuel cell
US3719806A (en) Apparatus for calculating halftone screen exposures
CN111260625B (en) Automatic extraction method for offset printing large image detection area
US10958899B2 (en) Evaluation of dynamic ranges of imaging devices
CN115830431B (en) Neural network image preprocessing method based on light intensity analysis
CN115942124B (en) Method, system, medium and equipment for digitizing industrial ray film with large blackness range

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant