WO2019211484A1 - Non-contact multispectral measurement device with improved multispectral sensor - Google Patents

Non-contact multispectral measurement device with improved multispectral sensor Download PDF

Info

Publication number
WO2019211484A1
WO2019211484A1 PCT/EP2019/061520 EP2019061520W WO2019211484A1 WO 2019211484 A1 WO2019211484 A1 WO 2019211484A1 EP 2019061520 W EP2019061520 W EP 2019061520W WO 2019211484 A1 WO2019211484 A1 WO 2019211484A1
Authority
WO
WIPO (PCT)
Prior art keywords
multispectral
measurement
measurement system
data
calibration
Prior art date
Application number
PCT/EP2019/061520
Other languages
French (fr)
Inventor
Peter Ehbets
Vitaly DMITRIEV
Heiko Gross
Johannes RILK
Thomas HOEPPLER
David Gamperl
Original Assignee
Peter Ehbets
Dmitriev Vitaly
Heiko Gross
Rilk Johannes
Hoeppler Thomas
David Gamperl
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peter Ehbets, Dmitriev Vitaly, Heiko Gross, Rilk Johannes, Hoeppler Thomas, David Gamperl filed Critical Peter Ehbets
Publication of WO2019211484A1 publication Critical patent/WO2019211484A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/251Colorimeters; Construction thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0202Mechanical elements; Supports for optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/021Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using plane or convex mirrors, parallel phase plates, or particular reflectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0256Compact construction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0272Handheld
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0289Field-of-view determination; Aiming or pointing of a spectrometer; Adjusting alignment; Encoding angular position; Size of measurement area; Position tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/255Details, e.g. use of specially adapted sources, lighting or optical systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • G01N21/274Calibration, base line adjustment, drift correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/4738Diffuse reflection, e.g. also for testing fluids, fibrous materials
    • G01N21/474Details of optical heads therefor, e.g. using optical fibres
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • G01J2003/102Plural sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/4738Diffuse reflection, e.g. also for testing fluids, fibrous materials
    • G01N21/474Details of optical heads therefor, e.g. using optical fibres
    • G01N2021/4752Geometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/02Mechanical
    • G01N2201/022Casings
    • G01N2201/0221Portable; cableless; compact; hand-held

Definitions

  • Color measurement as part of a workflow has traditionally been performed with contact mode color measurement instruments.
  • Examples are the "Capsure” instrument from X-Rite Inc.
  • the Capsure instrument includes an integrated display and is capable of standalone operation.
  • Another example is the color picker instrument "ColorMuse” from Variable Inc. This color picker instrument incorporates basic color measurement functionalities.
  • the color picker instrument communicates over Bluetooth with a smart phone or a tablet which runs a companion software application. These systems are operated in contact mode (i.e., the sample being measured and the instrument are in physical contact).
  • the optical systems of these devices cannot be used for non-contact measurement operation over a large range of positions, ambient illuminations, illumination and pickup spot sizes.
  • the instruments are based on a 45/0 measurement geometry or a different standard reference measurement geometry such as d/8.
  • the measurement distance and the measurement spot size imply restrictions for the minimum mechanical dimensions of the optics.
  • the diameter of the optics scales with increasing measurement distance and/or measurement spot size.
  • a schematic illustration of a known measurement system 10 with illumination in 45° and pick-up in 0° geometry (45/0 geometry) is depicted in Figure 1.
  • Light source 12 provides illumination
  • detector 14 detects light reflected from target surface 16a.
  • the lateral displacement between illumination and pick-up fields as a function of distance variation is shown.
  • the valid range of distance for measurement may be as small as ⁇ 2mm from a target distance d.
  • the mechanical size of the system is impacted by the distance from the measurement system to the sample surface.
  • changing the distance provides a relative spatial displacement of the observation and illumination fields (shown by the arrows in Figure 1).
  • the distance range is limited by the condition that a sufficient overlap between the two fields is required. If the distance is too great (illustrated), or too short, the illumination field and the measurement field will not be aligned. For example, target surface 16b, which is spaced from the target distance d by an additional variance v, results in the illumination field to be misaligned with the measurement field.
  • Non-contact color measurements may also be made.
  • cameras typically embedded in mobile devices have limited achievable accuracy and are not appropriate for precise color measurement.
  • the typical RGB color filters are not optimized for color measurement performance. Additionally, the environmental measurement conditions are difficult to control.
  • An improved accuracy may be achieved by using a color reference card. See, for example, U.S. Pat. Pub. 2016/0224861.
  • a small calibration target is positioned on the sample material to be measured.
  • the calibration target contains known color patches for the color calibration of the camera as well as means to characterize the illumination conditions.
  • the achievable accuracy is limited due to availability of only red, green and blue filter functions in the camera.
  • Color sensors for mobile devices also exist, such as the TCS3430 tristimulus sensor from AMS AG.
  • the application for this device is color management of the camera in a mobile device.
  • Such a sensor may be used to assist the smartphone camera sensor with color sensing of ambient light to enhance and improve picture white balance.
  • This sensor has no active illumination.
  • These sensors typically require no specific optical system.
  • An optical diffuser in front of the active detector area is sufficient.
  • U.S. Patent 8,423,080 describes a mobile communication system including a color sensor for color measurement. The usability of this system is limited since it requires manual positioning of the sensor at a pre-defined distance to initiate the automatic execution of a measurement. Controlling and holding the mobile
  • the system is limited to color data and does not support spectral reflectance information of the sample. This limits the flexibility to use the data in an open system architecture with different data libraries requiring different colorimetric calibration settings and search parameters.
  • a multispectral measurement system for measuring reflectance properties of a surface of interest in some embodiments comprises a multi spectral detector configured to measure spectral information in a plurality of bands of optical radiation, each band of optical radiation corresponding to a filter function, the multispectral detector comprising a plurality of photodiodes, each photodiode having a filter corresponding to one of the filter functions, there being at least two photodiodes corresponding each of the filter functions located in a point-symmetric arrangement in a two dimensional array; and observation optics having an aperture and being configured to observe the surface of interest; wherein each of the plurality of photodiodes is located in a different location with respect to the aperture, and wherein differences in a field of view of the surface of interest for each photodiode are compensated for by combining
  • the plurality of bands of optical radiation comprise at least six band of optical radiation.
  • the multispectral measurement system further comprises at least one illumination source illuminating at least the field of view.
  • the at least one illumination source may be a non-collimated illumination source.
  • the illumination source may emit optical radiation over a range of visible light.
  • the illumination source emits visible light radiation, near ultraviolet light radiation, and/or infra-red radiation, and the multispectral detector has filter functions
  • the multispectral measurement system further comprises a non-contact multispectral measurement device comprising: a position measurement system for measuring position values of the multispectral measurement system relative to the surface of interest; and means to correct multispectral values from the multispectral measurement system based on detected position values from the position measurement system.
  • the non-contact measurement device may comprise a mobile communications device.
  • the position measurement system may be selected from the group consisting of: a pattern projector and a camera, a camera autofocus system, a stereo vision system, a laser range finder, and a time of flight distance sensor.
  • a non-contact spectral measurement system comprises a retro-reflection multispectral sensing system and a position correction system.
  • the combination of these systems allows for non- contact measurement over a relatively wide range of measurement distances and for correction of sensor distance and angle with respect to a surface being measured. This enables accurate spectral measurement in the visible spectral region from 400 to 700 nm wavelength range, which may be extended to cover the UV (below 400 nm) and Near Infra-red (NIR) (over 700 nm) spectral regions.
  • the non-contact spectral measurement system may be advantageously included on handheld measurement devices.
  • the system may also be included on mobile communications devices to improve their spectral measurement capabilities.
  • “Non-contact multispectral measurement device,” as used herein, includes but is not limited to dedicated hand held devices and other mobile devices, such as smart phones and tablets.
  • a non-contact multispectral measurement device for measuring reflectance properties of a surface of interest may include a multispectral measurement system, a position measurement system for measuring position values of the multispectral measurement system relative to the surface of interest, and means to correct multispectral values from the multispectral measurement system based on detected position values from the position measurement system.
  • Figure 1 is a schematic illustration of an example of known 45/0 measurement geometry.
  • Figure 2 is a block diagram of an example of a non-contact multispectral measurement device according to the present invention.
  • Figure 3 is a schematic illustration of an example of a retro-reflection measurement system according to an aspect of the present invention.
  • Figure 4 is a schematic illustration of an example of a retro-reflection measurement system including a camera according to another aspect of the present invention.
  • Figure 5 is a schematic illustration including fields of illumination and view of an example of a retro-reflection measurement system including a camera according to another aspect of the present invention.
  • Figure 6 is graph of response values provided by a retro-reflection measurement system at target distance to a measured surface and at target distance ⁇ 5mm to a measured surface.
  • Figure 7 is a graph showing ratios of response values of provided by a retro-reflection measurement system at distances of target distance +5mm and target distance -5mm to a measured surface relative to response values at the target distance to a measured surface.
  • Figure 8 is a graph of filter functions for a multispectral detector which may be used in implementing the present invention.
  • Figure 9A is an example of a photodiode layout for a multispectral detector which may be used in implementing the present invention.
  • Figure 9B is another example of a photodiode layout for a multispectral detector which may be used in implementing the present invention.
  • Figure 9C is a simplified illustration of light paths to a photodiode layout for a
  • multispectral detector which may be used in implementing the present invention.
  • FIGS 10 and 11 illustrate examples of optical designs which may be used in
  • Figure 12 illustrates an example of a folded light path optical design which may be used in implementing the present invention.
  • Figures 13 and 14 illustrate examples of spatial relationships between illumination sources and a multispectral detector according to examples of a multispectral measurement system according to according to another aspect of the present invention.
  • Figure 15 illustrates an example of spatial relationship between an illumination path and an observation path with respect to a surface being measured according to another aspect of the present invention.
  • Figure 16 is a flow chart illustrating steps for calibrating and using a position detection system which may be used in implementing a retro-reflection measurement system according to according to another aspect of the present invention.
  • Figure 17 illustrates an example of dot pattern projection which may be implemented to determine position and angle information.
  • Figure 18 illustrates an example of dot pattern detection which may be implemented to determine position and angle information.
  • Figure 19 illustrates calibration procedures for generating use case calibration parameters and multispectral device calibration parameters.
  • Figures 20a and 20b illustrate examples of measurement processes relative to time using a retro-reflection measurement system according to according to another aspect of the present invention.
  • Figure 21 illustrates an example of a logical flow chart for a measurement process using a retro-reflection measurement system according to according to another aspect of the present invention.
  • Figure 22 illustrates an example of a data flow chart for a measurement process using a retro-reflection measurement system according to according to another aspect of the present invention.
  • Figure 23 illustrates an example of a setup for determining distance and angular position correction parameters for a retro-reflection measurement system according to according to another aspect of the present invention.
  • Figure 24 is a graph illustrating a response curve for a multispectral detector as distance to a surface being measured varies.
  • Figure 25 is an example of using live-view feedback from a position sensor to assist a user in targeting a sample surface which may be used with a retro-reflection
  • Figures 26 and 27 are examples of using visual feedback to assist a user in targeting a sample surface which may be used with a retro-reflection measurement system according to according to another aspect of the present invention.
  • the non-contact multispectral measurement device 100 may be implemented on mobile communications systems or dedicated handheld measurement devices.
  • Mobile communications systems such mobile smartphones or tablets, typically include mobile phone electronics, an operating system (such as iOS or Android), a camera, a display, data input capabilities, memory for data storage, and an interface to external systems (cloud, pc, networks) and wireless communications systems, such as cellular voice and data systems, LTE systems, and other wireless communications systems.
  • Such mobile communications systems are referred to herein as "mobile devices.”
  • a non-contact multispectral measurement device 100 may comprise a retro- reflection multispectral measurement system 110, a position correction system 120, and application software and calibration data processed by processor 124.
  • the application software and calibration data may be stored in a non-volatile memory.
  • a RGB camera 122 and a display 128 may also be provided.
  • the measurement geometry of the retro-reflection measurement optics, and the distance and angular orientation guidance and correction provided by the present invention improves ease of use and spectral accuracy by making accurate measurements over a wider range of distances and orientations than was previously known.
  • the application software runs on the non-contact multispectral measurement device 100 and controls the data acquisition workflow, the user interaction and the processing of the data and the application.
  • the additional sensor components can be realized in the non-contact multispectral measurement device 100 or attached to the outside of the housing of the non-contact multispectral measurement device 100.
  • the non-contact multispectral measurement systems described herein are not limited in use to mobile systems, and may be included in any device where non-contact spectral measurement with distance and angle correction is desired, including being embedded in industrial systems.
  • a non-contact multispectral measurement device 100 with spectral sensing capabilities of the present invention is particularly well suited to measure an "inspiration color" found on a surface of an object due to its ease of use, targeting assistance and use of correction parameters. Having accurately captured the inspiration color, the software application may identify a matching color in a database 130 of digital reference data corresponding to a set of color shades.
  • a color database may include relational databases, flat files of structured data (e.g., CxF and AxF files) and other libraries of structured data comprising spectral or other color data (RGB, CIE tristimulus colors, etc.) and/or associated metadata pertinent to a given use case (scattering parameters, effect finishes, translucency, printing conditions, etc.).
  • Each different color database may have different classes of materials and measurement requirements and therefore requires different calibration parameters. Such databases having different calibration parameters would be considered different use cases.
  • Color databases may include, for example, PANTONE Color Matching System colors, printable colors, architectural paint color databases, plastics colors databases, skin tone databases, and the like.
  • the database 130 may be stored on the non-contact
  • multispectral measurement device 100 or be cloud based and accessed using a mobile device's communications capabilities, either directly or through a computer network 132 as shown in Figure 2.
  • the multispectral measurement system 110 includes a retro-reflection measurement path comprising an illumination light path and an observation light path, one or more sources of illumination 112 and a multispectral detector 114.
  • the illumination sources 112 and multispectral sensor 114 may be separately mounted or positioned on a single circuit board or substrate, along with electronics and embedded firmware to operate the sensor according to a measurement sequence and to interface the measurement data to the application software.
  • the multispectral measurement system 110 is miniaturized and has a form factor suitable for integration into a mobile device or dedicated handheld measurement device. Additionally, the multispectral measurement system 110 is preferably adapted to extend valid non-contact measurement distance ranges to ⁇ 10mm, ⁇ 20mm, or more, relative to a target measurement distance.
  • the geometry of the retro- reflection measurement path may be configured to provide standardized aspecular measurement angles.
  • One standard measurement geometry for color measurement (CIE publication 15, 2004) is based on an illumination angle of 45° and a detection angle of 0°, and is commonly referred to as 45/0 measurement geometry.
  • This measurement geometry provides an aspecular angle of measurement, which is defined as the angular difference between the direction of specular reflection of the illumination in the center of the observation field and the corresponding observation angle at the center field position.
  • the aspecular angle is a relevant parameter for the amplitude of the surface reflection radiation.
  • the 45/0 measurement geometry has an aspecular angle of 45°.
  • an exemplary implementation of a multispectral measurement system 110 has measurement path with a 22.5° illumination angle and a back-reflected 22.5° detection angle in the plane of incidence with respect to a surface normal.
  • This provides an aspecular measurement angle of 45° in the plane of incidence with respect to the specular surface reflection, which corresponds to the standardized 45/0 measurement geometry. Selecting the same aspecular angles is helpful because the surface reflections are of comparable size to the standard 45/0 measurement geometry. This selection can be useful if at a later stage the measurement data of the current system needs to be compared to the measurement data of an instrument with a 45/0 geometry. This conversion can be achieved with an algorithmic correction of the measurement results.
  • the illumination sources 112 and the multispectral detector 114 components of the multispectral measurement system 110 project light onto the sample and receive back-reflected light from the sample at the substantially the same angle with respect to a surface normal of the surface being measured.
  • the illumination and detection angles are not necessarily precisely equal, because locating an illumination source 112 adjacent to a multispectral pick-up detector 114 may result in some minor difference between the illumination and detection angles. Accordingly, receiving back-reflected optical radiation at the same or at least
  • retro-reflection measurement geometry substantially the same angle (e.g., typically within about ⁇ 5° in the plane of incidence) (Fig. 15) as the illumination optical radiation is referred to herein as retro-reflection measurement geometry. Differences in angles outside of the plane of incidence have less effect on measurement accuracy, and need not be within ⁇ 5° to be considered retro-reflection measurement geometry (Fig. 14).
  • the mechanical size of the multispectral measurement system 110 may be very compact, since the illumination and detector components may be arranged at the same location on a small area such as on a common socket or support part. Additionally, the illumination and observation light paths stay centered with respect to each other when the measurement distance varies.
  • the illumination and observation light rays are substantially collinear in the plane of incidence as shown in Figure 3. This allows the field of illumination and field of observation to maintain alignment over a large measurement distance and accurate operation over a relatively large range of distance variations.
  • the sample surface has some roughness or structure, the reflected radiation from the surface will impact the measurement results.
  • the color of the sample may be characterized by the material properties inside the material (e.g., sub-surface scattering).
  • the radiation from the surface is superimposed on the sub-surface radiation and perturbs the measurement results. Larger aspecular angles relative to the measurement surface reduce the corresponding surface effects from rough surfaces. Accordingly, the present invention is not limited to the specific 22.5° retro-reflection angles illustrated in the figures. Any aspecular angle larger or equal than 30°, or more preferably 40° may be appropriate, depending on the surfaces to be measured.
  • retro- reflection angles result in retro- reflection angles larger or equal than 15°, or more preferably 20°.
  • An alternative example would be an 30°/30° optical system which corresponds to an aspecular angle of 60°.
  • the retro-reflection angles may by in the range of 15° to 30°.
  • the design of the multispectral measurement system 110 may be defined for a pre-defined target measurement distance as schematically shown in Figure 4.
  • Figure 4 the position of the multispectral measurement system 110 with respect to the camera 122 in the non-contact multispectral measurement device 100 is shown.
  • the center of the measurement field at the pre-defined target measurement distance d is positioned at the intersection point with the optical axis of the camera 122.
  • the target measurement distance to the measurement plane should be selected so that the camera can achieve a sharp image of the sample at this distance and over the desired distance variation range.
  • a reasonable pre-defined target measurement distance d is in the range of 30 mm to 150 mm. Any other distance could equivalently be supported by an adapted design.
  • the multispectral measurement system 110 should generate accurate measurement results for a broad range of materials. Many materials are not
  • the size of the observation field of the detector pick-up optics may be selected to be sufficiently large with respect to the surface inhomogeneities in order to provide a representative average measurement result.
  • a typical observation field size o with a diameter in the range of 6 to 12 mm at the target measurement distance has been found to be appropriate.
  • the present invention is not limited to any particular size of observation field.
  • the illumination field is selected to over-illuminate the observation field of the multispectral pick-up detector. Over-illumination, in this context, refers to the size i of the
  • an over-illumination radius of 2 mm at the target measurement distance may be appropriate.
  • the over-illumination radius may need to be increased.
  • the over-illumination radius should be in the range of 4 mm to 10 mm or higher. The present invention is not limited to any particular range of over-illumination radius.
  • the desired field size m in the measurement plane (6-12mm) is bigger than the desired package size of the full multispectral measurement system.
  • a measurement system with a miniature multispectral detector in the range of few millimeters will require diverging optical beams for the illumination and detector observation optical systems.
  • the solid light bundle is the observation beam; the dashed light bundle corresponds to the illumination beam.
  • the present invention includes a position correction system that provides information on the effective distance and angular orientation of the sample with respect to the
  • This position/orientation information is used by correction algorithms that correct the measurement results as a function of distance and angle with respect to the target reference measurement geometry.
  • the position correction system may also be used in combination with the display of the non-contact multispectral measurement device 100 to provide guidance to the user to hold the non- contact multispectral measurement device 100 at an appropriate distance and angle with respect to the measured surface.
  • the retro-reflection measurement geometry of the multispectral measurement system 110 is well suited for such algorithmic position correction.
  • the optical design has the property that over a large distance variation range the resulting relative signal variation is the same for each spectral observation channel of the multispectral detector 114.
  • the distance correction can be described by a global relation between
  • FIG. 6 shows response of the multispectral detector at a target distance, at the target distance +5 mm, and at the target distance -5mm.
  • Figure 7 shows the comparison of target distance to the +5mm and target to -5mm distances.
  • the figure represents the relative signal variation of the illumination over a distance range of +/- 5mm with respect to the target distance. It can be seen that the relative ratio curves are constant over the full illumination wavelength range (400 to 700 nm).
  • the multispectral detector 114 in the multispectral measurement system 110 may comprise a CMOS detector with a photodiode array.
  • Multispectral in this context means six or more different spectral channels, where each channel corresponds to a bandwidth of optical radiation.
  • Optical radiation includes visible, ultraviolet and infrared radiation.
  • An example of a commercially available six channel array is the AS7262 multispectral sensor, available from AMS AG. It is possible to apply more channels in the range of 16 or more. In this case the capabilities would correspond to a hyper-spectral or full spectral measurement system.
  • a multispectral CMOS photodiode array a precision spectral bandpass filter is deposited on each photodiode.
  • the filters may be realized by thin film coating technology in combination with photolithographic techniques to achieve the spatial microscopic filter pattern.
  • the present invention is not limited to CMOS photo-diode arrays. Alternative photosensitive detector array technology may be applied.
  • the spectral filters pass selected wavelengths of light to enable the spectral analysis. Color measurement requires spectral analysis over the visible wavelength region.
  • the number of spectral filters, the center wavelength of each filters, the bandpass (the full width at half maximum of the filter function) as well as the shape of the filter function have an impact on the achievable performance.
  • Figure 8 shows an example of a filter set composed of eight filter bandpass functions 801-808, respectively, selected to cover the visible range of light. Greater or fewer channels may be used. Spectral filters in the UV range below 400 nm and in the NIR range over 700 nm may also be included. The filters should continuously sample the specified spectral measurement range without gaps. The spectral measurement range for color application is typically the visible spectral range from 400 to 700 nm.
  • the spectral filters may cover the full sensitivity region of the underlying photodiode.
  • Figure 9A An example of a CMOS detector array is provided in Figure 9A.
  • Figure 9A illustrates a conventional eight channel array.
  • the numerals 1-8 in Figure 9A correspond to filter functions 801-808, as illustrated in Figure 8.
  • a sensor having multiple sets of photodiodes arranged in a periodic array may also be suitable.
  • FIG. 9C A simplified viewing diagram for two photodiodes at two different locations with respect to a single aperture is provided in Figure 9C. Because each photodiode is located at a different location with respect to the aperture, each photodiode measures a slightly different portion of the field of observation of the surface being measured. Conventional photodiode arrays do not correct for such different viewing angles.
  • a photodiode array with complementary symmetrically-located detector photodiodes is illustrated in Figure 9B.
  • the CMOS detector in Figure 9B has more photodiodes than filter channels.
  • the detector array provides sixteen photodiodes in a 4X4 matrix.
  • Two photodiodes are provided with spectral filters corresponding to one filter channel, with each spectral filter function being realized at two complementary locations. The locations are symmetrically arranged about a center of the photodiode matrix.
  • This complementary, symmetrical arrangement of photodiodes allows for compensation of variations of the measurement results due to different observation angles of the individual photodiodes in the matrix with compensation in both axes of the two-dimensional photodiode array. Adding the signals of the corresponding detector pixels with the same spectral filter in the detector readout electronics will cancel the measurement effect due to different observation angles, viewing angles. This provides consistent spectral measurement results for all different channels for the same angular measurement conditions.
  • the difference signal may be divided by the distance between the two filter locations on the photodiode array. This provides information about the angular variation of the measurement signal for each filter wavelength. The result of this operation is a spectral vector with additional information about the angular behavior of the material. This additional information may be used to refine the search in the color libraries or databases.
  • the multispectral detector chip may be placed in a compact chip package.
  • the observation field of the detector is defined by additional optical means like a lens or lenses, apertures or lamellar structures. These optical elements may be integrated with the detector package for a miniaturized solution or externally arranged in the mechanical housing of the multispectral measurement system.
  • An additional diffuser may be included between the detector pixels and the optical components for shaping of the observation field. The additional diffuser helps to average out measurement effects due to different viewing angles and inhomogeneities of the sample.
  • Figure 10 shows one example of an implementation of the multispectral color detector pick-up optical design with external lens 140 and an aperture 142 to define the illumination light bundle.
  • the lens is at a 22.5 angle with respect to the multispectral detector to create the desired retro-reflective measurement geometry. While only three paths of light corresponding to different photodiodes on a multispectral detector are illustrated for purposes of clarity, the invention is not so limited.
  • the optics for the illumination system for the shaping of the illumination beams may comprise mechanical apertures, lenses or lamellar structures to define the illumination cone angle.
  • Figure 11 provides an example of an optical detector pickup system with an external lens 140 and mechanical aperture 144 to shape the light bundle.
  • the multispectral measurement system 110 may be integrated inside a mobile device or may be attached outside to the casing of the mobile device.
  • Figure 12 presents another example of an optical design 150 comprising a folded optical path for the illumination and detection optical systems suitable for use in the present invention.
  • the folding of the optical path can be achieved by one, two or more surface reflections.
  • a multispectral measurement system 110 is illustrated in Figure 12, a discrete multispectral detector 114 or illumination source(s) 112 may be substituted therefore.
  • an optical diffuser can be arranged between the exit surface of the multispectral measurement system 110 and the optical system.
  • the light path is folded by two surface reflections 152, 154 followed by a lens 156.
  • Alternative embodiments may implement a different number of surface reflections.
  • the lens function may be realized by one or multiple surfaces of spherical or aspherical shape.
  • the lens function may be realized with the reflecting surfaces, that is, the reflecting surfaces may be surfaces with curvature instead of being planar.
  • the lens function may be realized by a Fresnel lens.
  • the surface reflections and the lens are formed in a single component.
  • the optical component may be composed of a transparent material such as glass or polymer.
  • Figure 12 shows an aperture 158 after the lens surface. The optical ray path and ray pattern after the aperture towards the sample plane corresponds to the geometries shown in Figs. 10 and 11.
  • the illumination sources 112 for the multispectral measurement system 110 may comprise, but are not limited to, light emitting diode (LED) emitters. Any suitable lamp or emitter may be used.
  • the illumination sources 112 are placed in close proximity to the multispectral detector 114 optics.
  • the illumination sources 112 may be integrated in the same package as the multispectral detector 114.
  • White LEDs may be used for making measurements in the visible spectrum. For an extension of the spectral range,
  • UV LEDs and NIR LEDs may be added.
  • the white LEDs may be supplemented with additional LEDs having a narrower spectrum in the visible region.
  • FIG. 13 Examples of placement of the illumination sources 112 with respect to the multispectral detector 114 are shown in Figure 13 and 14. Two illumination sources 112 are symmetrically arranged about the multispectral detector 114 in Figure 13. In Figure 14, four illumination sources 112 are illustrated with respect to the multispectral detector in the center.
  • the plane of incidence is defined by the central observation path of the multispectral detector (vector u) and the normal on the surface of the sample (vector v).
  • the observation path is inclined from the surface normal by 15° to 30° (angle Q).
  • the observation direction is inclined from the surface normal by 22.5°
  • the illumination path (vector / ' ) when projected onto the plane of incidence, is inclined from the surface normal by the same, or close to the same, angle as the observation path (typically within 5°).
  • Another optical design goal is for the aspecular angle of the illumination channel of each illumination source 112 is constant with respect to the central observation direction.
  • the LEDs on both sides of the multispectral detector are symmetrically arranged in a line perpendicular to the plane of incidence going through the center of the active detector area (dashed line in Figs. 13 and 14). Due to the small lateral displacement of the LEDs and the detector active area, the observation path and illumination path are not fully co-linear but instead have a small angular deviation. Since the angular deviation is in the out-of-plane direction (perpendicular to the plane of incidence) it has little to no effect on the spectral measurements.
  • the electronics of the non-contact multispectral measurement device 100 may store information which is useful for the calibration and for the data processing, including spectral data of the filter functions for the detector and the illumination system LEDs, a white reference vector to transfer raw measurement data into calibrated reflectance factor values at the target measurement distance, distance correction polynomial coefficients, and linearity correction.
  • the optical diffuser may be mechanically put on the measurement window of the detector pick-up channel. This may be accomplished, for example, with a mechanical slider positioned at the outside of the housing of the non-contact multispectral measurement device 100.
  • a position correction system comprises an optical pattern projector and a camera, such as the camera of a mobile device.
  • the optical pattern projector may project a set of position markers.
  • the position markers comprise individual dots. Different position markers and patterns are also contemplated, including continuous lines, including rectangles or circular patterns.
  • the optical pattern projector may project visible light in applications where visual guidance for a user is desired.
  • the optical pattern generator may also project non-visible light, such as NIR and UV, in applications where visible light may comprise a distraction or annoyance.
  • position markers may be related to position information itself in case when the position sensor uses time of flight or stereovision and searching at projected features on the sample is not needed.
  • Figure 17 illustrates an example of using epipolar geometry to identify the distance and orientation of three exemplary surface orientations.
  • an optical pattern projector is located at a known, fixed distance from a camera.
  • the projector beam projects multiple dots onto a surface of interest.
  • An image of the dots is acquired by the camera. Determining a location of one of the dots along its epipolar line provides 3-D location information for that dot relative to the camera and projector.
  • the determined 3D location of multiple dots may be fitted to a plane, and information concerning the distance and orientation of the plane relative to the camera and projector may be determined.
  • the 3D location of at least three dots is required to define a plane.
  • Figure 18 is an illustration of dots 1802, 1804 found on Epi-Polar Lines 1806. For purposes of clarity, not all epi-polar lines are illustrated.
  • the pattern projector and the camera are in the same non-contact multispectral measurement device 100 and in a in a fixed position relative to each other.
  • the geometry of the system is determined once during production as illustrated in Figure 16.
  • First the camera geometry is determined in step 1602.
  • the camera may be treated as a pinhole camera. Fixed focus may be used, and a projection matrix called camera matrix is determined. Distortion parameters may also be modeled if necessary.
  • the geometry of the pattern projector is determined in step 1604.
  • the light beams from the projector travel in a straight line, so all points where the beam hits a target is also on this straight line.
  • the image of a beam is a straight line which may be referred to as an epi polar line, a common concept of stereoscopy.
  • This along with the known, fixed distance from the camera to the pattern projector, provides the necessary calibration information for determining the 3-D locations of the projected position markers.
  • the calibration data is stored in a calibration database 1608.
  • position markers are projected on to a surface in steps 1610 and 1612.
  • An image of the position marker is captured in step 1614.
  • Position markers are detected in the image in step 1616 and position information is calculated in step 1618.
  • the optical pattern generator module should be located near the multispectral measurement system 110 in Figure 2. It should illuminate the sample from substantially the same direction with a similar incidence angle as the multispectral measurement system 110. Both systems are aligned to be centered on the axis of the camera field at the reference measurement position. This ensures that the light field of the optical pattern generator and the observation field of the detector pick-up channel are overlapping for a large distance variation range. Thus, the distance and orientation angle information are sensed at or near to the same position as the spectral
  • the distance measurement range of the position correction system is determined by the viewing field of the camera. If the field of view of the camera does not limit either the detector pick-up field or the optical pattern generator field, an analysis of the acquired position and orientation information is followed by a correction of the measurement data. Other restrictions come from the signal level which decreases with increasing distance and from the focusing capability of the camera in the short distance area to provide a sufficiently sharp image for the image analysis.
  • Calibration data is needed to define algorithms and data parameters for correcting multispectral data produced by the multispectral measurement system. Calibration operates, among others, with the hardware of the multispectral
  • the use case may be represented as a reference tile collection or as abstract information about their physical properties (e.g. gloss of printed samples).
  • a calibration workflow as presented here assumes that the position correction system has already been calibrated. Output of the calibration includes, but is not limited to:
  • the non-contact multispectral measurement device 100 may comprise a multispectral measurement system to obtain multispectral data of one or more patches, for example one or more color calibration patches.
  • the one or more color calibration patches may be selected from one or more materials or types of materials.
  • the multispectral measurement system calibration method 1900 may comprise step 1918 of forming a set of characteristic calibration data 1918CC.
  • the step 1918 may comprise acquiring multispectral data of one or more color calibration patches at one or more reference positions of a reference multispectral measurement system with respect to the one or more color calibration patches.
  • the one or more calibration patches may have one or more colors and be of one or more materials, for example having one or more appearance characteristics.
  • the set of characteristic calibration data 1918CC may be stored on a non-volatile computer-readable memory device.
  • the set of characteristic calibration data 1918CC may be used, for example, on a production line to calibrate and correct the color measurements of one or more multispectral
  • the multispectral measurement system calibration method 1900 may comprise a step 1918 of forming a set of position-related color correcting parameters 1918a.
  • the position-related color correcting parameters 1918a may be used for correcting the multispectral data acquired by, for example, a production handheld device comprising a production multispectral measurement system.
  • the data may be acquired at one or more positions, recorded as position data 1918PD, for example positions at which the characteristic calibration data 1918CC may have been acquired.
  • Position data 1918PD may comprise position and orientation measurements of the characterization device 1912, for example position and orientation of one or more of its sensors and
  • the step 1918 may comprise acquiring measurements for a plurality of patches or targets, for example a subset of the patches used for forming the set of characteristic calibration data 1918CC.
  • the position-related color correcting parameters 1918a may be production device- and position-specific.
  • the position-related color correcting parameters 1918a may be stored on a non-volatile computer-readable memory device, for example comprised on the handheld device.
  • the multispectral measurement system calibration method 1900 may comprise a process 1930 of forming multispectral data calibration parameters 1938 for each spectral device and use case.
  • the multispectral data calibration parameters 1938 may be material- or material type-specific.
  • the multispectral data calibration parameters 1938 may, for example, be represented as one or more matrices, for example a matrix for each material type.
  • material types may be: paper coated with matte ink; paper coated with glossy ink; skin; metallized paint, for example comprising effect pigments; fabric; marble; or a type of polymer.
  • the multispectral data calibration parameters 1938 may be stored on a non-volatile computer-readable memory device, for example comprised on the handheld device.
  • the step 1930 may comprise using data, for example color data, for example color space data, from one or more measured materials databases 1910.
  • the color space data may, for example be XYZ or L*a*b* data.
  • the step 1930 may comprise acquiring multispectral data of one or more color calibration patches referenced in the materials database 1910, at one or more positions, for example one or more reference positions, using the production device's multispectral measurement system 110 with respect to the one or more color calibration patches.
  • the step 1930 may comprise, for one or more materials or materials type of the materials database 1910, computing a material type-specific minimization 1936 of a sum of colorimetric distances taken over one or more patches of the material database 1910.
  • the colorimetric distance may comprise one or more of: a colorimetric term, for example expressed as a vector of 3 values in an XYZ or an L*a*b* color space; a multispectral data calibration parameters term, for example expressed as a matrix of dimensions 3x8; and a multispectral data term, for example expressed as an array of 8 values corresponding to 8 filter
  • the multispectral data calibration parameters 1938 enable the conversion of a multispectral acquisition using the multispectral measurement system 2200SS of the handheld device into, for example, colorimetric space values.
  • a user may select on the handheld device a type of material the color of which is to be measured, acquire one or more multispectral measurements of a sample or a patch of the material, and obtain, by way of the conversion of the multispectral measurement by the multispectral data calibration parameters 1938, colorimetric space values 2030 for the material.
  • the colorimetric space values 2030 for the material may be searched in the material database 1910 to retrieve a correct color, for example a nearest color match, of the material being measured.
  • the multispectral data calibration parameters 1938 may enable correct color measurement irrespective of the position and the orientation of the multispectral measurement system 2200SS with respect to the sample being measured.
  • the multispectral measurement system calibration method 1900 may enable a user to acquire one or more measurements, for example corrected measurements, of the sample from one or more orientations and positions, for example by continuously moving the non-contact multispectral measurement device 100 with respect to a sample or patch to be measured.
  • the device uses multispectral sensor to obtain multispectral data.
  • Increased amount of data comprised in the multispectral data 2026 collected by the multispectral measurement system 2200SS may be greater than the amount of data that may be collected by, for example, an RGB or other color sensor, for example a trichromatic color sensor.
  • the mobile or handheld device comprising a multispectral measurement system 2200SS may comprise one or more methods for processing multispectral data, for example computer-readable instructions stored on a non-volatile memory device for processing multispectral data.
  • the multispectral data 2026 may be corrected using data from the positioning sensor 2200PS, for example one or more of orientation and position data relative to a sample or target being measured.
  • the calibration method 1900 may be used to determine parameters, for example
  • multispectral data calibration parameters which allows the use of multispectral data for all use cases. These detected values are corrected using position correction step 2208, 2024, 2122.
  • the parameters may comprise sensor information described earlier allowing for increased flexibility when positioning the device with respect to the target.
  • Calibration determines parameters of the position correction algorithm, which are dependent on the sample or target type, for example, the type of material, dependent on one or more of sample texture, translucency, gloss, sparkle, color, and appearance. From a user viewpoint, the parameters may be dependent on a use case, for example to measure fabrics, skin, paint, paint comprising effect pigments, minerals, and polymers.
  • position-corrected multispectral data is used as an input to the colorimetric calibration step.
  • the colorimetric calibration step may be used for transforming the position-corrected multispectral data into colorimetric coordinates, for example one or more of XYZ, L*a*b*, CIE tristimulus coordinates, and further quantities that may be used to search a match, for example a color match, for the measured sample, patch, or target in the reference database 1910 in Fig. 19, 2034 in Fig. 20, 2130 in Fig. 21, (database 2220 in Fig. 22 ), for example in one or more of a Pantone Color Library, a commercial paint database, and a products database.
  • This part of the calibration is dependent on each device properties and on the use-case, for example the type of material.
  • the target-related values may be used to search in the reference database (1910 in Fig. 19, 2034 in Fig. 20, 2130 in Fig. 21, 2220 in Fig. 22) which may contain measurements made using the reference device and may be use case dependent.
  • the method for color calibration comprises improvements that may accelerate the calibration process.
  • Calibration data needed to obtain multispectral data from raw filter values (steps 1924, 1926 Fig. 22) is determined for each calibrated device and is largely independent from the use case.
  • Position correction parameters are largely dependent on the use case nature.
  • Calibration data 1938 needed to obtain reference data (e.g., color coordinates) from multispectral data is determined employing a virtual model 1934 of the device based on its sensing and illumination properties (in particular, 1922) in order to speed-up the calibration.
  • Device model 1936 and information about the use case e.g. as a collection of reference targets and their measurements by a reference device 1910
  • This simulation is used in an optimization 1936 that searches for proper parameters of the calibration algorithm.
  • calibration may be divided into three parts; use case- related calibration 1902, multispectral device calibration 1904, and combining use case- related calibration parameters and device calibration parameters 1930. Determining calibration parameters in stages allows for greater flexibility and reduced duplication of calibration efforts. For example, use case related calibration 1902 may be combined with different types of multispectral devices, and device calibration 1904 may be used with different use case parameters on an as-needed basis.
  • Use case calibration describes a general workflow to obtain the reference data for a given particular color database (print, skin tone, architectural paint, textiles, etc.) and is performed for each use case or when reference measurement procedure is changed.
  • Multispectral device calibration is done for each non-contact spectral measurement system.
  • positioning correction parameters determination 1918 is done once for larger batches of devices to calibrated
  • Use case calibration may be done only occasionally and is not necessarily synchronized with the device calibration workflow.
  • Part of the use case related calibration may be performed with a reference device 1906.
  • Different sample types often require different reference devices. For example, measurements of skin samples may require non-contact measurement with spherical spectrophotometers. Additionally, to allow for database creation to be distributed amongst different users and/or institutions, the reference device should have good inter-instrument agreement, tight population and good reproducibility. Measurements do not necessarily have to be made in a laboratory with a high-grade instrument. The measurement may be performed by users using a multispectral device as described herein or other mobile multispectral devices, especially if the datasets for a use case are small, e.g. if the data collection holds skin measurements of one user (step 2128 Fig. 21, step 2226 are an illustration of the data collection update).
  • the reference device preferably has the same or a similar geometry as the multispectral measurement system to be used on a non-contact multispectral measurement device 100, but it does not need to be the same if there are advantages to using a reference device that has a geometry that is better suited to a particular set of samples for a use case or if the use case requires more precise measurement than the multispectral measurement system can provide.
  • Reference device 1906 measures samples 1908 that are representative of a given use case. Color data, metrics and metadata may be stored in a reference database 1910.
  • the characterization device 1912 should have optical geometry and
  • the characterization device is used to perform calibration steps which are representative of a type or class of multispectral measurement systems, and may be used to derive calibration parameters for that type of multispectral as applied to a given use case. Pre-characterizing parameters common to a class of multispectral
  • step 1904 Fig. 19
  • step 1914 The transfer is done to account for different optics, measuring geometry and/or measuring procedure of the reference device.
  • This step may be done by combining the reference optics with the multispectral detector to define how optics of the reference device influences the transfer parameters 1914a.
  • Recommended measurement parameters may be defined in step 1916, including integration time, gain, signal to noise ratio (SNR), etc. 1916a, and the parameter of averaging procedure. They are later stored to define measurement parameters and statistical measurement correction (SMC), c.f. 2214, Fig. 22.
  • SMC statistical measurement correction
  • the positioning correction parameters may be defined in step 1918, including positioning correction data and positioning bounds 1918a.
  • characterization device and reference device in order to bring their measurements close to each other.
  • the transformation can be defined as follows. If reference device signal is a vector R re f of dimension n re f while the characterization device delivers filter responses F test , a vector of dimension N test the correspondence operator denoted by B can be thought of a series of vector-matrix operations.
  • weighting parameters w or and (generally nonlinear) distance function E cor can be a vector norm or a nonlinear metric such as colorimetric delta E. Additional matrix-vector (in)equality constraints can be set to assure that for a white tile the difference in signals or XYZ coordinates do not drift far apart after the nonlinear transform.
  • step 1918 calibration to obtain correction parameters to correct for variations in distance and orientation of the multispectral measurement system relative to the surface being measured is performed (step 2024, 2122, 2208).
  • measurements made by a multispectral measurement system 110 under test are recorded from measurements with a target sample at different distances which vary from a target distance d by an amount v, and at various angles Q to a sample surface.
  • the sample measurements may include targets relevant for case as sell as colors with different media properties (e.g. sample sets with different gloss properties or color reference samples like BCRA tiles).
  • the series of measurements allows one to design a correction scheme for using the positioning system to guide the multispectral measurement system into the desired position(s) for the non-contact measurement and the bounds of positioning where the desired trade-off between the usability and precision may be attained (used in steps 2016, 2110, 2112).
  • the boundaries take the precision of the positioning sensor into account this precision can be transferred into the errors of the pseudospectra to derive acceptable bounds.
  • these measurements may be performed using the characterization device and can be assumed quasi-static, i.e. constant for large numbers of devices under calibration.
  • correction of the distance h to the required position h Q may take form of a polynomial
  • Angle of orientation correction may include bidirectional reflectance distribution function (BRDF) parameters, such as a model of the surface effects (e.g., amount of gloss, texture), sample characteristics and substrate nature.
  • BRDF bidirectional reflectance distribution function
  • An example of applying a BRDF model is a correction based on the Oren-Nayar model with dynamically determined parameters. See, for example, Michael Oren and Shree K.
  • filter responses F lt ... , F N test for a series of use case dependent patches with known material properties are measured during step 1918 at calibrated distance h Q and varying orientation angles, (for example, varying in-plane orientation angle Q).
  • Measured filter responses and known material properties are used to derive parameters of the orientation correction step.
  • parameters d lt d 2 are derived such that material properties (e.g. s) during measurement process (Fig. 20) of an unknown sample may be put into correspondence with the distance corrected filter responses F 1 , ... , F Ntest , e.g. s * F t .
  • Parameters, e.g. d 1 , d 2 are part of the position correction algorithm parameters 1918a which is later stored in the memory of the multispectral sensing device. This procedure may include different forms of data processing instead of making sum of multispectral data, such as taking mean, maximum, etc.
  • surface properties are measured during step 1918 using the plurality of calibration patches. For example, change of filter response F lt ... , F Ntest at calibrated distance are measured on dark calibration patches to define the orientation-dependent surface component D s (a, Q ) as difference between filter response at calibrated orientation and varying orientation value.
  • the component may be defined as a dependence, using spline approximation of measured values as a function with arguments s, Q.
  • the system optics may include diverging light beams. This property leads to signal changing with distance.
  • the correction may be made as follows. Assume that the multispectral device delivers multispectral data F lt ... , F N for some N, the calibrated distance and orientation are h Q , q 0 and the distance and orientation measured by the position correcting system are h, Q. Correction algorithm has following inputs:
  • Output of the algorithm are distance and orientation corrected multispectral values
  • the algorithm may be implemented as follows.
  • the distance correction is made based on the distance shift h— h 0 .
  • the corrected multispectral data is summed up.
  • the Oren-Nayar model is used to find the relation of the filter value at desired orientation to the filter value at current orientation, denote it ⁇ (s, q ' ). Then, distance corrected filter values are multiplied by ⁇ (s, q ' ) to arrive at the orientation and distance corrected filter values F lt ...,P N .
  • an orientation-dependent surface component ⁇ 5 (s, q) depending on the orientation angle Q and defined during step 1918 is subtracted from the distance-corrected multispectral data before it is multiplied by ⁇ (s, q ' ).
  • Inputs of this step include distance corrected multispectral data F lt ... , F n .
  • Outputs of this step include distance and orientation corrected multispectral data > > FN ⁇ For example,
  • This procedure may be iterated: distance correction is defined again followed by the orientation correction.
  • the material properties (e.g. cavity angle standard deviation s) as well at material correction parameters may be set or corrected by the user in the settings 2036.
  • material parameters may be defined automatically using device illumination and picture of the sample taken by the camera 2010. Knowing geometry of illumination relative to the camera pickup one can put reflectance or color information detected by the camera at a specific position of the sample in relation to the angle of the incoming illumination beam at this position.
  • the BRDF may be derived from the correction model parameters as set forth above and/or the BRDF may be detected automatically using the camera 2010. Automatic BRDF detection requires no presetting of the position correction parameters in the nonvolatile device memory. Corrections may be made to multispectral data based on such BRDF information.
  • Another part of the calibration process in characterizing the device 1920 includes the steps to define multispectral calibration parameters relevant for each multispectral measurement system, but not for the use-case so it can be done once for each device (step 1922, 1924, 1926). This involves determining saturation, optimal gain and other parameter to optimally measure the samples of the use case. Data like signal- to-noise ratio (SNR) and the noise model is used to model the test and device and the device under calibration.
  • SNR signal- to-noise ratio
  • Calibration for each multispectral measurement system 110 may be divided into two parts. The first part aims at defining parameters which help transform raw data from the multispectral detector into multispectral information during the early stages of data processing (cf. step 2022, Fig. 20, step 2206, Fig.
  • step 22 These are calibration steps such as measuring neutral targets in step 1924 to determine white transfer and linearity information 1924a and the blacktrap measurement in step 1926 to determine black offset information 1926a. They are needed to compute multispectral information before position correction in step 2208, Fig. 22.
  • the second part of the calibration is performed for each multispectral measurement system 110 and for each use-case. It delivers parameters defining computation of colorimetric coordinates or other data computed in step 2216 Fig. 22 which needed to perform search in the reference database in step 2220, Fig. 22.
  • the filter curves correspond to the filter functions for the photodiodes of the color detector, as illustrated in Figure 8. Using a quadrature rule Q over the wavelength domain of the multispectral sensor, a mathematical approximation of the filter responses is created
  • Term e includes additional parameters, such as noise defined for the characterization device.
  • additional measurements may be done in step 1928 to further trim the model by comparing simulated filter values with the real filter values for a limited number of filters. They may be used to adjust the model in step 1936 or serve as support vectors during optimization in step 1938. They may also be used to adjust the operator B in step 1914. For some devices, the number of measured targets should be increased if the use case is not yet studied. This part is not limited to measurements during the production. They may also be performed by the user to adjust the model or account for the instability and drift in the non-contact multispectral measurement device 100 hardware.
  • the samples may be unrelated to the use case in question, e.g. they may be part of the Pantone color library.
  • Physical targets may be produced by a third party company. Their measurement and subsequent calibration parameter correction is then a separate part of the device workflow (Fig. 20) with the option of doing the correction on the server.
  • the calibrated device model uses the calibrated device model, device parameters such as SNR, samples measured with the calibrated device and the reference device database, optimized parameters are derived and stored in step 1938.
  • the optimized parameters are used to derive filter or colorimetric data from values from the multispectral color detector.
  • a formula similar to the one for the operator B may be used to adjust the signals before calibration.
  • the calibration parameters determined during the calibration step. It is similar to the optimization problem for the operator B.
  • the optimization delivers a linear operator to transform the multispectral coordinates to XYZ colorimetric coordinates followed by transformation into L*a*b* coordinates followed another linear correction.
  • the correction model may be as follows.
  • LAB is function implementing standard L*a*b* calculation from XYZ
  • ' Qca iib x Y z j s a ma trix with 3 rows and N callb columns
  • vectors d callb XYZ has 3 rows.
  • These elements are part of the multispectral data calibration parameters 1938 and may be defined in step 1936 by obtaining simulated raw data F cahb ’ sirrl ' transforming it to multispectral data and running an optimization.
  • Various merit functions are possible, including hit-rate.
  • the optimization step may be done on a non-contact multispectral measurement device 100 or on a cloud server.
  • the calibration and correction algorithms are designed to work with the spectral measurement data. This concept provides higher accuracy and increases the flexibility to use the data sets in an open system architecture supporting different applications with different calibration requirements. Calibration profits from the multispectral nature of the data. It is also possible that the number of channels and their positioning would allow for direct computation of the colorimetric coordinates in case of a hyperspectral or full spectral measurement system (e.g., when the channels deliver an approximation to the signal over the full wavelength range with lOnm or smaller step size). In a hyperspectral or full spectral measurement system the calibration matrix can also be avoided.
  • Device calibration may also involve measuring samples from a sample library 1928. This allows to each calibrated device to include correction to the calibration data. This step is presented as a separate block in Figure 19. This step may partly be delegated to the end-user, if the sample data with controlled standard properties is distributed to the user or purchased by the user.
  • Figures 20a, 20b, 21, and 22 illustrate steps involved in making a measurement with a non-contact multispectral measurement device 100 according to the present invention from three perspectives, time, sequence and dataflow, respectively.
  • Figures 20a and 20b show examples of how a work flow progresses over time. Referring to Figure 20a, the initial step is to start the software application 2002. Position
  • device position i.e., distance and orientation with respect to the surface of interest
  • guidance is provided to the user 2008 to position the non-contact multispectral measurement device 100 at an appropriate distance and orientation to the surface to be measured.
  • the appropriate use case calibration data is retrieved in 2010. Position continues to be measured and computed, 2014, and if position permits measurement 2016, one or more
  • SMC statistic measurement correction procedure
  • Measurements may be made with and without activating LED illumination sources 2018 to allow for correction for ambient light 2020 with input parameters such as white transfer and black offset 2022.
  • Parameters for step 2022 are found during the device specific calibration steps 1922, 1924, 1926 of the device specific calibration process 1904 as described earlier. Additionally, ambient light may be measured by an ambient light sensor. Time modulated light and demultiplexing may also be used for ambient light correction (lock-in techniques). Corrections to multispectral data from the multispectral detector may be made to compensate for device distance and orientation (parameters 1918a are found during step 1918 of the optimization) to the sampled surface 2024 and for ambient light 2020 to produce corrected multispectral data 2026. The multispectral data may then be corrected for the selected use case 2028.
  • Parameters of the correction 1938 are found during device and use case specific calibration as described earlier. Reference data is accessed in 2030. The corrected spectral data may then be used to search 2032 a color database 2034, results may be displayed 2036.
  • the multispectral measurement system 110 is activated in step 2050 to obtain multispectral filter responses 2052.
  • the multispectral filter responses are corrected for white transfer, linearity, back offset parameters 2054 in step 2056.
  • Correction for ambient light is performed in step 2058 to provide multispectral data 2060.
  • device position i.e., distance 2068 and angle of orientation 2082 with respect to the surface of interest
  • Corrections to multispectral data from the multispectral detector may be made to compensate for device distance 2068 to the sampled surface 2074 to produce distance corrected multispectral data 2072.
  • Processed multispectral data 2076 is stored in memory.
  • Orientation data 2082 and orientation calibration information 2078 are accessed.
  • material properties 2084 of the surface being samples are accessed depending on use case.
  • the multispectral data may then be corrected 2086 for orientation of the multispectral system toward the surface being sampled 2082, a model of the surface being sampled 2088, and other surface properties 2090, to produce distance and orientation corrected multispectral data 2092.
  • Figure 21 illustrates a logical sequence for obtaining color measurements with a device according to the present invention, including several loops that may be interrupted by an event triggered by user action or by a timeout.
  • the workflow in this section starts when user opens the application 2102.
  • the positioning system optionally together with the camera, starts observing the scene to bind position markers 2104.
  • a decision is made whether to proceed with a measurement.
  • Device position may be continued to be monitored in either case, at 2104, 2108.
  • the display may display audio-visual information to guide user closer to the desired position.
  • the camera positioning markers of the positioning system may be visible so that the approximate position of the measurement is visible.
  • the camera can further capture the whole scene which the sample is a part of so that the user gets the additional information of the sample together with its position in the scene.
  • the application waits for user input to start a measurement or a series of measurements.
  • the positioning markers are used to identify device distance and angle relative to the surface being sampled.
  • the application provides a visual indication of whether position is in bounds 2112 so that the measurement can be performed within bounds that are capable of being corrected.
  • the bounds are decided during the step 1918 of the calibration.
  • the application decides whether the position is close enough 2112 so that the sensor correction may be applied while guidance is shown to user on the screen. If the decision is negative, decision is taken by the user or by timeout whether a measurement should be made or whether further positioning will be attempted.
  • a measurement is performed 2116 as described below with and without additional illumination to compensate for the ambient light.
  • the measurement may be stopped in 2118 or additional measurements may be made.
  • the measurement is stored in device memory and a loop termination condition is applied. If the loop continues, the series of measurements is made as was described earlier.
  • the data collected during the loop is processed by applying steps including correcting ambient light and computing multispectral data 2120, correcting for distance and angle to the surface 2122, and computing reference data 2124. Aggregated and processed information from positioning sensor, camera and multispectral sensor is passed for search 2126, the result is visualized 2128, additionally the sample data collection 2130 is updated. After the workflow is finished, the application is put into stand-by mode for the user to start another series of
  • the system When the application is started, but before the sample acquisition command is issued, the system is active with the positioning sensor observing the scene. It identifies the positioning markers. Several measurements are used to perform averaging and position correction aiming at colorimetric information.
  • the system has separate settings that may be accessed by the user through the application.
  • the user Before starting the measurement or before performing the measurement in the sample data collection, usually including but not limited to, color library with corresponding metrics, the user can change the use case identity and supply metadata related to a particular measurement such as user age, geographic position, time of measurement, etc. In some applications e.g. when the measured sample is human skin, this metadata influences calibration data and can be used for search metrics, thus the sample data collection is not necessary reduced to color.
  • Advanced settings include measurement parameters such as averaging procedure, gain and integration time.
  • a single measurement in this scheme may include a number of illumination and measurement pairs to compensate for the ambient light.
  • the triggering of the measurement would be by the device and associated programming, not a manual input. In this case, the display and user interface serves to input the settings.
  • the hardware is activated.
  • Filter responses are measured 2204 by the multispectral detector.
  • the signal measured by the multispectral detector passes the initial processing step where dark current, linearity etc. are corrected 2206.
  • White correction and filter crosstalk correction are applied leading to multispectral information 2208. These parameters are found during steps 1922, 1924 and 1926 of the calibration as described earlier.
  • device camera acquires images of the sample to define the distance and orientation of the device relative to the sample 2210.
  • the positioning system may be designed to function with the device camera, have its own camera or be used without the device camera to define the position (e.g. use time of flight or employing stereovision).
  • sample metadata 2214 may be used in the correction process 2208 (e.g. when correcting pseudospectra from the skin) or during the search in the sample data collection (e.g. searching for the last skin data related to the user).
  • the multispectral data is corrected for the ambient light 2208.
  • the multispectral data are further corrected using the positioning information and an algorithm for position correction determined for each use case during the calibration (step 1918 of the calibration process described earlier).
  • Positioning information includes, but is not limited to, distance and orientation (e.g., sample curvature can be supplied by the positioning system). As described above, a single measurement may be a part of a series of measurements. These measurements may be used to further interpolate the values to the desired position and perform averaging to minimize positioning sensor error and other random effects as described in the next paragraph. These parameters are part of the SMC parameters found in the step 1920 of the calibration.
  • Corrected filter responses and position information are supplied to the next processing step (also in 2208) where the filter responses are averaged in case multiple measurements were performed. They are further corrected using the calibration parameters 2218 supplied by the calibration of the multispectral measurement system 2216.
  • Calibration includes, but not limited to, calibration to account for geometry or other differences between the reference device and the multispectral measurement system. During device specific and use case specific calibration these parameters are found and stored in 1938. The calibration data also corrects the individual device features, so that unified data may be used with the sample database.
  • the calibration is used to process the reference data (e.g. colorimetric data) from the multispectral data. For example, the XYZ or the L*a*b* is computed.
  • the data is specific to the use case identity and can possibly be altered using the metadata.
  • the calibration parameters are stored in the processing unit. They may additionally be accessed from a cloud-based server or on the server by supplying the unit identity.
  • the spectral data may be transferred to the cloud server for further processing.
  • the reference database 2220 may be stored on the server, although this step may also or alternatively happen in the processing unit.
  • the database may comprise sample color data for different illuminations measured by a third-party supplier as well as data related to the user (such as skin data for the user taken over time) and thus created by the user or by a group of users.
  • Part of the calibration data for the colorimetric system may be stored on the server/cloud and accessed using the unit ID for post processing.
  • the database may also include products related to the sample measured and identifiable by the color and appearance information (such as foundation products for skin or paint).
  • the sample data collection also contains metrics and a set of rules to find the closest match to a measured sample, thus it may include some functionality of an expert system. These procedures are used to search in the color library to find the closest match 2222. For example, the match may be decided using the metadata such as looking for the last measurement of user skin. Alternatively, the closest match may be sought for by using colorimetric data, e.g. by calculating a metric comprised of a weighted sum of delta E to the measured sample under different illuminants.
  • the database may be updated 2226, e.g. the database of skin measurements of the user is augmented or the color information of a new inspiration is added together with the data supported by the user.
  • the workflow is then passed back to the device 2228, more specifically, to the application which may display the color of the inspiration item 2230 or sample information 2232. If the measurement and/or search in the database have failed, the workflow is also passed back to the application 2234.
  • the camera of the device is used for targeting and positioning of the sample with respect to the device.
  • the acquired images of the camera may be visualized on the device display.
  • the individual images may be overlaid by positioning marks to guide the user to an optimal position. This provides direct user interaction for the operator and helps to support and control the measurement workflow.
  • the positioning marks may be placed on the images due to information of the positioning sensor. If a dot pattern projector is used the dots themselves may be used to guide the user as shown in Figure 25 where circles indicate the range where the dots shall be. Another option is to use the position information to place marks on the image.
  • the SW application has stored information available to calculate the location of the detector pick-up area on the sample for different distance and angular orientations. This can be done by trigonometric calculations. This helps the SW application to provide on the display a position marker for the virtual position of the measurement field of the color sensor. The user may use this virtual marker to position it at the desired location in the sample image.
  • multispectral measurement device 100 with the spectral sensor.
  • the marker is red when the alignment is out of the angular tolerances.
  • the marker becomes green as soon as the angular alignment is within the pre-determined tolerance limits. This is shown in Figure 26.
  • a similar concept can be applied for the distance control. This is shown in Figure 27.
  • Such alignment features may be superimposed to the real camera image of the scene.
  • a first method to initiate a measurement cycle and create a valid result is to align the distance and the angular orientation of the non-contact multispectral measurement device 100 until the display indicates that the position is within the tolerance bands around the reference position. Then, start a single measurement by a control interaction.
  • the control interaction may comprise pressing a button or voice command.
  • a second method is to execute a series of multiple measurements around the reference position. All measurements are stored. For each measurement the distance is corrected. The angular information for each measurement is used for an interpolation of the measurement results at the reference angle.
  • Exemplary embodiments of the present invention include, but are not limited to, the following.
  • a multispectral measurement system for measuring reflectance properties of a surface of interest comprising a multi spectral detector configured to measure spectral information in a plurality of bands of optical radiation, each band of optical radiation corresponding to a filter function, the multispectral detector comprising a plurality of photodiodes, each photodiode having a filter corresponding to one of the filter functions, there being at least two photodiodes corresponding each of the filter functions located in a point-symmetric arrangement in a two dimensional array; and observation optics having an aperture and being configured to observe the surface of interest; wherein each of the plurality of photodiodes is located in a different location with respect to the aperture, and wherein differences in a field of view of the surface of interest for each photodiode are compensated for by combining measurements of point-symmetric photodiodes.
  • the multispectral measurement system as above further comprising at least one illumination source illuminating at least the field of view.
  • the at least one illumination source is a non-collimated illumination source.
  • the multispectral measurement system further comprises at least one illumination source illuminating at least the field of view, wherein the at least one illumination source emits optical radiation over a range of visible light.
  • the multispectral measurement system comprises at least one illumination source, wherein the at least one illumination source emits visible light radiation and infra-red radiation, and wherein the multispectral detector has filter functions corresponding to visible light and near infra-red optical radiation.
  • the multispectral measurement system comprises at least one illumination source, wherein the at least one illumination source emits visible light radiation and ultraviolet radiation, and wherein the multispectral detector has filter functions corresponding to visible light and near ultraviolet optical radiation.
  • a non-contact multispectral measurement device comprising the multispectral measurement system as above, and further comprising: a position measurement system for measuring position values of the multispectral measurement system relative to the surface of interest; and means to correct multispectral values from the multispectral measurement system based on detected position values from the position
  • the non-contact multispectral measurement device as above, wherein the measurement device comprises a mobile communications device.
  • the position measurement system is selected from the group consisting of: a pattern projector and a camera, a camera autofocus system, a stereo vision system, a laser range finder, and a time of flight distance sensor.

Abstract

A multispectral measurement system for measuring reflectance properties of a surface of interest, comprising a multi spectral detector configured to measure spectral information in a plurality of bands of optical radiation, each band of optical radiation corresponding to a filter function, the multispectral detector comprising a plurality of photodiodes, each photodiode having a filter corresponding to one of the filter functions, there being at least two photodiodes corresponding each of the filter functions located in a point-symmetric arrangement in a two dimensional array; and observation optics having an aperture and being configured to observe the surface of interest; wherein each of the plurality of photodiodes is located in a different location with respect to the aperture, and wherein differences in a field of view of the surface of interest for each photodiode are compensated for by combining measurements of point-symmetric photodiodes.

Description

Non-contact Multispectral Measurement Device with Improved Multispectral Sensor
Background
Color measurement as part of a workflow has traditionally been performed with contact mode color measurement instruments. Examples are the "Capsure" instrument from X-Rite Inc. The Capsure instrument includes an integrated display and is capable of standalone operation. Another example is the color picker instrument "ColorMuse" from Variable Inc. This color picker instrument incorporates basic color measurement functionalities. The color picker instrument communicates over Bluetooth with a smart phone or a tablet which runs a companion software application. These systems are operated in contact mode (i.e., the sample being measured and the instrument are in physical contact). The optical systems of these devices cannot be used for non-contact measurement operation over a large range of positions, ambient illuminations, illumination and pickup spot sizes.
The instruments are based on a 45/0 measurement geometry or a different standard reference measurement geometry such as d/8. In the case of standardized measurement geometries, the measurement distance and the measurement spot size imply restrictions for the minimum mechanical dimensions of the optics. The diameter of the optics scales with increasing measurement distance and/or measurement spot size. A schematic illustration of a known measurement system 10 with illumination in 45° and pick-up in 0° geometry (45/0 geometry) is depicted in Figure 1. Light source 12 provides illumination, and detector 14 detects light reflected from target surface 16a. The lateral displacement between illumination and pick-up fields as a function of distance variation is shown. With such geometry, the valid range of distance for measurement may be as small as ± 2mm from a target distance d. Also, the mechanical size of the system is impacted by the distance from the measurement system to the sample surface.
Furthermore, changing the distance provides a relative spatial displacement of the observation and illumination fields (shown by the arrows in Figure 1). The distance range is limited by the condition that a sufficient overlap between the two fields is required. If the distance is too great (illustrated), or too short, the illumination field and the measurement field will not be aligned. For example, target surface 16b, which is spaced from the target distance d by an additional variance v, results in the illumination field to be misaligned with the measurement field.
Non-contact color measurements may also be made. For example, it is known to use the camera embedded in a mobile device to attempt to sample or measure a color of an object. However, cameras typically embedded in mobile devices have limited achievable accuracy and are not appropriate for precise color measurement. The typical RGB color filters are not optimized for color measurement performance. Additionally, the environmental measurement conditions are difficult to control.
An improved accuracy may be achieved by using a color reference card. See, for example, U.S. Pat. Pub. 2016/0224861. In this example, a small calibration target is positioned on the sample material to be measured. The calibration target contains known color patches for the color calibration of the camera as well as means to characterize the illumination conditions. However, the achievable accuracy is limited due to availability of only red, green and blue filter functions in the camera.
Furthermore, the usability is compromised because it requires the user to carry and place the calibration card on the surface being sampled.
Color sensors for mobile devices also exist, such as the TCS3430 tristimulus sensor from AMS AG. The application for this device is color management of the camera in a mobile device. Such a sensor may be used to assist the smartphone camera sensor with color sensing of ambient light to enhance and improve picture white balance. This sensor has no active illumination. These sensors typically require no specific optical system. An optical diffuser in front of the active detector area is sufficient. However, even with correction for ambient lighting conditions, limitations still exist for the RGB camera. U.S. Patent 8,423,080 describes a mobile communication system including a color sensor for color measurement. The usability of this system is limited since it requires manual positioning of the sensor at a pre-defined distance to initiate the automatic execution of a measurement. Controlling and holding the mobile
communication device with the sensor at a pre-defined distance can be difficult.
Additionally, the system is limited to color data and does not support spectral reflectance information of the sample. This limits the flexibility to use the data in an open system architecture with different data libraries requiring different colorimetric calibration settings and search parameters.
An additional attempt at measuring color with a mobile phone is U.S. Pat. No. 9,316,539. This patent describes a compact Fourier transform spectrometer that can operate with fringe generating optics and the camera sensor of the mobile phone. Such an arrangement does not offer the possibility to use the camera for targeting and positioning with respect to the spectral color measurement.
Summary
A multispectral measurement system for measuring reflectance properties of a surface of interest in some embodiments comprises a multi spectral detector configured to measure spectral information in a plurality of bands of optical radiation, each band of optical radiation corresponding to a filter function, the multispectral detector comprising a plurality of photodiodes, each photodiode having a filter corresponding to one of the filter functions, there being at least two photodiodes corresponding each of the filter functions located in a point-symmetric arrangement in a two dimensional array; and observation optics having an aperture and being configured to observe the surface of interest; wherein each of the plurality of photodiodes is located in a different location with respect to the aperture, and wherein differences in a field of view of the surface of interest for each photodiode are compensated for by combining
measurements of point-symmetric photodiodes. In some embodiments, the plurality of bands of optical radiation comprise at least six band of optical radiation.
In some embodiments, the multispectral measurement system further comprises at least one illumination source illuminating at least the field of view. The at least one illumination source may be a non-collimated illumination source. The illumination source may emit optical radiation over a range of visible light. In some embodiments the illumination source emits visible light radiation, near ultraviolet light radiation, and/or infra-red radiation, and the multispectral detector has filter functions
corresponding to the radiation.
In some embodiments, the multispectral measurement system further comprises a non-contact multispectral measurement device comprising: a position measurement system for measuring position values of the multispectral measurement system relative to the surface of interest; and means to correct multispectral values from the multispectral measurement system based on detected position values from the position measurement system. The non-contact measurement device may comprise a mobile communications device. The position measurement system may be selected from the group consisting of: a pattern projector and a camera, a camera autofocus system, a stereo vision system, a laser range finder, and a time of flight distance sensor.
In some embodiments, a non-contact spectral measurement system comprises a retro-reflection multispectral sensing system and a position correction system. The combination of these systems, in addition to calibration information, allows for non- contact measurement over a relatively wide range of measurement distances and for correction of sensor distance and angle with respect to a surface being measured. This enables accurate spectral measurement in the visible spectral region from 400 to 700 nm wavelength range, which may be extended to cover the UV (below 400 nm) and Near Infra-red (NIR) (over 700 nm) spectral regions. The non-contact spectral measurement system may be advantageously included on handheld measurement devices. The system may also be included on mobile communications devices to improve their spectral measurement capabilities. "Non-contact multispectral measurement device," as used herein, includes but is not limited to dedicated hand held devices and other mobile devices, such as smart phones and tablets.
A non-contact multispectral measurement device for measuring reflectance properties of a surface of interest may include a multispectral measurement system, a position measurement system for measuring position values of the multispectral measurement system relative to the surface of interest, and means to correct multispectral values from the multispectral measurement system based on detected position values from the position measurement system.
Figure 1 is a schematic illustration of an example of known 45/0 measurement geometry.
Figure 2 is a block diagram of an example of a non-contact multispectral measurement device according to the present invention.
Figure 3 is a schematic illustration of an example of a retro-reflection measurement system according to an aspect of the present invention.
Figure 4 is a schematic illustration of an example of a retro-reflection measurement system including a camera according to another aspect of the present invention.
Figure 5 is a schematic illustration including fields of illumination and view of an example of a retro-reflection measurement system including a camera according to another aspect of the present invention.
Figure 6 is graph of response values provided by a retro-reflection measurement system at target distance to a measured surface and at target distance ± 5mm to a measured surface.
Figure 7 is a graph showing ratios of response values of provided by a retro-reflection measurement system at distances of target distance +5mm and target distance -5mm to a measured surface relative to response values at the target distance to a measured surface.
Figure 8 is a graph of filter functions for a multispectral detector which may be used in implementing the present invention.
Figure 9A is an example of a photodiode layout for a multispectral detector which may be used in implementing the present invention.
Figure 9B is another example of a photodiode layout for a multispectral detector which may be used in implementing the present invention.
Figure 9C is a simplified illustration of light paths to a photodiode layout for a
multispectral detector which may be used in implementing the present invention.
Figures 10 and 11 illustrate examples of optical designs which may be used in
implementing the present invention.
Figure 12 illustrates an example of a folded light path optical design which may be used in implementing the present invention.
Figures 13 and 14 illustrate examples of spatial relationships between illumination sources and a multispectral detector according to examples of a multispectral measurement system according to according to another aspect of the present invention.
Figure 15 illustrates an example of spatial relationship between an illumination path and an observation path with respect to a surface being measured according to another aspect of the present invention.
Figure 16 is a flow chart illustrating steps for calibrating and using a position detection system which may be used in implementing a retro-reflection measurement system according to according to another aspect of the present invention. Figure 17 illustrates an example of dot pattern projection which may be implemented to determine position and angle information.
Figure 18 illustrates an example of dot pattern detection which may be implemented to determine position and angle information.
Figure 19 illustrates calibration procedures for generating use case calibration parameters and multispectral device calibration parameters.
Figures 20a and 20b illustrate examples of measurement processes relative to time using a retro-reflection measurement system according to according to another aspect of the present invention.
Figure 21 illustrates an example of a logical flow chart for a measurement process using a retro-reflection measurement system according to according to another aspect of the present invention.
Figure 22 illustrates an example of a data flow chart for a measurement process using a retro-reflection measurement system according to according to another aspect of the present invention.
Figure 23 illustrates an example of a setup for determining distance and angular position correction parameters for a retro-reflection measurement system according to according to another aspect of the present invention.
Figure 24 is a graph illustrating a response curve for a multispectral detector as distance to a surface being measured varies.
Figure 25 is an example of using live-view feedback from a position sensor to assist a user in targeting a sample surface which may be used with a retro-reflection
measurement system according to according to another aspect of the present invention. Figures 26 and 27 are examples of using visual feedback to assist a user in targeting a sample surface which may be used with a retro-reflection measurement system according to according to another aspect of the present invention.
Detailed Description.
With reference to Figure 2, a block diagram of a non-contact multispectral measurement device 100 is provided. The non-contact multispectral measurement device 100 may be implemented on mobile communications systems or dedicated handheld measurement devices. Mobile communications systems, such mobile smartphones or tablets, typically include mobile phone electronics, an operating system (such as iOS or Android), a camera, a display, data input capabilities, memory for data storage, and an interface to external systems (cloud, pc, networks) and wireless communications systems, such as cellular voice and data systems, LTE systems, and other wireless communications systems. Such mobile communications systems are referred to herein as "mobile devices."
A non-contact multispectral measurement device 100 may comprise a retro- reflection multispectral measurement system 110, a position correction system 120, and application software and calibration data processed by processor 124. The application software and calibration data may be stored in a non-volatile memory. A RGB camera 122 and a display 128 may also be provided. As described more fully herein, the measurement geometry of the retro-reflection measurement optics, and the distance and angular orientation guidance and correction provided by the present invention improves ease of use and spectral accuracy by making accurate measurements over a wider range of distances and orientations than was previously known. The application software runs on the non-contact multispectral measurement device 100 and controls the data acquisition workflow, the user interaction and the processing of the data and the application. The additional sensor components can be realized in the non-contact multispectral measurement device 100 or attached to the outside of the housing of the non-contact multispectral measurement device 100. The non-contact multispectral measurement systems described herein are not limited in use to mobile systems, and may be included in any device where non-contact spectral measurement with distance and angle correction is desired, including being embedded in industrial systems.
A non-contact multispectral measurement device 100 with spectral sensing capabilities of the present invention is particularly well suited to measure an "inspiration color" found on a surface of an object due to its ease of use, targeting assistance and use of correction parameters. Having accurately captured the inspiration color, the software application may identify a matching color in a database 130 of digital reference data corresponding to a set of color shades. A color database, as that term is used herein, may include relational databases, flat files of structured data (e.g., CxF and AxF files) and other libraries of structured data comprising spectral or other color data (RGB, CIE tristimulus colors, etc.) and/or associated metadata pertinent to a given use case (scattering parameters, effect finishes, translucency, printing conditions, etc.). Each different color database may have different classes of materials and measurement requirements and therefore requires different calibration parameters. Such databases having different calibration parameters would be considered different use cases. Color databases may include, for example, PANTONE Color Matching System colors, printable colors, architectural paint color databases, plastics colors databases, skin tone databases, and the like. The database 130 may be stored on the non-contact
multispectral measurement device 100 or be cloud based and accessed using a mobile device's communications capabilities, either directly or through a computer network 132 as shown in Figure 2.
Referring to Figure 3, the multispectral measurement system 110 includes a retro-reflection measurement path comprising an illumination light path and an observation light path, one or more sources of illumination 112 and a multispectral detector 114. The illumination sources 112 and multispectral sensor 114 may be separately mounted or positioned on a single circuit board or substrate, along with electronics and embedded firmware to operate the sensor according to a measurement sequence and to interface the measurement data to the application software.
Preferably, the multispectral measurement system 110 is miniaturized and has a form factor suitable for integration into a mobile device or dedicated handheld measurement device. Additionally, the multispectral measurement system 110 is preferably adapted to extend valid non-contact measurement distance ranges to ± 10mm, ± 20mm, or more, relative to a target measurement distance.
According to one aspect of the present invention, the geometry of the retro- reflection measurement path may be configured to provide standardized aspecular measurement angles. One standard measurement geometry for color measurement (CIE publication 15, 2004) is based on an illumination angle of 45° and a detection angle of 0°, and is commonly referred to as 45/0 measurement geometry. This measurement geometry provides an aspecular angle of measurement, which is defined as the angular difference between the direction of specular reflection of the illumination in the center of the observation field and the corresponding observation angle at the center field position. The aspecular angle is a relevant parameter for the amplitude of the surface reflection radiation. The 45/0 measurement geometry has an aspecular angle of 45°.
As shown in Figure 3, an exemplary implementation of a multispectral measurement system 110 according to the present invention has measurement path with a 22.5° illumination angle and a back-reflected 22.5° detection angle in the plane of incidence with respect to a surface normal. This provides an aspecular measurement angle of 45° in the plane of incidence with respect to the specular surface reflection, which corresponds to the standardized 45/0 measurement geometry. Selecting the same aspecular angles is helpful because the surface reflections are of comparable size to the standard 45/0 measurement geometry. This selection can be useful if at a later stage the measurement data of the current system needs to be compared to the measurement data of an instrument with a 45/0 geometry. This conversion can be achieved with an algorithmic correction of the measurement results. In this retro-reflection measurement geometry, the illumination sources 112 and the multispectral detector 114 components of the multispectral measurement system 110 project light onto the sample and receive back-reflected light from the sample at the substantially the same angle with respect to a surface normal of the surface being measured. The illumination and detection angles are not necessarily precisely equal, because locating an illumination source 112 adjacent to a multispectral pick-up detector 114 may result in some minor difference between the illumination and detection angles. Accordingly, receiving back-reflected optical radiation at the same or at least
substantially the same angle (e.g., typically within about ± 5° in the plane of incidence) (Fig. 15) as the illumination optical radiation is referred to herein as retro-reflection measurement geometry. Differences in angles outside of the plane of incidence have less effect on measurement accuracy, and need not be within ± 5° to be considered retro-reflection measurement geometry (Fig. 14).
With a retro-reflection measurement geometry, the mechanical size of the multispectral measurement system 110 may be very compact, since the illumination and detector components may be arranged at the same location on a small area such as on a common socket or support part. Additionally, the illumination and observation light paths stay centered with respect to each other when the measurement distance varies. The illumination and observation light rays are substantially collinear in the plane of incidence as shown in Figure 3. This allows the field of illumination and field of observation to maintain alignment over a large measurement distance and accurate operation over a relatively large range of distance variations. For example, in Figure 3, the field of illumination and field of observation remained aligned on target surface 116a, spaced distance d from the multispectral measurement system 110, and on target surface 116b, spaced distance d + v from the multispectral measurement system 110.
Certain applications may require care in choosing the central incidence angle of the illumination and observation optical paths. If the sample surface has some roughness or structure, the reflected radiation from the surface will impact the measurement results. The color of the sample may be characterized by the material properties inside the material (e.g., sub-surface scattering). The radiation from the surface is superimposed on the sub-surface radiation and perturbs the measurement results. Larger aspecular angles relative to the measurement surface reduce the corresponding surface effects from rough surfaces. Accordingly, the present invention is not limited to the specific 22.5° retro-reflection angles illustrated in the figures. Any aspecular angle larger or equal than 30°, or more preferably 40° may be appropriate, depending on the surfaces to be measured. These aspecular angles result in retro- reflection angles larger or equal than 15°, or more preferably 20°. An alternative example would be an 30°/30° optical system which corresponds to an aspecular angle of 60°. The retro-reflection angles may by in the range of 15° to 30°.
When it is desired to acquire images of the target surface during spectral measurement operations, the design of the multispectral measurement system 110 may be defined for a pre-defined target measurement distance as schematically shown in Figure 4. In Figure 4, the position of the multispectral measurement system 110 with respect to the camera 122 in the non-contact multispectral measurement device 100 is shown. In this example, the center of the measurement field at the pre-defined target measurement distance d is positioned at the intersection point with the optical axis of the camera 122.
The target measurement distance to the measurement plane should be selected so that the camera can achieve a sharp image of the sample at this distance and over the desired distance variation range. A reasonable pre-defined target measurement distance d is in the range of 30 mm to 150 mm. Any other distance could equivalently be supported by an adapted design.
The multispectral measurement system 110 should generate accurate measurement results for a broad range of materials. Many materials are not
homogenous. Accordingly, the size of the observation field of the detector pick-up optics may be selected to be sufficiently large with respect to the surface inhomogeneities in order to provide a representative average measurement result. Referring to Figure 5, a typical observation field size o with a diameter in the range of 6 to 12 mm at the target measurement distance has been found to be appropriate. The present invention is not limited to any particular size of observation field. The illumination field is selected to over-illuminate the observation field of the multispectral pick-up detector. Over-illumination, in this context, refers to the size i of the
illumination field relative to the observation field, where i is greater than o, and not to the intensity of the illumination. In some applications, an over-illumination radius of 2 mm at the target measurement distance may be appropriate. Depending on the translucency properties of the material the over-illumination radius may need to be increased. For translucent media like human skin the over-illumination radius should be in the range of 4 mm to 10 mm or higher. The present invention is not limited to any particular range of over-illumination radius.
The desired field size m in the measurement plane (6-12mm) is bigger than the desired package size of the full multispectral measurement system. A measurement system with a miniature multispectral detector in the range of few millimeters will require diverging optical beams for the illumination and detector observation optical systems. The solid light bundle is the observation beam; the dashed light bundle corresponds to the illumination beam.
A consequence of the optical design with a diverging illumination beam is that the detected optical signal varies as the distance and angular orientation with respect to the measured surface varies. For example, the area of the illumination field and observation field will increase with increasing measurement distances. The present invention includes a position correction system that provides information on the effective distance and angular orientation of the sample with respect to the
multispectral measurement system. This position/orientation information is used by correction algorithms that correct the measurement results as a function of distance and angle with respect to the target reference measurement geometry. The position correction system may also be used in combination with the display of the non-contact multispectral measurement device 100 to provide guidance to the user to hold the non- contact multispectral measurement device 100 at an appropriate distance and angle with respect to the measured surface.
The retro-reflection measurement geometry of the multispectral measurement system 110 is well suited for such algorithmic position correction. The optical design has the property that over a large distance variation range the resulting relative signal variation is the same for each spectral observation channel of the multispectral detector 114. The distance correction can be described by a global relation between
measurement signal and distance variation valid for all spectral filter channels.
This behavior can be characterized in the laboratory on a set of representative prototypes. A typical measurement result is shown in Figure 6 shows response of the multispectral detector at a target distance, at the target distance +5 mm, and at the target distance -5mm. Figure 7 shows the comparison of target distance to the +5mm and target to -5mm distances. The figure represents the relative signal variation of the illumination over a distance range of +/- 5mm with respect to the target distance. It can be seen that the relative ratio curves are constant over the full illumination wavelength range (400 to 700 nm).
The multispectral detector 114 in the multispectral measurement system 110 may comprise a CMOS detector with a photodiode array. Multispectral in this context means six or more different spectral channels, where each channel corresponds to a bandwidth of optical radiation. Optical radiation includes visible, ultraviolet and infrared radiation. An example of a commercially available six channel array is the AS7262 multispectral sensor, available from AMS AG. It is possible to apply more channels in the range of 16 or more. In this case the capabilities would correspond to a hyper-spectral or full spectral measurement system. In a multispectral CMOS photodiode array, a precision spectral bandpass filter is deposited on each photodiode. The filters may be realized by thin film coating technology in combination with photolithographic techniques to achieve the spatial microscopic filter pattern. The present invention is not limited to CMOS photo-diode arrays. Alternative photosensitive detector array technology may be applied.
The spectral filters pass selected wavelengths of light to enable the spectral analysis. Color measurement requires spectral analysis over the visible wavelength region. The number of spectral filters, the center wavelength of each filters, the bandpass (the full width at half maximum of the filter function) as well as the shape of the filter function have an impact on the achievable performance.
Figure 8 shows an example of a filter set composed of eight filter bandpass functions 801-808, respectively, selected to cover the visible range of light. Greater or fewer channels may be used. Spectral filters in the UV range below 400 nm and in the NIR range over 700 nm may also be included. The filters should continuously sample the specified spectral measurement range without gaps. The spectral measurement range for color application is typically the visible spectral range from 400 to 700 nm.
The spectral filters may cover the full sensitivity region of the underlying photodiode.
An example of a CMOS detector array is provided in Figure 9A. Figure 9A illustrates a conventional eight channel array. The numerals 1-8 in Figure 9A correspond to filter functions 801-808, as illustrated in Figure 8. A sensor having multiple sets of photodiodes arranged in a periodic array may also be suitable.
One consequence of optical designs where the photodetector array occupies an area which is smaller than the field of view of the surface being measured is that the diverging light ray paths result in different photodiodes each observing a slightly different portion of the surface being measured. A simplified viewing diagram for two photodiodes at two different locations with respect to a single aperture is provided in Figure 9C. Because each photodiode is located at a different location with respect to the aperture, each photodiode measures a slightly different portion of the field of observation of the surface being measured. Conventional photodiode arrays do not correct for such different viewing angles.
A photodiode array with complementary symmetrically-located detector photodiodes is illustrated in Figure 9B. The CMOS detector in Figure 9B has more photodiodes than filter channels. In this example, for the eight filter channels, the detector array provides sixteen photodiodes in a 4X4 matrix. Two photodiodes are provided with spectral filters corresponding to one filter channel, with each spectral filter function being realized at two complementary locations. The locations are symmetrically arranged about a center of the photodiode matrix.
This complementary, symmetrical arrangement of photodiodes allows for compensation of variations of the measurement results due to different observation angles of the individual photodiodes in the matrix with compensation in both axes of the two-dimensional photodiode array. Adding the signals of the corresponding detector pixels with the same spectral filter in the detector readout electronics will cancel the measurement effect due to different observation angles, viewing angles. This provides consistent spectral measurement results for all different channels for the same angular measurement conditions.
It may also be advantageous to calculate the difference between the two photodiode locations having the same filter function. The difference signal may be divided by the distance between the two filter locations on the photodiode array. This provides information about the angular variation of the measurement signal for each filter wavelength. The result of this operation is a spectral vector with additional information about the angular behavior of the material. This additional information may be used to refine the search in the color libraries or databases.
The multispectral detector chip may be placed in a compact chip package. The observation field of the detector is defined by additional optical means like a lens or lenses, apertures or lamellar structures. These optical elements may be integrated with the detector package for a miniaturized solution or externally arranged in the mechanical housing of the multispectral measurement system. An additional diffuser may be included between the detector pixels and the optical components for shaping of the observation field. The additional diffuser helps to average out measurement effects due to different viewing angles and inhomogeneities of the sample.
Figure 10 shows one example of an implementation of the multispectral color detector pick-up optical design with external lens 140 and an aperture 142 to define the illumination light bundle. The lens is at a 22.5 angle with respect to the multispectral detector to create the desired retro-reflective measurement geometry. While only three paths of light corresponding to different photodiodes on a multispectral detector are illustrated for purposes of clarity, the invention is not so limited. The optics for the illumination system for the shaping of the illumination beams may comprise mechanical apertures, lenses or lamellar structures to define the illumination cone angle. Figure 11 provides an example of an optical detector pickup system with an external lens 140 and mechanical aperture 144 to shape the light bundle. The multispectral measurement system 110 may be integrated inside a mobile device or may be attached outside to the casing of the mobile device.
Figure 12 presents another example of an optical design 150 comprising a folded optical path for the illumination and detection optical systems suitable for use in the present invention. The folding of the optical path can be achieved by one, two or more surface reflections. Although a multispectral measurement system 110 is illustrated in Figure 12, a discrete multispectral detector 114 or illumination source(s) 112 may be substituted therefore. In some embodiments an optical diffuser can be arranged between the exit surface of the multispectral measurement system 110 and the optical system. In the particular optical system shown in Figure 12 the light path is folded by two surface reflections 152, 154 followed by a lens 156. Alternative embodiments may implement a different number of surface reflections. The lens function may be realized by one or multiple surfaces of spherical or aspherical shape. In alternative embodiments, the lens function may be realized with the reflecting surfaces, that is, the reflecting surfaces may be surfaces with curvature instead of being planar. In an optical design, the lens function may be realized by a Fresnel lens. In a specific embodiment the surface reflections and the lens are formed in a single component. The optical component may be composed of a transparent material such as glass or polymer. Figure 12 shows an aperture 158 after the lens surface. The optical ray path and ray pattern after the aperture towards the sample plane corresponds to the geometries shown in Figs. 10 and 11.
The illumination sources 112 for the multispectral measurement system 110 may comprise, but are not limited to, light emitting diode (LED) emitters. Any suitable lamp or emitter may be used. The illumination sources 112 are placed in close proximity to the multispectral detector 114 optics. The illumination sources 112 may be integrated in the same package as the multispectral detector 114. White LEDs may be used for making measurements in the visible spectrum. For an extension of the spectral range,
UV LEDs and NIR LEDs may be added. In addition, the white LEDs may be supplemented with additional LEDs having a narrower spectrum in the visible region.
Examples of placement of the illumination sources 112 with respect to the multispectral detector 114 are shown in Figure 13 and 14. Two illumination sources 112 are symmetrically arranged about the multispectral detector 114 in Figure 13. In Figure 14, four illumination sources 112 are illustrated with respect to the multispectral detector in the center.
Referring to Figure 15, the plane of incidence is defined by the central observation path of the multispectral detector (vector u) and the normal on the surface of the sample (vector v). As described above, the observation path is inclined from the surface normal by 15° to 30° (angle Q). In a preferred example, the observation direction is inclined from the surface normal by 22.5° Also, to achieve retro-reflection measurement path geometry, the illumination path (vector /'), when projected onto the plane of incidence, is inclined from the surface normal by the same, or close to the same, angle as the observation path (typically within 5°). Another optical design goal is for the aspecular angle of the illumination channel of each illumination source 112 is constant with respect to the central observation direction.
Referring to Figs. 13 and 14, the LEDs on both sides of the multispectral detector are symmetrically arranged in a line perpendicular to the plane of incidence going through the center of the active detector area (dashed line in Figs. 13 and 14). Due to the small lateral displacement of the LEDs and the detector active area, the observation path and illumination path are not fully co-linear but instead have a small angular deviation. Since the angular deviation is in the out-of-plane direction (perpendicular to the plane of incidence) it has little to no effect on the spectral measurements.
The electronics of the non-contact multispectral measurement device 100 may store information which is useful for the calibration and for the data processing, including spectral data of the filter functions for the detector and the illumination system LEDs, a white reference vector to transfer raw measurement data into calibrated reflectance factor values at the target measurement distance, distance correction polynomial coefficients, and linearity correction.
To support ambient light measurement, an additional Lambertian optical diffusor may be included. The optical diffuser may be mechanically put on the measurement window of the detector pick-up channel. This may be accomplished, for example, with a mechanical slider positioned at the outside of the housing of the non-contact multispectral measurement device 100.
Various means may be employed to provide the non-contact multispectral measurement device 100 with information concerning the distance and angular orientation of a surface to be measured with respect to the multispectral measurement system. For example, "Time of Flight", "Stereo Vision," laser measurement, distance sensors, camera auto-focus information, and other means may be implemented. In one advantageous example, a position correction system comprises an optical pattern projector and a camera, such as the camera of a mobile device.
The optical pattern projector may project a set of position markers. In the illustrated examples, the position markers comprise individual dots. Different position markers and patterns are also contemplated, including continuous lines, including rectangles or circular patterns. The optical pattern projector may project visible light in applications where visual guidance for a user is desired. The optical pattern generator may also project non-visible light, such as NIR and UV, in applications where visible light may comprise a distraction or annoyance. Moreover, position markers may be related to position information itself in case when the position sensor uses time of flight or stereovision and searching at projected features on the sample is not needed.
Fitting a set of points by a certain type of surface allows a characterization regarding its position in the given coordinate system. For example, the points might be fit by a plane or a quadric, by the least square methods. For example, Figure 17 illustrates an example of using epipolar geometry to identify the distance and orientation of three exemplary surface orientations. In Figure 17, an optical pattern projector is located at a known, fixed distance from a camera. The projector beam projects multiple dots onto a surface of interest. An image of the dots is acquired by the camera. Determining a location of one of the dots along its epipolar line provides 3-D location information for that dot relative to the camera and projector. The determined 3D location of multiple dots may be fitted to a plane, and information concerning the distance and orientation of the plane relative to the camera and projector may be determined. The 3D location of at least three dots is required to define a plane. Figure 18 is an illustration of dots 1802, 1804 found on Epi-Polar Lines 1806. For purposes of clarity, not all epi-polar lines are illustrated.
In one example of the present invention, the pattern projector and the camera are in the same non-contact multispectral measurement device 100 and in a in a fixed position relative to each other. The geometry of the system is determined once during production as illustrated in Figure 16. First the camera geometry is determined in step 1602. The camera may be treated as a pinhole camera. Fixed focus may be used, and a projection matrix called camera matrix is determined. Distortion parameters may also be modeled if necessary. Then the geometry of the pattern projector is determined in step 1604. The light beams from the projector travel in a straight line, so all points where the beam hits a target is also on this straight line. Also, the image of a beam is a straight line which may be referred to as an epi polar line, a common concept of stereoscopy. This, along with the known, fixed distance from the camera to the pattern projector, provides the necessary calibration information for determining the 3-D locations of the projected position markers. In step 1606 the calibration data is stored in a calibration database 1608.
To perform position analysis, position markers are projected on to a surface in steps 1610 and 1612. An image of the position marker is captured in step 1614. Position markers are detected in the image in step 1616 and position information is calculated in step 1618.
The optical pattern generator module should be located near the multispectral measurement system 110 in Figure 2. It should illuminate the sample from substantially the same direction with a similar incidence angle as the multispectral measurement system 110. Both systems are aligned to be centered on the axis of the camera field at the reference measurement position. This ensures that the light field of the optical pattern generator and the observation field of the detector pick-up channel are overlapping for a large distance variation range. Thus, the distance and orientation angle information are sensed at or near to the same position as the spectral
measurements are made.
The distance measurement range of the position correction system is determined by the viewing field of the camera. If the field of view of the camera does not limit either the detector pick-up field or the optical pattern generator field, an analysis of the acquired position and orientation information is followed by a correction of the measurement data. Other restrictions come from the signal level which decreases with increasing distance and from the focusing capability of the camera in the short distance area to provide a sufficiently sharp image for the image analysis.
Calibration data is needed to define algorithms and data parameters for correcting multispectral data produced by the multispectral measurement system. Calibration operates, among others, with the hardware of the multispectral
measurement system 110 and with the description of the use case. The use case may be represented as a reference tile collection or as abstract information about their physical properties (e.g. gloss of printed samples). A calibration workflow as presented here assumes that the position correction system has already been calibrated. Output of the calibration includes, but is not limited to:
• Calibration data which may be needed to obtain multispectral data from the multispectral detector readings.
• Calibration data related to position correction, which may be needed to correct multispectral information based on position correction system.
• Calibration data which may be needed to obtain colorimetric or other case dependent data from the multispectral data.
A calibration workflow or method 1900 for the multispectral measurement system 110 is presented in Figure 19. The non-contact multispectral measurement device 100 may comprise a multispectral measurement system to obtain multispectral data of one or more patches, for example one or more color calibration patches. The one or more color calibration patches may be selected from one or more materials or types of materials.
The multispectral measurement system calibration method 1900 may comprise step 1918 of forming a set of characteristic calibration data 1918CC. The step 1918 may comprise acquiring multispectral data of one or more color calibration patches at one or more reference positions of a reference multispectral measurement system with respect to the one or more color calibration patches. The one or more calibration patches may have one or more colors and be of one or more materials, for example having one or more appearance characteristics. The set of characteristic calibration data 1918CC may be stored on a non-volatile computer-readable memory device. The set of characteristic calibration data 1918CC may be used, for example, on a production line to calibrate and correct the color measurements of one or more multispectral
measurement systems of one or more handheld or mobile devices.
The multispectral measurement system calibration method 1900 may comprise a step 1918 of forming a set of position-related color correcting parameters 1918a. The position-related color correcting parameters 1918a may be used for correcting the multispectral data acquired by, for example, a production handheld device comprising a production multispectral measurement system. The data may be acquired at one or more positions, recorded as position data 1918PD, for example positions at which the characteristic calibration data 1918CC may have been acquired. Position data 1918PD may comprise position and orientation measurements of the characterization device 1912, for example position and orientation of one or more of its sensors and
illumination sources, with respect to a target, sample, or color calibration patch. The step 1918 may comprise acquiring measurements for a plurality of patches or targets, for example a subset of the patches used for forming the set of characteristic calibration data 1918CC. The position-related color correcting parameters 1918a may be production device- and position-specific. The position-related color correcting parameters 1918a may be stored on a non-volatile computer-readable memory device, for example comprised on the handheld device.
The multispectral measurement system calibration method 1900 may comprise a process 1930 of forming multispectral data calibration parameters 1938 for each spectral device and use case. The multispectral data calibration parameters 1938 may be material- or material type-specific. The multispectral data calibration parameters 1938 may, for example, be represented as one or more matrices, for example a matrix for each material type. For example, material types may be: paper coated with matte ink; paper coated with glossy ink; skin; metallized paint, for example comprising effect pigments; fabric; marble; or a type of polymer. The multispectral data calibration parameters 1938 may be stored on a non-volatile computer-readable memory device, for example comprised on the handheld device.
The step 1930 may comprise using data, for example color data, for example color space data, from one or more measured materials databases 1910. The color space data may, for example be XYZ or L*a*b* data. The step 1930 may comprise acquiring multispectral data of one or more color calibration patches referenced in the materials database 1910, at one or more positions, for example one or more reference positions, using the production device's multispectral measurement system 110 with respect to the one or more color calibration patches. The step 1930 may comprise, for one or more materials or materials type of the materials database 1910, computing a material type-specific minimization 1936 of a sum of colorimetric distances taken over one or more patches of the material database 1910. The colorimetric distance may comprise one or more of: a colorimetric term, for example expressed as a vector of 3 values in an XYZ or an L*a*b* color space; a multispectral data calibration parameters term, for example expressed as a matrix of dimensions 3x8; and a multispectral data term, for example expressed as an array of 8 values corresponding to 8 filter
wavelengths of the multispectral detector.
The multispectral data calibration parameters 1938 enable the conversion of a multispectral acquisition using the multispectral measurement system 2200SS of the handheld device into, for example, colorimetric space values. For example, a user may select on the handheld device a type of material the color of which is to be measured, acquire one or more multispectral measurements of a sample or a patch of the material, and obtain, by way of the conversion of the multispectral measurement by the multispectral data calibration parameters 1938, colorimetric space values 2030 for the material. The colorimetric space values 2030 for the material may be searched in the material database 1910 to retrieve a correct color, for example a nearest color match, of the material being measured. The multispectral data calibration parameters 1938, for example stored in a multispectral data calibration parameters matrix, may enable correct color measurement irrespective of the position and the orientation of the multispectral measurement system 2200SS with respect to the sample being measured. The multispectral measurement system calibration method 1900 may enable a user to acquire one or more measurements, for example corrected measurements, of the sample from one or more orientations and positions, for example by continuously moving the non-contact multispectral measurement device 100 with respect to a sample or patch to be measured.
In greater detail, the device uses multispectral sensor to obtain multispectral data. Increased amount of data comprised in the multispectral data 2026 collected by the multispectral measurement system 2200SS may be greater than the amount of data that may be collected by, for example, an RGB or other color sensor, for example a trichromatic color sensor. The mobile or handheld device comprising a multispectral measurement system 2200SS may comprise one or more methods for processing multispectral data, for example computer-readable instructions stored on a non-volatile memory device for processing multispectral data. The multispectral data 2026 may be corrected using data from the positioning sensor 2200PS, for example one or more of orientation and position data relative to a sample or target being measured. The calibration method 1900 may be used to determine parameters, for example
multispectral data calibration parameters, which allows the use of multispectral data for all use cases. These detected values are corrected using position correction step 2208, 2024, 2122. The parameters may comprise sensor information described earlier allowing for increased flexibility when positioning the device with respect to the target. Calibration determines parameters of the position correction algorithm, which are dependent on the sample or target type, for example, the type of material, dependent on one or more of sample texture, translucency, gloss, sparkle, color, and appearance. From a user viewpoint, the parameters may be dependent on a use case, for example to measure fabrics, skin, paint, paint comprising effect pigments, minerals, and polymers.
As mentioned earlier, position-corrected multispectral data is used as an input to the colorimetric calibration step. The colorimetric calibration step may be used for transforming the position-corrected multispectral data into colorimetric coordinates, for example one or more of XYZ, L*a*b*, CIE tristimulus coordinates, and further quantities that may be used to search a match, for example a color match, for the measured sample, patch, or target in the reference database 1910 in Fig. 19, 2034 in Fig. 20, 2130 in Fig. 21, (database 2220 in Fig. 22 ), for example in one or more of a Pantone Color Library, a commercial paint database, and a products database. This part of the calibration is dependent on each device properties and on the use-case, for example the type of material. The target-related values may be used to search in the reference database (1910 in Fig. 19, 2034 in Fig. 20, 2130 in Fig. 21, 2220 in Fig. 22) which may contain measurements made using the reference device and may be use case dependent.
The method for color calibration comprises improvements that may accelerate the calibration process. Calibration data needed to obtain multispectral data from raw filter values (steps 1924, 1926 Fig. 22) is determined for each calibrated device and is largely independent from the use case. Position correction parameters, on the other hand, are largely dependent on the use case nature. The calibration process (1918, Fig. 22) aiming at determining parameters of the position correction algorithm, (1918a, Fig. 22), is considered quasi-static when compared to variation of multispectral sensor properties (steps 1922, 1924, 1926 and 1928 Fig. 22) and is determined for a large batch of calibrated devices. Calibration data 1938 needed to obtain reference data (e.g., color coordinates) from multispectral data is determined employing a virtual model 1934 of the device based on its sensing and illumination properties (in particular, 1922) in order to speed-up the calibration. Device model 1936 and information about the use case (e.g. as a collection of reference targets and their measurements by a reference device 1910) are used as an input to the virtual model which delivers simulated multispectral data with potentially less time and resource investment than a measurement of the underlying reference targets. This simulation is used in an optimization 1936 that searches for proper parameters of the calibration algorithm.
In view of the foregoing, calibration may be divided into three parts; use case- related calibration 1902, multispectral device calibration 1904, and combining use case- related calibration parameters and device calibration parameters 1930. Determining calibration parameters in stages allows for greater flexibility and reduced duplication of calibration efforts. For example, use case related calibration 1902 may be combined with different types of multispectral devices, and device calibration 1904 may be used with different use case parameters on an as-needed basis.
Use case calibration describes a general workflow to obtain the reference data for a given particular color database (print, skin tone, architectural paint, textiles, etc.) and is performed for each use case or when reference measurement procedure is changed. Multispectral device calibration is done for each non-contact spectral measurement system. As was mentioned earlier, positioning correction parameters determination 1918 is done once for larger batches of devices to calibrated
multispectral devices. It depends on the medium and is assigned in the use case calibration. Use case calibration may be done only occasionally and is not necessarily synchronized with the device calibration workflow.
Part of the use case related calibration may be performed with a reference device 1906. Different sample types often require different reference devices. For example, measurements of skin samples may require non-contact measurement with spherical spectrophotometers. Additionally, to allow for database creation to be distributed amongst different users and/or institutions, the reference device should have good inter-instrument agreement, tight population and good reproducibility. Measurements do not necessarily have to be made in a laboratory with a high-grade instrument. The measurement may be performed by users using a multispectral device as described herein or other mobile multispectral devices, especially if the datasets for a use case are small, e.g. if the data collection holds skin measurements of one user (step 2128 Fig. 21, step 2226 are an illustration of the data collection update). In view of the foregoing, the reference device preferably has the same or a similar geometry as the multispectral measurement system to be used on a non-contact multispectral measurement device 100, but it does not need to be the same if there are advantages to using a reference device that has a geometry that is better suited to a particular set of samples for a use case or if the use case requires more precise measurement than the multispectral measurement system can provide. Reference device 1906 measures samples 1908 that are representative of a given use case. Color data, metrics and metadata may be stored in a reference database 1910.
The characterization device 1912 should have optical geometry and
characteristics similar to or the same as a multispectral measurement system as described herein. The characterization device is used to perform calibration steps which are representative of a type or class of multispectral measurement systems, and may be used to derive calibration parameters for that type of multispectral as applied to a given use case. Pre-characterizing parameters common to a class of multispectral
measurement systems reduces calibration efforts during manufacturing and calibration of individual units (step 1904, Fig. 19). It is further employed to define reduced model to transfer measurements from the reference device to the characterization device in step 1914, as described in more detail below. The transfer is done to account for different optics, measuring geometry and/or measuring procedure of the reference device. This step may be done by combining the reference optics with the multispectral detector to define how optics of the reference device influences the transfer parameters 1914a. Recommended measurement parameters may be defined in step 1916, including integration time, gain, signal to noise ratio (SNR), etc. 1916a, and the parameter of averaging procedure. They are later stored to define measurement parameters and statistical measurement correction (SMC), c.f. 2214, Fig. 22. The positioning correction parameters may be defined in step 1918, including positioning correction data and positioning bounds 1918a.
With respect to the step of measuring correspondence to the reference device 1914, an operation is performed to provide a reduced model between the
characterization device and reference device in order to bring their measurements close to each other. In its most simple form the transformation can be defined as follows. If reference device signal is a vector Rref of dimension nref while the characterization device delivers filter responses Ftest, a vector of dimension Ntest the correspondence operator denoted by B can be thought of a series of vector-matrix operations.
Figure imgf000031_0001
For each index i = 1, ... nre/ . With linear operators B
Figure imgf000031_0002
for polynomials of rising dimension, constant shift d or and the nonlinear functions fcor to account for, e.g. different linearity properties of the devices and is often a piecewise polynomial. It can also take the form of an activation function so that the model incorporates a neural network. Parameters are found by first measuring nonlinearity properties on e.g.
neutral targets with Lambertian surface properties to obtain dcor and fcor. Afterwards, a series of samples relevant to the case and samples having clearly defined properties (e.g. BCRA tiles) are measured, total mcor, and an optimization problem is started. One example could be a generally nonlinear optimization problem. These parameters later become part of multispectral data 1938 calibration parameters needed to form reference data from the position corrected multispectral data (step 2216, 2030, 2124).
Figure imgf000032_0001
With weighting parameters w or and (generally nonlinear) distance function Ecor . The latter can be a vector norm or a nonlinear metric such as colorimetric delta E. Additional matrix-vector (in)equality constraints can be set to assure that for a white tile the difference in signals or XYZ coordinates do not drift far apart after the nonlinear transform.
With respect to step 1918, calibration to obtain correction parameters to correct for variations in distance and orientation of the multispectral measurement system relative to the surface being measured is performed (step 2024, 2122, 2208). Referring to Figure 23, measurements made by a multispectral measurement system 110 under test are recorded from measurements with a target sample at different distances which vary from a target distance d by an amount v, and at various angles Q to a sample surface. The sample measurements may include targets relevant for case as sell as colors with different media properties (e.g. sample sets with different gloss properties or color reference samples like BCRA tiles).
The series of measurements allows one to design a correction scheme for using the positioning system to guide the multispectral measurement system into the desired position(s) for the non-contact measurement and the bounds of positioning where the desired trade-off between the usability and precision may be attained (used in steps 2016, 2110, 2112). The boundaries take the precision of the positioning sensor into account this precision can be transferred into the errors of the pseudospectra to derive acceptable bounds. As mentioned previously, these measurements may be performed using the characterization device and can be assumed quasi-static, i.e. constant for large numbers of devices under calibration.
For example, correction of the distance h to the required position hQ may take form of a polynomial
Figure imgf000033_0001
As shown in Figure 24, in case position sensor has an imperfect precision, stability of the correction curve defines the propagated error from the positioning sensor. In Figure 23, the device is positioned v=3 mm from target distance d. Position measurement effort is transformed into an uncertainty in the value
Figure imgf000033_0002
. The desired uncertainty defines the bounds of the positioning. Parameters a0 a
Figure imgf000033_0003
a2 a may be scalar or themselves include functional dependences on multispectral data.
Additional correction for orientation may be applied together with the distance correction. Distance and orientation corrections may be applied together, in parallel, sequentially or iteratively. Angle of orientation correction may include bidirectional reflectance distribution function (BRDF) parameters, such as a model of the surface effects (e.g., amount of gloss, texture), sample characteristics and substrate nature. An example of applying a BRDF model is a correction based on the Oren-Nayar model with dynamically determined parameters. See, for example, Michael Oren and Shree K.
Nayar, Generalization of Lambert's reflectance model, In Proceedings of the 21st annual conference on Computer graphics and interactive techniques (SIGGRAPH '94). ACM,
New York, NY, USA, 239-246.
As such, a case dependent model of cavity angle standard deviation s
dependence on filter signals may be derived. For example, filter responses Flt ... , FNtest for a series of use case dependent patches with known material properties are measured during step 1918 at calibrated distance hQ and varying orientation angles, (for example, varying in-plane orientation angle Q). Measured filter responses and known material properties (e.g. cavity angle standard deviation s) are used to derive parameters of the orientation correction step. For example, parameters dlt d2 are derived such that material properties (e.g. s) during measurement process (Fig. 20) of an unknown sample may be put into correspondence with the distance corrected filter responses F1, ... , FNtest, e.g. s *
Figure imgf000034_0001
Ft. Parameters, e.g. d1, d2 are part of the position correction algorithm parameters 1918a which is later stored in the memory of the multispectral sensing device. This procedure may include different forms of data processing instead of making sum of multispectral data, such as taking mean, maximum, etc.
Moreover, surface properties are measured during step 1918 using the plurality of calibration patches. For example, change of filter response Flt ... , FNtest at calibrated distance are measured on dark calibration patches to define the orientation-dependent surface component Ds(a, Q ) as difference between filter response at calibrated orientation and varying orientation value. The component may be defined as a dependence, using spline approximation of measured values as a function with arguments s, Q.
The system optics may include diverging light beams. This property leads to signal changing with distance. The correction may be made as follows. Assume that the multispectral device delivers multispectral data Flt ... , FN for some N, the calibrated distance and orientation are hQ, q0 and the distance and orientation measured by the position correcting system are h, Q. Correction algorithm has following inputs:
• multispectral data Fi> > FN as
• parameters dlt d2,
Figure imgf000034_0002
of the distance and orientation correction algorithm. Additionally, functional dependence Ds(a, 6') of the surface component. • position information delivered by the position correcting system (e.g. distance and orientation angles h, Q) together with calibration position , 00
Output of the algorithm are distance and orientation corrected multispectral values
Figure imgf000035_0001
The algorithm may be implemented as follows.
1. the distance correction is made based on the distance shift h— h0. The step has multispectral values Ei, , FN as an input as distance corrected multispectral values El, , EN as an output. For example,
Figure imgf000035_0002
= F A(Ftest , h, FQ) for i = 1
Figure imgf000035_0003
Parameters of operator A
Figure imgf000035_0004
h, /i0) are found during step 1918 as described previously.
2. The corrected multispectral data is summed up. The step has distance corrected multispectral data Flt ... , FN as an input and sum of the distance corrected multispectral data å =1 Ft as an output.
3. This sum å =1 Ft is used as an argument of a linear relation with parameters defined by the calibration step 1918 to define material property, the standard deviation s of the orientation of the cavities used in the Oren-Nayar model. Inputs of this step include parameters dlt d2 and the sum of distance corrected multispectral data å =1 Fi An output of the step is the material property of the surface, e.g. s. As mentioned previously, this step may have the form s d1 + d2 å =i Ft with parameters dlt d2 being part of the parameters 1918a defined during step 1918 and stored in device memory.
4. The Oren-Nayar model is used to find the relation of the filter value at desired orientation to the filter value at current orientation, denote it ΰ (s, q'). Then, distance corrected filter values are multiplied by ΰ (s, q') to arrive at the orientation and distance corrected filter values Flt ...,PN. For targets with strong surface properties, an orientation-dependent surface component ϋ5(s, q) depending on the orientation angle Q and defined during step 1918 is subtracted from the distance-corrected multispectral data before it is multiplied by ΰ (s, q'). Inputs of this step include distance corrected multispectral data Flt ... , Fn. Outputs of this step include distance and orientation corrected multispectral data > > FN · For example,
Figure imgf000036_0001
This procedure may be iterated: distance correction is defined again followed by the orientation correction.
The material properties (e.g. cavity angle standard deviation s) as well at material correction parameters may be set or corrected by the user in the settings 2036. Alternatively, material parameters may be defined automatically using device illumination and picture of the sample taken by the camera 2010. Knowing geometry of illumination relative to the camera pickup one can put reflectance or color information detected by the camera at a specific position of the sample in relation to the angle of the incoming illumination beam at this position. The BRDF may be derived from the correction model parameters as set forth above and/or the BRDF may be detected automatically using the camera 2010. Automatic BRDF detection requires no presetting of the position correction parameters in the nonvolatile device memory. Corrections may be made to multispectral data based on such BRDF information.
Another part of the calibration process in characterizing the device 1920, includes the steps to define multispectral calibration parameters relevant for each multispectral measurement system, but not for the use-case so it can be done once for each device (step 1922, 1924, 1926). This involves determining saturation, optimal gain and other parameter to optimally measure the samples of the use case. Data like signal- to-noise ratio (SNR) and the noise model is used to model the test and device and the device under calibration. Calibration for each multispectral measurement system 110 may be divided into two parts. The first part aims at defining parameters which help transform raw data from the multispectral detector into multispectral information during the early stages of data processing (cf. step 2022, Fig. 20, step 2206, Fig. 22), These are calibration steps such as measuring neutral targets in step 1924 to determine white transfer and linearity information 1924a and the blacktrap measurement in step 1926 to determine black offset information 1926a. They are needed to compute multispectral information before position correction in step 2208, Fig. 22.
The second part of the calibration is performed for each multispectral measurement system 110 and for each use-case. It delivers parameters defining computation of colorimetric coordinates or other data computed in step 2216 Fig. 22 which needed to perform search in the reference database in step 2220, Fig. 22. For faster calibration, in step 1922 filter curves
Figure imgf000037_0001
all filters i = 1, ... , Ncallb and the LED spectra (A) are measured at a sufficient number of points. The filter curves correspond to the filter functions for the photodiodes of the color detector, as illustrated in Figure 8. Using a quadrature rule Q over the wavelength domain of the multispectral sensor, a mathematical approximation of the filter responses is created
Figure imgf000037_0002
where the function g(Fcallb') includes white normalization and linearization. Term e includes additional parameters, such as noise defined for the characterization device. As shown in Figure 19, additional measurements may be done in step 1928 to further trim the model by comparing simulated filter values with the real filter values for a limited number of filters. They may be used to adjust the model in step 1936 or serve as support vectors during optimization in step 1938. They may also be used to adjust the operator B in step 1914. For some devices, the number of measured targets should be increased if the use case is not yet studied. This part is not limited to measurements during the production. They may also be performed by the user to adjust the model or account for the instability and drift in the non-contact multispectral measurement device 100 hardware. The samples may be unrelated to the use case in question, e.g. they may be part of the Pantone color library. Physical targets may be produced by a third party company. Their measurement and subsequent calibration parameter correction is then a separate part of the device workflow (Fig. 20) with the option of doing the correction on the server.
Using the calibrated device model, device parameters such as SNR, samples measured with the calibrated device and the reference device database, optimized parameters are derived and stored in step 1938. The optimized parameters are used to derive filter or colorimetric data from values from the multispectral color detector. A formula similar to the one for the operator B may be used to adjust the signals before calibration. For values chosen as reference data, the calibration parameters determined during the calibration step. It is similar to the optimization problem for the operator B. For example, the optimization delivers a linear operator to transform the multispectral coordinates to XYZ colorimetric coordinates followed by transformation into L*a*b* coordinates followed another linear correction. In this case, the correction model may be as follows. Assume that the (position corrected) multispectral information is denoted by pcallb being a vector of length Ncallb (in some instances, Ncallb = jVtest jf characterization and calibration devices are similar). For illustration we assume that the operator B has a simple form of a matrix multiplication and shift acting only on the XYZ coordinates. The step 2216 in Figure 22 takes the form:
Figure imgf000038_0001
Where LAB is function implementing standard L*a*b* calculation from XYZ, 'Qcaiib xYz js a matrix with 3 rows and Ncallb columns, vectors dcallb XYZ has 3 rows. These elements are part of the multispectral data calibration parameters 1938 and may be defined in step 1936 by obtaining simulated raw data Fcahbsirrl' transforming it to multispectral data and running an optimization. Various merit functions are possible, including hit-rate. One simple illustration would be quadratic difference to the reference device. Assume that, as previously, there are i = 1 ... mcor reference targets and that the corresponding reference L*a*b* coordinates are £*(i)b*( £*( . Optimization problem may be stated as follows
Figure imgf000039_0001
or, for example, for a multispectral detector comprising eight filters
Figure imgf000039_0002
with parameters B1 1...E!3 S defined as a solution of the problem eq. 1 with weighting parameters wfallb and (generally nonlinear) distance function Ecallb. The optimization step may be done on a non-contact multispectral measurement device 100 or on a cloud server.
As noted before, the calibration and correction algorithms are designed to work with the spectral measurement data. This concept provides higher accuracy and increases the flexibility to use the data sets in an open system architecture supporting different applications with different calibration requirements. Calibration profits from the multispectral nature of the data. It is also possible that the number of channels and their positioning would allow for direct computation of the colorimetric coordinates in case of a hyperspectral or full spectral measurement system (e.g., when the channels deliver an approximation to the signal over the full wavelength range with lOnm or smaller step size). In a hyperspectral or full spectral measurement system the calibration matrix can also be avoided.
Device characterization (1914, 1922, 1924, 1926, 1928) is used to simulate the multispectral measurement system. This step allows for the calculation of the multispectral data the multispectral measurement system 110 would deliver for the sample collection held in the sample data collection. This largely virtual simulation of filter responses to a large number of measurements taken by the reference device may be performed on a computer to define parameters needed to obtain filter and colorimetric data described in step 2216 in Figure 22.
Device calibration may also involve measuring samples from a sample library 1928. This allows to each calibrated device to include correction to the calibration data. This step is presented as a separate block in Figure 19. This step may partly be delegated to the end-user, if the sample data with controlled standard properties is distributed to the user or purchased by the user.
Figures 20a, 20b, 21, and 22 illustrate steps involved in making a measurement with a non-contact multispectral measurement device 100 according to the present invention from three perspectives, time, sequence and dataflow, respectively. Figures 20a and 20b show examples of how a work flow progresses over time. Referring to Figure 20a, the initial step is to start the software application 2002. Position
measurement is activated 2004, device position (i.e., distance and orientation with respect to the surface of interest) is computed 2006, and guidance is provided to the user 2008 to position the non-contact multispectral measurement device 100 at an appropriate distance and orientation to the surface to be measured. The appropriate use case calibration data is retrieved in 2010. Position continues to be measured and computed, 2014, and if position permits measurement 2016, one or more
measurements are made 2018. Several measurements may be averaged or statistically combined using the statistic measurement correction procedure (SMC) which may include, but is not limited to, averaging procedure defined in step 1920 of the calibration.
Measurements may be made with and without activating LED illumination sources 2018 to allow for correction for ambient light 2020 with input parameters such as white transfer and black offset 2022. Parameters for step 2022 are found during the device specific calibration steps 1922, 1924, 1926 of the device specific calibration process 1904 as described earlier. Additionally, ambient light may be measured by an ambient light sensor. Time modulated light and demultiplexing may also be used for ambient light correction (lock-in techniques). Corrections to multispectral data from the multispectral detector may be made to compensate for device distance and orientation (parameters 1918a are found during step 1918 of the optimization) to the sampled surface 2024 and for ambient light 2020 to produce corrected multispectral data 2026. The multispectral data may then be corrected for the selected use case 2028.
Parameters of the correction 1938 are found during device and use case specific calibration as described earlier. Reference data is accessed in 2030. The corrected spectral data may then be used to search 2032 a color database 2034, results may be displayed 2036.
Referring to Figure 20b another example of a workflow is provided. The multispectral measurement system 110 is activated in step 2050 to obtain multispectral filter responses 2052. The multispectral filter responses are corrected for white transfer, linearity, back offset parameters 2054 in step 2056. Correction for ambient light is performed in step 2058 to provide multispectral data 2060. Position
measurement is activated and device position (i.e., distance 2068 and angle of orientation 2082 with respect to the surface of interest) is obtained 2070. Corrections to multispectral data from the multispectral detector may be made to compensate for device distance 2068 to the sampled surface 2074 to produce distance corrected multispectral data 2072. Processed multispectral data 2076 is stored in memory. Orientation data 2082 and orientation calibration information 2078 are accessed. In step 2080 material properties 2084 of the surface being samples are accessed depending on use case. The multispectral data may then be corrected 2086 for orientation of the multispectral system toward the surface being sampled 2082, a model of the surface being sampled 2088, and other surface properties 2090, to produce distance and orientation corrected multispectral data 2092.
Figure 21 illustrates a logical sequence for obtaining color measurements with a device according to the present invention, including several loops that may be interrupted by an event triggered by user action or by a timeout. The workflow in this section starts when user opens the application 2102. After the application is started, the positioning system, optionally together with the camera, starts observing the scene to bind position markers 2104. In step 2106, a decision is made whether to proceed with a measurement. Device position may be continued to be monitored in either case, at 2104, 2108. In 2110, the display may display audio-visual information to guide user closer to the desired position. The camera positioning markers of the positioning system may be visible so that the approximate position of the measurement is visible. The camera can further capture the whole scene which the sample is a part of so that the user gets the additional information of the sample together with its position in the scene. The application waits for user input to start a measurement or a series of measurements.
After measurement(s) are started by the user, the positioning markers, already calculated and identified by the position correction system, are used to identify device distance and angle relative to the surface being sampled. The application provides a visual indication of whether position is in bounds 2112 so that the measurement can be performed within bounds that are capable of being corrected. The bounds are decided during the step 1918 of the calibration. Depending on the tradeoff between position and measurement precision defined during the calibration or additionally changed by the user, the application decides whether the position is close enough 2112 so that the sensor correction may be applied while guidance is shown to user on the screen. If the decision is negative, decision is taken by the user or by timeout whether a measurement should be made or whether further positioning will be attempted.
If the position is appropriate, a measurement is performed 2116 as described below with and without additional illumination to compensate for the ambient light. The measurement may be stopped in 2118 or additional measurements may be made. The measurement is stored in device memory and a loop termination condition is applied. If the loop continues, the series of measurements is made as was described earlier.
If the measurements are finished, the data collected during the loop is processed by applying steps including correcting ambient light and computing multispectral data 2120, correcting for distance and angle to the surface 2122, and computing reference data 2124. Aggregated and processed information from positioning sensor, camera and multispectral sensor is passed for search 2126, the result is visualized 2128, additionally the sample data collection 2130 is updated. After the workflow is finished, the application is put into stand-by mode for the user to start another series of
measurements or otherwise interact with the application by e.g. changing the measurement parameters.
When the application is started, but before the sample acquisition command is issued, the system is active with the positioning sensor observing the scene. It identifies the positioning markers. Several measurements are used to perform averaging and position correction aiming at colorimetric information.
The system has separate settings that may be accessed by the user through the application. Before starting the measurement or before performing the measurement in the sample data collection, usually including but not limited to, color library with corresponding metrics, the user can change the use case identity and supply metadata related to a particular measurement such as user age, geographic position, time of measurement, etc. In some applications e.g. when the measured sample is human skin, this metadata influences calibration data and can be used for search metrics, thus the sample data collection is not necessary reduced to color. Advanced settings include measurement parameters such as averaging procedure, gain and integration time.
These are preset for each use case during calibration with the characterization device, but may also be changed by the user on demand. The user may also set the targeting bounds relative to the tradeoff between usability and positioning precision.
The dataflow of a single measurement is shown in the Figure 22, from starting the measurement to displaying the results. For illustration purposes, it is shown here that the measurement is triggered by the user, such as manually or by voice. As depicted in Figure 21, this is not the only option, measurements may also be performed in an automatic mode while the user changes and adjusts the position until the process is interrupted. In any case, a single measurement in this scheme may include a number of illumination and measurement pairs to compensate for the ambient light. For some applications, e.g. when a multispectral measurement system is embedded into a device having limited or no human user interfaces, the triggering of the measurement would be by the device and associated programming, not a manual input. In this case, the display and user interface serves to input the settings.
After the sample identification command is issued by the user 2202, the hardware is activated. Filter responses are measured 2204 by the multispectral detector. The signal measured by the multispectral detector passes the initial processing step where dark current, linearity etc. are corrected 2206. White correction and filter crosstalk correction are applied leading to multispectral information 2208. These parameters are found during steps 1922, 1924 and 1926 of the calibration as described earlier.
At the same time, device camera acquires images of the sample to define the distance and orientation of the device relative to the sample 2210. The positioning system may be designed to function with the device camera, have its own camera or be used without the device camera to define the position (e.g. use time of flight or employing stereovision).
When other sensors 2212 (e.g. clock, GPS) are present, then data may be appended by the user settings (age etc.) to form the sample metadata 2214 that may be used in the correction process 2208 (e.g. when correcting pseudospectra from the skin) or during the search in the sample data collection (e.g. searching for the last skin data related to the user).
The multispectral data is corrected for the ambient light 2208. The multispectral data are further corrected using the positioning information and an algorithm for position correction determined for each use case during the calibration (step 1918 of the calibration process described earlier). Positioning information includes, but is not limited to, distance and orientation (e.g., sample curvature can be supplied by the positioning system). As described above, a single measurement may be a part of a series of measurements. These measurements may be used to further interpolate the values to the desired position and perform averaging to minimize positioning sensor error and other random effects as described in the next paragraph. These parameters are part of the SMC parameters found in the step 1920 of the calibration.
Corrected filter responses and position information are supplied to the next processing step (also in 2208) where the filter responses are averaged in case multiple measurements were performed. They are further corrected using the calibration parameters 2218 supplied by the calibration of the multispectral measurement system 2216. Calibration includes, but not limited to, calibration to account for geometry or other differences between the reference device and the multispectral measurement system. During device specific and use case specific calibration these parameters are found and stored in 1938. The calibration data also corrects the individual device features, so that unified data may be used with the sample database. Furthermore, the calibration is used to process the reference data (e.g. colorimetric data) from the multispectral data. For example, the XYZ or the L*a*b* is computed. The data is specific to the use case identity and can possibly be altered using the metadata. In Figure 22, the calibration parameters are stored in the processing unit. They may additionally be accessed from a cloud-based server or on the server by supplying the unit identity.
The spectral data may be transferred to the cloud server for further processing. The reference database 2220 may be stored on the server, although this step may also or alternatively happen in the processing unit. The database may comprise sample color data for different illuminations measured by a third-party supplier as well as data related to the user (such as skin data for the user taken over time) and thus created by the user or by a group of users.
Part of the calibration data for the colorimetric system may be stored on the server/cloud and accessed using the unit ID for post processing. The database may also include products related to the sample measured and identifiable by the color and appearance information (such as foundation products for skin or paint).
The sample data collection also contains metrics and a set of rules to find the closest match to a measured sample, thus it may include some functionality of an expert system. These procedures are used to search in the color library to find the closest match 2222. For example, the match may be decided using the metadata such as looking for the last measurement of user skin. Alternatively, the closest match may be sought for by using colorimetric data, e.g. by calculating a metric comprised of a weighted sum of delta E to the measured sample under different illuminants.
If the search and measurement were successful 2224, the database may be updated 2226, e.g. the database of skin measurements of the user is augmented or the color information of a new inspiration is added together with the data supported by the user. The workflow is then passed back to the device 2228, more specifically, to the application which may display the color of the inspiration item 2230 or sample information 2232. If the measurement and/or search in the database have failed, the workflow is also passed back to the application 2234. The camera of the device is used for targeting and positioning of the sample with respect to the device. The acquired images of the camera may be visualized on the device display. The individual images may be overlaid by positioning marks to guide the user to an optimal position. This provides direct user interaction for the operator and helps to support and control the measurement workflow.
The positioning marks may be placed on the images due to information of the positioning sensor. If a dot pattern projector is used the dots themselves may be used to guide the user as shown in Figure 25 where circles indicate the range where the dots shall be. Another option is to use the position information to place marks on the image.
The SW application has stored information available to calculate the location of the detector pick-up area on the sample for different distance and angular orientations. This can be done by trigonometric calculations. This helps the SW application to provide on the display a position marker for the virtual position of the measurement field of the color sensor. The user may use this virtual marker to position it at the desired location in the sample image.
There are different possibilities to support graphically the user on the display with the distance and angular control. One possibility is to use a rectangular control window where the position of the marker corresponds to the measured angle. The goal is to shift the marker in the center of the window by tilting the non-contact
multispectral measurement device 100 with the spectral sensor. The marker is red when the alignment is out of the angular tolerances. The marker becomes green as soon as the angular alignment is within the pre-determined tolerance limits. This is shown in Figure 26. A similar concept can be applied for the distance control. This is shown in Figure 27. Such alignment features may be superimposed to the real camera image of the scene.
A first method to initiate a measurement cycle and create a valid result is to align the distance and the angular orientation of the non-contact multispectral measurement device 100 until the display indicates that the position is within the tolerance bands around the reference position. Then, start a single measurement by a control interaction. The control interaction may comprise pressing a button or voice command.
A second method is to execute a series of multiple measurements around the reference position. All measurements are stored. For each measurement the distance is corrected. The angular information for each measurement is used for an interpolation of the measurement results at the reference angle.
Exemplary embodiments of the present invention include, but are not limited to, the following.
A multispectral measurement system for measuring reflectance properties of a surface of interest, comprising a multi spectral detector configured to measure spectral information in a plurality of bands of optical radiation, each band of optical radiation corresponding to a filter function, the multispectral detector comprising a plurality of photodiodes, each photodiode having a filter corresponding to one of the filter functions, there being at least two photodiodes corresponding each of the filter functions located in a point-symmetric arrangement in a two dimensional array; and observation optics having an aperture and being configured to observe the surface of interest; wherein each of the plurality of photodiodes is located in a different location with respect to the aperture, and wherein differences in a field of view of the surface of interest for each photodiode are compensated for by combining measurements of point-symmetric photodiodes.
The multispectral measurement system as above, wherein the plurality of bands of optical radiation comprise at least six band of optical radiation.
The multispectral measurement system as above, further comprising at least one illumination source illuminating at least the field of view. The multispectral measurement system as above, wherein the at least one illumination source is a non-collimated illumination source.
The multispectral measurement system as above, wherein the multispectral measurement system further comprises at least one illumination source illuminating at least the field of view, wherein the at least one illumination source emits optical radiation over a range of visible light.
The multispectral measurement system as above, wherein the multispectral measurement system comprises at least one illumination source, wherein the at least one illumination source emits visible light radiation and infra-red radiation, and wherein the multispectral detector has filter functions corresponding to visible light and near infra-red optical radiation.
The multispectral measurement system as above, wherein the multispectral measurement system comprises at least one illumination source, wherein the at least one illumination source emits visible light radiation and ultraviolet radiation, and wherein the multispectral detector has filter functions corresponding to visible light and near ultraviolet optical radiation.
A non-contact multispectral measurement device comprising the multispectral measurement system as above, and further comprising: a position measurement system for measuring position values of the multispectral measurement system relative to the surface of interest; and means to correct multispectral values from the multispectral measurement system based on detected position values from the position
measurement system.
The non-contact multispectral measurement device as above, wherein the measurement device comprises a mobile communications device.
The non-contact multispectral measurement device as above, wherein the position measurement system is selected from the group consisting of: a pattern projector and a camera, a camera autofocus system, a stereo vision system, a laser range finder, and a time of flight distance sensor.

Claims

What is claimed is:
1. A multispectral measurement system for measuring reflectance properties of a surface of interest, comprising: a multi spectral detector configured to measure spectral information in a
plurality of bands of optical radiation, each band of optical radiation corresponding to a filter function, the multispectral detector comprising a plurality of photodiodes, each photodiode having a filter corresponding to one of the filter functions, there being at least two photodiodes corresponding each of the filter functions located in a point-symmetric arrangement in a two dimensional array; and observation optics having an aperture and being configured to observe the
surface of interest; wherein each of the plurality of photodiodes is located in a different location with respect to the aperture, and wherein differences in a field of view of the surface of interest for each photodiode are compensated for by combining measurements of point-symmetric photodiodes.
2. The multispectral measurement system of claim 1, wherein the plurality of bands of optical radiation comprise at least six band of optical radiation.
3. The multispectral measurement system of claim 1, further comprising at least one illumination source illuminating at least the field of view.
4. The multispectral measurement system of claim 3, wherein the at least one illumination source is a non-collimated illumination source.
5. The multispectral measurement system of claim 1, wherein the multispectral measurement system further comprises at least one illumination source illuminating at least the field of view, wherein the at least one illumination source emits optical radiation over a range of visible light.
6. The multispectral measurement system of claim 1, wherein the multispectral measurement system comprises at least one illumination source, wherein the at least one illumination source emits visible light radiation and infra-red radiation, and wherein the multispectral detector has filter functions corresponding to visible light and near infra-red optical radiation.
7. The multispectral measurement system of claim 1, wherein the multispectral measurement system comprises at least one illumination source, wherein the at least one illumination source emits visible light radiation and ultraviolet radiation, and wherein the multispectral detector has filter functions corresponding to visible light and near ultraviolet optical radiation.
8. The multispectral measurement system of claim 1, wherein measurements of point-symmetric photodiodes are combined by adding the measurements of point- symmetric photodiodes.
9. A non-contact multispectral measurement device comprising the multispectral measurement system of claim 1, and further comprising: a position measurement system for measuring position values of the
multispectral measurement system relative to the surface of interest; and means to correct multispectral values from the multispectral measurement system based on detected position values from the position measurement system.
10. The non-contact multispectral measurement device of claim 9, wherein the measurement device comprises a mobile communications device.
11. The non-contact multispectral measurement device of claim 9, wherein the position measurement system is selected from the group consisting of: a pattern projector and a camera, a camera autofocus system, a stereo vision system, a laser range finder, and a time of flight distance sensor.
PCT/EP2019/061520 2018-05-04 2019-05-06 Non-contact multispectral measurement device with improved multispectral sensor WO2019211484A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862667106P 2018-05-04 2018-05-04
US62/667,106 2018-05-04

Publications (1)

Publication Number Publication Date
WO2019211484A1 true WO2019211484A1 (en) 2019-11-07

Family

ID=66440052

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/061520 WO2019211484A1 (en) 2018-05-04 2019-05-06 Non-contact multispectral measurement device with improved multispectral sensor

Country Status (2)

Country Link
TW (1) TW202006341A (en)
WO (1) WO2019211484A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114485942A (en) * 2022-02-16 2022-05-13 南京大学 Hyperspectral registration method and imaging system thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060114461A1 (en) * 2004-12-01 2006-06-01 Boon-Keat Tan Arrangement of color filters for characterizing the color of an aggregate light
US20080180665A1 (en) * 2007-01-31 2008-07-31 Redman David J Measuring color spectra using color filter arrays
US8423080B2 (en) 2008-06-30 2013-04-16 Nokia Corporation Color detection with a mobile device
US9316539B1 (en) 2015-03-10 2016-04-19 LightHaus Photonics Pte. Ltd. Compact spectrometer
US20160224861A1 (en) 2015-01-30 2016-08-04 X-Rite Switzerland GmbH Imaging Apparatus, Systems and Methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060114461A1 (en) * 2004-12-01 2006-06-01 Boon-Keat Tan Arrangement of color filters for characterizing the color of an aggregate light
US20080180665A1 (en) * 2007-01-31 2008-07-31 Redman David J Measuring color spectra using color filter arrays
US8423080B2 (en) 2008-06-30 2013-04-16 Nokia Corporation Color detection with a mobile device
US20160224861A1 (en) 2015-01-30 2016-08-04 X-Rite Switzerland GmbH Imaging Apparatus, Systems and Methods
US9316539B1 (en) 2015-03-10 2016-04-19 LightHaus Photonics Pte. Ltd. Compact spectrometer

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Proceedings of the 21st annual conference on Computer graphics and interactive techniques (SIGGRAPH '94", ACM, pages: 239 - 246

Also Published As

Publication number Publication date
TW202006341A (en) 2020-02-01

Similar Documents

Publication Publication Date Title
US20210372920A1 (en) Handheld non-contact multispectral measurement device with position correction
CN108051374B (en) Handheld measuring device and method for capturing a visual impression of a measuring object
CN104471361B (en) Variable angle spectroscopic imaging measurement method and device therefor
US10931924B2 (en) Method for the generation of a correction model of a camera for the correction of an aberration
KR102007309B1 (en) Method and device for measuring the colour of an object
US10257509B2 (en) Apparatus and method for multi configuration near eye display performance characterization
TW201634906A (en) Benchmark light source device for correcting bright stars of the meter and correction method using the same
JP2014077792A (en) Method for determining calibration parameter of spectrometer
US20130208285A1 (en) Method and device for measuring the colour and other properties of a surface
CN113454429A (en) System and method for spectral interpolation using multiple illumination sources
US20180184053A1 (en) Compensating for vignetting
KR101539472B1 (en) Color distribution measuring optical system, color distribution measuring device, and color distribution measuring method
US20130154830A1 (en) System and apparatus for gloss correction in color measurements
WO2019211484A1 (en) Non-contact multispectral measurement device with improved multispectral sensor
Scheeline Smartphone technology–instrumentation and applications
EP3560184A1 (en) Compensating for vignetting
CN114234857A (en) Visible and infrared multi-optical-axis parallelism detection device and method
KR102022836B1 (en) Apparatus for measuring light, system and method thereof
CN216132666U (en) Calibration device and electronic equipment
CN110108359A (en) Spectrum calibration device and method
CN113155756B (en) Light spot online calibration method
KR102655377B1 (en) Apparatus and method for measuring spectral radiance luminance and image
US11582398B2 (en) Calibrating color measurement devices
US20170082492A1 (en) Device for measuring colour properties
CN112014069B (en) Imaging measuring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19722596

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019722596

Country of ref document: EP

Effective date: 20201204

122 Ep: pct application non-entry in european phase

Ref document number: 19722596

Country of ref document: EP

Kind code of ref document: A1