GB2583905A - Method and system for generating optical spectra - Google Patents

Method and system for generating optical spectra Download PDF

Info

Publication number
GB2583905A
GB2583905A GB1905717.3A GB201905717A GB2583905A GB 2583905 A GB2583905 A GB 2583905A GB 201905717 A GB201905717 A GB 201905717A GB 2583905 A GB2583905 A GB 2583905A
Authority
GB
United Kingdom
Prior art keywords
optical spectrum
computing device
spectrum
light
different colours
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1905717.3A
Other versions
GB201905717D0 (en
Inventor
Wang Hui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ulster University
Original Assignee
Ulster University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ulster University filed Critical Ulster University
Priority to GB1905717.3A priority Critical patent/GB2583905A/en
Publication of GB201905717D0 publication Critical patent/GB201905717D0/en
Priority to PCT/EP2020/061536 priority patent/WO2020216938A1/en
Publication of GB2583905A publication Critical patent/GB2583905A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0256Compact construction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0272Handheld
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/10Arrangements of light sources specially adapted for spectrometry or colorimetry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2803Investigating the spectrum using photoelectric array detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/32Investigating bands of a spectrum in sequence by a single detector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • G01J2003/425Reflectance

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

A target object, e.g. a food item, is illuminated with different colours of light in sequence by device display 164, e.g. a smartphone or tablet. Digital video of the corresponding received light sequence, reflected from the target object to video camera 152, is created. Frame images are obtained from the video, with spectral values derived from pixels, preferably in the same location, in each frame. The optical spectrum thus obtained may be classified, e.g. at server 12A, into categories using known spectra possibly stored on database 16. The method may allow food authentication by consumers using a mobile application, avoiding food frauds, using a non-specialised device without providing additional equipment e.g. a prism. At least 50 different colours may be used and may be rendered over a time of between 3 and 15 seconds. Properties of the optical spectrum may be represented as a feature vector and compared.

Description

Intellectual Property Office Application No. GII1905717.3 RTM Date:24 October 2019 The following terms are registered trade marks and should be read as such wherever they occur in this document: Wi-Fi Bluetooth Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo Method and System for Generating Optical Spectra
Field of the Invention
The present invention relates to optical spectrometry, and in particular to imaging spectrometry.
Background to the Invention
Optical spectrometers are scientific instruments widely used in sciences to determine the chemical 10 composition of materials, based on the fact that different elements (i.e., different configurations of protons/electrons) absorb/emit light of different wavelengths.
A conventional optical spectrometer splits light into an array of separate colours, called a light spectrum, which is then recorded as a spectrum vector representing light intensities at different colours (i.e., wavelengths). Conventional spectrometers typically include diffraction gratings or prisms for creating the light spectrum, and CCD sensors for recording the light spectrum as a spectrum vector. Such equipment is not normally found in consumer products such as mobile (cellular) telephones or computers and so spectrometers tend to be specialised instruments.
It is known to adapt generic devices, such as mobile phones, for use as a spectrometer by modifying the hardware of the device, or providing additional hardware for connection to the device. It would be desirable however to allow a non-specialised device to operate as a spectrometer using non-modified hardware and without having to provide additional equipment.
Summary of the Invention
A first aspect of the invention provides a method of generating an optical spectrum for a target object, the optical spectrum comprising a plurality of spectral values, the method comprising: illuminating, using a display of a computing device that conveniently has a video camera and 30 said display screen facing in the same direction, the target object with a plurality of different colours of light one at a time in succession; receiving, at said video camera, a corresponding sequence of coloured light reflected from the target object; creating a digital video of said received light sequence; obtaining a plurality of frame images from said digital video, each frame image comprising a plurality of pixels; deriving a respective one of said spectral values from a selected pixel from each frame image.
A second aspect of the invention provides a system for generating an optical spectrum for a target object, the optical spectrum comprising a plurality of spectral values, the system comprising a computing device having a video camera and a display, preferably facing in the same direction as each other, the computing device further comprising: means for causing the display to illuminate the target object with a plurality of different colours of light one at a time in succession; and means for causing said video camera to simultaneously create a digital video of a corresponding sequence of coloured light reflected from the target object, the system further including: means for obtaining a plurality of frame images from said digital video, each frame image comprising a plurality of pixels; and means for deriving a respective one of said spectral values from a selected pixel from each frame image.
From a third aspect the invention provides a method of generating an optical spectrum for a target object, the optical spectrum comprising a plurality of spectral values, the method comprising: illuminating the target object with a plurality of different colours of light one at a time in succession, preferably using a display of a computing device; receiving, at a video camera, a corresponding sequence of coloured light reflected from the target object, the video camera preferably being provided on the same computing device as said display, and preferably facing in the same direction as the display; creating a digital video of said received light sequence; obtaining a plurality of frame images from said digital video, each frame image comprising a plurality of pixels; deriving a respective one of said spectral values from a selected pixel from each frame image.
Preferably, the respective selected pixel for each frame image has the same or substantially the same location in the respective fame image.
Typically, each pixel has respective red, green and blue (RGB) channel values, and wherein said deriving involves calculating the respective spectral value from the respective RGB channel values. Said calculating preferably involves calculating an average, preferably a weighted average, of said RGB channel values. Optionally, a weighted average of said RGB channel values with a weighting of 3:6:1 is used.
Said obtaining may involve obtaining at least one frame image for each of said different colours of light, or obtaining a respective frame image for at least some of said different colours of light.
Preferably, said plurality of different colours of light include at least one colour from each of a plurality of wavelength bands, said wavelength bands including a red band, and green band and a 40 blue band, and preferably also from any one or more of an orange, yellow, indigo and violet band.
In preferred embodiments, said illuminating involves causing a computing device, preferably a smartphone or other hand-held computing device, to illuminate the target object using a display of the computing device.
Typically said computing device supports a colour gamut for said display, and wherein said plurality of different colours of light include at least one colour from each of a plurality of wavelength bands of said gamut, said wavelength bands including a red band, and green band and a blue band, and preferably also from one or more of an orange band, a yellow band, an indigo band and a violet band. Said plurality of different colours may comprise all of the colours of the supported colour gamut, or at least 50 different colours, preferably between 100-500 different colours, or optionally more colours.
In preferred embodiments, said illuminating involves rendering said plurality of different colours of light over a time period of between 3 and 15 seconds, preferably between 5 and 10 seconds.
A fourth aspect of the invention provides a spectrometry method comprising generating an optical spectrum for a target object using the method of the first or third aspects of the invention, the spectrometry method further including classifying said optical spectrum into one or more categories, preferably using a plurality of known optical spectra.
A fifth aspect of the invention provides a computing device having a video camera and a display, conveniently facing in the same direction as each other, said computing device comprising: means for causing said display to illuminate a target object with a plurality of different colours of light one at a time in sequence; and means for causing said video camera to simultaneously create a digital video of a corresponding sequence of coloured light reflected from the target object.
The computing device may include means for obtaining a plurality of frame images from said digital video, each frame image comprising a plurality of pixels; and means for deriving a respective one of 30 said spectral values from at least one pixel from each frame image.
A sixth aspect of the invention provides a spectrometry system comprising a computing device of the fifth aspect of the invention, and means for classifying said optical spectrum into one or more categories, preferably using a plurality of known optical spectra.
Preferred embodiments of the invention may be used for screening agri-food samples in order to identify susceptible food frauds before full investigations take place. For example, a consumer may authenticate agri-food items on-site in terms of their origins and production methods, and generate alerts if samples are not authenticated. Such a screening system puts ordinary consumers at the first line of defence against food frauds.
Preferred embodiments of the invention may be suitable for use with a web service to promote quality food products and to create brand/identity, where food producers register themselves, their food products and the spectral fingerprints of these food products. Consumers may authenticate food products using suitable embodiments of the invention and the spectral fingerprints stored at the web service.
Other advantageous aspects of the invention will be apparent to those ordinarily skilled in the art upon review of the following description of a specific embodiment and with reference to the accompanying drawings.
Brief Description of the Drawings
An embodiment of the invention is now described by way of example and with reference to the accompanying drawings in which: Figure 1 is a schematic view of a spectrometry system embodying one aspect of the invention, the system including a spectrometry-enabled computing device embodying another aspect of the invention; Figure 2 is a block diagram of an embodiment of the spectrometry-enabled computing device of Figure 1; and Figure 3 is a sequence diagram illustrating exemplary operation of the system and device of Figure 1.
Detailed Description of the Drawings
Referring now to Figure 1 of the drawings there is shown a spectrometry system 10 embodying one aspect of the invention. As is described in more detail below, the system 10 is configured to perform imaging spectroscopy. The system 10 includes a computing device 20 embodying another aspect of the invention. In preferred embodiments, the device 20 comprises a mobile (cellular) telephone, preferably a smartphone.
The system 10 further includes at least one server, which typically comprises one or more server applications supported by one or more server computers configured to provide data, and/or downloadable computer program(s) and/or to provide data processing, as required. In this example the system 10 comprises a data server 12A and a web server 12B. The data server 12A includes a database 16 storing data, such as spectral data, as is described in more detail hereinafter. The data server 12A is optionally configured to perform data processing, for example spectral classification, as is described in more detail below. The web server 12B provides a computer program marketplace.
The device 20 and the server(s) 12A, 12B are capable of communication with one another via a telecommunications network 14, typically comprising any one or more of: a global computer network (e.g. the Internet); a data communications network; a mobile (cellular) telephone network; and/or other telephone network e.g. a public standard telephone network. In preferred embodiments, the server(s) 12A, 12B and the device 20 are equipped with the necessary hardware and software to enable them to communicate with each other across the network as required. The system 10 may be said to support the device 20 in that it may receive data and computer programs from the servers 12A, 12B, particularly for the purposes of any one or more of downloading, configuration, updating, data processing and diagnostics, as required.
In preferred embodiments, the device 20 is a computing device comprising a programmable processor, typically a microprocessor, computer memory and one or more communications devices for allowing the device 20 to communicate with other devices including the servers 12A, 12B. Conveniently, the device 20 comprises a portable, preferably hand-held, computing device.
The computing device 20 includes a camera 152, in particular a video camera, the lens 153 of which is shown in Figure 1. The computing device 20 also includes an electronic visual display device 164, which may also be referred to as a display, or video display, and typically comprises a screen 165. By way of example the display device may comprise an LED display, also commonly referred to as an LED video display. The display may be of any known type, including OLED, AMOLED and LEO based displays. The display 164 and the camera 152 (or more particularly the lens 153 of camera 152) face in the same direction, e.g. they may both be said to be front-facing with respect to the device 20. The arrangement is such that at least some light emanating from display 164 and reflected from an object in front of the display 164 is in the field of view of the camera 152. It will be understood that the camera 152 may have other lens(es) that are not necessarily front-facing.
In preferred embodiments, the computing device 20 includes a mobile computing platform supporting a mobile computing operating system (OS). The computing device 20 preferably includes communication device(s) for communicating with other devices via a mobile (cellular) telephone network. Conveniently, the computing device 20 is a mobile (cellular) telephone (e.g. a smartphone), tablet computer or other hand-held computer.
In the illustrated embodiment, the computing device 20 is a smartphone. In addition to having mobile (cellular) telephony capabilities, smartphones are internet enabled, i.e. capable of communicating with other devices, e.g. servers 12A, 12B, across the internet, for example using GSM (Global System for Mobile Communications) technology. This can be achieved using a mobile (cellular) telephone network. Smartphones are typically also equipped with other telecommunications devices, e.g. a WiFi transceiver, to allow communication across the internet via other means, e.g. a public standard telephone network or data network. Smartphones typically also have a video camera and LED display facing in the same direction. Other hand-held computing devices, such as tablet computers, may be similarly equipped.
Referring now to Figure 2 there is shown a block diagram of an exemplary embodiment of the computing device 20, which may conveniently be a smartphone, showing only those components that are helpful for understanding the invention. The device 20 includes the camera 152 and the display device 164. The device 20 includes one or more processors 158, typically suitably programmed microprocessor(s), and computer memory 160 (including in this example RAM and flash memory for software application storage) for use by the processor(s) 158. The processor(s) 158 are configured to control, or otherwise co-operate with, a display driver 162 which drives the display device 164, and typically also a camera driver 153 for operating the camera 152. Typically, the display device 164 comprises a touch screen 165, enabling the display device 164 to service as a user input device as well as an output device. One or more keys 167 may be provided for user input.
The device 20 further includes communications devices 161 for communicating with other devices such as the server(s) 12A, 12B. The communications devices typically include a mobile (cellular) data connection device 168, e.g. a GSM device, and a VViFi device 172. Other communications devices such as a USB device 166 or other port device, a Bluetooth device 170 and a Near Field Communication (NFC) device 174, may be provided. One or more antennas and/or ports (not shown) may be provided in association with the respective communications device(s) as required. A power supply 176, typically a battery, is also provided.
The computing device 20 supports spectrometry application software 92 (hereinafter referred to as the mobile application) comprising one or more computer programs for causing the computing device 20 to perform spectrometry-related functions as described in more detail hereinafter (and so to render the device 20 "spectrometry-enabled"). In particular, the spectrometry-related functions relate to imaging spectrometry.
In preferred embodiments, the mobile application 92 is configured to control, facilitate and/or support, as required, any one or more of: communication between the computing device 20 and the server(s) 12A, 128; the provision and operation of a user interface (which typically includes rendering information to the user (e.g. via the display 164) and handling user inputs (e.g. via keys 167 and/or touch screen 165)); and performing spectrometry related functions, which may include causing the display device 164 to produce an illumination spectrum, causing the camera 152 to record resulting reflected light as a spectrum video, analysing the spectrum video or causing the spectrum video to be analysed. The functionality supported by the mobile application 92 may vary from embodiment to embodiment.
Advantageously, the mobile application 92 can be downloaded from a server (assumed to be web server 128 in this example) across network 14. It may conveniently be stored in local memory 160 40 and executed by processor(s) 158.
Conventional spectrometers use a device such as a prism or diffraction grating to split light in order to create a light spectrum for illuminating a target object. Such spectrometers may be described as separative spectrometers because they separate light into an array of colours.
In contrast, for illuminating a target object, the computing device 20 generates a light spectrum output comprising a sequence of successive different colours (i.e. light, typically but not necessarily visible light, of different wavelengths representing different colours). As such the device 20, when acting as a spectrometer (or as part of a spectrometry system), may be described as a generative spectrometer in that it generates a light spectrum comprising a sequence of different colours. The light spectrum output is rendered by the display device 164. In particular, the mobile application 92, when running on the processor(s) 158 causes the display driver 162 to drive the display device 164 to render the light spectrum output.
The light spectrum output may comprise, in a sequence, all of the colours (which may be referred to as the colour gamut) that the computing device 20 is capable of rendering via its display 164, or at least a plurality of colours representing the full colour range of the gamut. This may be described as generating a full spectrum of the available light colours, or at least using at least one colour from all bands of the available spectrum (which bands may for example be designated as red, orange, yellow, green, blue, indigo and violet). The colour gamut for the display 164 may for example correspond with the sRGB (standard Red Green Blue) colour gamut or other supported known colour gamut. Alternatively, the light spectrum output may comprise a sequence of only some of the available colours, although it is preferred that at least one red, at least one green and at least one blue colour is included, wherein in this context "red" refers to a wavelength band that covers more than one specific red colour (i.e. including shades), "blue" refers to a wavelength band that covers more than one specific blue colour, and "green" refers to a wavelength band that covers more than one specific green colour. More preferably, the sequence output additionally comprises any one or more of: at least one orange, at least one yellow and at least one indigo and at least one violet colour, wherein in this context "orange", "yellow", "indigo" and "violet" refer to the respective wavelength band that covers more than one specific respective colour (i.e. including shades of the respective colour). In preferred embodiments, the light spectrum output comprises at least 50 different colours, more preferably between 100 -500 different colours, although more or fewer colours may be used. The colours may for example be rendered at a rate of 5 to 100 per second, although slower or faster rates may be used.
Conveniently, the colours are rendered in a sequence corresponding to increasing or decreasing wavelength, although other sequences may be used. Typically, the colours correspond to the visible spectrum, although it is possible that the colour gamut of some devices 20 includes colours (light) corresponding to non-visible portions of the electromagnetic spectrum, e.g. infra red (IR) or ultraviolet (UV) light.
Causing the display device 164 to render each desired colour involves controlling the colour pixels (not shown) of the display device 164 to produce light of the desired colour at the desired time. Preferably, all of the available pixels, i.e. the entire display, are controlled to produce the desired colour at the desired time. Alternatively, only part of the display may display the desired colour, in which case the remaining portion of the display is preferably caused to be black or grey, i.e. non-coloured.
In typical embodiments, lighting the display device 164 with the desired colours is performed in accordance with an RGB colour space supported by the display device 164. As such, the display device 164 supports the generation of light of different colours encoded by red (R), green (G) and blue (B) channels. The relative intensities of the RGB channels determine the rendered colour. The pixels of the display comprise corresponding R, G and B light sources (e.g. red, green and blue LEDs in the case of an LED display) that are illuminated in accordance with the RGB channel signals.
In use, the device 20 is held close to a target object, for example a food item (not shown), so that the target is illuminated by the light spectrum output. The light reflected from the target object is captured by the camera 152. In particular, the camera 152 is operated in video capture mode and records a video, which may be referred to as a spectrum video, capturing the reflected light. The video is recorded simultaneously with the illumination of the target object, conveniently under the control of the mobile application 92. The video is a digital video and may be stored as a digital video file in any supported video file format.
The camera 152 has a digital image sensor (not shown), for example a CCD or CMOS image sensor, comprising an array of light sensitive pixels that detect the reflected light to produce corresponding output signals that are used to create the spectrum video. Typically, the camera 152 supports the detection of coloured light in accordance with an RGB colour space, which may or may not be the same as the RGB colour space supported by the display device 164, and so detects coloured light using R, G and B channels. Each pixel may comprise one or more colour sensitive photodiode, or other photodetector, and typically supports the detection of coloured light using R, G and B channels.
In use, the intensity of RGB channels of the display 164 is controlled to cause the display 164 to emit specific light colours, one at a time in accordance with the light spectrum output sequence, and the camera 152 receives the corresponding reflected light, one colour at a time in the same sequence. The camera 152 detects the coloured light as respective RGB channel intensities. Typically, each pixel of the camera's image sensor produces an output signal comprising RGB channel intensities corresponding to the detected light at any given time. The light spectrum output sequence is rendered by the display device 164 over a pre-determined period of time, and the camera 152 records the corresponding spectrum video over a coincident or substantially coincident period of time.
The time delay between a colour being emitted by the display 164 and the corresponding colour (likely modified as a result of absorption by the target object) being received by the camera 152 is close to zero; in other words, the two events are virtually instantaneous. However, the time delay between the mobile application 92 instructing the display driver 162 to display a given colour and the colour actually being displayed on the display 164 is longer. Therefore, there may be delay between the mobile application 92 initiating a colour change and it actually occurring. Similarly, there may be delay between any coloured light arriving at the camera's lens and the coloured light being recorded in the spectrum video. Therefore the time period over which the light spectrum output sequence is rendered should be long enough to ensure that the corresponding light can be detected and recorded by the camera 152, while not being so long as to produce significant duplicate (unnecessary) data. It is preferred therefore to render the light spectrum output sequence over a time period of a few seconds, for example between 3 and 15 seconds, preferably between Sand 10 seconds. By way of example, in an embodiment where 300 colours are rendered as the light output spectrum, they may be rendered at a rate of 30 colours per second over a 10 second period, or at 60 colours per second over a 5 second period.
Higher intensities of output light colour produce a clearer signal and allow the target to be placed further from the display 164 and still be affected meaningfully by the changes in colour of the light spectrum output sequence. However, too high intensity can saturate the corresponding colour channel in the resulting video, thus losing data. It is therefore preferred to balance the intensity such that the resulting video of the illuminated target goes through as much of the available colour depth as possible while not over-saturating any of the channels. Therefore, while the maximum light output intensity of some devices 20 may be suitable and preferred, for other devices 20 a less than maximum intensity setting may be preferred.
The spectrum video is processed to extract a spectrum representing the target object. The extracted spectrum comprises at least one spectral value for each of a plurality of colours, preferably for each colour of the light spectrum output sequence. The extracted spectrum may be described as a generative spectrum in that it is constructed by combining a plurality of individually determined spectral values.
Conveniently, but not necessarily, the spectral values are arranged in a sequence, for example a sequence corresponding to increasing or decreasing wavelength, although other sequences may be 35 used. The sequence is conveniently the same as the colour sequence of the light spectrum output of the display device 164.
In order to determine the spectral values from the spectrum video, i.e. to extract the target object spectrum, one or more video analytic algorithms may be used. Spectrum extraction from the video, 40 and any other video pre-processing that may be required, may be performed by the computing device 20 (e.g. by the mobile application 92 running on the processor(s) 158) or may be performed at the server 12A, as is convenient. If performed at the server 12A, the spectrum video file is sent to the server 12A by the device 20.
The spectrum video is converted into a plurality of frames, each frame comprising an image from the spectrum video. At least one frame is provided for a plurality of colours, optionally for each colour of the light spectrum output sequence. It is preferred that at least one frame is obtained for each of the colour bands designated as red, green and blue, and preferably also for any one or more of the colour bands orange, yellow, indigo and violet. The number of frames provided for each relevant colour may depend on the rate at which colours are rendered in the light spectrum output sequence and on the rate at which frames are taken from the spectrum video. For example, frames may be extracted from the spectrum video at a rate of, say, 30 frames per second. Assuming for example that the light spectrum output sequence is rendered over a period of 10 seconds, then 300 frames are obtained from which to create the extracted spectrum. The colour associated with each frame is dependent on the colour composition of the light spectrum output sequence and on the rate at which the corresponding colours are rendered. For example, in an embodiment where 300 colours are rendered as the light output spectrum rendered at a rate of 30 colours per second over a 10 second period, the corresponding 10 second spectrum video has 300 colours, 30 colours in each second of video footage. If the frames are extracted at a rate of 30 per second, then a frame for each of the 300 different colours can be extracted, one colour per frame. Alternatively, if 300 colours are rendered at 60 colours per second over a 5 second period, the corresponding 5 second spectrum video has 60 colours in each second of video footage. If the frames are extracted at a rate of 30 per second, then a frame for 150 of the 300 rendered colours can be extracted.
The frame images may be extracted from the spectrum video using any conventional means. For example the MATLAB (trade mark) computing tools provide a function for extracting frames from video files, although any other convenient video frame extraction process or tool may be used. Frame extraction may be performed by the computing device 20 (e.g. by the mobile application 92 running on the processor(s) 158) or may be performed at the server 12A, as is convenient.
Each frame image comprises a plurality of pixels. At least one pixel of each frame image is selected for creating a corresponding spectral value for the extracted spectrum. The, or each, selected pixel of each frame image has associated RGB channel values (the respective values being determined by the light reflected by the target and captured by the camera 152). For the, or each, selected pixel of each frame image, the respective RGB channel values are converted into a corresponding single representative value for the pixel. Preferably, the single representative value is a greysc,ale value. This may be performed by any suitable technique. In preferred embodiments, the single representative value is calculated as an average, preferably a weighted average, of the RGB channel values. For example, an RGB weighting of 3:6:1 (3 red, 6 green, 1 blue) may be used, in which case the single (spectral) value s may be calculated as s = 0.21R + 0.72G + 0.07B.
Other weightings may be used, preferably weightings that represent relative human perception of the colours. Alternatively, a simple average of the RGB values may be used as the single value. Alternatively still, the most and least prominent colours may be averaged, for example by (max (R, G, B) + min (ft G, B))/2. In any event, the single representative value for the pixel may be used as a spectral value for the extracted spectrum.
In a preferred embodiment, a single pixel is selected from each frame image for deriving the spectral value for the respective frame, the pixel of each frame being in the same, or substantially the same, location with respect to its frame. In embodiments, where more than one pixel is used from each 10 frame, each pixel is used to produce a respective spectral value for the frame.
The, or each, selected pixel is preferably selected from a corresponding, i.e. the same or substantially the same, location of the respective frame image. For example, the pixels of each frame image are typically arranged in an array (e.g. a 2 dimensional array), the selected pixel(s) being chosen from the same location(s) in the respective pixel array. In embodiments, where more than one pixel is used from each frame, the pixels are preferably adjacent to each other in the respective frame, e.g. immediately adjacent or otherwise close to each other. Even in embodiments where only one pixel per frame image is selected, it is not essential that the selected pixels have an identical location in their respective frames (although it is preferred that they do); more generally the respective pixels should have locations that are all close to each other in the respective frame, i.e. the pixels are at the substantially same pixel location in the respective frame, e.g. pixel locations that are at or adjacent a nominated pixel location, or spaced from the nominal pixel location by a small number (e.g. up to 50 but preferably fewer than 10) pixel locations.
For example, in an embodiment where one pixel is selected from each frame image, the RGB channel values for the corresponding pixel in each frame image are used to produce (e.g. by means of a weighted average) a respective spectral value for the frame image, and the respective spectral value for each frame image together form the extracted spectrum.
Digital images are usually affected by noise from the imaging instrument and the external environment during digitisation and transmission. It is preferred therefore to pre-process the frame images for image denoising prior to extracting the spectrum. Any conventional image denoising technique can be used. For example, a digital median filter may be applied to the frame image data for image denoising.
It is noted that computing devices such as smartphones may support different video formats. Some video formats use lossy compression of video frames based around image features that visually perceptible to humans. This is undesirable and parameters such as duration and intensity may need to be adjusted to compensate for lossy compression. The mobile application 92 may be calibrated to suit the respective computing device, e.g. smartphone, with which it is to be used such that the extracted spectrum is standardised for any given target object. The calibration may be performed using any conventional calibration technique, typically using a standard reference against which duration and intensity parameters can be calibrated.
The extracted spectrum is a representation of the target object. Information about the target object can therefore be obtained by examining its extracted spectrum. In preferred embodiments, the extracted spectrum is processed by classifying it using a database of known reference spectra. In particular, conventional machine learning algorithms may be used to classify the extracted spectrum based on the known spectra provided by the database. The known spectra may be used as training data from which a matching mathematical model may be derived. The matching model may be used to classify the extracted spectrum. For example, the known spectra may be paired, each pair having a matching spectrum and a non-matching spectrum for a given classification category. This provides a two class training dataset that can be used to create the matching model. Conventional classification and/or training algorithms may be used for this purpose, for example SVM (support vector machine) algorithms or other supervised learning algorithms. Alternatively the k-NN (k-Nearest Neighbours) algorithm, or other pattern recognition algorithms, may be used to classify the extracted spectrum using the known spectra. Classification may involve performing feature extraction on the extracted spectrum. Any conventional feature extraction algorithm(s) may be used for this purpose. It is preferred to use one or more chemometric algorithm for feature extraction, and any convenient conventional chemometric algorithm(s) may be used.
In preferred embodiments, the extracted spectrum is classified using a subsumption technique to determine the similarity between the extracted spectrum and the known reference spectra. Each reference spectrum is defined by a set of core properties. The core properties may be determined from the respective reference spectrum in any suitable manner, usually by performing an analysis of the respective spectrum, for example performing any suitable feature extraction analysis of the spectrum. The core properties may therefore comprise or be derived from the extracted spectral features. Using the principle of subsumption, it is assumed that if the correspondingly derived properties of the extracted spectrum include all of the core properties of a given reference spectrum, then the extracted spectrum can be deemed sufficiently similar to the reference spectrum for the purposes of classification. Accordingly, the extracted spectrum is analysed, e.g. by corresponding feature extraction, to determine its properties, and these properties are then compared to the core properties of the reference spectra. The extracted spectrum can be identified as a subsumption of a reference spectrum by determining that its properties include the core properties of the reference spectrum. It is noted that the set of properties of the extracted spectrum typically include additional properties as well as the core properties. For example, a reference spectrum of an organic apple may be obtained in a controlled environment and determined to comprise a set A of core properties. In testing, the device 20 may be used to obtain the extracted spectrum of a test apple and to determine that the extracted spectrum comprises a set B of properties. If set B contains the same properties as set A plus some additional information/properties in the form of, e.g. higher intensities at the same wavelength or additional intensities at newer wavelength, it is determined that set B is a subsumption of set A. By determining that set B is a subsumption of set A, it is established that the test apple is organic. Each set of properties is preferably represented as a feature vector. If the feature vector of a given reference spectrum is included in the feature vector of the extracted spectrum (i.e. obtained from the target object) then, by subsumption, the target object is classified in accordance with said reference spectrum, i.e. is deemed to match said reference spectrum for the purposes of classification. This is in contrast to alternative comparison techniques in which the equality of the respective feature vectors are compared.
The characteristics of the RGB space supported by the camera 152 and the display 164 can vary from device to device, and therefore the characteristics of the extracted spectra may similarly be device dependent. Accordingly, the analysis, including specific machine learning models, may be specific to specific computing devices. Optionally, calibration may be performed so that different types of computing device support a common RGB space and such that the classification and other machine learning models and data are not device dependent.
Figure 3 shows an exemplary overview of the processing that may be performed once the spectrum video has been recorded by the camera 152. The mobile application 92 obtains the video file (spectrum video) captured by the camera 152 and causes it to be processed by a processing algorithm to produce the extracted spectrum. This processing may be performed at the device 20 (e.g. by the mobile application 92) or at the server 12A as is convenient. The extracted spectrum is processing by one or more classification algorithms, which processing involves obtaining known spectra from database 16, and classifying the extracted spectrum using the known spectra to determine a best match spectrum (or category) for the extracted spectrum, preferably using a subsumption comparison technique as described above. Optionally a single best match may be returned, although multiple best matches may alternatively be returned e.g. one for each of a number of matching categories. The best matches may then be used to obtain corresponding data from the database 16, which data may provide information that is assumed to be relevant to the target object, e.g. information identifying one or more characteristic of the target object such as its type and/or origin, or which may simply identify one or more categories to which the extracted spectrum, and therefore the target object, is deemed to match. The target object data is provided to the mobile application 92, which may render it to a user of the device 20 via the user interface.
In some embodiments, all of the processing of the video file, including classification of the extracted spectrum and obtaining corresponding target object data may be performed on the computing device 20 On which case the database 16 is provided on the computing device 20). However, it is advantageous to perform at least some of the processing at the server 12A, or at least to provide the database 16 at the server 12A. One reason for this is that the server 12A typically has more processing power than the computing device 20 and may therefore be capable of more sophisticated classification. Another advantage is that the database 16 can be updated using data gathered from multiple sources, which improves performance for all computing devices that can communicate with the server 12A. For example, it is preferred that at least the processing of the extracted spectrum, including classification, is performed at the sewer 12A.
Embodiments of the invention have many potential applications. One application is food authentication by consumers, where consumers can use the mobile application 92 to check if a food item is what it purports to be, e.g. in terms of production method (e.g., organic food) and origin.
The invention is not limited to the embodiment(s) described herein but can be amended or modified without departing from the scope of the present invention.

Claims (1)

  1. CLAIMS: 1. A method of generating an optical spectrum for a target object, the optical spectrum comprising a 5 plurality of spectral values, the method comprising: illuminating, using a display of a computing device, the target object with a plurality of different colours of light one at a time in succession; receiving, at a video camera of said computing device, a corresponding sequence of coloured light reflected from the target object; creating a digital video of said received light sequence; obtaining a plurality of frame images from said digital video, each frame image comprising a plurality of pixels; deriving a respective one of said spectral values from a selected pixel from each frame image 2. The method of claim 1, wherein the respective selected pixel for each frame image has the same or substantially the same location in the respective fame image.3. The method of claim 1 or 2, wherein each pixel has respective red, green and blue (RGB) 20 channel values, and wherein said deriving involves calculating the respective spectral value from the respective RGB channel values.4. The method of claim 3, wherein said calculating involves calculating an average, preferably a weighted average, of said RGB channel values.5. The method of claim 4, wherein said calculating involves calculating a weighted average of said RGB channel values with a weighting of 3:6:1.6. The method of any preceding claim, wherein said obtaining involves obtaining at least one frame 30 image for each of said different colours of light, or obtaining a respective frame image for at least some of said different colours of light.7. The method of any preceding claim, wherein said plurality of different colours of light include at least one colour from each of a plurality of wavelength bands, said wavelength bands including a red 35 band, and green band and a blue band, and preferably also from any one or more of an orange, yellow, indigo and violet band.8. The method of any preceding claim, wherein said computing device supports a colour gamut for said display, and wherein said plurality of different colours of light include at least one colour from 40 each of a plurality of wavelength bands of said gamut, said wavelength bands including a red band, and green band and a blue band, and preferably also from one or more of an orange band, a yellow band, an indigo band and a violet band.9. The method of claim 8, wherein said plurality of different colours comprises all of the colours of 5 the supported colour gamut, or at least 50 different colours, preferably 100-500 different colours.10. The method of any preceding claim, wherein said illuminating involves rendering said plurality of different colours of light over a time period of between 3 and 15 seconds, preferably between 5 and 10 seconds.11. The method of any preceding claim, wherein said computing device is a smartphone or other hand-held computing device.12. The method of any preceding claim, wherein said display and said video camera of said 15 computing device face in the same direction.13. A spectrometry method comprising generating an optical spectrum for a target object using the method of any one of claims 1 to 12, the spectrometry method further including classifying said optical spectrum into one or more categories, preferably using a plurality of reference optical spectra 20 wherein each category may be associated with one or more of said reference optical spectrum.14. The method of claim 13, wherein said classifying involves: determining a set of properties of said optical spectrum; comparing said set of properties to a respective set of properties of at least one of said reference optical spectra; classifying said optical spectrum according to a reference optical spectrum if said set of properties of said reference optical spectrum are included in said set of properties of said optical spectrum.15. The method of claim 14, wherein said classifying involves: representing said set of properties as a feature vector; comparing the feature vector of said optical spectrum with the respective feature vector of at least one of said reference optical spectra; classifying said optical spectrum according to a reference optical spectrum if said feature vector of said reference optical spectrum is subsumed by said feature vector of said optical spectrum.16. A system for generating an optical spectrum for a target object, the optical spectrum comprising 35 a plurality of spectral values, the system comprising a computing device having a video camera and a display, the computing device further comprising: means for causing the display to illuminate the target object with a plurality of different colours of light one at a time in succession; and means for causing said video camera to simultaneously create a digital video of a corresponding sequence of coloured light reflected from the target object, the system further including: means for obtaining a plurality of frame images from said digital video, each frame image comprising a plurality of pixels; and means for deriving a respective one of said spectral values from a selected pixel from each frame image.17. The system of claim 16, wherein said display and said video camera are oriented on said computing device to face in the same direction as each other.18. The system of claim 16 or 17, wherein said obtaining means and said deriving means are 10 included in said computing device.19. The system of any one of claims 16 to 18, wherein said computing device is a smartphone or other hand-held computing device.20. A spectrometry system comprising a system as claimed in any one of claims 16 to 19, further including means for classifying said optical spectrum into one or more categories, preferably using a plurality of known optical spectra.
GB1905717.3A 2019-04-24 2019-04-24 Method and system for generating optical spectra Withdrawn GB2583905A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1905717.3A GB2583905A (en) 2019-04-24 2019-04-24 Method and system for generating optical spectra
PCT/EP2020/061536 WO2020216938A1 (en) 2019-04-24 2020-04-24 Method and system for generating optical spectra

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1905717.3A GB2583905A (en) 2019-04-24 2019-04-24 Method and system for generating optical spectra

Publications (2)

Publication Number Publication Date
GB201905717D0 GB201905717D0 (en) 2019-06-05
GB2583905A true GB2583905A (en) 2020-11-18

Family

ID=66810350

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1905717.3A Withdrawn GB2583905A (en) 2019-04-24 2019-04-24 Method and system for generating optical spectra

Country Status (2)

Country Link
GB (1) GB2583905A (en)
WO (1) WO2020216938A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230085600A1 (en) * 2021-09-10 2023-03-16 Arizona Board Of Regents On Behalf Of The University Of Arizona Self-calibrating spectrometer

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187199A1 (en) * 2014-08-26 2016-06-30 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US20170184452A1 (en) * 2015-12-23 2017-06-29 Imec Vzw Spectrometer module
US20170195586A1 (en) * 2015-12-23 2017-07-06 Imec Vzw User device
CN109269643A (en) * 2018-11-02 2019-01-25 天津津航技术物理研究所 Spectrum demodulating system and method based on mobile device screen light source

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2987118A1 (en) * 2012-02-17 2013-08-23 Franck Hennebelle METHOD AND DEVICE FOR MEASURING THE COLOR OF AN OBJECT
US20140028799A1 (en) * 2012-07-25 2014-01-30 James Kuffner Use of Color and Intensity Modulation of a Display for Three-Dimensional Object Information
DE102016226206A1 (en) * 2016-12-23 2018-06-28 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. System and method for acquiring measurement images of a measurement object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187199A1 (en) * 2014-08-26 2016-06-30 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US20170184452A1 (en) * 2015-12-23 2017-06-29 Imec Vzw Spectrometer module
US20170195586A1 (en) * 2015-12-23 2017-07-06 Imec Vzw User device
CN109269643A (en) * 2018-11-02 2019-01-25 天津津航技术物理研究所 Spectrum demodulating system and method based on mobile device screen light source

Also Published As

Publication number Publication date
WO2020216938A1 (en) 2020-10-29
GB201905717D0 (en) 2019-06-05

Similar Documents

Publication Publication Date Title
US10027868B2 (en) Sensor-synchronized spectrally-structured-light imaging
US10264250B2 (en) Method and apparatus for determining spectral characteristics of an image captured by a camera on a mobile endpoint device
US10113910B2 (en) Sensor-synchronized spectrally-structured-light imaging
US8154612B2 (en) Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
EP2815218B1 (en) Method and device for measuring the colour of an object
KR20200046137A (en) Apparatus and method for inspecting for defects
US10762613B2 (en) Capturing images under coded illumination patterns to reduce effects of ambient lighting conditions
Nixon et al. Accurate device-independent colorimetric measurements using smartphones
CN110637224A (en) Information search system and program
KR20190046532A (en) Personal color dignosis device and the method
US10012584B2 (en) System and method for determining solute concentration in a colored liquid sample
WO2020216938A1 (en) Method and system for generating optical spectra
Khalili Moghaddam et al. Smartphone-based quantitative measurements on holographic sensors
US8526717B2 (en) Rich color transition curve tracking method
Koskinen et al. Single pixel spectral color constancy
Nieves et al. Recovering fluorescent spectra with an RGB digital camera and color filters using different matrix factorizations
CN112799224B (en) System and method for generating output image data and microscope
US11543644B2 (en) Digital imaging device and method for generating a digital color image
KR20230052014A (en) Method and apparatus for classifying foreign matter
WO2015146084A1 (en) Pos terminal, information processing device, white balance adjustment method, and recording medium
Qiu et al. Image quality degradation of object-color metamer mismatching in digital camera color reproduction
CN113129250A (en) Skin detection method and device, terminal equipment and computer storage medium
US20230085600A1 (en) Self-calibrating spectrometer
JP2021018754A (en) Object identification method, information processing device, information processing program, optical filter and lighting device
US20150279047A1 (en) Exemplar-based color classification

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)