WO2019008330A1 - Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects - Google Patents

Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects Download PDF

Info

Publication number
WO2019008330A1
WO2019008330A1 PCT/GB2018/051833 GB2018051833W WO2019008330A1 WO 2019008330 A1 WO2019008330 A1 WO 2019008330A1 GB 2018051833 W GB2018051833 W GB 2018051833W WO 2019008330 A1 WO2019008330 A1 WO 2019008330A1
Authority
WO
WIPO (PCT)
Prior art keywords
fish
projector
speed
spatial
processing subsystem
Prior art date
Application number
PCT/GB2018/051833
Other languages
French (fr)
Inventor
Eiolf Vikhagen
Sven Jørund KOLSTØ
Original Assignee
Optoscale As
Wilson, Timothy James
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Optoscale As, Wilson, Timothy James filed Critical Optoscale As
Publication of WO2019008330A1 publication Critical patent/WO2019008330A1/en
Priority to NO20200134A priority Critical patent/NO20200134A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • A01K61/95Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2522Projection by scanning of the object the position of the object changing and being recorded
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • This invention relates to systems and method for emitting structured light towards an object, and for determining spatial information about an object that is illuminated by 5 structured light.
  • phase-shifting technique where the projected stripes are shifted laterally by a given distance between successive exposures, and where the spatial phase of the pattern on a surface of the object, captured by the camera, is computed numerically by combining multiple different0 phase-shifted images.
  • Other methods analyse a single image— for example, using Fourier analysis or "fringe tracking" to determine the object's size and shape.
  • This technology has many application areas.
  • One application area is the non-invasive measuring of the size and weight (biomass) of caged fish in a fish farm.
  • WO 2014/098614 (EBTech AS) discloses a system for determining 3D measurements of a fish in water using one camera and a structured-light illumination system. It also discloses an alternative system that uses two cameras, as a stereoscopic pair, to determine three-dimensional information about a fish. In both0 cases, 3D modelling is said to be calculated by triangulation between the light and the image, or between images from several cameras, so as to provide a "depth image" where each pixel value represents a distance to the measured object.
  • WO 2014/098614 recognises that imaging a moving fish is not straightforward, but states that this can be mitigated by arranging the light source to have an appropriate wavelength, intensity and frequency. The applicant has realised, however, that this is not always sufficient to obtain measurements of a desired accuracy, and that there remains a need for systems that can provide accurate three-dimensional spatial information of an object, especially— although not exclusively— when the object is in motion.
  • the present invention provides a system for determining three- dimensional spatial information about an object, the system comprising a projector, a camera, and a processing subsystem, wherein:
  • the projector is arranged to emit structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic, having a first spatial frequency in a first direction;
  • the camera is arranged to capture a plurality of images of the object in a respective plurality of time frames, and to send image data representing the plurality of images to the processing subsystem;
  • the processing subsystem is arranged to:
  • the invention provides a method of determining three- dimensional spatial information about an object, the method comprising:
  • the structured light has, within an illumination plane, an intensity pattern that is periodic, having a first spatial frequency in a first direction; capturing a plurality of images of the object in a respective plurality of time frames;
  • three-dimensional spatial information is determined from spatial phase values that are calculated from a plurality of phase-shifted images.
  • This can provide improved spatial resolution compared with triangulation based on just a single image. It also less susceptible to interference from other light sources, such as sunlight.
  • multi-image, phase-shifting, structured-light techniques were best suited for use on static objects, but the present invention allows such techniques to be used effectively with moving objects. It does this by appropriately controlling the structured light emissions, in dependence on the speed of the object, so that a pattern of light projected on the surface of the object is shifted, relative to the object, by a speed- independent spatial-phase step.
  • the object may be in water.
  • the object may be a fish.
  • the system may be installed in a fish cage or fish farm.
  • the invention is not limited to any particular type of object or environment.
  • the object may be only a part of a larger entity— e.g., it may be just the tail of a fish.
  • the object is preferably not an element of the system itself.
  • the structured light is controlled by shifting the intensity pattern in the first direction, between successive time frames of the plurality of time frames, by an amount that depends on the speed of the object in the first direction. In this way, the intensity pattern can be shifted by an amount that combines the desired phase step and the distance the object is expected to travel between adjacent time frames, so that the net result, relative to the object, is simply the desired phase step.
  • the processing subsystem is preferably arranged to calculate the shift amount using the data relating to the speed of the object.
  • the control signal sent by the processing subsystem preferably encodes the shift amount.
  • the shift amount may be determined in an angular unit (e.g., degrees of angle from a lens of the projector), or in a linear unit (e.g., distance within the illumination plane), or as a spatial phase shift for the structured light.
  • the structured light is controlled by controlling the spatial frequency of the intensity pattern in dependence on the speed of the object in the first direction.
  • the spatial frequency can be such that the spatial-phase step corresponds to the distance the object is expected to travel within the illumination plane, between adjacent time frames. This may desirably allow the projector to be made simpler, by avoiding a need for a mechanism in the projector to shift the whole intensity pattern.
  • the processing subsystem is preferably arranged to generate data representing the spatial frequency using the data relating to the speed of the object.
  • the control signal sent by the processing subsystem preferably encodes data representing the spatial frequency of the intensity pattern in the illumination plane.
  • the spatial-phase step is preferably constant over a plurality of pairs of successive time frames— e.g., for at least three, four, or more successive time frames.
  • the spatial-phase step may be an integer fraction of 2 ⁇ . For example, it may be 2 ⁇ / 3, or 2 TT / 4.
  • the time frames are preferably spaced at regular intervals— e.g., every 0.01 seconds, or at any interval ranging between 0.001 seconds to 1 second, or more or less than this. Shorter intervals may be preferred where the movement of the object may be erratic, since this minimises errors due to extrapolating the speed information, or where the object may change shape between time frames, such as may be case for a swimming fish.
  • the intensity pattern is periodic in the first direction within the illumination plane (which is typically a virtual plane), it will be appreciated that an actual projected pattern, on a surface of the object, will typically not be strictly periodic, but will have a varying frequency (or phase) based on the contours of the surface.
  • the illumination plane may intersect the object. It will be appreciated that, if part of the object is closer to the projector than the illumination plane, the structured light will not reach the illumination plane everywhere; in this case, references to the intensity pattern in the illumination plane should be understood as referring to the theoretical intensity if the object were not present.
  • the illumination plane may be a plane in which, out of all possible planes that intersect the object, the structured light has an intensity pattern having a minimum spatial frequency.
  • the first direction within the illumination plane may be a direction in which, out of all possible directions in the illumination plane, the intensity pattern in the illumination plane has a minimum spatial frequency.
  • the intensity pattern within the illumination plane may be periodic in all directions, or in a plurality of directions (e.g., a square grid, or concentric arcs), or there may be a direction in which the intensity pattern is not periodic (i.e., in which it has constant intensities).
  • the intensity pattern may be periodic in one direction only.
  • the intensity pattern comprises a set of parallel stripes.
  • the intensity of the stripes may vary as a sine wave.
  • the stripes may be substantially vertical, or at an angle of between 0 and 25, 10 or 5 degrees to vertical; this has been found to be particularly appropriate when measuring objects that travel mainly horizontally, such as a swimming fish.
  • the first direction is preferably substantially horizontal, or at an angle of between 0 and 25, 10 or 5 degrees to horizontal.
  • the intensity pattern may not be periodic over its entire extent, but only where the illumination plane intersects an imaging region or imaging volume. Due to limitations in the optics and the transmission medium, it will also be appreciated that the intensity pattern may not be perfectly uniform or periodic even over an imaging region— e.g., the first spatial frequency may vary within certain bounds, such as +/- 10%, or the stripes may not be exactly parallel.
  • the projector may comprise a non-coherent light source. It may be a digital projector. It may comprise a spatial light modulator.
  • the projector comprises an interference-based structured-light emitter.
  • the projector preferably comprises a source of coherent light.
  • the intensity pattern in the illumination plane and/or the actual projected pattern on the object may thus comprise an interference fringe pattern.
  • the projector may comprise an interferometer, such as a Michelson interferometer.
  • the emitted structured light may contain a majority of its energy in the visible spectrum, or may contain a majority of its energy in the infrared or ultraviolet spectrum.
  • the projector preferably emits polarized light— preferably linearly polarized.
  • the projector may comprise a light source that is inherently polarized, or it may comprise a polarizing filter.
  • the camera preferably comprises a corresponding polarizer (e.g., a linear polarizer) to filter backscattered light from particles and molecules in a medium between the system and the object, such as water.
  • the polarizer on the camera can also help to reduce interference due to specular reflection "highlights" on the object, which may otherwise be caused by the projector and/or by external light sources.
  • the camera may comprise a narrowband bandpass filter. This can be set to pass light from the projector, but to attenuate background light or sunlight scattered from particles in the medium, or reflected from the object or from other background objects.
  • the processing subsystem is preferably arranged to use the data relating to the speed of the object in the first direction when processing the image data or when processing the calculated spatial phase values. It preferably applies a lateral shift operation to the image data or to the spatial phase values, corresponding to one or more time frames, which shifts the data laterally by an amount that depends on the speed of the object in the first direction. In this way, the position of the object will be substantially the same within the image data, or within spatial phase values, from two or more successive time frames, even though the object moved relative to the projector in reality.
  • the processing subsystem is preferably arranged to translate the image data or spatial phase values into a coordinate system in which the object is fixed. In this same coordinate system, the intensity pattern should step by the spatial-phase step between successive time frames.
  • the processing subsystem may implement a shift or translation of the data in any appropriate way, which may depend on how the data is represented in a memory of the processing subsystem (e.g., as a data array, or a bitmap image file, etc.).
  • the projector and camera are preferably arranged in a fixed relationship to each other. They preferably have a fixed separation distance (e.g., measured between respective lenses), and preferably have fixed relative orientations. They may be connected by one or more support members, such as a metal framework.
  • the processing subsystem preferably uses a value representing a distance between the projector and camera when determining 3D information about the object.
  • the processing subsystem preferably also uses information relating to a projection axis of the projector and/or an imaging axis of the camera when determining 3D information about the object.
  • the object is moving, it is also possible that the object is stationary and the projector (and preferably also the camera) is moving.
  • the speed of the object in the first direction may represent only a component of the object's velocity, which may also have a non-zero component in another direction.
  • the system is preferably arranged so that the object moves substantially in the first direction (i.e., so that the speed of the object in the first direction is larger than the speed of the object in any direction orthogonal to the first direction).
  • the data relating to a speed of the object, relative to the projector, in the first direction is preferably determined using the camera.
  • the speed data may be received as an input to the processing subsystem from another source.
  • the speed may be represented in angular units, such as radians/second, or as a linear speed in a reference plane, such as an imaging plane (which may or may not be a plane intersecting the object), such as metres/second.
  • the imaging plane is the illumination plane.
  • the speed of the object may be determined with reference to the illumination plane.
  • Data relating to a speed of the object may be obtained only once during a
  • determining data relating to the speed of the object in the first direction comprises summing pixel values along a set of lines perpendicular to the first direction in an image from the camera, to generate a set of summed values. A leading edge of the object may then be determined from the summed values— e.g., based on where a value in the set crosses a threshold value, or on where the gradient of the summed values crosses a threshold gradient value.
  • the respective positions of a leading edge may be determined in each of two of more images taken at different time frames.
  • the speed of the object in the first direction may then be determined from these positions and the interval between the time frames.
  • the system may be arranged to illuminate the object uniformly for determining the speed of the object— i.e., not with a static pattern. This may be done with a separate light source.
  • the projector is arranged to sweep a pattern or beam over the illumination plane or an imaging region, within an exposure time of the camera.
  • the pattern or beam may be generated by a laser in the projector. This allows the system to provide illumination that is effectively uniform for each image capture in a way that does not require an additional light source, thereby saving cost and/or space.
  • the spatial phase values are preferably calculated from two or more images taken in successive time frames. The number of these images, multiplied by the spatial-phase step, may equal 2 pi. For example, if the phase step is 2 pi / 3, three successive images may be used to calculate the spatial phase values.
  • the spatial phase values may be calculated at a spatial resolution that is the same as a spatial resolution of the images, or at a courser resolution. Known techniques may be used to do this.
  • the spatial phase values may be used to determined 3D spatial information about the object in any known way— e.g., by unwrapping the phase values and subtracting the unwrapped values from phase values calculated for a reference plane.
  • the spatial information may include a length of the object, a height of the object, or any other information. These values may be determined as functions of distance to the object, which may or may not be known at this stage.
  • the system may be arranged to adjust the images from the camera to reduce image distortion. This adjustment may be based on a suitable calibration process— e.g., using a pinhole calibration algorithm.
  • the system may be further arranged to estimate a mass or volume of the object.
  • the processing subsystem may be arranged to identify an outline of the object— e.g., in order to subtract background regions from the image data, which may contain unwanted noise such as reflections from other objects. This may comprise the processing subsystem applying a spatial gradient filter, or other edge-detection filter, to the calculated spatial phase values.
  • a distance to the object may be received by the processing subsystem, or it may be determined by the processing subsystem.
  • the system may, for instance, be arranged to direct a single laser beam at the object, from a known angle, in addition to the structured light emission; this may enable the distance to be determined by triangulation.
  • a ray or beam in the structured light emission may be encoded (e.g., using colour) so as to create an identifiable point or line in the images, thereby allowing detection of the stripe order.
  • the projector may output multiple structured light emissions having intensity patterns in the illumination plane that have different spatial frequencies.
  • the system may comprise a second camera, which may be used to determine a distance to the object.
  • the second camera is preferably spaced apart from the first camera.
  • the processing subsystem may be arranged to process images from the first camera and from the second camera to determine a distance to the object. This distance may then be used to determine final 3D spatial information about the object— e.g., by scaling the spatial information derived from the calculated spatial phase values by an appropriate amount to account for distance to the object.
  • stereo vision systems are known in the art, including in combination with structured-light illumination, the applicant is not aware of any prior disclosure of using two cameras to determine a distance to an object, and using this distance in combination with unwrapped spatial phase information to determine three-dimensional spatial information about the object.
  • known stereo vision systems would calculate a large number of distances to different points on an object, which would thereby directly yield the desired 3D spatial information.
  • the applicant has recognised that there are situations in which additionally using unwrapped spatial phase information can produce better results. For example, this can allow the two cameras to be placed closer together than would normally be the case in order to provide a desired depth resolution, which may be desirable for avoiding interference due to specular reflection from a light source. It can also allow the use of low- resolution cameras than might otherwise be required.
  • the invention provides a system for determining three- dimensional spatial information about an object, the system comprising a projector, a first camera, a second camera, and a processing subsystem, wherein:
  • the projector is arranged to emit structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic;
  • the first camera is arranged to capture a first plurality of images of the object from a first viewing position, in a respective plurality of time frames, and to send first image data representing the first plurality of images to the processing subsystem;
  • the second camera is arranged to capture a second plurality of images of the object from a second viewing position, different from the first viewing position, in a respective plurality of time frames, and to send second image data representing the second plurality of images to the processing subsystem;
  • the processing subsystem is arranged to:
  • the invention provides a method of determining three- dimensional spatial information about an object, the method comprising:
  • structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic;
  • the third set of spatial phase values may be the same as, or may overlap with, the first set or second set, or it may be a separate set.
  • the first and second cameras may be synchronised. Their respective plurality of time frames may be the same time frames.
  • the first and second sets may be calculated using the same frames for the first and second image data. This reduces errors due to the object changing its shape or position over time.
  • the common point on the object is preferably determined using a correlation operation between the first and second sets of spatial phase or pattern amplitude values.
  • Triangulation may be used to find a distance to the common point.
  • a distance measurement to only one common point is used to determine the 3D spatial information—i.e., only one distance measurement.
  • the distance measurement is preferably used to scale distance-dependent spatial information determined from the set of unwrapped phase values.
  • the structured light emission and/or the intensity pattern within the illumination plane may have a Gaussian intensity distribution.
  • the intensity of the intensity pattern may be tailored to the shape and/or size of the object that is being measured.
  • the projector may comprise at least one light source, an optical element, and an interferometer. It may be arranged to direct a first Gaussian beam into the optical element along a first path and to direct a second Gaussian beam into the optical element along a second path, the second path being laterally and/or angularly offset from the first path.
  • the optical element is preferably arranged to direct light from the first and second Gaussian beams into the interferometer.
  • the optical element and the interferometer are preferably arranged to emit structured light from the first Gaussian beam, the structured light having a first intensity pattern, within the illumination plane, that is periodic, and to emit structured light from the second Gaussian beam, the structured light having a second intensity pattern, within the illumination plane, that is periodic.
  • the first and second intensity patterns preferably have a common spatial frequency and a common spatial phase distribution within the illumination plane.
  • the first intensity pattern preferably overlaps the second intensity pattern within the illumination plane.
  • a point of maximum intensity of the first intensity pattern, in the illumination plane is laterally offset from a point of maximum intensity of the second intensity pattern in the illumination plane.
  • the intensity patterns are preferably fringe patterns.
  • the processing subsystem may be arranged to instruct the projector to adjust the offset of the first and second paths, in response to data from the camera. This may allow, for instance, one centre of intensity to be placed on an upper half of a fish, and the other centre of intensity to be placed on a lower half of the fish.
  • This projection system is believed to be novel in its own right.
  • the invention provides a structured-light projection system for emitting structured light towards an object, the system comprising at least one light source, an optical element, and an interferometer, wherein:
  • the structured-light projection system is arranged to direct a first Gaussian beam into the optical element along a first path and to direct a second Gaussian beam into the optical element along a second path, the second path being laterally or angularly offset from the first path;
  • the optical element is arranged to direct light from the first and second
  • the optical element and the interferometer are arranged to emit structured light from the first Gaussian beam, wherein the first structured light has, within an illumination plane, a first intensity pattern that is periodic, and to emit structured light from the second Gaussian beam, wherein the second structured light has, within said illumination plane, a second intensity pattern that is periodic;
  • the first and second intensity patterns have a common spatial frequency; the first and second intensity patterns have a common spatial phase distribution over at least a portion of the illumination plane;
  • a point of maximum intensity of the first intensity pattern in the illumination plane is laterally offset from a point of maximum intensity of the second pattern in the illumination plane;
  • the first intensity pattern overlaps the second intensity pattern in the illumination plane.
  • the invention provides a method of emitting structured light towards an object, the method comprising: directing a first Gaussian beam into an optical element along a first path;
  • the interferometer emitting first structured light from the first Gaussian beam, wherein the first structured light has, within an illumination plane, a first intensity pattern that is periodic;
  • the interferometer emitting second structured light from the second Gaussian beam, wherein the second structured light has, within said illumination plane, a second intensity pattern that is periodic,
  • the first and second intensity patterns have a common spatial frequency in the illumination plane
  • the first and second intensity patterns have a common spatial phase distribution over at least a portion of the illumination plane
  • a point of maximum intensity of the first intensity pattern in the illumination plane is laterally offset from a point of maximum intensity of the second pattern in the illumination plane;
  • the first intensity pattern overlaps the second intensity pattern in the
  • the illumination plane may be, or have any of the features of, an illumination plane as previously described.
  • the illumination plane may intersect the object and/or an illumination region.
  • the illumination region may contain part of all of the object; it may correspond to an imaging region as described above.
  • the optical element may be a lens, or a beam expander, or a group of lenses, a holographic element, an elliptical mirror, a pinhole, or any other appropriate element.
  • the interferometer is preferably a Michelson interferometer.
  • the first and second beams may have different polarizations.
  • the illuminated polarization can be differentiated spatially over the surface of an object located in the illumination region. This can provide a more homogenous contrast in the stripe images, e.g., by reducing specular reflection from parts of the object that have a tendency to reflect too much light specularly.
  • the camera may also have a polarizer, which may further reduce noise in the image.
  • the structured-light projection system may be arranged to direct one or more further Gaussian beams into the optical element along further respective paths— e.g., three, four, or more beams.
  • the feature of the spatial-phase step being independent of the speed of the object in the first direction, relative to the object should be understood as covering situations where it may not be possible to track the motion of the object perfectly, but where the stepping is nevertheless sufficiently independent of the speed as to allow for spatial phase values to be calculated to an acceptable level of precision.
  • Figure 1 is a schematic representation of a system embodying the invention being used to measure a fish
  • Figure 2 is a composite image of a fish illuminated by a fringe pattern, at three different spatial positions;
  • Figure 3 is a diagram showing vertical and horizontal displacement of an object;
  • Figure 4 is a phase image of a fish
  • Figure 5a is a graph of intensity over distance for a fringe pattern with a constant background
  • Figure 5b is a graph of intensity over distance for a fringe pattern with a changing background
  • Figure 6 is a fringe-amplitude image of a fish
  • Figure 7 is a phase-gradient image of a fish
  • Figure 8 is diagram showing how successive image frames are combined and processed
  • Figure 9 is a schematic drawing of part of a Michelson interferometer used in an embodiment of the invention.
  • Figure 10 is a schematic drawing of a Michelson interferometer used in an embodiment of the invention.
  • Figure 11 is a schematic drawing showing parallel light beams passing through a lens of a Michelson interferometer
  • Figure 12 is a schematic drawing showing two striped light beams against a fish
  • Figure 13 is a schematic drawing showing angled light beams passing through a lens of a Michelson interferometer
  • Figure 14 is a schematic diagram of an optical beam-combining arrangement used in an embodiment of the invention.
  • Figure 15 is a schematic diagram of four beams entering a lens
  • Figure 16 is a schematic diagram showing four light beams against a fish
  • Figure 17 is a schematic diagram of beam-splitting arrangement used in an embodiment of the invention.
  • Figure 18 is a schematic diagram of a beam-shifting arrangement used in an embodiment of the invention.
  • Figure 1 shows a system 1 for estimating the size and weight of a fish 2.
  • the system 1 could be installed in, or adjacent, a fish cage or tank to monitor the growth of fish in the cage— e.g., to determine when fish in the cage have grown to a sufficient size to be sold, or to be moved to a different cage.
  • a commercial net-cage can be installed in the sea or in a lake to hold a large number of fish (e.g., 200,000 or more). Individual fish are measured as they swim past the measurement system 1.
  • Statistical analysis may be used to estimate the distribution of weights of the fish in the whole cage from a set of individual measurements.
  • the system 1 has an electronic control unit 3 which is connected to an optical projector 4, a left camera 5 and a right camera 6.
  • the connections could be wired or wireless.
  • the projector 4 and cameras 5, 6 are typically mounted underwater, in a fixed relationship to the earth, or to a fish net or fish cage.
  • the components may be individually sealed, or may be contained in a common housing (not shown).
  • the control unit 3 can also be mounted underwater.
  • the system 1 is of a type commonly known as "projected fringes" or “moire” or “structured light”.
  • the projector 4 contains one or more lasers and a Michelson interferometer, and is arranged to project optical interference fringes onto the surface of the fish 2 as the fish 2 swims past the projector 4.
  • it could contain two optical fibres, terminated close to each other so as to act as two adjacent point sources, which is another way of generating an interference stripe pattern.
  • the projector 4, left camera 5 and right camera 6 are all aligned to illuminate and image, respectively, the fish 2, as it passes through an imaging volume or region.
  • the fish 2 may be able to swim freely, or physical guides, such as glass walls, could be used to encourage it to pass the system 1 along a desired path.
  • the dimensions of the imaging region will depend on the optics of the projector 4 and on the width and depth of field of the cameras 5, 6.
  • the projector 4 is arranged to project a pattern of uniform, vertical fringe lines towards the side of the fish 2, while the fish 2 is in the imaging region.
  • the intensity of the fringes follows a sine-wave in the horizontal axis, with a period of between around 1 cm and 10 cm, although smaller or larger periods could be used.
  • the pattern cannot be too fine, or else the resolution limit of the cameras 5, 6 may become problematic.
  • the pattern of light is periodic in a virtual illumination plane intersecting the fish, it will be understood that the actual pattern on the surface of the fish will not be strictly periodic, as it is affecting by the curvature and orientation of the fish.
  • the control unit 3 uses triangulation and other geometrical principles, as described below, to determine the surface shape, size and distance to the fish 2, based on the output of light from the projector 4 and the collection of light, from different directions, at the cameras 5, 6.
  • the system 1 combines projected-stripe processing with stereo- vision and "digital image correlation" (DIC) processing, to estimate the size of the fish 2.
  • the mass of the fish 2 can then be estimated from its size, if required, using knowledge about the mean density of fish of the appropriate type, or using more sophisticated modelling techniques.
  • Each camera 5, 6 captures images at a rate of up to 200 Hz.
  • the cameras 5, 6 are aligned vertically and with parallel optical axes, but are separated from each other horizontally by approximately 10 cm.
  • the cameras 5, 6 are aligned vertically with the projector 4, but separated from it horizontally by approximately 70 cm.
  • the projector 4 is set at an angle of approximately 25 degrees to the axes of the cameras 5, 6. This arrangement enables the cameras 5, 6 to acquire images of the projected fringe pattern that contain information about the shape of the fish 2, while minimising the risk of imaging specular reflections from the shiny scales of the fish 2, which could otherwise hinder accurate image processing.
  • any of these numbers could be larger or smaller in alternative embodiments.
  • control unit 3 calculates the speed and direction of the fish 2 and uses this information to adjust a coordinate system used by the system 1 so that it remains centred on the fish 2. This has the effect of making the fish 2 stationary within this coordinate system, which can simplify the control and processing algorithms.
  • Figure 2 shows three successive image frames, superimposed on each other, as the fish 2 swims from left to right through the imaging region.
  • the left and right arrows above the image show how much the first and third image frames would need to be shifted by in order to be aligned exactly with the second, middle image frame.
  • the control unit 3 performs such shifting of successive images from the cameras 5, 6, based on an estimate of the fish's speed across the imaging region, so that the coordinate system of the images tracks, and remains centred on, the fish 2.
  • Figure 2 show only horizontal panning (which is typically the most relevant for fish, as they primarily move horizontally), but the method also encompasses vertical panning.
  • Figure 3 shows how vertical (y-axis) and horizontal (x-axis) speed estimates can be used to shift the acquired images in sync with the overall motion of the fish 2.
  • a horizontal speed estimate is obtained as follows. Before a fish 2 has been identified, the projector 4, or another lamp, is set to provide uniform illumination over the imaging region. This may alternatively be simulated by shifting the fringe pattern laterally within one imaging period, so that light is cast over the whole imaging region during the one image exposure time. This horizontal shifting may be continuous or discrete.
  • the control unit 3 processes the image frames from one of the cameras 5, 6 to detect when an object, such as the fish 2, enters the field of view of the camera 5, 6. When this happens, the control unit 3 first calculates an estimate of a horizontal velocity component of the object.
  • control unit 2 therefore makes a quick determination of the fish's horizontal speed even before the fish 2 has fully entered the field of view of the camera 5, 6.
  • the control unit 3 does this by comparing two or more images (with uniform
  • control unit 3 performs a column-wise summation over each of two image, having a known time difference. This gives two one-dimensional sets of summation values, one for each image. Assuming that the fish 2 is alone in the imaged area, or assuming other fish in the imaging region have a significant difference in distance from the camera 5, 6 compared with the measured fish 2, the set of summation values will step from very low values in front of the fish 2 (where there is just dark, background water) to much higher values where light is reflected off the head of the fish 2. By using an appropriate threshold, a position of the leading edge of the fish can be determined from each of the summation sets. From these positions, and knowing the time intervals between the images, the fish's horizontal speed within the object plane of the camera 5, 6 can be measured. Note that these speed estimates do not necessarily reflect the actual speed of the fish
  • the camera 5, 6 may have a large depth of field, such that a slow-moving fish close to the camera 5, 6 could yield the same horizontal speed estimate as a fast-moving fish further away from the camera 5, 6. To work out the true speed of the fish 2 would require a knowledge of the distance between the camera 5, 6 and the fish 2.
  • the control unit 3 uses the horizontal speed estimate or estimates to control the fringes projected by the projector 4.
  • the present example uses an approach of illuminating the fish 2 with a uniform light pattern of vertical stripes.
  • the stripes are stepped horizontally by 2 ⁇ / 3 between successive image frames.
  • a succession of three image frames is then processed mathematically to determine shape information.
  • the control unit uses an approach of illuminating the fish 2 with a uniform light pattern of vertical stripes.
  • the stripes are stepped horizontally by 2 ⁇ / 3 between successive image frames.
  • a succession of three image frames is then processed mathematically to determine shape information.
  • the control unit because the fish 2 is typically in motion, the control unit
  • the control unit 3 controls the projector 4 so that this 2 ⁇ / 3 phase stepping occurs relative to an illuminated surface of the fish 2. This can be done in one of two ways.
  • the control unit 3 can adjust the lateral displacement of the stripes between successive images by more or less than 2 ⁇ / 3, based on the horizontal speed of the fish 2, so that a step of 2 ⁇ / 3 occurs in a coordinate system that is centred on (i.e., that tracks) the moving fish 2.
  • control unit 3 can change the spatial frequency of the projected stripes, based on the speed of the fish 2, so that the fish 2 moves horizontally by a distance that corresponds to a 2 ⁇ / 3 shift over the surface of the fish 2; in this way, an effective 2 ⁇ / 3 shift can be realised without any lateral shifting by the projector 4. It's also possible that these two approaches can be combined, so that the projector 4 performs some lateral shifting, while the control unit 3 also adjusts the spatial frequency of the fringe pattern.
  • phase shift values than 2 ⁇ / 3, relative to the fish 2 are also possible, such as 2 ⁇ / n for any desired integer, or real-valued number, n. ln a post-processing stage, both horizontal and vertical motion estimates are used to adjust the images acquired by the cameras 5, 6, to compensate for the motion of the fish 2.
  • These horizontal and vertical displacements can be calculated using a slower correlation approach, over a set of images, because the results are not required as fast as was the case for the initial horizontal speed estimate.
  • the present approach assumes that the fish 2 does not change shape significantly between successive frames. If the fish 2 does make a large movement, the measurement process may fail. However, a certain failure rate is acceptable.
  • l-i, l 2 and l 3 be three images from one of the cameras 5, 6, captured using three successive 2 ⁇ / 3 phase shifts of the striped illumination pattern relative to the fish 2.
  • the images I 2 and l 3 have been adjusted to remove horizontal and vertical displacement of the fish 2 over the image set, as described above.
  • the control unit 3 then evaluates the arc-tangent function
  • Figure 4 shows a resulting phase image of the fish 2.
  • the grayscale pixel values ranging from white to black, represent spatial phase values ranging from zero to 2 ⁇ .
  • the phase wraps every 2 ⁇ , resulting in abrupt jumps from white to black between adjacent stripes.
  • the control unit 3 unwraps the phase to remove the 2 ⁇ jumps, using standard techniques.
  • the shape of the fish 2 can then be determined by performing a standard phase-to-height conversion, given the known positions of the projector 4 and the cameras 5, 6.
  • the left camera 5 and right camera 6 are used as a stereo pair, in combination with the projected stripes.
  • Stereo image analysis traditionally relies on a correlation operation between captured left and right images to identify one or more common points between the two images. A distance to that point can then be calculated, based on the known geometry of the cameras 5, 6.
  • the control unit 3 operates not on the direct images from the camera 5, 6 but on processed numerical data arrays, derived from the images.
  • Data arrays from the two cameras 5, 6 are generated from images, or image series, recorded simultaneously in time from the left camera 5 and the right camera 6.
  • the laser projector 4 creates random, subjective speckle patterns in the images obtained from the two cameras 5, 6, which can make conventional image correlation problematic.
  • the control unit 3 uses numeric data arrays generated from the two cameras 5, 6, where the data values from the two cameras are identical or virtually identical point by point on the fish 2. These values form the basis for a detailed and accurate correlation between the two image sets from the two cameras 5, 6.
  • a correlation measurement then gives the relative picture distance between the fish 2 in the two camera images, from which information about the distance between the cameras 5, 6 and the fish 2 can be calculated.
  • the processed data array for each camera 5, 6 may, for example, be the phase images, as shown in Figure 4. Alternatively, the processed data arrays may represent the fringe amplitude of the stripe patterns, or other values calculated using a set of camera images as input.
  • Figures 5a and 5b show what is meant by the fringe amplitude (or fringe intensity), both where the background intensity, / 0 , is constant ( Figure 5a) and changing ( Figure 5b). How to calculate the fringe amplitude is described in detail below.
  • the control unit 3 can then use a distance measurement to a single point on the fish 2 in order to calibrate the size of the fish 2, based on the shape information determined from the phase analysis described above, derived from one of the camera 5, 6.
  • correlation over the data arrays is used to determine the distances to a plurality of points on the fish 2.
  • This information can potentially be used alone to find the shape or size or volume of the fish 2 (i.e., without deriving shape information from the phase data), or it can be used to enhance the accuracy of the phase-based shape analysis.
  • the control unit 3 looks for the outline of a fish based on information from one or both of the cameras 5, 6, relating to the projected stripes. This can be done in various ways, but based on processed data arrays, rather than raw images from the cameras.
  • One algorithm is based on phase shift, preferably with dynamic coordinate system as outlined previously, but in which, instead of calculating the phase of the projected stripes, the control unit 3 calculates the modulation or intensity of the projected stripes. Interfering backlight and particle dispersion are filtered out from this data processed generated image. This algorithm runs simultaneously for the two cameras 5, 6. The resulting image or data array is well suited as input to a pattern recognition algorithm, and is expected to perform significantly better than a normal picture of the fish 2 for performing outline detection.
  • the intensity can be calculated using, for example, the three coordinate-shifted images, l 1t l 2 , l 3 , described above, and the algorithm
  • Figure 6 shows an example of such a fringe-amplitude image.
  • the control unit 3 uses a threshold value to filter the fringe-amplitude image, to yield a mask of the fish. Amplitudes over the threshold are taken to correspond to a surface of a fish, while amplitudes below the selected threshold mean no fish.
  • the amplitude masking described above may help to remove contributions from fish that are much further away from the illumination source than the closest fish 2.
  • Separating two overlapping fish can be done in various ways.
  • control unit 3 first calculates the spatial gradient of the phase image. If vertical fringes are projected, the spatial gradients can be calculated in the horizontal direction only, by evaluating: phase(x, y) - phase(x - dx, y). However, other embodiments may calculate the spatial gradient in both horizontal and vertical directions— e.g. by evaluating:
  • Figure 7 shows a resulting spatial gradient image, where gradients have been calculated in a horizontal direction.
  • any jumps or discrepancies in the spatial gradient of phase, within the image, represent a shift from one surface to another.
  • the smooth shape of fish means that, in general, the camera 5, 6 will see the whole side of the fish as a continuous surface.
  • gradient jumps will typically occur only along the boundary where two fish overlap, from the camera's point of view.
  • phase gradients will typically remain constant on a continuous fish surface while the gradients between different overlapping fish are likely to change over time, since different fish generally move differently in the images.
  • control unit 3 may alternatively or additionally use stereo information from the two cameras 5, 6 to separate fish that are at different distances, based on the fact that the objects at different depths will have different offsets in the respective image planes. This information may be used on raw images or on processed data array values, as described above.
  • the control unit 3 may calculate distances using the stereo vision algorithm described above, based on the data processed arrays, and look for quick and sudden deviations in distance along projected lines in the scene, or from one part of an area in which fringe intensity is above a threshold level to other parts of the area, which may indicate a shift from one fish to another fish that is closer or further away.
  • the control unit 3 may use results from some or all of the above analyses in an image analysis algorithm to identify the outline of a single fish 2 in the scene.
  • the control unit 3 can obtain complementary information and the possibility of averaging the data in order to obtain a more accurate volume estimate for the fish 2. This provides increased sensitivity and more accurate measurements.
  • the control unit 3 may make measurements of cross sections across the body of the fish 2. Such measurements can also be made on parts of the fish 2 even before the whole fish 2 is completely inside the image fields of the two cameras 5, 6, and after parts of the body of the fish 2 are out of the picture.
  • the control unit 3 may sequentially record many images. By letting one or both cameras 5, 6 run continuously, the phase of the stripes can be shifted between adjacent images without stopping the camera. In this way it is possible for the system 1 to obtain, for example, 100 or more phase-shifted stripe frames per second, from each of the cameras 5, 6.
  • a typical sequence will involve the stripes being phase- shifted by 2. pi / 3 between shots, in the following phase sequence:
  • Phase [0, - - ⁇ , - - ... ].
  • the control unit 3 can then calculate a new phase value, a N , based on the last incoming image plus the two preceding images, as indicated in Figure 8.
  • Phase here means the effective phase shift, referring to the origin of the dynamic coordinate system.
  • Backscattering is reduced using a linearly polarized laser in the projector 4, or by including a polarizer within the projector 4.
  • the light reflected from small particles in the water will then also be polarized.
  • the light reflected from the surface of the fish i.e., the stripe pattern
  • polarizers can be placed in front of the cameras 5, 6, adjusted to 90 degrees of the projector 4 polarization direction. In this way most of the backscattering light is removed, while light from the fish 2 is muted only by around 50 percent. This reduction is acceptable because the contrast stripe pattern of the fish 2 is increased significantly.
  • the projector 4 also includes a unique construction of a Michelson interferometer, for precise phase shifting of the projected stripes. To displace the stripes sideways, one mirror is translated in a linear movement without inclination. The accuracy must be within a fraction of a wavelength of light. A special mechanical suspension is used to achieve this.
  • Figure 9 shows this mechanical suspension.
  • a piezo pusher 90 presses against a proximal end of a shaft 91.
  • the distal end of the shaft 91 is fastened to an end plate
  • the springs 94, 95 ensure a pure translation of the shaft 91 , and thus the mirror
  • the beam when an object is illuminated by a laser, diode, or other source, the beam will typically have a Gaussian intensity distribution.
  • a Michelson's interferometer as is used within the projector 4.
  • the projector 4 illuminates the scene with multiple, partially-overlapping beams of light, where each beam can be directed towards the fish 2 from the same, or approximately the same, area, but with a somewhat different angle towards the fish 2.
  • the projector 4 can be controlled to regulate the intensity relationship between the light beams.
  • Each of these beams has its own stripe pattern, but all the patterns follow the same function in space, so that the stripes are continuous with each other in areas where the beams overlap.
  • FIG 10 shows a Michelson interferometer 100 and a laser 101 that are used within the projector 4 to generate a stripe pattern in the beam.
  • the laser 101 generates a light beam which passes through a convex input lens 102. This causes the beam to spread as it enters a beam splitter 103.
  • a first mirror 104 creates a first virtual point source 105, while a second mirror 106 creates a second virtual point source 107.
  • the diverging output beam 108 has a Gaussian-distributed intensity over its cross section.
  • the two virtual points 105, 106 create a striped interference fringe pattern in the output beam 108. These two point sources 105, 106 would appear to be closely spaced apart, next to each other, if an observer were to look towards the interferometer 100 from its output. In this arrangement, a single light cone is generated, containing stripes.
  • the projector 4 includes additional components so that multiple light beams pass through a Michelson interferometer.
  • This interferometer set-up is similar to the interferometer 100 in Figure 10, but may have multiple lasers.
  • the point-source couples for each light beam must be close to each other in space, in order that the stripe patterns between different light beams are not notably laterally skewed in relation to each other. Ideally the point couples for all the light beams have the exact same position in space.
  • Figure 1 1 shows two parallel beams 1 10, 11 1 going through an input lens 112 of a Michelson interferometer contained within the projector 4.
  • the beams 1 10, 11 1 are laterally offset from each other, but overlapping, heading towards the lens 1 12. They leave the lens 112 and pass through a mutual focal point 113 in different directions. Both beams 110, 11 1 maintain their Gaussian profiles.
  • the beams 1 10, 1 11 each generate their own independent stripe pattern when they have passed through the beam splitter and mirrors of the Michelson interferometer, and travel towards the fish 2. These two stripe patterns will be spatially identical or overlapping if the beams come from the same point initially, and thereby from identical virtual points, as shown in Figure 10.
  • the primary difference between the two striped beams is that they illuminate the fish 2 with maximal intensity in different areas.
  • the beams may also have different sizes and spreads, which depends on the shape and diameter of the beams entering the input lens 112.
  • the laser beams must have roughly the same wavelength, but can be polarized at 90 degrees to each other.
  • Figure 12 shows a first beam of stripes 120 and a second beam of stripes 121 travelling towards the fish 2.
  • the beams are indicated here by respective ovals, which represent regions where the output light exceeds a threshold intensity in an object plane containing the fish 2.
  • the two beams 120, 121 are offset vertically, so that the first beam 120 has its maximum intensity on the back (top half) of the fish 2.
  • the beams may enter the lens 1 12 at different angles.
  • Figure 13 shows the effect of angling the first beam 1 10 relative to the second beam 11 1 by a few degrees, so that they are not parallel when they enter the input lens 1 12.
  • the first beam 1 10 is focused to a focal point 130 that is shifted relative to the focal point 131 of the second beam 1 11.
  • Each beam 1 10, 1 11 creates a respective pair of virtual point sources in the interferometer. The two pairs are offset from each other, but the distance and angle between the virtual point sources in each pair is the same for both pairs. In this case, the beams will have a different angle as they leave the
  • the point couples that the two light sources generate after passing the interferometer will have a somewhat different placing in space.
  • the distance between the points in each point couple will still be the same, and the stripe pattern will be approximately the same for the two beams so that they will still generate the same overlap as before, as shown in Figure 12.
  • the two light beams 110, 11 1 may be polarized 90 degrees to each other, so that the reflection properties of the two beams are differentiated. This allows for greater homogeneity in the stripe contrast over the fish 2, because the reflection from the fish can be adapted to the different viewing angles of the cameras 5, 6.
  • the camera 5, 6 may have a corresponding polarizing filter.
  • Figure 14 shows how two light beams 1 10, 1 11 , generated from a first laser 140 and a second laser 141 , respectively, can be combined before entering the input lens 112 of the Michelson interferometer, within the projector 4.
  • the beams 110, 11 1 are polarized in respective directions, which allows a mirror 142 and a polarizing beam- splitter 143 to be used to combine the two beams.
  • the beams 1 10, 1 11 can then be totally super-positioned, which gives totally overlapping light beams, or they can be laterally offset, as shown in Figure 1 1.
  • the same approach can be used with more than two lasers, generating more than two offset beams.
  • Figure 15 shows four offset beams, of similar wavelength, entering the input lens 112. It is, however, not easy to combine more than two laser beams by the same principle as shown in Figure 14 without losing energy from one or more lasers. This is because there are only two independent polarization directions. But it may still be beneficial to combine more than two laser beams in order to get a good distribution of intensity over the fish 2. This may be achieved by combining three or more beams directed towards the lens 112 with differing angles, even though the focal point and the placement of the point-couples after passing the interferometer will be somewhat different for the different beams.
  • Figure 16 shows how four light beams could be arranged to illuminate the fish 2 more uniformly with a striped pattern.
  • each of the light beams generates an synchronised stripe pattern, so that there are continuous stripes over an entire object place containing the fish 2.
  • the only difference from a single laser source embodiment is that, there is a differentiated polarization and illumination on the fish.
  • the principles described above can also be used when there is just one source laser.
  • Figure 17 shows part of the projector 4 in an alternative embodiment in which a double-refracting prism 170 is placed in front of a laser 161 in order to generate two parallel, offset beams 172, 173. These are directed towards the input lens 112.
  • the double-refracting prism 170 can be formed from a birefringent crystal such as calcite.
  • the intensity relationship between the two beams can be adjusted by rotating the prism 170 or the laser 171. If it proves to be difficult to obtain a prism 160 that offsets the beams by a desired distance, this can be corrected by adjusting the diameter of the beam before it enters the prism 170 and choosing a lens 112 with an appropriate focal length.
  • the two beams 172, 173 going out from the prism 170 are polarized 90 degrees to each other and so do not cause any interference issues between the light beams.
  • Figure 18 shows part of the projector 4 in an alternative embodiment in which a vibrating or stepping mirror 180, or another device that can laterally offset a beam dynamically, is placed in front of a laser 181 in order to generate two parallel, offset beams 182, 183, which are directed towards the input lens 112.
  • the mirror 180 shifts the beam between two parallel paths. If the lateral shifting happens fast enough so as to cycle one or more times during a single exposure period of the camera 5, 6, this will create a time-averaging effect equivalent to the beams 182, 183 being on the fish 2 simultaneously.
  • the approach of Figure 18 may be desirable where differentiated illumination is wanted over the fish 2, but not differentiated polarization.
  • system 1 may be used to measure objects other than fish 2, which may be stationary or moving. Beam patterns other than vertical stripes may be used.

Abstract

A system(1)for determining three-dimensional spatial information about an object(2), such as a fish, comprises a projector(4), a camera(5, 6), and a processing subsystem (3). The projector(4)emits structured light towards the object(2). The light has a periodic intensity pattern, having a first spatial frequency in a first direction. The camera(5, 6)captures images of the object(2)over time. The processing subsystem (3)determines aspeed of the object(2), relative to the projector(4), and instructs the projector(4)to control the emitted structured light in dependence on the speed, such that the intensity pattern is shifted, relative to the object(2), between successive images, by a spatial-phase step that is independent of the speed of the object(2). The processing subsystem(3)then usesimages from at least two time frames to calculate spatial phase values for points on the object(2), and uses these spatial phase values to determine three-dimensional spatial information about the object (2).

Description

MOTION COMPENSATION IN PHASE-SHIFTED STRUCTURED LIGHT ILLUMINATION FOR MEASURING DIMENSIONS OF FREELY MOVING OBJECTS
This invention relates to systems and method for emitting structured light towards an object, and for determining spatial information about an object that is illuminated by 5 structured light.
It is known to project a periodic pattern of light onto an object in order to determine spatial information about the object, especially three-dimensional information. A digital camera is typically used to image the illuminated object, and a processing subsystem0 analyses the distortion in the periodic pattern, from the viewing position of the camera, arising due to the contours of the illuminated surface or surfaces of the object. Such techniques are commonly referred to as "projected fringes", "moire", or "structured light" methods. 5 One common approach for using projected stripes and a camera to extract three- dimensional (3D) shape information is the phase-shifting technique, where the projected stripes are shifted laterally by a given distance between successive exposures, and where the spatial phase of the pattern on a surface of the object, captured by the camera, is computed numerically by combining multiple different0 phase-shifted images. Other methods analyse a single image— for example, using Fourier analysis or "fringe tracking" to determine the object's size and shape.
This technology has many application areas. One application area is the non-invasive measuring of the size and weight (biomass) of caged fish in a fish farm.
5
For example, WO 2014/098614 (EBTech AS) discloses a system for determining 3D measurements of a fish in water using one camera and a structured-light illumination system. It also discloses an alternative system that uses two cameras, as a stereoscopic pair, to determine three-dimensional information about a fish. In both0 cases, 3D modelling is said to be calculated by triangulation between the light and the image, or between images from several cameras, so as to provide a "depth image" where each pixel value represents a distance to the measured object. WO 2014/098614 recognises that imaging a moving fish is not straightforward, but states that this can be mitigated by arranging the light source to have an appropriate wavelength, intensity and frequency. The applicant has realised, however, that this is not always sufficient to obtain measurements of a desired accuracy, and that there remains a need for systems that can provide accurate three-dimensional spatial information of an object, especially— although not exclusively— when the object is in motion.
From a first aspect, the present invention provides a system for determining three- dimensional spatial information about an object, the system comprising a projector, a camera, and a processing subsystem, wherein:
the projector is arranged to emit structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic, having a first spatial frequency in a first direction;
the camera is arranged to capture a plurality of images of the object in a respective plurality of time frames, and to send image data representing the plurality of images to the processing subsystem; and
the processing subsystem is arranged to:
receive or determine data relating to a speed of the object, relative to the projector, in said first direction;
send a control signal to the projector to cause the projector to control the emitted structured light in dependence on the speed of the object in the first direction such that said intensity pattern is shifted, relative to the object, between successive time frames of the plurality of time frames, by a spatial- phase step that is independent of the speed of the object in the first direction; use image data corresponding to at least two of the time frames to calculate spatial phase values for a plurality of points on the object; and
use the calculated spatial phase values to determine three-dimensional spatial information about the object.
From a further aspect, the invention provides a method of determining three- dimensional spatial information about an object, the method comprising:
emitting structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic, having a first spatial frequency in a first direction; capturing a plurality of images of the object in a respective plurality of time frames;
receiving or determining data relating to a speed of the object in said first direction;
controlling the emitted structured light in dependence on the speed of the object in the first direction such that said intensity pattern is shifted, relative to the object, between successive time frames of the plurality of time frames by a spatial-phase step that is independent of the speed of the object in the first direction;
using at least two of the plurality of images to calculate spatial phase values for a plurality of points on the object; and
using the calculated spatial phase values to determine three-dimensional spatial information about the object. Thus it will be seen by one skilled in the art that, in accordance with these aspects of the invention, three-dimensional spatial information is determined from spatial phase values that are calculated from a plurality of phase-shifted images. This can provide improved spatial resolution compared with triangulation based on just a single image. It also less susceptible to interference from other light sources, such as sunlight. In the past, multi-image, phase-shifting, structured-light techniques were best suited for use on static objects, but the present invention allows such techniques to be used effectively with moving objects. It does this by appropriately controlling the structured light emissions, in dependence on the speed of the object, so that a pattern of light projected on the surface of the object is shifted, relative to the object, by a speed- independent spatial-phase step.
The object may be in water. The object may be a fish. The system may be installed in a fish cage or fish farm. However, the invention is not limited to any particular type of object or environment. The object may be only a part of a larger entity— e.g., it may be just the tail of a fish. The object is preferably not an element of the system itself.
In some situations, only a part of the object may be illuminated by the structured light. In some situations, the camera may capture images of only a part of the object. In such cases, the three-dimensional spatial information may relate only to that part of the object that is captured by the camera. ln one set of embodiments, the structured light is controlled by shifting the intensity pattern in the first direction, between successive time frames of the plurality of time frames, by an amount that depends on the speed of the object in the first direction. In this way, the intensity pattern can be shifted by an amount that combines the desired phase step and the distance the object is expected to travel between adjacent time frames, so that the net result, relative to the object, is simply the desired phase step.
The processing subsystem is preferably arranged to calculate the shift amount using the data relating to the speed of the object. The control signal sent by the processing subsystem preferably encodes the shift amount. The shift amount may be determined in an angular unit (e.g., degrees of angle from a lens of the projector), or in a linear unit (e.g., distance within the illumination plane), or as a spatial phase shift for the structured light.
In another set of embodiments, the structured light is controlled by controlling the spatial frequency of the intensity pattern in dependence on the speed of the object in the first direction. In this way, the spatial frequency can be such that the spatial-phase step corresponds to the distance the object is expected to travel within the illumination plane, between adjacent time frames. This may desirably allow the projector to be made simpler, by avoiding a need for a mechanism in the projector to shift the whole intensity pattern.
The processing subsystem is preferably arranged to generate data representing the spatial frequency using the data relating to the speed of the object. The control signal sent by the processing subsystem preferably encodes data representing the spatial frequency of the intensity pattern in the illumination plane.
The spatial-phase step is preferably constant over a plurality of pairs of successive time frames— e.g., for at least three, four, or more successive time frames. The spatial-phase step may be an integer fraction of 2 ττ. For example, it may be 2 π / 3, or 2 TT / 4.
The time frames are preferably spaced at regular intervals— e.g., every 0.01 seconds, or at any interval ranging between 0.001 seconds to 1 second, or more or less than this. Shorter intervals may be preferred where the movement of the object may be erratic, since this minimises errors due to extrapolating the speed information, or where the object may change shape between time frames, such as may be case for a swimming fish.
While the intensity pattern is periodic in the first direction within the illumination plane (which is typically a virtual plane), it will be appreciated that an actual projected pattern, on a surface of the object, will typically not be strictly periodic, but will have a varying frequency (or phase) based on the contours of the surface.
The illumination plane may intersect the object. It will be appreciated that, if part of the object is closer to the projector than the illumination plane, the structured light will not reach the illumination plane everywhere; in this case, references to the intensity pattern in the illumination plane should be understood as referring to the theoretical intensity if the object were not present. The illumination plane may be a plane in which, out of all possible planes that intersect the object, the structured light has an intensity pattern having a minimum spatial frequency. The first direction within the illumination plane may be a direction in which, out of all possible directions in the illumination plane, the intensity pattern in the illumination plane has a minimum spatial frequency.
The intensity pattern within the illumination plane may be periodic in all directions, or in a plurality of directions (e.g., a square grid, or concentric arcs), or there may be a direction in which the intensity pattern is not periodic (i.e., in which it has constant intensities). The intensity pattern may be periodic in one direction only. In a preferred set of embodiments the intensity pattern comprises a set of parallel stripes. The intensity of the stripes may vary as a sine wave. The stripes may be substantially vertical, or at an angle of between 0 and 25, 10 or 5 degrees to vertical; this has been found to be particularly appropriate when measuring objects that travel mainly horizontally, such as a swimming fish. The first direction is preferably substantially horizontal, or at an angle of between 0 and 25, 10 or 5 degrees to horizontal. It will be appreciated that the intensity pattern may not be periodic over its entire extent, but only where the illumination plane intersects an imaging region or imaging volume. Due to limitations in the optics and the transmission medium, it will also be appreciated that the intensity pattern may not be perfectly uniform or periodic even over an imaging region— e.g., the first spatial frequency may vary within certain bounds, such as +/- 10%, or the stripes may not be exactly parallel.
The projector may comprise a non-coherent light source. It may be a digital projector. It may comprise a spatial light modulator.
However, preferably the projector comprises an interference-based structured-light emitter. The projector preferably comprises a source of coherent light. The intensity pattern in the illumination plane and/or the actual projected pattern on the object may thus comprise an interference fringe pattern. The projector may comprise an interferometer, such as a Michelson interferometer.
The emitted structured light may contain a majority of its energy in the visible spectrum, or may contain a majority of its energy in the infrared or ultraviolet spectrum.
The projector preferably emits polarized light— preferably linearly polarized. The projector may comprise a light source that is inherently polarized, or it may comprise a polarizing filter. The camera preferably comprises a corresponding polarizer (e.g., a linear polarizer) to filter backscattered light from particles and molecules in a medium between the system and the object, such as water. The polarizer on the camera can also help to reduce interference due to specular reflection "highlights" on the object, which may otherwise be caused by the projector and/or by external light sources.
The camera may comprise a narrowband bandpass filter. This can be set to pass light from the projector, but to attenuate background light or sunlight scattered from particles in the medium, or reflected from the object or from other background objects.
The processing subsystem is preferably arranged to use the data relating to the speed of the object in the first direction when processing the image data or when processing the calculated spatial phase values. It preferably applies a lateral shift operation to the image data or to the spatial phase values, corresponding to one or more time frames, which shifts the data laterally by an amount that depends on the speed of the object in the first direction. In this way, the position of the object will be substantially the same within the image data, or within spatial phase values, from two or more successive time frames, even though the object moved relative to the projector in reality. The processing subsystem is preferably arranged to translate the image data or spatial phase values into a coordinate system in which the object is fixed. In this same coordinate system, the intensity pattern should step by the spatial-phase step between successive time frames. Determining three-dimensional spatial information about the object is therefore simplified by processing the images in this coordinate system. Of course, the processing subsystem may implement a shift or translation of the data in any appropriate way, which may depend on how the data is represented in a memory of the processing subsystem (e.g., as a data array, or a bitmap image file, etc.). The projector and camera are preferably arranged in a fixed relationship to each other. They preferably have a fixed separation distance (e.g., measured between respective lenses), and preferably have fixed relative orientations. They may be connected by one or more support members, such as a metal framework. The processing subsystem preferably uses a value representing a distance between the projector and camera when determining 3D information about the object. The processing subsystem preferably also uses information relating to a projection axis of the projector and/or an imaging axis of the camera when determining 3D information about the object.
Although, in preferred embodiments, the object is moving, it is also possible that the object is stationary and the projector (and preferably also the camera) is moving.
It will be appreciated that the speed of the object in the first direction may represent only a component of the object's velocity, which may also have a non-zero component in another direction. In general, however, the system is preferably arranged so that the object moves substantially in the first direction (i.e., so that the speed of the object in the first direction is larger than the speed of the object in any direction orthogonal to the first direction).
The data relating to a speed of the object, relative to the projector, in the first direction is preferably determined using the camera. Alternatively, the speed data may be received as an input to the processing subsystem from another source. The speed may be represented in angular units, such as radians/second, or as a linear speed in a reference plane, such as an imaging plane (which may or may not be a plane intersecting the object), such as metres/second. In some embodiments, the imaging plane is the illumination plane. The speed of the object may be determined with reference to the illumination plane.
Data relating to a speed of the object may be obtained only once during a
measurement process, or it may be obtained continuously or at intervals. In the latter case, the processing subsystem is preferably arranged to update the control of the projector continuously or at intervals based on updated speed information— e.g., to send a plurality of control signals to the projector over time. In one set of embodiments, determining data relating to the speed of the object in the first direction comprises summing pixel values along a set of lines perpendicular to the first direction in an image from the camera, to generate a set of summed values. A leading edge of the object may then be determined from the summed values— e.g., based on where a value in the set crosses a threshold value, or on where the gradient of the summed values crosses a threshold gradient value. The respective positions of a leading edge may be determined in each of two of more images taken at different time frames. The speed of the object in the first direction may then be determined from these positions and the interval between the time frames. The system may be arranged to illuminate the object uniformly for determining the speed of the object— i.e., not with a static pattern. This may be done with a separate light source. However, in some embodiments, the projector is arranged to sweep a pattern or beam over the illumination plane or an imaging region, within an exposure time of the camera. The pattern or beam may be generated by a laser in the projector. This allows the system to provide illumination that is effectively uniform for each image capture in a way that does not require an additional light source, thereby saving cost and/or space.
If a distance between the camera and the object is known, or if the size of the object is known, it may be possible to calculate a true speed of the object in the first direction (relative to the camera or system). However, in general, the distance may not be known, and so the object's speed may be determined as an angular speed, or as a linear speed in a reference plane, such as in the image plane of the camera. The spatial phase values are preferably calculated from two or more images taken in successive time frames. The number of these images, multiplied by the spatial-phase step, may equal 2 pi. For example, if the phase step is 2 pi / 3, three successive images may be used to calculate the spatial phase values. The spatial phase values may be calculated at a spatial resolution that is the same as a spatial resolution of the images, or at a courser resolution. Known techniques may be used to do this.
The spatial phase values may be used to determined 3D spatial information about the object in any known way— e.g., by unwrapping the phase values and subtracting the unwrapped values from phase values calculated for a reference plane. The spatial information may include a length of the object, a height of the object, or any other information. These values may be determined as functions of distance to the object, which may or may not be known at this stage. The system may be arranged to adjust the images from the camera to reduce image distortion. This adjustment may be based on a suitable calibration process— e.g., using a pinhole calibration algorithm.
The system may be further arranged to estimate a mass or volume of the object.
The processing subsystem may be arranged to identify an outline of the object— e.g., in order to subtract background regions from the image data, which may contain unwanted noise such as reflections from other objects. This may comprise the processing subsystem applying a spatial gradient filter, or other edge-detection filter, to the calculated spatial phase values.
A distance to the object (e.g., from a point on the camera or the projector, or another suitable reference point, to a point on the object, which may or may not be the closest point on the object) may be received by the processing subsystem, or it may be determined by the processing subsystem. The system may, for instance, be arranged to direct a single laser beam at the object, from a known angle, in addition to the structured light emission; this may enable the distance to be determined by triangulation. Alternatively, a ray or beam in the structured light emission may be encoded (e.g., using colour) so as to create an identifiable point or line in the images, thereby allowing detection of the stripe order. Alternatively the projector may output multiple structured light emissions having intensity patterns in the illumination plane that have different spatial frequencies.
In some embodiments, the system may comprise a second camera, which may be used to determine a distance to the object. The second camera is preferably spaced apart from the first camera. The processing subsystem may be arranged to process images from the first camera and from the second camera to determine a distance to the object. This distance may then be used to determine final 3D spatial information about the object— e.g., by scaling the spatial information derived from the calculated spatial phase values by an appropriate amount to account for distance to the object.
Although stereo vision systems are known in the art, including in combination with structured-light illumination, the applicant is not aware of any prior disclosure of using two cameras to determine a distance to an object, and using this distance in combination with unwrapped spatial phase information to determine three-dimensional spatial information about the object. Typically, known stereo vision systems would calculate a large number of distances to different points on an object, which would thereby directly yield the desired 3D spatial information. However, the applicant has recognised that there are situations in which additionally using unwrapped spatial phase information can produce better results. For example, this can allow the two cameras to be placed closer together than would normally be the case in order to provide a desired depth resolution, which may be desirable for avoiding interference due to specular reflection from a light source. It can also allow the use of low- resolution cameras than might otherwise be required.
Thus, from a further aspect, the invention provides a system for determining three- dimensional spatial information about an object, the system comprising a projector, a first camera, a second camera, and a processing subsystem, wherein:
the projector is arranged to emit structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic; the first camera is arranged to capture a first plurality of images of the object from a first viewing position, in a respective plurality of time frames, and to send first image data representing the first plurality of images to the processing subsystem; the second camera is arranged to capture a second plurality of images of the object from a second viewing position, different from the first viewing position, in a respective plurality of time frames, and to send second image data representing the second plurality of images to the processing subsystem; and
the processing subsystem is arranged to:
use data from the first image data corresponding to at least two of the time frames to calculate a first set of spatial phase values or pattern amplitude values for a plurality of points on the object;
use data from the second image data corresponding to at least two of the time frames to calculate a second set of spatial phase values or pattern amplitude values for a plurality of points on the object;
process the first and second sets of spatial phase or pattern amplitude values to identify a common point on the object, and use the respective positions of the common point from the first and second viewing positions to determine a distance to the object;
use data from the first image data corresponding to at least two of the time frames to calculate a third set of spatial phase values for a plurality of points on the object;
performing a phase-unwrapping operation on the third set of spatial phase values to generate a set of unwrapped phase values; and
use (i) the set of unwrapped phase values, and (ii) said distance to the object, to determine three-dimensional spatial information about the object.
From another aspect, the invention provides a method of determining three- dimensional spatial information about an object, the method comprising:
emitting structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic;
capturing a first plurality of images of the object from a first viewing position, in a respective plurality of time frames;
capturing a second plurality of images of the object from a second viewing position, different from the first viewing position, in a respective plurality of time frames; using images from the first plurality of images corresponding to at least two of the time frames to calculate a first set of spatial phase values or pattern amplitude values for a plurality of points on the object;
using images from the second plurality of images corresponding to at least two of the time frames to calculate a second set of spatial phase values or pattern amplitude values for a plurality of points on the object; processing the first and second sets of spatial phase or pattern amplitude values to identify a common point on the object, and using the respective positions of the common point from the first and second viewing positions to determine a distance to the object;
using images from the first plurality of images corresponding to at least two of the time frames to calculate a third set of spatial phase values for a plurality of points on the object;
performing a phase-unwrapping operation on the third set of spatial phase values to generate a set of unwrapped phase values; and
using (i) the set of unwrapped phase values, and (ii) said distance to the object, to determine three-dimensional spatial information about the object.
Features of any earlier aspect or embodiments may be features of embodiments of these aspects also.
The applicant has found that using spatial phase or pattern amplitude values to identify a common point on the object can give better results than using raw image data, especially for objects that are relatively smooth and featureless, and/or which move, such as fish. Moreover, this approach is compatible with coherent light projection, which gives rise to random, subjective speckle patterns which prevent conventional image-based stereovision methods from working properly, because the patterns are not the same for the two cameras.
The third set of spatial phase values may be the same as, or may overlap with, the first set or second set, or it may be a separate set.
The first and second cameras may be synchronised. Their respective plurality of time frames may be the same time frames. The first and second sets may be calculated using the same frames for the first and second image data. This reduces errors due to the object changing its shape or position over time.
The common point on the object is preferably determined using a correlation operation between the first and second sets of spatial phase or pattern amplitude values.
Triangulation may be used to find a distance to the common point. Preferably a distance measurement to only one common point is used to determine the 3D spatial information— i.e., only one distance measurement. The distance measurement is preferably used to scale distance-dependent spatial information determined from the set of unwrapped phase values.
In any of the above embodiments, the structured light emission and/or the intensity pattern within the illumination plane may have a Gaussian intensity distribution.
However, in some situations it may be desirable for the intensity of the intensity pattern to be tailored to the shape and/or size of the object that is being measured.
The projector may comprise at least one light source, an optical element, and an interferometer. It may be arranged to direct a first Gaussian beam into the optical element along a first path and to direct a second Gaussian beam into the optical element along a second path, the second path being laterally and/or angularly offset from the first path. The optical element is preferably arranged to direct light from the first and second Gaussian beams into the interferometer. The optical element and the interferometer are preferably arranged to emit structured light from the first Gaussian beam, the structured light having a first intensity pattern, within the illumination plane, that is periodic, and to emit structured light from the second Gaussian beam, the structured light having a second intensity pattern, within the illumination plane, that is periodic. (It will be appreciated that the illumination plane is typically a virtual plane, so these intensity patterns are not actually formed on a planar surface; instead, the structured light would typically create a pattern on a surface of an object, which may be more or less periodic, depending on the shape of the object.) The first and second intensity patterns preferably have a common spatial frequency and a common spatial phase distribution within the illumination plane. The first intensity pattern preferably overlaps the second intensity pattern within the illumination plane. Preferably, however, a point of maximum intensity of the first intensity pattern, in the illumination plane, is laterally offset from a point of maximum intensity of the second intensity pattern in the illumination plane.
In this way, it can be possible to provide more uniform illumination, by having two overlapping patterns of light, each with Gaussian-distributed intensity, rather than just one, where the two patterns are spatially synchronised to provide one continuous illumination pattern. The intensity patterns are preferably fringe patterns.
The processing subsystem may be arranged to instruct the projector to adjust the offset of the first and second paths, in response to data from the camera. This may allow, for instance, one centre of intensity to be placed on an upper half of a fish, and the other centre of intensity to be placed on a lower half of the fish.
This projection system is believed to be novel in its own right.
Thus, from a further aspect, the invention provides a structured-light projection system for emitting structured light towards an object, the system comprising at least one light source, an optical element, and an interferometer, wherein:
the structured-light projection system is arranged to direct a first Gaussian beam into the optical element along a first path and to direct a second Gaussian beam into the optical element along a second path, the second path being laterally or angularly offset from the first path;
the optical element is arranged to direct light from the first and second
Gaussian beams into the interferometer;
the optical element and the interferometer are arranged to emit structured light from the first Gaussian beam, wherein the first structured light has, within an illumination plane, a first intensity pattern that is periodic, and to emit structured light from the second Gaussian beam, wherein the second structured light has, within said illumination plane, a second intensity pattern that is periodic;
the first and second intensity patterns have a common spatial frequency; the first and second intensity patterns have a common spatial phase distribution over at least a portion of the illumination plane;
a point of maximum intensity of the first intensity pattern in the illumination plane is laterally offset from a point of maximum intensity of the second pattern in the illumination plane; and
the first intensity pattern overlaps the second intensity pattern in the illumination plane.
From another aspect, the invention provides a method of emitting structured light towards an object, the method comprising: directing a first Gaussian beam into an optical element along a first path;
directing a second Gaussian beam into the optical element along a second path, the second path being laterally or angularly offset from the first path;
directing light from the first and second Gaussian beams into an interferometer; and
the interferometer emitting first structured light from the first Gaussian beam, wherein the first structured light has, within an illumination plane, a first intensity pattern that is periodic; and
the interferometer emitting second structured light from the second Gaussian beam, wherein the second structured light has, within said illumination plane, a second intensity pattern that is periodic,
wherein:
the first and second intensity patterns have a common spatial frequency in the illumination plane;
the first and second intensity patterns have a common spatial phase distribution over at least a portion of the illumination plane;
a point of maximum intensity of the first intensity pattern in the illumination plane is laterally offset from a point of maximum intensity of the second pattern in the illumination plane; and
the first intensity pattern overlaps the second intensity pattern in the
illumination plane.
This structured-light projection system may form part or all of the projector in any of the earlier-described aspects and embodiments. The illumination plane may be, or have any of the features of, an illumination plane as previously described. The illumination plane may intersect the object and/or an illumination region. The illumination region may contain part of all of the object; it may correspond to an imaging region as described above. The optical element may be a lens, or a beam expander, or a group of lenses, a holographic element, an elliptical mirror, a pinhole, or any other appropriate element.
The interferometer is preferably a Michelson interferometer. The first and second beams may have different polarizations. In this way the illuminated polarization can be differentiated spatially over the surface of an object located in the illumination region. This can provide a more homogenous contrast in the stripe images, e.g., by reducing specular reflection from parts of the object that have a tendency to reflect too much light specularly. In some embodiments, the camera may also have a polarizer, which may further reduce noise in the image.
The structured-light projection system may be arranged to direct one or more further Gaussian beams into the optical element along further respective paths— e.g., three, four, or more beams.
Features of any aspect or embodiment described herein may, wherever appropriate, be applied to any other aspect or embodiment described herein. Where reference is made to different embodiments or sets of embodiments, it should be understood that these are not necessarily distinct but may overlap.
It will, of course, be appreciated that all measurements and calculations mentioned herein (such as projections of motion of the object based on a speed estimate) may, in practice, be subject to error from various sources: for example, due to limitations in measuring precision (e.g., camera resolution), or alignment (e.g., caused by vibrations or knocks), or calculation (e.g., rounding errors), or assumptions that turn out not be entirely correct (e.g., assuming that the object does not accelerate or change shape). All terms in this specification should be read in this light. For instance, the feature of the spatial-phase step being independent of the speed of the object in the first direction, relative to the object, should be understood as covering situations where it may not be possible to track the motion of the object perfectly, but where the stepping is nevertheless sufficiently independent of the speed as to allow for spatial phase values to be calculated to an acceptable level of precision. Certain preferred embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic representation of a system embodying the invention being used to measure a fish;
Figure 2 is a composite image of a fish illuminated by a fringe pattern, at three different spatial positions; Figure 3 is a diagram showing vertical and horizontal displacement of an object;
Figure 4 is a phase image of a fish;
Figure 5a is a graph of intensity over distance for a fringe pattern with a constant background;
Figure 5b is a graph of intensity over distance for a fringe pattern with a changing background;
Figure 6 is a fringe-amplitude image of a fish;
Figure 7 is a phase-gradient image of a fish;
Figure 8 is diagram showing how successive image frames are combined and processed;
Figure 9 is a schematic drawing of part of a Michelson interferometer used in an embodiment of the invention;
Figure 10 is a schematic drawing of a Michelson interferometer used in an embodiment of the invention;
Figure 11 is a schematic drawing showing parallel light beams passing through a lens of a Michelson interferometer;
Figure 12 is a schematic drawing showing two striped light beams against a fish;
Figure 13 is a schematic drawing showing angled light beams passing through a lens of a Michelson interferometer;
Figure 14 is a schematic diagram of an optical beam-combining arrangement used in an embodiment of the invention;
Figure 15 is a schematic diagram of four beams entering a lens;
Figure 16 is a schematic diagram showing four light beams against a fish;
Figure 17 is a schematic diagram of beam-splitting arrangement used in an embodiment of the invention; and
Figure 18 is a schematic diagram of a beam-shifting arrangement used in an embodiment of the invention.
Figure 1 shows a system 1 for estimating the size and weight of a fish 2. The system 1 could be installed in, or adjacent, a fish cage or tank to monitor the growth of fish in the cage— e.g., to determine when fish in the cage have grown to a sufficient size to be sold, or to be moved to a different cage. A commercial net-cage can be installed in the sea or in a lake to hold a large number of fish (e.g., 200,000 or more). Individual fish are measured as they swim past the measurement system 1. Statistical analysis may be used to estimate the distribution of weights of the fish in the whole cage from a set of individual measurements. The system 1 has an electronic control unit 3 which is connected to an optical projector 4, a left camera 5 and a right camera 6. The connections could be wired or wireless. The projector 4 and cameras 5, 6 are typically mounted underwater, in a fixed relationship to the earth, or to a fish net or fish cage. The components may be individually sealed, or may be contained in a common housing (not shown). The control unit 3 can also be mounted underwater.
The system 1 is of a type commonly known as "projected fringes" or "moire" or "structured light". The projector 4 contains one or more lasers and a Michelson interferometer, and is arranged to project optical interference fringes onto the surface of the fish 2 as the fish 2 swims past the projector 4. Alternatively, it could contain two optical fibres, terminated close to each other so as to act as two adjacent point sources, which is another way of generating an interference stripe pattern.
The projector 4, left camera 5 and right camera 6 are all aligned to illuminate and image, respectively, the fish 2, as it passes through an imaging volume or region. The fish 2 may be able to swim freely, or physical guides, such as glass walls, could be used to encourage it to pass the system 1 along a desired path. The dimensions of the imaging region will depend on the optics of the projector 4 and on the width and depth of field of the cameras 5, 6.
The projector 4 is arranged to project a pattern of uniform, vertical fringe lines towards the side of the fish 2, while the fish 2 is in the imaging region. The intensity of the fringes follows a sine-wave in the horizontal axis, with a period of between around 1 cm and 10 cm, although smaller or larger periods could be used. The pattern cannot be too fine, or else the resolution limit of the cameras 5, 6 may become problematic. Although the pattern of light is periodic in a virtual illumination plane intersecting the fish, it will be understood that the actual pattern on the surface of the fish will not be strictly periodic, as it is affecting by the curvature and orientation of the fish. The control unit 3 uses triangulation and other geometrical principles, as described below, to determine the surface shape, size and distance to the fish 2, based on the output of light from the projector 4 and the collection of light, from different directions, at the cameras 5, 6. The system 1 combines projected-stripe processing with stereo- vision and "digital image correlation" (DIC) processing, to estimate the size of the fish 2. The mass of the fish 2 can then be estimated from its size, if required, using knowledge about the mean density of fish of the appropriate type, or using more sophisticated modelling techniques. Each camera 5, 6 captures images at a rate of up to 200 Hz. The cameras 5, 6 are aligned vertically and with parallel optical axes, but are separated from each other horizontally by approximately 10 cm. The cameras 5, 6 are aligned vertically with the projector 4, but separated from it horizontally by approximately 70 cm. The projector 4 is set at an angle of approximately 25 degrees to the axes of the cameras 5, 6. This arrangement enables the cameras 5, 6 to acquire images of the projected fringe pattern that contain information about the shape of the fish 2, while minimising the risk of imaging specular reflections from the shiny scales of the fish 2, which could otherwise hinder accurate image processing. Of course, any of these numbers could be larger or smaller in alternative embodiments.
In order to accommodate movement of the fish 2 past the projector 4 and cameras 5, 6, the control unit 3 calculates the speed and direction of the fish 2 and uses this information to adjust a coordinate system used by the system 1 so that it remains centred on the fish 2. This has the effect of making the fish 2 stationary within this coordinate system, which can simplify the control and processing algorithms.
Figure 2 shows three successive image frames, superimposed on each other, as the fish 2 swims from left to right through the imaging region. The left and right arrows above the image show how much the first and third image frames would need to be shifted by in order to be aligned exactly with the second, middle image frame. The control unit 3 performs such shifting of successive images from the cameras 5, 6, based on an estimate of the fish's speed across the imaging region, so that the coordinate system of the images tracks, and remains centred on, the fish 2. Figure 2 show only horizontal panning (which is typically the most relevant for fish, as they primarily move horizontally), but the method also encompasses vertical panning. Figure 3 shows how vertical (y-axis) and horizontal (x-axis) speed estimates can be used to shift the acquired images in sync with the overall motion of the fish 2. A horizontal speed estimate is obtained as follows. Before a fish 2 has been identified, the projector 4, or another lamp, is set to provide uniform illumination over the imaging region. This may alternatively be simulated by shifting the fringe pattern laterally within one imaging period, so that light is cast over the whole imaging region during the one image exposure time. This horizontal shifting may be continuous or discrete. The control unit 3 processes the image frames from one of the cameras 5, 6 to detect when an object, such as the fish 2, enters the field of view of the camera 5, 6. When this happens, the control unit 3 first calculates an estimate of a horizontal velocity component of the object. At this stage, a vertical velocity component is not obtained, because the vertical component is used only in a post-processing stage, whereas the horizontal speed estimate is used to adjust the projector 4, so is needed earlier. The control unit 2 therefore makes a quick determination of the fish's horizontal speed even before the fish 2 has fully entered the field of view of the camera 5, 6.
The control unit 3 does this by comparing two or more images (with uniform
illumination of the scene) from the camera 5, 6, taken at known time intervals. While the comparison could be done by a correlation calculation between pairs of images, such a correlation operation can be time-consuming and require a lot of processing.
Instead, the control unit 3 performs a column-wise summation over each of two image, having a known time difference. This gives two one-dimensional sets of summation values, one for each image. Assuming that the fish 2 is alone in the imaged area, or assuming other fish in the imaging region have a significant difference in distance from the camera 5, 6 compared with the measured fish 2, the set of summation values will step from very low values in front of the fish 2 (where there is just dark, background water) to much higher values where light is reflected off the head of the fish 2. By using an appropriate threshold, a position of the leading edge of the fish can be determined from each of the summation sets. From these positions, and knowing the time intervals between the images, the fish's horizontal speed within the object plane of the camera 5, 6 can be measured. Note that these speed estimates do not necessarily reflect the actual speed of the fish
2 through the water, but relate to how fast the image of the fish 2 moves across the image plane of the camera 5, 6. The camera 5, 6 may have a large depth of field, such that a slow-moving fish close to the camera 5, 6 could yield the same horizontal speed estimate as a fast-moving fish further away from the camera 5, 6. To work out the true speed of the fish 2 would require a knowledge of the distance between the camera 5, 6 and the fish 2.
The control unit 3 uses the horizontal speed estimate or estimates to control the fringes projected by the projector 4.
Although various structured light techniques can be used, the present example uses an approach of illuminating the fish 2 with a uniform light pattern of vertical stripes. The stripes are stepped horizontally by 2 π / 3 between successive image frames. A succession of three image frames is then processed mathematically to determine shape information. However, because the fish 2 is typically in motion, the control unit
3 controls the projector 4 so that this 2 π / 3 phase stepping occurs relative to an illuminated surface of the fish 2. This can be done in one of two ways. The control unit 3 can adjust the lateral displacement of the stripes between successive images by more or less than 2 π / 3, based on the horizontal speed of the fish 2, so that a step of 2 π / 3 occurs in a coordinate system that is centred on (i.e., that tracks) the moving fish 2. Alternatively the control unit 3 can change the spatial frequency of the projected stripes, based on the speed of the fish 2, so that the fish 2 moves horizontally by a distance that corresponds to a 2 π / 3 shift over the surface of the fish 2; in this way, an effective 2 π / 3 shift can be realised without any lateral shifting by the projector 4. It's also possible that these two approaches can be combined, so that the projector 4 performs some lateral shifting, while the control unit 3 also adjusts the spatial frequency of the fringe pattern.
Of course, other phase shift values than 2 π / 3, relative to the fish 2, are also possible, such as 2 π / n for any desired integer, or real-valued number, n. ln a post-processing stage, both horizontal and vertical motion estimates are used to adjust the images acquired by the cameras 5, 6, to compensate for the motion of the fish 2. These horizontal and vertical displacements can be calculated using a slower correlation approach, over a set of images, because the results are not required as fast as was the case for the initial horizontal speed estimate.
The present approach assumes that the fish 2 does not change shape significantly between successive frames. If the fish 2 does make a large movement, the measurement process may fail. However, a certain failure rate is acceptable.
Let l-i, l2 and l3 be three images from one of the cameras 5, 6, captured using three successive 2 π / 3 phase shifts of the striped illumination pattern relative to the fish 2. The images I2 and l3 have been adjusted to remove horizontal and vertical displacement of the fish 2 over the image set, as described above.
These images satisfy the equations: h = Io + ¾cos(0)
Figure imgf000024_0001
where l0 represents the medium level (DC level) of the illumination and lM represents intensity due to the modulated illumination. I0, IM and Θ are all functions of position, (x, y), in the image plane.
The control unit 3 then evaluates the arc-tangent function
Figure imgf000024_0002
to calculate the phases, a(x, y), over the stripe pattern, of a two-dimensional phase image.
Figure 4 shows a resulting phase image of the fish 2. The grayscale pixel values, ranging from white to black, represent spatial phase values ranging from zero to 2 ττ. The phase wraps every 2 π, resulting in abrupt jumps from white to black between adjacent stripes.
The control unit 3 unwraps the phase to remove the 2 π jumps, using standard techniques. The shape of the fish 2 can then be determined by performing a standard phase-to-height conversion, given the known positions of the projector 4 and the cameras 5, 6.
Although the shape of the fish 2 is now known, to determine the absolute size of the fish 2 also requires the distance between the system 1 and the fish 2 to be
determined, as well as the camera images to be corrected for any distortion (which can be determined during a calibration process).
For this, the left camera 5 and right camera 6 are used as a stereo pair, in combination with the projected stripes. Stereo image analysis traditionally relies on a correlation operation between captured left and right images to identify one or more common points between the two images. A distance to that point can then be calculated, based on the known geometry of the cameras 5, 6. Here, however, the control unit 3 operates not on the direct images from the camera 5, 6 but on processed numerical data arrays, derived from the images.
Data arrays from the two cameras 5, 6 are generated from images, or image series, recorded simultaneously in time from the left camera 5 and the right camera 6.
Identical algorithms are used to process the left and right images. The laser projector 4 creates random, subjective speckle patterns in the images obtained from the two cameras 5, 6, which can make conventional image correlation problematic. Instead, the control unit 3 uses numeric data arrays generated from the two cameras 5, 6, where the data values from the two cameras are identical or virtually identical point by point on the fish 2. These values form the basis for a detailed and accurate correlation between the two image sets from the two cameras 5, 6. A correlation measurement then gives the relative picture distance between the fish 2 in the two camera images, from which information about the distance between the cameras 5, 6 and the fish 2 can be calculated. The processed data array for each camera 5, 6 may, for example, be the phase images, as shown in Figure 4. Alternatively, the processed data arrays may represent the fringe amplitude of the stripe patterns, or other values calculated using a set of camera images as input.
Figures 5a and 5b show what is meant by the fringe amplitude (or fringe intensity), both where the background intensity, /0, is constant (Figure 5a) and changing (Figure 5b). How to calculate the fringe amplitude is described in detail below. The control unit 3 can then use a distance measurement to a single point on the fish 2 in order to calibrate the size of the fish 2, based on the shape information determined from the phase analysis described above, derived from one of the camera 5, 6.
In another set of embodiments, correlation over the data arrays is used to determine the distances to a plurality of points on the fish 2. This information can potentially be used alone to find the shape or size or volume of the fish 2 (i.e., without deriving shape information from the phase data), or it can be used to enhance the accuracy of the phase-based shape analysis. In order to determine whether a fish is present in the scene, and to filter out some of the background, the control unit 3 looks for the outline of a fish based on information from one or both of the cameras 5, 6, relating to the projected stripes. This can be done in various ways, but based on processed data arrays, rather than raw images from the cameras.
One algorithm is based on phase shift, preferably with dynamic coordinate system as outlined previously, but in which, instead of calculating the phase of the projected stripes, the control unit 3 calculates the modulation or intensity of the projected stripes. Interfering backlight and particle dispersion are filtered out from this data processed generated image. This algorithm runs simultaneously for the two cameras 5, 6. The resulting image or data array is well suited as input to a pattern recognition algorithm, and is expected to perform significantly better than a normal picture of the fish 2 for performing outline detection. The intensity can be calculated using, for example, the three coordinate-shifted images, l1t l2, l3, described above, and the algorithm
Figure imgf000027_0001
This results in a kind of image of the fish 2, based on the amplitude of the stripes.
Figure 6 shows an example of such a fringe-amplitude image.
The control unit 3 uses a threshold value to filter the fringe-amplitude image, to yield a mask of the fish. Amplitudes over the threshold are taken to correspond to a surface of a fish, while amplitudes below the selected threshold mean no fish.
So far, the processing has been described on the assumption that only one fish 2 is present in the field of view. In practice, two or more fish may be visible to the cameras 5, 6, and so it is desirable to be able to cope with this situation by identifying the outline of the single fish 2 in the images, so that other fish can be disregarded from the analysis. It is also important that an individual fish is identified with its entire outline so that the whole fish is measured, not only parts of the fish.
The amplitude masking described above may help to remove contributions from fish that are much further away from the illumination source than the closest fish 2.
However, this approach alone will not resolve all cases of overlap, especially where two fish are close to each other.
Separating two overlapping fish can be done in various ways.
In one set of embodiments, the control unit 3 first calculates the spatial gradient of the phase image. If vertical fringes are projected, the spatial gradients can be calculated in the horizontal direction only, by evaluating: phase(x, y) - phase(x - dx, y). However, other embodiments may calculate the spatial gradient in both horizontal and vertical directions— e.g. by evaluating:
Figure imgf000028_0001
Figure 7 shows a resulting spatial gradient image, where gradients have been calculated in a horizontal direction.
Where a surface is continuous, as viewed from one of the camera 5, 6, the stripes will be unbroken and continuous. Therefore, any jumps or discrepancies in the spatial gradient of phase, within the image, represent a shift from one surface to another. The smooth shape of fish means that, in general, the camera 5, 6 will see the whole side of the fish as a continuous surface. There may be jumps and discontinuities in the phase gradients, for example, by the base of the fins of the fish, but this can be taken into consideration in the analysis, since this is expected and known. Other than this, gradient jumps will typically occur only along the boundary where two fish overlap, from the camera's point of view.
It is also possible to separate fish by analysing multiple images over time. The phase gradients will typically remain constant on a continuous fish surface while the gradients between different overlapping fish are likely to change over time, since different fish generally move differently in the images.
In some embodiments, the control unit 3 may alternatively or additionally use stereo information from the two cameras 5, 6 to separate fish that are at different distances, based on the fact that the objects at different depths will have different offsets in the respective image planes. This information may be used on raw images or on processed data array values, as described above. The control unit 3 may calculate distances using the stereo vision algorithm described above, based on the data processed arrays, and look for quick and sudden deviations in distance along projected lines in the scene, or from one part of an area in which fringe intensity is above a threshold level to other parts of the area, which may indicate a shift from one fish to another fish that is closer or further away. The control unit 3 may use results from some or all of the above analyses in an image analysis algorithm to identify the outline of a single fish 2 in the scene.
For measuring the shape and volume of the fish 2, and for identification of individual fish, many consecutive single measurements are obtained using the phase shift and dynamic coordinate system described above. The criteria for both the identification of individuals, and the criteria for measuring the shape and size and volume of fish 2, may change during the time it take the fish 2 to swim past the system 1. For example, the fish 2 could curl or change direction. By monitoring the fish 2 with the phase-shift algorithm, and the use of dynamic coordinates, over a few dozen individual measurements, or more than one hundred individual measurements, the control unit 3 can obtain complementary information and the possibility of averaging the data in order to obtain a more accurate volume estimate for the fish 2. This provides increased sensitivity and more accurate measurements.
The control unit 3 may make measurements of cross sections across the body of the fish 2. Such measurements can also be made on parts of the fish 2 even before the whole fish 2 is completely inside the image fields of the two cameras 5, 6, and after parts of the body of the fish 2 are out of the picture.
The control unit 3 may sequentially record many images. By letting one or both cameras 5, 6 run continuously, the phase of the stripes can be shifted between adjacent images without stopping the camera. In this way it is possible for the system 1 to obtain, for example, 100 or more phase-shifted stripe frames per second, from each of the cameras 5, 6. A typical sequence will involve the stripes being phase- shifted by 2. pi / 3 between shots, in the following phase sequence:
Phase = [0, - - Ο, - - ... ]. For each image, the control unit 3 can then calculate a new phase value, aN, based on the last incoming image plus the two preceding images, as indicated in Figure 8. Phase here means the effective phase shift, referring to the origin of the dynamic coordinate system. Backscattering of light from the water could potentially interfere with reliable image processing. Backscattering is reduced using a linearly polarized laser in the projector 4, or by including a polarizer within the projector 4. The light reflected from small particles in the water will then also be polarized. The light reflected from the surface of the fish (i.e., the stripe pattern) will generally be depolarized. Therefore, polarizers can be placed in front of the cameras 5, 6, adjusted to 90 degrees of the projector 4 polarization direction. In this way most of the backscattering light is removed, while light from the fish 2 is muted only by around 50 percent. This reduction is acceptable because the contrast stripe pattern of the fish 2 is increased significantly.
The projector 4 also includes a unique construction of a Michelson interferometer, for precise phase shifting of the projected stripes. To displace the stripes sideways, one mirror is translated in a linear movement without inclination. The accuracy must be within a fraction of a wavelength of light. A special mechanical suspension is used to achieve this.
Figure 9 shows this mechanical suspension. A piezo pusher 90 presses against a proximal end of a shaft 91. The distal end of the shaft 91 is fastened to an end plate
92, to which a mirror 93 is attached. Two circular leaf springs 94, 95, located near the proximal and distal ends of the shaft 91 , respectfully, provide suspension for the shaft
91. The springs 94, 95 ensure a pure translation of the shaft 91 , and thus the mirror
93, without any rotation.
When projecting the stripes onto the fish 2 it can be a challenge to obtain an acceptable fringe contrast across the whole body of the fish 2. Fish are typically more reflective on the lower half of their body, compared with the upper half, when viewed from the side. The reflectiveness is also influenced by the fish's body shape and the angle from which it is viewed. All of these factors make it challenging to obtain adequate stripe contrast over the fish 2. One approach is simply to increase the intensity of the light source, but this can lead to saturation near the middle of the fish 2, where reflection is typically highest. Because of this, the system 1 uses a lighting system that enables a differentiated intensity distribution over the fish 2, so that more light is sent to the less reflective areas of the fish 2, and less light to the more reflective areas. ln general, when an object is illuminated by a laser, diode, or other source, the beam will typically have a Gaussian intensity distribution. This is true of a Michelson's interferometer, as is used within the projector 4. In order to differentiate the illuminated intensity over the fish 2, the projector 4 illuminates the scene with multiple, partially-overlapping beams of light, where each beam can be directed towards the fish 2 from the same, or approximately the same, area, but with a somewhat different angle towards the fish 2. Additionally, the projector 4 can be controlled to regulate the intensity relationship between the light beams. Each of these beams has its own stripe pattern, but all the patterns follow the same function in space, so that the stripes are continuous with each other in areas where the beams overlap. This is important in order to be able to unwrap the stripes and calculate the surface and volume of the fish 2 based on the stripes. Figure 10 shows a Michelson interferometer 100 and a laser 101 that are used within the projector 4 to generate a stripe pattern in the beam. The laser 101 generates a light beam which passes through a convex input lens 102. This causes the beam to spread as it enters a beam splitter 103. A first mirror 104 creates a first virtual point source 105, while a second mirror 106 creates a second virtual point source 107. The diverging output beam 108 has a Gaussian-distributed intensity over its cross section. The two virtual points 105, 106 create a striped interference fringe pattern in the output beam 108. These two point sources 105, 106 would appear to be closely spaced apart, next to each other, if an observer were to look towards the interferometer 100 from its output. In this arrangement, a single light cone is generated, containing stripes.
In another set of embodiments, however, the projector 4 includes additional components so that multiple light beams pass through a Michelson interferometer. This interferometer set-up is similar to the interferometer 100 in Figure 10, but may have multiple lasers. In order that several light beams, each having a respective pair of virtual point sources, can illuminate the fish 2 with a single, uniform pattern of stripes, it is necessary for the distance and angle between the virtual points of each pair of virtual point sources to be the same. Additionally, the point-source couples for each light beam must be close to each other in space, in order that the stripe patterns between different light beams are not notably laterally skewed in relation to each other. Ideally the point couples for all the light beams have the exact same position in space.
Figure 1 1 shows two parallel beams 1 10, 11 1 going through an input lens 112 of a Michelson interferometer contained within the projector 4. The beams 1 10, 11 1 are laterally offset from each other, but overlapping, heading towards the lens 1 12. They leave the lens 112 and pass through a mutual focal point 113 in different directions. Both beams 110, 11 1 maintain their Gaussian profiles. The beams 1 10, 1 11 each generate their own independent stripe pattern when they have passed through the beam splitter and mirrors of the Michelson interferometer, and travel towards the fish 2. These two stripe patterns will be spatially identical or overlapping if the beams come from the same point initially, and thereby from identical virtual points, as shown in Figure 10. The primary difference between the two striped beams is that they illuminate the fish 2 with maximal intensity in different areas. The beams may also have different sizes and spreads, which depends on the shape and diameter of the beams entering the input lens 112. The laser beams must have roughly the same wavelength, but can be polarized at 90 degrees to each other.
Figure 12 shows a first beam of stripes 120 and a second beam of stripes 121 travelling towards the fish 2. The beams are indicated here by respective ovals, which represent regions where the output light exceeds a threshold intensity in an object plane containing the fish 2. Here, the two beams 120, 121 are offset vertically, so that the first beam 120 has its maximum intensity on the back (top half) of the fish 2. Instead of using two parallel two input beams 110, 11 1 , as shown in Figure 11 , the beams may enter the lens 1 12 at different angles. Figure 13 shows the effect of angling the first beam 1 10 relative to the second beam 11 1 by a few degrees, so that they are not parallel when they enter the input lens 1 12. In this case the first beam 1 10 is focused to a focal point 130 that is shifted relative to the focal point 131 of the second beam 1 11. Each beam 1 10, 1 11 creates a respective pair of virtual point sources in the interferometer. The two pairs are offset from each other, but the distance and angle between the virtual point sources in each pair is the same for both pairs. In this case, the beams will have a different angle as they leave the
interferometer in the projector 4, but the point couples that the two light sources generate after passing the interferometer will have a somewhat different placing in space. The distance between the points in each point couple will still be the same, and the stripe pattern will be approximately the same for the two beams so that they will still generate the same overlap as before, as shown in Figure 12. Whether parallel or angled, the two light beams 110, 11 1 may be polarized 90 degrees to each other, so that the reflection properties of the two beams are differentiated. This allows for greater homogeneity in the stripe contrast over the fish 2, because the reflection from the fish can be adapted to the different viewing angles of the cameras 5, 6. This is because the surface of the fish 2, with varying surface angles across the fish 2, and with varying reflection properties across the fish 2, will reflect light with different polarisation direction, in different ways. If some parts of the fish 2 have a tendency to reflect too much light specularly, this can be reduced by adjusting the polarization direction of the illumination of these parts of the fish 2. The camera 5, 6 may have a corresponding polarizing filter.
Figure 14 shows how two light beams 1 10, 1 11 , generated from a first laser 140 and a second laser 141 , respectively, can be combined before entering the input lens 112 of the Michelson interferometer, within the projector 4. The beams 110, 11 1 are polarized in respective directions, which allows a mirror 142 and a polarizing beam- splitter 143 to be used to combine the two beams. The beams 1 10, 1 11 can then be totally super-positioned, which gives totally overlapping light beams, or they can be laterally offset, as shown in Figure 1 1.
The same approach can be used with more than two lasers, generating more than two offset beams.
Figure 15 shows four offset beams, of similar wavelength, entering the input lens 112. It is, however, not easy to combine more than two laser beams by the same principle as shown in Figure 14 without losing energy from one or more lasers. This is because there are only two independent polarization directions. But it may still be beneficial to combine more than two laser beams in order to get a good distribution of intensity over the fish 2. This may be achieved by combining three or more beams directed towards the lens 112 with differing angles, even though the focal point and the placement of the point-couples after passing the interferometer will be somewhat different for the different beams. Figure 16 shows how four light beams could be arranged to illuminate the fish 2 more uniformly with a striped pattern. Note that each of the light beams generates an synchronised stripe pattern, so that there are continuous stripes over an entire object place containing the fish 2. The only difference from a single laser source embodiment is that, there is a differentiated polarization and illumination on the fish. The principles described above can also be used when there is just one source laser.
Figure 17 shows part of the projector 4 in an alternative embodiment in which a double-refracting prism 170 is placed in front of a laser 161 in order to generate two parallel, offset beams 172, 173. These are directed towards the input lens 112. The double-refracting prism 170 can be formed from a birefringent crystal such as calcite. The intensity relationship between the two beams can be adjusted by rotating the prism 170 or the laser 171. If it proves to be difficult to obtain a prism 160 that offsets the beams by a desired distance, this can be corrected by adjusting the diameter of the beam before it enters the prism 170 and choosing a lens 112 with an appropriate focal length. The two beams 172, 173 going out from the prism 170 are polarized 90 degrees to each other and so do not cause any interference issues between the light beams.
Figure 18 shows part of the projector 4 in an alternative embodiment in which a vibrating or stepping mirror 180, or another device that can laterally offset a beam dynamically, is placed in front of a laser 181 in order to generate two parallel, offset beams 182, 183, which are directed towards the input lens 112. The mirror 180 shifts the beam between two parallel paths. If the lateral shifting happens fast enough so as to cycle one or more times during a single exposure period of the camera 5, 6, this will create a time-averaging effect equivalent to the beams 182, 183 being on the fish 2 simultaneously. The approach of Figure 18 may be desirable where differentiated illumination is wanted over the fish 2, but not differentiated polarization.
It will be appreciated that many variations to the arrangements described above are possible. In particular, the system 1 may be used to measure objects other than fish 2, which may be stationary or moving. Beam patterns other than vertical stripes may be used.

Claims

Claims
1. A system for determining three-dimensional spatial information about an object, the system comprising a projector, a camera, and a processing subsystem, wherein: the projector is arranged to emit structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic, having a first spatial frequency in a first direction;
the camera is arranged to capture a plurality of images of the object in a respective plurality of time frames, and to send image data representing the plurality of images to the processing subsystem; and
the processing subsystem is arranged to:
receive or determine data relating to a speed of the object, relative to the projector, in said first direction;
send a control signal to the projector to cause the projector to control the emitted structured light in dependence on the speed of the object in the first direction such that said intensity pattern is shifted, relative to the object, between successive time frames of the plurality of time frames, by a spatial- phase step that is independent of the speed of the object in the first direction; use image data corresponding to at least two of the time frames to calculate spatial phase values for a plurality of points on the object; and
use the calculated spatial phase values to determine three-dimensional spatial information about the object.
2. A system as claimed in claim 1 , wherein the processing subsystem is arranged to control the emitted structured light in dependence on the speed of the object in the first direction by causing the projector to shift the intensity pattern in the first direction, between successive time frames of the plurality of time frames, by an amount that depends on the speed of the object in the first direction.
3. A system as claimed in claim 2, wherein the processing subsystem is arranged to calculate an amount of said shift using the data relating to the speed of the object.
4. A system as claimed in any preceding claim, wherein the processing subsystem is arranged to control the emitted structured light in dependence on the speed of the object in the first direction by controlling the spatial frequency of the intensity pattern in dependence on the speed of the object in the first direction.
5. A system as claimed in claim 4, wherein the control signal sent by the processing subsystem encodes data representing the spatial frequency of the intensity pattern in the illumination plane.
6. A system as claimed in any preceding claim, wherein said spatial-phase step is constant over a plurality of pairs of successive time frames.
7. A system as claimed in any preceding claim, wherein the intensity pattern comprises a set of parallel stripes.
8. A system as claimed in any preceding claim, wherein the intensity pattern comprises a set of vertical stripes and wherein the first direction is horizontal.
9. A system as claimed in any preceding claim, wherein the projector comprises an interferometer.
10. A system as claimed in any preceding claim, wherein the projector is arranged to emit polarized light and wherein the camera comprises a polarizer.
1 1. A system as claimed in any preceding claim, wherein the processing subsystem is arranged to apply a lateral shift operation to the image data or to the spatial phase values corresponding to one or more time frames, wherein the lateral shift operation shifts the image data laterally by an amount that depends on the speed of the object in the first direction.
12. A system as claimed in any preceding claim, wherein the processing subsystem is arranged to translate the image data or the spatial phase values into a coordinate system in which the object is fixed.
13. A system as claimed in any preceding claim, arranged to use the camera when determining the data relating to a speed of the object, relative to the projector, in the first direction.
14. A system as claimed in claim 13, wherein the processing subsystem is arranged, when determining the data relating to the speed of the object in the first direction, to sum pixel values along a set of lines perpendicular to the first direction in an image from the camera, to generate a set of summed values, and to identify a leading edge of the object from the summed values.
15. A system as claimed in claim 13 or 14, wherein the processing subsystem is arranged to illuminate the object uniformly for determining the speed of the object the projector, by causing the projector to sweep a pattern or beam over the illumination plane, within an exposure time of the camera.
16. A system as claimed in any preceding claim, wherein the processing subsystem is arranged to estimate a mass or volume of the object.
17. A system as claimed in any preceding claim, wherein the object is a fish.
18. A method of determining three-dimensional spatial information about an object, the method comprising:
emitting structured light towards the object, wherein the structured light has, within an illumination plane, an intensity pattern that is periodic, having a first spatial frequency in a first direction;
capturing a plurality of images of the object in a respective plurality of time frames;
receiving or determining data relating to a speed of the object in said first direction;
controlling the emitted structured light in dependence on the speed of the object in the first direction such that said intensity pattern is shifted, relative to the object, between successive time frames of the plurality of time frames by a spatial-phase step that is independent of the speed of the object in the first direction;
using at least two of the plurality of images to calculate spatial phase values for a plurality of points on the object; and
using the calculated spatial phase values to determine three-dimensional spatial information about the object.
PCT/GB2018/051833 2017-07-04 2018-06-29 Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects WO2019008330A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NO20200134A NO20200134A1 (en) 2017-07-04 2020-02-03 Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1710705.3 2017-07-04
GBGB1710705.3A GB201710705D0 (en) 2017-07-04 2017-07-04 Structured-Light Illumination

Publications (1)

Publication Number Publication Date
WO2019008330A1 true WO2019008330A1 (en) 2019-01-10

Family

ID=59592446

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/051833 WO2019008330A1 (en) 2017-07-04 2018-06-29 Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects

Country Status (3)

Country Link
GB (1) GB201710705D0 (en)
NO (1) NO20200134A1 (en)
WO (1) WO2019008330A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020256566A1 (en) * 2019-06-19 2020-12-24 Subc3D As System and method for depiction and counting of external structures on a fish
WO2021216343A1 (en) 2020-04-21 2021-10-28 InnovaSea Systems, Inc. Systems and methods for fish volume estimation, weight estimation, and analytic value generation
CN114485470A (en) * 2022-01-30 2022-05-13 北京理工大学 Speckle-based composite material three-dimensional appearance and defect comprehensive measurement system and method
US11388889B2 (en) * 2020-08-14 2022-07-19 Martineau & Associates Systems and methods for aquatic organism imaging
CN115307576A (en) * 2022-08-02 2022-11-08 清华大学 Step boundary compensation method and device in structured light measurement
CN115812646A (en) * 2022-12-05 2023-03-21 中国电建集团成都勘测设计研究院有限公司 Method for analyzing fish behaviors in fishway
US11710245B2 (en) * 2020-02-25 2023-07-25 Jack Wade Real-time marine snow noise removal from underwater video

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008120457A1 (en) * 2007-03-29 2008-10-09 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional image measurement apparatus, three-dimensional image measurement method, and three-dimensional image measurement program of non-static object
US20120133741A1 (en) * 2005-11-10 2012-05-31 Obe Ohnmacht & Baumgaertner Gmbh & Co. Kg Camera chip, camera and method for image recording
WO2014098614A1 (en) 2012-12-20 2014-06-26 Ebtech As System and method for calculating physical dimensions for freely movable objects in water

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120133741A1 (en) * 2005-11-10 2012-05-31 Obe Ohnmacht & Baumgaertner Gmbh & Co. Kg Camera chip, camera and method for image recording
WO2008120457A1 (en) * 2007-03-29 2008-10-09 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional image measurement apparatus, three-dimensional image measurement method, and three-dimensional image measurement program of non-static object
WO2014098614A1 (en) 2012-12-20 2014-06-26 Ebtech As System and method for calculating physical dimensions for freely movable objects in water

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
RICARDO R GARCIA ET AL: "Temporally-Consistent Phase Unwrapping for a Stereo-Assisted Structured Light System", 3D IMAGING, MODELING, PROCESSING, VISUALIZATION AND TRANSMISSION (3DIMPVT), 2011 INTERNATIONAL CONFERENCE ON, IEEE, 16 May 2011 (2011-05-16), pages 389 - 396, XP031896510, ISBN: 978-1-61284-429-9, DOI: 10.1109/3DIMPVT.2011.56 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020256566A1 (en) * 2019-06-19 2020-12-24 Subc3D As System and method for depiction and counting of external structures on a fish
GB2599532A (en) * 2019-06-19 2022-04-06 Subc3D As System and method for depiction and counting of external structures on a fish
GB2599532B (en) * 2019-06-19 2022-10-26 Subc3D As System and method for depiction and counting of external structures on a fish
US11710245B2 (en) * 2020-02-25 2023-07-25 Jack Wade Real-time marine snow noise removal from underwater video
WO2021216343A1 (en) 2020-04-21 2021-10-28 InnovaSea Systems, Inc. Systems and methods for fish volume estimation, weight estimation, and analytic value generation
US11388889B2 (en) * 2020-08-14 2022-07-19 Martineau & Associates Systems and methods for aquatic organism imaging
US11723345B2 (en) 2020-08-14 2023-08-15 Martineau & Associates Systems and methods for aquatic organism imaging
CN114485470A (en) * 2022-01-30 2022-05-13 北京理工大学 Speckle-based composite material three-dimensional appearance and defect comprehensive measurement system and method
CN115307576A (en) * 2022-08-02 2022-11-08 清华大学 Step boundary compensation method and device in structured light measurement
CN115307576B (en) * 2022-08-02 2024-04-09 清华大学 Step boundary compensation method and device in structured light measurement
CN115812646A (en) * 2022-12-05 2023-03-21 中国电建集团成都勘测设计研究院有限公司 Method for analyzing fish behaviors in fishway
CN115812646B (en) * 2022-12-05 2023-07-11 中国电建集团成都勘测设计研究院有限公司 Fish behavior analysis method in fishway

Also Published As

Publication number Publication date
NO20200134A1 (en) 2020-02-03
GB201710705D0 (en) 2017-08-16

Similar Documents

Publication Publication Date Title
NO20200134A1 (en) Motion compensation in phase-shifted structured light illumination for measuring dimensions of freely moving objects
US20230392920A1 (en) Multiple channel locating
US9170097B2 (en) Hybrid system
US10152800B2 (en) Stereoscopic vision three dimensional measurement method and system for calculating laser speckle as texture
TWI485361B (en) Measuring apparatus for three-dimensional profilometry and method thereof
EP0877914B1 (en) Scanning phase measuring method and system for an object at a vision station
US8923603B2 (en) Non-contact measurement apparatus and method
EP1330790B1 (en) Accurately aligning images in digital imaging systems by matching points in the images
CA2805443C (en) Method and apparatus for imaging
US8755036B2 (en) Active imaging system and method
US10648789B2 (en) Method for monitoring linear dimensions of three-dimensional objects
JP7386185B2 (en) Apparatus, method, and system for generating dynamic projection patterns in a confocal camera
JP2010507079A (en) Apparatus and method for non-contact detection of 3D contours
CA2799705C (en) Method and apparatus for triangulation-based 3d optical profilometry
EP2399222A1 (en) Speckle noise reduction for a coherent illumination imaging system
EP2813809A1 (en) Device and method for measuring the dimensions of an objet and method for producing an item using said device
CN110692084B (en) Apparatus and machine-readable storage medium for deriving topology information of a scene
TWI740237B (en) Optical phase profilometry system
US20220196386A1 (en) Three-dimensional scanner with event camera
TWI588441B (en) Measuring method and apparatus for carrying out the measuring method
JP7438555B2 (en) 3D measurement method and 3D measurement device
TWI588508B (en) Stereoscopic depth measuring apparatus
Liu et al. Study of three-dimensional sensing by using inverse square law
Hassebrook et al. Super resolution structured light illumination
WO2008098759A1 (en) Three-dimensional image acquisition method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18739610

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.04.2020)

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22/04/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18739610

Country of ref document: EP

Kind code of ref document: A1