GB2610376A - Portable illumination device, imaging system and method - Google Patents

Portable illumination device, imaging system and method Download PDF

Info

Publication number
GB2610376A
GB2610376A GB2111830.2A GB202111830A GB2610376A GB 2610376 A GB2610376 A GB 2610376A GB 202111830 A GB202111830 A GB 202111830A GB 2610376 A GB2610376 A GB 2610376A
Authority
GB
United Kingdom
Prior art keywords
illuminators
illumination device
target
sequence
illuminator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2111830.2A
Other versions
GB202111830D0 (en
Inventor
Charles Wright Glynn
Broadbent Laurence
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aralia Systems Ltd
Original Assignee
Aralia Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aralia Systems Ltd filed Critical Aralia Systems Ltd
Priority to GB2111830.2A priority Critical patent/GB2610376A/en
Publication of GB202111830D0 publication Critical patent/GB202111830D0/en
Publication of GB2610376A publication Critical patent/GB2610376A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/20Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/02Stereoscopic photography by sequential recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/586Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2215/00Special procedures for taking photographs; Apparatus therefor
    • G03B2215/05Combinations of cameras with electronic flash units
    • G03B2215/0514Separate unit
    • G03B2215/056Connection with camera, e.g. adapter
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2215/00Special procedures for taking photographs; Apparatus therefor
    • G03B2215/05Combinations of cameras with electronic flash units
    • G03B2215/0564Combinations of cameras with electronic flash units characterised by the type of light source
    • G03B2215/0575Ring shaped lighting arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

A portable illumination device 1 for attachment to a mobile computing device (3, Figure 5; e.g., mobile phone, smart phone, tablet, e-reader) comprises a plurality of illuminators (e.g., LEDs 9A-D; emitting visible light and/or near infra-red, NIR, radiation), each illuminator controlled independently; at least one viewing aperture 6 (e.g., moveable for alignment with the mobile computing device); at least one detector 30 (e.g., time of flight, ToF, sensor) for detecting the distance of a target from the illumination device; and a controller for controlling the lighting sequence and/or brightness of each of the plural illuminators. Also, a three-dimensional (3D) imaging method for obtaining a reconstructed 3D model of a target comprises: illuminating the target from a plurality of directions; detecting the distance between a video recording camera and the target; activating a pre-determined illumination sequence from a plurality of illuminators; recording a video data capture sequence of the target; identifying the start of the video data capture sequence using known features of the illumination sequence; extracting a plurality of frames of the video data capture sequence; and reconstructing a 3D model of the target.

Description

Portable Illumination Device, Imaging System and Method The present invention relates to a portable illumination device, a 3D imaging system and a three-dimensional (3D) imaging method.
Single camera photometric stereo methods and devices cannot recover 3D data because there is no absolute distance recorded in the measurement process. Thus, it is common to refer to the data recovered by such known photometric stereo methods as "2.5D".
Known imaging systems for generating 3D models of a 3D object require multiple cameras to recover 3D geometry. Existing methods for photometric stereo systems require that each frame captured is synchronised with illumination of the scene by one of a plurality of illuminators in turn. A series of images associated with all of the illuminators is used to reconstruct a 3D model using known algorithms. However, known methods require a synchronisation signal from the camera that is necessary to control the illumination system. This signal is not directly available from mobile phones, such that known photometric stereo methods cannot be used to create 3D models with a mobile phone. Mobile phones are portable, easily accessible and now have high quality video camera capabilities but are not presently used for producing 3D models.
International (PCT) Patent Publication W02019/064022 discloses a method for photometric stereo imaging using four LEDs and a video camera in the centre of the LEDs. The LEDs each illuminate from a different angle. To create the photometric video, it is necessary to know the illumination of at least part of the scene and the object to be imaged from each LED individually. The method does not require a synchronisation signal.
However, the method is computationally complex because of the need to extract individual illuminator views. This known method also requires a large number of frames to create a 3D model, which in practical terms, limits the upper limit of individual directions that can be recovered by the method. Furthermore, this known method has also been found to be impacted by "camera shake" -i.e., when a user accidentally shakes the camera during image capture due to unsteady hands, which results in blurry images.
The present invention sets out to provide an improved 3D imaging method 5 and device which addresses the above-described problems.
In one aspect, the invention provides a portable illumination device for attachment to a mobile computing device comprising: a plurality of illuminators, wherein each illuminator is controlled independently; at least one viewing aperture, at least one detector for detecting the distance of a target from the illumination device; a controller for controlling the lighting sequence and/or brightness of each of the plurality of illuminators.
It has been shown that photometric stereo methods combined with a detector for detecting the distance of a target can gather true 3D data for both flat and curved surfaces by using an array of point measurements of distance between the illuminated surface and the time of flight sensor to complement the "2.5D" data that is collected by a photometric stereo method carried out using a controlled sequence of illumination of each of a plurality of illuminators.
The improvement offered by the present invention produces accurate 3D images that can be used in hand biometrics with associated applications in fintech. The present invention is also accurate enough to be used for medical imaging in cutaneous and subcutaneous imaging. The improved system and method of the present invention also has significant advantages for application to agriculture. Further the invention can also be used in material testing; with particular application to the testing of innovative, environmentally-friendly materials.
It is understood that in the context of this invention, "portable" refers to the device being suitable for carrying or moving.
In the context of the present invention, an "illuminator" is understood to be a component that illuminates -i.e., directs radiation, including light to the target area to be imaged.
Preferably, the portable illumination device is a hand-held illumination device.
Preferably, the at least one viewing aperture is moveable for alignment with a mobile computing device.
Preferably, each illuminator comprises one or more LEDs.
Preferably, each illuminator is an LED.
Preferably, the plurality of illuminators are arranged in a regular array.
Preferably, each illuminator of the plurality of illuminators is arranged at the corner of a square.
Preferably, the portable illumination device comprises at least three, or at least four, or at least five, or at least six, or at least seven, or at least eight, , or at least nine, or at least ten, or at least eleven, or at least twelve illuminators.
Preferably, the portable illumination device comprises three or four or six or eight or ten or twelve illuminators.
Preferably, the portable illumination device comprises four LEDs.
Preferably, the four illuminators are arranged in an X-shape.
The arrangement of the illuminators of the present invention is carefully configured to minimise non-Lambertian illuminations, such as "glints" or reflections.
Preferably, the portable illumination device comprises a substantially annular frame having a central aperture.
More preferably, the portable illumination device comprises four illuminators arranged at points around a circle, wherein the circle shares a centre point with the substantially annular frame.
Preferably, each illuminator is equidistant from the centre point of the substantially annular portion.
Preferably, the portable illumination device comprises four illuminators arranged at equidistant points around a circle. More preferably, wherein each illuminators has an angular separation from the adjacent illuminators of about 90 degrees.
Preferably, each illuminator is an illuminator emitting visible light and/or near infra-red radiation.
Optionally, the plurality of illuminators comprises a first plurality of illuminators emitting visible light and a second plurality of illuminators emitting near-infra-red radiation.
Optionally, the portable illumination device comprises eight illuminators wherein four illuminators emit visible light, and four illuminators emit near infra-red radiation.
The present invention can be used to provide 3D subcutaneous imaging for biometric applications. The dual wavelength approach is both highly accurate and very difficult to imitate.
Preferably, each illuminator is attached to the portable illumination device by an arm.
Preferably, each illuminator is attached to the portable illumination device 35 by a moveable arm.
Preferably, each illuminator is attached to the portable illumination device by a hinged arm.
Optionally, each illuminator is attached to the portable illumination device
by a pivotable arm.
Preferably, the portable illumination device comprises a body and each arm attached to an illuminator is moveable between a storage position and a use position.
Preferably, the portable illumination device comprises a body and each arm attached to an illuminator is moveable between a storage position wherein the arm is within the body and a use position wherein the arm projects from the body.
Preferably, the portable illumination device comprises a body and each arm attached to an illuminator is moveable between a storage position wherein the arm is within a recess in the body and a use position wherein the arm projects from the body.
Preferably, each arm is moveable to a use position wherein the arm holds the illuminator at a distance from the body of the portable illumination device.
Preferably, in a use position, each illuminator is held at a fixed position by a respective arm.
Preferably, the position of each illuminator is known to the controller.
The present invention allows for the illuminators to be protected in a storage position and then held in the required configuration for illumination of a target during use of the device.
Preferably, the portable illumination device comprises a mounting means, for example, a mount; more preferably, an elongate mounting means, for example, an elongate mount.
The elongate mounting means allows a user to hold the portable illumination device whilst affixing the mobile computing device, such as a mobile phone or smart phone to the illumination device.
Preferably, the elongate mounting means further comprises an elongate opening.
Preferably, the portable illumination device comprises an attachment means for attaching a mobile computing device thereto. Preferably, the attachment means is a securing bracket.
Preferably, the attachment means is moveable along the elongate opening of the elongate mounting means.
Preferably, the elongate opening further comprises a securing means, such
as a rotatable knob.
By moving the attachment means along the elongate opening, the camera of the mobile computing device can be aligned with the circular aperture of the illumination device and so with the plurality of illuminators. When correctly positioned the attachment means can be secured in position using the securing means. Adjustment of the position of the mobile computing device with respect to the portable illumination device does not require any additional tools but can quickly and conveniently be aligned, adjusted and secured by hand.
More preferably, the portable illumination device comprises an adjustable attachment means.
Preferably, the securing bracket comprises a backing plate and two 35 opposing gripping members.
Preferably, the opposing gripping members are moveable towards and away from each other. More preferably, the opposing gripping members are moveable by adjustment of an adjuster, for example a rotatable adjuster.
The portable illumination device can be conveniently used with a broad range of mobile computing devices.
Preferably, the portable illumination device comprises a time of flight sensor.
Preferably, the time of flight sensor is a Single Photo Avalanche Diode (SPA D).
Preferably, the controller is configured to analyse data from the or each detector and data from the illuminators and video images recorded by a mobile computing device.
Optionally, the portable illumination device comprises a mmWave sensor.
Preferably, the portable illumination device comprises a thermal camera.
Preferably, the portable illumination device comprises a processor for reconstructing 3D models.
In a further aspect, the present invention provides an imaging system comprising the portable illumination device as previously described and a mobile computing device, wherein the mobile computing device comprises a camera for video recording.
Preferably, the mobile computing device comprises a camera for slow-motion video recording.
Preferably, the mobile computing device comprises a camera for slow-motion video recording of about 240 frames per second or of about 960 frames per second.
Preferably, the imaging system comprises the camera of the mobile computing device wherein the camera is positioned in the same plane as each illuminator of the portable illumination device.
Preferably, the imaging system comprises the camera of the mobile computing device for video recording wherein the camera is aligned with a central aperture in a substantially annular portion of the portable illumination device. More preferably, the camera is aligned such that the camera is central to the plurality of illuminators.
Preferably, the imaging system comprises the camera of the mobile computing device for video recording wherein the camera is within about 5mm-15mm of the detector; more preferably, wherein the camera is within about 10mm of the detector Preferably, the camera of the mobile computing device for video recording is within about lOmm of the time of flight sensor.
Preferably, the principal axis of the time of flight sensor is aligned with the principal axis of the lens of the camera of the mobile computing device.
More preferably, the principal axis of the time of flight sensor is aligned with the principal axis of the lens of the camera of the mobile computing device and displaced by no more than about 1 cm from the optical centre of the lens of the camera of the mobile computing device.
The time of flight sensor provides a measure of the distance between the camera and the target to be modelled, which is used as an initial estimate of the scene/target geometry. By positioning the time of flight sensor close to the camera, the time of flight sensor provides data that augments the overall geometry of the recovery process.
Preferably, the mobile computing device is configured to communicate with the controller of the portable illumination device.
Optionally, the imaging system comprises a processor for reconstructing 3D models wherein the processor is connected to the portable illumination device via a high speed mobile data network connections; more preferably, via a 4G or 5G mobile data network connection.
Optionally, the mobile computing device is configured to communicate with the controller of the portable illumination device via Bluetooth®; preferably, low energy Bluetooth®.
In a further aspect, the present invention provides a 3D imaging method for obtaining a reconstructed 3D model of a target comprising the steps of: illuminating the target from a plurality of directions; detecting the distance between a video recording camera and the target; activating a pre-determined illumination sequence from a plurality of illuminators; recording a video data capture sequence of the target; identifying the start of the video data capture sequence using known features of the illumination sequence; extracting a plurality of frames of the video data capture sequence; and reconstructing a 3D model of the target.
Preferably, the pre-determined illumination sequence from a plurality of illuminators comprises: i) switching all of the plurality of illuminators on to activate simultaneous illumination from all of the plurality of illuminators; ii) switching all of the plurality of illuminators off; iii) switching each of the plurality of illuminators on sequentially.
Preferably, the method further comprises obtaining photometric data from the plurality of frames of the video data capture sequence using the predetermined illumination sequence.
Preferably, the method further comprises a calibration step with respect to a reference image.
Preferably, the method further comprises a step of adjusting the radiation output of the or each illuminator to the optimum level for ambient light 10 conditions.
Preferably, each of the plurality of illuminators are synchronised with each other throughout the pre-determined illumination sequence; more preferably, each of the plurality of illuminators are in strict synchronisation with each other throughout the pre-determined illumination sequence.
It has been found that by maintaining strict synchronisation of each of the plurality of illuminators, this allows the illumination sequence to be predetermined for successful reconstruction of the 3D model of the target. The fundamental frequency of each illuminator is then matched to the frame rate of the video data capture sequence of the mobile computing device, such as a smart phone, to allow the recovery algorithm to reliably reconstruct the 3D model of the target.
Preferably, each of the plurality of illuminators are in strict synchronisation with each other throughout the pre-determined illumination sequence but are not synchronised with the start of each image frame extracted from the video data capture sequence.
Preferably, the method further comprises a step of: aligning the phase relationship of the illumination sequence and the video data capture sequence using a timing marker generated by a mobile computing device.
Preferably, the method further comprises illuminating the target for a fixed time period from at least one illuminator in advance of activating the predetermined illumination sequence from a plurality of illuminators.
Preferably, the method further comprises illuminating the target for a fixed time period of about 1 second from at least one illuminator in advance of activating the pre-determined illumination sequence from a plurality of illuminators.
More preferably, the method further comprises illuminating the target for a fixed time period from a single illuminator in advance of activating the predetermined illumination sequence from a plurality of illuminators.
Preferably, the method further comprises illuminating the target for a fixed time period of about 1 second from a single illuminator in advance of activating the pre-determined illumination sequence from a plurality of illuminators.
By illuminating the target for a fixed time period in advance of activating the pre-determined illumination sequence from a plurality of illuminators, the method ensures that the correct exposure setting is applied and maintained during the recording of a video data capture sequence of the target. The initial pre-amble of illumination for a fixed time period sets the automatic exposure function within the camera of the mobile computing device, such as a smart phone, to the correct value to be maintained throughout the video data capture sequence. This prevents any degradation of the reconstructed 3D model of the target due to an incorrect exposure setting.
Preferably, the method comprises use of a device or system as described herein.
It has been found that the method of the present invention greatly reduces the computational complexity when compared to known methods of extracting individual illuminator views. The number of frames that are required to be captured to create a 3D model is significantly reduced by the method of the present invention. Thus, in practical terms, it has been found that the upper limit of individual directions that can be recovered by the proposed sequence is greater. This is a significant improvement, particularly when the nature of the surface texture that is being modelled requires a long sequence of images to be recovered for the purpose of reducing the distortion of the 3D reconstruction for non-Lambertian surfaces. Furthermore, the shorter sequence of images that are required by the proposed method reduces the impact of "camera shake" on the reconstructed 3D model.
For the purposes of clarity and a concise description, features are described herein as part of the same or separate embodiments; however, it will be appreciated that the scope of the invention may include embodiments having combinations of all or some of the features described.
The invention will now be described by way of example with reference to the accompanying drawings, in which: -Figure 1 is a view from below (target-facing side) of a portable illumination device in accordance with the present invention, shown without a mobile phone attached in a use position; Figure 2 is a view from above (side opposing the target-facing side) of the portable illumination device of Figure 1 without a mobile phone attached in a use position; Figure 3 is a side view of the portable illumination device of Figures 1 and 2 without a mobile phone attached shown in a use position; Figure 4 is a view from below (target-facing side) of a portable illumination device in accordance with the present invention, shown with a mobile phone attached in a use position; Figure 5 is a view from above (side opposing the target-facing side) of the portable illumination device of Figure 4 with a mobile phone attached in a use position; Figure 6 is a perspective view of the portable illumination device of Figures 4 and 5 with a mobile phone attached and the portable illumination device in a use position; Figure 7 is a view from below (target-facing side) of a portable illumination device in accordance with the present invention, shown with a mobile phone attached and the portable illumination device in a storage position; Figure 8 is a view from above (side opposing the target-facing side) of the portable illumination device of Figure 7 with a mobile phone attached and the portable illumination device in a storage position; Figure 9 is a schematic overview of the components of the illumination device of the present invention; Figure 10 is a flow chart illustrating the steps of the method of the present invention; Figure 11 is a side view of an alternative embodiment of a portable illumination device in accordance with the present invention, shown with a mobile phone attached; Figure 12 is an exploded perspective view of the alternative embodiment of the portable illumination device shown in Figure 11, shown with a mobile phone attached; and Figure 13 is a perspective view from below (target-facing side) of the alternative embodiment of the portable illumination device shown in Figures 11 and 12 attached to a mobile phone.
Referring to Figure 1, an illumination device 1 in accordance with the present invention is shown prior to attachment to a portable recording device. In a preferred embodiment, the portable recording device is a mobile computing device, such as a mobile phone, smart phone, tablet, e-reader or similar. Figures 4 to 8 show the illumination device attached to a mobile phone. The illumination device 1 of the present invention is of a comparable size to a mobile phone and is a portable device suitable to be hand held. In a preferred embodiment, the illumination device has a length of between about 120mm to 160mm, a maximum width of between about 60mm to 85 mm, and a depth of between about lOmm to 20mm.
Referring to Figures 1 and 2, the illumination device 1 comprises a body 7, a plurality of LED illuminators 9A 9B 9C 9D, a battery compartment (not shown) housing a rechargeable battery (not shown), and a USB charger port 13. The body 7 comprises a substantially annular portion having a central aperture 6. The annular portion is attached to an elongate mounting means 8. In use, to attach the illumination device 1 to a mobile phone, the elongate mounting means 8 can be held by a user.
Referring to Figure 1, the illumination device 1 further comprises a resilient, push-button power switch 15 for turning the illumination device 1 on and off. In alternative embodiments, the power switch 15 is an equivalent switch to turn the device on and off. A power indicator light 17 is used to indicate whether the illumination device 1 is on or off.
In the embodiment shown in Figures 1 to 8, each LED illuminator 9A 9B 9C 9D is connected to the illumination device 1 by an arm 19, which is connected by a hinge or pivot 21 to the body 7 of the housing so that the arm is foldable into and out of the body 7. It is understood that the "arm" of the present invention is a mounting means for the illuminator, wherein the mounting means projects from the body 7. As shown in Figures 1, 4 and 6, in use, each illuminator 9A 9B 9C 9D is held in a fixed position by the arm, wherein the position is distant from the body 7 of the illumination device 1, such that the illuminator 9A 93 9C 9D projects from the body 7..
It is understood that the method of the present invention requires a minimum of three illuminators for Lambertian surfaces. Many surfaces are not Lambertian, and the additional fourth illuminator, shown in Figures 1 to 8, allows for further measurements. It is understood that for certain applications of the invention; for example, for use in a medical instrument, the illumination device comprises eight, twelve or more illuminators to provide greater scope to remove artefacts generated by non-Lambertian surfaces.
Equivalent mounting means to the arm 19 include a rod, elongate plate or similar. In further alternative embodiments, the arm is jointed to pivot into and out of the body 7. In alternative embodiments, the illumination device is held by a tripod or similar mounting device because of susceptibility to "camera shake" due to the relatively long exposure time. In further alternative embodiments, image stabilisation of the recorded video is used to minimise "camera shake".
Referring to Figures 4 to 6, an illumination device 1 in accordance with the present invention is shown attached to a portable recording device 3 to provide the imaging system 2 of the present invention. In the embodiment shown, the portable recording device 3 is a mobile phone or smart phone comprising a camera 5 with high-speed video recording capabilities. A suitable portable recording device 3 is a mobile phone, including a smart phone, having slow-motion ("slo-mo") functionality, typically of about 240 frames per second or super slow-motion ("super slo-mo") functionality, typically of about 960 frames per second. The image exposure setting of the smartphone camera must be accessible through an application program or be configurable to provide automatic exposure control. When the mobile phone 3 is attached to the illumination device 1, the camera 5 is central to the circular aperture 6 of the body 7.
Referring to Figures 4, 6 and 7, each foldable arm 19 is moveable between a storage position within a respective recess 23 in the body 7 of the illumination device 1 and a use position. Referring to Figures 4 and 6, the illumination device 1 is shown in a use position wherein each arm 19 projects from the body 7 and holds the respective LED illuminator 9A 95 9C 9D at a distance from the body 7 of the illumination device 1. The distance to the target is given to a high degree of accuracy by the time of flight sensor 30, such as a Single Photo Avalanche Diode (SPAD) sensor. The ideal distance from the LED illuminator 9A 95 9C 9D to the target is about 15cm.
The further the illumination device 1 is away from the target beyond about 15cm the smaller the useful imaging area may become.
It is understood that in alternative embodiments, an alternative movement 10 mechanism to move the arm between a storage and a use position could be used, for example, wherein the arm slides into and out of the storage position.
Referring to Figures 1 and 4, each illuminator 9A 95 9C 9D is an LED emitting visible light or near infra-red radiation in the wavelength range of about 400nm to about 960nm. In alternative embodiments, the plurality of illuminators comprises LEDs emitting at different wavelengths. For example, a first set of illuminators comprises LEDs to emit visible light and a second set of illuminators comprise LEDs to emit near infra-red radiation.
Each illuminator 9A 95 9C 9D is positioned in the same plane as the camera 5. The plurality of illuminators 9A 95 9C 9D are positioned at points around the circumference of a circle such that the illuminator LEDs are positioned in an X-configuration, wherein each LED is equidistant from the centre of the circular aperture 6 in the body 7 of the illumination device 1. The angular distance between each of the plurality of illuminators 9A 9B 9C 9D is equal. For example, in the embodiment show in Figure 1, there are four illuminators 9A 95 9C 9D with an angular separation of 90 degrees around a circle having its centre located at the centre of the circular aperture 6, which in use is aligned with the centre of the camera 5. In the alternative embodiment shown in Figures 11 to 14, there are eight illuminators having an angular separation of 45 degrees.
It is understood that, in alternative embodiments, the diameter (D) of the circle around which the illuminators are arranged-i.e., the distance of each illuminator from the camera, the number of illuminators used and so the angular separation between the illuminators in the illumination device varies depending on the application. The diameter (D) of the circle around which the LED illuminators 9A 96 9C 9D are positioned and the number of illuminators used varies according to the application of the illumination device. For example, the diameter (D) will vary depending on the required field of view of the illumination device 1, the intensity distribution of the illuminators 9A 96 9C 9D, the required signal to noise ratio (SNR) and the required distance between the camera 3 and the target. The illuminators 9A 96 9C 9D are distributed at an angle of about 45 degrees to the surface of target. The illumination device 1 shown in Figures 1 to 8 has been optimised to balance the requirements of the field of view of the device with the size of the illumination device whilst minimising the signal to noise ratio (SNR).
The illumination device 1 comprises a thermal camera 29, a time of flight sensor 30 or a mmWave sensor (not shown). The time of flight sensor 30 allows for reconstruction of the surface of the target in 3D -i.e., in order to correctly reconstruct the surface of the target in 3D, the distance between the camera and the surface must be known. The time of flight sensor 30 provides multiple distance measurements in a matrix of point measurements across the entire surface of the target to be imaged. This provides "anchor points", i.e., known points wherein the distance of the target from the camera is recorded. These anchor points effectively represent the accuracy limits of the device -i.e., with respect to potential errors in the 3D reconstructed image.
The thermal camera 29 and mmWave sensor/s have a different wavelength to the LED illuminators 9A 9B 9C 9D and can be used if reconstruction of sub-surface details is required. The wavelengths of the thermal camera 29 and mmWave sensor/s are suitable to penetrate the surface of the target to be imaged. The 3D surface images created using the illumination device 1 can be correlated with images from the thermal camera 29 and/or the mmWave sensor to classify the nature and characteristics of structures within the surface of the target.
In the embodiment shown in Figure 1, the time of flight sensor 30 is located as close as possible to the lens of the camera 5 of the mobile phone 3. For example, the time of flight sensor 30 is less than 10mm from the lens of the camera 5. It is understood that "as close as possible" refers to the proximity of the time of flight sensor 30 to an imaginary line orthogonal to the surface of the lens and passing through the optical centre of the lens, known as the principal axis of the lens. The principal axis of the time of flight sensor 30 is aligned with the principal axis of the smartphone lens and displaced by no more than about 1 cm from the optical centre of the lens.
Referring to Figures 4 to 8, the illumination device 1 is secured to the mobile phone 3 by a securing bracket 31. As shown, the securing bracket 31 is a substantially C-shaped clamp with a backing plate 32a and two opposing, tapered gripping members 32b. The bracket 31 is adjustable so that the gripping members 32b move towards and away from each other depending on the size of the device to be held therein. A mobile phone 3 can be held in place by adjusting and then tightening the securing bracket 31 using a rotatable adjuster 33. The securing bracket 31 is also moveable along the length of an elongate opening 35 in the elongate mounting means 8. Referring to Figure 4, by sliding the securing bracket 31 along the length of the mounting means 8 the camera 5 of the portable recording device 3 can be aligned with the circular aperture 6 in the illumination device 1. When the mobile phone 3 and the illumination device 1 are aligned, a rotatable knob 37 is tightened to secure the illumination device 1 to the mobile phone 3 in a fixed position.
In the embodiment shown in Figures 4 to 8, the securing bracket 31 comprises a rotatable adjustor 33, but it is understood that in alternative embodiments the adjustor 33 is an equivalent mechanism; for example, wherein the securing bracket 31 is resilient or the securing bracket is a "clip fit".
The imaging system 2 shown in Figures 4 to 8 comprises a mobile phone 3 and the illumination device 1. The illumination device 1 does not require communication with the mobile phone 3. However, in a preferred embodiment, the illumination device 1 communicates with the mobile phone via low energy Bluetooth® so that the 3D model of the target can be obtained and viewed on the mobile phone 3.
Referring to Figures 1 and 9, a processor 40 that controls the lighting sequence and the brightness of the or each LED illuminator 9A 95 9C 9D is shown. For example, in a preferred embodiment each of the LED illuminators 9A 93 9C 9D is connected to a processor 40, such as an Arduino processor. The processor 40 is also connected to the time of flight sensor 30 and a Bluetooth module 45. The processor 40 is also connected to a power conditioning module 41, from which the USB charging module 43 and the USB 47 of the thermal camera 29 are connected.
The images captured by the mobile phone camera 5 may be monochrome or colour. Colour images can either be converted to monochrome and processed or each of the individual channels in a coloured set of images can be processed. In embodiments wherein each of the individual channels in a coloured set of images are processed, this allows for the recovery of colour albedo/reflectance information.
In use, the time of flight sensor 30 provides a measure of the distance between the camera 5 and the target to be modelled (not shown), which is used as an initial estimate of the scene/target geometry from which lighting vectors can be determined. The time of flight sensor 30 allows position data to at least one or a plurality of positions on the target to be imaged. Furthermore, the time of flight data is used to augment the overall geometry of the recovery process by constraining the numerical integration of the recovered gradient information.
Referring to Figures 1, 4 and 6, the four LED illuminators 9A 9B 9C 9D of the present invention are illuminated sequentially in a pattern that is determined by a microcontroller (not shown) that is located on a circuit board within the illuminator assembly of the illumination device 1. The LED drivers (not shown) are also located on the circuit board within the illuminator assembly. The illumination pattern includes a marker at the start of each frame that allows the individual contribution from each illuminator to be associated or matched with one or more video frames in the sequence captured by the mobile phone/capture device 3. The four LED illuminators 9A 93 9C 9D are in strict synchronisation with each other but are not synchronised with the start of each image frame recorded by the camera 5 within the mobile phone/capture device 3. The method of the present invention does not require that the phase relationship between the illuminators 9A 93 9C 9D and the camera frame capture is known, nor does it require that the LED illumination frequency is precisely the same as the frame capture frequency of the camera 5 of the mobile phone/capture device 3.
Referring to Figure 10, the method of image capture using the standard "slo-mo" or "ultra slo-mo" functions of a mobile phone or smart phone is described in more detail. It is understood that the method of the present invention can also be used with alternative devices having a high-speed video camera, including other personal computing devices, such as tablet computers. The method is based on execution of an appropriate illumination sequence during video recording/file capture. The file of images that are recorded are post-processed using start-of-frame markers that are embedded in the image sequence to create independent illuminated views of the target to be modelled. The independent illuminated images are processed using a photometric stereo algorithm to arrive at the 3D image.
Prior to the method set out in Figure 10, a calibration step to allow for systematic errors is carried out. Prior to capturing the sequence of images that are reconstructed to provide the 3D model, the method is calibrated to remove systematic errors; for example, errors that are caused by variation in surface lighting for each illuminator and between the plurality of illuminators. This is achieved by use of a reference image. Furthermore, a reference image is also used for calibration of the system to remove systematic geometrical errors caused by the camera lens of the capture device/mobile phone.
There are two calibrations required for the method of the present invention: i) Geometric calibration of each illuminator's position relative to the time of flight sensor; and ii) Optical calibration in respect of the features of the camera, which also aims to quantify any radial and/or tangential distortion and allows the system to correct for inherent systematic errors.
For geometric calibration of the illuminator positions, the method uses a "specular ball" approach. The specular ball is imaged at approximately the centre of the image. The input into the calibration algorithm is x images, wherein x is the number of illuminators in the illumination device. Each image is independently illuminated by the corresponding illuminator/light source. The calibration method detects the specular sphere, estimates the sphere's diameter and detects the specular lobe associated with the respective illuminator/light source. The location of the centre of the specular lobes and the diameter of the sphere are used to estimate the light vectors for each of the illuminator/light sources. The output from the time of flight sensor, for example from a Single Photo Avalanche Diode (SPAD) is used to adjust the estimation when the method is used in capture mode -i.e., when recording a video data capture sequence of the target. It is envisaged that the geometric calibration is carried out every time the portable illumination device is mounted to the mobile computing device but is not required between captures using the same portable illumination device with the same mobile computing device.
For optical calibration, to estimate the features of the camera and any distortions, the method uses a chequerboard pattern of known dimensions.
The calibration step automatically detects the corners within the image and uses multiple captures of the chequerboard pattern from different positions and perspectives. It is envisaged that the optical calibration of the portable illumination device only needs to be undertaken occasionally. The distance estimation providing by the time of flight sensor is used to provide an estimate of the focal length of the camera and the distance estimation to the target as discussed further with respect to the geometry recovery step of the method.
As shown in Figure 10, the 3D recovery method starts at step 101 and sets up communications at step 103. The method then asks at step 105, "is device ready?". If the device is not ready the method returns to step 105 to repeat this check until the device is ready.
When the device is ready, the method proceeds to step 107 to adjust the camera exposure. The camera exposure setting of the mobile phone is adjusted to ensure that the image sequence is correctly exposed. The target is illuminated from a single illuminator for a fixed time period of about 1 second in advance of activating the pre-determined illumination sequence from a plurality of illuminators. This includes adjusting the light output of each illuminator to the optimum level for ambient light conditions. To adjust the light output of each illuminator, the target surface is illuminated by one or more of the illuminators and the method takes measurements of the difference between the brightest and darkest regions of the resulting model as acquired by the mobile phone/capture device. In some embodiments of the method of the present invention, manual control of the camera exposure setting is not available when operating in "slow-mo" or "ultra slomo" mode. In this event, the target surface is illuminated by one or more of the illuminators with an intensity appropriate for the ambient lighting conditions prior to the data capture illumination sequence, as described later. The illumination is then maintained for a sufficient period, such as 1 second, so that the camera autoexposure mechanism adjusts the camera sensitivity to an appropriate level.
Following the required adjustment of the camera exposure at step 107, coarse geometric data is captured using the time of flight sensor. The time of flight sensor is used to measure the distance between the camera and the contours of the target surface that is to be reconstructed in 3D.
The video recording function of the mobile phone/capture device camera is activated at step 109 immediately before the data capture illumination sequence is generated at step 111 by the processor of the illumination device. Each of the plurality of illuminators is switched on or off by the controller so that the duration of illumination is approximately equal to the duration of a single frame of the video recording frame rate.
The data capture sequence generated by the microprocessor can be defined in terms of the following sequence, wherein: N refers to all illuminators being switched off at the same time; S refers to all illuminators being switched on at the same time; n refers to the number of the illuminator that is switched on wherein n illuminators are arranged as a plurality of illuminators in the illumination device; for example, Figure 1 shows n=4 illuminators. In alternative embodiments, n=1, 2, 3, 4, n.
In one example, the data capture sequence generated by the microprocessor is: NNNSSS 111222333444 [Sequence 1] As previously described, a 1 second duration preamble, where one of the four illuminators is switched on prior to Sequence 1, is added when there is no direct control of image exposure available to the user. For example, this additional step would be carried out for all mobile phone/capture devices that are capturing at frame rates of 240 frames per second or higher.
The length of time for each element of the data capture sequence is approximately equal to the length of time of a single frame of the selected video recording device of the capture device/mobile phone. For example, for a slo-mo frame rate of 240 frames per second, the method of the present invention captures twelve frames in total, so the length of time taken is 12/240 seconds.
The preamble of the illumination sequence (N NNSS S) is used as a start of frame marker to identify the start of the data capture sequence by the frame extraction process. The above-described example sequence ensures that at least one frame will be illuminated for the full duration of that frame irrespective of the phase relationship between the illuminator pattern and the video capture process.
In alternative embodiments, the duration of the illumination of each phase can be increased to mitigate the effect of external variations in lighting or other parameters of the or each illuminator and the camera. For example, an alternative sequence is shown below: NNNNSSSS 111222333444 [Sequence 1] It is to be understood that the sequence applies to an illumination device having any number of illuminators.
The illumination device proceeds with the above-described illumination sequence and data is captured by the mobile phone/capture device at steps 109, 111. When data capture ends, the method proceeds with the frame extraction sequence at step 113. The frame extraction sequence is set out in Equations 1-5 wherein: In = illumination at frame n; background = background illumination; k = number of frames in the sequence; G = absolute value The frame extraction sequence assumes that the background illumination 'background is the mean of the sequence as described in Equation 1: lc
background = n=i
[Equation 1].
The DC offset is removed by through subtraction of the background image from the sequence as described in Equation 2:
background
[Equation 2] The frame extraction sequence then carries out frame differencing over every other frame to find the absolute value as described in Equation 3: Gn -'In f (n-2) I [Equation 3].
To find the marker frame, the frame extraction sequence then locates the frame number where G is a maximum as described in Equation 4: i = max(G7) [Equation 4].
The extracted sequence is then expressed as Equation 5 th+3,h+6,Ii+9,h+12 [Equation 5].
Referring to Figure 10, the next step 115 is applying the geometry recovery algorithm to output a 3D model of the target at step 117; for example, to output a 3D model of a surface to be viewed on the mobile phone.
With reference to Equation 5, given the four images obtained from the frame extraction sequence, the sparse depth map obtained from the time of flight sensor and the output of the calibration process for both light and optical set up is used for the geometry recovery step. This allows the method to recover a high resolution, geometrically accurate recovery of the imaged target. The method of the present invention uses a photometric stereo method to recover the normal field because such a method is capable of using more than three input images and has robustness to shadows and specularities. The recovered normal field provides a 2.5D representation of the target surface but has no geometrical information. To recover the geometric information, the method converts the normal field to a gradient map, which is then used to recover a height map. This height map has no scale. Thus, to recover the surface geometry of the target, the distance of the surface from the time of flight sensor data is used. The geometry of the LED illuminators' principal axes referenced to the principal axis of the camera lens, which are accurately measured during the calibration process, to provide the height map with a scaling value. Geometrical aspects of the recovery are improved by performing a joint optimisation of the geometry and the surface normal information from the time of flight and photometric stereo systems.
In use, the method and system of the present invention are used to record the surface contours and albedo of a target structure to determine the surface condition. Three-dimensional data is collected at intervals across a time period and is held in a database to measure degradation over that time period. In alternative applications, the system and method are used to obtain and record a series of three-dimensional images over a shorter period of time. This allows a user to analyse dynamic changes in the surface over the required time period. Example applications of the system and method of the present invention include recording skin conditions for dermatological and cosmetic purposes; recording the palm surface or hand contours for biometric purposes. Alternative applications also include recording the surface condition of ancient monuments and artefacts to prepare records for conservation purposes. Further uses include recording the surface contour of rocks for mining and climbing/recreational purposes.
Referring to Figures 11 to 13, a further embodiment of the present invention is shown having applications to biometrics, including biometric applications to financial technology. The illumination device 1' comprises a plurality of LED illuminators 9' wherein a first plurality of LED illuminators 9A' emit near infrared radiation at a wavelength of about 850nm and a second plurality of LED illuminators 9B' emit visible light. The method of the present invention as previously described and the device of Figures 11 to 13 is used to generate a 3D surface scan of the palm or the back of the hand, which is used to identify a user. Visible light emitted from the first plurality of LED illuminators 9A' is used to reconstruct the skin layer, i.e., to reconstruct a 3D model of the skin layer. Near infrared radiation emitted from the second plurality of LED illuminators 95' is used to reconstruct the subsurface of the skin, i.e., to reconstruct a 3D model of the subsurface of the skin. The near infrared radiation is able to penetrate the subcutaneous layer of the hand, which creates a characteristic 3D "signature" of the scanned structure, such as a palm or the back of the hand of a user. Such a scanned structure is invariant in a "live" hand and so the illumination device 1' provides a secure way of identifying a user that cannot be imitated with an alternative, e.g., a dummy hand.
It is understood that the illumination device 1' for 3D scanning of the hand can be a portable device that is carried by a user, or a portable device that is mounted to a wall or similar surface. When mounted, the illumination device can be used as an access control device.
The hand scanning/biometric device comprises the same features as the portable illumination device described with respect to Figures 1 to 8, wherein the device is connectable to a mobile phone (smartphone).
Referring to Figure 11, the illumination device 1' is attached to a portable recording device/mobile phone 3' by a securing bracket 31' having a U-shaped receiving portion into which the mobile phone 3' is inserted. The securing bracket 31' clips over the mobile phone and is then secured in place by a thumb screw clamp 33'. The thumb screw clamp 33' is adjustable so that the securing bracket 31' can be tightened against the mobile phone 3' by rotation of the thumb screw clamp 33', so that the mobile phone 3' and the illumination device 1' are aligned. It is envisaged that alternative, equivalent securing mechanisms can be used.
Referring to Figure 12, the illumination device 1' comprises an annular housing 7b' into which an annular illuminator mount 7b' is secured. The illumination device 1' comprises four LED illuminators 9A' to emit visible light and four LED illuminators 95' to emit near infrared radiation at a wavelength of about 850nm. Each of the LED illuminators 9A', 96' are secured to the illuminator mount 7b'. The plurality of illuminators 9K 93' are positioned at points around the circumference of a circle such that each LED is equidistant from a circular aperture 6' in the housing 7b' of the illumination device 1'. The angular distance between each adjacent LED 9A', 9B' is equal. As shown in Figure 13, there are eight LEDs 9A' 9B' with an angular separation of 45 degrees around a circle having its centre located at the centre of the circular aperture 6'. As previously described, the illumination device 1' further comprises a time of flight sensor and/or a thermal camera and/or a mmWave sensor.
The method of sequential illumination of the LED illuminators 9A' 93' is as previously described with the 3D recovery method used to reconstruct the skin layer using visible light emitted form the first plurality of illuminators 9A1 and the near infrared radiation emitted from the second plurality of LED illuminators 9B' used to reconstruct the subsurface of the skin, i.e., to reconstruct a 3D model of the subsurface of the skin.
Within this specification, the term "about" means plus or minus 20%; more preferably, plus or minus 100/0; even more preferably, plus or minus 5%; most preferably, plus or minus 2%.
Within this specification, the term "substantially" means a deviation of plus or minus 20%; more preferably, plus or minus 100/0; even more preferably, plus or minus 5°/o; most preferably, plus or minus 2°/o.
The above-described embodiment has been given by way of example only, and the skilled reader will naturally appreciate that many variations could be made thereto without departing from the scope of the claims.

Claims (25)

  1. Claims 1. A portable illumination device for attachment to a mobile computing device comprising: a plurality of illuminators, wherein each illuminator is controlled independently; at least one viewing aperture, at least one detector for detecting the distance of a target from the illumination device; a controller for controlling the lighting sequence and/or brightness of each of the plurality of illuminators.
  2. 2. A portable illumination device according to claim 1 wherein the at least one viewing aperture is moveable for alignment of the or each viewing aperture with a mobile computing device.
  3. 3. A portable illumination device according to any preceding claim wherein each illuminator is an LED.
  4. 4. A portable illumination device according to any preceding claim comprising at least three illuminators.
  5. 5. A portable illumination device according to any preceding claim comprising four illuminators arranged in an X-shape.
  6. 6. A portable illumination device according to any preceding claim comprising a substantially annular frame having a central aperture, wherein the portable illumination device comprises four illuminators arranged at points around a circle and wherein the circle shares a centre point with the substantially annular frame and wherein each illuminator is equidistant from the centre point of the substantially annular portion.
  7. 7. A portable illumination device according to any preceding claim comprising four illuminators arranged at equidistant points around a circle, preferably, wherein each illuminator has an angular separation from the adjacent illuminators of about 90 degrees.
  8. 8. A portable illumination device according to any preceding claim wherein each illuminator is an illuminator emitting visible light and/or near infra-red radiation.
  9. 9. A portable illumination device according to any preceding claim wherein each illuminator is attached to the portable illumination device by a moveable arm.
  10. 10.A portable illumination device according to claim 9 comprising a body wherein each moveable arm attached to an illuminator is moveable between a storage position wherein the arm is within the body and a use position wherein the arm projects from the body.
  11. 11.A portable illumination device according to claim 9 or claim 10 wherein each arm is moveable to a use position wherein the arm holds the illuminator at a distance from the body of the portable illumination device.
  12. 12.A portable illumination device according to any preceding claim comprising an adjustable attachment means for attaching a mobile computing device thereto.
  13. 13.A portable illumination device according to any preceding claim comprising a securing bracket for attaching a mobile computing device thereto.
  14. 14.A portable illumination device according to any preceding claim wherein the at least one detector comprises a time of flight sensor and/or a thermal camera and/or a mmWave sensor.
  15. 15.An imaging system comprising the portable illumination device of claims 1 to 14 and a mobile computing device, wherein the mobile computing device comprises a camera for video recording.
  16. 16.An imaging system according to claim 15 wherein the camera of the mobile computing device for video recording is within about lOmm of the detector, wherein the detector comprises a time of flight sensor.
  17. 17. An imaging system according to claim 15 or claim 16 wherein the detector is a time of flight sensor and the principal axis of the time of flight sensor is aligned with the principal axis of the lens of the camera of the mobile computing device.
  18. 18.A 3D imaging method for obtaining a reconstructed 3D model of a target comprising the steps of: illuminating the target from a plurality of directions; detecting the distance between a video recording camera and the target; activating a pre-determined illumination sequence from a plurality of illuminators; recording a video data capture sequence of the target; identifying the start of the video data capture sequence using known features of the illumination sequence; extracting a plurality of frames of the video data capture sequence; and reconstructing a 3D model of the target.
  19. 19. A 3D imaging method for obtaining a reconstructed 3D model of a target according to claim 18 wherein the pre-determined illumination sequence from a plurality of illuminators comprises: iv) switching all of the plurality of illuminators on to activate simultaneous illumination from all of the plurality of illuminators; v) switching all of the plurality of illuminators off; vi) switching each of the plurality of illuminators on sequentially.
  20. 20.A 3D imaging method for obtaining a reconstructed 3D model of a target according to claim 18 or claim 19 further comprising obtaining photometric data from the plurality of frames of the video data capture sequence using the pre-determined illumination sequence.
  21. 21. A 3D imaging method for obtaining a reconstructed 3D model of a target according to claims 18 to 20 further comprising a calibration step with respect to a reference image.
  22. 22.A 3D imaging method for obtaining a reconstructed 3D model of a target according to claims 18 to 21 further comprising a step of adjusting the radiation output of the or each illuminator to the optimum level for ambient light conditions.
  23. 23.A 3D imaging method for obtaining a reconstructed 3D model of a target according to claims 18 to 21 wherein each of the plurality of illuminators are synchronised with each other throughout the pre-determined illumination sequence.
  24. 24.A 3D imaging method for obtaining a reconstructed 3D model of a target according to claims 18 to 23 further comprising a step of aligning the phase relationship of the illumination sequence and the video data capture sequence using a timing marker generated by a mobile computing device.
  25. 25.A 3D imaging method for obtaining a reconstructed 3D model of a target according to claims 18 to 24 further comprising a step of illuminating the target for a fixed time period from at least one illuminator in advance of activating the pre-determined illumination sequence from a plurality of illuminators.
GB2111830.2A 2021-08-18 2021-08-18 Portable illumination device, imaging system and method Pending GB2610376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2111830.2A GB2610376A (en) 2021-08-18 2021-08-18 Portable illumination device, imaging system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2111830.2A GB2610376A (en) 2021-08-18 2021-08-18 Portable illumination device, imaging system and method

Publications (2)

Publication Number Publication Date
GB202111830D0 GB202111830D0 (en) 2021-09-29
GB2610376A true GB2610376A (en) 2023-03-08

Family

ID=77859907

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2111830.2A Pending GB2610376A (en) 2021-08-18 2021-08-18 Portable illumination device, imaging system and method

Country Status (1)

Country Link
GB (1) GB2610376A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2827175A2 (en) * 2013-07-12 2015-01-21 Princeton Optronics, Inc. 2-D planar VCSEL source for 3-D imaging
US20150077517A1 (en) * 2013-09-17 2015-03-19 Occipital, Inc. Apparatus for real-time 3d capture

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2827175A2 (en) * 2013-07-12 2015-01-21 Princeton Optronics, Inc. 2-D planar VCSEL source for 3-D imaging
US20150077517A1 (en) * 2013-09-17 2015-03-19 Occipital, Inc. Apparatus for real-time 3d capture

Also Published As

Publication number Publication date
GB202111830D0 (en) 2021-09-29

Similar Documents

Publication Publication Date Title
US20190273890A1 (en) Devices and methods for identifying and monitoring changes of a suspect area of a patient
US8190239B2 (en) Individual identification device
US10007337B2 (en) Eye gaze imaging
CN111009007A (en) Finger multi-feature comprehensive three-dimensional reconstruction method
US10916025B2 (en) Systems and methods for forming models of three-dimensional objects
US20080186449A1 (en) Gaze tracking using multiple images
JP4308220B2 (en) Personal recognition device
WO2005081677A2 (en) Passive stereo sensing for 3d facial shape biometrics
EP2054850A1 (en) Image input apparatus, image input method, personal authentication apparatus, and electronic apparatus
WO2005077259A1 (en) Method and apparatus for three-dimensional video-oculography
Paterson et al. BRDF and geometry capture from extended inhomogeneous samples using flash photography.
US20110170060A1 (en) Gaze Tracking Using Polarized Light
EP3381015B1 (en) Systems and methods for forming three-dimensional models of objects
KR102210533B1 (en) 3D Scanning Device for Objects
US11080511B2 (en) Contactless rolled fingerprints
CN113545028A (en) Gain control for face authentication
CN110300976A (en) Eye gaze tracking
GB2610376A (en) Portable illumination device, imaging system and method
CA2885775C (en) Imaging device of facial topography with multiple light source flash photography and method of blending same
CN109389113B (en) Multifunctional footprint acquisition equipment
JP6782933B1 (en) Imaging system
CN221654265U (en) Light supplementing structure and optical facial skin detector with same
WO2010122198A1 (en) Method for identifying reflecting objects subjected to variable lighting conditions and system for performing said method
CN116166096A (en) Accessory equipment and electronic system