GB2551894A - Break analysis apparatus and method - Google Patents

Break analysis apparatus and method Download PDF

Info

Publication number
GB2551894A
GB2551894A GB1707572.2A GB201707572A GB2551894A GB 2551894 A GB2551894 A GB 2551894A GB 201707572 A GB201707572 A GB 201707572A GB 2551894 A GB2551894 A GB 2551894A
Authority
GB
United Kingdom
Prior art keywords
break
image
computing device
mobile computing
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1707572.2A
Other versions
GB201707572D0 (en
Inventor
Hansen Mark
Hales Ian
Farooq Abdul
Smith Melvyn
Daniel Gwen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Belron International Ltd
Original Assignee
Belron International Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Belron International Ltd filed Critical Belron International Ltd
Publication of GB201707572D0 publication Critical patent/GB201707572D0/en
Publication of GB2551894A publication Critical patent/GB2551894A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/958Inspecting transparent materials or objects, e.g. windscreens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/14Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/958Inspecting transparent materials or objects, e.g. windscreens
    • G01N2021/9586Windscreens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

A method, system and device are disclosed which enable the analysis of a break/crack in a vehicle glazing panel without the attendance of a technician. The driver of the vehicle photographs the break/crack using their mobile phone and this image is transmitted to an image processing module and then to a break analysis module, s400, which first isolates the crack from the background, s408, by application of a Fourier transform, s402, to the image followed by a Butterworth bandpass filter, s404 and then an inverse Fourier transform process, s406. A morphology routine, s410 is then applied to refine the image and the break analysis module determines the size of the crack and based on this it is determined whether the crack needs attention, s412 and this result is transmitted to the driver.

Description

(54) Title of the Invention: Break analysis apparatus and method
Abstract Title: Vehicle glazing crack analysis using mobile phone (57) A method, system and device are disclosed which enable the analysis of a break/crack in a vehicle glazing panel without the attendance of a technician. The driver of the vehicle photographs the break/crack using their mobile phone and this image is transmitted to an image processing module and then to a break analysis module, s400, which first isolates the crack from the background, s408, by application of a Fourier transform, s402, to the image followed by a Butterworth bandpass filter, s404 and then an inverse Fourier transform process, s406. A morphology routine, s410 is then applied to refine the image and the break analysis module determines the size of the crack and based on this it is determined whether the crack needs attention, s412 and this result is transmitted to the driver.
Figure GB2551894A_D0001
Fig.4
At least one drawing originally filed was informal and the print reproduced here is taken from a later filed formal copy.
1/8
Figure GB2551894A_D0002
106
Figure GB2551894A_D0003
100
A
Fig.2
V
102
Figure GB2551894A_D0004
Fig.3
2/8
Figure GB2551894A_D0005
Fig.4
3/8
Figure GB2551894A_D0006
Fig.5
418
Figure GB2551894A_D0007
Figure GB2551894A_D0008
Fig.6
5/8
Figure GB2551894A_D0009
0)
CN
Ο
6/8
Figure GB2551894A_D0010
Fig.8
Figure GB2551894A_D0011
7ΐβ
Figure GB2551894A_D0012
Figure GB2551894A_D0013
Figure GB2551894A_D0014
8/8
Phone dc
Figure GB2551894A_D0015
Camera
Fig.10
Figure GB2551894A_D0016
Fig.11
-1BREAK ANALYSIS APPARATUS AND METHOD
FIELD
This invention relates generally to an apparatus and a method. Particularly, but not exclusively, the invention relates to an apparatus and a method to be used to analyse breaks in vehicle glazing panels. Further particularly, but not exclusively, the invention relates to a method and an apparatus to be used to analyse cracks in glass, particularly a vehicle glazing panel.
BACKGROUND
When driving the presence of debris and other material on the road can cause such material to be deflected into the path of traffic which, when, such material collides with a windscreen, can cause cracks, breaks or other damage to manifest in such windscreens which may necessitate the need for repair or replacement of the vehicle glazing panel.
For safety reasons and reasons of economy, it is imperative that such work is performed as quickly as possible as these cracks can propagate further into the windscreen due to the influence of cold weather, say, which can cause the crack to grow. This can cause the crack to change from one which needs a minor repair to one which necessitates the need for a full replacement of the windscreen.
An assessment is required as to whether the damage to a vehicle glazing panel can be remedied by repair. If the assessment indicates that repair is not feasible then replacement of the glazing panel will be required.
Aspects and embodiments were conceived with the foregoing in mind.
SUMMARY
-2Viewed from a first aspect, there is provided a break analysis method for analysing breaks in a vehicle glazing panel, the method comprising: capturing an image of a break in a vehicle glazing panel; processing the image of the break.
Viewed from a second aspect, there is provided a break analysis apparatus for analysing breaks in a vehicle glazing panel, the apparatus comprising: a camera arranged to capture an image of a break in a vehicle glazing panel; a processing module operative to process the image of the break.
Optionally, the apparatus may comprise a mobile computing device which comprises the camera. A mobile computing device is an electronic device which is configured to capture images. This may include a mobile telephone (e.g. a smart phone), a laptop computer, a tablet, a phablet or a camera. The mobile computing device includes a camera to capture images.
The mobile computing device may also comprise the processing module.
A method or apparatus in accordance with the first and second aspects enables an image of a break in a surface to be used to analyse the break. This removes the need for physical attendance by a technician to perform any analysis on the crack.
The determination of the need for a replacement glazing panel may be based on the processing of the image. Thus, the method may include the step of determining whether or not the glazing panel needs to be replaced, and/or whether the glazing panel is suitable for repair, based on the processing of the image.
The image of the break may be captured at an angle inclined relative to the vehicle glazing panel.
The image may be captured by a mobile computing device held in contact with the surface of the glazing panel, wherein the mobile computing device includes a camera. The mobile computing device may for example be a mobile phone such as a smartphone provided with
-3a camera. The invention may be implemented via a software component for processing image data from the camera in order to determine whether the break can be repaired or replacement is needed. The software component may be downloaded to the device such as for example as a stand-alone app or the like. The software component may comprise an algorithm for making the determination and also preferably display user instructions for how to implement the determination method.
Accordingly according to another aspect the invention comprises a computer implemented method for determining technical information regarding a break present in a vehicle glazing panel, wherein the method includes the step of downloading a software component to a mobile computing device having a camera, the software component arranged to process image information captured via the camera in order to determine whether the break may be repaired or replacement is preferred.
The mobile computing device may be inclined to view the break at a predetermined position in a field of view of the camera. A datum position may be achieved by ensuring that the computing device is inclined to the panel with an edge contacting the panel.
The predetermined position at which the mobile computing device may be inclined to view the break may be indicated by indicia displayed on the camera or on the mobile computing device comprising the camera (for example on a screen).
Additionally or alternatively, the indicia may indicate the centre of the field of view (for example on a screen).
Optionally, the camera or mobile computing device may be initially laid flat on the surface of the glazing panel. The mobile computing device may then be pivoted or tilted away from the glazing panel, wherein at least a portion of the mobile computing device remains in contact with the glazing panel.
Optionally, the camera is positioned in a predetermined position relative to the break before the mobile computing device is pivoted. For example, the method may include
-4aligning a feature of the mobile computing device in a predetermined position relative to the break.
In some embodiments, an edge of the image capture module, such as the top edge, may be aligned adjacent (or immediately below) the lowest point of the break. This means that the initial distance between the camera and the lowest point of the break can be determined using the geometry of the mobile computing device.
The image capture module or mobile computing device may then be pivoted away from the glazing panel, wherein a bottom edge of the mobile computing device remains in contact with the glazing panel. The image of the break is then captured.
The method may include pivoting or tilting the mobile computing device until the break is viewable at a predetermined position in a field of view of the camera.
Optionally, the method includes pivoting the mobile computing device until the break is in the centre of the field of view of the camera.
The method may include using geometric parameters of the mobile computing device and lens parameters of the camera to estimate the parameters of the break. The parameters of the break includes one or more spatial dimensions indicative of the size of the break.
For example, using the above method the geometric parameters of the mobile computing device and the lens parameters of the camera may be used to determine the length of the one or more legs of the break and/or the size (e.g. width/diameter) of the centre of the break.
Geometric parameters of the mobile computing device may be used to determine the angle of rotation (or pivot angle) between the mobile computing device and the glazing panel.
If the estimated size of the break (e.g. length of one the legs of the break) exceeds a given threshold then the method may determine that the glazing panel needs to be replaced. If the
-5estimated size of the break (e.g. length of one the legs of the break) is less than a given threshold then the method may determine that the glazing panel is suitable for repair.
The processing of the image may be based upon geometric parameters of the mobile computing device to capture the image of the break.
The processing of the image may be further based on chip parameters for the camera, and/or mobile computing device.
The processing of the image may generate a set of data points which are used to generate a diameter for the break which may be used to determine the requirement for a replacement vehicle glazing panel.
The method may include outputting a signal or indication indicating whether the glazing panel needs to be repaired or replaced.
The break in the glazing panel may comprise a centre and one or more legs. This break formation is common when a small stone or other object impacts a glazing panel. The one or more legs (or cracks) generally radiate from the centre of the break.
The centre of the break may be substantially circular in shape.
The determination of the need for a replacement or repair of a glazing panel may comprise generating data indicative of a centre of a break and one or more legs of a break.
The method may include identifying a centre of the break and one or more legs of the break.
The method may include generating a scaling factor indicating the relative length of the one or more legs of the break compared to the size of the centre of the break. The size of the centre of the break may be the diameter, width and/or length of the centre of the break.
-6The method may include estimating the length of the one or more legs of the break (i.e. the absolute length in cm or mm etc.) using the scaling factor. For example, the length of the one or more legs may be estimated by multiplying the scaling factor by a predetermined value.
The predetermined value may be an estimate of the actual (i.e. absolute) size of the centre of the break. This provides the advantage that no calibration object is required, which is at least more convenient for the user.
Thus, the method of the present invention may comprise determining the scale of the image to estimate the length of the one or more legs of the break.
If the estimated length of the one or more legs exceeds a given threshold then the method may determine that the glazing panel needs to be replaced. If the estimated length of the one or more legs is less than a given threshold then the method may determine that the break in the glazing panel is suitable for repair.
The method may comprise outputting a signal indicating that repair of the glazing panel is required if the estimated length of the one or more legs is less than the given threshold.
The method may comprise outputting a signal indicating that replacement of the glazing panel is required if the estimated length of the one or more legs exceeds the given threshold.
It has been found that the size of the centre of the break generally varies less than the length of the one or more legs of the break when comparing different breaks in glazing panels. As such, the predetermined value may be an average, or mode of the measured sizes of the centre of a break in a glazing panel.
Optionally, the estimated of the actual width (or diameter) of the centre of the break (i.e. the predetermined value) may be between 1mm and 3mm. A particularly preferred
-7predetermined width (or diameter) of the centre of the break may be 2mm. These ranges / values have been determined from studies of breaks carried out by the applicant.
The effect of this is that the estimated size of the centre of the break, such as the estimated diameter of the centre of the break, can be used to estimate the length of the legs of the break as the predetermined value can be used to scale between the relative length of the one or more legs relative to the size of the centre of the break and the estimated actual length of the one or more legs of the break.
For example, if we know that the centre of the break is always going to be around 2mm in width (or diameter) and the generated image data indicates that the legs are twice the length of the diameter of the centre of the break, then the method may comprise multiplying 2mm by a scaling factor of 2. This estimates that the legs are 4mm in length. This helps to build a picture, in the data, of the dimensions of the break.
The generated (or estimated) length of the one or more legs may be used to indicate the estimated size of the break. The size of the break may be compared to a threshold parameter to determine the need for replacement of repair of the glazing panel.
If the estimated size of the break exceeds a given threshold then the method may determine that the glazing panel needs to be replaced. If the estimated size of the break is less than a given threshold then the method may determine that the glazing panel is suitable for repair.
The comparison may be between the break threshold parameter and the largest distance across the break.
Optionally, the predetermined estimate of the size of the centre of the break may be dependent upon one or more parameters. The parameters may be input by the user and/or pre-set into the device or processing module. For example, the parameters may include: one or more properties of the glazing panel (such as type, size, etc.), and/or the speed the vehicle was traveling at when the break occurred.
-8The processing of the image may comprise filtering the image to remove a background portion to identify the break.
Morphological refinement may be applied to the image to remove any clutter from the image and improve the quality of the image data used as the basis for a determination of whether a replacement glazing panel is required.
The method may include cleaning the glazing panel prior to capturing the image of the break. This may assist in removing any dirt that could affect the processing of the image. For example, there is a risk that dirt could be construed as a break by image processing software.
The method may include disabling a flash function of the image capturing module or device before capturing an image of the break. If flash photography is used then the light may adversely affect the accuracy of the image processing software. For example, the flash may be reflected in the glazing panel which may affect the identification or analysis of the break.
The method may be implemented using computer implemented instructions which, when installed into memory, instruct a processor to implement a method as defined above. A downloadable software component (such as an app) is preferred.
It will be appreciated that any features of the method may be performed using the apparatus of the present invention.
These and other aspects of the present invention will be apparent from and elucidated with reference to, the embodiment described herein.
DESCRIPTION
First and second embodiments of the present invention will now be described, by way of example only, and with reference to the accompany drawings, in which:
-9Figure 1 illustrates a windscreen with a crack;
Figure 2 illustrates the use of a camera to capture an image of a crack;
Figure 3 illustrates a processing module which may be used to analyse the crack in the windscreen of Figure 1;
Figure 4 illustrates a flow chart detailing the steps involved in assessing a crack in the windscreen using the system of Figure 3;
Figure 5 illustrates an image of a Fourier transform of a crack; and
Figure 6 illustrates a delineated image of a crack.
Figure 7 illustrates an arrangement which may be used to model a crack 102;
Figure 8 illustrates the steps involved in generating break parameters;
Figure 9a schematically illustrates the field of view of a camera with a bird’s eye view of a windscreen;
Figure 9b schematically illustrates the field of view of a camera with an inclined view of a windscreen;
Figure 10 illustrates how a camera may be modelled in order to use the parameters of the camera to analyse the crack; and
Figure 11 illustrates an image which is output from the system to determine the size of a crack in a windscreen
In a first embodiment, Figure 1 illustrates a glass windscreen 100 with a crack 102 caused by a stone which had been flicked onto the windscreen. The driver of the car in which the
-10glass windscreen 100 is housed then captures an image of the crack 102 using a mobile telephone 104 which comprises a camera 106 which is used to capture the image of the crack 102. This arrangement is shown from a side perspective in Figure 2. The focal length of the camera 106 is fixed at less than 100 millimetres to cause the focus of the camera to be tight at a small distance.
The image of the crack 102 is then captured, responsive to user input, and mobile telephone 104 is configured to provide a prompt to the user to request that they indicate that they would like the image to be transmitted from the camera 106 to an image processing module 108 which we will now describe with reference to Figure 3. This step enables the user to assess the quality of the image themselves so that they may elect to capture another image if they do not consider the image to be clear such as, for example, in inclement weather conditions where condensation may be deposited on the lens of camera 106.
The camera 106 converts the captured image to an array of image data using any suitable method. The camera 106 may save the image data as an exchangeable image file (EXIF) where the lens parameters of the camera are also stored.
That is to say, the camera 106 is an example of an image capture module which is operative to capture the image of the crack 102 and to transmit the captured image in the form of captured image data to the image processing module 108 where it may be processed further to extract details of the crack.
The image processing module 108 may form part of the mobile telephone 104 or it may be geographically distant relative to the mobile telephone 104. The image data is transmitted to the image processing module 108 by any suitable means such as, for example, a data bus or the internet.
In a preferred embodiment the image processing module 108 is implemented as a software component downloaded to the mobile phone. This may be implemented by means of downloading the software component as an app. The software component is capable of
-11processing the image data from the camera to determine whether the break is susceptible for repair or whether replacement of the glazing panel may be required as the preferred solution.
The break analysis module 112 may be a software component downloaded to the mobile phone preferably as a single download in combination with the image processing module 108. A single downloaded software component I preferably arranged to process the image data from the phone camera and analyse the break using one or more algorithms implemented in software.
In one implementation, the captured image data is received by the image processing module 108 at data input interface 110. The image data is then transmitted to the break analysis module 112 which is configured to access a routine library 114 where routines may be stored to perform operations on the captured data throughout the analysis of the captured image data.
The break analysis module 112 is also configured to access a device parameters database 116 where parameters related to the mobile telephone 104 are stored.
The parameters related to the mobile telephone 104 include chip parameters which define the image capture performance of the camera 106 such as, for example, focal length, and sensor size of the lens, and the dimensional parameters of the mobile telephone 104 such as, for example, the length of the mobile telephone 104 and the distance between the top edge of the mobile telephone 104 and the centre of the image of the camera 106.
The break analysis module 112 is also operative to interface with a display module 118 which is operative to display image data transmitted from the break analysis module 112 on a display and to display parametric data transmitted from the break analysis module 112 on a display.
We will now describe, with reference to Figure 4, the analysis of crack 102 using the break analysis module 112.
-12The break analysis module 112 receives the image data in a step S400. The break analysis module 112 then, in a step S402, calls a Fourier transform routine from the routine library 114 and uses the Fourier transform routine to apply a discrete two-dimensional Fourier transform to the image data to produce a transform image as illustrated in Figure 5a.
In Figure 5a we can see the transformed image. Figure 5a plots spatial frequency against the magnitude of the respective Fourier component. It will be seen that low spatial frequencies occupy the centre of the transformed image and higher spatial frequencies can be seen as we move away from the centre of the transformed image.
Taking a Fourier Transform of the image enables break analysis module 112 to perform analysis of the image in terms of its component spatial frequencies and phase. As will now be described, it enables the removal of the spatial frequencies in which we have no interest and enables us to reconstruct the image we are interested in by retaining the spatial frequencies of interest.
A Butterworth bandpass filter is then applied to the transformed image in a step S404 by break analysis module 112. The mask implemented by the Butterworth bandpass filter is illustrated in Figure 5b. The Butterworth bandpass filter implements a mask on the transformed image shown in Figure 5a and removes the low spatial frequencies (shown by the black dot in the centre of Figure 5b) and the very high spatial frequencies (the dark border in the image in Figure 5b) which represents specks of dust and dirt on the image.
The Fourier transform of the image data is then inverted in a step S406 by calling a Fourier Transform Inversion routine from the routine library 114 to perform an inverse discrete two-dimensional Fourier transform on the transformed image data.
The performance of the inverse Fourier transform on the transformed image data transforms the transformed image data from the Fourier domain into the real domain to generate real domain image data. The resulting real domain image data is illustrated in Figures 6a and 6b.
-13The use of the Fourier transform to produce the image illustrated in Figure 6b has the effect of isolating the crack from the background.
The real domain image data is compared to a threshold intensity of 4, in a step S408, to delineate the areas of interest more clearly.
Figure 6a shows the real domain image data without using a Butterworth bandpass filter. Figure 6b shows the real domain image data after applying the Butterworth bandpass filter to the transformed data illustrated in Figure 5a and applying thresholding to generate a binary image using a threshold intensity of 4. The Butterworth bandpass filter in this example has a rolloff value of 3.
The upper and lower cutoff frequencies of the Butterworth bandpass filter can be modelled as being linearly dependent on the number of pixels in the longest side of the image (denoted as m) and can be expressed respectively as:
This relationship can be altered using standard trials and numerical experiments.
The image illustrated in Figure 6b is an image which can include more than just the crack 102. It may also include image data which has passed through steps S400 to S408 but is due to specks of dirt on the windscreen and other artefacts of the processing performed by break analysis module 112.
The use of the threshold intensity of 4 to generate the binary image shown in Figure 6b helps to show the areas of interest more clearly. The image shown by the real domain image data illustrated in Figure 6b highlights the crack - including the central crack area which is an area of low spatial frequency.
-14As can be seen the Fourier approach does a very neat job of isolating the crack region from the cluttered background assuming that it is in focus and the background is not.
The break analysis module 112 can then call a morphology routine from the routines library 114 to remove any clutter from the image illustrated in Figure 6b in a step S410.
The morphology routine performs several operations on the image illustrated in Figure 6b. This image is a binary image. Regions of black are zero-valued and regions of white are valued at non-zero. The value of the pixels is stored in the memory of the break analysis module 112 and is the result of the processing in steps S400 to S408.
The first of these operations is a fill-in operation which uses morphological reconstruction to fill in pixel sized regions of black that are surrounded by regions of white with white by replacing the zero value with a non-zero value in accordance with the process set out in [1]
The second of these operations is a clean-up operation which discards very small non-zero valued regions. Very small non-zero valued regions are defined as non-zero valued regions which occupy an area less than the square of (largest dimension of image/500). The largest dimension of the image can be determined by the break analysis module simply by comparing the length of the width of the image with the length of the image.
The first morphological operation is then repeated to fill in any pixel sized regions of black that are surrounded by regions of white that have been generated by the second morphological operation. This is a third morphological operation.
A fourth morphological operation is then performed to join up any legs in the image of the crack 102 which have gaps in them. This is implemented using a morphological closing operation as described in [2]. An erosion is performed followed by a dilation by making use of a disk shaped structuring element with a radius of (largest dimension of image\5312) multiplied by 20. The value of 20 has been determined empirically and may change. This value can be determined without any undue burden for different image resolutions.
-15The first morphological operation is then repeated to fill in any pixel sized regions of black that are surrounded by regions of white that have been generated by the fourth morphological operation. This is the fifth morphological operation.
A sixth morphological operation is then performed to discard any small non-zero regions. Small regions are defined as regions with an area equal to the square of (largest dimension of image/100).
A seventh morphological operation is then performed to remove any disconnected objects in the image. Disconnected objects of interest are objects that are further away than 3ri of the radius of the largest object nearest the centre of the image. This means that legs of the crack that are still disjointed are included but superfluous artefacts are included. The seventh morphological operation is implemented by finding, for each remaining region in the image, the centroid, i.e. the centre of mass of the image, and the length of the major axis of the region. An additional weighting is assigned to each region area based on how close the centroid is to the centre of the image.
The weighting w = 1/d2 where d is the Euclidean distance between the centroid and the centre of the image. The largest region, closest to the centre of the image is selected, and its major axis length is used to set a radius (or 3ri the major axis length from its centroid) outside of which any regions are discarded. That is to say, the morphology routine and the centroid to boundary calculations are configured to retain all “blobs” within a distance from the centre of the crack of the radius of the largest object in the image plus half of that radius to ensure that any discontinuities in the crack 102 are not lost.
The image data, after the morphology has been applied to refine the image data, can then be used to determine the dimensions of the crack 102.
The break analysis module 112 applies further edge detection, morphology, blurring and thresholding to determine the centre of the crack 102.
-16It has been observed through experimentation that the centre of a crack is usually around 2mm in diameter. The break analysis module 112 is operative to, using the refined image data and the data which results from the determination of the centre of the crack 102, estimate the length of the legs of the crack 102 and determine a proportional value which characterises the length of the legs compared to the diameter of the centre of the crack 102, that is a scaling factor for the legs compared to the centre of the crack 102. Using the observation that the centre of the crack is usually 2mm, the scaling factor can then be used to determine the length of the legs. This provides uncalibrated analysis of the size of a crack 102.
The determined length of the legs can then be used to approximate the size of the crack 102 and enables the break analysis module 112 to output a determination of whether a replacement windscreen will be necessary or whether repair will be sufficient as it is the size of a crack which is important in making this determination and by comparing the size of the crack 102 to a repair/replacement threshold the break analysis module 112 can automate this determination. The break analysis module 112 will output this determination to the display module 118
The output, i.e. whether a replacement windscreen is needed, or not, is then displayed using display module 118 in a step S412.
The use of an observed estimate of the centre of a crack to estimate the size of the legs of a crack, which relies on the assumption of a degree of radial spikeyness in a crack, means that an image can be taken of a crack and used to analyse the crack without any calibration at the scene to provide a scale for the crack 102.
This method enables an analysis of the crack to be performed in a wide range of conditions and without the attendance of a technician.
In a second embodiment, we now describe how to derive parameters of the crack 102 using the parameters of the mobile telephone 104 and the lens of the camera 106. This can assist in correcting for any influence that the angle has on the image.
-17The second embodiment can be combined with the first embodiment without departing from this disclosure.
The arrangement illustrated in Figure 2 may enable the dimensions of the crack to be estimated using the chip parameters of the camera 106 and the geometric parameters of the mobile telephone 104.
In order to calculate the angle of rotation (or pivot or tilt angle) of the mobile telephone 104 relative to the windscreen, we can use the geometric parameters of the mobile telephone 104.
In positioning the crack 102 in the centre of the field of view of the lens of the camera 106 it enables a right angled triangle to be drawn. This is described with respect to Figure 7.
After the crack 102 has been discovered, the mobile telephone 104 is laid flat on the windscreen with the top edge at the base of the crack 102. This means that the distance between the bottom edge of the mobile telephone 104 and the base of the crack is equal to the length of the mobile telephone 104. The mobile telephone 104 is then inclined from the bottom edge of the mobile telephone 104 until the crack 102 is in the centre of the field of view of camera 106. Indicia may be provided on the display of mobile telephone 104 to indicate the centre of the field of view.
The distance between the bottom edge of the mobile telephone 104 and the lens of the camera 106 can be retrieved from device parameters database 116. There is therefore formed a right angled triangle defined by the angle of rotation between the bottom edge of the mobile telephone 104 and the windscreen 100, the z-axis of the camera lens and the distance formed between the bottom edge and the base of the crack.
We now describe how the geometric parameters of the mobile telephone 104 and the lens parameters can be used to estimate the parameters of the crack.
-18An image of the crack is captured consistently with the process described above in that the mobile telephone 104 is rotated until the crack 102 is in the centre of the field of view of camera 106
This enables a right angled triangle to be formed by the z-axis of the camera lens, the distance formed between the bottom edge and the base of the crack and the length between the bottom edge and the camera lens.
With reference to Figure 8, we describe how the geometry of the mobile telephone 104 and the lens parameters can be used to estimate the parameters of the break.
In step S800, break analysis module 112 retrieves the distance formed between the bottom edge of mobile telephone 104 and the base of the crack (i.e. the length of mobile telephone 104) and the length between the bottom edge of the mobile telephone 104 and the camera lens from the device parameters database 116. The angle of rotation of mobile telephone 104 can then be calculated in a step S802 using the cosine relationship between the distance formed between the bottom edge and the base of the crack and the length between the bottom edge and the camera lens.
We then need to use the cameras parameters to derive plane-plane homographic mapping between the pixels of the camera and the real-world spatial dimensions of the image. A plane-to-plane homographic mapping routine is then called from the routines library 114 in a step S804 to derive the real-world spatial dimensions of the image.
The derivation of the homographic map to provide the real-world spatial dimensions of the image is based upon the “pin-hole camera model” where a camera thinks of the view area of the camera as a rectangular based cone expanding out relative to the lens of the camera 106. This is illustrated schematically in Figures 9a and 9b.
Figure 9a is for illustration only and illustrates the case where the mobile telephone 104 is located directly above the windscreen 100. That is to say, the camera 106 provides a bird’s
-19eye view of the windscreen 100. In this instance the view area Al is a rectangle and each pixel occupies the same amount of real-world space (in millimetres).
In this instance, and as illustrated in Figure 9b, the mobile telephone 104 is at an angle relative to the windscreen 100. The angle was calculated in step S802. The view area A2 then becomes a trapezium which means that pixels close to the camera represent fewer millimetres than pixels that are further away.
We describe the theoretical basis for how the plane-to-plane homographic map is derived but it will be understood that this will be implemented numerically using routines that will be available to the break analysis module 112 using the routines made available using the routines library 114.
Consider a rectangular image sensor which forms part of camera 106 and a sensor angled from a flat plane by an angle of rotation Θ, the area observed by the sensor maps to an isosceles trapezium. The width of the bases of this trapezium are directly dependent upon Θ. Using the plane-to-plane homographic mapping routine we can use this principle to numerically estimate the parameters of the crack 102 using the knowledge of the pixels on the camera 106.
We define a 3D rotation matrix, about the x-axis, as a function of Θ, as:
Figure GB2551894A_D0017
0 cos Θ — sin 0 sin 0 cos Θ
It will be understood that Θ is the angle of the mobile telephone 104 with respect to the windscreen. We can define an origin in Cartesian x, y and z dimensions at (0,0,0), i.e. the world origin. This is the point in the middle of the base-edge of the mobile telephone 104 which is aligned with the x-axis. The y-axis of this coordinate system is then directed vertically from the base to the top of the phone. If we assume, for simplicity and without loss of generality, that the camera lies upon the y-axis, at some distance dc from the base of the phone. The camera centre is therefore defined as:
-20c - r r . : (U. </, , 0}T
The focal length and the vertical and horizontal sensor sizes of the lens of the camera 106 can then be retrieved from device parameters database 116 in a step S806. These parameters may be called chip parameters. This enables us to calculate the area of view from the camera. The area of view is defined by two quantities which are called the horizontal and vertical angle of view (respectively denoted as an and αν) and they are defined by the following equations:
2/
2/
Where sx and sy are the horizontal and vertical sensor size and f is the focal length.
Having calculated the horizontal and vertical angle of view, break analysis module 112 uses the plane-to-plane homographic mapping routine to calculate the edges of the view pyramid to provide us with a field of view on the windscreen 100 in a step S8O8. This provides with the trapezium illustrated schematically in Figure 9b, i.e. the trapezium that we need to correct to compensate for the differing amounts of space that are occupied by the pixels further from the lens relative to the pixels nearer to the lens. That is to say, we need to scale the trapezium to ensure that the proceeding calculations attribute equal amounts of real-world space to each pixel.
This is modelled in the plane-to-plane homographic mapping routine used by the break analysis module 112 by a line, i.e. a ray, which extends from the lens along the line of view between the lens and the crack 102. This line will intersect the plane represented by the windscreen - that is, the plane-to-plane homographic mapping routine is modelled as a plane.
-21In step S810, the plane-to-plane homographic mapping routine calls a numeric solver routine from the routines library 114 to solve the simultaneous equations which define the plane of the windscreen and the line extending from the lens along the line of view between the lens and the crack 102. The plane-to-plane homographic mapping routine is programmed under the assumption that the plane defining the windscreen 100 is flat and the camera 106 is tilted with respect to it. This provides the intersection between the line extending from the lens along the line of view and the plane of the windscreen 100.
Theoretically, this can be expressed as the calculation of the rays emanating from the point at the centre of the camera through the corners of the sensor/image-plane and onto the windscreen which forms the aforementioned trapezium.
We first obtain the intersection of the rays with a plane, parallel to the image plane, at unit distance, given horizontal and vertical viewing angles of an and αν respectively as defined above.
There are four rays, one for each corner of the rectangular sensor. The minimum and maximum x values can be defined as:
rani f i !
I ahi
Similarly, we can define the minimum and maximum y values as:
··· ' l<tti 1 j
..... We can then define the comers of the rectangular sensor as:
W: x;?. .....:-14
Xfe-r
;. . y
Xu “-I)?·''
Normalising these coordinates by their magnitude provides us with the direction of the 4 rays. We define the ray direction for each of these coordinates as:
x+ = -A > € itl.tr. hr. 11}.
dx: ||'
If we suppose that the phone is rotated, in the x-axis, by Θ, we can calculate that the camera-centre’s position is now:
Fan · i(U, .π) ‘
This enables us to define the direction of the rays as:
V' x'v.
€ it Ur. hr, hl}.
This gives us the rays in Cartesian coordinates with a known point of intersection with the plane parallel to the image plane and we now that this intersection occurs only once. This provides a trapezium indicating the field of view in the real world.
We define the corners of the trapezium as:
-23We calculate the vertices of the trapezium using the line plane intersection formula described in [3]
We know that the normal to the windscreen plane is the vector n=(0,0,-l) and that it lies on the world origin which means that the intersection formula simplifies to:
ii o X>
V' ί ύ ; < ( '3'' <V' bl'!
Where the points '' ‘ ' define the vertices of the trapezium that we need to define the homographic mapping H from the image-plane to the plane in the real-world using the four-point correspondence technique between the trapezium vertices and the image coordinates:
= (0. or llw
Where w is the width of the image and h is the height of the image. The algorithm which discusses how this homographic map is obtained is discussed in [4].
The height of the camera above the windscreen can be calculated by the break analysis module 112 using the Pythagoras theorem as the distance formed between the bottom edge of mobile telephone 104 and the base of the crack (i.e. the length of mobile telephone 104) and the length between the bottom edge of the mobile telephone 104 and the camera lens
-24have been retrieved from the device parameters database 116 in step S800 and are still in the memory of the break analysis module 112.
The output from step S810 is the trapezium of view in the real-world (Xi, X2, X3, X4). A comparison between the parameters (Xi, X2, X3, X4) and the comers of the captured image on the windscreen (performed by break analysis module 112 in step S812) provide the scaling that is needed to map the location of the pixels of camera 106 to locations in millimetres on the field of view on the windscreen 100. This provides us with the plane-toplane homographic map. The scaling is in the form of a 3x3 matrix which represents scale, rotation, skew and translation between the field of view of the camera and the windscreen 100.
The plane-to-plane homographic map enables the correction of the effect of the perspective on the captured image and the conversion from pixel dimensions to millimetres which enables the break analysis module 112 to derive dimensional parameters for the crack 102.
The plane-to-plane homographic map is a matrix maps the two-dimensional image plane of the camera 106 onto a plane representing the windscreen.
The output from the plane-to-plane homographic map provides an orthorectified mask, in millimetres, indicating the location and shape of the crack.
Responsive to this output from the plane-to-plane homographic map, which will, as will be understood, the output from the plane-to-plane homographic mapping routine, the break analysis module 112 calls a convex hull calculation routine from the routines library 114. The locations in millimetres on the field of view of the windscreen are provided to the convex hull calculation routine from the routines library 114.
A convex hull is, in summary, a space which covers each of the locations in millimetres on the field of view. The output from the convex hull calculation routine is data which can be expressed, in simple terms, as a “blob” which will be the same size as the detected crack 102. This enables analysis to be performed on the detected crack 102 using the blob.
-25The break analysis module 112 then calls a smallest-circle routine from the routines library 114 which implements a numerical solution to the smallest circle problem for the convex hull which is output from the convex hull calculation routine. This module outputs the smallest circle which encloses each of the points in the convex hull and therefore provides a minimum radius for the crack 102.
The data representing the convex hull, the data representing the solution to the smallest circle problem for the convex hull and the calculated radius for the crack are each stored by the break analysis module 112 in storage which is either local to the processing module 108 or remote relative to the processing module 108.
That is to say, break analysis module 112 has used the geometric parameters of the mobile telephone 104 and the parameters of the camera 106 to generate a radius for the crack 102.
The parameters and the circle output from the smallest circle routine can then be displayed using display module 118 in a step S814.
An example image which may be provided by the display module 118 is illustrated in Figure 11. In this instance the diameter of the smallest circle is indicated as 16 mm, which means a radius of 8mm. The estimated largest crack diameter in this case is 16 mm. The effect here is that a minimum size for the crack is estimated and can be used to determine the necessity for a replacement windscreen.
The estimated radius can be compared to a replacement/repair threshold by the break analysis module 112 to determine whether the crack 102 requires replacing or whether repair will be sufficient.
The presence of a case on the mobile telephone 104 is likely to introduce an error into the measured parameters as this will add to the length of the mobile telephone 104 but the error is generally around 3%. A 3 % error margin is built into the calculations of the break analysis module 112 and provided on a display by the display module 118.
-26It is also possible that the distance between the base of the mobile telephone 104 and the camera 106 will not be available from device parameters database 116. In this instance we can estimate the parameter to improve the robustness of the described method.
We can use an inclinometer built into mobile telephone 104 to obtain the angle of the mobile telephone when the image of the crack 102 is being captured. This can be used to calculate the height h using the equation:
& = 1 * s in (8)
Where 1 is the length of the mobile telephone 104 and the angle Θ is the angle obtained from the inclinometer.
Similarly, the phone angle could be estimated using the angle of view, the image resolution and the sensor size.
As describe in relation to the first technique, In a preferred embodiment the image processing module 108 is implemented as a software component downloaded to the mobile phone. This may be implemented by means of downloading the software component as an app. The software component is capable of processing the image data from the camera to determine whether the break is susceptible for repair or whether replacement of the glazing panel may be required as the preferred solution.
The break analysis module 112 may be a software component downloaded to the mobile phone preferably as a single download in combination with the image processing module 108. A single downloaded software component I preferably arranged to process the image data from the phone camera and analyse the break using one or more algorithms implemented in software.
The break analysis module 112 is operative to provide an alert on display as to whether a full windscreen replacement is required based on the radius of the smallest circle. If crack
-27102 is above a specified threshold, then the break analysis module 112 will indicate the need for a windscreen replacement or not. The alert may be displayed on a display of mobile telephone 104.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. In the claims, any reference signs placed in parentheses shall not be construed as limiting the claims. The word comprising and comprises, and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. In the present specification, “comprises” means “includes or consists of’ and “comprising” means “including or consisting of’. The singular reference of an element does not exclude the plural reference of such elements and vice-versa. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
References [ 1 ] http://www.mathworks.com/tagteam/64199.. 91822v00...eddins.„finaLpdf
Γ21 http s: /fen. wiki pedia. org/wiki/do sin ·?. (m orp ho logy) [3] Ioseph O’Rourke “Computational Geometry in C”, Cambridge University Press, 2012 [4] Richard Hartley and Andrew Zisserman “Multiple View Geometry in Computer Vision”, Cambridge University Press, 2011.

Claims (31)

1. Break analysis method for analysing breaks in a vehicle glazing panel, the method comprising:
capturing an image of a break in a vehicle glazing panel; processing the image of the break.
2. Method according to claim 1, the method further comprising:
determining the need for a replacement or repair of the glazing panel based on the processing of the image.
3. Method according to any preceding claim, wherein the image of the break is captured at an angle inclined relative to the vehicle glazing panel.
4. Method according to any preceding claim wherein the image is captured by a mobile computing device held in contact with the surface of the glazing panel, wherein the mobile computing device comprises a camera.
5. Method according to claim 4, wherein the mobile computing device is inclined to view the break at a predetermined position in a field of view of the camera.
6. Method according to claim 5, wherein the predetermined position is indicated by indicia displayed on the mobile computing device.
7. Method according to any of claims 4 to 6, comprising:
placing the mobile computing device flat on the surface of the glazing panel; and pivoting the mobile computing device away from the glazing panel, wherein at least part of the mobile computing device remains in contact with the glazing panel.
8. Method according to claim 7, wherein the camera is positioned in a predetermined position relative to the break before the mobile computing device is pivoted.
-299. Method according to claim 8, comprising aligning a feature of the mobile computing device in a predetermined position relative to the break before pivoting the mobile computing device.
10. Method according to claim 9, comprising aligning an edge of the mobile computing device with the lowest point of the break before pivoting the mobile computing device.
11. Method according to any of claims 7 to 10, comprising pivoting the mobile computing device until the break is viewable at a predetermined position in a field of view of the camera.
12. Method according to claim 11, wherein the predetermined position is the centre of the field of view of the camera.
13. Method according to any of claims 7 to 12, including using geometric parameters of the mobile computing device and lens parameters of the camera to estimate the size of the break.
14. Method according to claim 13, wherein the geometric parameters of the mobile computing device are used to determine the pivot angle between the mobile computing device and the glazing panel.
15. Method according to claim 13 or claim 14, wherein the processing of the image is further based on chip parameters for the camera and/or mobile computing device.
16. Method according to any of claims 13 to 15, wherein the processing of the image generates a set of data points which are used to generate a diameter for the break or a zone within the break.
17. Method according to any of claims 13 to 16, wherein the method further comprises:
-30determining the requirement for a replacement vehicle glazing panel or a repair of the glazing panel based on the estimated size of the break.
18. Method according to any preceding claim, comprising identifying a centre of the break and one or more legs of the break.
19. Method according to claim 18 , comprising:
generating a scaling factor indicating the relative length of the one or more legs of the break compared to the size of the centre of the break; and estimating the length of the one or more legs using the scaling factor.
20. Method according to claim 19, wherein estimating the length of the one or more legs of the break comprises multiplying the scaling factor by a predetermined value.
21. Method according to claim 20, wherein the predetermined value is a predetermined estimate of the actual size of the centre of the break, wherein the size is the diameter, width and/or length of the centre of the break.
22. Method according to any of claims 19 to 21, wherein the estimated length of the one or more legs is used to indicate the size of the break and the size of the break is compared to a threshold parameter to determine the need for replacement of repair of the glazing panel.
23. Method according to any of claims 19 to 22 as it depends on claim 2, wherein the step of determining the need for a replacement or repair of the glazing panel comprises determining if the estimated length of the one or more legs exceeds a given threshold.
-3124. Method according to any preceding claim, wherein the processing of the image comprises filtering the image the remove a background portion to identify the break.
25. Method according to claim 24, wherein filtering the image comprises applying morphological refinement to the image.
26. Method according to claim 17 or claim 23, comprising outputting a signal indicating whether the windscreen needs to be repaired or replaced.
27. Apparatus for analysing breaks in a vehicle glazing panel, the apparatus comprising:
a camera arranged to capture an image of a break in a vehicle glazing panel; a processing module operative to process the image of the break.
28. Apparatus according to claim 27, comprising a mobile computing device, wherein the mobile computing device comprises the camera.
29. Apparatus according to claim 28, wherein the mobile computing device comprises the processing module.
30. Apparatus according to any of claims 27 to 29, wherein the processing module is configured to carry out the method steps of any of claims 1 to 26.
31. A computer implemented method for determining technical information regarding a break present in a vehicle glazing panel, wherein the method includes the step of downloading a software component to a mobile computing device having a camera, the software component arranged to process image information captured via the camera in order to determine whether the break may be repaired or replacement is preferred.
Intellectual
Property
Office
Application No: GB1707572.2
GB1707572.2A 2016-05-13 2017-05-11 Break analysis apparatus and method Withdrawn GB2551894A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1608455.0A GB201608455D0 (en) 2016-05-13 2016-05-13 Break analysis system, method and device

Publications (2)

Publication Number Publication Date
GB201707572D0 GB201707572D0 (en) 2017-06-28
GB2551894A true GB2551894A (en) 2018-01-03

Family

ID=56320372

Family Applications (3)

Application Number Title Priority Date Filing Date
GBGB1608455.0A Ceased GB201608455D0 (en) 2016-05-13 2016-05-13 Break analysis system, method and device
GB1707572.2A Withdrawn GB2551894A (en) 2016-05-13 2017-05-11 Break analysis apparatus and method
GB1819662.6A Withdrawn GB2565500A (en) 2016-05-13 2017-05-11 Break analysis apparatus and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1608455.0A Ceased GB201608455D0 (en) 2016-05-13 2016-05-13 Break analysis system, method and device

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB1819662.6A Withdrawn GB2565500A (en) 2016-05-13 2017-05-11 Break analysis apparatus and method

Country Status (28)

Country Link
US (3) US10852249B2 (en)
EP (2) EP3455825B1 (en)
JP (1) JP2019518276A (en)
KR (1) KR20190007481A (en)
CN (2) CN109155062A (en)
AR (1) AR108460A1 (en)
AU (2) AU2017263145B2 (en)
BR (1) BR112018072977A2 (en)
CA (1) CA3021030C (en)
DK (2) DK3455825T3 (en)
ES (2) ES2848546T3 (en)
FI (1) FI3783567T3 (en)
GB (3) GB201608455D0 (en)
HR (2) HRP20231169T1 (en)
HU (2) HUE063378T2 (en)
LT (2) LT3455825T (en)
MA (1) MA44971B1 (en)
MX (2) MX2018013704A (en)
PL (2) PL3455825T3 (en)
PT (2) PT3455825T (en)
RS (2) RS64596B1 (en)
RU (1) RU2766420C2 (en)
SG (1) SG11201809865RA (en)
SI (2) SI3783567T1 (en)
TN (1) TN2018000371A1 (en)
TW (2) TW201839385A (en)
WO (1) WO2017194950A1 (en)
ZA (2) ZA201806995B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201608455D0 (en) 2016-05-13 2016-06-29 Belron Internat Ltd Break analysis system, method and device
GB2585004A (en) 2019-06-24 2020-12-30 Belron Int Ltd Analysis of damage to vehicle glazing panels
CN112784626B (en) * 2019-11-04 2022-07-01 广东新星源智能信息科技有限公司 Replacement demand field triggering device based on mobile terminal
CN111762181B (en) * 2019-11-06 2021-06-04 嵊州市万智网络科技有限公司 Automobile accident number mapping system and method
CN110893382B (en) * 2019-11-09 2020-10-27 嵊州亿源投资管理有限公司 Automatic fine grain repairing system
CN110907473B (en) * 2019-11-29 2023-04-07 上海科世达-华阳汽车电器有限公司 Photovoltaic module overhauling method, device, equipment and storage medium
CN112906688A (en) * 2019-12-04 2021-06-04 刘志方 Information analysis platform based on visual processing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2526270A (en) * 2014-05-16 2015-11-25 Pre Chasm Res Ltd Examining vehicle glass defects

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6115118A (en) * 1997-08-25 2000-09-05 Northstar Automotive Glass, Inc. Vehicle windshield scanning system
GB2391784B (en) * 2002-08-02 2005-10-19 Carglass Luxembourg Sarl Zug Windscreen heater device particularly for windscreen repair
GB2415776B (en) * 2004-06-28 2009-01-28 Carglass Luxembourg Sarl Zug Investigation of vehicle glazing panels
US7089099B2 (en) * 2004-07-30 2006-08-08 Automotive Technologies International, Inc. Sensor assemblies
RU2366933C2 (en) * 2004-12-16 2009-09-10 Вольво Аэро Корпорейшн Method and device for detection of cracks in object
RU2444059C2 (en) * 2005-06-30 2012-02-27 Эй-Джи-Си Флэт Гласс Норт Америкэ, Инк. Integrated device and method of reading images
US20070067075A1 (en) * 2005-09-16 2007-03-22 Mcmillan Michael W Quick automotive cosmetic repair
US20070245956A1 (en) * 2006-02-23 2007-10-25 Picodeon Ltd Oy Surface treatment technique and surface treatment apparatus associated with ablation technology
US8073196B2 (en) * 2006-10-16 2011-12-06 University Of Southern California Detection and tracking of moving objects from a moving platform in presence of strong parallax
GB2479677A (en) * 2007-04-23 2011-10-19 Belron Hungary Kft Zug Branch Glazing panel investigation
DE102007021181B3 (en) * 2007-05-05 2008-07-03 Dr.Ing.H.C. F. Porsche Ag Damaged vehicle carrier e.g. side skirt, repairing method, involves attaching and connecting current alternative wall layers as cut wall layers so that new residual vehicle carrier corresponds to undamaged vehicle carrier
KR101561913B1 (en) 2009-04-17 2015-10-20 엘지전자 주식회사 Method for displaying image for mobile terminal and apparatus thereof
WO2011022490A2 (en) * 2009-08-18 2011-02-24 Serious Materials, Inc. Method and systems for retrofitting glass or insulated glass units of existing curtain wall systems for improved thermal performance
WO2012142967A1 (en) * 2011-04-21 2012-10-26 Ati-China Co., Ltd. Apparatus and method for photographing glass defects in multiple layers
US20140247348A1 (en) * 2013-03-01 2014-09-04 Hertz System, Inc. Virtual rent-a-car system and method with in-car concierge device
JP6238569B2 (en) * 2013-05-22 2017-11-29 キヤノン株式会社 Image processing apparatus, image processing method, and program
US9113306B2 (en) * 2013-08-02 2015-08-18 Google Inc. Predictive assistance information
JP2016095502A (en) * 2014-11-11 2016-05-26 株式会社半導体エネルギー研究所 Display system and display device
GB201608455D0 (en) 2016-05-13 2016-06-29 Belron Internat Ltd Break analysis system, method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2526270A (en) * 2014-05-16 2015-11-25 Pre Chasm Res Ltd Examining vehicle glass defects

Also Published As

Publication number Publication date
PT3455825T (en) 2021-02-03
AU2017263145A1 (en) 2018-11-08
EP3783567B1 (en) 2023-07-05
HUE063378T2 (en) 2024-01-28
GB201707572D0 (en) 2017-06-28
EP3455825A1 (en) 2019-03-20
LT3455825T (en) 2021-02-25
SG11201809865RA (en) 2018-12-28
JP2019518276A (en) 2019-06-27
RU2018140338A (en) 2020-06-15
MA44971B1 (en) 2021-03-31
AR108460A1 (en) 2018-08-22
MX2022000529A (en) 2022-02-10
TN2018000371A1 (en) 2020-06-15
SI3455825T1 (en) 2021-04-30
EP3783567A1 (en) 2021-02-24
HRP20210174T1 (en) 2021-04-02
TWI647443B (en) 2019-01-11
TW201740102A (en) 2017-11-16
CN109155062A (en) 2019-01-04
CN116205871A (en) 2023-06-02
MA44971A (en) 2019-03-20
CA3021030C (en) 2024-02-27
US11385189B2 (en) 2022-07-12
FI3783567T3 (en) 2023-09-08
DK3455825T3 (en) 2021-02-08
US20220326163A1 (en) 2022-10-13
DK3783567T3 (en) 2023-09-18
HUE053349T2 (en) 2021-06-28
SI3783567T1 (en) 2023-11-30
AU2017263145B2 (en) 2021-12-23
EP3455825B1 (en) 2020-11-11
GB201819662D0 (en) 2019-01-16
CA3021030A1 (en) 2017-11-16
AU2022201019A1 (en) 2022-03-10
GB201608455D0 (en) 2016-06-29
RS61389B1 (en) 2021-02-26
PT3783567T (en) 2023-08-18
RS64596B1 (en) 2023-10-31
AU2022201019B2 (en) 2023-04-13
BR112018072977A2 (en) 2019-03-06
US20190170662A1 (en) 2019-06-06
US11692949B2 (en) 2023-07-04
HRP20231169T1 (en) 2024-01-05
RU2018140338A3 (en) 2020-10-26
KR20190007481A (en) 2019-01-22
US10852249B2 (en) 2020-12-01
ZA201806995B (en) 2023-09-27
US20210131979A1 (en) 2021-05-06
PL3783567T3 (en) 2023-12-11
ZA202100372B (en) 2022-10-26
RU2766420C2 (en) 2022-03-15
WO2017194950A1 (en) 2017-11-16
TW201839385A (en) 2018-11-01
LT3783567T (en) 2023-10-10
GB2565500A (en) 2019-02-13
PL3455825T3 (en) 2021-07-19
MX2018013704A (en) 2019-05-02
ES2848546T3 (en) 2021-08-10
ES2956878T3 (en) 2023-12-29

Similar Documents

Publication Publication Date Title
US11692949B2 (en) Break analysis apparatus and method
US9959600B2 (en) Motion image compensation method and device, display device
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
KR102438078B1 (en) Apparatus and method for providing around view
CN111861980B (en) Imaging detection method, electronic equipment and computer readable storage medium
US20190325593A1 (en) Image processing apparatus, system, method of manufacturing article, image processing method, and non-transitory computer-readable storage medium
JP2020107290A (en) Adhering matter detection device and adhering matter detection method
JPWO2011039977A1 (en) Crosswalk sign detection method and crosswalk sign detection device
JP2010048553A (en) Inspecting method of compound-eye distance measuring device and chart used for same
US10970592B2 (en) Adhering substance detection apparatus and adhering substance detection method
Lelégard et al. Multiscale Haar transform for blur estimation from a set of images
NZ787611A (en) Break analysis apparatus and method
JP6855254B2 (en) Image processing device, image processing system, and image processing method
JP2021043141A (en) Object distance estimating device and object distance estimating method
CN115147413B (en) Ghost image detection method, device, equipment and readable storage medium
CN116012322A (en) Camera dirt detection method, device, equipment and medium
CN113744200A (en) Camera contamination detection method, device and equipment

Legal Events

Date Code Title Description
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1241546

Country of ref document: HK

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)