GB2384305A - Human position detection by capturing spectral contents of images representative of human skin - Google Patents

Human position detection by capturing spectral contents of images representative of human skin Download PDF

Info

Publication number
GB2384305A
GB2384305A GB0212411A GB0212411A GB2384305A GB 2384305 A GB2384305 A GB 2384305A GB 0212411 A GB0212411 A GB 0212411A GB 0212411 A GB0212411 A GB 0212411A GB 2384305 A GB2384305 A GB 2384305A
Authority
GB
United Kingdom
Prior art keywords
camera
image
processor
arrangement according
areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0212411A
Other versions
GB2384305B (en
GB0212411D0 (en
Inventor
Marcus Klomark
Mattias Hanqvist
Karl Munsin
Salah Hadi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autoliv Development AB
Original Assignee
Autoliv Development AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autoliv Development AB filed Critical Autoliv Development AB
Publication of GB0212411D0 publication Critical patent/GB0212411D0/en
Priority to US10/502,126 priority Critical patent/US20060050927A1/en
Priority to AU2002359172A priority patent/AU2002359172A1/en
Priority to PCT/SE2002/002382 priority patent/WO2003059697A1/en
Publication of GB2384305A publication Critical patent/GB2384305A/en
Application granted granted Critical
Publication of GB2384305B publication Critical patent/GB2384305B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • B60R21/0153Passenger detection systems using field detection presence sensors
    • B60R21/01534Passenger detection systems using field detection presence sensors using electromagneticwaves, e.g. infrared
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • B60R21/0153Passenger detection systems using field detection presence sensors
    • B60R21/01538Passenger detection systems using field detection presence sensors for image processing, e.g. cameras or sensor arrays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • B60R21/01542Passenger detection systems detecting passenger motion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Air Bags (AREA)

Abstract

A camera arrangement (1) is mounted on a vehicle to detect a human (2) in images. The output of the camera provides a signal to a processor (4) which analyses the signal to identify any area or areas of the captured images which have a specific spectral content representative of human skin. This task may be done in H, S, V colour space (fig.. 2). The processor(4) then determines the position of such areas. The processor may be adapted to control or modify the actuation of one or more safety devices (5). It may also determine successive positions of such areas to determine movement of such areas. The camera may be directed towards a seat or a space in front of the vehicle and the safety device may be pedestrians protection device (35, fig. 10). The processor may use triangulation to determine the distance from the camera to the identified areas.

Description

<Desc/Clms Page number 1>
DESCRIPTION OF INVENTION "IMPROVEMENTS IN OR RELATING TO A CAMERA ARRANGEMENT" THE PRESENT INVENTION relates to a camera arrangement and more particularly relates to a camera arrangement for use with a safety device, in particular in a motor vehicle.
In connection with the deployment of a safety device in a motor vehicle it is sometimes important to be able to detect and identify objects located in the region above and in front of a vehicle seat. For example, it may be necessary to determine the position of at least part of the occupant of the seat, for example the head of the occupant of the seat, so as to be able to determine the position of the occupant of the seat within the seat. If the occupant is leaning forwardly, for example, it may be desirable to modify the deployment of safety equipment in the vehicle, such as a safety device in the form of an airbag mounted directly in front of the occupant of the seat, if an accident should occur. In the situation envisaged it may be appropriate only to inflate the airbag partially, rather than to inflate the airbag fully.
<Desc/Clms Page number 2>
If the front seat of a vehicle is not occupied by a person, but instead has a rear-facing child seat located on it, then it may be desirable to modify the deployment of an airbag located in front of that seat, in the event that an accident should occur, in such a way that the airbag does not inflate at all. If the airbag did inflate it might eject the child from the rear facing child seat.
Many prior proposals have been made concerning techniques that can be utilised to determine the position of part of an occupant of a seat and also to determine whether a seat is occupied by a rear-facing child seat. Some prior proposals have utilised optical techniques, and others have utilised techniques involving ultrasonic radiation or even"radar". In many prior arrangements the sensors have been mounted in front of the seat, and the signals derived from the sensors have been processed to calculate the distance between the occupant of the seat, or an item on the seat, and the sensors.
It is now becoming increasingly important to be able to detect the position of a pedestrian in front of a motor vehicle, as more vehicles have safety devices which may be deployed in an accident situation to provide protection for a pedestrian. The mode of deployment of these devices may be controlled in dependence on the number of pedestrians involved in an accident, and the size of the pedestrians. A camera may actuate a safety device to provide protection for pedestrians.
The present invention seeks to provide an improved camera arrangement
which can be utilised to detect and evaluate objects on and above a vehicle seat. i According to this invention there is provided a camera arrangement to be mounted in a vehicle to detect a human, the arrangement comprising a camera
<Desc/Clms Page number 3>
to capture a light image, the camera providing an output signal; and a processor operable to analyse the signal to identify any area or areas of the captured image which have a specific spectral content representative of human skin, and to determine the position of any so identified area or areas within the image.
Preferably the processor is adapted, in response to the determined position of the area or areas, to control or modify the actuation of one or more safety devices.
Conveniently the processor is adapted to determine successive positions of the identified area or areas to determine a parameter related to the movement of the identified area or areas, the processor being adapted to control or modify the actuation of one or more safety devices in response to the determined parameter.
Advantageously the camera is directed towards a space in front of the vehicle and the safety device is a pedestrian protection device.
Preferably the camera arrangement is adapted to trigger the pedestrian protection device.
Conveniently the camera arrangement is adapted to control deployment of the pedestrian protection device.
In an alternative embodiment the camera is directed towards the space above and in front of a seat within the vehicle compartment.
<Desc/Clms Page number 4>
In one embodiment of the invention the camera is laterally displaced relative to the seat, the viewing axis of the camera extending transversely of the vehicle.
In an alternative embodiment of the invention two cameras are provided, the cameras being located in front of the seat, the processor being adapted to use triangulation to determine the distance from the cameras to an identified area in the image.
Conveniently the processor analyses the signal to identify specific features of a head.
Preferably the processor analyses the signal to identify any area or areas of the captured image which have, in the H, S, V space, H greater than or equal to 3350 or less than or equal to 250, S between 0.2 and 0.6 inclusive, and V greater than or equal to 0.4.
Conveniently a light source is provided to illuminate the field of view of the camera, a subtractor being provided to an image with the light not operative from an image with the light operative, the resultant image being analysed to determine the position of an identified area or areas within the image, wherein the light source emits light outside the visible spectrum, and the camera is responsive to light of a wavelength as emitted by the light source.
Preferably, the arrangement is configured such that the light source and subtractor are operable as defined in the preceding paragraph only if the ambient light in the field of view of the camera is below a first predetermined or calculated level of intensity.
<Desc/Clms Page number 5>
Advantageously, the processor is operable to analyse the signal from the camera to identify any area or areas of the captured image which have a specific spectral content representative of human skin, only when the ambient light in the field of view of the camera is above a second predetermined or calculated level of intensity.
Conveniently, the first and second predetermined or calculated levels of intensity are equal.
In order that the invention may be more readily understood, and so that further features thereof may be appreciated, the invention will now be described, by way of example, with reference to the accompanying drawings in which: FIGURE I is a representation of a first colour model provided for purposes of explanation, FIGURE 2 is a corresponding diagram of a second colour model provided for purposes of explanation, FIGURE 3 is a diagrammatic top plan view of part of the cabin of a motor vehicle illustrating a camera arrangement in accordance with the invention illustrating an optional light source that forms part of one embodiment of the camera arrangement in the operative condition, FIGURE 4 is a view corresponding to Figure 3 illustrating the light source in a non-operative condition,
<Desc/Clms Page number 6>
FIGURE 5 is a schematic view of the image obtained from the camera arrangement with the light source in an operative condition, FIGURE 6 is a schematic view corresponding to Figure 4 showing the image obtained when the light source is not operative, FIGURE 7 is a view showing a resultant image obtained by subtracting the image of Figure 6 from the image of Figure 5, FIGURE 8 is a block diagram, FIGURE 9 is a view corresponding to Figure 3 illustrating a further embodiment of the invention, FIGURE 10 is a diagrammatic side elevational view of the front part of a motor vehicle illustrating an alternative camera arrangement of the present invention configured to detect the position of pedestrians in front of the vehicle, and FIGURE 11 is a graph illustrating the relative effectiveness of two modes of operation of the present invention, with varying light intensity.
There are several colour models which are used to"measure"colour.
One colour model is the R, G, B colour model which is most widely used in computer hardware and in cameras. This model represents colour as three independent components, namely red, green and blue. Like the X, Y, Z coordinate system, the R, G, B colour model is an additive model, and combinations ofR, G and B values generate a specific colour C.
<Desc/Clms Page number 7>
This model is often represented by a three-dimensional box with R, G and B axes as shown in Figure 1.
The comers of the box, on the axes, correspond to the primary colours.
Black is positioned in the origin (0,0, 0) and white at the opposite comer of the box (1, 1, 1), and is the sum of the primary colours. The other comers which are spaced from the axes represent combinations of two primary colours. For example, adding red and blue gives magenta (1,0, 1). Shades of grey are positioned along the diagonal from black to white. This model is hard to comprehend for a human observer, because the human way of understanding and describing colour is not based on combinations of red, green and blue.
Another colour model is the H, S, V colour model which is more intuitive to humans. To specify a colour, one colour is chosen and amounts of black and white are added, which gives different shades, tints and tones. The colour parameters here are called Hue, Saturation and Value. In a three-dimensional representation, as shown in Figure 2, Hue is the colour and is represented as an angle between 00 and 3600. The Saturation varies from 0 to 1 and is representative of the"purity"of the colour-for example a pale colour like pink, is less pure than red. Value varies from 0 at the apex of the cone, which corresponds to black, to I at the top, where the colours have their maximum intensity.
Studies have shown that all kinds of human skin, no matter the race of the human being, are gathered in a relatively small cluster in a suitable colour space. It has been found that human skin colours are positioned in a small cluster of the H, S, V space. It has been suggested that appropriate thresholds may be considered to be a Hue between 00 to 250 or between 3350 and 3600. Of
<Desc/Clms Page number 8>
course, 3600 is the same as 00 and thus the range can be considered to be from 3350 upwards, through the origin of 00 and continuing on to 250. A Saturation of 0.2 to 0.6 is appropriate, and a Value of greater than or equal to 0.4 is appropriate.
It is to be appreciated that by using Hue and Saturation, it is possible to obtain an appropriate identification within a large range of lighting intensity.
Most cameras produce R, G, B pixels, and if the H, S, V system has to be used a conversion to H, S, V has to be effected.
Since it has been found that the colour of all kinds of human skin is located within a relatively small and relatively clearly defined volume within the H, S, V space, it is possible to identify a human image on a camera by identifying regions which have a colour within the said defined volume of the H, S, V space.
The present invention therefore uses at least one camera to take a colour image of a part of a motor vehicle where it is anticipated that there may be a human occupant, or an image from a vehicle, the image covering an area in front of the vehicle that may be occupied by a pedestrian, and the image is analysed to identify areas where the colour of the image is within the said defined volume of H, S, V space. Thus the image may be processed to determine if there is a human shown within the image and, if so, the position of the occupant within or relative to the vehicle. This information may be used to control the actuation of one or more active safety devices in the event that an accident should occur.
<Desc/Clms Page number 9>
Referring now to Figure 3 of the accompanying drawings, a camera arrangement of the present invention includes a camera 1. The camera is responsive to light, and in particular is responsive to light which is within the said defined volume of the H, S, V colour model as representative of human skin.
The camera may be a conventional television camera or a chargecoupled device, or a CMOS camera or any other camera capable of capturing the appropriate image. If the camera is such that the camera produces an output signal in the R, G, B model, that signal is converted to the H, S, V model, or another suitable colour model which might be used for analysis of the image.
The camera 1 is directed towards the region of a motor vehicle expected to be occupied by a human occupant 2 shown sitting, in this embodiment, on a seat 3. The lens of the camera is directed laterally across the vehicle, that is to say the camera is located to one side of the vehicle so as to obtain a side view of the occupant.
The output of the camera is passed to a processor 4 where the image is processed. The image is processed primarily to determine the position of the head of the occupant 2 of the seat 3 within the field of view of the camera. Thus the image taken by the camera is initially analysed by an analyser within the processor to identify any areas of the image which fall within the defined volume of the H, S, V colour model, those areas being identified as being human skin. The area (or areas) thus identified is further processed to identify any large area of human skin that may correspond to the head of the occupant. The image may also be processed to determine the shape and size of any identified area of human skin to isolate details, such as a nose, mouth or eyes, which will confirm that the identified area of the image is an image of a head.
<Desc/Clms Page number 10>
The position of the head within the field of view of the camera is monitored. It would be expected, in the arrangement as shown in Figure 3, that the head would be towards the left-hand side of the image, if the occupant is in the ordinary position. If the occupant is leaning forwards, the head would be towards the centre, or even to the right-hand side of the field of view. By determining the position of the head of the occupant, the processor is adapted to determine an appropriate mode of operation for a safety device, such as a frontmounted air-bag, and will ensure that the safety device, 5, will, if deployed, be deployed in an appropriate manner, having regard to the position of the person to be protected.
The arrangement as described above, using"colour method"described will operate in a satisfactory manner during daylight hours or when there is a sufficient degree of illumination within the motor vehicle. However, the above-described"colour method"which identifies areas of an image having a spectral content representative of human skin, becomes less effective as the ambient light intensity reduces. This reduction in efficiency of the"colour method"is illustrated in Figure 11 which is a plot of functionality against light intensity. It is therefore proposed that an alternative mode of operation could be used when the ambient light intensity reduces below a predetermined or calculated effective level.
It is therefore proposed that the camera can be operated in the manner described above (the"colour method"), selecting parts of the image within the defined volume of the H, S, V space, but if the arrangement is unable to identify the position of a seat occupant, for example because of the interior of the vehicle is dark, then the arrangement may enter a second or alternative mode of operation. Alternatively, the arrangement may simply enter the second or alternative mode
<Desc/Clms Page number 11>
of operation upon detecting a drop in light intensity below a predetermined value. In order to facilitate the alternative mode of operation, a source of electromagnetic radiation is provided, such as a light source 6, in association with the camera.
The light source 6 generates a diverging beam of light which is directed towards the field of view of the camera 1, with the illumination intensity decreasing with distance from the light source 6.
It is preferred that the light source 6 emits light outside the visible spectrum, such as infra-red light, so as not to distract the driver of the vehicle.
The camera I is therefore not solely responsive to light within the said defined volume of the H, S, V space, but is also responsive to light of a wavelength as emitted by the light source, for example, infra-red light.
It is envisaged that the sensitivity of the camera 1 and the radiation intensity of the light source 6 will be so adjusted that the camera 1 is responsive to light reflected from the occupant 2 of the seat, but is not responsive (or is not so responsive) to light reflected from the parts of the cabin of the motor vehicle which are remote from the occupant 2, such as the door adjacent the occupant.
It is also envisaged that in the second or alternative mode of operation the camera will, in a first step, capture an image with the light source 6 operational, as indicated in Figure 3. In a subsequent step the camera will capture an image with the light source non-operational as shown in Figure 4.
Figure 5 illustrates schematically the image obtained in the first step, that is to say with the light source operational. Part of the image is the image of the occupant, who is illuminated by the light source 6, and thus this part of the
<Desc/Clms Page number 12>
image is relatively bright. The rest of the image includes those parts of the cabin of the vehicle detected by the camera 1, and also part of the image entering the vehicle through the window.
Figure 6 illustrates the corresponding image taken with the light source 6 non-operational. The occupant 2 of the vehicle is not so bright, in this image, since the occupant is not illuminated by the light source 6, but the rest of the image is virtually the same as the image of Figure 5.
As shown in Figure 8, successive signals from the camera 1 are passed to a processor 10 where signals representing the first image, with illumination, are stored in a first store 11, and signals representing the second image, without illuminations, are stored in a second store 12. The two signals are subtracted in the subtractor 13. Thus, effectively the second image, without illumination, is subtracted, pixel-by-pixel, from the first image as shown in Figure 5, taken with the light source 2 operative. The resultant image, as shown in Figure 7, consists substantially of an image of only the occupant. The taking of successive images and the subtraction of signals representing the images and the processing step is repeated continuously, in a multiplex manner, to provide a constantly up-dated resultant image. Signals representing the image are passed to a processor 14. It is to be appreciated that in the alternative arrangement as described above, when used with the light 6, will be sequentially operated with the light 6 on and with light 6 off, with a subsequent subtraction of the detected images. Alternative mechanisms, such as a shutter, may be used to interrupt the beam of light.
Referring to Figure 11, it will be seen that the functionality of the abovedescribed second or alternative method (the"Active Background Elimination (ABE)"method) is greatly improved over that of the first"colour method"in
<Desc/Clms Page number 13>
dark conditions. However, as light intensity increases, the ABE method becomes less efficient whilst the colour method becomes more efficient. It is therefore envisaged that during periods of intermediate ambient light intensity, both methods may be employed simultaneously to improve the overall reliability of the arrangement in accurately detecting the presence of a human.
Thus, the ABE method may be used when ambient light intensity is below a first predetermined or calculated level, and the colour method may be used if the ambient light intensity is above a second predetermined or calculated level. The first and second levels may not necessarily be equal. For example, the first light intensity level could be above the second level, in which case there would be a zone of simultaneous ABE and colour operation as described above.
The processor 4 of the embodiment of Figure 3 or the processor 14 of the embodiment described with reference to Figure 8 will process the image to determine whether the seat is completely empty or is occupied in any way. The processor is configured to identify and recognise predetermined objects, such as child seats, or parts of objects, such as the head of a human occupant of the seat, or even the nose, mouth or eyes present on the head, and to determine the position thereof relative to the seat. Thus the processor will process the image by determining the nature of the image, for example by determining whether the image is an image of an occupant of a seat or an image of a rear-facing child seat, and will determine the position of part of or the whole of the image.
If the image is an image of a rear-facing child seat the processor may, for example through a control arrangement 15, inhibit deployment of a safety device 5 or 16 in the form of an airbag mounted in the dashboard in front of the seat.
<Desc/Clms Page number 14>
If the processor 14 determines that the image is an image of an occupant the processor will then determine if part of the occupant such as the head of the occupant, is in a predetermined part of the image. Because the field of view of the camera is fixed in position it is possible to determine the position in the vehicle of part of the occupant by determining the position of that part of the occupant within the image. It is thus possible to calculate the distance between part of the occupant, such as the head of the occupant and the dashboard or steering wheel to determine if the occupant is"in position"or"out of position".
If the occupant is"out of position"the deployment of an airbag in front of the occupant may be modified for example by the control arrangement 15. The image processor 4 or 14 may also be adapted to determine the size of the image.
Thus the processor 4 or 14 will discriminate between a small seat occupant, such as a child, or a large seat occupant, such an obese adult. The position of the head may be monitored over a period of time, and any movement of the head may be analysed. In dependence upon the result of the processing within the processor, the manner of deployment of an airbag provided to protect the occupant of the seat may be modified, for example, by the control arrangement 15.
Figure 9 illustrates a modified embodiment of the invention where, instead of having a camera which is located at the side of the vehicle cabin to take a side view of the occupant, two cameras 21,22 are positioned generally in front of an occupant of a vehicle 23 seated on a seat 24. The cameras are again connected to a processor adapted to identify regions of images taken by the cameras which are within the appropriate volume of the H, S, V space. Using a triangulation technique, the position of the head of the occupant 23 can readily be determined. As in the previously described embodiment, the processor 25 will analyse the image to determine the location of the head of the occupant,
<Desc/Clms Page number 15>
possibly determining the location of features such as the nose, mouth or eyes.
The processor may determine parameters relating to movement of the head.
The processor 25 controls or modifies the actuation of safety device 26, such as front-mounted air-bag or"smart seat belt", in dependence upon an output from the processor.
Referring now to Figure 10 of the accompanying drawings, a further camera arrangement in accordance with the invention is illustrated. In this embodiment, the camera arrangement includes a camera 30 which is mounted on the front part of a vehicle 31, so as to view an image of the road in front of the vehicle. It is intended that the camera will receive an image of the road in front of the vehicle and in particular, will receive an image of any pedestrians, such as pedestrians 32,33 located in front of the vehicle.
The camera, as in the previously described embodiments, passes a signal to a processor 34, which again incorporates an analyser analysing the image to identify the area or areas having the specific colour representative of human skin. The processor is adapted to identify any area or areas having the colour of human skin, and to determine if those areas represent one or more pedestrians located in front of the vehicle. The processor is adapted to actuate or deploy a safety device 35 if pedestrians are identified in front of the vehicle, (in dependence on the speed of the vehicle relative to the pedestrians and the distance between the vehicle and the pedestrians), and the processor may determine a number of pedestrians and the physical size of the pedestrians and control the way in which the safety device 35 is deployed. The safety device 35 may take many forms, and may comprise an external air-bag or may comprise a device adapted to raise part of the bonnet or hood of the motor vehicle.
<Desc/Clms Page number 16>
In this embodiment, a light source 36 may be provided. The light source preferably emits light which is not in the visible spectrum, such as infra-red light. The light source 36 is mounted on the vehicle, and is adapted to operate in the same way as the light source 6 of the embodiment of Figures 3 and 4. Thus, in the embodiment, the arrangement may have a second mode of operation in which the light source 36 is alternately turned on and off.
In the present Specification"comprise"means"includes or consists of and"comprising"means"including or consisting of".
The features disclosed in the foregoing description, or the following Claims, or the accompanying drawings, expressed in their specific forms or in terms of a means for performing the disclosed function, or a method or process for attaining the disclosed result, as appropriate, may, separately, or in any combination of such features, be utilised for realising the invention in diverse forms thereof.

Claims (20)

  1. CLAIMS: 1. A camera arrangement to be mounted in a vehicle to detect a human, the arrangement comprising a camera to capture a light image, the camera providing an output signal; and a processor operable to analyse the signal to identify any area or areas of the captured image which have a specific spectral content representative of human skin, and to determine the position of any so identified area or areas within the image.
  2. 2. An arrangement according to Claim 1 wherein the processor is adapted, in response to the determined position of the area or areas, to control or modify the actuation of one or more safety devices.
  3. 3. An arrangement according to Claim 1 or 2 wherein the processor is adapted to determine successive positions of the identified area or areas to determine a parameter related to the movement of the identified area or areas, the processor being adapted to control or modify the actuation of one or more safety devices in response to the determined parameter.
  4. 4. An arrangement according to Claim 2 or 3 wherein the camera is directed towards a space in front of the vehicle and the safety device is a pedestrian protection device.
  5. 5. An arrangement according to Claim 4 wherein the camera arrangement is adapted to trigger the pedestrian protection device.
    <Desc/Clms Page number 18>
  6. 6. An arrangement according to Claim 4 or 5 wherein the camera arrangement is adapted to control deployment of the pedestrian protection device.
  7. 7. An arrangement according to Claim 2 or 3 wherein the camera is directed towards the space above and in front of a seat within the vehicle compartment.
  8. 8. An arrangement according to Claim 7 wherein the camera is laterally displaced relative to the seat, the viewing axis of the camera extending transversely of the vehicle.
  9. 9. An arrangement according to Claim 7 wherein two cameras are provided, the cameras being located in front of the seat, the processor being adapted to use triangulation to determine the distance from the cameras to an identified area in the image.
  10. 10. An arrangement according to any one of the preceding Claims wherein the processor analyses the signal to identify specific features of a head.
  11. 11. An arrangement according to any one of the preceding Claims wherein the processor analyses the signal to identify any area or areas of the captured image which have, in the H, S, V space, H greater than or equal to 3350 or less than or equal to 250, S between 0.2 and 0.6 inclusive, and V greater than or equal to 0.4.
  12. 12. An arrangement according to any one of the preceding Claims wherein a light source is provided to illuminate the field of view of the camera, a subtractor being provided, the subtractor being operable to subtract an image
    <Desc/Clms Page number 19>
    with the light not operative from an image with the light operative, the resultant image being analysed to determine the position of an identified area or areas within the image, wherein the light source emits light outside the visible spectrum, and the camera is responsive to light of a wavelength as emitted by the light source.
  13. 13. An arrangement according to Claim 12, configured such that the light source and substractor are operable as defined in Claim 12 only if the ambient light in the field of view of the camera is below a first predetermined or calculated level of intensity.
  14. 14. An arrangement according to Claim 13, wherein the processor is operable as defined in claim 1 only when the ambient light in the field of view of the camera is above a second predetermined or calculated level of intensity.
  15. 15. An arrangement according to Claim 14, wherein the first and second predetermined or calculated levels of intensity are equal.
  16. 16. A camera arrangement substantially as herein described with reference to and as shown in Figure 3 of the accompanying drawings with a light source.
  17. 17. A camera arrangement substantially as herein described with reference to and as shown in Figures 4 to 8 of the accompanying drawings with a light source.
  18. 18. A camera arrangement substantially as herein described with reference to and as shown in Figure 9 of the accompanying drawings.
    <Desc/Clms Page number 20>
  19. 19. A camera arrangements substantially as herein described with reference to and as shown in Figure 10 of the accompanying drawings.
  20. 20. Any novel feature or combination of features disclosed herein.
GB0212411A 2002-01-16 2002-05-29 Improvements in or relating to a camera arrangement Expired - Fee Related GB2384305B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/502,126 US20060050927A1 (en) 2002-01-16 2002-12-19 Camera arrangement
AU2002359172A AU2002359172A1 (en) 2002-01-16 2002-12-19 A camera arrangement
PCT/SE2002/002382 WO2003059697A1 (en) 2002-01-16 2002-12-19 A camera arrangement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0200954A GB0200954D0 (en) 2002-01-16 2002-01-16 Improvements in or relating to a camera arrangement

Publications (3)

Publication Number Publication Date
GB0212411D0 GB0212411D0 (en) 2002-07-10
GB2384305A true GB2384305A (en) 2003-07-23
GB2384305B GB2384305B (en) 2005-03-16

Family

ID=9929187

Family Applications (2)

Application Number Title Priority Date Filing Date
GB0200954A Ceased GB0200954D0 (en) 2002-01-16 2002-01-16 Improvements in or relating to a camera arrangement
GB0212411A Expired - Fee Related GB2384305B (en) 2002-01-16 2002-05-29 Improvements in or relating to a camera arrangement

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB0200954A Ceased GB0200954D0 (en) 2002-01-16 2002-01-16 Improvements in or relating to a camera arrangement

Country Status (1)

Country Link
GB (2) GB0200954D0 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008061997A1 (en) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Device and method for differentiating between rearward-facing child seats and grown-up passengers in a motor vehicle
US10829072B2 (en) 2015-04-10 2020-11-10 Robert Bosch Gmbh Detection of occupant size and pose with a vehicle interior camera

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07105371A (en) * 1993-10-04 1995-04-21 Hitachi Ltd Hand print pattern recognition method
JPH07282235A (en) * 1994-04-15 1995-10-27 Matsushita Electric Ind Co Ltd Operation recognition device
JPH0898184A (en) * 1994-09-28 1996-04-12 Nec Corp Human body moving area detecting device
JPH08161498A (en) * 1994-12-05 1996-06-21 Minolta Co Ltd Object recognition device
EP0756426A2 (en) * 1995-07-28 1997-01-29 Sharp Kabushiki Kaisha Specified image-area extracting method and device for producing video information
FR2778245A1 (en) * 1998-04-29 1999-11-05 Fabre Pierre Dermo Cosmetique Measuring color of objects particularly of human skin
JP2000028354A (en) * 1998-07-13 2000-01-28 Victor Co Of Japan Ltd Three-dimensional image processing device
JP2000097676A (en) * 1998-09-24 2000-04-07 Sanyo Electric Co Ltd Method and apparatus for detecting direction of face
JP2000105819A (en) * 1998-09-28 2000-04-11 Sanyo Electric Co Ltd Face image area detecting device
JP2001005973A (en) * 1999-04-20 2001-01-12 Atr Media Integration & Communications Res Lab Method and device for estimating three-dimensional posture of person by color image
US6263113B1 (en) * 1998-12-11 2001-07-17 Philips Electronics North America Corp. Method for detecting a face in a digital image

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07105371A (en) * 1993-10-04 1995-04-21 Hitachi Ltd Hand print pattern recognition method
JPH07282235A (en) * 1994-04-15 1995-10-27 Matsushita Electric Ind Co Ltd Operation recognition device
JPH0898184A (en) * 1994-09-28 1996-04-12 Nec Corp Human body moving area detecting device
JPH08161498A (en) * 1994-12-05 1996-06-21 Minolta Co Ltd Object recognition device
EP0756426A2 (en) * 1995-07-28 1997-01-29 Sharp Kabushiki Kaisha Specified image-area extracting method and device for producing video information
FR2778245A1 (en) * 1998-04-29 1999-11-05 Fabre Pierre Dermo Cosmetique Measuring color of objects particularly of human skin
JP2000028354A (en) * 1998-07-13 2000-01-28 Victor Co Of Japan Ltd Three-dimensional image processing device
JP2000097676A (en) * 1998-09-24 2000-04-07 Sanyo Electric Co Ltd Method and apparatus for detecting direction of face
JP2000105819A (en) * 1998-09-28 2000-04-11 Sanyo Electric Co Ltd Face image area detecting device
US6263113B1 (en) * 1998-12-11 2001-07-17 Philips Electronics North America Corp. Method for detecting a face in a digital image
JP2001005973A (en) * 1999-04-20 2001-01-12 Atr Media Integration & Communications Res Lab Method and device for estimating three-dimensional posture of person by color image

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008061997A1 (en) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Device and method for differentiating between rearward-facing child seats and grown-up passengers in a motor vehicle
US10829072B2 (en) 2015-04-10 2020-11-10 Robert Bosch Gmbh Detection of occupant size and pose with a vehicle interior camera

Also Published As

Publication number Publication date
GB2384305B (en) 2005-03-16
GB0200954D0 (en) 2002-03-06
GB0212411D0 (en) 2002-07-10

Similar Documents

Publication Publication Date Title
US20040220705A1 (en) Visual classification and posture estimation of multiple vehicle occupants
US9077962B2 (en) Method for calibrating vehicular vision system
US20180302575A1 (en) Imaging system for vehicle
US7580545B2 (en) Method and system for determining gaze direction in a pupil detection system
US8589033B2 (en) Contactless obstacle detection for power doors and the like
CN103303205B (en) Vehicle surroundings monitoring apparatus
KR100440669B1 (en) Human presence detection, identification and tracking using a facial feature image sensing system for airbag deployment
JPH08290751A (en) Sensor system and safety system for vehicle
KR20020002221A (en) Optimized human presence detection through elimination of background interference
JP2024069199A (en) Infrared light source protective system
WO2020230636A1 (en) Image recognition device and image recognition method
WO2014103223A1 (en) Night-vision device
KR102420289B1 (en) Method, control device and vehicle for detecting at least one object present on a vehicle
US8560179B2 (en) Adaptive visual occupant detection and classification system
US20060050927A1 (en) Camera arrangement
EP1552988A2 (en) Infrared proximity sensor for air bag safety
JP2004350303A (en) Image processing system for vehicle
EP1800964B1 (en) Method of depth estimation from a single camera
JP2007316036A (en) Occupant detector for vehicle
GB2384305A (en) Human position detection by capturing spectral contents of images representative of human skin
JP2005033680A (en) Image processing apparatus for vehicle
US10904409B2 (en) Detection apparatus, imaging apparatus, moveable body, and detection method
Koch et al. Real-time occupant classification in high dynamic range environments
US20040249567A1 (en) Detection of the change of position of a vehicle occupant in an image sequence
EP4408010A1 (en) Method and apparatus for controlling the exposure time of an imaging device

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20090529