SE542704C2 - Method and control arrangement in a vehicle for object detection - Google Patents

Method and control arrangement in a vehicle for object detection

Info

Publication number
SE542704C2
SE542704C2 SE1750382A SE1750382A SE542704C2 SE 542704 C2 SE542704 C2 SE 542704C2 SE 1750382 A SE1750382 A SE 1750382A SE 1750382 A SE1750382 A SE 1750382A SE 542704 C2 SE542704 C2 SE 542704C2
Authority
SE
Sweden
Prior art keywords
vehicle
electromagnetic radiation
control arrangement
moving
representation
Prior art date
Application number
SE1750382A
Other versions
SE1750382A1 (en
Inventor
Torbjörn Boxell
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1750382A priority Critical patent/SE542704C2/en
Priority to DE102018002256.1A priority patent/DE102018002256A1/en
Publication of SE1750382A1 publication Critical patent/SE1750382A1/en
Publication of SE542704C2 publication Critical patent/SE542704C2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Method (400) and control arrangement (310) in a vehicle (100) for detecting an object (120) and outputting a representation of the object (120) to a driver of the vehicle (100). The method (400) comprises: emitting (401) electromagnetic radiation in a wavelength spectrum (210) more narrow than a threshold spectrum width (220); receiving (402) reflections of the emitted (401) electromagnetic radiation from the object (120); determining (404) whether the object (120) is moving or not, in relation to the vehicle (100), based on the received (402) reflections; and outputting (405) a representation of the object (120) determined (404) to be moving to the driver of the vehicle (100), comprising a visual enhancement.

Description

METHOD AND CONTROL ARRANGEMENT IN A VEHICLE FOR OBJECT DETECTION TECHNICAL FIELD This document relates to a method and a control arrangement in a vehicle. More particularly, a method and a control arrangement are described, for detecting an object and outputting a representation of the object to a driver of the vehicle.
BACKGROUND It is important for a driver of a vehicle to distinguish moving objects, close to the vehicle, which may present a potential risk of an accident, from static objects.
Visibility of the vehicle driver is limited; perhaps in particular for heavy vehicles such as busses, trucks, and in particular for trucks with trailers.
The problem is escalated perhaps in particular when a heavy vehicle such as a city bus is driving at slow velocity in a populated environment. Passengers may be approaching the vehicle in order to enter, or leave and then re-enter at multiple doors, some passengers (or other pedestrians or bicyclists) may take shortcuts through blind spots of the driver, etc. It is difficult for the driver to keep track of these passengers as he/she also has to keep an eye on the surrounding traffic situation and also serve/provide information to entering passengers. The driver, in particular the inexperienced driver, may have problems with distinguishing relevant information in the close surroundings of the vehicle, from irrelevant or at least not urging information.
The problems of the driver perceiving the surroundings correctly, further accelerates in low visibility conditions such as in darkness etc.
In wintertime, frost/snow (or dirt all around the year) on the rear-view mirrors or windscreen of the vehicle may further restrict the driver’s perception.
Also, the driver's direct visibility in the direction of the rear-view mirrors will be further limited by the mirrors themselves.
It is also known to replace a conventional rear view mirror of a vehicle with a pair of cameras (situated outside the vehicle) and a corresponding display (situated in the cabin). This arrangement may sometimes be referred to as a digital rear view mirror. An advantage is that air resistance may be reduced, as the camera is considerably smaller than a rear-view mirror. However, the above discussed problems with detection of moving objects around the vehicle are not solved only by making such replacement.
Another problem in a vehicle is to determine if/when the vehicle is not moving, i.e. when the vehicle has stopped completely and/or if the vehicle is moving slowly. Many functions in a vehicle may be triggered, or only possible to perform when the vehicle is stationary, such as opening the door in a bus; as opening the passenger doors otherwise may result in an accident.
Document JPH0933546 describes a method of determining vehicle velocity by directing a laser towards the vehicle, capturing reflected laser light from the vehicle and interpreting a speckle pattern.
However, it is not feasible for the vehicle driver to start operating a laser while driving and direct it towards different objects in the surroundings. Further, it would rather be desired to assist the driver in discovering a moving object and distinguishing it from static/remote objects, than determine its exact speed.
The documents US2011128142, DE102008003936 and JP2004306779 all illustrate mirrors wherein the speed of surrounding vehicles is determined and outputted to the driver. All three documents, although different, are using complicated and expensive techniques which does not necessarily assist the driver in detecting a very slowly moving object, such as a passenger waiting at a bus stop or road crossing.
A possible solution for detecting a human/animal in darkness may be to apply an infrared camera in the vehicle. Adding infrared cameras to the vehicle is expensive. Infrared cameras also have a fairly slow response time, providing a delay in the image which may be fatal in a traffic situation. Further, focusing lenses cannot be made of glass, as glass blocks long-wave infrared light. Special materials such as Germanium must be used, which are both expensive and fragile.
Yet another problem of vehicles is to determine if the vehicle is stationary or moving at low velocity. This is typically solved by a sensor at the driving shaft. However, vibrations of the motor make distinct determination difficult. Many functions in vehicles are possible to activate only when the vehicle is stationary, such as e.g. opening passenger doors in a bus. In case the doors are opened while the vehicle is still moving, also at low velocity, it may result in severe accidents as descending passengers (who may be physically or optically impaired and/or carrying bulky luggage) expect the vehicle to be stationary.
It would thus be desired to enhance traffic safety of a vehicle, by reducing problems associated with rear-view mirrors, blind spots around a vehicle and reality perception of the driver.
SUMMARY It is therefore an object of this invention to solve at least some of the above problems and improve traffic safety by detecting a moving object and output a representation thereof to a vehicle driver and/or to accurately determine vehicle movement at low speed.
According to a first aspect of the invention, this objective is achieved by a method in a vehicle. The method aims at detecting an object and outputting a representation of the object to a driver of the vehicle. The method comprises the step of emitting coherent electromagnetic radiation in a wavelength spectrum more narrow than a threshold spectrum width. Further, the method also comprises receiving reflections of the emitted coherent electromagnetic radiation from the object, creating speckels. In addition, the method also comprises determining whether the object is moving or not, in relation to the vehicle, based on the received reflections by detecting speckles created by the received reflections of the object and analysing speckle contrast in one sample of speckles for retrieving information regarding relative speed difference. Additionally, the method furthermore comprises outputting a representation of the object determined to be moving to the driver of the vehicle, comprising a visual enhancement illustrating the relative speed difference.
According to a second aspect of the invention, this objective is achieved by a control arrangement in a vehicle. The control arrangement aims at detecting an object and outputting a representation of the object to a driver of the vehicle. The control arrangement is configured to trigger a source of coherent electromagnetic radiation to emit electromagnetic radiation in a wavelength spectrum more narrow than a threshold spectrum width. Further, the control arrangement is configured to also obtain signals from a camera, representing reflections of the emitted coherent electromagnetic radiation from the object received by the camera, creating speckles. The control arrangement is additionally configured to determine whether the object is moving or not, in relation to the vehicle, based on the obtained signals by detecting speckles created by the received reflections of the object and analysing speckle contrast in one sample of speckles for retrieving information regarding relative speed difference. Also, the control arrangement is furthermore configured to generate a command to output a representation of the object determined to be moving to the driver of the vehicle, comprising a visual enhancement illustrating the relative speed difference, via an output device.
Thanks to the described aspects, by emitting electromagnetic radiation in a narrow wavelength spectrum, and perceive reflections from an object within that same spectrum, it may be determined if the object is moving or not by studying the reflections and any possible interference. A detected interference or speckle of coherent light, indicates that the object is moving. A representation of the moving object may then be outputted to the driver, with an enhancement for making him/her aware of the moving object. The method may be performed also in low visibility conditions such as at night time. Thereby, increased traffic safety is achieved.
Other advantages and additional novel features will become apparent from the subsequent detailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which: Figure 1A illustrates a vehicle according to an embodiment of the invention; Figure 1B illustrates a vehicle according to an embodiment, as regarded from above; Figure 2 illustrates an electromagnetic radiation spectrum according to an embodiment; Figure 3 illustrates a vehicle interior according to an embodiment; Figure 4 is a flow chart illustrating an embodiment of the method; Figure 5 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method and a control unit, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.
Figure 1 illustrates a scenario with a vehicle 100. The vehicle 100 may be stationary, or be driving on a road in a driving direction.
The vehicle 100 may comprise e.g. a means for transportation in broad sense such as e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, a cable transport, an aerial tramway, an elevator, a drone, a spacecraft, or other similar manned or unmanned means of conveyance running e.g. on wheels, rails, air, water, vacuum or similar media.
The vehicle 100 may be configured for running on a road, on a rail, in terrain, in water, in air, in space, etc.
The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) in different embodiments. However, for enhanced clarity, the vehicle 100 is subsequently described as having a driver.
The vehicle 100 comprises a source of electromagnetic radiation 110, configured to emit electromagnetic radiation in a wavelength spectrum more narrow than a threshold spectrum width, as illustrated in Figure 2 and further discussed in the corresponding section of the disclosure. The electromagnetic radiation source 110 may in some embodiments comprise a diode, or set of diodes; or one or several lamps with a colour filter; one or several fluorescent lamps or a fluorescent tube with a colour filter or a coherent laser device. In spatially coherent light, all photons are moving in phase, and thereby interference may be created, which is referred to as speckles. The laser device may also have high temporal coherence, which allow it to emit electromagnetic radiation with a very narrow spectrum, i.e., they can emit a single colour of light.
In the illustrated embodiment, an object 120, such as e.g. a human, another vehicle, an animal, a structure, etc.
Further, the vehicle 100 comprises a camera 130, configured to receive reflections of emitted electromagnetic radiation, from the object 120, which electromagnetic radiation has been emitted by the electromagnetic radiation source 110. The camera 130 may comprise a camera, a stereo camera, an infrared camera, a video camera etc., configured for capturing and/or streaming images.
In some embodiments, coherent laser light is emitted by the electromagnetic radiation source 110, within the visible or invisible spectrum. As the laser light is coherent, reflections from the emitted laser light, reflected from the object 120 creates moving speckles at the sensor surface of the camera 130 when the object 120 is moving in relation to the camera 130 (and thereby also in relation to the vehicle 100). Thus, the speckle pattern is moving when the illuminated object 120 is moving in relation to the electromagnetic radiation source 110/camera 130. The speckle pattern is stationary if the reflecting object 120 is stationary.
A speckle pattern is an intensity pattern produced by the mutual interference of a set of wave fronts creating constructive/destructive interference due to a path length difference. Speckle patterns typically occur in diffuse reflections of monochromatic light such as laser light. The speckles may further be analysed to retrieve information regarding relative speed difference. The faster the light reflecting object 120 is moving, the faster the speckles will move on the sensor surface of the camera 130.
Measurements may be made on a plurality of distinct points, such as e.g. on some or all pixels of the camera sensor surface. Thereby, a two-dimensional image, or set of images in distinct time frames, may be created, illustrating the speed of the object 120.
Speckle contrast imaging/measurement or speckle pattern movement in between successive camera frames are two different methods to sense movement of the reflective object 120. In speckle contrast imaging, contrast of the speckle pattern is measured. Blurry speckles means movement, then the shutter speed of the camera 130 allow some speckles to move during the time the picture is taken. The light spot may then be blurred as the energy is distributed on multiple pixels on the sensor area of the camera 130 and the contrast decreases. Stationary speckles have high contrast as they not are moving.
The photons of the reflected electromagnetic radiation, emitted by the electromagnetic radiation source 110, are coinciding with the reflected light of the object 120 from other light sources. Thereby, it becomes possible to combine and associate a “normal” image with superposed speed information, based on the reflected electromagnetic radiation of the electromagnetic radiation source 110. Further, the reflections of the “normal” image may be filtered out and only the speed information may be outputted. Thus, only representations of moving objects in the vehicle environment may be outputted to the driver, helping him/her to focus on the moving objects in the vehicle environment; also in bad environmental light, at night time, in fog, heavy rain/snow/hail, etc.
The camera 130, the electromagnetic radiation source 110 in combination with laser speckle velocimetry may be utilised for outputting a representation of the object 120 in an output unit, which may be a digital rearview mirror; a display in the cab of the vehicle; a head up display; a display integrated in the windshield of the vehicle 100; a projector; a pair of intelligent glasses, i.e. an optical head-mounted display, that is designed in the shape of a pair of eyeglasses; a loud speaker; a tactile device, etc.; or a combination thereof.
Thereby, an image representation may be outputted, wherein superposed speed information on objects 120 in the environment which are moving may be provided. In some embodiments, an enhancement may be outputted superposed on the representation of the object 120, such as e.g. a colour; or a colour scale mapped to speed information. Another example of enhancement may be flashing the representation of the moving object 120; emit a sound/ tactile signal; put a frame around the moving object 120, etc.
In some embodiments, the object 120 may be the ground/road under the vehicle 100, the electromagnetic radiation source 110 may be directed towards the ground/street and the camera 130 may be directed for capturing reflected electromagnetic radiation from the object 120/road. It may thereby be determined whether the vehicle 100 is stationary or not, which may be an important precondition for certain functions in the vehicle 100, such as opening the passenger doors in a bus, for example; or igniting the cab light, etc.
Figure 1B illustrates the vehicle 100 already presented in Figure 1A regarded from an above perspective. It is here illustrated how the object 120 is moving towards the vehicle 100 in a number of distinct time frames. The electromagnetic radiation emitted from the electromagnetic radiation source 110, within a narrow electromagnetic radiation spectrum may be reflected by the object 120 and the reflections may be captured by the camera 130.
The electromagnetic radiation source 110 and the camera 130 may be directed forwardly in the regular driving direction of the vehicle 100. The object 120 may then be an in-front vehicle and a constant distance and velocity alignment may be made with the in-front vehicle, in some embodiments.
The electromagnetic radiation of the electromagnetic radiation source 110 may be expanded e.g. by an optical element or a set of optical elements such as a lens, a spherical reflector and/or a mirror, to cover a zone which is desired to be covered. In some embodiments, diffractive optics may be used.
A coherent laser provides a speckle pattern on the camera sensor when light is reflected from a moving object 120. Frequency analysis or speckle contrast measurement may then determine speed of the object 120. Optical filters may be applied in some embodiments to filter out light disturbing the measuring wavelength. The speed information may then be filtered out speed information on the wave length of the laser and thereby avoid measuring errors due to incident ambient light. Alternatively, two (or more) cameras 130 may be used; one for capturing background images and one for capturing speed information of the moving object 120. The images/information may then be matched and calibrated together. Thereby, the speed information may be superposed on the background images. In some embodiments, the static background of objects not moving may be outputted in black and white scale while the superposed speed information of the moving object(s) 120 may be outputted with a colour.
An advantage with the herein disclosed embodiments is that moving objects 120, e.g. in darkness, may be detected and easier distinguished from the surroundings. It may for example be difficult to otherwise discover an animal with colours merging into the surroundings; a person with dark cloths without retroreflector, etc.
It is possible for the driver to perceive both a “normal” view of the surroundings, e.g. outputted in a digital mirror, on a screen integrated in the windshield, etc., and also obtain speed information in the same view. Thereby the driver does not have to switch between looking at different output units, enabling the driver to focus on the most critical object in the vehicle environment.
In case the vehicle 100 is an autonomous vehicle, as may be the case, the obtained speed information of a detected moving object 120 in combination with a movement direction analysis may be used for determining if the vehicle 100 is going to collide with the moving object 120 or not, and may trigger an appropriate action such as brake, activate horn, etc.
Figure 2 illustrates a portion of the electromagnetic spectrum, i.e. frequencies and their linked wavelengths of known photons (electromagnetic radiation).
The electromagnetic spectrum in general extends from below the low frequencies used for modern radio communication to gamma radiation at the short-wavelength (high-frequency) end, thereby covering wavelengths from thousands of kilometres down to a fraction of the size of an atom. The illustrated portion rather concerns visible light lies, with wavelengths from 400 to 700 nanometres; infrared light (700 nanometres - 1 mm) and ultraviolet light (10 nanometres - 400 nanometres).
The electromagnetic radiation of the electromagnetic radiation source 110 is emitted in a wavelength spectrum 210 more narrow than a threshold spectrum width 220.
The threshold spectrum width 220 may be predetermined or configurable and may comprise a narrow spectrum interval such as e.g. 10 nanometres; 50 nanometres; etc.
An advantage with using laser within the infrared light spectrum interval is that the infrared light is not disturbing other road users. Another advantage may be that heat radiation of moving objects 120 may be used as electromagnetic radiation source.
An advantage with using laser within the visible light spectrum is that other road users will notify the laser light and turn away the sight. Thus, they can avoid injuries on the eye due to the laser.
Figure 3 illustrates an example of a vehicle interior of the vehicle 100 and depicts how the previously scenario in Figure 1A and/or Figure 1 B may be perceived by the driver of the vehicle 100.
The vehicle 100 comprises a control arrangement 310, a right-side output device 320a intended to display a representation of objects 120 outside a driver’s direct field of vision, situated on the right side of the vehicle 100, and a left side output device 320b intended to display a representation of objects 120 outside a driver’s direct field of vision, situated on the left side of the vehicle 100. Each such output device 320a, 320b is associated with a respective electromagnetic radiation source 110a, 110b and camera 130a, 130b. The electromagnetic radiation source 110a, 110b may comprise a laser; or a narrow light spectrum radiation source; and the camera 130a, 130b may typically comprise a respective video camera. The control arrangement 310 may comprise one control unit, or a plurality of control units.
Thus, the vehicle 100 comprises digital rear view mirrors, which each comprises: one output device 320a, 320b; at least one electromagnetic radiation source 110a, 110b; at least one camera 130a, 130b and the control arrangement 310.
The output device 320a, 320b may in some alternative embodiments be complemented or replaced by another device such as e.g. a display, a loudspeaker, a projector, a head-up display, a display integrated in the windshield of the vehicle 100, a display integrated in the dashboard of the vehicle 100, a tactile device, a portable device of the vehicle driver/owner, intelligent glasses of the vehicle driver/owner, etc.; or a combination thereof.
However, in some embodiments, the vehicle 100 may comprise a plurality of additional sensors on each side of the vehicle 100 for detecting objects 120, which may be of the same, or different types, such as e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device in different embodiments.
The control arrangement 310 is configured to detect the object 120 and generate control signals for outputting a representation of the object 120 to a driver of the vehicle 100 via any of the output devices 320a, 320b. One or several sensors may be attached to or associated with the control arrangement 310 in some embodiments.
The control arrangement 310 is configured to trigger a source of electromagnetic radiation 110a to emit electromagnetic radiation in a wavelength spectrum 210 more narrow than a threshold spectrum width 220. Further, the control arrangement 310 is configured to obtain signals from the camera 130a, representing reflections of the emitted electromagnetic radiation from the object 120 received by the camera 130a. The control arrangement 310 is also configured to determine whether the object 120 is moving or not, in relation to the vehicle 100, based on the obtained signals. Furthermore, the control arrangement 310 is configured to generate a command to output a representation of the object 120 determined to be moving to the driver of the vehicle 100, comprising a visual enhancement, via an output device 320a. The visual enhancement may be a colour, flashing lights at the contours of the representation of the object 120, a frame around the moving object 120, an audio signal, a light signal, a haptic signal etc., and/or a combination thereof.
Thereby the risk of an accident due to a moving object 120 appearing in a close environment of the vehicle 100 is reduced, as the driver is made aware of the object 120 and its position in relation to the own vehicle 100. Traffic safety is thereby enhanced.
The control arrangement 310 may communicate with the electromagnetic radiation sources 110a, 110b, cameras 130a, 130b and displays 320a, 320b e.g. via a wired or wireless communication bus of the vehicle 100, or via a wired or wireless connection. The communication bus may comprise e.g. a Controller Area Network (CAN) bus, Ethernet/Local Area Network (LAN), a Media Oriented Systems Transport (MOST) bus, or similar. However, the communication may alternatively be made over an optic connection or a wireless connection based on a wireless communication technology.
The cameras 130a, 130b may be directed to capture images at the back and the side of the vehicle 100. The cameras 130a, 130b may comprise a video camera, or a camera configured for streaming images. The cameras 130a, 130b may be part of a digital mirror, replacing the reverse mirror on the two respective sides of the vehicle 100, together with the connected control arrangement 310 and a display 320a, 320b, outputting images captured by the cameras 130a, 130b, possibly image processed by the control arrangement 310.
Figure 4 illustrates an example of a method 400 according to an embodiment. The flow chart in Figure 4 shows the method 400 for use in a vehicle 100. The method 400 aims at detecting an object 120 and outputting a representation of the object 120 to a driver of the vehicle 100.
The vehicle 100 may be e.g. a truck, a bus, a car, or similar means of conveyance.
The vehicle 100 may comprise one or several sources of electromagnetic radiation 110; one or several cameras 130 pointable towards the object 120, in some embodiments, simultaneously, shifted or sequentially in time. In case several sources of electromagnetic radiation 110 are used, they may be tuned with each other in order to not create interference between them.
The object 120 may be another vehicle, a human, an animal, the ground, etc.
In order to correctly be able to detect and visualise the object 120, the method 400 may comprise a number of steps 401-406. However, some of these steps 401-406 may be performed in various alternative manners. Some method steps may only be performed in some optional embodiments; such as e.g. steps 403 and/or step 406. Further, the described steps 401-406 may be performed in a somewhat different chronological order than the numbering suggests. The method 400 may comprise the subsequent steps: Step 401 comprises emitting electromagnetic radiation in a wavelength spectrum 210 more narrow than a threshold spectrum width 220. The electromagnetic radiation may comprise e.g. visible light, near infrared light, infrared light, and/or ultraviolet light in some embodiments.
The electromagnetic radiation may be coherent in some embodiments, i.e. comprise laser light.
The source of electromagnetic radiation 110 may be modulated in synchronisation with a camera 130, for suppressing interference from environmental electromagnetic radiation.
Step 402 comprises receiving reflections of the emitted 401 electromagnetic radiation from the object 120. The reflections may be received by a camera 130.
Further, in some particular embodiments, receiving the reflections of the object 120 may also comprise identifying the object 120 by image recognition.
Image recognition/computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. A theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of the world that can interface with other thought processes and elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Computer vision may also be described as the enterprise of automating and integrating a wide range of processes and representations for vision perception.
In some embodiments, the location of the detected object 200 in relation to the vehicle 100 may be determined.
Step 403, which only may be comprised in some embodiments, comprises filtering out interfering electromagnetic radiation outside the wavelength spectrum 210, in which electromagnetic radiation has been emitted 401.
That is, reflections and incoming light from other sources of electromagnetic radiation such as the sun or ambient lights may be filtered out.
Step 404 comprises determining whether the object 120 is moving or not, in relation to the vehicle 100, based on the received 402 reflections.
In some embodiments, wherein the emitted electromagnetic radiation is coherent, it may be determined whether the object 120 is moving or not by detecting speckles created by the received 402 reflections of the object 120, when analysing sequential samples in different time frames; or speckle contrast in one image frame, in some embodiments.
A movement direction of the object 120 may be determined, based on a detected difference in position of the object 120, between a plurality of time frames wherein reflections of the object 120 have been received 402.
In some embodiments, the determination whether the object 120 is moving or not, further comprises estimating speed of the object 120. The object speed is proportional to the movements of the speckles on the receptor of the camera 130. Thus, by determining and analysing the movements of the speckles, and/or speckle contrast, an estimation of the object speed may be made.
A lock-in amplifier may be used in some embodiments, to extract a signal with a known carrier wave from a noisy environment.
Step 405 comprises outputting a representation of the object 120 determined 404 to be moving to the driver of the vehicle 100, comprising a visual enhancement.
In some embodiments, wherein the velocity of the object 120 has been estimated, the representation of the object 120 may be outputted, based on the estimated velocity.
The outputted representation of the object 120 may comprise a confirmation that the vehicle 100 is stationary, when the object 120 is determined 404 not to be moving in relation to the vehicle 100, in embodiments wherein the object 120 is represented by the ground.
The outputted representation may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner.
The visual enhancement may be represented by a colour; enhanced contours; a frame around the moving object 120, etc., which visual enhancement may be super positioned on an image representation of the vehicle environment.
Different speeds of the object 120 may be mapped with different colours in some embodiments. Thus, a speed below a first threshold level may be grey; a speed exceeding the first threshold level while being below a second threshold level may be green; a speed exceeding the second threshold level while being below a third threshold level may be yellow; while a speed exceeding the third threshold level may be red, in an arbitrary example.
Step 406, which only may be comprised in some embodiments, comprises switching between outputting 405 a representation of the object 120 determined 404 to be moving comprising the visual enhancement, and a camera image.
Figure 5 illustrates an embodiment of a system 500 in a vehicle 100 for detecting an object 120 and outputting a representation of the object 120 to a driver of the vehicle 100.
The system 500 may perform at least some of the previously described steps 401-406 according to the method 400 described above and illustrated in Figure 4.
The system 500 comprises at least one control arrangement 310 in the vehicle 100. The control arrangement 310 is configured to trigger a source of electromagnetic radiation 110 to emit electromagnetic radiation in a wavelength spectrum 210 more narrow than a threshold spectrum width 220. Further, the control arrangement 310 is configured to obtain signals from a camera 130, representing reflections of the emitted electromagnetic radiation from the object 120 received by the camera 130. The control arrangement 310 is in addition configured to determine whether the object 120 is moving or not, in relation to the vehicle 100, based on the obtained signals. Also, the control arrangement 310 is furthermore configured to generate a command to output a representation of the object 120 determined to be moving to the driver of the vehicle 100, comprising a visual enhancement, via an output device 320.
The control arrangement 310 may in some embodiments be configured to determine whether the object 120 is moving or not by detecting speckles created by the received reflections of the object 120, when analysing sequential samples in different time frames.
Further, the control arrangement 310 may in addition be configured to determine a movement direction of the object 120, based on a detected difference in position of the object 120, between a plurality of time frames wherein reflections of the object 120 have been received.
In addition, the control arrangement 310 may also be configured to estimate velocity of the object 120 when determining whether the object 120 is moving or not. Furthermore, the control arrangement 310 may be configured to generate a command to output the representation of the object 120, based on the estimated velocity, in some embodiments.
The control arrangement 310 may furthermore be configured to filter out interfering electromagnetic radiation outside the wavelength spectrum 210, in which electromagnetic radiation has been emitted.
Also, the control arrangement 310 may, in some embodiments, be configured to modulate a source of electromagnetic radiation 110 in synchronisation with a camera 130, for suppressing interference from environmental electromagnetic radiation, e.g. by a lock-in amplifier.
Furthermore, the control arrangement 310 may also be additionally configured to output the representation of the object 120 via the output device 320, when the object 120 is determined not to be moving in relation to the vehicle 100, when the object 120 is represented by the ground.
The control arrangement 310 may also be configured to switch between outputting the representation of the object 120 determined to be moving comprising the visual enhancement, via the output device 320, and a camera image.
The control arrangement 310 comprises a receiving circuit 510 configured for receiving a signal from the cameras 130a, 130b and possibly other sensors of the vehicle 100.
Further, the control arrangement 310 comprises a processor 520 configured for performing at least some steps of the method 400, according to some embodiments.
Such processor 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression “processor” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control arrangement 310 may comprise a memory 525 in some embodiments. The optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 525 may comprise integrated circuits comprising siliconbased transistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embodiments.
Further, the control arrangement 310 may comprise a signal transmitter 530 in some embodiments. The signal transmitter 530 may be configured for transmitting a signal to e.g. the output device 320a, 320b, the sources of electromagnetic radiation 110a, 110b, and/or a warning system or warning device, for example.
In addition, the system 500 also comprises at least one source of electromagnetic radiation 110, configured to emit electromagnetic radiation in a wavelength spectrum 210 more narrow than a threshold spectrum width 220. In some embodiments, the electromagnetic radiation source 110 may be configured to emit coherent electromagnetic radiation.
Furthermore, the system 500 also comprises at least one camera 130 configured to receive reflections of emitted electromagnetic radiation from the object 120.
The system 500 also comprises an output device 320 configured to output a representation of the object 120.
The at least one source of electromagnetic radiation 110 and/or at least one camera 130 may in some embodiments have another main purpose than performing the method 400, i.e. be already existing in the vehicle 100.
The output device 320 may in some embodiments comprise at least one a digital mirror. In some embodiments, an additional camera may be comprised, for capturing a stream of images, and a display for displaying the captured stream of images of the corresponding camera.
The above described steps 401-406 to be performed in the vehicle 100 may be implemented through the one or more processors 520 within the control arrangement 310, together with computer program product for performing at least some of the functions of the steps 401-406. Thus, a computer program product, comprising instructions for performing the steps 401-406 in the control arrangement 310 may perform the method 400 comprising at least some of the steps 401-406 for detecting an object 120 and outputting a representation of the object 120 to a driver of the vehicle 100, when the computer program is loaded into the one or more processors 520 of the control arrangement 310.
Further, some embodiments of the invention may comprise a vehicle 100, comprising the control arrangement 310, for detecting the object 120 and outputting a representation of the object 120 to a driver of the vehicle 100, according to at least some of the steps 401-406.
The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the steps 401-406 according to some embodiments when being loaded into the one or more processors 520 of the control arrangement 310. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a nontransitory manner. The computer program product may furthermore be provided as computer program code on a server and downloaded to the control arrangement 310 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method 400; the control arrangement 310; the computer program; the system 500 and/or the vehicle 100. Various changes, substitutions and/or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/or" comprises any and all combinations of one or more of the associated listed items. The term “or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be interpreted as “at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and/or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (17)

PATENT CLAIMS
1. A method (400) in a vehicle (100) for detecting an object (120) and outputting a representation of the object (120) to a driver of the vehicle (100), wherein the method (400) comprises: emitting (401) coherent electromagnetic radiation in a wavelength spectrum (210) more narrow than a threshold spectrum width (220); receiving (402) reflections of the emitted (401) coherent electromagnetic radiation from the object (120), creating speckles; determining (404) whether the object (120) is moving or not, in relation to the vehicle (100), based on the received (402) reflections by detecting speckles created by the received (402) reflections of the object (120) and analysing speckle contrast in one sample of speckles for retrieving information regarding relative speed difference; and outputting (405) a representation of the object (120) determined (404) to be moving to the driver of the vehicle (100), comprising a visual enhancement illustrating the relative speed difference.
2. The method (400) according to claim 1, wherein a movement direction of the object (120) is determined (404), based on a detected difference in position of the object (120), between a plurality of time frames wherein reflections of the object (120) have been received (402).
3. The method (400) according to any of claims 1-2, wherein the determination (404) whether the object (120) is moving or not further comprises estimating velocity of the object (120); and wherein the representation of the object (120) is outputted, based on the estimated velocity.
4. The method (400) according to any of claims 1 -3, further comprising: filtering out (403) interfering electromagnetic radiation outside the wavelength spectrum (210), in which coherent electromagnetic radiation has been emitted (401).
5. The method (400) according to any of claims 1 -4, wherein a source of coherent electro-magnetic radiation (110) is modulated in synchronisation with a camera (130), for suppressing interference from environmental electromagnetic radiation.
6. The method (400) according to any of claims 1-5, wherein the object (120) is the ground, and wherein the outputted (405) representation of the object (120) comprises a confirmation that the vehicle (100) is stationary, when the object (120) is determined (404) not to be moving in relation to the vehicle (100).
7. The method (400) according to any of claims 1 -6, further comprising: switching (406) between outputting (405) a representation of the object (120) determined (404) to be moving comprising the visual enhancement, and a camera image.
8. A control arrangement (310) in a vehicle (100), for detecting an object (120) and outputting a representation of the object (120) to a driver of the vehicle (100), wherein the control arrangement (310) is configured to: trigger a source of coherent electromagnetic radiation (110) to emit coherent electromagnetic radiation in a wavelength spectrum (210) more narrow than a threshold spectrum width (220); obtain signals from a camera (130), representing reflections of the emitted coherent electromagnetic radiation from the object (120) received by the camera (130), creating speckles; determine whether the object (120) is moving or not, in relation to the vehicle (100), based on the obtained signals by detecting speckles created by the received reflections of the object (120) and analysing speckle contrast in one sample of speckles for retrieving information regarding relative speed difference; and generate a command to output a representation of the object (120) determined to be moving to the driver of the vehicle (100), comprising a visual enhancement illustrating the relative speed difference, via an output device (320).
9. The control arrangement (310) according to claim 8, further configured to determine a movement direction of the object (120), based on a detected difference in position of the object (120), between a plurality of time frames wherein reflections of the object (120) have been received.
10. The control arrangement (310) according to any of claims 8-9, further configured to estimate velocity of the object (120) when determining whether the object (120) is moving or not; and also, configured to generate a command to output the representation of the object (120), based on the estimated velocity.
11. The control arrangement (310) according to any of claims 8-10, further configured to filter out interfering electromagnetic radiation outside the wavelength spectrum (210), in which coherent electromagnetic radiation has been emitted.
12. The control arrangement (310) according to any of claims 8-11, further configured to modulate a source of coherent electromagnetic radiation (110) in synchronisation with the camera (130), for suppressing interference from environmental electromagnetic radiation.
13. The control arrangement (310) according to any of claims 8-12, wherein the object (120) is the ground, and wherein the control arrangement (310) is further configured to output the representation of the object (120) via the output device (320), when the object (120) is determined not to be moving in relation to the vehicle (100).
14. The control arrangement (310) according to any of claims 8-13, further configured to switch between outputting the representation of the object (120) determined to be moving comprising the visual enhancement, via the output device (320), and a camera image.
15. A system (500) in a vehicle (100) for detecting an object (120) and outputting a representation of the object (120) to a driver of the vehicle (100), which system (500) comprises: a control arrangement (310) according to claims 8-14; a source of coherent electromagnetic radiation (110), configured to emit coherent electromagnetic radiation in a wavelength spectrum (210) more narrow than a threshold spectrum width (220); a camera (130) configured to receive reflections of emitted coherent electromagnetic radiation from the object (120); and an output device (320) configured to output a representation of the object (120).
16. A computer program comprising program code for performing a method (400) according to any of claims 1-7 when the computer program is executed in a control arrangement (310), according to any of claims 8-14.
17. A vehicle (100) comprising a system (500) according to claim 15.
SE1750382A 2017-03-30 2017-03-30 Method and control arrangement in a vehicle for object detection SE542704C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1750382A SE542704C2 (en) 2017-03-30 2017-03-30 Method and control arrangement in a vehicle for object detection
DE102018002256.1A DE102018002256A1 (en) 2017-03-30 2018-03-19 Method and control unit for object detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1750382A SE542704C2 (en) 2017-03-30 2017-03-30 Method and control arrangement in a vehicle for object detection

Publications (2)

Publication Number Publication Date
SE1750382A1 SE1750382A1 (en) 2018-10-01
SE542704C2 true SE542704C2 (en) 2020-06-30

Family

ID=63524545

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1750382A SE542704C2 (en) 2017-03-30 2017-03-30 Method and control arrangement in a vehicle for object detection

Country Status (2)

Country Link
DE (1) DE102018002256A1 (en)
SE (1) SE542704C2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4186682B2 (en) 2003-04-07 2008-11-26 株式会社デンソー In-vehicle mirror display
DE102008003936A1 (en) 2008-01-11 2008-07-10 Daimler Ag Warning indication displaying device for aiding e.g. van driver, has light display device provided as bar displays, where one of bar displays is provided for representation of relative speed of detected objects
KR101285006B1 (en) 2009-11-30 2013-07-10 한국전자통신연구원 Car mirror system and method for displaying driving information

Also Published As

Publication number Publication date
DE102018002256A1 (en) 2018-10-04
SE1750382A1 (en) 2018-10-01

Similar Documents

Publication Publication Date Title
CN108460734B (en) System and method for image presentation by vehicle driver assistance module
CN102447911B (en) Image acquisition unit, its method and associated control element
US10055649B2 (en) Image enhancements for vehicle imaging systems
CA3087048C (en) Multiple operating modes to expand dynamic range
JP7332726B2 (en) Detecting Driver Attention Using Heatmaps
JP6672326B2 (en) Dual-mode illuminator for imaging under different lighting conditions
US11318886B2 (en) Interactive safety system for vehicles
US10846833B2 (en) System and method for visibility enhancement
US20190135169A1 (en) Vehicle communication system using projected light
EP3428677B1 (en) A vision system and a vision method for a vehicle
KR102130059B1 (en) Digital rearview mirror control unit and method
US20190141310A1 (en) Real-time, three-dimensional vehicle display
EP3428687A1 (en) A vision system and vision method for a vehicle
SE542704C2 (en) Method and control arrangement in a vehicle for object detection
US11407358B2 (en) Method and control unit for rear view
JP2013114536A (en) Safety support apparatus and safety support method
JP2007050749A (en) Automobile periphery monitoring device
NL2019774B1 (en) Method and system for alerting a truck driver
DE112018005039T5 (en) SIGNAL PROCESSING DEVICE, SIGNAL PROCESSING PROCESS, PROGRAM AND MOBILE BODY
US11648876B2 (en) System and method for visibility enhancement
Tadjine et al. Optical Self Diagnostics for Camera Based Driver Assistance
Kashihara A driver support system to prevent traffic accidents caused by optical illusions