WO2017211837A1 - On-board system and method for determining a relative position - Google Patents

On-board system and method for determining a relative position Download PDF

Info

Publication number
WO2017211837A1
WO2017211837A1 PCT/EP2017/063725 EP2017063725W WO2017211837A1 WO 2017211837 A1 WO2017211837 A1 WO 2017211837A1 EP 2017063725 W EP2017063725 W EP 2017063725W WO 2017211837 A1 WO2017211837 A1 WO 2017211837A1
Authority
WO
WIPO (PCT)
Prior art keywords
directions
sensor
vehicle
board system
image
Prior art date
Application number
PCT/EP2017/063725
Other languages
French (fr)
Inventor
Thomas Heitzmann
Benazouz Bradai
Original Assignee
Valeo Schalter Und Sensoren Gmbh Fr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter Und Sensoren Gmbh Fr filed Critical Valeo Schalter Und Sensoren Gmbh Fr
Publication of WO2017211837A1 publication Critical patent/WO2017211837A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves

Definitions

  • the present invention relates to on-board vehicle systems. It relates more particularly to an onboard system and a method for determining a relative position.
  • the invention applies particularly advantageously in the case where the onboard system comprises an image sensor and a time of flight sensor.
  • distance measurement sensors are also used, for example obstacle sensors, whose measurement directions are, however, limited in number in order to reduce the cost of such sensors.
  • the present invention proposes an onboard system comprising a first sensor able to measure a free distance for a plurality of directions in an environment encountered by a vehicle and a second sensor capable of generating an image of said environment, characterized by a module of grouping a plurality of said directions on the basis of the free distances measured for the directions of said plurality, a location module of at least one object within a region of the image corresponding to said directions of said plurality, and a a direction selection module corresponding to the location of said object among said plurality of directions.
  • the first sensor is a flight time sensor, for example a scanning laser rangefinder or an obstacle sensor;
  • the second sensor is a video camera, or a radar, or an ultrasound system
  • the on-board system comprises a control module of an actuator as a function of the selected directions;
  • the actuator is a steering control system.
  • the invention also proposes a method for determining a relative position of an object with respect to a vehicle, by means of an onboard system comprising a first sensor able to measure a free distance for a plurality of directions in an environment encountered. by the vehicle and a second sensor adapted to generate an image of said environment, characterized by the following steps:
  • FIG. 1 shows schematically the main elements of a system embedded in a vehicle
  • FIG. 2 represents, in the form of functional blocks, a processing unit of the onboard system of FIG. 1.
  • FIG. 1 schematically shows the main elements of an embedded system 2 in a vehicle V (here a motor vehicle).
  • This embedded system comprises an image sensor 4, a flight time sensor 6, an actuator 8 and a processing unit 10.
  • the image sensor 4 is for example a video camera.
  • the image sensor 4 is placed in the vehicle so as to capture an IMG image of the outside environment of the vehicle V, located at the front of the vehicle (that is to say, met by the vehicle V when the latter advance), in a solid angle A corresponding to the field of view of the image sensor 4.
  • this environment comprises a first object O, here a second motor vehicle, and a second object O ', here a third motor vehicle.
  • the image sensor 4 could be a radar or an ultrasound system. It should be noted that, in these cases in particular, the image captured by the image sensor 4 can then be a one-dimensional image (or representation) of the environment located at the front of the vehicle V.
  • the flight time sensor 6 is for example an obstacle sensor
  • Such a time of flight sensor 6 makes it possible to measure, for each of a plurality of directions a, in the environment facing the vehicle V, the free distance d, up to the first obstacle encountered in the direction a, concerned, generally by counting the time taken by a light ray (or more generally an electromagnetic wave) emitted in the direction concerned a, to be detected at the time of flight sensor 6 after reflection on the first obstacle encountered (at a level of point of impact P,).
  • the distance measurements made by the flight time sensor 6 are accurate.
  • the actuator 8 makes it possible to control at least part of the trajectory of the V.
  • the actuator 8 is a steering control system. It could be a variant of a powertrain or a braking system.
  • the processing unit 10 can control the actuator 8 by means of a control signal CMD determined according to various processes carried out within the processing unit 10.
  • the processing unit 10 is designed to control, by means of the actuator 8, a trajectory consisting in following the second vehicle (first object O in FIG. 1), with the aim of assisting the driving (this trajectory being followed in the absence of different control of the driver), or for the purpose of automatic control of the vehicle V (case of an autonomous vehicle).
  • FIG. 2 represents the processing unit 10 in the form of functional blocks, or modules.
  • Each module 12, 14, 1 6, 18 represented in FIG. 2 corresponds to a particular functionality implemented by the processing unit 10.
  • Several (or even all) modules can however in practice be implemented by the same entity. physical, for example a processor on which executes program instructions stored in a memory associated with the processor (each module being then in this case implemented by the execution of a particular set of instructions stored in said memory) .
  • the processing unit 10 thus comprises a grouping module 12 designed to group (or associate with each other) a plurality of the observation directions a, of the flight time sensor 6, on the basis of a criterion using the free distances d, measured by the flight time sensor 6.
  • the directions could be grouped on the basis of the criterion of proximity of the free distances d j measured (at a given instant): two directions Oj, a ( - are in this case grouped if the free distances dj, dj > for these two directions a j; (3 ⁇ 4 ⁇ are sufficiently close (that is to say in practice if we have:
  • the processing unit 10 also comprises a location module 14 of object (s) O, O 'within a region R of the IMG image that corresponds to said grouped directions.
  • Such a location module 14 is designed to identify one or more object (s) O, O '(for example by means of a shape recognition algorithm) within the region R corresponding to the grouped directions ⁇ and to provide data L representative of the location (for example in the IMG image or, alternatively, in the region R) of the identified object (s) O, O '.
  • the association between the different possible directions a, and the region R of the IMG image associated with each of them is predefined (according to the fixed relative position of the image sensor 4 and the flight time sensor 6) and stored for example in a correspondence table within the processing unit 10.
  • the grouped directions generally correspond (by construction) to a single object
  • the identification (ie the detection) of a plurality of objects O, O 'within the region corresponding to the grouped directions ⁇ is possible when several objects O, O 'are close and / or or a behavior (for example a speed) close, as schematically represented in FIG. 1, and therefore could not be distinguished within the grouping module 12.
  • the processing unit 10 comprises a selection module 1 6 designed to select, for each object O, O 'identified in the region R by the location module 14, the set of directions ⁇ ci k ⁇ kes' corresponding to the concerned object O, O 'among all the grouped directions ⁇ a j ⁇ iS determined by the grouping module 12.
  • the selection module 1 6 determines, from the grouped directions, which correspond to the data L representative of the location of the object O concerned.
  • the association between the various possible directions a, and the IMG image is for example stored in a correspondence table within the processing unit 10 .
  • the selection module 1 6 can thus provide, for each object O, O ', all the selected directions ⁇ a k ⁇ kes ! associated with this object O, O '.
  • This provides a map of the environment located at the front of the vehicle V, based on the data provided by the time of flight sensor 6, but whose robustness is improved thanks to the use of the data (IMG image) provided by the image sensor 4.
  • the processing unit 10 further comprises a control module 18 designed to generate a direction control CMD for the actuator 8 as a function of the aforementioned map, precisely according to the selected directions ⁇ a k ⁇ kes ! relating to the object O, that is to say here the vehicle to follow (second vehicle).
  • the control module 18 emits for example a command CMD such that, by action of the actuator 8 (here a steering control system), the vehicle V is moving towards a mean direction determined on the basis of the selected directions mentioned above. k .
  • control module 18 distinguishes the object O (vehicle to follow) from another object O 'by choosing the object for which the selected directions ⁇ a ⁇ s' are the closest to the direction of advance of the vehicle V or, alternatively, by choosing the object tracked at the previous iteration (for example by comparing the directions selected for the different objects at the current time with the directions selected for the vehicle followed at the previous iteration).

Abstract

The invention relates to an on-board system (2) comprising a first sensor (6) for measuring a clear distance (di) for different directions (αi) in an environment encountered by a vehicle (V) and a second sensor (4) for generating an image (IMG) of the environment. The on-board system also comprises: a grouping module for grouping together a plurality of the aforementioned directions (αi) on the basis of the clear distances (di) measured for the directions (αi) from the plurality of directions; a module for locating at least one object (O, O') within a region (R) of the image (IMG) corresponding to the directions (αi) from the plurality of directions; and a module for selecting directions (αi) corresponding to the location of the object (O; O') from among the plurality of directions (αi). The invention also relates to a method for determining a relative position of an object (O; O').

Description

Système embarqué et procédé de détermination d'une position relative  Embedded system and method for determining a relative position
DOMAINE TECHNIQUE AUQUEL SE RAPPORTE L'INVENTION La présente invention concerne les systèmes embarqués pour véhicule. Elle concerne plus particulièrement un système embarqué et un procédé de détermination d'une position relative. TECHNICAL FIELD TO WHICH THE INVENTION RELATES The present invention relates to on-board vehicle systems. It relates more particularly to an onboard system and a method for determining a relative position.
L'invention s'applique particulièrement avantageusement dans le cas où le système embarqué comprend un capteur d'image et un capteur de temps de vol.  The invention applies particularly advantageously in the case where the onboard system comprises an image sensor and a time of flight sensor.
ARRIERE-PLAN TECHNOLOGIQUE  BACKGROUND
On utilise de plus en plus de nos jours des systèmes permettant d'appréhender (au moyen de capteurs) l'environnement routier d'un véhicule et d'analyser cet environnement, dans un but d'assistance à la conduite, voire de pilotage automatique du véhicule.  More and more systems are being used nowadays to apprehend (by means of sensors) the road environment of a vehicle and to analyze this environment, with the aim of driving assistance, or even of automatic piloting. of the vehicle.
Parmi les capteurs utilisés dans ce cadre, certains permettent d'obtenir une image de l'environnement ; ils sont toutefois en général assez peu précis lorsqu'il s'agit de déterminer la distance des objets observés. C'est pourquoi on utilise également des capteurs de mesure de distance, par exemple des capteurs d'obstacles, dont les directions de mesure sont toutefois en nombre limité afin de réduire le coût de tels capteurs.  Among the sensors used in this context, some allow to obtain an image of the environment; they are, however, generally quite inaccurate when it comes to determining the distance of objects observed. For this reason, distance measurement sensors are also used, for example obstacle sensors, whose measurement directions are, however, limited in number in order to reduce the cost of such sensors.
OBJET DE L'INVENTION  OBJECT OF THE INVENTION
Dans ce contexte, la présente invention propose un système embarqué comprenant un premier capteur apte à mesurer une distance libre pour une pluralité de directions dans un environnement rencontré par un véhicule et un second capteur apte à générer une image dudit environnement, caractérisé par un module de regroupement d'une pluralité desdites directions sur la base des distances libres mesurées pour les directions de ladite pluralité, un module de localisation d'au moins un objet au sein d'une région de l'image correspondant auxdites directions de ladite pluralité, et un module de sélection des directions correspondant à la localisation dudit objet parmi ladite pluralité de directions.  In this context, the present invention proposes an onboard system comprising a first sensor able to measure a free distance for a plurality of directions in an environment encountered by a vehicle and a second sensor capable of generating an image of said environment, characterized by a module of grouping a plurality of said directions on the basis of the free distances measured for the directions of said plurality, a location module of at least one object within a region of the image corresponding to said directions of said plurality, and a a direction selection module corresponding to the location of said object among said plurality of directions.
On combine ainsi astucieusement les informations fournies par deux types de capteur différents pour obtenir une évaluation précise du positionnement de l'objet concerné, typiquement un autre véhicule, situé devant le véhicule équipé du système embarqué. Selon des caractéristiques envisageables à titre optionnel (et donc non limitatif) : The information provided by two different types of sensor is thus cleverly combined to obtain an accurate assessment of the positioning of the object concerned, typically another vehicle, located in front of the vehicle equipped with the onboard system. According to optional features (and therefore not limiting):
- le premier capteur est un capteur de temps de vol, par exemple un télémètre laser à balayage ou un capteur d'obstacle ;  the first sensor is a flight time sensor, for example a scanning laser rangefinder or an obstacle sensor;
- le second capteur est une caméra vidéo, ou un radar, ou encore un système à ultrasons ;  the second sensor is a video camera, or a radar, or an ultrasound system;
- le système embarqué comprend un module de commande d'un actionneur en fonction des directions sélectionnées ;  the on-board system comprises a control module of an actuator as a function of the selected directions;
- l'actionneur est un système de commande de direction.  the actuator is a steering control system.
L'invention propose également un procédé de détermination d'une position relative d'un objet par rapport à un véhicule, au moyen d'un système embarqué comprenant un premier capteur apte à mesurer une distance libre pour une pluralité de directions dans un environnement rencontré par le véhicule et un second capteur apte à générer une image dudit environnement, caractérisé par les étapes suivantes :  The invention also proposes a method for determining a relative position of an object with respect to a vehicle, by means of an onboard system comprising a first sensor able to measure a free distance for a plurality of directions in an environment encountered. by the vehicle and a second sensor adapted to generate an image of said environment, characterized by the following steps:
- regroupement d'une pluralité desdites directions sur la base des distances libres mesurées pour les directions de ladite pluralité ;  grouping a plurality of said directions on the basis of the measured free distances for the directions of said plurality;
- localisation de l'objet au sein d'une région de l'image correspondant auxdites directions de ladite pluralité ;  locating the object within a region of the image corresponding to said directions of said plurality;
- sélection des directions correspondant à la localisation dudit objet parmi ladite pluralité de directions.  selecting directions corresponding to the location of said object among said plurality of directions.
Les caractéristiques présentées à titre optionnel pour le système embarqué peuvent également s'appliquer à un tel procédé.  The features presented as optional for the embedded system may also apply to such a method.
DESCRIPTION DÉTAILLÉE D'UN EXEMPLE DE RÉALISATION La description qui va suivre en regard des dessins annexés, donnés à titre d'exemples non limitatifs, fera bien comprendre en quoi consiste l'invention et comment elle peut être réalisée.  DETAILED DESCRIPTION OF AN EXEMPLARY EMBODIMENT The following description with reference to the accompanying drawings, given by way of non-limiting examples, will make it clear what the invention consists of and how it can be implemented.
Sur les dessins annexés :  In the accompanying drawings:
- la figure 1 représente schématiquement les éléments principaux d'un système embarqué dans un véhicule ;  - Figure 1 shows schematically the main elements of a system embedded in a vehicle;
- la figure 2 représente, sous forme de blocs fonctionnels, une unité de traitement du système embarqué de la figure 1 .  FIG. 2 represents, in the form of functional blocks, a processing unit of the onboard system of FIG. 1.
La figure 1 représente schématiquement les éléments principaux d'un système embarqué 2 dans un véhicule V (ici un véhicule automobile). Ce système embarqué comprend un capteur d'image 4, un capteur de temps de vol 6, un actionneur 8 et une unité de traitement 10. Figure 1 schematically shows the main elements of an embedded system 2 in a vehicle V (here a motor vehicle). This embedded system comprises an image sensor 4, a flight time sensor 6, an actuator 8 and a processing unit 10.
Le capteur d'image 4 est par exemple une caméra vidéo. Le capteur d'image 4 est placé dans le véhicule de manière à capturer une image IMG de l'environnement extérieur du véhicule V, situé à l'avant du véhicule (c'est-à-dire rencontré par le véhicule V lorsque ce dernier avance), dans un angle solide A correspondant au champ de vision du capteur d'image 4. Comme représenté schématiquement en figure 1 , cet environnement comprend un premier objet O, ici un second véhicule automobile, et un second objet O', ici un troisième véhicule automobile.  The image sensor 4 is for example a video camera. The image sensor 4 is placed in the vehicle so as to capture an IMG image of the outside environment of the vehicle V, located at the front of the vehicle (that is to say, met by the vehicle V when the latter advance), in a solid angle A corresponding to the field of view of the image sensor 4. As shown schematically in FIG. 1, this environment comprises a first object O, here a second motor vehicle, and a second object O ', here a third motor vehicle.
En variante, le capteur d'image 4 pourrait être un radar ou un système à ultrasons. On remarque que, dans ces cas notamment, l'image capturée par le capteur d'image 4 peut alors être une image (ou représentation) monodimensionnelle de l'environnement situé à l'avant du véhicule V.  Alternatively, the image sensor 4 could be a radar or an ultrasound system. It should be noted that, in these cases in particular, the image captured by the image sensor 4 can then be a one-dimensional image (or representation) of the environment located at the front of the vehicle V.
Le capteur de temps de vol 6 est par exemple un capteur d'obstacle The flight time sensor 6 is for example an obstacle sensor
(typiquement à infrarouge ou à laser). En variante, il pourrait s'agir d'un télémètre laser à balayage (ou "laser scanner" selon l'appellation anglo-saxonne parfois utilisée). (typically infrared or laser). Alternatively, it could be a scanning laser rangefinder (or "laser scanner" according to the English name sometimes used).
Un tel capteur de temps de vol 6 permet de mesurer, pour chacune d'une pluralité de directions a, dans l'environnement faisant face au véhicule V, la distance libre d, jusqu'au premier obstacle rencontré dans la direction a, concernée, généralement en décomptant le temps mis par un rayon lumineux (ou plus généralement une onde électromagnétique) émis dans la direction concernée a, pour être détecté au niveau du capteur de temps de vol 6 après réflexion sur le premier obstacle rencontré (au niveau d'un point d'impact P,).  Such a time of flight sensor 6 makes it possible to measure, for each of a plurality of directions a, in the environment facing the vehicle V, the free distance d, up to the first obstacle encountered in the direction a, concerned, generally by counting the time taken by a light ray (or more generally an electromagnetic wave) emitted in the direction concerned a, to be detected at the time of flight sensor 6 after reflection on the first obstacle encountered (at a level of point of impact P,).
Du fait de la technique utilisée, les mesures de distance effectuées par le capteur de temps de vol 6 sont précises.  Due to the technique used, the distance measurements made by the flight time sensor 6 are accurate.
Dans l'exemple décrit ici, le capteur de temps de vol 6 est conçu pour mesurer les distances libres d, respectivement associées à N directions a, (où i peut donc varier entre 1 et N, avec N compris entre 10 et 24, ici N = 1 6) ; ces N directions a, sont ici toutes horizontales (ou quasiment horizontales), et correspondent à une pluralité d'angle répartis sur une plage angulaire d'étendue inférieure à 45° située à l'avant du véhicule.  In the example described here, the flight time sensor 6 is designed to measure the free distances d, respectively associated with N directions a, (where i can therefore vary between 1 and N, with N between 10 and 24, here N = 16); these N directions are here all horizontal (or almost horizontal), and correspond to a plurality of angles distributed over an angular range of less than 45 ° range located at the front of the vehicle.
L'actionneur 8 permet de commander au moins en partie la trajectoire du véhicule V. Dans l'exemple décrit ici, l'actionneur 8 est un système de commande de direction. Il pourrait s'agir en variante d'un groupe motopropulseur ou d'un système de freinage. The actuator 8 makes it possible to control at least part of the trajectory of the V. In the example described here, the actuator 8 is a steering control system. It could be a variant of a powertrain or a braking system.
Comme expliqué plus bas, l'unité de traitement 10 peut commander l'actionneur 8 au moyen d'un signal de commande CMD déterminé en fonction de divers traitements effectués au sein de l'unité de traitement 10.  As explained below, the processing unit 10 can control the actuator 8 by means of a control signal CMD determined according to various processes carried out within the processing unit 10.
L'unité de traitement 10 est par exemple conçue pour commander, au moyen de l'actionneur 8, une trajectoire consistant à suivre le second véhicule (premier objet O en figure 1 ), dans un but d'assistance à la conduite (cette trajectoire étant suivie en l'absence de commande différente du conducteur), ou dans un but de pilotage automatique du véhicule V (cas d'un véhicule autonome).  For example, the processing unit 10 is designed to control, by means of the actuator 8, a trajectory consisting in following the second vehicle (first object O in FIG. 1), with the aim of assisting the driving (this trajectory being followed in the absence of different control of the driver), or for the purpose of automatic control of the vehicle V (case of an autonomous vehicle).
La figure 2 représente l'unité de traitement 10 sous forme de blocs fonctionnels, ou modules.  FIG. 2 represents the processing unit 10 in the form of functional blocks, or modules.
Chaque module 12, 14, 1 6, 18 représenté sur la figure 2 correspond à une fonctionnalité particulière mise en œuvre par l'unité de traitement 10. Plusieurs (voire tous les) modules peuvent toutefois en pratique être mis en œuvre par une même entité physique, par exemple un processeur sur lequel s'exécute des instructions de programme mémorisées dans une mémoire associé au processeur (chaque module étant alors dans ce cas mis en œuvre par l'exécution d'un jeu particulier d'instructions mémorisées dans ladite mémoire).  Each module 12, 14, 1 6, 18 represented in FIG. 2 corresponds to a particular functionality implemented by the processing unit 10. Several (or even all) modules can however in practice be implemented by the same entity. physical, for example a processor on which executes program instructions stored in a memory associated with the processor (each module being then in this case implemented by the execution of a particular set of instructions stored in said memory) .
L'unité de traitement 10 comprend ainsi un module de regroupement 12 conçu pour regrouper (ou associer entre elles) une pluralité des directions d'observation a, du capteur de temps de vol 6, sur la base d'un critère utilisant les distances libres d, mesurées par le capteur de temps de vol 6.  The processing unit 10 thus comprises a grouping module 12 designed to group (or associate with each other) a plurality of the observation directions a, of the flight time sensor 6, on the basis of a criterion using the free distances d, measured by the flight time sensor 6.
On note dans la suite {aj}jes l'ensemble des directions ainsi regroupéesWe note in the sequence {aj} j e s all directions and grouped
(ou associées). (or associated).
Si on note v,(t) la vitesse relative au point d'impact P, déterminée à l'instant t par dérivation de la distance libre d, correspondante (c'est-à-dire Vi(t) = [di(t) - di(t-At)]/At), deux directions (¾ sont par exemple regroupées {i.e. jeS et j'eS) si les vitesses Vj(t), Vj-(t) déterminées pour ces deux directions (¾, (¾ sont suffisamment proches (c'est-à-dire en pratique si on a : | Vj(t) - Vj-(t) | < ε, avec ε un seuil prédéterminé). If we denote v, (t) the velocity relative to the point of impact P, determined at time t by derivation of the corresponding free distance d (that is, Vi (t) = [di (t ) - di (t-At)] / At), two directions (¾ are for example grouped together (ie iS and IeS) if the velocities Vj (t), Vj- (t) determined for these two directions (¾ , (¾ are sufficiently close (that is to say in practice if we have: | Vj (t) - Vj- (t) | <ε, with ε a predetermined threshold).
En variante, on pourrait regrouper les directions sur la base du critère de proximité des distance libres dj mesurées (à un instant donné) : deux directions Oj, a(- sont dans ce cas regroupées si les distances libres dj, dj> pour ces deux directions aj; sont suffisamment proches (c'est-à-dire en pratique si on a : | dj - d I < ε', avec ε' un seuil prédéterminé). As a variant, the directions could be grouped on the basis of the criterion of proximity of the free distances d j measured (at a given instant): two directions Oj, a ( - are in this case grouped if the free distances dj, dj > for these two directions a j; are sufficiently close (that is to say in practice if we have: | d j - d I <ε ', with ε' a predetermined threshold).
L'unité de traitement 10 comprend également un module de localisation 14 d'objet(s) O, O' au sein d'une région R de l'image IMG qui correspond auxdites directions regroupées .  The processing unit 10 also comprises a location module 14 of object (s) O, O 'within a region R of the IMG image that corresponds to said grouped directions.
Un tel module de localisation 14 est conçu pour identifier un ou plusieurs objet(s) O, O' (par exemple au moyen d'un algorithme de reconnaissance de forme) au sein de la région R correspondant aux directions regroupées η et pour fournir des données L représentatives de la localisation (par exemple dans l'image IMG ou, en variante, dans la région R) du ou des objet(s) identifié(s) O, O'.  Such a location module 14 is designed to identify one or more object (s) O, O '(for example by means of a shape recognition algorithm) within the region R corresponding to the grouped directions η and to provide data L representative of the location (for example in the IMG image or, alternatively, in the region R) of the identified object (s) O, O '.
L'association entre les différentes directions possibles a, et la région R de l'image IMG associée à chacune d'elle est prédéfinie (d'après la position relative fixe du capteur d'image 4 et du capteur de temps de vol 6) et mémorisée par exemple dans une table de correspondance au sein de l'unité de traitement 10.  The association between the different possible directions a, and the region R of the IMG image associated with each of them is predefined (according to the fixed relative position of the image sensor 4 and the flight time sensor 6) and stored for example in a correspondence table within the processing unit 10.
On remarque ici que, bien que les directions regroupées correspondent en général (par construction) à un seul objet, l'identification (i.e. la détection) d'une pluralité d'objets O, O' au sein de la région correspondant aux directions regroupées η est possible lorsque plusieurs objets O, O' sont proches et/ou ou un comportement (par exemple une vitesse) proche, comme schématiquement représenté en figure 1 , et n'ont de ce fait pas pu être distingués au sein du module de regroupement 12.  It is noted here that, although the grouped directions generally correspond (by construction) to a single object, the identification (ie the detection) of a plurality of objects O, O 'within the region corresponding to the grouped directions η is possible when several objects O, O 'are close and / or or a behavior (for example a speed) close, as schematically represented in FIG. 1, and therefore could not be distinguished within the grouping module 12.
L'unité de traitement 10 comprend un module de sélection 1 6 conçu pour sélectionner, pour chaque objet O, O' identifié dans la région R par le module de localisation 14, l'ensemble des directions {cik}kes' correspondant à l'objet concerné O, O' parmi l'ensemble des directions regroupées {aj}jeS déterminé par le module de regroupement 12. The processing unit 10 comprises a selection module 1 6 designed to select, for each object O, O 'identified in the region R by the location module 14, the set of directions {ci k } kes' corresponding to the concerned object O, O 'among all the grouped directions {a j } iS determined by the grouping module 12.
Pour ce faire, pour chaque objet O, O', le module de sélection 1 6 détermine, parmi les directions regroupées , lesquelles correspondent aux données L représentatives de la localisation de l'objet O concerné. Comme déjà indiqué, l'association entre les différentes directions possibles a, et l'image IMG (les données L étant exprimées au sein de cette image IMG) est par exemple mémorisée dans une table de correspondance au sein de l'unité de traitement 10.  To do this, for each object O, O ', the selection module 1 6 determines, from the grouped directions, which correspond to the data L representative of the location of the object O concerned. As already indicated, the association between the various possible directions a, and the IMG image (the data L being expressed within this IMG image) is for example stored in a correspondence table within the processing unit 10 .
Le module de sélection 1 6 peut ainsi fournir, pour chaque objet O, O', l'ensemble des directions sélectionnées {ak}kes! associées à cet objet O, O'. The selection module 1 6 can thus provide, for each object O, O ', all the selected directions {a k } kes ! associated with this object O, O '.
On obtient ainsi une cartographie de l'environnement situé à l'avant du véhicule V, basée sur les données fournies par le capteur de temps de vol 6, mais dont la robustesse est améliorée grâce à l'utilisation des données (image IMG) fournies par le capteur d'image 4.  This provides a map of the environment located at the front of the vehicle V, based on the data provided by the time of flight sensor 6, but whose robustness is improved thanks to the use of the data (IMG image) provided by the image sensor 4.
Dans l'exemple décrit ici, l'unité de traitement 10 comprend en outre un module de commande 18 conçu pour générer une commande de direction CMD destinée à l'actionneur 8 en fonction de la cartographie précitée, précisément en fonction des directions sélectionnées {ak}kes! relatives à l'objet O, c'est-à-dire ici le véhicule à suivre (second véhicule). Le module de commande 18 émet par exemple une commande CMD telle que, par action de l'actionneur 8 (ici un système de commande de direction), le véhicule V s'oriente vers une direction moyenne déterminée sur la base des directions sélectionnées précitées ak. In the example described here, the processing unit 10 further comprises a control module 18 designed to generate a direction control CMD for the actuator 8 as a function of the aforementioned map, precisely according to the selected directions {a k } kes ! relating to the object O, that is to say here the vehicle to follow (second vehicle). The control module 18 emits for example a command CMD such that, by action of the actuator 8 (here a steering control system), the vehicle V is moving towards a mean direction determined on the basis of the selected directions mentioned above. k .
Pour ce faire, on peut prévoir que le module de commande 18 distingue l'objet O (véhicule à suivre) d'un autre objet O' en choisissant l'objet pour lequel les directions sélectionnées {a^s' sont les plus proches de la direction d'avancement du véhicule V ou, en variante, en choisissant l'objet suivi à l'itération précédente (par exemple par comparaison des directions sélectionnées pour les différents objets à l'instant courant avec les directions sélectionnés pour le véhicule suivi à l'itération précédente).  To do this, it can be provided that the control module 18 distinguishes the object O (vehicle to follow) from another object O 'by choosing the object for which the selected directions {a ^ s' are the closest to the direction of advance of the vehicle V or, alternatively, by choosing the object tracked at the previous iteration (for example by comparing the directions selected for the different objects at the current time with the directions selected for the vehicle followed at the previous iteration).

Claims

REVENDICATIONS
1 . Système embarqué (2) comprenant un premier capteur (6) apte à mesurer une distance libre (d,) pour diverses directions (α,) dans un environnement rencontré par un véhicule (V) et un second capteur (4) apte à générer une image (IMG) dudit environnement, caractérisé par : 1. On-board system (2) comprising a first sensor (6) capable of measuring a free distance (d,) for various directions (α,) in an environment encountered by a vehicle (V) and a second sensor (4) capable of generating a image (IMG) of said environment, characterized by:
- un module de regroupement (12) d'une pluralité desdites directions (q) sur la base des distances libres (d,) mesurées pour les directions (q) de ladite pluralité ; - a grouping module (12) of a plurality of said directions (q) on the basis of the free distances (d,) measured for the directions (q) of said plurality;
- un module de localisation (14) d'au moins un objet (O, O') au sein d'une région (R) de l'image (IMG) correspondant auxdites directions (q) de ladite pluralité ; et - a location module (14) of at least one object (O, O') within a region (R) of the image (IMG) corresponding to said directions (q) of said plurality; And
- un module de sélection (1 6) des directions (ak) correspondant à la localisation (L) dudit objet (O ; O') parmi ladite pluralité de directions (q). - a selection module (1 6) of directions (a k ) corresponding to the location (L) of said object (O; O') among said plurality of directions (q).
2. Système embarqué selon la revendication 1 , dans lequel le premier capteur (6) est un capteur de temps de vol. 2. On-board system according to claim 1, wherein the first sensor (6) is a time-of-flight sensor.
3. Système embarqué selon la revendication 2, dans lequel le premier capteur est un télémètre laser à balayage. 3. On-board system according to claim 2, wherein the first sensor is a scanning laser rangefinder.
4. Système embarqué selon la revendication 2, dans lequel le premier capteur (6) est un capteur d'obstacle. 4. On-board system according to claim 2, wherein the first sensor (6) is an obstacle sensor.
5. Système embarqué selon l'une des revendications 1 à 3, dans lequel le second capteur (4) est une caméra vidéo. 5. Embedded system according to one of claims 1 to 3, wherein the second sensor (4) is a video camera.
6. Système embarqué selon l'une des revendications 1 à 3, dans lequel le second capteur est un radar. 6. On-board system according to one of claims 1 to 3, in which the second sensor is a radar.
7. Système embarqué selon l'une des revendications 1 à 3, dans lequel le second capteur est un système à ultrasons. 7. On-board system according to one of claims 1 to 3, in which the second sensor is an ultrasonic system.
8. Système embarqué selon l'une des revendications 1 à 7, comprenant un module de commande (18) d'un actionneur (8) en fonction des directions sélectionnées (ak). 8. Embedded system according to one of claims 1 to 7, comprising a control module (18) of an actuator (8) as a function of the selected directions (a k ).
9. Système embarqué selon l'une des revendications 1 à 8, dans lequel l'actionneur (8) est un système de commande de direction. 9. On-board system according to one of claims 1 to 8, wherein the actuator (8) is a steering control system.
10. Procédé de détermination d'une position relative d'un objet (O ; O') par rapport à un véhicule (V), au moyen d'un système embarqué (2) comprenant un premier capteur (6) apte à mesurer une distance libre (d,) pour diverses directions (α,) dans un environnement rencontré par le véhicule (V) et un second capteur (4) apte à générer une image (IMG) dudit environnement, caractérisé par les étapes suivantes : 10. Method for determining a relative position of an object (O; O') relative to a vehicle (V), by means of an on-board system (2) comprising a first sensor (6) capable of measuring a free distance (d,) for various directions (α,) in an environment encountered by the vehicle (V) and a second sensor (4) capable of generating an image (IMG) of said environment, characterized by the following steps:
- regroupement d'une pluralité desdites directions (q) sur la base des distances libres (d,) mesurées pour les directions (q) de ladite pluralité ; - grouping of a plurality of said directions (q) on the basis of the free distances (d,) measured for the directions (q) of said plurality;
- localisation de l'objet (O ; O') au sein d'une région (R) de l'image (IMG) correspondant auxdites directions ( ) de ladite pluralité ; - location of the object (O; O') within a region (R) of the image (IMG) corresponding to said directions () of said plurality;
- sélection des directions (ak) correspondant à la localisation (L) dudit objet (O ; O') parmi ladite pluralité de directions (η). - selection of directions (a k ) corresponding to the location (L) of said object (O; O') among said plurality of directions (η).
PCT/EP2017/063725 2016-06-08 2017-06-06 On-board system and method for determining a relative position WO2017211837A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1655224A FR3052560B1 (en) 2016-06-08 2016-06-08 ONBOARD SYSTEM AND METHOD FOR DETERMINING A RELATIVE POSITION
FR1655224 2016-06-08

Publications (1)

Publication Number Publication Date
WO2017211837A1 true WO2017211837A1 (en) 2017-12-14

Family

ID=57190041

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/063725 WO2017211837A1 (en) 2016-06-08 2017-06-06 On-board system and method for determining a relative position

Country Status (2)

Country Link
FR (1) FR3052560B1 (en)
WO (1) WO2017211837A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061015A (en) * 1998-01-14 2000-05-09 Honda Giken Kogyo Kabushiki Kaisha Vehicle obstacle detecting system
US20060125680A1 (en) * 2004-12-15 2006-06-15 Thackray Robert G Method and system for detecting an object using a composite evidence grid
US20140333468A1 (en) * 2013-05-07 2014-11-13 Google Inc. Methods and Systems for Detecting Weather Conditions Including Sunlight Using Vehicle Onboard Sensors

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061015A (en) * 1998-01-14 2000-05-09 Honda Giken Kogyo Kabushiki Kaisha Vehicle obstacle detecting system
US20060125680A1 (en) * 2004-12-15 2006-06-15 Thackray Robert G Method and system for detecting an object using a composite evidence grid
US20140333468A1 (en) * 2013-05-07 2014-11-13 Google Inc. Methods and Systems for Detecting Weather Conditions Including Sunlight Using Vehicle Onboard Sensors

Also Published As

Publication number Publication date
FR3052560A1 (en) 2017-12-15
FR3052560B1 (en) 2019-08-16

Similar Documents

Publication Publication Date Title
US20220172487A1 (en) Method for detecting an operating capability of an environment sensor, control unit and vehicle
EP3749561A1 (en) System and method for detecting a risk of collision between a motor vehicle and a secondary object located in the traffic lanes adjacent to said vehicle when changing lanes
JP6740470B2 (en) Measuring device, measuring method and program
EP2043044B1 (en) Method and device for automobile parking assistance
WO2013169185A1 (en) A vision system and method for a motor vehicle
WO2017036927A1 (en) Vision system for a motor vehicle and method of controlling a vision system
JP2023099851A (en) Measurement device, method for measurement, and program
JP2023068009A (en) Map information creation method
FR3052560B1 (en) ONBOARD SYSTEM AND METHOD FOR DETERMINING A RELATIVE POSITION
WO2018041978A1 (en) Device for determining a speed limit, on-board system comprising such a device, and method for determining a speed limit
FR3085082A1 (en) ESTIMATION OF THE GEOGRAPHICAL POSITION OF A ROAD VEHICLE FOR PARTICIPATORY PRODUCTION OF ROAD DATABASES
JP7303365B2 (en) Sensor calibration based on string of detected values
FR3052568A1 (en) ONBOARD SYSTEM AND METHOD FOR DETERMINING A CAP
FR3061886A1 (en) METHOD FOR DETERMINING A WORK AREA END, WORK AREA MANAGEMENT DEVICE AND ON-BOARD SYSTEM COMPRISING SUCH A DEVICE
FR3062836A1 (en) METHOD AND SYSTEM FOR DETERMINING A TRUST INDEX ASSOCIATED WITH AN OBJECT OF AN ENVIRONMENT OF A VEHICLE
WO2019077010A1 (en) Data processing method and associated onboard system
FR3057693A1 (en) LOCATION DEVICE AND DEVICE FOR GENERATING INTEGRITY DATA
FR3082936A1 (en) STEREOSCOPIC PROCESSING OF CONNECTED VEHICLE DATA
US11815626B2 (en) Method for detecting intensity peaks of a specularly reflected light beam
US20220309776A1 (en) Method and system for determining ground level using an artificial neural network
WO2016146823A1 (en) Method for estimating geometric parameters representing the shape of a road, system for estimating such parameters and motor vehicle equipped with such a system
FR3107350A1 (en) METHODS AND SYSTEMS FOR REMOTE MEASUREMENT OF THE ANGULAR ORIENTATION OF AN OBJECT
FR3036498A1 (en) METHOD AND SYSTEM FOR ONLINE LOCATION OF A MOTOR VEHICLE
WO2022033902A1 (en) Method for aligning at least two images formed by three-dimensional points
WO2018212286A1 (en) Measurement device, measurement method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17727607

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17727607

Country of ref document: EP

Kind code of ref document: A1