WO2017191212A1 - Method and an apparatus objects recognition - Google Patents
Method and an apparatus objects recognition Download PDFInfo
- Publication number
- WO2017191212A1 WO2017191212A1 PCT/EP2017/060572 EP2017060572W WO2017191212A1 WO 2017191212 A1 WO2017191212 A1 WO 2017191212A1 EP 2017060572 W EP2017060572 W EP 2017060572W WO 2017191212 A1 WO2017191212 A1 WO 2017191212A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- speed
- objects
- module
- detection means
- classification
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/52—Discriminating between fixed and moving objects or between objects moving at different speeds
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/411—Identification of targets based on measurements of radar reflectivity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/415—Identification of targets based on measurements of movement associated with the target
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2300/00—Indexing codes relating to the type of vehicle
- B60W2300/17—Construction vehicles, e.g. graders, excavators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9317—Driving backwards
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9324—Alternative operation using ultrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93272—Sensor installation details in the back of the vehicles
Definitions
- the invention relates to a method and an apparatus for objects recognition to be provided on vehicles, particularly construction vehicles, like excavators or the like and agricultural vehicles, such as tractors, combines, etc.
- ACC adaptive cruise control
- the ACC system uses e.g. a laser setup in order to allow a car to keep pace with another car it is following, so as to slow when closing in and accelerating to the preset speed when traffic allows.
- FIG. 1 is a schematic view of a construction vehicle driving in reverse in a construction site, provided with the apparatus of the invention;
- FIG. 2 is a schematic view of a construction vehicle going towards a pile of gravel
- FIG. 3 is a diagram representing the apparatus of the invention.
- FIG. 4 is a schematic classification chart representing a possible output of the classification module of the invention.
- FIG. 5 is a schematic view representing a construction vehicle, a stylized object and their speed and spatial relationship
- FIG. 6 is a diagram representing a way of determining whether a given object is standing still according to its speed and spatial relationships with two fixed objects.
- 1 indicates the apparatus for objects recognition according to the invention.
- the apparatus 1 is intended to be provided on a vehicle 2, especially construction vehicles, like excavators or the like and agricultural vehicles, such as tractors, combines, etc.
- the apparatus 1 includes detection means 3 to be placed on board of the vehicle 2, for example provided at its back portion, which detection means 3 are able to detect the positions of objects 4, 41 , 42 in at least an area of interest 5.
- the area of interest 5 is a portion of the overall zone surrounding the vehicle 2, such as the area where the vehicle 2 can go when moving in reverse driving (see figure 1 ).
- the detection means comprise at least an echo device, such as a radar device 3, able to determine the position of the objects.
- the detection means 3 can also or instead include an optical device, e.g. a laser device or the like, or an ultrasound device, etc...
- an optical device e.g. a laser device or the like, or an ultrasound device, etc...
- the apparatus 1 also includes a processing unit 6, connected to the detection means 3 and comprising a plurality of operative modules and at least a memory module. Please note that, in the present description, the processing unit 6 is presented as articulated into distinct operative modules in order to describe it in a clear and complete way.
- the processing unit 6 may be constituted by a single electronic device, also of the type commonly present on this type of machines (like an ECU), programmed to perform the functionalities described.
- Different modules can correspond to respective hardware entities and / or software routines that are part of the programmed device.
- Such features can be carried out by a plurality of electronic devices on which the aforesaid operative modules are included.
- processing unit 6 may use one or more microprocessors for the execution of instructions contained in memory modules and the above operative modules can also be distributed over a plurality of computers in a local or remote according to the network architecture in which they are provided.
- the processing unit 6 of the invention comprises a speed module 61 configured for determining a speed parameter for each object 4, 41 , 42, according to the position or positions of the latter as detected by said detection means 3.
- the speed parameter can be a function of or be equal to the relative speed between the apparatus 1 and a given object 4, 41 , 42, according to the changes in their positions.
- the speed parameter determined by the speed module 61 can be an average or approximate value and may be calculated according to the distances between the different positions in which the objects 4, 41 , 42 moved in a time unit, using e.g. Newtonian mechanics.
- the speed module 61 can calculated the speed of the object according to the Doppler effect.
- the apparatus 1 can acquire information that can be used to determine whether a given object is standing still, like e.g. a post, a pile of gravel 42, a building, etc. or it is moving/movable.
- the processing unit 6 of the apparatus 1 also includes a size module 62 configured for determining a size parameter for each object 4, 41 , 42, according to a signal produced by the detection means 3.
- the detection means 3 produces an output electronic signal according to the detection performed, which is sent to the processing unit 6, where it is processed by the size module 62.
- said size parameter is calculated according to the magnitude of echo signals reflected by each object 4, 41 , 42 and received by the device 3.
- the unit 6 also comprises a distance module 63 connected to the detection device 3 and configured to calculate the distances between each object 4, 41 , 42 and the apparatus 1 (so roughly the position of vehicle 2).
- the distance module 63 is configured for calculating the distances of the objects along their radial directions passing through the detection means 3; in the following, this distance value will be named "radial distance”.
- the distance module 63 can be configured to calculate the distance between the detected positions of the objects 4, 41 , 42 and a reference point in space; in the above-mentioned preferred case, the reference point is the position of the apparatus 1.
- the processing unit 6 further includes a classification module 64 configured to classify different objects 4, 41 , 42 according to the respective speed and the size parameters.
- an object stands still and is very large, it might be a pile of gravel 42 or even a wall, whereas if it is very small and is moving slow can be a pedestrian 41 , while if it's small / medium sized and is moving can be another vehicle, etc.
- said size parameter can be calculated according to a relation between said magnitude and said relative distance of the object.
- said relation can be the ratio between the magnitude and the relative distance or a different mathematical relation.
- the size module 62 is configured for determining the size parameter calculating the ratio between the magnitude of said output signal and the square of the detected distance of a given object, particularly said radial distance.
- the speed module 61 is configured for determining said speed parameter according to a speed component S relative to a radial direction R of each objects 4 passing through the detection means 3.
- said radial direction R can be determined according to an azimuth angle A between each object and a reference direction D, the latter being fixed with respect to detection means 3, whereas said azimuth angle A being determined according to the positions detected by the detection means 3.
- Said reference direction D is preferably the axial direction of the vehicle 2 or is parallel to the axial direction and passes through the apparatus 1.
- the speed parameter in the radial direction is the relative speed S between the vehicle 2 and the object or is a function of that relative speed S (see figure 5).
- This relative "radial speed" S between the vehicle 2 and the object can be used by the classification module 64 to determine whether an object 4 is standing still or not.
- the classification module 64 can be configured for comparing said relative radial speed S with the speed V (component) of the vehicle 2 in said radial direction R (i.e. the radial speed of the vehicle 2), calculated by means of e.g. known speed sensor usually provided on the vehicle 2.
- the radial speed V of the vehicle 2 is different from zero and the relative radial speed S is equal to the vehicle radial speed V, than it is reasonable to assume that the object is standing still.
- the radial speed V of the vehicle 2 is different from zero and the relative radial speed S is different from the vehicle radial speed V, then it is reasonable to assume that the object is moving.
- the classification module 64 is configured for determining whether an object 44 is standing still according to its speed relative to a plurality of objects 43 being fixed to each other (see figure 6).
- any other further object 44 can be assumed as standing still if its relative speed relative to the two reference standing still objects 43 equals zero.
- the classification module 64 is configured for using a combination of the relative speed S, the vehicle radial speed V and the size parameter.
- the apparatus 1 can include a setting module 65 connected to the classification module 64 and configured to enable the user to set said reference fixed object, i.e. to preliminary pinpoint the objects 43 which are to be considered as fixed.
- This information can be recorded and stored in a memory unit 66.
- a GPS navigation device or the like, can be provided in the apparatus 1 so that the positions of the reference fixed objects 43 can be defined by respective coordinates.
- the processing unit 6 comprises a user interface 67 configured for displaying to a user of said vehicle 2 the objects 4, 41 , 42 in said area of interest 5 associated to information relating the respective classification as determined by the classification module 64.
- the user interface 67 can comprise a display and a keyboard or a touchscreen display or the like.
- an assessment module 68 can be provided in the processing unit 6 which is configured for enabling the user to confirm or reject or change said classification associated to the objects 4, 41 , 42, by means of said user interface 67.
- the invention can either comprise a camera pointing at said area of interest 5 or a imaging module of the processing unit 6 configured for producing a visual rendering of the objects detected by the detection means 3, that is configured to translate e.g. sound waves into a visual rendering of those objects.
- the invention can also include both the camera and the imaging module or different means able to feed the interface with said representation of the objects comprised in the area of interest 5.
- the detection means 3 detects the objects 4, 41 , 42 in the area of interest 5, the classification of which is assessed by the classification module 64.
- a visual representation of the objects and their classification is provided to the user by means of the user interface 67, e.g. a display included in the cabin of the vehicle 2.
- the user watching the image of a detected object 4, 41 , 42 as provided by the interface 67 could also be able to look directly at the object 4, 41 , 42, e.g. through the windows of the cabin, and assess whether the displayed classification is correct or erroneous.
- the user can reject or even change it via the interface 67 itself; if the classification is correct the user can accept it either doing nothing or positively confirm it, again via the interface 67.
- the invention also provides a method for objects recognition, comprising the following steps: providing a vehicle 2; detecting the positions of objects 4, 41 , 42 in at least an area of interest 5; determining a speed parameter of each object 4, 41 , 42 according to the position or positions detected; determining a size parameter for each object 4, 41 , 42; and classify different objects 4, 41 , 42 according to the respective speed parameter and size parameter.
- the method can include the following steps:
Abstract
An apparatus (1) for objects recognition to be provided on a vehicle (2), comprising detection means (3) for detecting the positions of objects (4, 41, 42) in at least an area of interest (5) and at least a processing unit (6), connected to the detection means (3). The processing unit (6) comprises: a speed module (61) configured for determining a speed parameter for each object (4, 41, 42) according to its position or positions detected by said detection means (3); a size module (62) configured for determining a size parameter for each object (4, 41, 42), according to a signal produced by the detection means (3); and a classification module (64) configured for classify different objects (4, 41, 42) according to the respective speed parameters and size parameters.
Description
METHOD AND AN APPARATUS OBJECTS RECOGNITION
DESCRIPTION
The invention relates to a method and an apparatus for objects recognition to be provided on vehicles, particularly construction vehicles, like excavators or the like and agricultural vehicles, such as tractors, combines, etc.
In the automotive field, adaptive cruise control (ACC) systems have been recently introduced, which provide automatic braking or dynamic set-speed type controls for cars and the like.
The ACC system uses e.g. a laser setup in order to allow a car to keep pace with another car it is following, so as to slow when closing in and accelerating to the preset speed when traffic allows.
Although this solution works fine in the automotive field, it has not been adopted in the field of construction or agricultural equipments, where the need is felt of a system able to recognize the objects, in order to control or adjust the driving of the vehicle, e.g. an excavator, according to the kind of objects it might run into.
By way of example, if an excavator moves in a construction site, especially driving in reverse, it might run into a pile of gravel, in which case there is no need to brake, since it can easily climb over it; however, if the excavator is about to cross the path of a pedestrian, it is imperative to take measures for avoid a collision.
It is an object of the present invention to provide an apparatus and a method for objects recognition able to satisfy the above-cited need.
This object is achieved by the apparatus realized in accordance with claim 1 and by the method realized according to claim 12.
Additional features and advantages of the present invention will be more apparent from the illustrative, and thus non-limiting, description, of a preferred, but not exclusive embodiment of the apparatus of the invention, as illustrated in the appended drawings in which:
- figure 1 is a schematic view of a construction vehicle driving in reverse in a construction site, provided with the apparatus of the invention;
- figure 2 is a schematic view of a construction vehicle going towards a pile of gravel;
- figure 3 is a diagram representing the apparatus of the invention;
- figure 4 is a schematic classification chart representing a possible output of the classification module of the invention;
- figure 5 is a schematic view representing a construction vehicle, a stylized object and their speed and spatial relationship; and
- figure 6 is a diagram representing a way of determining whether a given object is standing still according to its speed and spatial relationships with two fixed objects.
With reference to the aforementioned figures, 1 indicates the apparatus for objects recognition according to the invention.
The apparatus 1 is intended to be provided on a vehicle 2, especially construction vehicles, like excavators or the like and agricultural vehicles, such as tractors, combines, etc.
The apparatus 1 includes detection means 3 to be placed on board of the vehicle 2, for example provided at its back portion, which detection means 3 are able to detect the positions of objects 4, 41 , 42 in at least an area of interest 5.
Preferably, the area of interest 5 is a portion of the overall zone surrounding the vehicle 2, such as the area where the vehicle 2 can go when moving in reverse driving (see figure 1 ).
In a preferred embodiment, the detection means comprise at least an echo device, such as a radar device 3, able to determine the position of the objects.
However, the detection means 3 can also or instead include an optical device, e.g. a laser device or the like, or an ultrasound device, etc...
The apparatus 1 also includes a processing unit 6, connected to the detection means 3 and comprising a plurality of operative modules and at least a memory module.
Please note that, in the present description, the processing unit 6 is presented as articulated into distinct operative modules in order to describe it in a clear and complete way.
In practice, the processing unit 6 may be constituted by a single electronic device, also of the type commonly present on this type of machines (like an ECU), programmed to perform the functionalities described.
Different modules can correspond to respective hardware entities and / or software routines that are part of the programmed device.
Alternatively or in addition, such features can be carried out by a plurality of electronic devices on which the aforesaid operative modules are included.
In general, the processing unit 6 may use one or more microprocessors for the execution of instructions contained in memory modules and the above operative modules can also be distributed over a plurality of computers in a local or remote according to the network architecture in which they are provided.
The processing unit 6 of the invention comprises a speed module 61 configured for determining a speed parameter for each object 4, 41 , 42, according to the position or positions of the latter as detected by said detection means 3.
The speed parameter can be a function of or be equal to the relative speed between the apparatus 1 and a given object 4, 41 , 42, according to the changes in their positions.
In detail, the speed parameter determined by the speed module 61 can be an average or approximate value and may be calculated according to the distances between the different positions in which the objects 4, 41 , 42 moved in a time unit, using e.g. Newtonian mechanics.
Also, in case of detecting means 3 using a wave transmitting technologies, the speed module 61 can calculated the speed of the object according to the Doppler effect.
As will be clear in a following section, by means of the speed module 61 , the apparatus 1 can acquire information that can be used to determine whether a given
object is standing still, like e.g. a post, a pile of gravel 42, a building, etc. or it is moving/movable.
The processing unit 6 of the apparatus 1 also includes a size module 62 configured for determining a size parameter for each object 4, 41 , 42, according to a signal produced by the detection means 3.
In fact, the detection means 3 produces an output electronic signal according to the detection performed, which is sent to the processing unit 6, where it is processed by the size module 62.
In a preferred embodiment, where said echo device 3 of the detection means is used, said size parameter is calculated according to the magnitude of echo signals reflected by each object 4, 41 , 42 and received by the device 3.
More preferably, the unit 6 also comprises a distance module 63 connected to the detection device 3 and configured to calculate the distances between each object 4, 41 , 42 and the apparatus 1 (so roughly the position of vehicle 2).
Most preferably, the distance module 63 is configured for calculating the distances of the objects along their radial directions passing through the detection means 3; in the following, this distance value will be named "radial distance".
More in general, the distance module 63 can be configured to calculate the distance between the detected positions of the objects 4, 41 , 42 and a reference point in space; in the above-mentioned preferred case, the reference point is the position of the apparatus 1.
According to an important feature of the invention, the processing unit 6 further includes a classification module 64 configured to classify different objects 4, 41 , 42 according to the respective speed and the size parameters.
In fact, the Applicant discovered that a realistic assumption of the type of the objects a vehicle 2 have to deal with in a work site is achievable knowing their sizes and speeds.
By way of example, if an object stands still and is very large, it might be a pile of gravel 42 or even a wall, whereas if it is very small and is moving slow can be a
pedestrian 41 , while if it's small / medium sized and is moving can be another vehicle, etc....
More in detail, said size parameter can be calculated according to a relation between said magnitude and said relative distance of the object.
In detail, said relation can be the ratio between the magnitude and the relative distance or a different mathematical relation.
More preferably, the size module 62 is configured for determining the size parameter calculating the ratio between the magnitude of said output signal and the square of the detected distance of a given object, particularly said radial distance.
Thus, by means of trials and in-field tests, it is possible to determine for each type of object corresponding ranges in speed and size parameters and accordingly program the classification module 64, so that an automatic classification of the objects can be done during the use of the vehicle 2 (see figure 4).
In a possible embodiment, the speed module 61 is configured for determining said speed parameter according to a speed component S relative to a radial direction R of each objects 4 passing through the detection means 3.
In this case, said radial direction R can be determined according to an azimuth angle A between each object and a reference direction D, the latter being fixed with respect to detection means 3, whereas said azimuth angle A being determined according to the positions detected by the detection means 3.
Said reference direction D is preferably the axial direction of the vehicle 2 or is parallel to the axial direction and passes through the apparatus 1.
Preferably the speed parameter in the radial direction is the relative speed S between the vehicle 2 and the object or is a function of that relative speed S (see figure 5).
This relative "radial speed" S between the vehicle 2 and the object can be used by the classification module 64 to determine whether an object 4 is standing still or not.
In fact, the classification module 64 can be configured for comparing said relative radial speed S with the speed V (component) of the vehicle 2 in said radial direction
R (i.e. the radial speed of the vehicle 2), calculated by means of e.g. known speed sensor usually provided on the vehicle 2.
For example, if the radial speed V of the vehicle 2 is different from zero and the relative radial speed S is equal to the vehicle radial speed V, than it is reasonable to assume that the object is standing still. Whereas, if the radial speed V of the vehicle 2 is different from zero and the relative radial speed S is different from the vehicle radial speed V, then it is reasonable to assume that the object is moving.
In another possible embodiment, the classification module 64 is configured for determining whether an object 44 is standing still according to its speed relative to a plurality of objects 43 being fixed to each other (see figure 6).
In detail, as shown in figure 6, taking into consideration (at least) two objects 43 known as standing still, and considering the position of one of this standing still object 43 as the origin of a coordinate system, any other further object 44 (a third object, for instance) can be assumed as standing still if its relative speed relative to the two reference standing still objects 43 equals zero.
Then that further object 44 becomes a reference standing still object, itself and so on.
In the same embodiment, a variation is possible in which the classification module 64 is configured for using a combination of the relative speed S, the vehicle radial speed V and the size parameter.
Further, the apparatus 1 can include a setting module 65 connected to the classification module 64 and configured to enable the user to set said reference fixed object, i.e. to preliminary pinpoint the objects 43 which are to be considered as fixed.
This information can be recorded and stored in a memory unit 66.
In this case, also a GPS navigation device, or the like, can be provided in the apparatus 1 so that the positions of the reference fixed objects 43 can be defined by respective coordinates.
In a possible embodiment of the invention, the processing unit 6 comprises a user interface 67 configured for displaying to a user of said vehicle 2 the objects 4, 41 , 42
in said area of interest 5 associated to information relating the respective classification as determined by the classification module 64.
The user interface 67 can comprise a display and a keyboard or a touchscreen display or the like.
In this embodiment, an assessment module 68 can be provided in the processing unit 6 which is configured for enabling the user to confirm or reject or change said classification associated to the objects 4, 41 , 42, by means of said user interface 67. In order to provide the user with a representation of the objects 4, 41 , 42 via the interface 67, the invention can either comprise a camera pointing at said area of interest 5 or a imaging module of the processing unit 6 configured for producing a visual rendering of the objects detected by the detection means 3, that is configured to translate e.g. sound waves into a visual rendering of those objects.
The invention can also include both the camera and the imaging module or different means able to feed the interface with said representation of the objects comprised in the area of interest 5.
Basically, when the user drives the vehicle 2 the detection means 3 detects the objects 4, 41 , 42 in the area of interest 5, the classification of which is assessed by the classification module 64.
A visual representation of the objects and their classification is provided to the user by means of the user interface 67, e.g. a display included in the cabin of the vehicle 2.
The user watching the image of a detected object 4, 41 , 42 as provided by the interface 67 could also be able to look directly at the object 4, 41 , 42, e.g. through the windows of the cabin, and assess whether the displayed classification is correct or erroneous.
If the classification is erroneous, the user can reject or even change it via the interface 67 itself; if the classification is correct the user can accept it either doing nothing or positively confirm it, again via the interface 67.
In this way, a situation is avoided in which a user who has been once warned of a danger by an erroneous classification of a given object will be warned again every
time the vehicle 2 comes near that object, i.e. every time the object is in said area of interest 5 as above defined.
The invention also provides a method for objects recognition, comprising the following steps: providing a vehicle 2; detecting the positions of objects 4, 41 , 42 in at least an area of interest 5; determining a speed parameter of each object 4, 41 , 42 according to the position or positions detected; determining a size parameter for each object 4, 41 , 42; and classify different objects 4, 41 , 42 according to the respective speed parameter and size parameter.
Further optional steps of the proposed method corresponds to actions performed by the modules of the processing unit 6 as above described in detail.
In detail, in a particular embodiment, the method can include the following steps:
- displaying to a user of said vehicle (2) the objects (4, 41 , 42) detected in said area of interest (5) associated to information relating the respective classification; and
- confirm, reject or change the classification associated to the objects (4, 41 , 42).
Claims
1 . Apparatus (1 ) for objects recognition to be provided on a vehicle (2), comprising detection means (3) for detecting the positions of objects (4, 41 , 42) in at least an area of interest (5) and at least a processing unit (6), connected to the detection means (3), characterized in that the processing unit (6) comprises: a speed module (61 ) configured for determining a speed parameter for each object (4, 41 , 42) according to its position or positions detected by said detection means (3); a size module (62) configured for determining a size parameter for each object (4, 41 , 42), according to a signal produced by the detection means (3); and a classification module (64) configured for classify different objects (4, 41 , 42) according to the respective speed parameters and size parameters.
2. Apparatus (1 ) according to claim 1 , wherein said detection means comprises at least an echo device (3).
3. Apparatus (1 ) according to claim 2, wherein said detection means comprise a radar device (3).
4. Apparatus (1 ) according to claim 2 or claim 3, wherein said size parameter is calculated according to the magnitude of echo signals reflected by each object (4, 41 , 42) and received by the detection means (3).
5. Apparatus (1 ) according to at least one of the preceding claim, comprising a distance module (63) configured to calculate the distances between each object (4, 41 , 42) and the apparatus (1 ).
6. Apparatus (1 ) according to the preceding claim, wherein said size module (62) is configured to calculate the size parameter according to a relation between said magnitude and said distance of the respective object (4, 41 , 42).
7. Apparatus (1 ) according to the preceding claim, wherein the size module (62) is configured to determine the size parameter calculating the ratio between said magnitude and the square of the distance of the respective obstacle (4, 41 , 42).
8. Apparatus (1 ) according to at least a preceding claim, wherein said speed module (61 ) is configured to determine the relative speed between different objects (4, 41 , 42) and/or the apparatus (1 ) itself.
9. Apparatus (1 ) according to the preceding claim, wherein said speed module (61 ) is configured for determining said relative speed in radial directions of each object passing through the detection means (3).
10. Apparatus (1 ) according to the preceding claim, wherein said classification module (64) is configured to determine whether an object (4) is standing still or not according to said relative speed in said radial direction.
1 1 . Apparatus (1 ) according to at least one of the preceding claims, wherein said classification module (64) is configured to determine whether an object (44) is standing still or not according to its speed relative to at least another standing still object (43).
12. Apparatus (1 ) according to claim 10, wherein said classification module (64) is configured to classify the object (4) using a combination of the relative speed (S), the vehicle radial speed (V) and the size parameter.
13. Apparatus according to at least one of the preceding claims, in which the processing unit (6) comprises: a user interface (67) configured for displaying to a user of said vehicle (2) said objects (4, 41 , 42) detected in said area of interest (5) associated to information relating to the respective classification as determined by the classification module (64); an assessment module configured for enabling the user to confirm or reject or change said classification associated to the objects (4, 41 , 42), by means of said interface.
14. Method for objects recognition, comprising the steps of: providing a vehicle (2); detecting the positions of objects (4, 41 , 42) in at least an area of interest (5);
determining a speed parameter of each object according to its position or positions detected; determining a size parameter for each object (4, 41 , 42); classify different objects (4, 41 , 42) according to the respective speed parameter and size parameter.
15. Method according to the preceding claim, comprising the following step: displaying to a user of said vehicle (2) the objects (4, 41 , 42) detected in said area of interest (5) associated to information relating to the respective classification; and confirm, reject or change the classification associated to the objects (4, 41 , 42).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17720164.7A EP3452353A1 (en) | 2016-05-06 | 2017-05-03 | Method and an apparatus objects recognition |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ITUA2016A003203A ITUA20163203A1 (en) | 2016-05-06 | 2016-05-06 | Method and apparatus for object recognition. |
ITUA2016A003203 | 2016-05-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017191212A1 true WO2017191212A1 (en) | 2017-11-09 |
Family
ID=56801741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2017/060572 WO2017191212A1 (en) | 2016-05-06 | 2017-05-03 | Method and an apparatus objects recognition |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3452353A1 (en) |
IT (1) | ITUA20163203A1 (en) |
WO (1) | WO2017191212A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2595747B (en) * | 2020-06-02 | 2024-03-27 | Hastec Rail Ltd | Anti-collision apparatus for on track plant |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050232491A1 (en) * | 2004-03-02 | 2005-10-20 | Peng Chang | Method and apparatus for differentiating pedestrians, vehicles, and other objects |
US20100054540A1 (en) * | 2008-08-28 | 2010-03-04 | Lisa Marie Brown | Calibration of Video Object Classification |
US20150293216A1 (en) * | 2014-04-15 | 2015-10-15 | GM Global Technology Operations LLC | Method and system for detecting, tracking and estimating stationary roadside objects |
US20150336274A1 (en) * | 2014-05-20 | 2015-11-26 | International Business Machines Corporation | Information Technology Asset Type Identification Using a Mobile Vision-Enabled Robot |
US20150336575A1 (en) * | 2014-05-21 | 2015-11-26 | GM Global Technology Operations LLC | Collision avoidance with static targets in narrow spaces |
-
2016
- 2016-05-06 IT ITUA2016A003203A patent/ITUA20163203A1/en unknown
-
2017
- 2017-05-03 WO PCT/EP2017/060572 patent/WO2017191212A1/en unknown
- 2017-05-03 EP EP17720164.7A patent/EP3452353A1/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050232491A1 (en) * | 2004-03-02 | 2005-10-20 | Peng Chang | Method and apparatus for differentiating pedestrians, vehicles, and other objects |
US20100054540A1 (en) * | 2008-08-28 | 2010-03-04 | Lisa Marie Brown | Calibration of Video Object Classification |
US20150293216A1 (en) * | 2014-04-15 | 2015-10-15 | GM Global Technology Operations LLC | Method and system for detecting, tracking and estimating stationary roadside objects |
US20150336274A1 (en) * | 2014-05-20 | 2015-11-26 | International Business Machines Corporation | Information Technology Asset Type Identification Using a Mobile Vision-Enabled Robot |
US20150336575A1 (en) * | 2014-05-21 | 2015-11-26 | GM Global Technology Operations LLC | Collision avoidance with static targets in narrow spaces |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2595747B (en) * | 2020-06-02 | 2024-03-27 | Hastec Rail Ltd | Anti-collision apparatus for on track plant |
Also Published As
Publication number | Publication date |
---|---|
EP3452353A1 (en) | 2019-03-13 |
ITUA20163203A1 (en) | 2017-11-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10962638B2 (en) | Vehicle radar sensing system with surface modeling | |
US7298247B2 (en) | Vehicle periphery monitoring system | |
JP4892518B2 (en) | Vehicle external recognition device and vehicle system | |
RU2735340C1 (en) | Parking control method and parking control device | |
US7498972B2 (en) | Obstacle detection system for vehicle | |
CN109677348B (en) | Pre-collision control device and control method of pre-collision control device | |
US20070255498A1 (en) | Systems and methods for determining threshold warning distances for collision avoidance | |
US10102438B2 (en) | Information display device | |
CN102476619A (en) | Method for detecting the environment of a vehicle | |
CN112995584B (en) | Display device and parking assistance system for vehicle | |
JP7413935B2 (en) | In-vehicle sensor system | |
JP6892600B2 (en) | Object detection method and object detection device | |
US20220097661A1 (en) | Vehicle braking and warning method, system and device based on binocular stereo camera | |
WO2018043028A1 (en) | Surroundings monitoring device and surroundings monitoring method | |
JP2020019372A (en) | Shoring support device for vessel | |
WO2016152000A1 (en) | Safety confirmation assist apparatus, safety confirmation assist method | |
US9931981B2 (en) | Methods and systems for blind spot monitoring with rotatable blind spot sensor | |
JP2006221498A (en) | Operation support device | |
EP2026096A1 (en) | Object-detection device for vehicle | |
KR20190020670A (en) | Supports overtaking acceleration for adaptive cruise control of vehicles | |
WO2017191212A1 (en) | Method and an apparatus objects recognition | |
US20050004719A1 (en) | Device and method for determining the position of objects in the surroundings of a motor vehicle | |
JP2021135191A (en) | Object detection device | |
EP2885181A1 (en) | Method for monitoring a blind spot and driving assistance system | |
CN109229015B (en) | Method for realizing vehicle 360-degree obstacle alarm prompt based on ultrasonic sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17720164 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017720164 Country of ref document: EP Effective date: 20181206 |