EP4019457A1 - Collision avoidance safety system for a vehicle for transporting and/or lifting loads - Google Patents
Collision avoidance safety system for a vehicle for transporting and/or lifting loads Download PDFInfo
- Publication number
- EP4019457A1 EP4019457A1 EP21212859.9A EP21212859A EP4019457A1 EP 4019457 A1 EP4019457 A1 EP 4019457A1 EP 21212859 A EP21212859 A EP 21212859A EP 4019457 A1 EP4019457 A1 EP 4019457A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- collision avoidance
- safety system
- image stream
- central control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/0755—Position control; Position detectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F17/00—Safety devices, e.g. for limiting or indicating lifting force
- B66F17/003—Safety devices, e.g. for limiting or indicating lifting force for fork-lift trucks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/20—Means for actuating or controlling masts, platforms, or forks
- B66F9/24—Electrical devices or systems
Definitions
- the present description relates to the technical field of collision avoidance safety systems and in particular relates to a collision safety system for vehicles for lifting and/or transporting loads, in particular industrial loads.
- the invention relates to a collision avoidance safety system applicable to logistics and/or industrial vehicles which due to their structure (e.g., vehicles comprising a forklift mast) have blind spots and prevent the driver from having an open view.
- a load lifting and/or transport vehicle equipped with the aforesaid collision avoidance safety system is also the subject of this description.
- 1 globally indicates a vehicle for lifting and/or transporting loads.
- the teachings of the present description apply in particular to self-propelled vehicles which include a driver's station or cab intended to accommodate a driver of the vehicle, particularly wheeled vehicles or tracked vehicles.
- the vehicle 1 is a vehicle for lifting and transporting loads, in short "the vehicle 1", is a self-propelled vehicle with wheels and namely a forklift truck.
- the invention relates in general to a collision avoidance safety system applicable to logistics and/or industrial vehicles which due to their structure (e.g., vehicles comprising a fork lift mast) have blind spots and prevent the driver from having an open view. Therefore, vehicles such as trucks, vans, trailers, and trucks are also included.
- the vehicle 1 comprises a chassis 2, a pair of front wheels 3, and a pair of rear wheels 4. At least one of the two pairs of wheels 3,4 is operatively coupled to a traction motor e.g. electric, hybrid, or heat, not shown in the figures.
- a traction motor e.g. electric, hybrid, or heat
- a driver's station 5 is defined either in the chassis 2 or on the chassis 2, e.g., such as a driver's cab, which comprises, for example, a driver's seat 6, a steering handle 7, one or more control devices 8, such as hand levers or pedals.
- the vehicle 1 comprises a load lifting and transporting member, which in the non-limiting example of figure 1 comprises a fork 9.
- the load lifting and transporting member could comprise a movable lifting arm, e.g. either rigid or articulated.
- the vehicle 1 comprises a collision avoidance safety system 100 comprising a central control unit 10 and an at least one peripheral image acquisition and processing unit 20 operatively connected to the central control unit 10.
- the collision avoidance safety system 100 is installed/installable on board the vehicle 1.
- the central control unit 10 and the at least one image acquisition and processing peripheral unit 20 are installed/installable on board the vehicle 1. In particular, they are either integrated into the vehicle or installed or installable by mechanical coupling means, such as magnets, belts, adhesive means, reversible or irreversible screw or bolt systems.
- the central control unit 10 is installed in the driver's station 5, in particular in the driver's cab.
- the at least one imaging peripheral unit 20 is installed on the chassis 2, in particular on a rear portion of the chassis 2.
- the central control unit 10 and the peripheral image acquisition and processing unit 20 are powered independently of each other.
- each has an internal battery and/or each has a dedicated connection to a battery provided in the vehicle 1.
- the collision avoidance safety system 100 comprises, without because of this introducing any limitation, two peripheral image acquisition and processing units 20, one of which is facing in the opposite direction with respect to the location 5 to monitor a first area arranged behind the vehicle 1 and the other is facing sideways to monitor a second area arranged laterally relative to the vehicle 1.
- a third peripheral image acquisition and processing unit not shown in the figures may be arranged to monitor a third area arranged opposite to the second area.
- the image acquisition and processing peripheral unit 20 comprises:
- the image sensor 21 is, for example, a camera or video camera adapted to acquire images in the visible spectrum and/or the infrared spectrum.
- the image acquisition and processing peripheral unit 20 further comprises a communication interface 23 operatively connected to the image acquisition device processing unit 22.
- the communication interface 23 is a radio communication interface.
- the communication interface 23 is a short-range communication interface, e.g. such as a WiFi or Bluetooth communication interface.
- the image acquisition and processing peripheral unit 20 is adapted and configured and in particular is programmed to:
- the processed image stream comprises a video sequence having ten or about ten images per second or having at least ten images per second.
- the image acquisition and processing peripheral unit 20 is further adapted and configured to:
- the central control unit 10 comprises at least a screen 11 to display the processed image stream.
- the screen 11 is a touchscreen.
- the central control unit 10 preferably comprises a communication interface 37, which allows the central control unit 10 to be connected to one or more peripheral image acquisition and processing units 20.
- the communication interface 37 is also a radio communication interface.
- the communication interface 37 is a short-range communication interface, e.g. such as a WiFi or Bluetooth communication interface.
- the central control unit 10 comprises a processing unit 38, e.g. such as an FPGA or a single board computer (such as a Raspberry), which in the example is operatively connected to the communication interface 37 and to the screen 11.
- a processing unit 38 e.g. such as an FPGA or a single board computer (such as a Raspberry), which in the example is operatively connected to the communication interface 37 and to the screen 11.
- the peripheral image acquisition and processing unit 20 runs an object and/or person classification algorithm, e.g. to identify whether the person and/or an object is: a person, a bicycle, an animal, a further identical or similar vehicle 1, etc.
- Said object and/or person classification algorithm is preferably an artificial intelligence algorithm, e.g. based on a neural network, preferably a deep neural network or DNN.
- the image processing unit 22 comprises a single board computer (e.g. such as a Raspberry) and a VPU - Visual Processing Unit - board operatively connected to each other.
- a single board computer e.g. such as a Raspberry
- VPU - Visual Processing Unit - board operatively connected to each other.
- the image acquisition and processing peripheral unit 20, and in particular the image processing unit 22, is such that it compresses the processed stream of images before transmitting it through the communication interface 23 to the central control unit 10.
- the collision avoidance safety system 100 comprises a first housing 14 of the central control unit 10 and a second housing 24 of said at least one peripheral image acquisition and processing unit 20.
- the first housing 14 and the second housing 24 are adapted and configured to be installed spaced apart from each other aboard the vehicle 1, preferably independently of each other.
- the first housing 12 and the second housing 24 each comprise a dedicated box or case.
- the first housing 14 is adapted and configured to be installed inside the driving position 5 and the second housing 24 is adapted and configured to be installed on the chassis 2 of the vehicle 1 outside the driving position 5.
- the peripheral image acquisition and processing unit 20 is adapted and configured, in particular programmed, to estimate a confidence index associated with the step of identifying.
- the processed stream of images comprises a first datum 32 related to said displayed index superimposed on the processed stream of digital images.
- a confidence index is preferably a confidence percentage.
- the image acquisition and processing peripheral unit 20 is adapted and configured to associate an object and/or person identifier in the step of identifying.
- the processed image stream comprises a second datum 33 correlated to said object and/or person identifier displayed superimposed on the processed digital image stream, preferably a datum identifying a type or class of person and/or object.
- the aforementioned first datum 32 and/or the aforementioned second datum 33 in the processed stream of images are displayed near and/or adjacent and/or superimposed on said morphological or geometric box or outline.
- said at least one detection area comprises a first detection area and a second detection area.
- the image acquisition and processing peripheral unit 20 is adapted and configured to distinguish whether said person or object is located within either the first detection area or the second detection area and associate a corresponding area identification datum with the processed image stream.
- said area identification datum is displayed in the processed stream images.
- the datum is displayed by changing a display parameter of said morphological box or outline, e.g. such as a color. In this manner, it will be possible to advantageously associate to the first detection area a first collision risk index and the second detection area a second collision risk index different from the first one.
- the first detection area is an area, which is relatively farther away from the peripheral image acquisition and processing unit 20, and the second detection area is relatively closer.
- the second detection area is thus associated with a higher risk index. For example, if an object and/or a person is identified in the first detection area, the box or the silhouette 31 may be displayed in yellow, while if an object and/or a person is identified in the second area, the box or the silhouette 31 may be displayed in red. Note that in possible implementation variants, it is possible to distinguish between more than two survey areas.
- the central control unit 10 either comprises or is operatively connected to at least one actuator device 35,36.
- the central control unit 10 is attached to and configured, i.e., programmed, to automatically drive the actuator device 25,26 based on information contained in or associated with said processed stream of images or conveniently transmitted by the peripheral unit 20 in parallel with the processed stream of images.
- said actuator 25,26 may be connected to a vehicle safety device, such as an audible or optical warning device or a stop or shutdown device of the vehicle 1.
- the aforesaid actuator device 35, 36 comprises at least one relay operatively either connected or adapted to be operatively connected to a safety device of the vehicle 1.
- the central control unit 10 comprises two actuators 35, 36.
- one is operatively connected to an acoustic and/or optical signaling device
- the other is, for example, operatively connected to the ECU - Engine Control Unit - of the vehicle 1, to shut down or stop the vehicle 1.
- the two actuators can be selectively controlled by the central control unit 10. For example, if a relatively lower risk is associated with a detection area, the central control unit by means of one of the two actuators 35,36 will activate an acoustic and/or optical signaling device.
- the central control unit through one of the two actuators 35,36 will activate an acoustic and/or optical signaling device and through the other of the two actuators will impose a stop or shutdown of the vehicle 1.
- the collision avoidance safety system 100 described above allows for a significant elevation in safety performance. Furthermore, the aforesaid collision avoidance safety system 100 is easily installed also aboard vehicles 1 which are initially lacking it and is scalable in order to be able to increase as needed the number of image acquisition and processing peripheral units 20 connected to the same central control unit 10, without requiring the replacement or expensive upgrades thereof, since the computationally onerous part of the processing operations performed on the acquired images are performed by the peripheral units 20.
Landscapes
- Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Transportation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geology (AREA)
- Mechanical Engineering (AREA)
- Civil Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Traffic Control Systems (AREA)
- Air Bags (AREA)
- Regulating Braking Force (AREA)
- Vehicle Body Suspensions (AREA)
Abstract
- an image sensor (21);
- an image processing unit (22) operatively connected to the image sensor (21);
- a communication interface (23) operatively connected to the image processing unit (22);
wherein the image acquisition and processing peripheral unit (20) is adapted and configured to:
- acquire an image stream through the image sensor (21) and process said acquired image stream through the image processing unit (22) to produce a processed image stream;
- transmit the processed image stream to the central control unit (10) through the communication interface (23);
wherein the image acquisition and processing peripheral unit (20) is further adapted and configured to:
- identify at least one visible person or object (30) in the acquired image stream either transiting or arranged within the at least one detection area;
- produce the processed image stream, wherein each image of the processed image stream is obtained by superimposing on a respective image of the acquired image stream a morphological or geometric box or outline (31) which identifies said person or said object (30) within said image;
wherein the central control unit (10) comprises a screen (11) to display the processed image stream.
Description
- The present description relates to the technical field of collision avoidance safety systems and in particular relates to a collision safety system for vehicles for lifting and/or transporting loads, in particular industrial loads. In particular, the invention relates to a collision avoidance safety system applicable to logistics and/or industrial vehicles which due to their structure (e.g., vehicles comprising a forklift mast) have blind spots and prevent the driver from having an open view.
- The handling of loads in ports, steelworks, industrial, logistic and commercial environments requires the vehicles for lifting and transporting loads to be driven safely to guarantee the protection of the health of the operators, or in general of the people operating or transiting through a concerned area. For this purpose, it is necessary to prevent or avoid collisions between lifting and/or transporting vehicles and persons or objects or other vehicles present or transiting through the maneuvering area.
- Companies in the sector invest a lot in the training and monitoring of the operators in charge of driving these vehicles, e.g. by subjecting operators to medical checks, providing training courses, or requiring these vehicles to be driven by operators holding special licenses.
- From a technological point of view, it is known to equip these vehicles with safety devices of various kinds, e.g., acoustic or luminous warnings, such as buzzers, sirens, flashing lights.
- The expedients adopted so far while reducing to some extent the risk of collisions or accidents in general, however, do not guarantee a satisfactory level of safety.
- It is the object of the present invention to provide a collision avoidance safety system for a vehicle for transporting and/or lifting loads, which is capable of further reducing the risk of accidents or collisions and thus increasing safety performance.
- This and other objects are achieved by a collision avoidance safety system for a vehicle for transporting and/or lifting loads as defined in
claim 1 in its most general form, and some particular embodiments of it in the dependent claims. - A load lifting and/or transport vehicle equipped with the aforesaid collision avoidance safety system is also the subject of this description.
- The invention will become more apparent from the following detailed description of the embodiments thereof, given by way of non-limiting examples, with reference to the accompanying drawings, in which:
-
figure 1 shows a schematic side view of a non-limiting example of an embodiment of a vehicle for transporting and/or lifting loads on which a collision avoidance safety system is installed; -
figure 2 shows a functional block chart of a non-limiting example of an embodiment of the collision avoidance safety system infigure 1 . - Equal or similar elements will be indicated by the same reference numbers in the accompanying drawings.
- With reference to the accompanying
figures, 1 globally indicates a vehicle for lifting and/or transporting loads. The teachings of the present description apply in particular to self-propelled vehicles which include a driver's station or cab intended to accommodate a driver of the vehicle, particularly wheeled vehicles or tracked vehicles. - In the particular example shown, without because of this introducing any limitation, the
vehicle 1 is a vehicle for lifting and transporting loads, in short "thevehicle 1", is a self-propelled vehicle with wheels and namely a forklift truck. As explained above, the invention relates in general to a collision avoidance safety system applicable to logistics and/or industrial vehicles which due to their structure (e.g., vehicles comprising a fork lift mast) have blind spots and prevent the driver from having an open view. Therefore, vehicles such as trucks, vans, trailers, and trucks are also included. - The
vehicle 1 comprises achassis 2, a pair offront wheels 3, and a pair ofrear wheels 4. At least one of the two pairs ofwheels - A driver's
station 5 is defined either in thechassis 2 or on thechassis 2, e.g., such as a driver's cab, which comprises, for example, a driver'sseat 6, asteering handle 7, one ormore control devices 8, such as hand levers or pedals. - The
vehicle 1 comprises a load lifting and transporting member, which in the non-limiting example offigure 1 comprises afork 9. In a possible embodiment variant, the load lifting and transporting member could comprise a movable lifting arm, e.g. either rigid or articulated. - The
vehicle 1 comprises a collisionavoidance safety system 100 comprising acentral control unit 10 and an at least one peripheral image acquisition andprocessing unit 20 operatively connected to thecentral control unit 10. - The collision
avoidance safety system 100 is installed/installable on board thevehicle 1. In particular, thecentral control unit 10 and the at least one image acquisition and processingperipheral unit 20 are installed/installable on board thevehicle 1. In particular, they are either integrated into the vehicle or installed or installable by mechanical coupling means, such as magnets, belts, adhesive means, reversible or irreversible screw or bolt systems. In the particular example shown, thecentral control unit 10 is installed in the driver'sstation 5, in particular in the driver's cab. The at least one imagingperipheral unit 20 is installed on thechassis 2, in particular on a rear portion of thechassis 2. - According to a particularly advantageous embodiment, the
central control unit 10 and the peripheral image acquisition andprocessing unit 20 are powered independently of each other. For example, each has an internal battery and/or each has a dedicated connection to a battery provided in thevehicle 1. - In the non-limiting example shown in the figures, the collision
avoidance safety system 100 comprises, without because of this introducing any limitation, two peripheral image acquisition andprocessing units 20, one of which is facing in the opposite direction with respect to thelocation 5 to monitor a first area arranged behind thevehicle 1 and the other is facing sideways to monitor a second area arranged laterally relative to thevehicle 1. For example, a third peripheral image acquisition and processing unit not shown in the figures may be arranged to monitor a third area arranged opposite to the second area. - The image acquisition and processing
peripheral unit 20 comprises: - an
image sensor 21; - an
image processing unit 22 operatively connected to theimage sensor 21. - The
image sensor 21 is, for example, a camera or video camera adapted to acquire images in the visible spectrum and/or the infrared spectrum. - The image acquisition and processing
peripheral unit 20 further comprises acommunication interface 23 operatively connected to the image acquisitiondevice processing unit 22. According to an advantageous embodiment, thecommunication interface 23 is a radio communication interface. Preferably, thecommunication interface 23 is a short-range communication interface, e.g. such as a WiFi or Bluetooth communication interface. - The image acquisition and processing
peripheral unit 20 is adapted and configured and in particular is programmed to: - acquire an image stream through the
image sensor 21 and process the acquired image stream through theimage processing unit 22 to produce a processed image stream; and - transmit the processed image stream to the
central control unit 10 through thecommunication interface 23. - Preferably, the processed image stream comprises a video sequence having ten or about ten images per second or having at least ten images per second.
- The image acquisition and processing
peripheral unit 20 is further adapted and configured to: - identify at least one visible person and/or
object 30 in the acquired image stream either transiting or arranged within the at least one detection area; - produce the processed image stream, wherein each image of the processed image stream is obtained by superimposing on a respective image of the acquired image stream a morphological or geometric box or
outline 31 which identifies said person and/orobject 30 within said image. - The
central control unit 10 comprises at least ascreen 11 to display the processed image stream. Preferably, thescreen 11 is a touchscreen. - The
central control unit 10 preferably comprises acommunication interface 37, which allows thecentral control unit 10 to be connected to one or more peripheral image acquisition andprocessing units 20. According to an advantageous embodiment, thecommunication interface 37 is also a radio communication interface. Preferably, also thecommunication interface 37 is a short-range communication interface, e.g. such as a WiFi or Bluetooth communication interface. - The
central control unit 10 comprises aprocessing unit 38, e.g. such as an FPGA or a single board computer (such as a Raspberry), which in the example is operatively connected to thecommunication interface 37 and to thescreen 11. - According to an advantageous embodiment, to carry out the aforesaid step of identification the peripheral image acquisition and
processing unit 20, in particular, theimage processing unit 22, runs an object and/or person classification algorithm, e.g. to identify whether the person and/or an object is: a person, a bicycle, an animal, a further identical orsimilar vehicle 1, etc. Said object and/or person classification algorithm is preferably an artificial intelligence algorithm, e.g. based on a neural network, preferably a deep neural network or DNN. - According to an advantageous embodiment, the
image processing unit 22 comprises a single board computer (e.g. such as a Raspberry) and a VPU - Visual Processing Unit - board operatively connected to each other. - According to a particularly advantageous embodiment, the image acquisition and processing
peripheral unit 20, and in particular theimage processing unit 22, is such that it compresses the processed stream of images before transmitting it through thecommunication interface 23 to thecentral control unit 10. - According to an advantageous embodiment, the collision
avoidance safety system 100 comprises afirst housing 14 of thecentral control unit 10 and asecond housing 24 of said at least one peripheral image acquisition andprocessing unit 20. Thefirst housing 14 and thesecond housing 24 are adapted and configured to be installed spaced apart from each other aboard thevehicle 1, preferably independently of each other. For example, the first housing 12 and thesecond housing 24 each comprise a dedicated box or case. For example, thefirst housing 14 is adapted and configured to be installed inside thedriving position 5 and thesecond housing 24 is adapted and configured to be installed on thechassis 2 of thevehicle 1 outside thedriving position 5. - According to a particularly advantageous embodiment, the peripheral image acquisition and
processing unit 20 is adapted and configured, in particular programmed, to estimate a confidence index associated with the step of identifying. The processed stream of images comprises a first datum 32 related to said displayed index superimposed on the processed stream of digital images. Such a confidence index is preferably a confidence percentage. - Advantageously, the image acquisition and processing
peripheral unit 20 is adapted and configured to associate an object and/or person identifier in the step of identifying. The processed image stream comprises a second datum 33 correlated to said object and/or person identifier displayed superimposed on the processed digital image stream, preferably a datum identifying a type or class of person and/or object. - Preferably, the aforementioned first datum 32 and/or the aforementioned second datum 33 in the processed stream of images are displayed near and/or adjacent and/or superimposed on said morphological or geometric box or outline.
- According to a particularly advantageous embodiment, said at least one detection area comprises a first detection area and a second detection area. The image acquisition and processing
peripheral unit 20 is adapted and configured to distinguish whether said person or object is located within either the first detection area or the second detection area and associate a corresponding area identification datum with the processed image stream. Conveniently, said area identification datum is displayed in the processed stream images. For example, the datum is displayed by changing a display parameter of said morphological box or outline, e.g. such as a color. In this manner, it will be possible to advantageously associate to the first detection area a first collision risk index and the second detection area a second collision risk index different from the first one. For example, the first detection area is an area, which is relatively farther away from the peripheral image acquisition andprocessing unit 20, and the second detection area is relatively closer. The second detection area is thus associated with a higher risk index. For example, if an object and/or a person is identified in the first detection area, the box or thesilhouette 31 may be displayed in yellow, while if an object and/or a person is identified in the second area, the box or thesilhouette 31 may be displayed in red. Note that in possible implementation variants, it is possible to distinguish between more than two survey areas. - According to an advantageous embodiment, the
central control unit 10 either comprises or is operatively connected to at least oneactuator device central control unit 10 is attached to and configured, i.e., programmed, to automatically drive the actuator device 25,26 based on information contained in or associated with said processed stream of images or conveniently transmitted by theperipheral unit 20 in parallel with the processed stream of images. For example, said actuator 25,26 may be connected to a vehicle safety device, such as an audible or optical warning device or a stop or shutdown device of thevehicle 1. - Preferably, the
aforesaid actuator device vehicle 1. - In the particular example shown in
figure 2 , thecentral control unit 10 comprises twoactuators vehicle 1, to shut down or stop thevehicle 1. According to the collision risk index associated with the detection area, the two actuators can be selectively controlled by thecentral control unit 10. For example, if a relatively lower risk is associated with a detection area, the central control unit by means of one of the twoactuators actuators vehicle 1. - Based on the above, it is thus possible to understand how a collision avoidance safety system of the type described above allows fully achieving the purposes indicated above with reference to the prior art.
- Indeed, the collision
avoidance safety system 100 described above allows for a significant elevation in safety performance. Furthermore, the aforesaid collisionavoidance safety system 100 is easily installed also aboardvehicles 1 which are initially lacking it and is scalable in order to be able to increase as needed the number of image acquisition and processingperipheral units 20 connected to the samecentral control unit 10, without requiring the replacement or expensive upgrades thereof, since the computationally onerous part of the processing operations performed on the acquired images are performed by theperipheral units 20. - Without prejudice to the principle of the invention, the embodiments and the constructional details may be broadly varied with respect to the above description disclosed by way of non-limiting example, without departing from the scope of the invention as defined in the appended claims.
Claims (15)
- A collision avoidance safety system (100) for a vehicle (1) for transporting and/or lifting loads, comprising a central control unit (10) and at least one image acquisition and processing peripheral unit (20) operatively connected to the central control unit (10), wherein the central control unit (10) and the at least one image acquisition and processing peripheral unit (20) either are or can be installed aboard said vehicle (1) and wherein the image acquisition and processing peripheral unit (20) comprises:- an image sensor (21);- an image processing unit (22) operatively connected to the image sensor (21);- a communication interface (23) operatively connected to the image processing unit (22);wherein the image acquisition and processing peripheral unit (20) is adapted and configured to:- acquire an image stream through the image sensor (21) and process said acquired image stream through the image processing unit (22) to produce a processed image stream;- transmit the processed image stream to the central control unit (10) through the communication interface (23);wherein the image acquisition and processing peripheral unit (20) is further adapted and configured to:- identify at least one visible person or object (30) in the acquired image stream either transiting or arranged within the at least one detection area;- produce the processed image stream, wherein each image of the processed image stream is obtained by superimposing on a respective image of the acquired image stream a morphological or geometric box or outline (31) which identifies said person or said object (30) within said image;wherein the central control unit (10) comprises a screen (11) to display the processed image stream.
- A collision avoidance safety system (100) according to claim 1, wherein the communication interface (23) is a wireless communication interface.
- A collision avoidance safety system (100) according to claim 2, wherein the communication interface (23) is a short-range communication interface, e.g. such as a WiFi or Bluetooth communication interface.
- A collision avoidance safety system (100) according to any one of the preceding claims, wherein the peripheral image acquisition and processing unit (20) is such to compress the processed image stream before transmitting it through said communication interface (23).
- A collision avoidance safety system (100) according to any one of the preceding claims, comprising a first housing (14) of said central control unit (10) and a second housing (24) of said at least one peripheral image acquisition and processing unit (20), wherein the first housing (14) and the second housing (24) are adapted and configured to be installed spaced apart from each other in said vehicle (1).
- A collision avoidance safety system (100) according to claim 5, wherein the vehicle (1) comprises a chassis (2) and a driving station (5) and wherein the first housing (14) is adapted and configured to be installed in said driving station (5) and the second housing (24) is adapted and configured to be installed on the chassis (2) outside the driving station (5).
- A collision avoidance safety system (100) according to any one of the preceding claims, wherein said peripheral image acquisition and processing unit (20) is adapted and configured to estimate a confidence index associated with the step of identifying and wherein the processed image stream comprises a first datum (32) related to said displayed index superimposed on said processed digital image stream, preferably a confidence percentage.
- A collision avoidance safety system (100) according to any one of the preceding claims, wherein said image acquisition and processing peripheral unit (20) is adapted and configured to associate an object identifier with the step of identifying and wherein said processed image stream comprises a second datum (33) related to said displayed object identifier superimposed on said processed digital image stream, preferably a data element identifying a type or class of person or object.
- A collision avoidance safety system (100) according to claim 8 or 9, wherein said first datum (32) and/or said second datum (33) are displayed close to and/or adjacent to and/or superimposed on said morphological or geometric box or outline.
- A collision avoidance safety system (100) according to any one of the preceding claims, wherein said at least one detection area comprises a first detection area and a second detection area and wherein the peripheral image acquisition and processing unit (20) is adapted and configured to distinguish whether said person or object is located within the first detection area or within the second detection area and associate a corresponding area identification datum with the processed image stream.
- A collision avoidance safety system (100) according to claim 10, wherein said area identification datum is displayed in the images of the processed image stream.
- A collision avoidance safety system (100) according to claim 11, wherein said datum is displayed by changing a display parameter of said morphological box or outline (31), e.g. such as a color.
- A collision avoidance safety system (100) according to any one of the preceding claims, wherein said central control unit (10) either comprises or is operatively connected to at least one actuator device (35,36), and wherein said central control unit (10) is adapted and configured, i.e. programmed, to automatically drive said actuator device (25,26) based on information either contained in or associated with said processed image stream or transmitted by said peripheral unit (20) in parallel with said processed image stream.
- A collision avoidance safety system (100) according to claim 13, wherein said at least one actuator device (35, 36) comprises at least one relay operatively either connected or adapted to be operatively connected to a safety device of the vehicle (1).
- A vehicle (1) for transporting and/or lifting loads, characterized in that it comprises a collision avoidance safety system (100) according to any one of the preceding claims installed aboard said vehicle (1).
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IT102020000031862A IT202000031862A1 (en) | 2020-12-22 | 2020-12-22 | ANTI-COLLISION SAFETY SYSTEM FOR LIFTING AND/OR LOAD TRANSPORT VEHICLES |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4019457A1 true EP4019457A1 (en) | 2022-06-29 |
Family
ID=75111684
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21212859.9A Pending EP4019457A1 (en) | 2020-12-22 | 2021-12-07 | Collision avoidance safety system for a vehicle for transporting and/or lifting loads |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP4019457A1 (en) |
IT (1) | IT202000031862A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170061689A1 (en) * | 2015-08-24 | 2017-03-02 | Caterpillar Inc. | System for improving operator visibility of machine surroundings |
US20170212517A1 (en) * | 2016-01-27 | 2017-07-27 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US20200134396A1 (en) * | 2018-10-25 | 2020-04-30 | Ambarella, Inc. | Obstacle detection in vehicle using a wide angle camera and radar sensor fusion |
-
2020
- 2020-12-22 IT IT102020000031862A patent/IT202000031862A1/en unknown
-
2021
- 2021-12-07 EP EP21212859.9A patent/EP4019457A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170061689A1 (en) * | 2015-08-24 | 2017-03-02 | Caterpillar Inc. | System for improving operator visibility of machine surroundings |
US20170212517A1 (en) * | 2016-01-27 | 2017-07-27 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US20200134396A1 (en) * | 2018-10-25 | 2020-04-30 | Ambarella, Inc. | Obstacle detection in vehicle using a wide angle camera and radar sensor fusion |
Also Published As
Publication number | Publication date |
---|---|
IT202000031862A1 (en) | 2022-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11618441B2 (en) | Vehicular control system with remote processor | |
EP3589506B1 (en) | Autonomous trailer hitching using neural network | |
CN111372795B (en) | Automated trailer hitch using image coordinates | |
CN208101973U (en) | A kind of intelligent driving auxiliary anti-collision system | |
KR101747375B1 (en) | Apparatus and method for vehicle remote controlling and remote driving system | |
US9164507B2 (en) | Systems and methods for modeling driving behavior of vehicles | |
CN113212427A (en) | Intelligent vehicle with advanced vehicle camera system for underbody hazard and foreign object detection | |
CN112373465A (en) | Auxiliary driving system of trackless rubber-tyred vehicle and control method | |
KR102390021B1 (en) | Forklift safety system using AI and radar | |
CN106467061A (en) | For showing the camera chain in the region of outside vehicle | |
CN112078581A (en) | Vehicle indicating mobility of passenger and method of controlling the same | |
CN111071143B (en) | Light prompting system and method for automatic driving vehicle and automatic driving vehicle | |
KR20120086577A (en) | Apparatus And Method Detecting Side Vehicle Using Camera | |
CN111891121A (en) | Safety early warning system and method for low-speed running and parking of vehicle | |
EP4019457A1 (en) | Collision avoidance safety system for a vehicle for transporting and/or lifting loads | |
CN214492713U (en) | Vehicle-mounted system and vehicle information interaction system | |
US20210170821A1 (en) | Method and device for ascertaining a relative angle between two vehicles | |
JP6771653B2 (en) | Methods and devices for notifying the driving status of a car without a driver | |
SE541746C2 (en) | Method and system for facilitating safety for vulnerable road users in association with a vehicle | |
CN112074452B (en) | Safety method and control device for modular autonomous vehicle | |
US20220388512A1 (en) | Vehicle monitoring system | |
CN115884930A (en) | System and method for operating and managing an autonomous vehicle interchange area | |
CN110497854A (en) | A kind of tele-control system of slag-soil truck | |
US20220194421A1 (en) | Automated driving vehicle | |
CN213649544U (en) | Auxiliary driving system of trackless rubber-tyred vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20221207 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230831 |