WO2011153646A1 - Method and control unit for controlling a display of a proximity warning system - Google Patents

Method and control unit for controlling a display of a proximity warning system Download PDF

Info

Publication number
WO2011153646A1
WO2011153646A1 PCT/CH2010/000152 CH2010000152W WO2011153646A1 WO 2011153646 A1 WO2011153646 A1 WO 2011153646A1 CH 2010000152 W CH2010000152 W CH 2010000152W WO 2011153646 A1 WO2011153646 A1 WO 2011153646A1
Authority
WO
WIPO (PCT)
Prior art keywords
cameras
positional information
camera
display
images
Prior art date
Application number
PCT/CH2010/000152
Other languages
French (fr)
Inventor
Urs Martin Rothacher
Peter Arnold Stegmaier
Original Assignee
Safemine Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Safemine Ag filed Critical Safemine Ag
Priority to AU2010355231A priority Critical patent/AU2010355231B2/en
Priority to PCT/CH2010/000152 priority patent/WO2011153646A1/en
Priority to CA2802122A priority patent/CA2802122C/en
Publication of WO2011153646A1 publication Critical patent/WO2011153646A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the invention relates to a method and a control unit for controlling a display of a proximity warning system.
  • Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
  • Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated sys- terns may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car .
  • 2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions.
  • Another approach based on GNNS receivers is disclosed in the International Application No. PCT/CH2009/000200 incorporated herein by reference.
  • Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", retrieved and accessed from/on the Internet at
  • Sensors for avoid ⁇ ing collisions may include radar systems, conventional cameras or thermal imaging cameras.
  • each camera may display its image on a display installed in the driving cab.
  • a signal representing positional information is received from a radio based positioning receiver.
  • a subset of at least one camera out of at least two cameras for providing images of different scenes is selected dependent on the positional information.
  • a control signal is provided for the display to display images provided by the selected subset of one or more cameras.
  • a control unit for controlling a display according to the features of independent claim 20.
  • Such control unit comprises a receiving unit for receiving a signal representing positional information of an object from a radio based positioning receiver.
  • a selection unit is designed for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes subject to the positional information.
  • a control signal is provided to display images provided by the selected subset of one or more cameras.
  • the basic idea of the present invention is to provide an aid to the operator at which of the camera outputs to look at by means of prioritizing such camera (s) .
  • a GNSS receiver is used for determining the present location of the object the cameras are assigned to and/or the location of an object different to the one the cameras are assigned to.
  • the location of an object, specifically presented as coordinates in a chosen coordinate system, may advantageously be subsumed under the term "positional information" as presently used .
  • the GNSS re- ceiver and the cameras are attached to the same object.
  • An electronic map of preferably stationary objects being critical to traffic on a site may be stored, and the cur ⁇ rent position of the object as identified by the GNSS receiver, may be compared or otherwise put into relation to the position of one or more objects listed in such map. For example, in case of the distance between the object and a stationary object listed in the map being or possibly becoming critical, e.g. by subtracting the two location data from each other, it is decided to which one of the cameras to draw the operators attention to which preferably is the camera that looks into the direction the critical object is located at.
  • the GNSS receiver and the cameras are attached to the same object, other objects including movable objects may be equipped with GNSS receivers, too, for determining their respective positions and/or trajectories.
  • Such positional information is broadcast or individually transmitted by these objects to other objects on the site being equipped with a corresponding receiver.
  • the direction and distance, and also any approaching ve- locity may be determined at the present object with respect to one or more other objects around.
  • it is determined again to which of the cameras the operators attention should be drawn to which preferably is the camera that looks into the direction the critical object is located at.
  • the GNSS receiver and the cameras are attached to different objects, the object comprising the cameras not necessarily including a GNSS receiver.
  • other objects on a site may be equipped with a GNSS receiver and broadcast or otherwise transmit their positional information to other objects on the site being equipped with a corresponding receiving unit.
  • the object equipped with the cameras receives such positional information via its receiving unit, the positional information may be evaluated and the direction and/or the distance and/or the approaching velocity of near-by or distant objects may be considered as critical in terms of a collision or a pre-collision scenario.
  • the selection step can be implemented the same way as described above, and the display is controlled such that for the operator of the object the cameras are attached to emphasis is put on the one or more cameras looking into the direction another object is detected.
  • the general purpose of the selection step is to make the operator focus to the one or more cameras by which a potential danger is currently being filmed under the assumption that there are at least two cameras avail- able filming different scenes, i.e. preferably different scenes around the object the cameras are attached to. Consequently, it is ensured that the image information being most relevant especially in terms of proximity warning including collision avoidance is displayed on the display.
  • the determination which, one/s of the cameras currently monitors the most relevant scene is performed by a selection unit comprised in the control unit. In particular, the location information provided by a GNSS receiver is analyzed in terms of proximity to potentially dangerous objects.
  • the personnel in charge for a safe operation of such object being e.g. a vehicle may not be distracted by a multitude of image information but instead may focus on the most relevant image information being displayed.
  • FIG. 1 a schematic representation of a mining site
  • Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention
  • Fig. 3 a top view on a schematic vehicle with cameras mounted according to an embodiment of the present invention
  • Fig. 4 a display
  • Fig. 5 a block diagram of another monitoring system according to an embodiment of the present invention.
  • Fig. 6 another display.
  • an "image” is understood as being the output of a camera filming a scene. This can be a camera working in the visible range if light but also a camera working in the infrared range. Such image typically visualizes the scene on a display. When talking about different images it is inherently un ⁇ derstood that these images are generated by different cameras, typically simultaneously.
  • image information may include any information provided by such camera, and, in particular, the image itself.
  • the cameras used provide images of "different scenes".
  • a scene is "different” to another scene whenever the cameras involved do not shoot or scan the same perspective. Cameras may not shoot the same perspective, for example, when they are attached to different locations of an object.
  • the at least two cameras are mounted on the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the cameras are ar- ranged such that they scan different sides of the object they are attached to in order to detect other objects in proximity at different or even all sides of the object.
  • a "section" assigned to a camera is understood as - e.g. when shooting with a camera horizontally - the horizontal area in front of the camera in which the camera is able to monitor scenes in, and that conse- quently can be displayed in an image.
  • a section of a camera may include a sector.
  • control unit may be embodied in hardware, in software or both, and may be embodied in a single device, or its functions may be decentralized. Its functional building block “selection unit” may also be embodied in hardware, in software or both.
  • the "display” may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, i.e. cameras, simultaneously.
  • the “display” also encompasses the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any displaying means for displaying image information delivered by the cameras.
  • the control signal may evoke additional action sub ⁇ ject to what is displayed during the normal mode of op- eration, i.e. the default mode when there is no object in the vicinity detectable: If, for example, the display regularly shows images of a single camera source only, the control signal may cause to switch from displaying images from the current camera source to displaying im- ages from the camera source selected according to the present idea. If the current image source by chance coincides with the selected image source, there may be no change visible to the monitoring person.
  • the control signal causes to highlight the se- lected images for drawing the attention to the subject images, e.g.
  • the control signal may cause that the entire display now displays images only from the selected source. Or, the control signal may cause images from other sources being shut down or completely masked or visually downsized in order to emphasize the selected im- ages.
  • the selected image may, as indicated, claim the entire display screen, or remain in an unchanged image size on the screen.
  • the control signal may include zooming in the selected image.
  • the control signal may additionally cause acoustic warnings to be issued.
  • the control signal may either comprise the selected image itself provided the images are supplied by the cameras to the control unit, or it may cause the subject cameras to directly route the requested image to the display, or it may cause the display to accept only display information from the camera as selected. All the above holds true also for the selection of multiple images if appropriate.
  • a "warning system" and a corresponding "warning” activity may refer to any suitable activity for drawing the attention of the driver or operator to what might be identified as a scene that may become critical in terms of collision or proximity, including selecting an image to be displayed.
  • Such warning system may primarily include the display which the cameras supply with images, but may additionally include acoustic means such as a horn, a diaphone or a speaker, and possibly other visual means such as one or more LEDs, a flashlight, etc..
  • the warning character of the images displayed may be especially emphasized by displaying the images intermittently, or by alternating between the image information and its inverse colors, or by overlaying the image information with visual warning symbols.
  • a control signal separate from the control signal for the display may be issued subject to range information derived from the positional information delivered by the one or more GNSS receivers. For example, a first control signal for the display may be issued based on a first threshold condition for the object still being distant with respect to the present object, and a separate control signal for an acoustic warning element may be issued based on a second threshold condition for the object being very close with respect to the present object.
  • radio based positioning system stands for a GNSS or for any other type of positioning system based on radio signals, such as a pseudolite system.
  • GNSS stands for "Global Navigation Satellite System” and encompasses all satellite based navigation systems, including GPS and Galileo.
  • a “receiver” is a receiver designed for receiving information from satel- lites and for determining its position subject to the signals received.
  • a "movable object” is any object that can change and is expected to change its position and/or orientation or configuration in space. It may e.g. be a truck or any other vehicle that moves from place to place and changes its orientation with respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, e.g. by extending an arm, in such a manner that the volume of safety space attributed to it varies in significant manner.
  • Fig. 1 schematically depicts a site 1, such as a surface mine.
  • a site such as a surface mine.
  • a site covers a large area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3.
  • a plurality of objects is present in the mine, such as:
  • Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
  • vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
  • a further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
  • objects according to an embodiment present in a mine 1 and subject to potential collision may be equipped with at least one GNSS receiver 11, a control unit per object, at least two cameras (not shown in Fig. 1) and a display per object (not shown in Fig. 1) .
  • Large objects may provide more than one GNSS receiver 11 per object as shown in Fig. 1.
  • the entirety of these elements per object for generating a proximity warning is called a monitoring system.
  • the GNSS receivers 12 interact with satellites 30 for determining the positional information of the object they are mounted to.
  • FIG. 2 illustrates a block diagram of a monitoring system including a control unit 13 according to an embodiment of the present invention.
  • a receiver 17 of the control unit 13 is connected to cameras 12.
  • An output 16 of the control unit 13 is connected to a display 19 and a beeper as warning means. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections.
  • Each camera 12 delivers a series of images with respect to the scene monitored by the respec- tive camera 12. Preferably, each of the cameras 12 looks into a different direction for monitoring different scenes with respect to the object these cameras are attached to.
  • the monitoring system further comprises a radio based positioning receiver 11, attached to the present object.
  • the receiver 11 provides a signal comprising positional information, i.e. the position of the present object, determined in combination with satellites 30 as shown in Figure 1. Such signal may be received by a receiving unit 15 in the control unit 13.
  • the control unit 13 comprises a microprocessor system 14, which controls the operations of the control unit 13.
  • a memory 18 comprises programs as well as various parameters, such as unique identifiers of the cameras. Such programs may comprise instructions for evaluating the positional information, and for selecting a subset of cameras currently providing the most significant image information.
  • the radio based positioning receiver 11 may provide positional information of the subject location it is located which represents the subject location of the object it is attached to. Provided that other moving or stationary objects on the site are equipped with such re- DCvers 11, too, the positional information related to the various objects may be shared between the control units of these objects, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to
  • Position information of the present object provided by the radio based positioning receiver 11 is transferred to the control unit 13 and evaluated there.
  • evaluation takes into account positional information received from other objects gathered by their own radio based positioning receivers and trans- mitted e.g. by a wireless interface not shown in Figure 2.
  • a proximity situation may be detected. If such proximity situation is detected by means of the positional information, a control signal may be issued which activates displaying the image from the camera looking into a direction where the proximate object is located at.
  • the selected image represent the camera that currently films the proximate object which is of most interest to be monitored by the operator in order to avoid a collision. This is why image information stemming from this camera is emphasized in being presented to the personnel via the display.
  • Figure 2 shows an electronic map 40 stored in the control unit 13 which holds location information significant of stationary objects located on the site where the monitoring system of Figure 2 is in use.
  • the positional information supplied by the GNSS receiver 11 is compared or otherwise put in relation to the location in- formation of the stationary objects. In case, sufficient proximity or approximation is detected between thses objects, the camera 12 looking into the direction of the stationary object is selected for displaying its image exclusively on the display 19.
  • the object is equipped with another sensor (not shown) for measuring the distance to another object, such as a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device.
  • a signal is re- ceived from such sensor, and the subset of one or more cameras additionally may be selected based on the distance information provided by such sensor.
  • FIG 3 illustrates a schematic top view on a vehicle 6 eguipped with four cameras 12, one located at each side of the vehicle 6, and a single GNSS receiver 11. Sections monitored by each camera 12 are indicated by sector lines and referred to by 121. This makes each camera 12 scan a different scene at each side of the vehicle 6. Alternatively, the cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the object for proximity including collision detection purposes.
  • the selection of the camera may be based on the positional information with respect to the present vehicle 6 and such second positional information. Analyzing the positional information of both of the objects may allow identification of the direction the other object is located at with respect to the vehicle 6, and the distance between the vehicle and the other object.
  • the section 121 of the left hand camera 12 is identified as relevant section 121 when mapping the position of the other object 200 to the sections 121 of the cameras 12 of the vehicle. For such mapping, it is beneficial to permanently monitor the orientation of the vehicle 6 which may alter when moving the vehicle. This may involve e.g.
  • the identified section 121 makes the camera 12 associated to be the preferred camera for selection. As a result, this camera 12 will exclusively provide images of this proximity situation to the operator provided the distance to the object 200 is not that far that any selection is suppressed.
  • the first and second positional information may be used for determining the distance between the ot- her object and the vehicle. The distance information may be included in the selection step, and the image of the camera corresponding to the identified section may only be selected when the determined distance between the objects is below a given threshold. Otherwise, it is assumed that the other object still is too far away for justifying a warning to the operator.
  • the position of the third object 300 may be determined with respect to the sections 121 of the cameras 12 of the vehicle 6.
  • the section 121 to the right hand side of the vehicle 6 is identified as section the object 300 maps/falls.
  • the right hand side camera 12 is associated to this section 121.
  • the object 200 now is assumed to be at a distance from the vehicle 6 which justifies issuing a warning to the operator.
  • both cameras i.e. the left hand and the right hand camera 12 may be selected for delivering images to the display, e.g. compared to a default display mode where all four cameras 12 show their images on the display.
  • only the object closest to the vehicle 6 shall be displayed.
  • the image of the camera being mounted to the object being closest will be allowed to display the scene it monitors, i.e. the camera 12 to the right hand as the object 300 is closer to the vehicle 6 than the object 200.
  • the radio based positioning receiver 11 always is present at the vehicle 6 / object the cameras are attached to. In another embodiment, no such radio based positioning receiver 11 is attached to the object holding the cameras. Instead, the selection of cameras only relies on positional information received from other objects. Such positional information may be sufficient for selecting the one or more cameras by a mapping step equivalent to the one described in connection with the embodiment above. This holds for other objects providing their position information not in an absolute measure but e.g. in relation to the present object, or to any other known object. Or, preferably, means other than radio based positioning means may be provided for allowing an assessment of the position of the other object with respect to the own position. In case the own position is a priori rather limited to a small area, e.g. when the vehicle may move only within a limited radius, even no such additional means are needed, as the own position may be known in advance, stored in the control unit and be used for putting the position of the other object provided in absolute coordinates into relation with its own position.
  • the display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 8 cameras across its screen.
  • the entire screen of Figure 4 may be reserved for showing the subject image information.
  • the screen of the display 19 is devided, and the image information selected is displayed on portion 20 of the display.
  • Portion 21 may be reserved for issuing visual warnings, such a bold "DANGER" symbol or text or other kind of visual warnings.
  • the block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the cameras 12. By activating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associated camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image information from the other cameras 12. There is no need for providing a receiver 17 for the image information in the control unit 13.
  • Figure 6 provides another schematic representation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying information from a camera assigned.
  • the control signal only high ⁇ lights the sub-display 192 which displays image information from the camera 12 selected to be most critical in terms of a potential collision by a blinking frame or similar .

Abstract

The present idea refers to a method and a control unit for controlling a display (19) of a proximity warning system. Vehicles and other objects (4a, 4b, 4c, 5, 6, 7, 8), for example, in a surface mine (1), are equipped with cameras (12) for providing images of different scenes. A control unit (13) of such object (4a, 4b, 4c, 5, 6, 7, 8) receives a signal representing positional information of such object from a radio based positioning receiver (11). Dependent on the positional information a subset of at least one camera (12) is selected, and a control signal is provided for the display (19) to display images provided by the selected subset of one or more cameras (12).By such method, the most relevant scene in terms of collision avoidance can be displayed to the operator.

Description

Method and control unit for controlling a display of a proximity warning system
Technical Field
The invention relates to a method and a control unit for controlling a display of a proximity warning system.
Background Art
Surface mines and similar sites or areas are generally operated by means of a large number of vehicles, some of which may be exceedingly large and difficult to maneuver and have very limited visibility for the operator .
Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated sys- terns may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car .
For extra-large vehicles used in mining, WO
2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions. Another approach based on GNNS receivers is disclosed in the International Application No. PCT/CH2009/000200 incorporated herein by reference. Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", retrieved and accessed from/on the Internet at
http: //www. flir . com/uploadedFiles /Eurasia/MMC/Appl_Storie s/AS_0020_EN.pdf on February 2, 2010. Sensors for avoid¬ ing collisions may include radar systems, conventional cameras or thermal imaging cameras.
In non-conventional types of vehicles such as the vehicles used in mining, each camera may display its image on a display installed in the driving cab. The more cameras there are available the more image information the driver is exposed to such that the driver may be dis¬ tracted by images not being relevant for collision avoid¬ ance. Or, the driver may be overstrained by monitoring the output of all cameras available.
Disclosure of the Invention In this respect, it is desired to improve means in a multi camera based proximity warning system for drawing the attention of the operator to the most relevant camera output/s.
According to a first aspect of the present invention, a method is provided for controlling a display according to the features of independent claim 1.
Accordingly, a signal representing positional information is received from a radio based positioning receiver. A subset of at least one camera out of at least two cameras for providing images of different scenes is selected dependent on the positional information. A control signal is provided for the display to display images provided by the selected subset of one or more cameras.
According to another aspect of the present invention, a control unit is provided for controlling a display according to the features of independent claim 20. Such control unit comprises a receiving unit for receiving a signal representing positional information of an object from a radio based positioning receiver. A selection unit is designed for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes subject to the positional information. At an output of the control unit, a control signal is provided to display images provided by the selected subset of one or more cameras.
The basic idea of the present invention is to provide an aid to the operator at which of the camera outputs to look at by means of prioritizing such camera (s) . For this reason, a GNSS receiver is used for determining the present location of the object the cameras are assigned to and/or the location of an object different to the one the cameras are assigned to. The location of an object, specifically presented as coordinates in a chosen coordinate system, may advantageously be subsumed under the term "positional information" as presently used .
In an advantageous scenario, the GNSS re- ceiver and the cameras are attached to the same object. An electronic map of preferably stationary objects being critical to traffic on a site may be stored, and the cur¬ rent position of the object as identified by the GNSS receiver, may be compared or otherwise put into relation to the position of one or more objects listed in such map. For example, in case of the distance between the object and a stationary object listed in the map being or possibly becoming critical, e.g. by subtracting the two location data from each other, it is decided to which one of the cameras to draw the operators attention to which preferably is the camera that looks into the direction the critical object is located at.
In another advantageous scenario, the GNSS receiver and the cameras are attached to the same object, other objects including movable objects may be equipped with GNSS receivers, too, for determining their respective positions and/or trajectories. Such positional information is broadcast or individually transmitted by these objects to other objects on the site being equipped with a corresponding receiver. By means of such positional information shared amongst objects on the site, the direction and distance, and also any approaching ve- locity may be determined at the present object with respect to one or more other objects around. As soon as one or more of these parameters becomes critical in terms of proximity and/or a collision scenario, it is determined again to which of the cameras the operators attention should be drawn to, which preferably is the camera that looks into the direction the critical object is located at. Summarizing, by means of other objects, e.g. operated and located on the same site, being equipped with GNSS receivers, too, and an infrastructure enabling these objects to exchange information about their current location, information about the existence, the distance to, and the direction of such other objects in the vicinity can be generated.
In another advantageous scenario, the GNSS receiver and the cameras are attached to different objects, the object comprising the cameras not necessarily including a GNSS receiver. However, other objects on a site may be equipped with a GNSS receiver and broadcast or otherwise transmit their positional information to other objects on the site being equipped with a corresponding receiving unit. In case, the object equipped with the cameras receives such positional information via its receiving unit, the positional information may be evaluated and the direction and/or the distance and/or the approaching velocity of near-by or distant objects may be considered as critical in terms of a collision or a pre-collision scenario. Again, the selection step can be implemented the same way as described above, and the display is controlled such that for the operator of the object the cameras are attached to emphasis is put on the one or more cameras looking into the direction another object is detected. The general purpose of the selection step is to make the operator focus to the one or more cameras by which a potential danger is currently being filmed under the assumption that there are at least two cameras avail- able filming different scenes, i.e. preferably different scenes around the object the cameras are attached to. Consequently, it is ensured that the image information being most relevant especially in terms of proximity warning including collision avoidance is displayed on the display. The determination which, one/s of the cameras currently monitors the most relevant scene is performed by a selection unit comprised in the control unit. In particular, the location information provided by a GNSS receiver is analyzed in terms of proximity to potentially dangerous objects.
By automatically selecting the camera currently monitoring the scene of most interest and by displaying image information delivered from this camera, the personnel in charge for a safe operation of such object being e.g. a vehicle may not be distracted by a multitude of image information but instead may focus on the most relevant image information being displayed.
For advantageous embodiments it is referred to the dependent claims. It is noted that embodiments re- ferred to or claimed only in connection with the method are deemed to be disclosed in connection with the apparatus, too, and vice versa.
Brief Description of the Drawings
A number of embodiments of the present invention will now be described by way of example only and with reference to the accompanying drawings, in which the figures show:
Fig. 1 a schematic representation of a mining site, Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention,
Fig. 3 a top view on a schematic vehicle with cameras mounted according to an embodiment of the present invention,
Fig. 4 a display,
Fig. 5 a block diagram of another monitoring system according to an embodiment of the present invention, and
Fig. 6 another display.
Modes for Carrying Out the Invention In the present application, an "image" is understood as being the output of a camera filming a scene. This can be a camera working in the visible range if light but also a camera working in the infrared range. Such image typically visualizes the scene on a display. When talking about different images it is inherently un¬ derstood that these images are generated by different cameras, typically simultaneously. In this respect, "image information" may include any information provided by such camera, and, in particular, the image itself.
The cameras used provide images of "different scenes". A scene is "different" to another scene whenever the cameras involved do not shoot or scan the same perspective. Cameras may not shoot the same perspective, for example, when they are attached to different locations of an object. In the context of the present application, it is preferred that the at least two cameras are mounted on the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the cameras are ar- ranged such that they scan different sides of the object they are attached to in order to detect other objects in proximity at different or even all sides of the object. A "section" assigned to a camera is understood as - e.g. when shooting with a camera horizontally - the horizontal area in front of the camera in which the camera is able to monitor scenes in, and that conse- quently can be displayed in an image. Typically, a section of a camera may include a sector.
The "control unit" may be embodied in hardware, in software or both, and may be embodied in a single device, or its functions may be decentralized. Its functional building block "selection unit" may also be embodied in hardware, in software or both.
The "display" may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, i.e. cameras, simultaneously. The "display" also encompasses the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any displaying means for displaying image information delivered by the cameras.
The "control signal to display images" trig¬ gers at least displaying the image selected for displaying. The control signal may evoke additional action sub¬ ject to what is displayed during the normal mode of op- eration, i.e. the default mode when there is no object in the vicinity detectable: If, for example, the display regularly shows images of a single camera source only, the control signal may cause to switch from displaying images from the current camera source to displaying im- ages from the camera source selected according to the present idea. If the current image source by chance coincides with the selected image source, there may be no change visible to the monitoring person. In some embodiments, the control signal causes to highlight the se- lected images for drawing the attention to the subject images, e.g. by a flashing frame, or other means. If, for example, the display by default displays images from various sources, the control signal may cause that the entire display now displays images only from the selected source. Or, the control signal may cause images from other sources being shut down or completely masked or visually downsized in order to emphasize the selected im- ages. The selected image may, as indicated, claim the entire display screen, or remain in an unchanged image size on the screen. The control signal may include zooming in the selected image. The control signal may additionally cause acoustic warnings to be issued. The control signal may either comprise the selected image itself provided the images are supplied by the cameras to the control unit, or it may cause the subject cameras to directly route the requested image to the display, or it may cause the display to accept only display information from the camera as selected. All the above holds true also for the selection of multiple images if appropriate.
A "warning system" and a corresponding "warning" activity may refer to any suitable activity for drawing the attention of the driver or operator to what might be identified as a scene that may become critical in terms of collision or proximity, including selecting an image to be displayed. Such warning system may primarily include the display which the cameras supply with images, but may additionally include acoustic means such as a horn, a diaphone or a speaker, and possibly other visual means such as one or more LEDs, a flashlight, etc.. The warning character of the images displayed may be especially emphasized by displaying the images intermittently, or by alternating between the image information and its inverse colors, or by overlaying the image information with visual warning symbols. Any warning in addition to the warning provided by the bare display of the images or selected images may be issued in combination with the control signal such that the control signal may activate such additional warnings, too. In other embodiments, a control signal separate from the control signal for the display may be issued subject to range information derived from the positional information delivered by the one or more GNSS receivers. For example, a first control signal for the display may be issued based on a first threshold condition for the object still being distant with respect to the present object, and a separate control signal for an acoustic warning element may be issued based on a second threshold condition for the object being very close with respect to the present object.
The term "radio based positioning system" stands for a GNSS or for any other type of positioning system based on radio signals, such as a pseudolite system. The term "GNSS" stands for "Global Navigation Satellite System" and encompasses all satellite based navigation systems, including GPS and Galileo. A "receiver" is a receiver designed for receiving information from satel- lites and for determining its position subject to the signals received.
A "movable object" is any object that can change and is expected to change its position and/or orientation or configuration in space. It may e.g. be a truck or any other vehicle that moves from place to place and changes its orientation with respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, e.g. by extending an arm, in such a manner that the volume of safety space attributed to it varies in significant manner.
Fig. 1 schematically depicts a site 1, such as a surface mine. Typically, such a site covers a large area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3. A plurality of objects is present in the mine, such as:
- Large vehicles, such as haul trucks 4a, cranes 4b or diggers 4c. Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
- Medium sized vehicles 5, such as regular trucks. These vehicles are easier to control, but they still have several blind spots and require a skilled driver .
- Small vehicles 6. Typically, vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
- Trains 7.
A further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
The risk of accidents in such an environment is high. In particular, the large sized vehicles can easily collide with other vehicles, or obstacles.
For this reason, objects according to an embodiment present in a mine 1 and subject to potential collision may be equipped with at least one GNSS receiver 11, a control unit per object, at least two cameras (not shown in Fig. 1) and a display per object (not shown in Fig. 1) . Large objects may provide more than one GNSS receiver 11 per object as shown in Fig. 1. The entirety of these elements per object for generating a proximity warning is called a monitoring system. The GNSS receivers 12 interact with satellites 30 for determining the positional information of the object they are mounted to.
Figure 2 illustrates a block diagram of a monitoring system including a control unit 13 according to an embodiment of the present invention. A receiver 17 of the control unit 13 is connected to cameras 12. An output 16 of the control unit 13 is connected to a display 19 and a beeper as warning means. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections. Each camera 12 delivers a series of images with respect to the scene monitored by the respec- tive camera 12. Preferably, each of the cameras 12 looks into a different direction for monitoring different scenes with respect to the object these cameras are attached to.
The monitoring system further comprises a radio based positioning receiver 11, attached to the present object. The receiver 11 provides a signal comprising positional information, i.e. the position of the present object, determined in combination with satellites 30 as shown in Figure 1. Such signal may be received by a receiving unit 15 in the control unit 13.
The control unit 13 comprises a microprocessor system 14, which controls the operations of the control unit 13. A memory 18 comprises programs as well as various parameters, such as unique identifiers of the cameras. Such programs may comprise instructions for evaluating the positional information, and for selecting a subset of cameras currently providing the most significant image information.
The radio based positioning receiver 11 may provide positional information of the subject location it is located which represents the subject location of the object it is attached to. Provided that other moving or stationary objects on the site are equipped with such re- ceivers 11, too, the positional information related to the various objects may be shared between the control units of these objects, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to
PCT/CH2009/000394 which is incorporated herein by reference .
Position information of the present object provided by the radio based positioning receiver 11 is transferred to the control unit 13 and evaluated there. Advantageously, such evaluation takes into account positional information received from other objects gathered by their own radio based positioning receivers and trans- mitted e.g. by a wireless interface not shown in Figure 2. By way of evaluating the positional information from these different sources, a proximity situation may be detected. If such proximity situation is detected by means of the positional information, a control signal may be issued which activates displaying the image from the camera looking into a direction where the proximate object is located at. The selected image represent the camera that currently films the proximate object which is of most interest to be monitored by the operator in order to avoid a collision. This is why image information stemming from this camera is emphasized in being presented to the personnel via the display.
Figure 2 shows an electronic map 40 stored in the control unit 13 which holds location information significant of stationary objects located on the site where the monitoring system of Figure 2 is in use. The positional information supplied by the GNSS receiver 11 is compared or otherwise put in relation to the location in- formation of the stationary objects. In case, sufficient proximity or approximation is detected between thses objects, the camera 12 looking into the direction of the stationary object is selected for displaying its image exclusively on the display 19.
In another embodiment, the object is equipped with another sensor (not shown) for measuring the distance to another object, such as a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device. A signal is re- ceived from such sensor, and the subset of one or more cameras additionally may be selected based on the distance information provided by such sensor. There may be multiple sensors arranged at different sides of a vehicle. These sensors may operate for detecting near-by ob- jects, and in particular objects not tagged with a GNSS receiver, by that providing additional information on the surrounding of the vehicle. Such sensors may individually trigger the selection of the camera (s) through the con- trol unit (13) and preferably cover similar sectors as the cameras.
Figure 3 illustrates a schematic top view on a vehicle 6 eguipped with four cameras 12, one located at each side of the vehicle 6, and a single GNSS receiver 11. Sections monitored by each camera 12 are indicated by sector lines and referred to by 121. This makes each camera 12 scan a different scene at each side of the vehicle 6. Alternatively, the cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the object for proximity including collision detection purposes.
Provided that second positional information is received from an object different to the present vehi- cle 6, the selection of the camera may be based on the positional information with respect to the present vehicle 6 and such second positional information. Analyzing the positional information of both of the objects may allow identification of the direction the other object is located at with respect to the vehicle 6, and the distance between the vehicle and the other object. In case the other object is located at a position 200 to the left hand side of the vehicle 6, the section 121 of the left hand camera 12 is identified as relevant section 121 when mapping the position of the other object 200 to the sections 121 of the cameras 12 of the vehicle. For such mapping, it is beneficial to permanently monitor the orientation of the vehicle 6 which may alter when moving the vehicle. This may involve e.g. a compass or any other means for determining the orientation of the vehicle with respect to the coordinate system the GNSS makes use of. The identified section 121 makes the camera 12 associated to be the preferred camera for selection. As a result, this camera 12 will exclusively provide images of this proximity situation to the operator provided the distance to the object 200 is not that far that any selection is suppressed. The first and second positional information may be used for determining the distance between the ot- her object and the vehicle. The distance information may be included in the selection step, and the image of the camera corresponding to the identified section may only be selected when the determined distance between the objects is below a given threshold. Otherwise, it is assumed that the other object still is too far away for justifying a warning to the operator.
Given that a third object 300 is in proximity to the vehicle 6 and given that the second object 200 still is at its position as illustrated in Figure 3, the position of the third object 300 may be determined with respect to the sections 121 of the cameras 12 of the vehicle 6. Hence, the section 121 to the right hand side of the vehicle 6 is identified as section the object 300 maps/falls. The right hand side camera 12 is associated to this section 121. For this example, the object 200 now is assumed to be at a distance from the vehicle 6 which justifies issuing a warning to the operator.
Subject to the display/warning strategy both cameras, i.e. the left hand and the right hand camera 12 may be selected for delivering images to the display, e.g. compared to a default display mode where all four cameras 12 show their images on the display. However, following another strategy, only the object closest to the vehicle 6 shall be displayed. By determining the distances between the vehicle 6 and the objects 200 and 300, the image of the camera being mounted to the object being closest will be allowed to display the scene it monitors, i.e. the camera 12 to the right hand as the object 300 is closer to the vehicle 6 than the object 200.
In the above examples, the radio based positioning receiver 11 always is present at the vehicle 6 / object the cameras are attached to. In another embodiment, no such radio based positioning receiver 11 is attached to the object holding the cameras. Instead, the selection of cameras only relies on positional information received from other objects. Such positional information may be sufficient for selecting the one or more cameras by a mapping step equivalent to the one described in connection with the embodiment above. This holds for other objects providing their position information not in an absolute measure but e.g. in relation to the present object, or to any other known object. Or, preferably, means other than radio based positioning means may be provided for allowing an assessment of the position of the other object with respect to the own position. In case the own position is a priori rather limited to a small area, e.g. when the vehicle may move only within a limited radius, even no such additional means are needed, as the own position may be known in advance, stored in the control unit and be used for putting the position of the other object provided in absolute coordinates into relation with its own position.
The display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 8 cameras across its screen. Once the control signal is received from the control unit 13, and provided the control signal identifies only one camera 12 for providing image information most relevant to be displayed, the entire screen of Figure 4 may be reserved for showing the subject image information. In Figure 4, the screen of the display 19 is devided, and the image information selected is displayed on portion 20 of the display. Portion 21 may be reserved for issuing visual warnings, such a bold "DANGER" symbol or text or other kind of visual warnings.
The block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the cameras 12. By activating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associated camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image information from the other cameras 12. There is no need for providing a receiver 17 for the image information in the control unit 13.
Figure 6 provides another schematic representation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying information from a camera assigned. In this embodiment, the control signal only high¬ lights the sub-display 192 which displays image information from the camera 12 selected to be most critical in terms of a potential collision by a blinking frame or similar .
While presently preferred embodiments of the invention are shown and described, it is to be distinctly understood that the invention is not limited thereto but may be otherwise variously embodied and practiced within the scope of the following claims.

Claims

Claims
1. A method for controlling a display of a proximity warning system, comprising
- receiving a signal representing first posi tional information of an object from a radio based positioning receiver (11),
- dependent on the positional information se lecting a subset of at least one camera (12) out of at least two cameras (12) available for providing images of different scenes, and
- providing a control signal for the display (19) to display images provided by the selected subset o one or more cameras (12) .
2. Method according to claim 1, wherein the cameras (12) available for selection are arranged to pro vide images of different sections around the object.
3. Method according to claim 2, wherein the cameras (12) and the radio based positioning receiver (11) are attached to the same object.
4. Method according to any one of the preced ing claims, wherein second positional information is received with respect to a second object, and wherein the second positional information originates from a radio based positioning receiver (11) of the second object.
5. Method according to claim 4, wherein the subset of at least one camera (12) is selected based on the first positional information and the second positional information.
6. Method according to claim 5, wherein a distance between the objects is determined from the posi tional information and the second positional information and wherein the subset of at least one camera (12) is se lected based on the distance.
7. Method according to any one of the preced ing claims 4 to 6 in combination with claim 2, wherein one of the sections (121) is identified for the second positional information to map into, and wherein the camera (12) associated with the identified section (121) is selected in the selection step.
5 8. Method according to claim 7 in combination with claim 6, wherein the camera (12) associated with the identified section (121) is selected provided at least one of the distance between the objects and the distance to the crossing point of their trajectories is below a !0 threshold.
9. Method according to any one of the previous claims, wherein the subset of one or more cameras (12) is selected subject to the positional information and subject to location information of stationary objects is stored in an electronic map.
10. Method according to claim 9 in combination with claim 2, wherein one of the sections (121) is identified for the stationary object location information to map into, and wherein the camera (12) associated with
20 the identified section (121) is selected in the selection step .
11. Method according to claim 10, wherein a distance between the object and the stationary object is determined from the positional information and the sta- 5 tionary object location information, and wherein the camera (12) associated with the identified section (121) is selected provided the determined distance between the object and the stationary object is below a threshold.
12. Method according to claim 1, wherein the 0 object the radio based positioning receiver (11) is assigned to is different to a second object the cameras (12) available for selection are attached to, and wherein the cameras (12) available for selection are arranged to provide images of different sections (121) around the
5 second object.
13. Method according to claim 12, wherein one of the sections (121) is identified for the positional information to map into, and wherein the camera (12) as- sociated with the identified section (121) is selected in the selection step.
14. Method according to any one of the preceding claims, wherein a signal is received from at least one sensor for measuring the distance to another object by means different to those of the radio based position¬ ing receiver (11), and wherein the subset of at least one camera (12) is selected based on the positional information and the distance information provided by the sensor.
15. A method according to claim 14, wherein the sensor includes at least one of a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device.
16. Method according to any one of the pre- ceding claims, wherein in a default mode the control signal is designed for allowing images provided by all the cameras (12) available to be displayed, and wherein based on the selection step the control signal is modified for allowing images provided by the one or more selected cam- eras (12) only to be displayed.
17. Method according to any one of the preceding claims, wherein the control signal is provided for the display (19) to display and highlight the images from the selected subset of one or more cameras (12) .
18. Method according to any one of the preceding claims, wherein the control signal is designed for triggering one of an acoustic and a visual warning.
19. Computer program element comprising computer program code means which, when loaded in a proces- sor unit of a control unit, configures the control unit to perform a method as claimed in any one of the preceding claims.
20. A control unit for controlling a display of a proximity warning system, comprising a receiving unit (15) for receiving a signal representing positional information of an object from a radio based positioning receiver (11), a selection unit for selecting a subset of at least one camera (12) out of at least two cameras (12) available for providing images of different scenes dependent on the positional information, and an output (16) for providing a control signal to a display (19) for displaying images provided by the selected subset of one or more cameras (12).
21. A monitoring system, comprising a control unit (14) according to claim 20, a display (19), and at least two cameras (12) for providing images of different scenes .
22. A monitoring system according to claim 21, comprising the radio based positioning receiver (11).
23. A monitoring system according to claim 21 or claim 22, wherein the receiving unit (15) is designed for receiving positional information of a second object.
24. A monitoring system according to any one of the preceding claims 21 to 23, comprising a log (60) for logging at least one of the positional information and the selected camera signal .
25. A movable object, comprising a monitoring system according to any one of the preceding claims 21 to 24, wherein the cameras (12) are attached to different locations of the object
26. A movable object according to claim 25 wherein the movable object (4, 5, 6, 7) is one of a vehi¬ cle (4, 5, 6), a crane (4b), a dragline, a haul truck (4a), a digger (4c) and a shovel.
PCT/CH2010/000152 2010-06-10 2010-06-10 Method and control unit for controlling a display of a proximity warning system WO2011153646A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2010355231A AU2010355231B2 (en) 2010-06-10 2010-06-10 Method and control unit for controlling a display of a proximity warning system
PCT/CH2010/000152 WO2011153646A1 (en) 2010-06-10 2010-06-10 Method and control unit for controlling a display of a proximity warning system
CA2802122A CA2802122C (en) 2010-06-10 2010-06-10 Method and control unit for controlling a display of a proximity warning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CH2010/000152 WO2011153646A1 (en) 2010-06-10 2010-06-10 Method and control unit for controlling a display of a proximity warning system

Publications (1)

Publication Number Publication Date
WO2011153646A1 true WO2011153646A1 (en) 2011-12-15

Family

ID=43431943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CH2010/000152 WO2011153646A1 (en) 2010-06-10 2010-06-10 Method and control unit for controlling a display of a proximity warning system

Country Status (3)

Country Link
AU (1) AU2010355231B2 (en)
CA (1) CA2802122C (en)
WO (1) WO2011153646A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same
EP3413287A1 (en) 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10703299B2 (en) 2010-04-19 2020-07-07 SMR Patents S.à.r.l. Rear view mirror simulation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11220726A (en) * 1998-01-30 1999-08-10 Niles Parts Co Ltd Vehicle surrounding monitoring device
WO2004021546A2 (en) 2002-08-09 2004-03-11 Conti Temic Microelectronic Gmbh Means of transport with a three-dimensional distance camera and method for the operation thereof
WO2004047047A1 (en) 2002-11-15 2004-06-03 Philips Intellectual Property & Standards Gmbh Method and system for avoiding traffic collisions
US20040217851A1 (en) * 2003-04-29 2004-11-04 Reinhart James W. Obstacle detection and alerting system
WO2006079165A1 (en) * 2005-01-25 2006-08-03 Alert Systems Pty Ltd Proximity warning system
GB2452829A (en) * 2007-09-12 2009-03-18 Spillard Safety Systems Ltd Decentralised GPS based anti-collision system for vehicles and pedestrians
US20090259400A1 (en) * 2008-04-15 2009-10-15 Caterpillar Inc. Vehicle collision avoidance system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11220726A (en) * 1998-01-30 1999-08-10 Niles Parts Co Ltd Vehicle surrounding monitoring device
WO2004021546A2 (en) 2002-08-09 2004-03-11 Conti Temic Microelectronic Gmbh Means of transport with a three-dimensional distance camera and method for the operation thereof
WO2004047047A1 (en) 2002-11-15 2004-06-03 Philips Intellectual Property & Standards Gmbh Method and system for avoiding traffic collisions
US20040217851A1 (en) * 2003-04-29 2004-11-04 Reinhart James W. Obstacle detection and alerting system
WO2006079165A1 (en) * 2005-01-25 2006-08-03 Alert Systems Pty Ltd Proximity warning system
GB2452829A (en) * 2007-09-12 2009-03-18 Spillard Safety Systems Ltd Decentralised GPS based anti-collision system for vehicles and pedestrians
US20090259400A1 (en) * 2008-04-15 2009-10-15 Caterpillar Inc. Vehicle collision avoidance system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
US9129509B2 (en) 2009-06-12 2015-09-08 Safemine Ag Movable object proximity warning system
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same
EP3413287A1 (en) 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle

Also Published As

Publication number Publication date
CA2802122A1 (en) 2011-12-15
AU2010355231A1 (en) 2013-01-10
AU2010355231B2 (en) 2014-11-20
CA2802122C (en) 2016-05-31

Similar Documents

Publication Publication Date Title
US9797247B1 (en) Command for underground
US9457718B2 (en) Obstacle detection system
US8280621B2 (en) Vehicle collision avoidance system
JP7067067B2 (en) Traffic light recognition device and automatic driving system
AU2017261540B2 (en) Command for underground
US6919917B1 (en) Device for monitoring the environment of a vehicle being parked
AU2014213529B2 (en) Image display system
CN108749813B (en) Automatic parking system and parking method
US10493622B2 (en) Systems and methods for communicating future vehicle actions to be performed by an autonomous vehicle
WO2015111422A1 (en) Periphery monitoring device for work machine
CN108230749A (en) Vehicle and its control method
US20190265736A1 (en) Information provision system, vehicular device, and non-transitory computer-readable storage medium
JP4719590B2 (en) In-vehicle peripheral status presentation device
CA2660215A1 (en) Vehicle collision avoidance system
US20120287277A1 (en) Machine display system
JP6801786B2 (en) Parking support method and parking support device
JP2007164328A (en) Vehicle run support system
WO2011130861A1 (en) Object proximity warning system and method
US20160148421A1 (en) Integrated Bird's Eye View with Situational Awareness
CN110497919A (en) Automotive vehicle is transformed into the object space history playback of manual mode from autonomous mode
CN107406072A (en) Vehicle assisted system
CA2802122C (en) Method and control unit for controlling a display of a proximity warning system
US11697425B1 (en) Method and system for assisting drivers in locating objects that may move into their vehicle path
JP2008097279A (en) Vehicle exterior information display device
JP2008046766A (en) Vehicle external information display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10725369

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2802122

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2010355231

Country of ref document: AU

Date of ref document: 20100610

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 10725369

Country of ref document: EP

Kind code of ref document: A1