CA2802122C - Method and control unit for controlling a display of a proximity warning system - Google Patents
Method and control unit for controlling a display of a proximity warning system Download PDFInfo
- Publication number
- CA2802122C CA2802122C CA2802122A CA2802122A CA2802122C CA 2802122 C CA2802122 C CA 2802122C CA 2802122 A CA2802122 A CA 2802122A CA 2802122 A CA2802122 A CA 2802122A CA 2802122 C CA2802122 C CA 2802122C
- Authority
- CA
- Canada
- Prior art keywords
- positional information
- cameras
- camera
- display
- subset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Abstract
The present idea refers to a method and a control unit for controlling a display (19) of a proximity warning system. Vehicles and other objects (4a, 4b, 4c, 5, 6, 7, 8), for example, in a surface mine (1), are equipped with cameras (12) for providing images of different scenes. A control unit (13) of such object (4a, 4b, 4c, 5, 6, 7, 8) receives a signal representing positional information of such object from a radio based positioning receiver (11). Dependent on the positional information a subset of at least one camera (12) is selected, and a control signal is provided for the display (19) to display images provided by the selected subset of one or more cameras (12).By such method, the most relevant scene in terms of collision avoidance can be displayed to the operator.
Description
Method and control unit for controlling a display of a proximity warning system Technical Field The invention relates to a method and a control unit for controlling a display of a proximity warning system.
Background Art Surface mines and similar sites or areas are generally operated by means of a large number of vehicles, some of which may be exceedingly large and difficult to maneuver and have very limited visibility for the operator.
Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated systems may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car.
For extra-large vehicles used in mining, WO
2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions. Another approach based on GNNS receivers is disclosed in the International Publication No. WO 2010/142046.
Background Art Surface mines and similar sites or areas are generally operated by means of a large number of vehicles, some of which may be exceedingly large and difficult to maneuver and have very limited visibility for the operator.
Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated systems may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car.
For extra-large vehicles used in mining, WO
2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions. Another approach based on GNNS receivers is disclosed in the International Publication No. WO 2010/142046.
2 Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", retrieved and accessed from/on the Internet at http://www.flir.com/uploadedFiles/Eurasia/MMC/Appl_Storie s/AS 0020 EN.pdf on February 2, 2010. Sensors for avoiding collisions may include radar systems, conventional cameras or thermal imaging cameras.
In non-conventional types of vehicles such as the vehicles used in mining, each camera may display its image on a display installed in the driving cab. The more cameras there are available the more image information the driver is exposed to such that the driver may be distracted by images not being relevant for collision avoidance. Or, the driver may be overstrained by monitoring the output of all cameras available.
It is generally desirable to overcome or ameliorate one or more of the above described difficulties, or to at least provide a useful alternative.
Summary of Invention According to the present invention, there is provided a method for controlling a display of a proximity warning system, comprising:
receiving a signal representing first positional information of a movable object from a radio based positioning receiver;
dependent on the first positional information selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes; and 2a providing a control signal for the display to display images provided by the selected subset of one or more cameras, wherein the subset of one or more cameras is selected subject to the first positional information and subject to location information of stationary objects stored in an electronic map.
According to the present invention, there is also provided a computer readable medium on which is lo stored computer program code means which, when loaded in a processor unit of a control unit, configures the control unit to perform a method as described herein.
According to the present invention, there is also provided a control unit for controlling a display of a proximity warning system, comprising:
a receiving unit for receiving a signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes dependent on the first positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to a display for displaying images provided by the selected subset of one or more cameras.
According to the present invention, there is also provided a proximity warning system comprising a display, at least two cameras for providing images of different scenes, and a control unit for controlling the display, the control unit comprising a receiving unit for receiving a 2b signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of the at least two cameras available for providing images of different scenes dependent on the positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to the displaying images provided by the selected subset of one or more lo cameras.
According to the present invention, there is also provided movable object, comprising a proximity warning system as described herein, wherein the at least two cameras are attached to different locations of the movable object, and wherein the movable object is a vehicle, a crane, a dragline, a haul truck, a digger or a shovel.
In this respect, it is desired to improve means in a multi camera based proximity warning system for drawing the attention of the operator to the most relevant camera output/s.
Accordingly, a signal representing positional information is received from a radio based positioning receiver. A subset of at least one camera out of at least two cameras for providing images of different scenes is selected dependent on the positional information. A control signal is provided for the display to display images provided by the selected subset of one or more cameras.
According to another preferred embodiment of the present invention, a control unit is provided for controlling a display. Such control unit comprises a receiving unit for receiving a signal representing positional information of
In non-conventional types of vehicles such as the vehicles used in mining, each camera may display its image on a display installed in the driving cab. The more cameras there are available the more image information the driver is exposed to such that the driver may be distracted by images not being relevant for collision avoidance. Or, the driver may be overstrained by monitoring the output of all cameras available.
It is generally desirable to overcome or ameliorate one or more of the above described difficulties, or to at least provide a useful alternative.
Summary of Invention According to the present invention, there is provided a method for controlling a display of a proximity warning system, comprising:
receiving a signal representing first positional information of a movable object from a radio based positioning receiver;
dependent on the first positional information selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes; and 2a providing a control signal for the display to display images provided by the selected subset of one or more cameras, wherein the subset of one or more cameras is selected subject to the first positional information and subject to location information of stationary objects stored in an electronic map.
According to the present invention, there is also provided a computer readable medium on which is lo stored computer program code means which, when loaded in a processor unit of a control unit, configures the control unit to perform a method as described herein.
According to the present invention, there is also provided a control unit for controlling a display of a proximity warning system, comprising:
a receiving unit for receiving a signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes dependent on the first positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to a display for displaying images provided by the selected subset of one or more cameras.
According to the present invention, there is also provided a proximity warning system comprising a display, at least two cameras for providing images of different scenes, and a control unit for controlling the display, the control unit comprising a receiving unit for receiving a 2b signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of the at least two cameras available for providing images of different scenes dependent on the positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to the displaying images provided by the selected subset of one or more lo cameras.
According to the present invention, there is also provided movable object, comprising a proximity warning system as described herein, wherein the at least two cameras are attached to different locations of the movable object, and wherein the movable object is a vehicle, a crane, a dragline, a haul truck, a digger or a shovel.
In this respect, it is desired to improve means in a multi camera based proximity warning system for drawing the attention of the operator to the most relevant camera output/s.
Accordingly, a signal representing positional information is received from a radio based positioning receiver. A subset of at least one camera out of at least two cameras for providing images of different scenes is selected dependent on the positional information. A control signal is provided for the display to display images provided by the selected subset of one or more cameras.
According to another preferred embodiment of the present invention, a control unit is provided for controlling a display. Such control unit comprises a receiving unit for receiving a signal representing positional information of
3 an object from a radio based positioning receiver. A se-lection unit is designed for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes subject to the positional information. At an output of the control unit, a control signal is provided to display images provided by the selected subset of one or more cameras.
The basic idea of the present invention is to provide an aid to the operator at which of the camera lo outputs to look at by means of prioritizing such cam-era(s). For this reason, a GNSS receiver is used for de-termining the present location of the object the cameras are assigned to and/or the location of an object differ-ent to the one the cameras are assigned to. The location of an object, specifically presented as coordinates in a chosen coordinate system, may advantageously be subsumed under the term "positional information" as presently used.
In an advantageous scenario, the GNSS re-ceiver and the cameras are attached to the same object.
An electronic map of preferably stationary objects being critical to traffic on a site may be stored, and the cur-rent position of the object as identified by the GNSS re-ceiver, may be compared or otherwise put into relation to the position of one or more objects listed in such map.
For example, in case of the distance between the object and a stationary object listed in the map being or possi-bly becoming critical, e.g. by subtracting the two loca-tion data from each other, it is decided to which one of the cameras to draw the operators attention to which preferably is the camera that looks into the direction the critical object is located at.
In another advantageous scenario, the GNSS
receiver and the cameras are attached to the same object, other objects including movable objects may be equipped with GNSS receivers, too, for determining their respec-tive positions and/or trajectories. Such positional in-formation is broadcast or individually transmitted by
The basic idea of the present invention is to provide an aid to the operator at which of the camera lo outputs to look at by means of prioritizing such cam-era(s). For this reason, a GNSS receiver is used for de-termining the present location of the object the cameras are assigned to and/or the location of an object differ-ent to the one the cameras are assigned to. The location of an object, specifically presented as coordinates in a chosen coordinate system, may advantageously be subsumed under the term "positional information" as presently used.
In an advantageous scenario, the GNSS re-ceiver and the cameras are attached to the same object.
An electronic map of preferably stationary objects being critical to traffic on a site may be stored, and the cur-rent position of the object as identified by the GNSS re-ceiver, may be compared or otherwise put into relation to the position of one or more objects listed in such map.
For example, in case of the distance between the object and a stationary object listed in the map being or possi-bly becoming critical, e.g. by subtracting the two loca-tion data from each other, it is decided to which one of the cameras to draw the operators attention to which preferably is the camera that looks into the direction the critical object is located at.
In another advantageous scenario, the GNSS
receiver and the cameras are attached to the same object, other objects including movable objects may be equipped with GNSS receivers, too, for determining their respec-tive positions and/or trajectories. Such positional in-formation is broadcast or individually transmitted by
4 these objects to other objects on the site being equipped with a corresponding receiver. By means of such posi-tional information shared amongst objects on the site, the direction and distance, and also any approaching ve-locity may be determined at the present object with re-spect to one or more other objects around. As soon as one or more of these parameters becomes critical in terms of proximity and/or a collision scenario, it is determined again to which of the cameras the operators attention lo should be drawn to, which preferably is the camera that looks into the direction the critical object is located at. Summarizing, by means of other objects, e.g. operated and located on the same site, being equipped with GNSS
receivers, too, and an infrastructure enabling these ob-15 jects to exchange information about their current loca-tion, information about the existence, the distance to, and the direction of such other objects in the vicinity can be generated.
In another advantageous scenario, the GNSS
20 receiver and the cameras are attached to different ob-jects, the object comprising the cameras not necessarily including a GNSS receiver. However, other objects on a site may be equipped with a GNSS receiver and broadcast or otherwise transmit their positional information to 25 other objects on the site being equipped with a corre-sponding receiving unit. In case, the object equipped with the cameras receives such positional information via its receiving unit, the positional information may be evaluated and the direction and/or the distance and/or 30 the approaching velocity of near-by or distant objects may be considered as critical in terms of a collision or a pre-collision scenario. Again, the selection step can be implemented the same way as described above, and the display is controlled such that for the operator of the 35 object the cameras are attached to emphasis is put on the one or more cameras looking into the direction another object is detected.
The general purpose of the selection step is to make the operator focus to the one or more cameras by which a potential danger is currently being filmed under the assumption that there are at least two cameras avail-able filming different scenes, i.e. preferably different scenes around the object the cameras are attached to.
Consequently, it is ensured that the image information being most relevant especially in terms of proximity warning including collision avoidance is displayed on the lo display. The determination which_one/s of the cameras currently monitors the most relevant scene is performed by a selection unit comprised in the control unit. In particular, the location information provided by a GNSS
receiver is analyzed in terms of proximity to potentially n dangerous objects.
By automatically selecting the camera cur-rently monitoring the scene of most interest and by dis-playing image information delivered from this camera, the personnel in charge for a safe operation of such object 20 being e.g. a vehicle may not be distracted by a multitude of image information but instead may focus on the most relevant image information being displayed.
For advantageous embodiments it is referred to the dependent claims. It is noted that embodiments re-25 ferred to or claimed only in connection with the method are deemed to be disclosed in connection with the appara-tus, too, and vice versa.
30 Brief Description of the Drawings A number of embodiments of the present inven-tion will now be described by way of example only and with reference to the accompanying drawings, in which the 35 figures show:
Fig. 1 a schematic representation of a mining site, Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention, Fig. 3 a top view on a schematic vehicle with cameras mounted according to an embodiment of the present invention, Fig. 4 a display, Fig. 5 a block diagram of another monitoring system according to an embodiment of the present inven-tion, and Fig. 6 another display.
Modes for Carrying Out the Invention In the present application, an "image" is un-derstood as being the output of a camera filming a scene.
This can be a camera working in the visible range if light but also a camera working in the infrared range.
Such image typically visualizes the scene on a display.
When talking about different images it is inherently un-derstood that these images are generated by different cameras, typically simultaneously. In this respect, "im-age information" may include any information provided by such camera, and, in particular, the image itself.
The cameras used provide images of "different scenes". A scene is "different" to another scene whenever the cameras involved do not shoot or scan the same per-spective. Cameras may not shoot the same perspective, for example, when they are attached to different locations of an object. In the context of the present application, it is preferred that the at least two cameras are mounted on the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the cameras are ar-ranged such that they scan different sides of the object they are attached to in order to detect other objects in proximity at different or even all sides of the object.
A "section" assigned to a camera is under-stood as - e.g. when shooting with a camera horizontally - the horizontal area in front of the camera in which the camera is able to monitor scenes in, and that conse-quently can be displayed in an image. Typically, a sec-tion of a camera may include a sector.
The "control unit" may be embodied in hard-ware, in software or both, and may be embodied in a sin-gle device, or its functions may be decentralized. Its lo functional building block "selection unit" may also be embodied in hardware, in software or both.
The "display" may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, 15 i.e. cameras, simultaneously. The "display" also encom-passes the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any display-ing means for displaying image information delivered by 20 the cameras.
The "control signal to display images" trig-gers at least displaying the image selected for display-ing. The control signal may evoke additional action sub-ject to what is displayed during the normal mode of op-25 eration, i.e. the default mode when there is no object in the vicinity detectable: If, for example, the display regularly shows images of a single camera source only, the control signal may cause to switch from displaying images from the current camera source to displaying im-30 ages from the camera source selected according to the present idea. If the current image source by chance coin-cides with the selected image source, there may be no change visible to the monitoring person. In some embodi-ments, the control signal causes to highlight the se-35 lected images for drawing the attention to the subject images, e.g. by a flashing frame, or other means. If, for example, the display by default displays images from various sources, the control signal may cause that the entire display now displays images only from the selected source. Or, the control signal may cause images from other sources being shut down or completely masked or visually downsized in order to emphasize the selected im-ages. The selected image may, as indicated, claim the en-tire display screen, or remain in an unchanged image size on the screen. The control signal may include zooming in the selected image. The control signal may additionally cause acoustic warnings to be issued. The control signal may either comprise the selected image itself provided the images are supplied by the cameras to the control unit, or it may cause the subject cameras to directly route the requested image to the display, or it may cause the display to accept only display information from the camera as selected. All the above holds true also for the selection of multiple images if appropriate.
A "warning system" and a corresponding "warn-ing" activity may refer to any suitable activity for drawing the attention of the driver or operator to what might be identified as a scene that may become critical in terms of collision or proximity, including selecting an image to be displayed. Such warning system may primar-ily include the display which the cameras supply with im-ages, but may additionally include acoustic means such as a horn, a diaphone or a speaker, and possibly other vis-ual means such as one or more LEDs, a flashlight, etc..
The warning character of the images displayed may be es-pecially emphasized by displaying the images intermit-tently, or by alternating between the image information and its inverse colors, or by overlaying the image infor-mation with visual warning symbols. Any warning in addi-tion to the warning provided by the bare display of the images or selected images may be issued in combination with the control signal such that the control signal may activate such additional warnings, too. In other embodi-ments, a control signal separate from the control signal for the display may be issued subject to range informa-tion derived from the positional information delivered by the one or more GNSS receivers. For example, a first con-trol signal for the display may be issued based on a first threshold condition for the object still being dis-tant with respect to the present object, and a separate control signal for an acoustic warning element may be is-sued based on a second threshold condition for the object being very close with respect to the present object.
The term "radio based positioning system"
stands for a GNSS or for any other type of positioning lo system based on radio signals, such as a pseudolite sys-tem. The term "GNSS" stands for "Global Navigation Satel-lite System" and encompasses all satellite based naviga-tion systems, including GPS and Galileo. A "receiver" is a receiver designed for receiving information from satel-lites and for determining its position subject to the signals received.
A "movable object" is any object that can change and is expected to change its position and/or ori-entation or configuration in space. It may e.g. be a 20 truck or any other vehicle that moves from place to place and changes its orientation with respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, 25 e.g. by extending an arm, in such a manner that the vol-ume of safety space attributed to it varies in signifi-cant manner.
Fig. 1 schematically depicts a site 1, such as a surface mine. Typically, such a site covers a large 30 area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3. A plurality of objects is present in the mine, such as:
- Large vehicles, such as haul trucks 4a, 35 cranes 4b or diggers 4c. Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
- Medium sized vehicles 5, such as regular trucks. These vehicles are easier to control, but they
receivers, too, and an infrastructure enabling these ob-15 jects to exchange information about their current loca-tion, information about the existence, the distance to, and the direction of such other objects in the vicinity can be generated.
In another advantageous scenario, the GNSS
20 receiver and the cameras are attached to different ob-jects, the object comprising the cameras not necessarily including a GNSS receiver. However, other objects on a site may be equipped with a GNSS receiver and broadcast or otherwise transmit their positional information to 25 other objects on the site being equipped with a corre-sponding receiving unit. In case, the object equipped with the cameras receives such positional information via its receiving unit, the positional information may be evaluated and the direction and/or the distance and/or 30 the approaching velocity of near-by or distant objects may be considered as critical in terms of a collision or a pre-collision scenario. Again, the selection step can be implemented the same way as described above, and the display is controlled such that for the operator of the 35 object the cameras are attached to emphasis is put on the one or more cameras looking into the direction another object is detected.
The general purpose of the selection step is to make the operator focus to the one or more cameras by which a potential danger is currently being filmed under the assumption that there are at least two cameras avail-able filming different scenes, i.e. preferably different scenes around the object the cameras are attached to.
Consequently, it is ensured that the image information being most relevant especially in terms of proximity warning including collision avoidance is displayed on the lo display. The determination which_one/s of the cameras currently monitors the most relevant scene is performed by a selection unit comprised in the control unit. In particular, the location information provided by a GNSS
receiver is analyzed in terms of proximity to potentially n dangerous objects.
By automatically selecting the camera cur-rently monitoring the scene of most interest and by dis-playing image information delivered from this camera, the personnel in charge for a safe operation of such object 20 being e.g. a vehicle may not be distracted by a multitude of image information but instead may focus on the most relevant image information being displayed.
For advantageous embodiments it is referred to the dependent claims. It is noted that embodiments re-25 ferred to or claimed only in connection with the method are deemed to be disclosed in connection with the appara-tus, too, and vice versa.
30 Brief Description of the Drawings A number of embodiments of the present inven-tion will now be described by way of example only and with reference to the accompanying drawings, in which the 35 figures show:
Fig. 1 a schematic representation of a mining site, Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention, Fig. 3 a top view on a schematic vehicle with cameras mounted according to an embodiment of the present invention, Fig. 4 a display, Fig. 5 a block diagram of another monitoring system according to an embodiment of the present inven-tion, and Fig. 6 another display.
Modes for Carrying Out the Invention In the present application, an "image" is un-derstood as being the output of a camera filming a scene.
This can be a camera working in the visible range if light but also a camera working in the infrared range.
Such image typically visualizes the scene on a display.
When talking about different images it is inherently un-derstood that these images are generated by different cameras, typically simultaneously. In this respect, "im-age information" may include any information provided by such camera, and, in particular, the image itself.
The cameras used provide images of "different scenes". A scene is "different" to another scene whenever the cameras involved do not shoot or scan the same per-spective. Cameras may not shoot the same perspective, for example, when they are attached to different locations of an object. In the context of the present application, it is preferred that the at least two cameras are mounted on the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the cameras are ar-ranged such that they scan different sides of the object they are attached to in order to detect other objects in proximity at different or even all sides of the object.
A "section" assigned to a camera is under-stood as - e.g. when shooting with a camera horizontally - the horizontal area in front of the camera in which the camera is able to monitor scenes in, and that conse-quently can be displayed in an image. Typically, a sec-tion of a camera may include a sector.
The "control unit" may be embodied in hard-ware, in software or both, and may be embodied in a sin-gle device, or its functions may be decentralized. Its lo functional building block "selection unit" may also be embodied in hardware, in software or both.
The "display" may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, 15 i.e. cameras, simultaneously. The "display" also encom-passes the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any display-ing means for displaying image information delivered by 20 the cameras.
The "control signal to display images" trig-gers at least displaying the image selected for display-ing. The control signal may evoke additional action sub-ject to what is displayed during the normal mode of op-25 eration, i.e. the default mode when there is no object in the vicinity detectable: If, for example, the display regularly shows images of a single camera source only, the control signal may cause to switch from displaying images from the current camera source to displaying im-30 ages from the camera source selected according to the present idea. If the current image source by chance coin-cides with the selected image source, there may be no change visible to the monitoring person. In some embodi-ments, the control signal causes to highlight the se-35 lected images for drawing the attention to the subject images, e.g. by a flashing frame, or other means. If, for example, the display by default displays images from various sources, the control signal may cause that the entire display now displays images only from the selected source. Or, the control signal may cause images from other sources being shut down or completely masked or visually downsized in order to emphasize the selected im-ages. The selected image may, as indicated, claim the en-tire display screen, or remain in an unchanged image size on the screen. The control signal may include zooming in the selected image. The control signal may additionally cause acoustic warnings to be issued. The control signal may either comprise the selected image itself provided the images are supplied by the cameras to the control unit, or it may cause the subject cameras to directly route the requested image to the display, or it may cause the display to accept only display information from the camera as selected. All the above holds true also for the selection of multiple images if appropriate.
A "warning system" and a corresponding "warn-ing" activity may refer to any suitable activity for drawing the attention of the driver or operator to what might be identified as a scene that may become critical in terms of collision or proximity, including selecting an image to be displayed. Such warning system may primar-ily include the display which the cameras supply with im-ages, but may additionally include acoustic means such as a horn, a diaphone or a speaker, and possibly other vis-ual means such as one or more LEDs, a flashlight, etc..
The warning character of the images displayed may be es-pecially emphasized by displaying the images intermit-tently, or by alternating between the image information and its inverse colors, or by overlaying the image infor-mation with visual warning symbols. Any warning in addi-tion to the warning provided by the bare display of the images or selected images may be issued in combination with the control signal such that the control signal may activate such additional warnings, too. In other embodi-ments, a control signal separate from the control signal for the display may be issued subject to range informa-tion derived from the positional information delivered by the one or more GNSS receivers. For example, a first con-trol signal for the display may be issued based on a first threshold condition for the object still being dis-tant with respect to the present object, and a separate control signal for an acoustic warning element may be is-sued based on a second threshold condition for the object being very close with respect to the present object.
The term "radio based positioning system"
stands for a GNSS or for any other type of positioning lo system based on radio signals, such as a pseudolite sys-tem. The term "GNSS" stands for "Global Navigation Satel-lite System" and encompasses all satellite based naviga-tion systems, including GPS and Galileo. A "receiver" is a receiver designed for receiving information from satel-lites and for determining its position subject to the signals received.
A "movable object" is any object that can change and is expected to change its position and/or ori-entation or configuration in space. It may e.g. be a 20 truck or any other vehicle that moves from place to place and changes its orientation with respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, 25 e.g. by extending an arm, in such a manner that the vol-ume of safety space attributed to it varies in signifi-cant manner.
Fig. 1 schematically depicts a site 1, such as a surface mine. Typically, such a site covers a large 30 area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3. A plurality of objects is present in the mine, such as:
- Large vehicles, such as haul trucks 4a, 35 cranes 4b or diggers 4c. Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
- Medium sized vehicles 5, such as regular trucks. These vehicles are easier to control, but they
5 still have several blind spots and require a skilled driver.
- Small vehicles 6. Typically, vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
10 - Trains 7.
A further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
The risk of accidents in such an environment is high. In particular, the large sized vehicles can eas-ily collide with other vehicles, or obstacles.
For this reason, objects according to an em-bodiment present in a mine 1 and subject to potential collision may be equipped with at least one GNSS receiver 11, a control unit per object, at least two cameras (not shown in Fig. 1) and a display per object (not shown in Fig. 1). Large objects may provide more than one GNSS re-ceiver 11 per object as shown in Fig. 1. The entirety of these elements per object for generating a proximity warning is called a monitoring system. The GNSS receivers 12 interact with satellites 30 for determining the posi-tional information of the object they are mounted to.
Figure 2 illustrates a block diagram of a monitoring system including a control unit 13 according to an embodiment of the present invention. A receiver 17 of the control unit 13 is connected to cameras 12. An output 16 of the control unit 13 is connected to a dis-play 19 and a beeper as warning means. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections. Each camera 12 delivers a series of images with respect to the scene monitored by the respec-
- Small vehicles 6. Typically, vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
10 - Trains 7.
A further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
The risk of accidents in such an environment is high. In particular, the large sized vehicles can eas-ily collide with other vehicles, or obstacles.
For this reason, objects according to an em-bodiment present in a mine 1 and subject to potential collision may be equipped with at least one GNSS receiver 11, a control unit per object, at least two cameras (not shown in Fig. 1) and a display per object (not shown in Fig. 1). Large objects may provide more than one GNSS re-ceiver 11 per object as shown in Fig. 1. The entirety of these elements per object for generating a proximity warning is called a monitoring system. The GNSS receivers 12 interact with satellites 30 for determining the posi-tional information of the object they are mounted to.
Figure 2 illustrates a block diagram of a monitoring system including a control unit 13 according to an embodiment of the present invention. A receiver 17 of the control unit 13 is connected to cameras 12. An output 16 of the control unit 13 is connected to a dis-play 19 and a beeper as warning means. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections. Each camera 12 delivers a series of images with respect to the scene monitored by the respec-
6 tive camera 12. Preferably, each of the cameras 12 looks into a different direction for monitoring different scenes with respect to the object these cameras are at-tached to.
The monitoring system further comprises a ra-dio based positioning receiver 11, attached to the pre-sent object. The receiver 11 provides a signal comprising positional information, i.e. the position of the present object, determined in combination with satellites 30 as shown in Figure 1. Such signal may be received by a re-ceiving unit 15 in the control unit 13.
The control unit 13 comprises a microproces-sor system 14, which controls the operations of the con-trol unit 13. A memory 18 comprises programs as well as n various parameters, such as unique identifiers of the cameras. Such programs may comprise instructions for evaluating the positional information, and for selecting a subset of cameras currently providing the most signifi-cant image information.
The radio based positioning receiver 11 may provide positional information of the subject location it is located which represents the subject location of the object it is attached to. Provided that other moving or stationary objects on the site are equipped with such re-ceivers 11, too, the positional information related to the various objects may be shared between the control units of these objects, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to PCT/CH2009/000394 which is incorporated herein by refer-ence.
Position information of the present object provided by the radio based positioning receiver 11 is transferred to the control unit 13 and evaluated there.
Advantageously, such evaluation takes into account posi-tional information received from other objects gathered by their own radio based positioning receivers and trans-mitted e.g. by a wireless interface not shown in Figure 2. By way of evaluating the positional information from these different sources, a proximity situation may be de-tected. If such proximity situation is detected by means of the positional information, a control signal may be issued which activates displaying the image from the cam-era looking into a direction where the proximate object is located at. The selected image represent the camera that currently films the proximate object which is of lo most interest to be monitored by the operator in order to avoid a collision. This is why image information stemming from this camera is emphasized in being presented to the personnel via the display.
Figure 2 shows an electronic map 40 stored in the control unit 13 which holds location information sig-nificant of stationary objects located on the site where the monitoring system of Figure 2 is in use. The posi-tional information supplied by the GNSS receiver 11 is compared or otherwise put in relation to the location in-formation of the stationary objects. In case, sufficient proximity or approximation is detected between thses ob-jects, the camera 12 looking into the direction of the stationary object is selected for displaying its image exclusively on the display 19.
In another embodiment, the object is equipped with another sensor (not shown) for measuring the dis-tance to another object, such as a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device. A signal is re-ceived from such sensor, and the subset of one or more cameras additionally may be selected based on the dis-tance information provided by such sensor. There may be multiple sensors arranged at different sides of a vehi-cle. These sensors may operate for detecting near-by ob-jects, and in particular objects not tagged with a GNSS
receiver, by that providing additional information on the surrounding of the vehicle. Such sensors may individually trigger the selection of the camera(s) through the con-trol unit (13) and preferably cover similar sectors as the cameras.
Figure 3 illustrates a schematic top view on a vehicle 6 equipped with four cameras 12, one located at each side of the vehicle 6, and a single GNSS receiver 11. Sections monitored by each camera 12 are indicated by sector lines and referred to by 121. This makes each cam-era 12 scan a different scene at each side of the vehicle 6. Alternatively, the cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the object for proximity including collision detection purposes.
Provided that second positional information is received from an object different to the present vehi-6, the selection of the camera may be based on the positional information with respect to the present vehi-cle 6 and such second positional information. Analyzing the positional information of both of the objects may al-low identification of the direction the other object is located at with respect to the vehicle 6, and the dis-tance between the vehicle and the other object. In case the other object is located at a position 200 to the left hand side of the vehicle 6, the section 121 of the left hand camera 12 is identified as relevant section 121 when mapping the position of the other object 200 to the sec-tions 121 of the cameras 12 of the vehicle. For such map-ping, it is beneficial to permanently monitor the orien-tation of the vehicle 6 which may alter when moving the vehicle. This may involve e.g. a compass or any other means for determining the orientation of the vehicle with respect to the coordinate system the GNSS makes use of.
The identified section 121 makes the camera 12 associated to be the preferred camera for selection. As a result, this camera 12 will exclusively provide images of this proximity situation to the operator provided the distance to the object 200 is not that far that any selection is suppressed. The first and second positional information may be used for determining the distance between the ot-her object and the vehicle. The distance information may be included in the selection step, and the image of the camera corresponding to the identified section may only be selected when the determined distance between the ob-jects is below a given threshold. Otherwise, it is as-sumed that the other object still is too far away for justifying a warning to the operator.
Given that a third object 300 is in proximity to the vehicle 6 and given that the second object 200 lo still is at its position as illustrated in Figure 3, the position of the third object 300 may be determined with respect to the sections 121 of the cameras 12 of the ve-hicle 6. Hence, the section 121 to the right hand side of the vehicle 6 is identified as section the object 300 15 maps/falls. The right hand side camera 12 is associated to this section 121. For this example, the object 200 now is assumed to be at a distance from the vehicle 6 which justifies issuing a warning to the operator.
Subject to the display/warning strategy both 20 cameras, i.e. the left hand and the right hand camera 12 may be selected for delivering images to the display, e.g. compared to a default display mode where all four cameras 12 show their images on the display. However, following another strategy, only the object closest to 25 the vehicle 6 shall be displayed. By determining the dis-tances between the vehicle 6 and the objects 200 and 300, the image of the camera being mounted to the object being closest will be allowed to display the scene it monitors, i.e. the camera 12 to the right hand as the object 300 is 30 closer to the vehicle 6 than the object 200.
In the above examples, the radio based posi-tioning receiver 11 always is present at the vehicle 6 /
object the cameras are attached to. In another embodi-ment, no such radio based positioning receiver 11 is at-35 tached to the object holding the cameras. Instead, the selection of cameras only relies on positional informa-tion received from other objects. Such positional infor-mation may be sufficient for selecting the one or more cameras by a mapping step equivalent to the one described in connection with the embodiment above. This holds for other objects providing their position information not in an absolute measure but e.g. in relation to the present 5 object, or to any other known object. Or, preferably, means other than radio based positioning means may be provided for allowing an assessment of the position of the other object with respect to the own position. In case the own position is a priori rather limited to a n small area, e.g. when the vehicle may move only within a limited radius, even no such additional means are needed, as the own position may be known in advance, stored in the control unit and be used for putting the position of the other object provided in absolute coordinates into 15 relation with its own position.
The display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 8 cam-eras across its screen. Once the control signal is re-ceived from the control unit 13, and provided the control signal identifies only one camera 12 for providing image information most relevant to be displayed, the entire screen of Figure 4 may be reserved for showing the sub-ject image information. In Figure 4, the screen of the display 19 is devided, and the image information selected is displayed on portion 20 of the display. Portion 21 may be reserved for issuing visual warnings, such a bold "DANGER" symbol or text or other kind of visual warnings.
The block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND
gates is connected with one of the cameras 12. By acti-vating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associated camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image information from the other cameras 12. There is no need for providing a receiver 17 for the image information in the control unit 13.
Figure 6 provides another schematic represen-tation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying information from a camera as-signed. In this embodiment, the control signal only high-lights the sub-display 192 which displays image informa-lo tion from the camera 12 selected to be most critical in terms of a potential collision by a blinking frame or similar.
While presently preferred embodiments of the invention are shown and described, it is to be distinctly 15 understood that the invention is not limited thereto but may be otherwise variously embodied and practiced within the scope of the following claims.
The monitoring system further comprises a ra-dio based positioning receiver 11, attached to the pre-sent object. The receiver 11 provides a signal comprising positional information, i.e. the position of the present object, determined in combination with satellites 30 as shown in Figure 1. Such signal may be received by a re-ceiving unit 15 in the control unit 13.
The control unit 13 comprises a microproces-sor system 14, which controls the operations of the con-trol unit 13. A memory 18 comprises programs as well as n various parameters, such as unique identifiers of the cameras. Such programs may comprise instructions for evaluating the positional information, and for selecting a subset of cameras currently providing the most signifi-cant image information.
The radio based positioning receiver 11 may provide positional information of the subject location it is located which represents the subject location of the object it is attached to. Provided that other moving or stationary objects on the site are equipped with such re-ceivers 11, too, the positional information related to the various objects may be shared between the control units of these objects, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to PCT/CH2009/000394 which is incorporated herein by refer-ence.
Position information of the present object provided by the radio based positioning receiver 11 is transferred to the control unit 13 and evaluated there.
Advantageously, such evaluation takes into account posi-tional information received from other objects gathered by their own radio based positioning receivers and trans-mitted e.g. by a wireless interface not shown in Figure 2. By way of evaluating the positional information from these different sources, a proximity situation may be de-tected. If such proximity situation is detected by means of the positional information, a control signal may be issued which activates displaying the image from the cam-era looking into a direction where the proximate object is located at. The selected image represent the camera that currently films the proximate object which is of lo most interest to be monitored by the operator in order to avoid a collision. This is why image information stemming from this camera is emphasized in being presented to the personnel via the display.
Figure 2 shows an electronic map 40 stored in the control unit 13 which holds location information sig-nificant of stationary objects located on the site where the monitoring system of Figure 2 is in use. The posi-tional information supplied by the GNSS receiver 11 is compared or otherwise put in relation to the location in-formation of the stationary objects. In case, sufficient proximity or approximation is detected between thses ob-jects, the camera 12 looking into the direction of the stationary object is selected for displaying its image exclusively on the display 19.
In another embodiment, the object is equipped with another sensor (not shown) for measuring the dis-tance to another object, such as a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device. A signal is re-ceived from such sensor, and the subset of one or more cameras additionally may be selected based on the dis-tance information provided by such sensor. There may be multiple sensors arranged at different sides of a vehi-cle. These sensors may operate for detecting near-by ob-jects, and in particular objects not tagged with a GNSS
receiver, by that providing additional information on the surrounding of the vehicle. Such sensors may individually trigger the selection of the camera(s) through the con-trol unit (13) and preferably cover similar sectors as the cameras.
Figure 3 illustrates a schematic top view on a vehicle 6 equipped with four cameras 12, one located at each side of the vehicle 6, and a single GNSS receiver 11. Sections monitored by each camera 12 are indicated by sector lines and referred to by 121. This makes each cam-era 12 scan a different scene at each side of the vehicle 6. Alternatively, the cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the object for proximity including collision detection purposes.
Provided that second positional information is received from an object different to the present vehi-6, the selection of the camera may be based on the positional information with respect to the present vehi-cle 6 and such second positional information. Analyzing the positional information of both of the objects may al-low identification of the direction the other object is located at with respect to the vehicle 6, and the dis-tance between the vehicle and the other object. In case the other object is located at a position 200 to the left hand side of the vehicle 6, the section 121 of the left hand camera 12 is identified as relevant section 121 when mapping the position of the other object 200 to the sec-tions 121 of the cameras 12 of the vehicle. For such map-ping, it is beneficial to permanently monitor the orien-tation of the vehicle 6 which may alter when moving the vehicle. This may involve e.g. a compass or any other means for determining the orientation of the vehicle with respect to the coordinate system the GNSS makes use of.
The identified section 121 makes the camera 12 associated to be the preferred camera for selection. As a result, this camera 12 will exclusively provide images of this proximity situation to the operator provided the distance to the object 200 is not that far that any selection is suppressed. The first and second positional information may be used for determining the distance between the ot-her object and the vehicle. The distance information may be included in the selection step, and the image of the camera corresponding to the identified section may only be selected when the determined distance between the ob-jects is below a given threshold. Otherwise, it is as-sumed that the other object still is too far away for justifying a warning to the operator.
Given that a third object 300 is in proximity to the vehicle 6 and given that the second object 200 lo still is at its position as illustrated in Figure 3, the position of the third object 300 may be determined with respect to the sections 121 of the cameras 12 of the ve-hicle 6. Hence, the section 121 to the right hand side of the vehicle 6 is identified as section the object 300 15 maps/falls. The right hand side camera 12 is associated to this section 121. For this example, the object 200 now is assumed to be at a distance from the vehicle 6 which justifies issuing a warning to the operator.
Subject to the display/warning strategy both 20 cameras, i.e. the left hand and the right hand camera 12 may be selected for delivering images to the display, e.g. compared to a default display mode where all four cameras 12 show their images on the display. However, following another strategy, only the object closest to 25 the vehicle 6 shall be displayed. By determining the dis-tances between the vehicle 6 and the objects 200 and 300, the image of the camera being mounted to the object being closest will be allowed to display the scene it monitors, i.e. the camera 12 to the right hand as the object 300 is 30 closer to the vehicle 6 than the object 200.
In the above examples, the radio based posi-tioning receiver 11 always is present at the vehicle 6 /
object the cameras are attached to. In another embodi-ment, no such radio based positioning receiver 11 is at-35 tached to the object holding the cameras. Instead, the selection of cameras only relies on positional informa-tion received from other objects. Such positional infor-mation may be sufficient for selecting the one or more cameras by a mapping step equivalent to the one described in connection with the embodiment above. This holds for other objects providing their position information not in an absolute measure but e.g. in relation to the present 5 object, or to any other known object. Or, preferably, means other than radio based positioning means may be provided for allowing an assessment of the position of the other object with respect to the own position. In case the own position is a priori rather limited to a n small area, e.g. when the vehicle may move only within a limited radius, even no such additional means are needed, as the own position may be known in advance, stored in the control unit and be used for putting the position of the other object provided in absolute coordinates into 15 relation with its own position.
The display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 8 cam-eras across its screen. Once the control signal is re-ceived from the control unit 13, and provided the control signal identifies only one camera 12 for providing image information most relevant to be displayed, the entire screen of Figure 4 may be reserved for showing the sub-ject image information. In Figure 4, the screen of the display 19 is devided, and the image information selected is displayed on portion 20 of the display. Portion 21 may be reserved for issuing visual warnings, such a bold "DANGER" symbol or text or other kind of visual warnings.
The block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND
gates is connected with one of the cameras 12. By acti-vating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associated camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image information from the other cameras 12. There is no need for providing a receiver 17 for the image information in the control unit 13.
Figure 6 provides another schematic represen-tation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying information from a camera as-signed. In this embodiment, the control signal only high-lights the sub-display 192 which displays image informa-lo tion from the camera 12 selected to be most critical in terms of a potential collision by a blinking frame or similar.
While presently preferred embodiments of the invention are shown and described, it is to be distinctly 15 understood that the invention is not limited thereto but may be otherwise variously embodied and practiced within the scope of the following claims.
Claims (23)
1. A method for controlling a display of a proximity warning system, comprising:
receiving a signal representing first positional information of a movable object from a radio based positioning receiver;
dependent on the first positional information selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes; and providing a control signal for the display to display images provided by the selected subset of one or more cameras, wherein the subset of one or more cameras is selected subject to the first positional information and subject to location information of stationary objects stored in an electronic map.
receiving a signal representing first positional information of a movable object from a radio based positioning receiver;
dependent on the first positional information selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes; and providing a control signal for the display to display images provided by the selected subset of one or more cameras, wherein the subset of one or more cameras is selected subject to the first positional information and subject to location information of stationary objects stored in an electronic map.
2. A method according to claim 1, wherein the cameras available for selection are arranged to provide images of different sections around the movable object.
3. A method according to claim 2, wherein the cameras and the radio based positioning receiver are attached to the same movable object.
4. A method according to claim 1 or 3, wherein second positional information is received with respect to a second object, wherein the second positional information originates from a radio based positioning receiver of the second object, and wherein the subset of at least one camera is selected based on the first positional information and the second positional information.
5. A method according to claim 4, wherein a distance between the objects is determined from the first positional information and the second positional information, and wherein the subset of at least one camera is selected based on the distance.
6. A method according to claim 2, wherein second positional information is received with respect to a second object, wherein the second positional information originates from a radio based positioning receiver of the second object, and wherein the subset of at least one camera is selected based on the first positional information and the second positional information.
7. A method according to claim 6, wherein one of the sections is identified as relevant when mapping the second positional information to the sections, and wherein the camera associated with the identified section is selected in the selection step.
8. A method according to claim 6, wherein a distance between the objects is determined from the first positional information and the second positional information, and wherein the subset of at least one camera is selected based on the distance.
9. A method according to claim 8, wherein one of the sections is identified as relevant when mapping the second positional information to the sections, and wherein the camera associated with the identified section is selected in the selection step.
10. A method according to claim 9, wherein the camera associated with the identified section is selected provided at least one of the distance between the objects and the distance to a crossing point of their trajectories is below a threshold.
11. A method according to claim 2, wherein one of the sections is identified as relevant when mapping the stationary object location information to the sections, and wherein the camera associated with the identified section is selected in the selection step.
12. A method according to claim 11, wherein a distance between the movable object and the stationary object is determined from the first positional information and the stationary object location information, and wherein the camera associated with the identified section is selected provided the determined distance between the movable object and the stationary object is below a threshold.
13. A method according to claim 1, wherein the movable object the radio based positioning receiver is assigned to is different to a second object the cameras available for selection are attached to, wherein the cameras available for selection are arranged to provide images of different sections around the first second object, wherein one of the sections is identified as relevant when mapping the first positional information to the sections, and wherein the camera associated with the identified section is selected in the selection step.
14. A method according to any one of claims 1 to 13, wherein a signal is received from at least one sensor for measuring the distance to another object by means different to those of the radio based positioning receiver, wherein the subset of at least one camera is selected based on the first positional information and the distance information provided by the sensor, and wherein the sensor includes at least one of a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device.
15. A method according to any one of claims 1 to 14, wherein in a default mode the control signal is designed for allowing images provided by all the cameras available to be displayed, and wherein based on the selection step the control signal is modified for allowing images provided by the one or more selected cameras only to be displayed.
16. A method according to any one of claims 1 to 15, wherein the control signal is provided for the display to display and highlight the images from the selected subset of one or more cameras.
17. A method according to any one of claims 1 to 16, wherein the control signal is designed for triggering one of an acoustic and a visual warning.
18. A computer readable medium on which is stored computer program code means which, when loaded in a processor unit of a control unit, configures the control unit to perform a method as defined in any one of claims 1 to 17.
19. A control unit for controlling a display of a proximity warning system, comprising:
a receiving unit for receiving a signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes dependent on the first positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to a display for displaying images provided by the selected subset of one or more cameras.
a receiving unit for receiving a signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes dependent on the first positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to a display for displaying images provided by the selected subset of one or more cameras.
20. A proximity warning system comprising a display, at least two cameras for providing images of different scenes, and a control unit for controlling the display, the control unit comprising a receiving unit for receiving a signal representing first positional information of a movable object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of the at least two cameras available for providing images of different scenes dependent on the positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to the displaying images provided by the selected subset of one or more cameras.
21. A proximity warning system according to claim 20, wherein the receiving unit is designed for receiving positional information of a second object.
22. A proximity warning system according to claim 20 or 21, comprising a log for logging at least one of the first positional information and the selected camera signal.
23. A movable object, comprising a proximity warning system as defined in any one of claims 20 to 22, wherein the at least two cameras are attached to different locations of the movable object, and wherein the movable object is a vehicle, a crane, a dragline, a haul truck, a digger or a shovel.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CH2010/000152 WO2011153646A1 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2802122A1 CA2802122A1 (en) | 2011-12-15 |
CA2802122C true CA2802122C (en) | 2016-05-31 |
Family
ID=43431943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2802122A Active CA2802122C (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Country Status (3)
Country | Link |
---|---|
AU (1) | AU2010355231B2 (en) |
CA (1) | CA2802122C (en) |
WO (1) | WO2011153646A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2765254C (en) | 2009-06-12 | 2016-11-22 | Safemine Ag | Movable object proximity warning system |
US8994557B2 (en) | 2009-12-11 | 2015-03-31 | Safemine Ag | Modular collision warning apparatus and method for operating the same |
US10703299B2 (en) | 2010-04-19 | 2020-07-07 | SMR Patents S.à.r.l. | Rear view mirror simulation |
US10800329B2 (en) | 2010-04-19 | 2020-10-13 | SMR Patents S.à.r.l. | Rear view mirror simulation |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220726A (en) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | Vehicle surrounding monitoring device |
WO2004021546A2 (en) | 2002-08-09 | 2004-03-11 | Conti Temic Microelectronic Gmbh | Means of transport with a three-dimensional distance camera and method for the operation thereof |
DE10253192A1 (en) | 2002-11-15 | 2004-05-27 | Philips Intellectual Property & Standards Gmbh | Anti-collision system for use with road vehicle has position determining computer with GPS receiver and has radio transmitter ending signals to equipment carried by pedestrians |
US20040217851A1 (en) * | 2003-04-29 | 2004-11-04 | Reinhart James W. | Obstacle detection and alerting system |
WO2006079165A1 (en) * | 2005-01-25 | 2006-08-03 | Alert Systems Pty Ltd | Proximity warning system |
GB0717741D0 (en) * | 2007-09-12 | 2007-10-17 | Spillard Saftey Systems Ltd | Proximity apparatus |
US8170787B2 (en) * | 2008-04-15 | 2012-05-01 | Caterpillar Inc. | Vehicle collision avoidance system |
-
2010
- 2010-06-10 WO PCT/CH2010/000152 patent/WO2011153646A1/en active Application Filing
- 2010-06-10 CA CA2802122A patent/CA2802122C/en active Active
- 2010-06-10 AU AU2010355231A patent/AU2010355231B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
WO2011153646A1 (en) | 2011-12-15 |
AU2010355231B2 (en) | 2014-11-20 |
CA2802122A1 (en) | 2011-12-15 |
AU2010355231A1 (en) | 2013-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9797247B1 (en) | Command for underground | |
US9457718B2 (en) | Obstacle detection system | |
AU2014213529B2 (en) | Image display system | |
JP6267972B2 (en) | Work machine ambient monitoring device | |
US9335545B2 (en) | Head mountable display system | |
US10114370B2 (en) | Machine automation system with autonomy electronic control module | |
US20090259400A1 (en) | Vehicle collision avoidance system | |
WO2014045459A1 (en) | Work vehicle periphery monitoring system, and work vehicle | |
US20120287277A1 (en) | Machine display system | |
AU2010351500B2 (en) | Object proximity warning system and method | |
RU2017121325A (en) | MIRROR REPLACEMENT SYSTEM FOR VEHICLE | |
JP7232287B2 (en) | ship navigation system | |
US20160148421A1 (en) | Integrated Bird's Eye View with Situational Awareness | |
CA2802122C (en) | Method and control unit for controlling a display of a proximity warning system | |
CN107406072A (en) | Vehicle assisted system | |
JP2008097279A (en) | Vehicle exterior information display device | |
AU2018201213B2 (en) | Command for underground | |
US9910434B1 (en) | Command for underground | |
US20120249342A1 (en) | Machine display system | |
CN113060156A (en) | Vehicle periphery monitoring device, vehicle periphery monitoring method, and program | |
JP2015153208A (en) | Alarm system | |
JP5788048B2 (en) | Work vehicle periphery monitoring system and work vehicle | |
AU2011264358B2 (en) | Method and control unit for controlling a display | |
KR102497610B1 (en) | Device for safety aid using a image | |
WO2021076734A1 (en) | Method for aligning camera and sensor data for augmented reality data visualization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20150522 |