WO2011153652A2 - Method and control unit for controlling a display - Google Patents

Method and control unit for controlling a display Download PDF

Info

Publication number
WO2011153652A2
WO2011153652A2 PCT/CH2011/000136 CH2011000136W WO2011153652A2 WO 2011153652 A2 WO2011153652 A2 WO 2011153652A2 CH 2011000136 W CH2011000136 W CH 2011000136W WO 2011153652 A2 WO2011153652 A2 WO 2011153652A2
Authority
WO
WIPO (PCT)
Prior art keywords
range
display
image
images
cameras
Prior art date
Application number
PCT/CH2011/000136
Other languages
French (fr)
Other versions
WO2011153652A3 (en
Inventor
Urs Martin Rothacher
Peter Arnold Stegmaier
Original Assignee
Safemine Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Safemine Ag filed Critical Safemine Ag
Priority to AU2011264358A priority Critical patent/AU2011264358B2/en
Publication of WO2011153652A2 publication Critical patent/WO2011153652A2/en
Publication of WO2011153652A3 publication Critical patent/WO2011153652A3/en
Priority to ZA2012/09336A priority patent/ZA201209336B/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the invention relates to a method and a con trol unit for controlling a display.
  • Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
  • Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated systems may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car .
  • WO 2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions.
  • Sensors for avoid- ing collisions may include radar systems, conventional cameras or thermal imaging cameras.
  • each camera may display its images on a display installed in the driving cab.
  • signals representative of range images from at least two range cameras arranged for providing range images of different scenes are received.
  • range images are evaluated.
  • a subset of range images from the range images available is selected subject to the result of the evaluation.
  • a control signal is provided for the display to display image information stemming from the one or more range cameras in charge of pro- viding the one or more range images selected for the sub ⁇ set .
  • a control unit for control ⁇ ling a display according to the features of independent claim 22.
  • Such control unit comprises a receiver for re ⁇ ceiving signals representative of range images from at least two range cameras for providing range images of different scenes.
  • An evaluation unit is designed for evaluating the at least two range images, and a selection unit is designed for selecting a subset of range images from the range images available subject to the result of the evaluation.
  • a control signal for the display to display image information stemming from the one or more range cameras in charge of providing the one or more range images selected for the subset.
  • the basic idea of the present invention is to provide range cameras and analyze the range images provided by these range cameras. It is ensured that the image information being most relevant especially in terms of collision avoidance is displayed on the display.
  • the determination which one/s of the range cameras available currently monitors the ' most relevant scene is performed by an evaluation and a selection unit comprised in the control unit.
  • ranges provided in the range images are analyzed in terms of proximity to objects identified in the range image.
  • the personnel in charge for a safe operation of such vehicle or stationary object may not be distracted by a multitude of image information but instead may focus on the most relevant image information displayed.
  • Fig. 1 a schematic representation of a mining site
  • Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention
  • Fig. 3 a top view on a schematic vehicle with range cameras mounted according to an embodiment of the present invention
  • Fig. 4 a display
  • Fig. 5 a block diagram of another monitoring system according to an embodiment of the present invention.
  • Fig. 7 a timing diagram for illustrating the operation of a range camera according to an aspect of the present invention.
  • Fig. 8 a representation of a range image.
  • a “range camera” is a sensor device for determining the distance to points in a scene.
  • the distance typically refers to a distance between the camera and the object.
  • the range camera at least provides a "range image” which represents an image in which a distance value is measured for each pixel of such image.
  • a visualization technique of a two-dimensional range image may comprise assigning different gray-scale intensities to different ranges. Other visualization techniques of range images are included.
  • the means for providing a range image may comprise a LIDAR system, i.e. a light detection and ranging system with discrete laser emitters for illumination.
  • a range camera may at least provide a range image.
  • a range camera may provide information beyond the range image.
  • the range camera may additionally provide an intensity image of the scene, i.e. a conventional image in preferably digital form in which the light intensity is detected for each pixel.
  • the light intensity in such case does not represent a distance to the object as it does after the transformation of distances into grey-scales as illus- trated above.
  • the light intensity image can be embodied as a grey-scale image, or as a color image.
  • illumination means subject to the ambient light intensity.
  • An in ⁇ tensity image is considered to represent the result of conventional photography or filming, where any light de ⁇ tected by the camera is not investigated for distance in- formation.
  • Cameras for providing both range and intensity information are also called three-dimensional cameras.
  • a range camera may provide both intensity and range image information
  • the visualization of these images may be separated, such that an intensity image and a range image are provided, or both kinds of image information, i.e. range and intensity, are merged into one image.
  • image information may include one or both of range image information and intensity image information, and for the embodiment that both kind of information is provided any image information processing may include a separate handling as well as an integrated handling of the underlying data.
  • the range cameras provide range images of "different scenes".
  • a scene is "different” to another scene whenever the cameras involved do not shoot or scan the same perspective. This means, whenever a camera is mounted at a different position from another camera, these cameras provide range images from different scenes.
  • the at least two range cameras are mounted to the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the range cameras are arranged such that they scan different sides of the object they are mounted to in order to detect other objects in proximity at all sides of the object.
  • a left-hand sided camera may provide exactly the same image content as the right-hand sided camera mounted at a vehicle, e.g.
  • control unit may be embodied in hard ⁇ ware, in software or both, and may be embodied in a single device, or its functions may be decentralized.
  • the functional building blocks “evaluation unit” and “selec- tion unit” may also be embodied in hardware, in software or both.
  • the "display” may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, i.e. cameras, simultaneously.
  • the "display” also encompasses the totality of a multitude of separated displays which are, ' for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any displaying means for displaying image information delivered by the range cameras .
  • the "control signal to display image information” triggers at least displaying the image information selected for displaying.
  • the control signal may evoke additional action subject to what is displayed during the normal mode of operation: If, for example, the display regularly shows images of a single source only, the control signal may cause to switch from displaying images from the current source to displaying images from the source selected according to the present idea. If the current image source by chance coincides with the selected image source, there may be no change visible to the monitoring person.
  • the control signal causes to highlight the selected images for drawing the attention to the subject images, e.g. by a flash- ing frame, or other means. If, for example, the display by default displays images from various sources, the control signal may cause that the entire display now dis ⁇ plays images only from the selected source.
  • the con ⁇ trol signal may cause images from other sources being shut down or completely masked or visually downsized in rder to emphasize the selected images.
  • the selected im ⁇ age may, as indicated, claim the entire display screen, or remain in an unchanged image size on the screen.
  • the control signal may include zooming into the object identified as being most close in the range images provided by the range camera selected.
  • the control signal may ad- ditionally cause acoustic warnings to be issued.
  • the control signal may either comprise the selected image information itself or it may cause the subject cameras to directly route the requested image information to the display or it may cause the display to accept only display information from the range camera as selected. All the above holds true also for the selection of multiple range images if appropriate.
  • a "warning system" and a corresponding "warning" activity may refer to any suitable activity for drawing the attention of the driver or other persons to image information selected to be displayed, and finally to the underlying scene that may become critical in terms of collision or proximity.
  • Such warning system may include acoustic means such as a horn, a diaphone or a speaker.
  • Such warning system may alternatively or additionally include visual means including the display for displaying image information itself, or, alternatively or additionally, a separate visual warning indicator such as another display, one or more LEDs, a flashlight, etc..
  • the warning especially may be implemented on the display for displaying the image information by intermittently displaying the image information, or by alternating between the image information and its inverse colors, or by overlaying the image information with visual warning ele- ments.
  • the warning may be issued in combination with the control signal such that the control signal may activate the warning system, too.
  • a control signal separate from the control signal for the display may be issued subject to range information in the se- lected image.
  • the control signal to the dis ⁇ play may be issued based on a first threshold condition of a range in the subject image
  • the separate control signal for the warning system may be issued based on a second threshold condition of a range in the subject im ⁇ age, that is, for example, more strict than the first threshold condition.
  • radio based positioning system stands for a GNSS or for any other type of positioning system based on radio signals, such as a pseudolite system.
  • GNSS stands for "Global Navigation Satellite System” and encompasses all satellite based naviga- tion systems, including GPS and Galileo.
  • a “receiver” is a receiver designed for receiving information from satellites and for determining its position subject to the signals received.
  • a radio based positioning receiver is "as- signed" to a range camera whenever the camera and the receiver are arranged at the more or less same position, being at a maximum of two meters separated from each other.
  • a "movable object” is any object that can change and is expected to change its position and/or orientation or configuration in space. It may e.g. be a truck or any other vehicle that moves from place to place and changes its orientation in respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, e.g. by extending an arm, in such a manner that the volume of safety space attributed to it varies in significant manner.
  • Fig. 1 schematically depicts a site 1, such as a surface mine.
  • a site such as a surface mine.
  • a site covers a large area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3.
  • a plurality of objects is present in the mine, such as:
  • Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
  • vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
  • a further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
  • objects present in a mine 1 and subject to potential collision may be equipped with at least two range cameras 12, a control unit per object, and a display per object.
  • the entirety of these elements per object for generating a proximity warning is called a monitoring system.
  • FIG. 2 illustrates a block diagram of a control. unit 13 according to an embodiment of the present invention.
  • a receiver 17 of the control unit 13 is connected to range cameras 12.
  • An output 16 of the control unit 13 is connected to a display 19. Both connections may be implemented as wireless connections or as wired connections.
  • One or more connections can be implemented via bus connections.
  • the control unit 13 comprises a microproces- sor system 14, which controls the operations of the control unit 13.
  • a memory 18 comprises programs as well as various parameters, such as unique identifiers of the range cameras. Such programs may comprise instructions for evaluating the incoming range images, and for selecting a subset of range images which subset of range images identifies the range cameras which currently provide the most significant image information.
  • the monitoring system further comprises radio based positioning receivers 11. These receivers 11 may determine their respective positions in combination with satellites 30 as shown in Figure 1. Positional informa- tion may be sent to a corresponding receiver 15 in the control unit 13. Each positioning receiver 11 is advantageously assigned to a range camera 12 such that the location of a camera / positioning receiver pair is more or less the same.
  • Each range camera 12 delivers a series of range images with respect to the scene monitored by the respective range camera 12.
  • Figure 3 illustrates a schematic top view on a car 6 eguipped with four range cameras 12, one located at each side of the vehicle 6. The area monitored by each range camera 12 is indicated by a sector. This makes each range camera 12 scan a different scene at each side of the vehicle 6. Alternatively, the range cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the mobile object for proximity and/or collision detection purposes.
  • the range images provided by the range cameras 12 are evaluated.
  • objects in the range image are identi- fied and their distance from the camera 12 is determined. After this, it is determined by means of selection which of the objects identified shows the shortest range to the camera 12, i.e. is closest to the camera 12 and may be identified as the object with the highest likelihood to collide with the present object.
  • the range camera 12 hav ⁇ ing provided the range image including the object most close is selected to provide image information to the personnel on board or wherever the control unit is located with respect to an object.
  • ranges can be one of the ranges measured for such object, and more specifically one of the ranges associated with a pixel of such object, or it can be the shortest distance of a pixel out of all pixels distances contributing to such object, or, for example, it can be an average over all the range values measured for the pixels of an object.
  • Such assigned ranges are advantageously compared across all objects identified in all range images.
  • the comparison may deliver the x most close objects identified in all range images.
  • x most close objects When translated into the scenario of a vehicle it means that the x most close objects to the vehicle are identified hereby.
  • These x most close objects may be scanned by at most x range cameras, or by less than x range cameras provided that one of the cameras has scanned multiple objects being amongst the x closest objects.
  • the closest object of all objects of such range image is deter- mined by determining the shortest range assigned to any object in the range image. Then, a comparison of all the closest objects results in x objects with a range being amongst the x shortest ranges identified which provides for exactly x range images as a subset of all available range images.
  • x is set to one such that only image information of the range camera delivering the most relevant range image will be displayed in the following.
  • the selected range images represent the range cameras that currently film the objects most close which are of most interest to be monitored by the personnel charge in order to avoid a collision. This is why image information stemming from these range cameras is emphasized in being presented to the personnel via the display.
  • the selection step may include selecting a subset of range images from the range images available only when the distance of an object fulfils a threshold condition, and preferably shows a distance less than a threshold.
  • a threshold may be programmable in the control unit.
  • the object under investigation may be an object within an image, a closest object within an image, a closest object in all images available, or other objects evaluated.
  • the display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 16 range cameras across its screen.
  • the control signal is received from the control unit 13, and provided the control signal identifies only one range camera 12 for providing image information most relevant to be dis- played, the entire screen of Figure 4 may be reserved for showing the subject image information.
  • the image information comprises a range image 20 delivered by the subject range camera as well as an intensity image 21 delivered by the subject range camera which, in the pre- sent embodiment, is a range camera providing three dimensional images.
  • the block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the ranges cameras 12. By activating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associ- ated range camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image infor- mation from the other range cameras 12.
  • the choice of which sort of image information of a range camera 12 shall be displayed e.g. a range image, an intensity image, or both, can be preconfigured at the range camera. By providing multiple AND gates per camera and the camera providing outputs for different kind of image information, it may be determined by the control signal which image information shall be displayed.
  • the type of image information to be displayed may also be selected by the personnel.
  • Figure 6 provides another schematic representation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying image information from a range camera assigned.
  • the control signal only highlights the sub-display 192 which displays image information from the range camera 12 which range images were evaluated and selected to be most critical in terms of a potential collision.
  • the radio based positioning receivers 11 may provide positional information of the subject location they are arranged at. Given that there is at least one radio based positioning receiver 11 mounted to an object and provided that other moving or stationary objects on the site are equipped with such receivers, too, such positional information may be shared between the control units of the objects, or the receivers themselves, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to PCT/CH2009/000394 which is incorporated herein by reference.
  • Location information provided by the posi ⁇ tioning receivers 11 of the same object is transferred to the control unit 13 and evaluated there.
  • such evaluation takes into account positional information received from other objects which may be provided by a wireless interface not shown in the Figures.
  • a proximity situation may be detected. If such proximity situation is detected by means of the po- sitional information, a control signal may be issued which activates displaying image information stemming from the range camera which is assigned to the positioning receiver which is at least co-responsible - together with a positioning receiver of a different object - for issuing the proximity warning.
  • the image information from the subject range camera selected by such process prevails over the image information from a different range camera based on the evaluation of the range images.
  • Such embodiment is beneficial in scenarios where an object may become close in an area which the range cameras do not or are not able to scan, or where a range camera may not provide correct range images and even may fail.
  • the analysis of the positional information serves as a safety mechanism for malfunctions in the camera system.
  • a permanent control signal implies that the selected image information will always prevail and possibly prevent other sources from display image information. This does not mean that it is always the same camera providing image information to be displayed as the selection step may select a new camera to provide image information from time to time.
  • the present idea can be modified by issuing the control signal only when the closest range assigned to an object of such range image exceeds a threshold value.
  • a threshold value has the effect, that in a situation where no object is close enough to the monitoring system, a default dis- play mode is executed, which includes for example permanently displaying image information from all the range cameras in assigned portions of the display, or which may include switching between the range cameras for displaying image information on a single display. Only if the threshold range is exceeded by any one of the objects in any one of the range images, the control signal becomes active and grants access to the display to the selected range camera.
  • Another embodiment is related to the duration of the control signal. Even today, there are range cameras around which provide range images with a frequency of more than 1/100 ms .
  • the control signal may be allowed to switch from one range camera to another one with every set of range images being evaluated which might cause toggling. However, there might be alternate options according to which a once selected range camera rests for a minimum amount of time for allowing the personnel to get an idea of the object approaching.
  • the image information displayed in response to the control signal is an intensity image provided by the range camera that also provides the range image.
  • the illumination pulse may interfere with the regular shooting of an intensity image.
  • a photo-sensor of the camera for receiving radiation may be overloaded by the illumination pulse itself. This is why it is preferred that any intensity image to be dis- played is masked for the time of the illumination pulse in order not to provide overloaded signals to the display. This means that an intensity image provided by the range camera as selected is suppressed for a certain time during and/or after, or in response to the illumination pulse.
  • an intensity image may, for example, be detected at times tl, t2, t3, ...
  • An illumination pulse is produced at time t2. This means in light of the present embodiment, that at time t2 and t3 no intensity image is provided to be displayed, or alternatively, any intensity image taken at times t2 and t3 is masked from being displayed.
  • the reflection of the illumination pulse is expected to be detected and allows for a distance determination. The intensity image will again be detected at time t4 and following time t5, etc until the next illumination pulse is scheduled.
  • Figure 8 refers to the identification of objects in a range image provided by a range camera.
  • a possible means to identify an object is to evaluate the ranges measured for neighboring pixels of a range image. If the range measured for neighboring pixels is within a certain range interval, the respective pixels may be identified to form an object. A neighboring pixel with a range too far from the range interval may represent an edge of the object.
  • two objects in a range image are identified according to the above procedure which objects are denoted by 01 and 02.
  • the range image is divided into pixels represented by the grid. For each pixel a distance value is measured, assigned and properly stored.
  • the entire range image may be defined as object, and, for example, the smallest distance value of any pixel of the object is representing the shortest range of an object in the range image.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)

Abstract

The present idea refers to a method and a control unit for controlling a display (19). Vehicles and other objects (4a, 4b, 4c, 5, 6, 7, 8), for example, in a surface mine (1), are equipped with range cameras (12). A control unit (13) of such object (4a, 4b, 4c, 5, 6, 7, 8) receives signals representative of range images from these at least two range cameras (12). The at least two range images are evaluated and a subset of range images is selected from the range images available subject to the result of the evaluation. A control signal is provided to the display (19) for displaying image information stemming from the one or more range cameras (12) in charge of providing the one or more range images selected for the subset. By such method, the most relevant scene in terms of collision avoidance can be displayed to the personnel in charge.

Description

Method and control unit for controlling a display
Technical Field
The invention relates to a method and a con trol unit for controlling a display.
Background Art
Surface mines and similar sites or areas are generally operated by means of a large number of vehicles, some of which may be exceedingly large and diffi- cult to maneuver and have very limited visibility for the operator.
Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated systems may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car .
For extra-large vehicles used in mining, WO 2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions.
Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", retrieved and accessed from/on the Internet at
http: //www. flir.com/uploadedFiles/Eurasia/MMC/Appl_Storie s/AS 0020 EN.pdf on February 2, 2010. Sensors for avoid- ing collisions may include radar systems, conventional cameras or thermal imaging cameras.
In non-conventional types of vehicles such as the vehicles used in mining, each camera may display its images on a display installed in the driving cab. The more cameras there are available the more image information the driver is exposed to such that the driver may be distracted by images not being relevant for collision avoidance. Or, the driver may be overstrained by monitor- ing the output of all cameras available.
Disclosure of the Invention. In this respect, it is desired to improve means in a multi camera based proximity warning system for drawing the attention of the personnel in charge to the most relevant camera output/s.
According to a first aspect of the present invention, there is provided a method for controlling a display according to the features of independent claim 1.
Accordingly, signals representative of range images from at least two range cameras arranged for providing range images of different scenes are received.
These range images are evaluated. A subset of range images from the range images available is selected subject to the result of the evaluation. A control signal is provided for the display to display image information stemming from the one or more range cameras in charge of pro- viding the one or more range images selected for the sub¬ set .
According to another aspect of the present invention, there is provided a control unit for control¬ ling a display according to the features of independent claim 22. Such control unit comprises a receiver for re¬ ceiving signals representative of range images from at least two range cameras for providing range images of different scenes. An evaluation unit is designed for evaluating the at least two range images, and a selection unit is designed for selecting a subset of range images from the range images available subject to the result of the evaluation. At the output of the control unit, there is provided a control signal for the display to display image information stemming from the one or more range cameras in charge of providing the one or more range images selected for the subset.
The basic idea of the present invention is to provide range cameras and analyze the range images provided by these range cameras. It is ensured that the image information being most relevant especially in terms of collision avoidance is displayed on the display. The determination which one/s of the range cameras available currently monitors the' most relevant scene is performed by an evaluation and a selection unit comprised in the control unit. In particular, ranges provided in the range images are analyzed in terms of proximity to objects identified in the range image.
By automatically selecting the range camera currently monitoring the scene of most interest and by displaying image information delivered from this range camera, the personnel in charge for a safe operation of such vehicle or stationary object may not be distracted by a multitude of image information but instead may focus on the most relevant image information displayed.
For advantageous embodiments it is referred to the dependent claims. It is noted that embodiments referred to or claimed only in connection with the method are deemed to be disclosed in connection with the apparatus, too, and vice versa. Brief Description of the Drawings
A number of embodiments of the present invention will now be described by way of example only and with reference to the accompanying drawings, in which the figures show:
Fig. 1 a schematic representation of a mining site,
Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention,
Fig. 3 a top view on a schematic vehicle with range cameras mounted according to an embodiment of the present invention,
Fig. 4 a display,
Fig. 5 a block diagram of another monitoring system according to an embodiment of the present invention,
Fig. 6 another display,
Fig. 7 a timing diagram for illustrating the operation of a range camera according to an aspect of the present invention, and
Fig. 8 a representation of a range image.
Modes for Carrying Out the Invention
Definitions:
In the present application, "range" is used equivalent to the term "distance".
A "range camera" is a sensor device for determining the distance to points in a scene. The distance typically refers to a distance between the camera and the object. The range camera at least provides a "range image" which represents an image in which a distance value is measured for each pixel of such image. A visualization technique of a two-dimensional range image may comprise assigning different gray-scale intensities to different ranges. Other visualization techniques of range images are included.
There are many different ways for producing range images. The most prominent of these techniques in- elude stereo triangulation, sheet of light triangulation, or time-of-flight measurement. A person skilled in the art will appreciate other techniques for producing a range image. Many of the techniques except for the stereo triangulation need to provide an illumination for acquir- ing the distance information which is gained by means of detecting a reflection of the illumination from the remote object. Subject' to the method applied, the illumination may be operated in a pulsed manner, or the illumination may be achieved by laser light. In other embodi- ments, the illumination source may provide for an illumination comprising multiple wavelengths, or multiple wavelengths with different modulation frequencies. Espe¬ cially, the means for providing a range image may comprise a LIDAR system, i.e. a light detection and ranging system with discrete laser emitters for illumination.
In the present application, a range camera may at least provide a range image. However, a range camera may provide information beyond the range image.. The range camera may additionally provide an intensity image of the scene, i.e. a conventional image in preferably digital form in which the light intensity is detected for each pixel. The light intensity in such case does not represent a distance to the object as it does after the transformation of distances into grey-scales as illus- trated above. The light intensity image can be embodied as a grey-scale image, or as a color image. For taking the intensity image there may or may not be illumination means used subject to the ambient light intensity. An in¬ tensity image is considered to represent the result of conventional photography or filming, where any light de¬ tected by the camera is not investigated for distance in- formation. Cameras for providing both range and intensity information are also called three-dimensional cameras.
Whenever a range camera may provide both intensity and range image information, the visualization of these images may be separated, such that an intensity image and a range image are provided, or both kinds of image information, i.e. range and intensity, are merged into one image.
In this respect, "image information" may include one or both of range image information and intensity image information, and for the embodiment that both kind of information is provided any image information processing may include a separate handling as well as an integrated handling of the underlying data.
The range cameras provide range images of "different scenes". A scene is "different" to another scene whenever the cameras involved do not shoot or scan the same perspective. This means, whenever a camera is mounted at a different position from another camera, these cameras provide range images from different scenes. In the context of the present application, it is preferred that the at least two range cameras are mounted to the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the range cameras are arranged such that they scan different sides of the object they are mounted to in order to detect other objects in proximity at all sides of the object. A left-hand sided camera may provide exactly the same image content as the right-hand sided camera mounted at a vehicle, e.g. a dusty road, however, the image content is not equivalent to the scene as the right-hand sided camera provides an image of the dusty road to the right hand of the vehicle, and the left-hand sided camera does so to the left hand of the vehicle, and the dusty road to the right hand is a scene different to the dusty road to the left hand. The "control unit" may be embodied in hard¬ ware, in software or both, and may be embodied in a single device, or its functions may be decentralized. The functional building blocks "evaluation unit" and "selec- tion unit" may also be embodied in hardware, in software or both.
The "display" may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, i.e. cameras, simultaneously. The "display" also encompasses the totality of a multitude of separated displays which are,' for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any displaying means for displaying image information delivered by the range cameras .
The "control signal to display image information" triggers at least displaying the image information selected for displaying. The control signal may evoke additional action subject to what is displayed during the normal mode of operation: If, for example, the display regularly shows images of a single source only, the control signal may cause to switch from displaying images from the current source to displaying images from the source selected according to the present idea. If the current image source by chance coincides with the selected image source, there may be no change visible to the monitoring person. In some embodiments, the control signal causes to highlight the selected images for drawing the attention to the subject images, e.g. by a flash- ing frame, or other means. If, for example, the display by default displays images from various sources, the control signal may cause that the entire display now dis¬ plays images only from the selected source. Or, the con¬ trol signal may cause images from other sources being shut down or completely masked or visually downsized in rder to emphasize the selected images. The selected im¬ age may, as indicated, claim the entire display screen, or remain in an unchanged image size on the screen. The control signal may include zooming into the object identified as being most close in the range images provided by the range camera selected. The control signal may ad- ditionally cause acoustic warnings to be issued. The control signal may either comprise the selected image information itself or it may cause the subject cameras to directly route the requested image information to the display or it may cause the display to accept only display information from the range camera as selected. All the above holds true also for the selection of multiple range images if appropriate.
A "warning system" and a corresponding "warning" activity may refer to any suitable activity for drawing the attention of the driver or other persons to image information selected to be displayed, and finally to the underlying scene that may become critical in terms of collision or proximity. Such warning system may include acoustic means such as a horn, a diaphone or a speaker. Such warning system may alternatively or additionally include visual means including the display for displaying image information itself, or, alternatively or additionally, a separate visual warning indicator such as another display, one or more LEDs, a flashlight, etc.. The warning especially may be implemented on the display for displaying the image information by intermittently displaying the image information, or by alternating between the image information and its inverse colors, or by overlaying the image information with visual warning ele- ments. The warning may be issued in combination with the control signal such that the control signal may activate the warning system, too. In other embodiments, a control signal separate from the control signal for the display may be issued subject to range information in the se- lected image. For example, the control signal to the dis¬ play may be issued based on a first threshold condition of a range in the subject image, and the separate control signal for the warning system may be issued based on a second threshold condition of a range in the subject im¬ age, that is, for example, more strict than the first threshold condition.
The term "radio based positioning system" stands for a GNSS or for any other type of positioning system based on radio signals, such as a pseudolite system. The term "GNSS" stands for "Global Navigation Satellite System" and encompasses all satellite based naviga- tion systems, including GPS and Galileo. A "receiver" is a receiver designed for receiving information from satellites and for determining its position subject to the signals received.
A radio based positioning receiver is "as- signed" to a range camera whenever the camera and the receiver are arranged at the more or less same position, being at a maximum of two meters separated from each other.
A "movable object" is any object that can change and is expected to change its position and/or orientation or configuration in space. It may e.g. be a truck or any other vehicle that moves from place to place and changes its orientation in respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, e.g. by extending an arm, in such a manner that the volume of safety space attributed to it varies in significant manner.
Fig. 1 schematically depicts a site 1, such as a surface mine. Typically, such a site covers a large area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3. A plurality of objects is present in the mine, such as:
- Large vehicles, such as haul trucks 4a, cranes 4b or diggers 4c. Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
- Medium sized vehicles 5, such as regular trucks. These vehicles are easier to control, but they still have several blind spots and require a skilled driver .
- Small vehicles 6. Typically, vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
- Trains 7.
A further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
The risk of accidents in such an environment is high. In particular, the large sized vehicles can easily collide with other vehicles, or obstacles.
For this reason, objects present in a mine 1 and subject to potential collision may be equipped with at least two range cameras 12, a control unit per object, and a display per object. The entirety of these elements per object for generating a proximity warning is called a monitoring system.
Figure 2 illustrates a block diagram of a control. unit 13 according to an embodiment of the present invention. A receiver 17 of the control unit 13 is connected to range cameras 12. An output 16 of the control unit 13 is connected to a display 19. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections.
The control unit 13 comprises a microproces- sor system 14, which controls the operations of the control unit 13. A memory 18 comprises programs as well as various parameters, such as unique identifiers of the range cameras. Such programs may comprise instructions for evaluating the incoming range images, and for selecting a subset of range images which subset of range images identifies the range cameras which currently provide the most significant image information.
The monitoring system further comprises radio based positioning receivers 11. These receivers 11 may determine their respective positions in combination with satellites 30 as shown in Figure 1. Positional informa- tion may be sent to a corresponding receiver 15 in the control unit 13. Each positioning receiver 11 is advantageously assigned to a range camera 12 such that the location of a camera / positioning receiver pair is more or less the same.
Each range camera 12 delivers a series of range images with respect to the scene monitored by the respective range camera 12. Figure 3 illustrates a schematic top view on a car 6 eguipped with four range cameras 12, one located at each side of the vehicle 6. The area monitored by each range camera 12 is indicated by a sector. This makes each range camera 12 scan a different scene at each side of the vehicle 6. Alternatively, the range cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the mobile object for proximity and/or collision detection purposes.
In the control unit 13 the range images provided by the range cameras 12 are evaluated. In a preferred embodiment, objects in the range image are identi- fied and their distance from the camera 12 is determined. After this, it is determined by means of selection which of the objects identified shows the shortest range to the camera 12, i.e. is closest to the camera 12 and may be identified as the object with the highest likelihood to collide with the present object. The range camera 12 hav¬ ing provided the range image including the object most close is selected to provide image information to the personnel on board or wherever the control unit is located with respect to an object.
Basically, there can be two ways of selecting the subset of range images most important for gaining the personnel attention. Either, after object identification, all the objects are assigned a range which can be one of the ranges measured for such object, and more specifically one of the ranges associated with a pixel of such object, or it can be the shortest distance of a pixel out of all pixels distances contributing to such object, or, for example, it can be an average over all the range values measured for the pixels of an object.
Such assigned ranges are advantageously compared across all objects identified in all range images. The comparison may deliver the x most close objects identified in all range images. When translated into the scenario of a vehicle it means that the x most close objects to the vehicle are identified hereby. These x most close objects may be scanned by at most x range cameras, or by less than x range cameras provided that one of the cameras has scanned multiple objects being amongst the x closest objects.
Alternatively, for each range image the closest object of all objects of such range image is deter- mined by determining the shortest range assigned to any object in the range image. Then, a comparison of all the closest objects results in x objects with a range being amongst the x shortest ranges identified which provides for exactly x range images as a subset of all available range images.
Advantageously, for both ways of processing, x is set to one such that only image information of the range camera delivering the most relevant range image will be displayed in the following.
The selected range images represent the range cameras that currently film the objects most close which are of most interest to be monitored by the personnel charge in order to avoid a collision. This is why image information stemming from these range cameras is emphasized in being presented to the personnel via the display.
In an advantageous embodiment, the selection step may include selecting a subset of range images from the range images available only when the distance of an object fulfils a threshold condition, and preferably shows a distance less than a threshold. Such threshold may be programmable in the control unit. The object under investigation may be an object within an image, a closest object within an image, a closest object in all images available, or other objects evaluated.
The display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 16 range cameras across its screen. Once the control signal is received from the control unit 13, and provided the control signal identifies only one range camera 12 for providing image information most relevant to be dis- played, the entire screen of Figure 4 may be reserved for showing the subject image information. In Figure 4, the image information comprises a range image 20 delivered by the subject range camera as well as an intensity image 21 delivered by the subject range camera which, in the pre- sent embodiment, is a range camera providing three dimensional images.
The block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the ranges cameras 12. By activating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associ- ated range camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image infor- mation from the other range cameras 12. The choice of which sort of image information of a range camera 12 shall be displayed, e.g. a range image, an intensity image, or both, can be preconfigured at the range camera. By providing multiple AND gates per camera and the camera providing outputs for different kind of image information, it may be determined by the control signal which image information shall be displayed. Advantageously, the type of image information to be displayed may also be selected by the personnel.
Figure 6 provides another schematic representation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying image information from a range camera assigned. In this embodiment, the control signal only highlights the sub-display 192 which displays image information from the range camera 12 which range images were evaluated and selected to be most critical in terms of a potential collision.
The radio based positioning receivers 11 may provide positional information of the subject location they are arranged at. Given that there is at least one radio based positioning receiver 11 mounted to an object and provided that other moving or stationary objects on the site are equipped with such receivers, too, such positional information may be shared between the control units of the objects, or the receivers themselves, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to PCT/CH2009/000394 which is incorporated herein by reference.
Location information provided by the posi¬ tioning receivers 11 of the same object is transferred to the control unit 13 and evaluated there. Advantageously, such evaluation takes into account positional information received from other objects which may be provided by a wireless interface not shown in the Figures. By way of evaluating the positional information from these different sources, a proximity situation may be detected. If such proximity situation is detected by means of the po- sitional information, a control signal may be issued which activates displaying image information stemming from the range camera which is assigned to the positioning receiver which is at least co-responsible - together with a positioning receiver of a different object - for issuing the proximity warning. In a preferred embodiment, the image information from the subject range camera selected by such process prevails over the image information from a different range camera based on the evaluation of the range images.
Such embodiment is beneficial in scenarios where an object may become close in an area which the range cameras do not or are not able to scan, or where a range camera may not provide correct range images and even may fail. In the latter scenario, the analysis of the positional information serves as a safety mechanism for malfunctions in the camera system.
Focus now is turned to the timing characteristics of the control signal. A permanent control signal implies that the selected image information will always prevail and possibly prevent other sources from display image information. This does not mean that it is always the same camera providing image information to be displayed as the selection step may select a new camera to provide image information from time to time. The present idea can be modified by issuing the control signal only when the closest range assigned to an object of such range image exceeds a threshold value. Such threshold value has the effect, that in a situation where no object is close enough to the monitoring system, a default dis- play mode is executed, which includes for example permanently displaying image information from all the range cameras in assigned portions of the display, or which may include switching between the range cameras for displaying image information on a single display. Only if the threshold range is exceeded by any one of the objects in any one of the range images, the control signal becomes active and grants access to the display to the selected range camera.
Another embodiment is related to the duration of the control signal. Even today, there are range cameras around which provide range images with a frequency of more than 1/100 ms . The control signal may be allowed to switch from one range camera to another one with every set of range images being evaluated which might cause toggling. However, there might be alternate options according to which a once selected range camera rests for a minimum amount of time for allowing the personnel to get an idea of the object approaching.
In a beneficial embodiment of the present invention, the image information displayed in response to the control signal is an intensity image provided by the range camera that also provides the range image. For the reason that most of the distance measuring techniques need to apply illumination of the scene in order to make a light signal of sufficient strength being reflected from the object which in turn allows for determining the distance to such object, the illumination pulse may interfere with the regular shooting of an intensity image. A photo-sensor of the camera for receiving radiation may be overloaded by the illumination pulse itself. This is why it is preferred that any intensity image to be dis- played is masked for the time of the illumination pulse in order not to provide overloaded signals to the display. This means that an intensity image provided by the range camera as selected is suppressed for a certain time during and/or after, or in response to the illumination pulse. Given that a range image is produced at a fre¬ quency frange, and the intensity image is produced at a frequency intensity with frange < intensity, the intensity im- age is interrupted every frange times. According to the diagram in Figure 7 , an intensity image may, for example, be detected at times tl, t2, t3, ... An illumination pulse is produced at time t2. This means in light of the present embodiment, that at time t2 and t3 no intensity image is provided to be displayed, or alternatively, any intensity image taken at times t2 and t3 is masked from being displayed. In contrast, during the time interval t2 - t4 the reflection of the illumination pulse is expected to be detected and allows for a distance determination. The intensity image will again be detected at time t4 and following time t5, etc until the next illumination pulse is scheduled.
Figure 8 refers to the identification of objects in a range image provided by a range camera. A possible means to identify an object is to evaluate the ranges measured for neighboring pixels of a range image. If the range measured for neighboring pixels is within a certain range interval, the respective pixels may be identified to form an object. A neighboring pixel with a range too far from the range interval may represent an edge of the object. In Figure 8, two objects in a range image are identified according to the above procedure which objects are denoted by 01 and 02. The range image is divided into pixels represented by the grid. For each pixel a distance value is measured, assigned and properly stored.
Alternatively, the entire range image may be defined as object, and, for example, the smallest distance value of any pixel of the object is representing the shortest range of an object in the range image.
While there are shown and described presently preferred embodiments of the invention, it is to be dis¬ tinctly understood that the invention is not limited thereto but may be otherwise variously embodied and prac¬ ticed within the scope of the following claims.

Claims

Claims
1. A method for controlling ' a display, comprising
- receiving signals representative of range images from at least two range cameras (12) arranged for providing range images of different scenes,
- evaluating the at least two range images,
- selecting a subset of range images from the range images available subject to the result of the evaluation, and
- providing a control signal for the display (19) to display image information stemming from the one or more range cameras (12) in charge of providing the one or more range images selected for the subset.
2. Method according to claim 1, wherein the evaluation step comprises evaluating range information comprised in the range images.
3. Method according to any one of the preceding claims, wherein the evaluation step comprises identifying objects (01, 02) in the range images.
4. Method according to claim 3, wherein an object is formed by neighboring pixels of the range image a range associated with each of these pixels being within a given range interval .
5. Method according to claim 3, wherein the entire range image is defined as an object.
6. Method according to any one of the preceding claims, wherein the evaluation step comprises assigning a range to an object (01, 02) identified in the range image .
7. Method according to claim 6, wherein the shortest range of any pixel of an object (01, 02) is as¬ signed as range to the object (01, 02) .
8. Method according to claim 6, wherein an average range determined by averaging the ranges of all pixels contributing to an object (01, 02) is assigned as range to the object (01, 02) .
9. Method according any one of the preceding claims, wherein the evaluation step comprises determining for each range image a closest object amongst all objects (01, 02) identified in the range image based on range information associated with such objects (01, 02).
10. Method according to claim 9, wherein the closest object of a range image is identified by comparing the range information of all the objects (01, 02) , and by identifying the object (01, 02) with the shortest range as closest object.
11. Method according to claim 9 or claim 10, wherein the selection step comprises comparing the range of the closest object to a threshold and selecting the corresponding range image only if the range of the closest object is not more than the threshold.
12. Method according to claim 9, wherein the selection step comprises selecting x range images as the subset of range images which x range images comprise the x closest objects.
13. Method according to claim 12, wherein x =
1.
14. A method according to any one of the pre¬ ceding claims, wherein the control signal is provided for the display (19) to display and highlight the image information .
15. Δ method according to any one of the pre¬ ceding claims, wherein the pontrol signal is provided for the display (19) to exclusively display the image infor¬ mation .
16. A method according to any one of the pre¬ ceding claims, wherein the image information to be dis¬ played comprises an intensity image of the subject scene.
17. A method according to any one of the preceding claims, wherein the image information to be displayed comprises a range image of the subject scene.
18. A method according to any one of the pre- ceding claims, comprising receiving signals representing positional information from radio based positioning receivers (11), and subject to the positional information providing a control signal for the display (19) to display image information stemming from one or more range cameras (12) other than the range cameras (12) in charge of providing the one or more range images selected for the subset.
19. A method according to claim 18, comprising receiving signals representing positional information from radio based positioning receivers (11) assigned to the range cameras (12).
20. A method according to claim 18, comprising receiving signals representing positional information from radio based positioning receivers (11) of objects other than the object the range cameras are mounted to.
21. A method according to any one of the preceding claims, wherein the control signal is designed for generating a warning.
22. A method according to claim 21, wherein the warning is an acoustic warning.
23. A method according to claim 21 or claim 21, wherein the warning is a visual warning.
24. A method for controlling a display, com¬ prising :
- scanning different scenes and providing range images of these scenes,
- evaluating the range images provided and selecting a subset of range images subject to the result of the evaluation, and
- displaying image information stemming from. one or more range cameras in charge of providing the one or more range images selected for the subset.
25. Computer program element comprising computer program code means which, when loaded in a processor unit of a control unit, configures the control unit to perform a method as claimed in any one of the preceding claims.
26. A control unit for controlling a display, comprising a receiver (17) for receiving signals representative of range images from at least two range cameras (12) for providing range images of different scenes, an evaluation unit for evaluating the at least two range images, a selection unit for selecting a subset of range images from the range images available subject to the result of the evaluation, and an output (16) for providing a control signal to the display (19) for displaying image information stemming from the one or more range cameras (12) in charge of providing the one or more range images selected for the subset.
27. A monitoring system, comprising a control unit according to claim 26, a display (19), and at least two range cameras (12) for providing range images of different scenes.
28. A monitoring system according to claim
27, wherein at least one range camera (12) comprises an illumination source for enabling scanning the range image .
29. A monitoring system according to claim
28, wherein the illumination source is a laser emitter.
30. A monitoring system according to claim 28, wherein the illumination source is designed for providing a pulsed illumination.
31. A monitoring system according to claim 28, wherein the illumination source is designed for providing an illumination mode comprising multiple illumination wavelengths.
32. A monitoring system according to claim 30, wherein the illumination source is designed for pro- viding an illumination mode comprising multiple illumination wavelengths with different modulation frequencies.
33. A monitoring system according to any one of claims 27 to 32, wherein in at least one range camera (12) a LIDAR light detection and ranging method is implemented.
34. A monitoring system according to any one of the claims 27 to 33, wherein at least one range camera
(12) is a time of flight camera.
35. A monitoring system according to any one of the preceding claims 27 to 34, wherein at least one range camera (12) is a 3D camera providing both an intensity image and a range image.
36. A monitoring system according to claim 26 in combination with claim 35, wherein the control unit
(13) is designed for masking intensity image information provided subsequent to an illumination pulse from being displayed on the display (19) .
37. A monitoring system according to any one of the preceding claims 27 to 36, comprising radio based positioning receivers (11) assigned to the range cameras (12), wherein the control unit is designed for receiving signals representing positional information from the radio based positioning receivers (11) and subject to the positional information for providing a control signal to the display (19) for displaying image information stemming from one or more range cameras (12) other than the one or more range cameras (12) in charge of providing the one or more range images selected for the subset.
38. A monitoring system according to any one of the preceding claims 27 to 37, comprising a warning system controlled by the control signal.
39. A movable object, comprising a monitoring system according to any one of the preceding claims 26 to 38, wherein the range cameras (12) are mounted at differ¬ ent locations of the movable object (4, 5, 6, 7) for pro- viding range images of different areas around the movable object (4, 5, 6, 7) .
40. Ά movable object according to claim 39, wherein said movable object (4, 5, 6, 7) is one of a ve- hide (4, 5, 6), a crane (4b), a dragline, a haul truck (4a), a digger (4c) and a shovel.
PCT/CH2011/000136 2010-06-10 2011-06-07 Method and control unit for controlling a display WO2011153652A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2011264358A AU2011264358B2 (en) 2010-06-10 2011-06-07 Method and control unit for controlling a display
ZA2012/09336A ZA201209336B (en) 2010-06-10 2012-12-10 Method and control unit for controlling a display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CH935/10 2010-06-10
CH9352010 2010-06-10

Publications (2)

Publication Number Publication Date
WO2011153652A2 true WO2011153652A2 (en) 2011-12-15
WO2011153652A3 WO2011153652A3 (en) 2012-06-28

Family

ID=44503450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CH2011/000136 WO2011153652A2 (en) 2010-06-10 2011-06-07 Method and control unit for controlling a display

Country Status (3)

Country Link
AU (1) AU2011264358B2 (en)
WO (1) WO2011153652A2 (en)
ZA (1) ZA201209336B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004021546A2 (en) 2002-08-09 2004-03-11 Conti Temic Microelectronic Gmbh Means of transport with a three-dimensional distance camera and method for the operation thereof
WO2004047047A1 (en) 2002-11-15 2004-06-03 Philips Intellectual Property & Standards Gmbh Method and system for avoiding traffic collisions

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040217851A1 (en) * 2003-04-29 2004-11-04 Reinhart James W. Obstacle detection and alerting system
JP4847051B2 (en) * 2005-06-09 2011-12-28 クラリオン株式会社 Vehicle surrounding monitoring method and system
EP1916846B1 (en) * 2005-08-02 2016-09-14 Nissan Motor Company Limited Device and method for monitoring vehicle surroundings
US8170787B2 (en) * 2008-04-15 2012-05-01 Caterpillar Inc. Vehicle collision avoidance system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004021546A2 (en) 2002-08-09 2004-03-11 Conti Temic Microelectronic Gmbh Means of transport with a three-dimensional distance camera and method for the operation thereof
WO2004047047A1 (en) 2002-11-15 2004-06-03 Philips Intellectual Property & Standards Gmbh Method and system for avoiding traffic collisions

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
US9129509B2 (en) 2009-06-12 2015-09-08 Safemine Ag Movable object proximity warning system
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same

Also Published As

Publication number Publication date
WO2011153652A3 (en) 2012-06-28
AU2011264358B2 (en) 2015-10-29
AU2011264358A1 (en) 2013-01-10
ZA201209336B (en) 2014-02-26

Similar Documents

Publication Publication Date Title
US11175406B2 (en) Range imaging system and solid-state imaging device
EP3357754B1 (en) Vehicle state display system
US9694736B2 (en) Vehicle state indication system
US9195894B2 (en) Vehicle and mobile device traffic hazard warning techniques
US11249473B2 (en) Remote driving managing apparatus, and computer readable storage medium
US8280621B2 (en) Vehicle collision avoidance system
EP3339999A2 (en) Information processing apparatus, operated vehicle, information processing method, and recording medium storing programm
US20180334099A1 (en) Vehicle environment imaging systems and methods
US20170192091A1 (en) System and method for augmented reality reduced visibility navigation
US9868389B2 (en) Vehicle state indication system
US20120245798A1 (en) Vehicle collision avoidance system
US20170088039A1 (en) Vehicle state indication system
KR20050121259A (en) Parking aid for a vehicle
EP3139340B1 (en) System and method for visibility enhancement
CN102740056A (en) Image display system
AU2010351500A1 (en) Object proximity warning system and method
US20040217851A1 (en) Obstacle detection and alerting system
EP3089136A1 (en) Apparatus and method for detecting an object in a surveillance area of a vehicle
US9902267B2 (en) Predicted position display for vehicle
US20130271606A1 (en) Method of displaying an assistant screen for improving driving safety of a vehicle
US11276309B2 (en) Vehicle control device
AU2020226982A1 (en) Work vehicle multi-camera vision systems
US11226616B2 (en) Information processing apparatus and computer readable storage medium for remotely driving vehicles
CA2802122C (en) Method and control unit for controlling a display of a proximity warning system
WO2011153652A2 (en) Method and control unit for controlling a display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11727391

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

ENP Entry into the national phase in:

Ref document number: 2011264358

Country of ref document: AU

Date of ref document: 20110607

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 11727391

Country of ref document: EP

Kind code of ref document: A2