AU2011264358B2 - Method and control unit for controlling a display - Google Patents

Method and control unit for controlling a display Download PDF

Info

Publication number
AU2011264358B2
AU2011264358B2 AU2011264358A AU2011264358A AU2011264358B2 AU 2011264358 B2 AU2011264358 B2 AU 2011264358B2 AU 2011264358 A AU2011264358 A AU 2011264358A AU 2011264358 A AU2011264358 A AU 2011264358A AU 2011264358 B2 AU2011264358 B2 AU 2011264358B2
Authority
AU
Australia
Prior art keywords
range
image
display
cameras
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2011264358A
Other versions
AU2011264358A1 (en
Inventor
Urs Martin Rothacher
Peter Arnold Stegmaier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Safemine AG
Original Assignee
Safemine AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Safemine AG filed Critical Safemine AG
Publication of AU2011264358A1 publication Critical patent/AU2011264358A1/en
Application granted granted Critical
Publication of AU2011264358B2 publication Critical patent/AU2011264358B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Abstract

The present idea refers to a method and a control unit for controlling a display (19). Vehicles and other objects (4a, 4b, 4c, 5, 6, 7, 8), for example, in a surface mine (1), are equipped with range cameras (12). A control unit (13) of such object (4a, 4b, 4c, 5, 6, 7, 8) receives signals representative of range images from these at least two range cameras (12). The at least two range images are evaluated and a subset of range images is selected from the range images available subject to the result of the evaluation. A control signal is provided to the display (19) for displaying image information stemming from the one or more range cameras (12) in charge of providing the one or more range images selected for the subset. By such method, the most relevant scene in terms of collision avoidance can be displayed to the personnel in charge.

Description

WO 2011/153652 PCT/CH2011/000136 Method and control unit for controlling a display Technical Field 5 The invention relates to a method and a con trol unit for controlling a display. Background Art .10 Surface mines and similar sites or areas are generally operated by means of a large number of vehi cles, some of which may be exceedingly large and diffi 15 cult to maneuver and have very limited visibility for the operator. Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles. 20 Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated sys tems may rely on different sensors such as three dimen sional distance cameras as proposed in WO 2004/021546 A2. 25 There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car. For extra-large vehicles used in mining, WO 2004/047047 A2 suggests to use satellite supported radio 30 positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warn ings in order to reduce the risk of collisions. Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", 35 retrieved and accessed from/on the Internet at http://www.flir.com/uploadedFiles/Eurasia/MMC/ApplStorie s/AS_0020 EN.pdf on February 2, 2010. Sensors for avoid- H: \MAG\Interwoven\NRPortbl\DCC\MAG\7837673_1.docx-26/05/2015 -2 ing collisions may include radar systems, conventional cameras or thermal imaging cameras. In non-conventional types of vehicles such as the vehicles used in mining, each camera may display its images 5 on a display installed in the driving cab. The more cameras there are available the more image information the driver is exposed to such that the driver may be distracted by images not being relevant for collision avoidance. Or, the driver may be overstrained by monitoring the output of all cameras 10 available. Disclosure of the Invention In this respect, it is desired to improve means 15 in a multi camera based proximity warning system for drawing the attention of the personnel in charge to the most relevant camera output/s. According to a first aspect of the present invention, there is provided a method for controlling a 20 display according to the features of independent claims 1 or 11. Accordingly, signals representative of range images from at least two range cameras arranged for providing range images of different scenes are received. 25 These range images are evaluated. A subset of range images from the range images available is selected subject to the result of the evaluation. A control signal is provided for the display to display image information stemming from the one or more range cameras in charge of providing the one or 30 more range images selected for the subset. According to another aspect of the present invention, there is provided a control unit for controlling a display according to the features of independent claim 13. Such control unit comprises a receiver for receiving signals 35 representative of range images from at least two range cameras for providing range images of WO 2011/153652 PCT/CH2011/000136 3 different scenes. An evaluation unit is designed for evaluating the at least two range images, and a selection unit is designed for selecting a subset of range images from the range images available subject to the result of s the evaluation. At the output of the control unit, there is provided a control signal for the display to display image information stemming from the one or more range cameras in charge of providing the one or more range im ages selected for the subset. 10 The basic idea of the present invention is to provide range cameras and analyze the range images pro vided by these range cameras. It is ensured that the im age information being most relevant especially in terms of collision avoidance is displayed on the display. The is determination which one/s of the range cameras available currently monitors the most relevant scene is performed by an evaluation and a selection unit comprised in the control unit. In particular, ranges provided in the range images are analyzed in terms of proximity to objects 20 identified in the range image. By automatically selecting the range camera currently monitoring the scene of most interest and by displaying image information delivered from this range camera, the personnel in charge for a safe operation of 25 such vehicle or stationary object may not be distracted by a multitude of image information but instead may focus on the most relevant image information displayed. For advantageous embodiments it is referred to the dependent claims. It is noted that embodiments re 30 ferred to or claimed only in connection with the method are deemed to be disclosed in connection with the appara tus, too, and vice versa.
WO 2011/153652 PCT/CH2011/000136 4 Brief Description of the Drawings A number of embodiments of the present inven tion will now be described by way of example only and 5 with reference to the accompanying drawings, in which the figures show: Fig. 1 a schematic representation of a mining site, Fig. 2 a block diagram of a monitoring system 10 according to an embodiment of the present invention, Fig. 3 a top view on a schematic vehicle with range cameras mounted according to an embodiment of the present invention, Fig. 4 a display, i5 Fig. 5 a block diagram of another monitoring system according to an embodiment of the present inven tion, Fig. 6 another display, Fig. 7 a timing diagram for illustrating the 20 operation of a range camera according to an aspect of the present invention, and Fig. 8 a representation of a range image. 25 Modes for Carrying Out the Invention Definitions: In the present application, "range" is used equivalent to the term "distance". 30 A "range camera" is a sensor device for de termining the distance to points in a scene. The distance typically refers to a distance between the camera and the object. The range camera at least provides a "range im age" which represents an image in which a distance value 35 is measured for each pixel of such image. A visualization technique of a two-dimensional range image may comprise assigning different gray-scale intensities to different WO 2011/153652 PCT/CH2011/000136 5 ranges. Other visualization techniques of range images are included. There are many different ways for producing range images. The most prominent of these techniques in clude stereo triangulation, sheet of light triangulation, or time-of-flight measurement. A person skilled in the art will appreciate other techniques for producing a range image. Many of the techniques except for the stereo triangulation need to provide an illumination for acquir 10 ing the distance information which is gained by means of detecting a reflection of the illumination from the re mote object. Subject to the method applied, the illumina tion may be operated in a pulsed manner, or the illumina tion may be achieved by laser light. In other embodi 1s ments, the illumination source may provide for an illumi nation comprising multiple wavelengths, or multiple wave lengths with different modulation frequencies. Espe cially, the means for providing a range image may com prise a LIDAR system, i.e. a light detection and ranging 20 system with discrete laser emitters for illumination. In the present application, a range camera may at least provide a range image. However, a range cam era may provide information beyond the range image. The range camera may additionally provide an intensity image 25 of the scene, i.e. a conventional image in preferably digital form in which the light intensity is detected for each pixel. The light intensity in such case does not represent a distance to the object as it does after the transformation of distances into grey-scales as illus 30 trated above. The light intensity image can be embodied as a grey-scale image, or as a color image. For taking the intensity image there may or may not be illumination means used subject to the ambient light intensity. An in tensity image is considered to represent the result of 35 conventional photography or filming, where any light de tected by the camera is not investigated for distance in- WO 2011/153652 PCT/CH2011/000136 6 formation. Cameras for providing both range and intensity information are also called three-dimensional cameras. Whenever a range camera may provide both in tensity and range image information, the visualization of 5 these images may be separated, such that an intensity im age and a range image are provided, or both kinds of im age information, i.e. range and intensity, are merged into one image. In this respect, "image information" may in 10 clude one or both of range image information and inten sity image information, and for the embodiment that both kind of information is provided any image information processing may include a separate handling as well as an integrated handling of the underlying data. is The range cameras provide range images of "different scenes". A scene is "different" to another scene whenever the cameras involved do not shoot or scan the same perspective. This means, whenever a camera is mounted at a different position from another camera, 20 these cameras provide range images from different scenes. In the context of the present application, it is pre ferred that the at least two range cameras are mounted to the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In 25 such scenario, it is preferred that the range cameras are arranged such that they scan different sides of the ob ject they are mounted to in order to detect other objects in proximity at all sides of the object. A left-hand sided camera may provide exactly the same image content 30 as the right-hand sided camera mounted at a vehicle, e.g. a dusty road, however, the image content is not equiva lent to the scene as the right-hand sided camera provides an image of the dusty road to the right hand of the vehi cle, and the left-hand sided camera does so to the left 35 hand of the vehicle, and the dusty road to the right hand is a scene different to the dusty road to the left hand.
WO 2011/153652 PCT/CH2011/000136 7 The "control unit" may be embodied in hard ware, in software or both, and may be embodied in a sin gle device, or its functions may be decentralized. The functional building blocks "evaluation unit" and "selec s tion unit" may also be embodied in hardware, in software or both. The "display" may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, 10 i.e. cameras, simultaneously. The "display" also encom passes the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any display ing means for displaying image information delivered by is the range cameras. The "control signal to display image informa tion" triggers at least displaying the image information selected for displaying. The control signal may evoke ad ditional action subject to what is displayed during the 20 normal mode of operation: If, for example, the display regularly shows images of a single source only, the con trol signal may cause to switch from displaying images from the current source to displaying images from the source selected according to the present idea. If the 25 current image source by chance coincides with the se lected image source, there may be no change visible to the monitoring person. In some embodiments, the control signal causes to highlight -the selected images for draw ing the attention to the subject images, e.g. by a flash 30 ing frame, or other means. If, for example, the display by default displays images from various sources, the con trol signal may cause that the entire display now dis plays images only from the selected source. Or, the con trol signal may cause images from other sources being 3s shut down or completely masked or visually downsized in order to emphasize the selected images. The selected im age may, as indicated, claim the entire display screen, WO 2011/153652 PCT/CH2011/000136 8 or remain in an unchanged image size on the screen. The control signal may include zooming into the object iden tified as being most close in the range images provided by the range camera selected. The control signal may ad 5 ditionally cause acoustic warnings to be issued. The con trol signal may either comprise the selected image infor mation itself or it may cause the subject cameras to di rectly route the requested image information to the dis play or it may cause the display to accept only display 10 information from the range camera as selected. All the above holds true also for the selection of multiple range images if appropriate. A "warning system" and a corresponding "warn ing" activity may refer to any suitable activity for 15 drawing the attention of the driver or other persons to image information selected to be displayed, and finally to the underlying scene that may become critical in terms of collision or proximity. Such warning system may in clude acoustic means such as a horn, a diaphone or a 20 speaker. Such warning system may alternatively or addi tionally include visual means including the display for displaying image information itself, or, alternatively or additionally, a separate visual warning indicator such as another display, one or more LEDs, a flashlight, etc.. 25 The warning especially may be implemented on the display for displaying the image information by intermittently displaying the image information, or by alternating be tween the image information and its inverse colors, or by overlaying the image information with visual warning ele 30 ments. The warning may be issued in combination with the control signal such that the control signal may activate the warning system, too. In other embodiments, a control signal separate from the control signal for the display may be issued subject to range information in the se s elected image. For example, the control signal to the dis play may be issued based on a first threshold condition of a range in the subject image, and the separate control WO 2011/153652 PCT/CH2011/000136 9 signal for the warning system may be issued based on a second threshold condition of a range in the subject im age, that is, for example, more strict than the first threshold condition. 5 The term "radio based positioning system" stands for a GNSS or for any other type of positioning system based on radio signals, such as a pseudolite sys tem. The term "GNSS" stands for "Global Navigation Satel lite System" and encompasses all satellite based naviga 10 tion systems, including GPS and Galileo. A "receiver" is a receiver designed for receiving information from satel lites and for determining its position subject to the signals received. A radio based positioning receiver is "as is signed" to a range camera whenever the camera and the re ceiver are arranged at the more or less same position, being at a maximum of two meters separated from each other. A "movable object" is any object that can 20 change and is expected to change its position and/or ori entation or configuration in space. It may e.g. be a truck or any other vehicle that moves from place to place and changes its orientation in respect to the general north-south direction, e.g. by steering, or it may be an 25 object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, e.g. by extending an arm, in such a manner that the vol ume of safety space attributed to it varies in signifi cant manner. 30 Fig. 1 schematically depicts a site 1, such as a surface mine. Typically, such a site covers a large area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3. A plurality of objects is 3s present in the mine, such as: - Large vehicles, such as haul trucks 4a, cranes 4b or diggers 4c. Vehicles of this type may easily WO 2011/153652 PCT/CH2011/000136 10 weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras. 5 - Medium sized vehicles 5, such as regular trucks. These vehicles are easier to control, but they still have several blind spots and require a skilled driver. - Small vehicles 6. Typically, vehicles of 10 this type weigh 3 tons or less. They comprise passenger vehicles and small lorries. - Trains 7. A further type of object within the mine is comprised of stationary obstacles, such as temporary or 15 permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc. The risk of accidents in such an environment is high. In particular, the large sized vehicles can eas ily collide with other vehicles, or obstacles. 20 For this reason, objects present in a mine 1 and subject to potential collision may be equipped with at least two range cameras 12, a control unit per object, and a display per object. The entirety of these elements per object for generating a proximity warning is called a 25 monitoring system. Figure 2 illustrates a block diagram of a control.unit 13 according to an embodiment of the present invention. A receiver 17 of the control unit 13 is con nected to range cameras 12. An output 16 of the control 30 unit 13 is connected to a display 19. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections. The control unit 13 comprises a microproces 35 sor system 14, which controls the operations of the con trol unit 13. A memory 18 comprises programs as well as various parameters, such as unique identifiers of the WO 2011/153652 PCT/CH2011/000136 11 range cameras. Such programs may comprise instructions for evaluating the incoming range images, and for select ing a subset of range images which subset of range images identifies the range cameras which currently provide the 5 most significant image information. The monitoring system further comprises radio based positioning receivers 11. These receivers 11 may determine their respective positions in combination with satellites 30 as shown in Figure 1. Positional informa 10 tion may be sent to a corresponding receiver 15 in the control unit 13. Each positioning receiver 11 is advanta geously assigned to a range camera 12 such that the loca tion of a camera / positioning receiver pair is more or less the same. 15 Each range camera 12 delivers a series of range images with respect to the scene monitored by the respective range camera 12. Figure 3 illustrates a sche matic top view on a car 6 equipped with four range cam eras 12, one located at each side of the vehicle 6. The 20 area monitored by each range camera 12 is indicated by a sector. This makes each range camera 12 scan a different scene at each side of the vehicle 6. Alternatively, the range cameras 12 can be located at the edges of the vehi cle 6. Both arrangements are beneficial for covering a 25 large area in the vicinity of the mobile object for prox imity and/or collision detection purposes. In the control unit 13 the range images pro vided by the range cameras 12 are evaluated. In a pre ferred embodiment, objects in the range image are identi 30 fied and their distance from the camera 12 is determined. After this, it is determined by means of selection which of the objects identified shows the shortest range to the camera 12, i.e. is closest to the camera 12 and may be identified as the object with the highest likelihood to 3s collide with the present object. The range camera 12 hav ing provided the range image including the object most close is selected to provide image information to the WO 2011/153652 PCT/CH2011/000136 12 personnel on board or wherever the control unit is lo cated with respect to an object. Basically, there can be two ways of selecting the subset of range images most important for gaining the 5 personnel attention. Either, after object identification, all the objects are assigned a range which can be one of the ranges measured for such object, and more specifi cally one of the ranges associated with a pixel of such object, or it can be the shortest distance of a pixel out 10 of all pixels distances contributing to such object, or, for example, it can be an average over all the range val ues measured for the pixels of an object. Such assigned ranges are advantageously com pared across all objects identified in all range images. 15 The comparison may deliver the x most close objects iden tified in all range images. When translated into the sce nario of a vehicle it means that the x most close objects to the vehicle are identified hereby. These x most close objects may be scanned by at most x range cameras, or by 20 less than x range cameras provided that one of the cam eras has scanned multiple objects being amongst the x closest objects. Alternatively, for each range image the clos est object of all objects of such range image is deter 25 mined by determining the shortest range assigned to any object in the range image. Then, a comparison of all the closest objects results in x objects with a range being amongst the x shortest ranges identified which provides for exactly x range images as a subset of all available 30 range images. Advantageously, for both ways of processing, x is set to one such that.only image information of the range camera delivering the most relevant range image will be displayed in the following. 35 The selected range images represent the range cameras that currently film the objects most close which are of most interest to be monitored by the personnel WO 2011/153652 PCT/CH2011/000136 13 charge in order to avoid a collision. This is why image information stemming from these range cameras is empha sized in being presented to the personnel via the dis play. 5 In an advantageous embodiment, the selection step may include selecting a subset of range images from the range images available only when the distance of an object fulfils a threshold condition, and preferably shows a distance less than a threshold. Such threshold 10 may be programmable in the control unit. The object under investigation may be an object within an image, a closest object within an image, a closest object in all images available, or other objects evaluated. The display 19 in Figure 4 represents a flat 15 panel display offering displaying images from e.g. 16 range cameras across its screen. Once the control signal is received from the control unit 13, and provided the control signal identifies only one range camera 12 for providing image information most relevant to be dis 20 played, the entire screen of Figure 4 may be reserved for showing the subject image information. In Figure 4, the image information comprises a range image 20 delivered by the subject range camera as well as an intensity image 21 delivered by the subject range camera which, in the pre 25 sent embodiment, is a range camera providing three dimen sional images. The block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the 30 control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the ranges cameras 12. By activating one of the AND gates by a corresponding con trol signal, the subject AND gate allows for the associ 35 ated range camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image infor- WO 2011/153652 PCT/CH2011/000136 14 mation from the other range cameras 12. The choice of which sort of image information of a range camera 12 shall be displayed, e.g. a range image, an intensity im age, or both, can be preconfigured at the range camera. 5 By providing multiple AND gates per camera and the camera providing outputs for different kind of image informa tion, it may be determined by the control signal which image information shall be displayed. Advantageously, the type of image information to be displayed may also be se 10 lected by the personnel. Figure 6 provides another schematic represen tation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying image information from a range 15 camera assigned. In this embodiment, the control signal only highlights the sub-display 192 which displays image information from the range camera 12 which range images were evaluated and selected to be most critical in terms of a potential collision. 20 The radio based positioning receivers 11 may provide positional information of the subject location they are arranged at. Given that there is at least one radio based positioning receiver 11 mounted to an object and provided that other moving or stationary objects on 25 the site are equipped with such receivers, too, such po sitional information may be shared between the control units of the objects, or the receivers themselves, such that by comparing positional information stemming from positioning receivers located on different objects prox 30 imity and even approximation can be detected. For further details it is referred to PCT/CH2009/000394 which is in corporated herein by reference. Location information provided by the posi tioning receivers -11 of the same object is transferred to 3s the control unit 13 and evaluated there. Advantageously, such evaluation takes into account positional information received from other objects which may be provided by a WO 2011/153652 PCT/CH2011/000136 15 wireless interface not shown in the Figures. By way of evaluating the positional information from these differ ent sources, a proximity situation may be detected. If such proximity situation is.detected by means of the po 5 sitional information, a control signal may be issued which activates displaying image information stemming from the range camera which is assigned to the position ing receiver which is at least co-responsible - together with a positioning receiver of a different object - for 10 issuing the proximity warning. In a preferred embodiment, the image information from the subject range camera se lected by such process prevails over the image informa tion from a different range camera based on the evalua tion of the range images. i5 Such embodiment is beneficial in scenarios where an object may become close in an area which the range cameras do not or are not able to scan, or where a range camera may not provide correct range images and even may fail. In the latter scenario, the analysis of 20 the positional information serves as a safety mechanism for malfunctions in the camera system. Focus now is turned to the timing character istics of the control signal. A permanent control signal implies that the selected image information will always 25 prevail and possibly prevent other sources from display image information. This does not mean that it is always the same camera providing image information to be dis played as the selection step may ,select a new camera to provide image information from time to time. The present 30 idea can be modified by issuing the control signal only when the closest range assigned to an object of such range image exceeds a threshold value. Such threshold value has the effect, that in a situation where no object is close enough to the monitoring system, a default dis 35 play mode is executed, which includes for example perma nently displaying image information from all the range cameras in assigned portions of the display, or which may WO 2011/153652 PCT/CH2011/000136 16 include switching between the range cameras for display ing image information on a single display. Only if the threshold range is exceeded by any one of the objects in any one of the range images, the control signal becomes s active and grants access to the display to the selected range camera. Another embodiment is related to the duration of the control signal. Even today, there are range cam eras around which provide range images with a frequency 10 of more than 1/100 ms. The control signal may be allowed to switch from one range camera to another one.with every set of range images being evaluated which might cause toggling. However, there might be alternate options ac cording to which a once selected range camera rests for a 15 minimum amount of time for allowing the personnel to get an idea of the object approaching. In a beneficial embodiment of the present in vention, the image information displayed in response to the control signal is an intensity image provided by the 20 range camera that also provides the range image. For the reason that most of the distance measuring techniques need to apply illumination of the scene in order to make a light signal of sufficient strength being reflected from the object which in turn allows for determining the 25 distance to such object, the illumination pulse may in terfere with the regular shooting of an intensity image. A photo-sensor of the camera for receiving radiation may be overloaded by the illumination pulse itself. This is why it is preferred that any intensity image to be dis 30 played is masked for the time of the illumination pulse in order not to provide overloaded signals to the dis play. This means that an intensity image provided by the range camera as selected is suppressed for a certain time during and/or after, or in response to the illumination 35 pulse. Given that a range image is produced at a fre quency frange, and the intensity image is produced at a frequency fintensity with frange < fintensity, the intensity im- WO 2011/153652 PCT/CH2011/000136 17 age is interrupted every frange times. According to the diagram in Figure 7, an intensity image may, for example, be detected at times tl, t2, t3, ... An illumination pulse is produced at time t2. This means in light of the pre 5 sent embodiment, that at time t2 and t3 no intensity im age is provided to be displayed, or alternatively, any intensity image taken at times t2 and t3 is masked from being displayed. In contrast, during the time interval t2 - t4 the reflection of the illumination pulse is expected 10 to be detected and allows for a distance determination. The intensity image will again be detected at time t4 and following time t5, etc until the next illumination pulse is scheduled. Figure 8 refers to the identification of ob is jects in a range image provided by a range camera. A pos sible means to identify an object is to evaluate the ranges measured for neighboring pixels of a range image. If the range measured for neighboring pixels is within a certain range interval, the respective pixels may be 20 identified to form an object. A neighboring pixel with a range too far from the range interval may represent an edge of the object. In Figure 8, two objects in a range image are identified according to the above procedure which objects are denoted by 01 and 02. The range image 25 is divided into pixels represented by the grid. For each pixel a distance value is measured, assigned and properly stored. Alternatively, the entire range image may be defined as object, and, for example, the smallest dis 30 tance value of any pixel of the object is representing the shortest range of an object in the range image. While there are shown and described presently preferred embodiments of the invention, it is to be dis tinctly understood that the invention is not limited the 3s reto but may be otherwise variously embodied and prac ticed within the scope of the following claims.
H: \MAG\Interwoven\NRPortbl\DCC\MAG\7837673_1.docx-26/05/2015 - 17A The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that that 5 prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates. Throughout this specification and claims which 10 follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or group of integers or steps but not the exclusion of any other integer or group of integers.

Claims (19)

1. A method for controlling a display, comprising: - receiving signals representative of range images from at 5 least two range cameras arranged for providing range images of different scenes; - evaluating the at least two range images, - selecting a subset of range images from the range images available subject to the result of the evaluation; 10 - providing a control signal for the display to display image information stemming from the one or more range cameras in charge of providing the one or more range images selected for the subset; and - receiving signals representing positional information 15 from radio based positioning receivers, and subject to the positional information providing a control signal for the display to display image information stemming from one or more range cameras other than the range cameras in charge of providing the one or more range images selected for the subset; 20 - wherein the image information to be displayed comprises an intensity image of the subject scene or a range image of the subject scene.
2. Method according to claim 1, wherein the evaluation step 25 comprises identifying objects in the range images, wherein an object is formed by neighboring pixels of the range image a range associated with each of these pixels being within a given range interval. 30
3. Method according to any one of the preceding claims, wherein the evaluation step comprises assigning a range to an object identified in the range image, H: \MAG\Interwoven\NRPortbl\DCC\MAG\7837673_1.docx-26/05/2015 - 19 wherein the shortest range of any pixel of an object or an average range determined by averaging the ranges of all pixels contributing to an object is assigned as range to the object. 5
4. Method according any one of the preceding claims, wherein the evaluation step comprises determining for each range image a closest object amongst all objects identified in the range image based on range information associated with such objects, 10 wherein the closest object of a range image is identified by comparing the range information of all the objects, and by identifying the object with the shortest range as closest object. 15
5. Method according to claim 4, wherein the selection step comprises comparing the range of the closest object to a threshold and selecting the corresponding range image only if the range of the closest object is not more than the threshold. 20
6. Method according to claim 4, wherein the selection step comprises selecting x range images as the subset of range images which x range images comprise the x closest objects.
7. A method according to any one of the preceding claims, 25 wherein the control signal is provided for the display to display and highlight the image information, and in particular to exclusively display the image information.
8. A method according to any one of the preceding claims, 30 comprising receiving signals representing positional information from radio based positioning receivers assigned to the range cameras. H:ldr\ nt emoenNRPorbl\DCC\DER\i2674_ I docx-29/09/2015 - 20
9. A method according to claim 8, comprising receiving signals representing positional information from radio based positioning receivers of objects other than the object the range cameras are mounted to. 5
10. A method according to any one of the preceding claims, wherein the control signal is designed for generating an acoustic or visual warning. 10
11. A method for controlling a display, comprising: - scanning different scenes and providing range images of these scenes, - evaluating the range images provided and selecting a subset of range images subject to the result of the 15 evaluation, - displaying image information stemming from one or more range cameras in charge of providing the one or more range images selected for the subset, and - receiving signals representing positional information 20 from radio based positioning receivers, and subject to the positional information providing a control signal for the display to display image information stemming from one or more range cameras other than the range cameras in charge of providing the one or more range images selected for the 25 subset; - wherein the image information to be displayed comprises an intensity image of the subject scene or a range image of the subject scene. 30
12. Computer program element comprising computer program code means which, when loaded in a processor unit of a H:\dcr\ntrnon,,\NRPorlbi\DCC\DER\8526784 -. docm-28/09/2 1 5 - 21 control unit, configures the control unit to perform a method as claimed in any one of the preceding claims.
13. A control unit for controlling a display, comprising a 5 receiver for receiving signals representative of range images from at least two range cameras for providing range images of different scenes, an evaluation unit for evaluating the at least two range images, a selection unit for selecting a subset of range images from the range images 10 available subject to the result of the evaluation, an output for providing a control signal to the display for displaying image information stemming from the one or more range cameras in charge of providing the one or more range images selected for the subset, and receiving signals representing 15 positional information from radio based positioning receivers, and subject to the positional information providing a control signal for the display to display image information stemming from one or more range cameras other than the range cameras in charge of providing the one or 20 more range images selected for the subset wherein the image information to be displayed comprises an intensity image of the subject scene or a range image of the subject scene.
14. A monitoring system, comprising a control unit 25 according to claim 13, a display, and at least two range cameras for providing range images of different scenes.
15. A monitoring system according to claim 14, wherein in at least one range camera a LIDAR light detection and 30 ranging method is implemented. H\e\t eNRPorbl\DCC\DER 26784_ I.doc-28/09/2015 - 22
16. A monitoring system according to claim 14 or claim 15 wherein at least one range camera is a time of flight camera or a 3D camera providing both an intensity image and a range image. 5
17. A monitoring system according to any one of the preceding claims 14 to 16, comprising radio based positioning receivers assigned to the range cameras, wherein the control unit is designed for receiving signals 10 representing positional information from the radio based positioning receivers and subject to the positional information for providing a control signal to the display for displaying image information stemming from one or more range cameras other than the one or more range cameras in 15 charge of providing the one or more range images selected for the subset.
18. A monitoring system according to any one of the preceding claims 14 to 17, comprising a warning system 20 controlled by the control signal.
19. A movable object, comprising a monitoring system according to any one of the preceding claims 14 to 18, wherein the range cameras are mounted at different locations 25 of the movable object for providing range images of different areas around the movable object, and wherein said movable object is one of a vehicle, a crane, a dragline, a haul truck, a digger and a shovel.
AU2011264358A 2010-06-10 2011-06-07 Method and control unit for controlling a display Active AU2011264358B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CH9352010 2010-06-10
CH935/10 2010-06-10
PCT/CH2011/000136 WO2011153652A2 (en) 2010-06-10 2011-06-07 Method and control unit for controlling a display

Publications (2)

Publication Number Publication Date
AU2011264358A1 AU2011264358A1 (en) 2013-01-10
AU2011264358B2 true AU2011264358B2 (en) 2015-10-29

Family

ID=44503450

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2011264358A Active AU2011264358B2 (en) 2010-06-10 2011-06-07 Method and control unit for controlling a display

Country Status (3)

Country Link
AU (1) AU2011264358B2 (en)
WO (1) WO2011153652A2 (en)
ZA (1) ZA201209336B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102460536B (en) 2009-06-12 2016-06-22 矿山安全股份公司 movable object proximity warning system
WO2011069267A1 (en) 2009-12-11 2011-06-16 Safemine Ag Modular collision warning apparatus and method for operating the same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070008091A1 (en) * 2005-06-09 2007-01-11 Hitachi, Ltd. Method and system of monitoring around a vehicle

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004021546A2 (en) 2002-08-09 2004-03-11 Conti Temic Microelectronic Gmbh Means of transport with a three-dimensional distance camera and method for the operation thereof
DE10253192A1 (en) 2002-11-15 2004-05-27 Philips Intellectual Property & Standards Gmbh Anti-collision system for use with road vehicle has position determining computer with GPS receiver and has radio transmitter ending signals to equipment carried by pedestrians
US20040217851A1 (en) * 2003-04-29 2004-11-04 Reinhart James W. Obstacle detection and alerting system
WO2007015446A1 (en) * 2005-08-02 2007-02-08 Nissan Motor Co., Ltd. Device for monitoring around vehicle and method for monitoring around vehicle
US8170787B2 (en) * 2008-04-15 2012-05-01 Caterpillar Inc. Vehicle collision avoidance system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070008091A1 (en) * 2005-06-09 2007-01-11 Hitachi, Ltd. Method and system of monitoring around a vehicle

Also Published As

Publication number Publication date
ZA201209336B (en) 2014-02-26
WO2011153652A2 (en) 2011-12-15
AU2011264358A1 (en) 2013-01-10
WO2011153652A3 (en) 2012-06-28

Similar Documents

Publication Publication Date Title
US11175406B2 (en) Range imaging system and solid-state imaging device
US20170192091A1 (en) System and method for augmented reality reduced visibility navigation
US20190294160A1 (en) Remote driving managing apparatus, and computer readable storage medium
US9785845B2 (en) Drive support display device
US20180334099A1 (en) Vehicle environment imaging systems and methods
JP4556742B2 (en) Vehicle direct image display control apparatus and vehicle direct image display control program
EP2910971A1 (en) Object recognition apparatus and object recognition method
US20060220910A1 (en) Parking aid for a vehicle
AU2010351500B2 (en) Object proximity warning system and method
JPH06293236A (en) Travel environment monitoring device
US20040217851A1 (en) Obstacle detection and alerting system
WO2020196513A1 (en) Object detection device
JP2001052171A (en) Surrounding environment recognizing device
CN111742235A (en) Method and system for identifying an empty parking space suitable for a vehicle
US11226616B2 (en) Information processing apparatus and computer readable storage medium for remotely driving vehicles
US9524644B2 (en) Notification control method and notification control device
AU2011264358B2 (en) Method and control unit for controlling a display
US10343603B2 (en) Image processing device and image processing method
JP2016162204A (en) Dirt determination device
AU2010355231B2 (en) Method and control unit for controlling a display of a proximity warning system
KR101449288B1 (en) Detection System Using Radar
JP5436652B1 (en) Vehicle periphery monitoring device and vehicle periphery monitoring method
JP4261321B2 (en) Pedestrian detection device
CN113752945B (en) Vehicle-mounted display system
KR102497610B1 (en) Device for safety aid using a image

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)