WO2011153646A1 - Method and control unit for controlling a display of a proximity warning system - Google Patents
Method and control unit for controlling a display of a proximity warning system Download PDFInfo
- Publication number
- WO2011153646A1 WO2011153646A1 PCT/CH2010/000152 CH2010000152W WO2011153646A1 WO 2011153646 A1 WO2011153646 A1 WO 2011153646A1 CH 2010000152 W CH2010000152 W CH 2010000152W WO 2011153646 A1 WO2011153646 A1 WO 2011153646A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cameras
- positional information
- camera
- display
- images
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- the invention relates to a method and a control unit for controlling a display of a proximity warning system.
- Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles.
- Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated sys- terns may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car .
- 2004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions.
- Another approach based on GNNS receivers is disclosed in the International Application No. PCT/CH2009/000200 incorporated herein by reference.
- Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", retrieved and accessed from/on the Internet at
- Sensors for avoid ⁇ ing collisions may include radar systems, conventional cameras or thermal imaging cameras.
- each camera may display its image on a display installed in the driving cab.
- a signal representing positional information is received from a radio based positioning receiver.
- a subset of at least one camera out of at least two cameras for providing images of different scenes is selected dependent on the positional information.
- a control signal is provided for the display to display images provided by the selected subset of one or more cameras.
- a control unit for controlling a display according to the features of independent claim 20.
- Such control unit comprises a receiving unit for receiving a signal representing positional information of an object from a radio based positioning receiver.
- a selection unit is designed for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes subject to the positional information.
- a control signal is provided to display images provided by the selected subset of one or more cameras.
- the basic idea of the present invention is to provide an aid to the operator at which of the camera outputs to look at by means of prioritizing such camera (s) .
- a GNSS receiver is used for determining the present location of the object the cameras are assigned to and/or the location of an object different to the one the cameras are assigned to.
- the location of an object, specifically presented as coordinates in a chosen coordinate system, may advantageously be subsumed under the term "positional information" as presently used .
- the GNSS re- ceiver and the cameras are attached to the same object.
- An electronic map of preferably stationary objects being critical to traffic on a site may be stored, and the cur ⁇ rent position of the object as identified by the GNSS receiver, may be compared or otherwise put into relation to the position of one or more objects listed in such map. For example, in case of the distance between the object and a stationary object listed in the map being or possibly becoming critical, e.g. by subtracting the two location data from each other, it is decided to which one of the cameras to draw the operators attention to which preferably is the camera that looks into the direction the critical object is located at.
- the GNSS receiver and the cameras are attached to the same object, other objects including movable objects may be equipped with GNSS receivers, too, for determining their respective positions and/or trajectories.
- Such positional information is broadcast or individually transmitted by these objects to other objects on the site being equipped with a corresponding receiver.
- the direction and distance, and also any approaching ve- locity may be determined at the present object with respect to one or more other objects around.
- it is determined again to which of the cameras the operators attention should be drawn to which preferably is the camera that looks into the direction the critical object is located at.
- the GNSS receiver and the cameras are attached to different objects, the object comprising the cameras not necessarily including a GNSS receiver.
- other objects on a site may be equipped with a GNSS receiver and broadcast or otherwise transmit their positional information to other objects on the site being equipped with a corresponding receiving unit.
- the object equipped with the cameras receives such positional information via its receiving unit, the positional information may be evaluated and the direction and/or the distance and/or the approaching velocity of near-by or distant objects may be considered as critical in terms of a collision or a pre-collision scenario.
- the selection step can be implemented the same way as described above, and the display is controlled such that for the operator of the object the cameras are attached to emphasis is put on the one or more cameras looking into the direction another object is detected.
- the general purpose of the selection step is to make the operator focus to the one or more cameras by which a potential danger is currently being filmed under the assumption that there are at least two cameras avail- able filming different scenes, i.e. preferably different scenes around the object the cameras are attached to. Consequently, it is ensured that the image information being most relevant especially in terms of proximity warning including collision avoidance is displayed on the display.
- the determination which, one/s of the cameras currently monitors the most relevant scene is performed by a selection unit comprised in the control unit. In particular, the location information provided by a GNSS receiver is analyzed in terms of proximity to potentially dangerous objects.
- the personnel in charge for a safe operation of such object being e.g. a vehicle may not be distracted by a multitude of image information but instead may focus on the most relevant image information being displayed.
- FIG. 1 a schematic representation of a mining site
- Fig. 2 a block diagram of a monitoring system according to an embodiment of the present invention
- Fig. 3 a top view on a schematic vehicle with cameras mounted according to an embodiment of the present invention
- Fig. 4 a display
- Fig. 5 a block diagram of another monitoring system according to an embodiment of the present invention.
- Fig. 6 another display.
- an "image” is understood as being the output of a camera filming a scene. This can be a camera working in the visible range if light but also a camera working in the infrared range. Such image typically visualizes the scene on a display. When talking about different images it is inherently un ⁇ derstood that these images are generated by different cameras, typically simultaneously.
- image information may include any information provided by such camera, and, in particular, the image itself.
- the cameras used provide images of "different scenes".
- a scene is "different” to another scene whenever the cameras involved do not shoot or scan the same perspective. Cameras may not shoot the same perspective, for example, when they are attached to different locations of an object.
- the at least two cameras are mounted on the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the cameras are ar- ranged such that they scan different sides of the object they are attached to in order to detect other objects in proximity at different or even all sides of the object.
- a "section" assigned to a camera is understood as - e.g. when shooting with a camera horizontally - the horizontal area in front of the camera in which the camera is able to monitor scenes in, and that conse- quently can be displayed in an image.
- a section of a camera may include a sector.
- control unit may be embodied in hardware, in software or both, and may be embodied in a single device, or its functions may be decentralized. Its functional building block “selection unit” may also be embodied in hardware, in software or both.
- the "display” may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, i.e. cameras, simultaneously.
- the “display” also encompasses the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any displaying means for displaying image information delivered by the cameras.
- the control signal may evoke additional action sub ⁇ ject to what is displayed during the normal mode of op- eration, i.e. the default mode when there is no object in the vicinity detectable: If, for example, the display regularly shows images of a single camera source only, the control signal may cause to switch from displaying images from the current camera source to displaying im- ages from the camera source selected according to the present idea. If the current image source by chance coincides with the selected image source, there may be no change visible to the monitoring person.
- the control signal causes to highlight the se- lected images for drawing the attention to the subject images, e.g.
- the control signal may cause that the entire display now displays images only from the selected source. Or, the control signal may cause images from other sources being shut down or completely masked or visually downsized in order to emphasize the selected im- ages.
- the selected image may, as indicated, claim the entire display screen, or remain in an unchanged image size on the screen.
- the control signal may include zooming in the selected image.
- the control signal may additionally cause acoustic warnings to be issued.
- the control signal may either comprise the selected image itself provided the images are supplied by the cameras to the control unit, or it may cause the subject cameras to directly route the requested image to the display, or it may cause the display to accept only display information from the camera as selected. All the above holds true also for the selection of multiple images if appropriate.
- a "warning system" and a corresponding "warning” activity may refer to any suitable activity for drawing the attention of the driver or operator to what might be identified as a scene that may become critical in terms of collision or proximity, including selecting an image to be displayed.
- Such warning system may primarily include the display which the cameras supply with images, but may additionally include acoustic means such as a horn, a diaphone or a speaker, and possibly other visual means such as one or more LEDs, a flashlight, etc..
- the warning character of the images displayed may be especially emphasized by displaying the images intermittently, or by alternating between the image information and its inverse colors, or by overlaying the image information with visual warning symbols.
- a control signal separate from the control signal for the display may be issued subject to range information derived from the positional information delivered by the one or more GNSS receivers. For example, a first control signal for the display may be issued based on a first threshold condition for the object still being distant with respect to the present object, and a separate control signal for an acoustic warning element may be issued based on a second threshold condition for the object being very close with respect to the present object.
- radio based positioning system stands for a GNSS or for any other type of positioning system based on radio signals, such as a pseudolite system.
- GNSS stands for "Global Navigation Satellite System” and encompasses all satellite based navigation systems, including GPS and Galileo.
- a “receiver” is a receiver designed for receiving information from satel- lites and for determining its position subject to the signals received.
- a "movable object” is any object that can change and is expected to change its position and/or orientation or configuration in space. It may e.g. be a truck or any other vehicle that moves from place to place and changes its orientation with respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, e.g. by extending an arm, in such a manner that the volume of safety space attributed to it varies in significant manner.
- Fig. 1 schematically depicts a site 1, such as a surface mine.
- a site such as a surface mine.
- a site covers a large area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3.
- a plurality of objects is present in the mine, such as:
- Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a large number of blind spots that the driver is unable to visually monitor without monitoring cameras.
- vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries.
- a further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc.
- objects according to an embodiment present in a mine 1 and subject to potential collision may be equipped with at least one GNSS receiver 11, a control unit per object, at least two cameras (not shown in Fig. 1) and a display per object (not shown in Fig. 1) .
- Large objects may provide more than one GNSS receiver 11 per object as shown in Fig. 1.
- the entirety of these elements per object for generating a proximity warning is called a monitoring system.
- the GNSS receivers 12 interact with satellites 30 for determining the positional information of the object they are mounted to.
- FIG. 2 illustrates a block diagram of a monitoring system including a control unit 13 according to an embodiment of the present invention.
- a receiver 17 of the control unit 13 is connected to cameras 12.
- An output 16 of the control unit 13 is connected to a display 19 and a beeper as warning means. Both connections may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections.
- Each camera 12 delivers a series of images with respect to the scene monitored by the respec- tive camera 12. Preferably, each of the cameras 12 looks into a different direction for monitoring different scenes with respect to the object these cameras are attached to.
- the monitoring system further comprises a radio based positioning receiver 11, attached to the present object.
- the receiver 11 provides a signal comprising positional information, i.e. the position of the present object, determined in combination with satellites 30 as shown in Figure 1. Such signal may be received by a receiving unit 15 in the control unit 13.
- the control unit 13 comprises a microprocessor system 14, which controls the operations of the control unit 13.
- a memory 18 comprises programs as well as various parameters, such as unique identifiers of the cameras. Such programs may comprise instructions for evaluating the positional information, and for selecting a subset of cameras currently providing the most significant image information.
- the radio based positioning receiver 11 may provide positional information of the subject location it is located which represents the subject location of the object it is attached to. Provided that other moving or stationary objects on the site are equipped with such re- DCvers 11, too, the positional information related to the various objects may be shared between the control units of these objects, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can be detected. For further details it is referred to
- Position information of the present object provided by the radio based positioning receiver 11 is transferred to the control unit 13 and evaluated there.
- evaluation takes into account positional information received from other objects gathered by their own radio based positioning receivers and trans- mitted e.g. by a wireless interface not shown in Figure 2.
- a proximity situation may be detected. If such proximity situation is detected by means of the positional information, a control signal may be issued which activates displaying the image from the camera looking into a direction where the proximate object is located at.
- the selected image represent the camera that currently films the proximate object which is of most interest to be monitored by the operator in order to avoid a collision. This is why image information stemming from this camera is emphasized in being presented to the personnel via the display.
- Figure 2 shows an electronic map 40 stored in the control unit 13 which holds location information significant of stationary objects located on the site where the monitoring system of Figure 2 is in use.
- the positional information supplied by the GNSS receiver 11 is compared or otherwise put in relation to the location in- formation of the stationary objects. In case, sufficient proximity or approximation is detected between thses objects, the camera 12 looking into the direction of the stationary object is selected for displaying its image exclusively on the display 19.
- the object is equipped with another sensor (not shown) for measuring the distance to another object, such as a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device.
- a signal is re- ceived from such sensor, and the subset of one or more cameras additionally may be selected based on the distance information provided by such sensor.
- FIG 3 illustrates a schematic top view on a vehicle 6 eguipped with four cameras 12, one located at each side of the vehicle 6, and a single GNSS receiver 11. Sections monitored by each camera 12 are indicated by sector lines and referred to by 121. This makes each camera 12 scan a different scene at each side of the vehicle 6. Alternatively, the cameras 12 can be located at the edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the object for proximity including collision detection purposes.
- the selection of the camera may be based on the positional information with respect to the present vehicle 6 and such second positional information. Analyzing the positional information of both of the objects may allow identification of the direction the other object is located at with respect to the vehicle 6, and the distance between the vehicle and the other object.
- the section 121 of the left hand camera 12 is identified as relevant section 121 when mapping the position of the other object 200 to the sections 121 of the cameras 12 of the vehicle. For such mapping, it is beneficial to permanently monitor the orientation of the vehicle 6 which may alter when moving the vehicle. This may involve e.g.
- the identified section 121 makes the camera 12 associated to be the preferred camera for selection. As a result, this camera 12 will exclusively provide images of this proximity situation to the operator provided the distance to the object 200 is not that far that any selection is suppressed.
- the first and second positional information may be used for determining the distance between the ot- her object and the vehicle. The distance information may be included in the selection step, and the image of the camera corresponding to the identified section may only be selected when the determined distance between the objects is below a given threshold. Otherwise, it is assumed that the other object still is too far away for justifying a warning to the operator.
- the position of the third object 300 may be determined with respect to the sections 121 of the cameras 12 of the vehicle 6.
- the section 121 to the right hand side of the vehicle 6 is identified as section the object 300 maps/falls.
- the right hand side camera 12 is associated to this section 121.
- the object 200 now is assumed to be at a distance from the vehicle 6 which justifies issuing a warning to the operator.
- both cameras i.e. the left hand and the right hand camera 12 may be selected for delivering images to the display, e.g. compared to a default display mode where all four cameras 12 show their images on the display.
- only the object closest to the vehicle 6 shall be displayed.
- the image of the camera being mounted to the object being closest will be allowed to display the scene it monitors, i.e. the camera 12 to the right hand as the object 300 is closer to the vehicle 6 than the object 200.
- the radio based positioning receiver 11 always is present at the vehicle 6 / object the cameras are attached to. In another embodiment, no such radio based positioning receiver 11 is attached to the object holding the cameras. Instead, the selection of cameras only relies on positional information received from other objects. Such positional information may be sufficient for selecting the one or more cameras by a mapping step equivalent to the one described in connection with the embodiment above. This holds for other objects providing their position information not in an absolute measure but e.g. in relation to the present object, or to any other known object. Or, preferably, means other than radio based positioning means may be provided for allowing an assessment of the position of the other object with respect to the own position. In case the own position is a priori rather limited to a small area, e.g. when the vehicle may move only within a limited radius, even no such additional means are needed, as the own position may be known in advance, stored in the control unit and be used for putting the position of the other object provided in absolute coordinates into relation with its own position.
- the display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 8 cameras across its screen.
- the entire screen of Figure 4 may be reserved for showing the subject image information.
- the screen of the display 19 is devided, and the image information selected is displayed on portion 20 of the display.
- Portion 21 may be reserved for issuing visual warnings, such a bold "DANGER" symbol or text or other kind of visual warnings.
- the block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the cameras 12. By activating one of the AND gates by a corresponding control signal, the subject AND gate allows for the associated camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image information from the other cameras 12. There is no need for providing a receiver 17 for the image information in the control unit 13.
- Figure 6 provides another schematic representation of a display 19, which display 19 is divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying information from a camera assigned.
- the control signal only high ⁇ lights the sub-display 192 which displays image information from the camera 12 selected to be most critical in terms of a potential collision by a blinking frame or similar .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2802122A CA2802122C (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
AU2010355231A AU2010355231B2 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
PCT/CH2010/000152 WO2011153646A1 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CH2010/000152 WO2011153646A1 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011153646A1 true WO2011153646A1 (en) | 2011-12-15 |
Family
ID=43431943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CH2010/000152 WO2011153646A1 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Country Status (3)
Country | Link |
---|---|
AU (1) | AU2010355231B2 (en) |
CA (1) | CA2802122C (en) |
WO (1) | WO2011153646A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8779934B2 (en) | 2009-06-12 | 2014-07-15 | Safemine Ag | Movable object proximity warning system |
US8994557B2 (en) | 2009-12-11 | 2015-03-31 | Safemine Ag | Modular collision warning apparatus and method for operating the same |
EP3413287A1 (en) | 2010-04-19 | 2018-12-12 | SMR Patents S.à.r.l. | Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10703299B2 (en) | 2010-04-19 | 2020-07-07 | SMR Patents S.à.r.l. | Rear view mirror simulation |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220726A (en) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | Vehicle surrounding monitoring device |
WO2004021546A2 (en) | 2002-08-09 | 2004-03-11 | Conti Temic Microelectronic Gmbh | Means of transport with a three-dimensional distance camera and method for the operation thereof |
WO2004047047A1 (en) | 2002-11-15 | 2004-06-03 | Philips Intellectual Property & Standards Gmbh | Method and system for avoiding traffic collisions |
US20040217851A1 (en) * | 2003-04-29 | 2004-11-04 | Reinhart James W. | Obstacle detection and alerting system |
WO2006079165A1 (en) * | 2005-01-25 | 2006-08-03 | Alert Systems Pty Ltd | Proximity warning system |
GB2452829A (en) * | 2007-09-12 | 2009-03-18 | Spillard Safety Systems Ltd | Decentralised GPS based anti-collision system for vehicles and pedestrians |
US20090259400A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Vehicle collision avoidance system |
-
2010
- 2010-06-10 CA CA2802122A patent/CA2802122C/en active Active
- 2010-06-10 AU AU2010355231A patent/AU2010355231B2/en active Active
- 2010-06-10 WO PCT/CH2010/000152 patent/WO2011153646A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220726A (en) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | Vehicle surrounding monitoring device |
WO2004021546A2 (en) | 2002-08-09 | 2004-03-11 | Conti Temic Microelectronic Gmbh | Means of transport with a three-dimensional distance camera and method for the operation thereof |
WO2004047047A1 (en) | 2002-11-15 | 2004-06-03 | Philips Intellectual Property & Standards Gmbh | Method and system for avoiding traffic collisions |
US20040217851A1 (en) * | 2003-04-29 | 2004-11-04 | Reinhart James W. | Obstacle detection and alerting system |
WO2006079165A1 (en) * | 2005-01-25 | 2006-08-03 | Alert Systems Pty Ltd | Proximity warning system |
GB2452829A (en) * | 2007-09-12 | 2009-03-18 | Spillard Safety Systems Ltd | Decentralised GPS based anti-collision system for vehicles and pedestrians |
US20090259400A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Vehicle collision avoidance system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8779934B2 (en) | 2009-06-12 | 2014-07-15 | Safemine Ag | Movable object proximity warning system |
US9129509B2 (en) | 2009-06-12 | 2015-09-08 | Safemine Ag | Movable object proximity warning system |
US8994557B2 (en) | 2009-12-11 | 2015-03-31 | Safemine Ag | Modular collision warning apparatus and method for operating the same |
EP3413287A1 (en) | 2010-04-19 | 2018-12-12 | SMR Patents S.à.r.l. | Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle |
Also Published As
Publication number | Publication date |
---|---|
CA2802122C (en) | 2016-05-31 |
CA2802122A1 (en) | 2011-12-15 |
AU2010355231A1 (en) | 2013-01-10 |
AU2010355231B2 (en) | 2014-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9797247B1 (en) | Command for underground | |
US9457718B2 (en) | Obstacle detection system | |
US8280621B2 (en) | Vehicle collision avoidance system | |
AU2017261540B2 (en) | Command for underground | |
JP7067067B2 (en) | Traffic light recognition device and automatic driving system | |
US6919917B1 (en) | Device for monitoring the environment of a vehicle being parked | |
JP6267972B2 (en) | Work machine ambient monitoring device | |
AU2014213529B2 (en) | Image display system | |
JP4846426B2 (en) | Vehicle perimeter monitoring device | |
CN108749813B (en) | Automatic parking system and parking method | |
US10493622B2 (en) | Systems and methods for communicating future vehicle actions to be performed by an autonomous vehicle | |
US20190265736A1 (en) | Information provision system, vehicular device, and non-transitory computer-readable storage medium | |
CN108230749A (en) | Vehicle and its control method | |
CA2660215A1 (en) | Vehicle collision avoidance system | |
US20120287277A1 (en) | Machine display system | |
JP6801786B2 (en) | Parking support method and parking support device | |
JP2007164328A (en) | Vehicle run support system | |
WO2011130861A1 (en) | Object proximity warning system and method | |
US20160148421A1 (en) | Integrated Bird's Eye View with Situational Awareness | |
CA2802122C (en) | Method and control unit for controlling a display of a proximity warning system | |
US11697425B1 (en) | Method and system for assisting drivers in locating objects that may move into their vehicle path | |
JP2008097279A (en) | Vehicle exterior information display device | |
JP2008046766A (en) | Vehicle external information display device | |
CN109313859B (en) | Method for automatically activating an obstacle recognition device of a motor vehicle and obstacle assistance device | |
AU2018201213B2 (en) | Command for underground |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10725369 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2802122 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2010355231 Country of ref document: AU Date of ref document: 20100610 Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10725369 Country of ref document: EP Kind code of ref document: A1 |