WO2013101075A1 - Systems and methods for proximal object awareness - Google Patents

Systems and methods for proximal object awareness Download PDF

Info

Publication number
WO2013101075A1
WO2013101075A1 PCT/US2011/067860 US2011067860W WO2013101075A1 WO 2013101075 A1 WO2013101075 A1 WO 2013101075A1 US 2011067860 W US2011067860 W US 2011067860W WO 2013101075 A1 WO2013101075 A1 WO 2013101075A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
audio
objects
enhanced
Prior art date
Application number
PCT/US2011/067860
Other languages
French (fr)
Inventor
David L. Graumann
Carlos MONTESINOS
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to PCT/US2011/067860 priority Critical patent/WO2013101075A1/en
Priority to CN201180076060.6A priority patent/CN104010909B/en
Priority to US13/977,617 priority patent/US20150130937A1/en
Priority to EP11878486.7A priority patent/EP2797793B1/en
Publication of WO2013101075A1 publication Critical patent/WO2013101075A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/005Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/006Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a distance sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees

Definitions

  • This invention generally relates to systems and methods for awareness of proximal objects.
  • vehicles may experience limited visibility when driving in reverse.
  • the limited visibility may lead to accidents, such as those that lead to injury , death, or property damage.
  • vehicles may be outfitted with rear image sensors that provide an image of what is behind the vehicle when t!ie vehicle is driven in reverse.
  • the images from the rear of the vehicle may be provided only when the vehicle is put in reverse gear.
  • the images viewed from the rear of the vehicle may be displayed on a display device within the cockpit of the vehicle, such as on a display panel provided on a center console of a car.
  • a fish eye lens camera provided on the rear exterior of the vehicle may be used for the purposes of generating an image as viewed from the rear of the vehicle.
  • Such systems may generate images of poor quality, such as images that arc distorted.
  • Range sensors may be provided on the rear of a vehicle to provide information about the range of an object at the rear of a vehicle.
  • range sensors to not provide a visual image as viewed from the rear of the vehicle and, therefore, it may be difficult for a driver to visualize and comprehend the relative distance between the vehicle and an obstruction.
  • having visual and other sensory aids provided on a vehicle for use while driving in reverse a driver of a vehicle may benefit from a comprehensive solution that provides user-friendly and easy to interpret information on the range of obst ctions, the direction of obstructions, and an image of the rear of the vehicle.
  • FIG. 1A is a simplified top-down view schematic diagram illustrating an example vehicle providing sensory information pertaining to obstructions at a rear side of the vehicle in accordance with embodiments of the invention.
  • FIG. I B is a simplified side view schematic diagram illustrating tlic example vehicle of FIG. I A operating in accordance with embodiments of the invention.
  • FIG. 2 is a simplified block diagram illustrating an example system for receiving sensor input and providing sensorv- infomiation regarding proximal objects at tlic rear of the vehicle of FIG. 1 A in accordance with embodiments of the invention.
  • FIG. 3 is a flow diagram illustrating an example method of providing an enhanced image and audio rendering of obstructions at the rear of the vehicle of FIG. 1 A in accordance with embodiments of the invention.
  • FIG. 4A is a simplified schematic diagram illustrating an example enhanced image of obstructions detected at the rear of the vehicle of FIG . 1A generated in accordance with embodiments of the invention.
  • FIG. 4B is a simplified schematic diagram illustrating an example enhanced image of obstRictions detected at the rear of the vehicle of FIG. 1 A generated in accordance with embodiments of the invention.
  • FIG. 4C is a simplified schematic diagram illustrating an example enhanced image of obstructions detected at the rear of the vehicle of FIG. 1A generated in accordance with embodiments of the invention.
  • FIG. 4D is a simplified schematic diagram illustrating an example enhanced image of obstructions detected at the rear of the vehicle of FIG. 1A generated in accordance with embodiments of the invention.
  • FIG . 5 is a simplified diagram illustrating an example audio rendering for representing obstructions detected at the rear of the vehicle of FIG. I A generated in accordance with embodiments of the invention.
  • Embodiments of the invention may provide apparatus, systems, mediods, and apparatus for providing awareness of proximal objects, particularly to a driver of a vehicle.
  • the vehicle may be traveling in reverse, and the driver of the vehicle may be made aware of obstructions at the rear of the vehicle.
  • a driver may have limited visibility when operating a vehicle in reverse. Therefore, making the driver aware of objects at the rear of the vehicle may enhance safety.
  • Various sensory-based information such as, for example, enhanced imagery and enhanced audio, may be provided to make the driver aware of obstructions at the rear of the vehicle.
  • the enhanced images may, in one aspect, provide user-friendly and easy to interpret information about objects that may be in proximity of the rear of the vehicle when the vehicle is driven in reverse.
  • the enhanced images may result in improved safety while operating the vehicle, particularly while driving the vehicle in reverse.
  • the enhanced images as displayed to a driver may provide a wide view from the rear of the vehicle and may- provide images of certain objects enhanced relative to other objects based on certain parameters, such as the relative distance to each of the objects.
  • an example scenario 100 may include a vehicle 102 with an emitter 1 10, an image sensor 1 12, and a range sensor 1 14.
  • the emitter may be configured to emit waves 120, for example, electromagnetic radiation, such as visible light, or compression waves, such as ultrasonic sound.
  • the image sensor I 12 and the range sensor I 14 may detect a variety of objects at the rear of the vehicle, such as a tree 130, a basketball 132, and a wall 134, and may provide a variety of ranges and angles relative to the vehicle 102.
  • Vectors 140, 142, and 144 may be defined from the range sensor 1 14 to their corresponding objects 130, 132, and 134, respectively.
  • the vectors 140, 142, and 144 may characterize both a distance to the respective object, as well as a respective angle from a reference plane 150.
  • the reference plane 150 is depicted as projecting in a normal direction from the rear of the vehicle 102. but in other embodiments may be at any angle relative to the vehicle 102.
  • the angles between the vectors 140, 142. and 144 and the reference plane 150 may be defined as ⁇ , ⁇ , and 0, respectively.
  • a vehicle 102 can include, but is not limited to. a car, a taick. a light-duty truck, a heavy-duty truck, a pickup truck, a minivan. a crossover vehicle, a van, a commercial vehicle, a private vehicle, a sports utility vehicle, a tractor- trailer, an aircraft, an airplane, ajet, a helicopter, a space vehicle, a watercraft, or any other suitable vehicle having a relatively closed cockpit.
  • the image sensor 1 12 may be any known device that converts an optical image to an electronic signal.
  • Tlie image sensor 1 12 may be of any known variety including a charge-coupled device (CCD), complementary metal oxide semiconductor (CMOS) sensors, or the like.
  • the image sensor 1 12 may be of any pixel count and aspect ratio.
  • the range sensor 1 14 may be of any known variety including, for example, an infrared detector.
  • the emitter 1 10 may be a radiation emitter and may emit infrared radiation 120 diat may reflect off of an object. The reflected radiation may be detected by the range sensor 1 14 to determine a range or distance between the range sensor 1 14 and the object.
  • the emitter 1 10 may emit infrared radiation that may reflect off of objects 1 30. 132, and 134 located at the rear of the vehicle. The reflected radiation may then be detected by the range sensor 1 14 to determine the distance between the range sensor 1 14 and the one or more objects at the rear of the vehicle 102.
  • the range sensor 1 14 may be a light detection and ranging (L1DAR) detector.
  • the emitter 1 10 may be an electromagnetic radiation emitter that emits coherent radiation, such as a light amplification by a stimulated emission of radiation (laser) beam at one or more wavelengths across a relatively wide range, including near-infrared, visible, or near-ultraviolet (UV).
  • the laser beam may be generated by providing the emitter 1 10 with electrical signals.
  • the LIDAR detector may detect a scattered laser beam reflecting off of an obstruction object 130, 132, and 134 and determine a range to the objects 130, 132, and 134.
  • the LIDAR detector may apply Mei solutions to interpret scattered laser light to determine range based diereon.
  • the LI DAR detector may apply Raylcigh scattering solutions to interpret scattered laser light to determine range based thereon.
  • the range sensor 1 14 may be a radio detection and ranging (RADAR) detector.
  • the emitter 1 10 may be an electromagnetic radiation emitter that emits microwave radiation.
  • the emitter 1 10 may be actuated with electrical signals to generate the microwave radiation 120.
  • the microwave radiation 120 may be of a variety of amplitudes and frequencies. In certain embodiments, the microwave radiation 120 may be mono-tonal or have substantially a single frequency component.
  • the RA DAR detector may detect scattered microwaves reflecting off of an obstruction object 130, 132, and 134 and determine a range to the object 130, 132. and 134.
  • the range may be related to the power of the reflected microwave radiation.
  • RADAR may further use Doppler analysis to determine the change in range between the range sensor 1 14 and an obstruction object 130, 132, and 134. Therefore, in certain embodiments, the range sensor 1 14 may provide both range information, as well as information about the change in range to an object 130, 132, and 134.
  • the range sensor 1 14 may be a sound navigation and ranging (SONAR) detector.
  • SONAR sound navigation and ranging
  • the emitter 1 10 may be an acoustic emitter that emits compression waves 120 at any frequency, such as frequencies in the ultra-sonic range.
  • the emitter 1 10 may be actuated with electrical signals to generate the sound 120.
  • the sound 120 may be of a variety of tones, magnitude, and rhythm.
  • Rhythm as used herein, is a succession of sounds and silences.
  • the sound 120 may be a white noise spanning a relatively wide range of frequencies with a relatively consistent magnitude across the range of frequencies.
  • the sound 120 may be pink noise spanning a relatively wide range of frequencies with a variation in magnitude across the range of frequencies.
  • the sound 120 may be mono-tonal or may have a finite number of tones corresponding to a finite number of frequencies of sound compression waves.
  • the emitter 1 10 may emit a pulse of sound 120, also referred to as a ping.
  • the SONAR detector may detect the ping as it reflects off of an obstruction object 130, 132, and 134 and determine a range to the object 130, 132, and 134 by measuring the time it takes for the sound to arrive at the range sensor 1 14.
  • the range may be related to the total time it takes for a ping to traverse the distance from the emitter 1 10 to the obstruction objects 130. 132, and 134 and then to the range sensor 1 14.
  • the determined range may be further related to the speed of sound.
  • SONAR may further use Doppler analysis to determine the change in range between the range sensor 1 14 and an obstruction object 130, 132, and 134. Therefore, in certain embodiments, the range sensor 1 14 may provide both range information, as well as information about the change in range to an object 130, 132, and 134.
  • FIGS. I A and I B depicted in FIGS. I A and I B for illustrative purposes only. It should be appreciated that the systems, methods, and apparatus disclosed herein can be applied to any number of obstructions behind the vehicle 102 at any distance and at any angle.
  • Tlie system 160 may include one or more controllers 164, each controller 164 having one or more processors 168 communicatively coupled to memory 1 70.
  • the one or more processors 168 may further be communicatively coupled to the image sensor 1 12, the range sensor 1 14, a user interface 174, a display 1 76, and one or more speakers 178.
  • Image sensor signals generated by the image sensor 1 12 and range sensor signals generated by the range sensor 1 14 may be provided to the one or more processors 168.
  • the one or more processors 168 may further receive input from or provide output to the user interface 174.
  • the processors) 168 may include, without limitation, a central processing unit (CPU), a digital signal processor (DSP), a reduced instruction set computer (RISC), a complex instruction set computer (CISC), a microprocessor, a microcontroller, a field programmable gate array (FPGA), or any combination thereof.
  • the system 160 may also include a chipset (not shown) for controlling communications between the processors) 168 and one or more of the other components of the system 160.
  • the system 160 may be based on an Intel® Architecture system, and the processor(s) 168 and chipset may be from a family of Intel® processors and chipsets, such as the Intel® Atom® processor family.
  • the processor(s) 168 may also include one or more processors as part of one or more application-specific integrated circuits (ASICs) or application-specific standard products (ASSPs) for handling specific data processing functions or tasks.
  • the memory 170 may include one or more volatile and/or non-volatile memory devices including, but not limited to, random access memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), synchronous dynamic RAM (SDRAM), double data rate (DDR) SDRAM (DDR-SDRAM), RAM-BUS DRA (RDRAM), flash memory devices, electrically erasable programmable read only memory (EEPROM), non-volatile RAM (NVRAM), universal serial bus (USB) removable memory, or combinations thereof.
  • RAM random access memory
  • DRAM dynamic RAM
  • SRAM static RAM
  • SDRAM synchronous dynamic RAM
  • DDR double data rate SDRAM
  • RDRAM RAM-BUS DRA
  • flash memory devices electrically erasable programmable read only memory
  • NVRAM non-volatile RAM
  • USB
  • the one or more processors 168 may be part of an in- vchiclc infotainment (I VI ) system. In other embodiments the one or more processors 168 may be dedicated to the system 160 for providing enhanced images and enhanced sounds indicative of obstructive objects 130, 132 and 134. Therefore, in such embodiments, the system 160 is separate from the IVl system. However, the system 160 may optionally communicate with the IVI system of the vehicle 102.
  • I VI in- vchiclc infotainment
  • the user interface 174 may be any known input device, output device, or input and output device that can be used by a user to communicate with the one or more processors 168.
  • the user interface 174 may include, but is not limited to, a touch panel, a keyboard, a display, speakers, a switch, a visual indicator, an audio indicator, a tactile indicator, a speech-to-text engine, or combinations thereof.
  • the user interface 174 may be used by a user, such as the driver of the vehicle 1 2, to selectively activate or deactivate the system 160.
  • the user interface 174 may be used by the user to provide parameter settings for the system 160.
  • Non-limiting examples of the parameter settings may include power settings of the system 160, the sensitivity of the range sensor 1 14, the optical zoom associated with the image sensor 1 12, the frame rate of the image sensor 1 12, the brightness of the display 176, the volume of the one or more speakers 178, other parameters associated with enhancements of images displayed on a display 176, and other parameters associated with enhancements of sounds played by the one or more speakers 1 78.
  • the user interface 174 may further communicate with the one or more processors 1 8 and provide information to the user, such as an indication that the system 160 is operational.
  • the display 176 may be any known type of display including, but not limited to, a touch screen, a liquid crystal display (LCD), a thin-film transistor (TFT) display, and an organic light-emitting diode (OLED) display, a plasma display, a cathode ray tube (CRT) display, or combinations thereof.
  • the display 176 may receive display signals and, based upon the display signals, provide still or moving images corresponding to the display signals.
  • the images displayed on the display 176 may be viewed by one or more users, such as a driver of the vehicle 102.
  • the one or more speakers 1 78 may be of any known type including, but not limited to, a cone diaphragm-type speaker, a dynamic speaker, a piezoelectric speaker, a full- range speaker, a subwoofer, a woofer, a tweeter, or combinations thereof.
  • the one or more speakers 1 78 may receive speaker signals and, based upon the speaker signals, provide sound corresponding to the speaker signals.
  • the sounds generated by the one or more speakers 1 78 may be heard by one or more users, such as the driver of die vehicle 102.
  • the one or more processors 168 may generate display signals that are provided to the display 176 based at least in part on the received image sensor signals, the range sensor signals, and optionally input from the user interface 174.
  • the display signals may correspond to a display image that may be shown on the display 176.
  • the display image may be an enhanced display image of the image corresponding to the image sensor signals provided by the image sensor 1 12.
  • the enhancement associated with the enhanced display image may entail rendering one of the objects 130, 132, and 134 differently from the other objects 130, 132, and 134.
  • the rendering of one of the objects 130, 132, and 134 may entail a different color, an oscillation, a different frequency of oscillation, a different magnitude of oscillation, a surrounding halo, a different size of a surrounding halo, a different color of a surrounding halo, a disproportionate size, a different level of pixel dithering, or combinations thereof relative to the other objects 130, 132, and 134. Therefore, in the enhanced display image, one or more of the objects 130, 132, and 134 may be displayed more prominently than the other objects 130, 132. and 134. In other words, the user viewing the enhanced display image may notice one or more of the objects 130, 132, and 134 more readily than some of the other objects 130, 132, and 134.
  • the most proximal of the objects 130, 132, and 134 may be displayed more prominently than the other objects 130, 132, and 134 in the enhanced display image as displayed on a display 176.
  • the basketball 132 may be more proximal to the vehicle 102 than the tree 130 or a wall 134. Accordingly, the basketball 132 may be displayed more prominently in the enhanced display image as displayed on the display 176.
  • the user may notice the basketball 132 more readily than the tree 130 and the wall 134.
  • the user may be aware that the basketball 132 is closer to the vehicle 102 than the other two objects 130 and 134.
  • the level of prominence accorded to each of the objects 130, 132, and 134 may be related to the relative distance between the objects 130, 132, and 134 and the rear of the vehicle 102. Therefore, the basketball 132 may be displayed more prominently than the wall 134, which, in turn, may be displayed more prominently than the tree 130, since the basketball 132 is more proximal than the wall 134, and the wall 134 is more proximal to the vehicle than the tree 130. As the vehicle 102 moves and the relative distances between the vehicle 102 and the objects 130, 132, and 134 changes, so might the enhancement applied to each object.
  • the image may change in a manner where the relatively high prominence shifts from the image of the basketball 132 to the image of the tree 130.
  • the level of prominence accorded to each of the objects 130, 132, and 134 and the enhanced display image may be related to the relative angle between the objects 130, 132. and 1 34 and the reference plane 150.
  • objects such as the basketball 132 with a smaller angle ⁇ to the reference plane 150 may be displayed more prominently than objects 130 and 134 that have relatively greater angles ⁇ and 0, respectively, to the reference plane 150.
  • the one or more processors 168 may provide speaker signals to the one or more speakers 178.
  • the speaker signals may correspond with audio output from the one or more speakers 178.
  • the audio output may be an enhanced audio output that is generated based in part upon the image sensor signal, the range sensor signal, and optionally any input from the user interface 174.
  • the enhanced audio output may be indicative of the location of the one or more objects 130, 1 32, and 1 4.
  • a user such as the driver of the vehicle 1 2, may hear the enhanced audio output and gain awareness of the objects 130, 132, and 134 at the rear of the vehicle 102. Therefore, each of the audio signals sent to corresponding speakers 178 may be rendered in a manner diat can be combined with audio output from all of the actuated speakers 178 to produce the desired directionality, magnitude, frequency, rhythm, and repetition to provide object proximity awareness to the user, such as the driver of the vehicle 102.
  • the one or more speakers 178 consist of two speakers 1 78, and the most proximal object 132 should be rendered from a direction that is equidistant between the two speakers 178.
  • the one or more processors may generate audio signals corresponding to each of the two speakers 178 so that an equal magnitude of sound is produced by the two speakers 178 such that it appears to someone listening to the sounds from a particular location, that the sound originates from some point between the two speakers 1 78.
  • the enhanced audio output may provide sound from a plurality of speakers in a manner such that the audio output is perceived as originating from the direction of the most proximal object 130, 132, and 134 by a user, such as the driver of the vehicle 102.
  • the audio signals provided to the one or more speakers 178 may be such that the driver of the vehicle 102 may perceive a relatively substantial magnitude of sound originating from the direction of the basketball 132, and relatively less magnitude of sound or no sound from the direction of the tree 130 and the wall 134.
  • the level of audio output from each of one or more speakers 178 may be rendered spatially in a manner such that the level of sound perceived by a user corresponds to the proximity of the various proximal objects 130, 132, and 1 34.
  • a driver of the vehicle 102 may perceive a greater magnitude of sound from the direction of the basketball 132 and a relatively lower magnitude of sound from the direction of the wall 134 and yet a relatively lower magnitude of sound from the direction of the tree 130.
  • the level of audio output from each of the one or more speakers 178 may be rendered spatially in a manner such that the level of sound perceived by a user corresponds to the angle of the objects 1 0, 1 32, and 134 relative to the reference plane 150. For example, sound may be perceived more prominently from the direction of objects, such as the basketball 132, with a smaller angle ⁇ to the reference plane 150 than from the direction of objects 130 and 134 that ha%'c relatively greater angles ⁇ and 0, respectively, to the reference plane 150. It should be noted that in some embodiments, the one or more processors 168 may also optionally receive information pertaining to the transmission (not shown) of the vehicle 102.
  • the one or more processors 168 may receive information that indicates if the vehicle 102 is in a reverse gear.
  • the vehicle 102 may be driven in a reverse direction when the vehicle 102 is in a reverse gear.
  • the system 160 may generate enhanced display images and enhanced audio output only when the vehicle 102 is in a reverse gear.
  • only the enhanced image may ⁇ be generated and displayed on the display 176.
  • only the enhanced audio output may be generated and played on the one or more speakers 178.
  • both the enhanced image may be displayed on the display 176, as well as the enhanced audio played on the one or more speakers 1 78.
  • the user of the system 1 0 may determine if an enhanced image is desired or an enhanced audio is desired or if both are desired. Referring now to FIG. 3, an example method 180 for providing an enhanced image and an enhanced audio output in accordance with embodiments of the disclosure is illustrated. The method 180 may use the elements and the system 160 as described with reference to FIGS. 1A, I B. and 2.
  • the determination may ⁇ be performed by the one or more processors 168 based upon a communicative signal received by the one or more processors 168.
  • the communicative signal may, in one aspect, be provided by one or more of an engine controller, a transmission controller, a vehicle main computer, an IV! system, or combinations thereof. If it is determined that the vehicle 102 is not in reverse then the method 1 0 continues to monitor if the vehicle 102 transmission is placed in reverse.
  • the image sensor signal generated by the image sensor 1 12 may be received via a communicative link by the one or more processors 1 8 of the system 160.
  • input from the range sensor 1 14 may be received.
  • the range sensor signal generated by the range sensor 1 14 may be received via a communicative link by the one or more processors 168 of the system 160. Therefore at blocks 1 4 and 186, the image sensor signals and the range sensor signals may be received concurrently by the one or more processors 168.
  • the angles ⁇ , ⁇ , and 0 of each of the obstruction objects 130, 132, and 134 may be determined.
  • the determination of the angles ⁇ , ⁇ , and 0 may be conducted by the one or more processors 168.
  • the combination of the image sensor infonriation with the range sensor infonriation is sufficient to detennine the angles ⁇ , ⁇ p, and 0 to each of the obstruction objects 130, 132, and 134.
  • only one of the image sensor information and the range sensor infonriation may be needed to detennine the angles ⁇ , ⁇ , and 0 to each of the obstruction objects 130, 132, and 134.
  • determining the angles ⁇ may be determined.
  • ⁇ , and 0 may entail analyzing the image that is generated by the image sensor 1 12 to identify each of the objects 130, 132, and 134. Upon identifying the relative positions of each of the objects 130, 132, and 134, information on the distance of each of the objects 130, 132, and 134 from the range sensor 1 1 may be used to determine the angles ⁇ , ⁇ , and I) of each of the objects 130, 132. and 134. In one aspect, trigonometric mathematical manipulations may be applied to the relative positions determined using the image sensor 1 12 and the distance using the range sensor 1 14 to arrive at the angles ⁇ , ⁇ , and 0 of each of the objects 130, 132, and 134. Such mathematical manipulations may incorporate aspects of triangulation to determine angles from the images and distances as provided by the sensors 1 12 and 1 14.
  • the determination of the angles ⁇ , ⁇ , 0, at block 188 may be optional and may not be needed for generating an enhanced display image or an enhanced audio output.
  • the distance to each of the obstruction objects 130, 132, and 134 may be determined.
  • the distance information may be provided by the range sensor 1 14 to the one or more processors 168.
  • the received range sensor signal may be analyzed in conjunction with the received image sensor signal to determine the distance to each of the objects 130, 132, and 134.
  • the range sensor 1 14 is a SONAR detector.
  • the range sensor 1 14 may receive three separate return signals corresponding to each ping that is transmitted by the acoustic emitter 1 10 or transducer. From die three separate return signals, the one or more processors 1 8 may be able to determine three different ranges.
  • the one or more processors 1 8 may not be able to determine which object 130, 132, and 134 corresponds to each of the determined ranges from the range sensor data. With the image sensor signals, the one or more processors 1 8 may be able to identify the three objects 130. 1 32, and 134 and then be able to estimate which of the objects 130, 132, and 134 are likely to be the nearest. Based upon these estimations, the range sensor 1 14 may determine the ranges to each of the identified proximal objects 130, 132, and 134.
  • Blocks 188 and 190 may provide information, such as the relative angle and the relative distance, of each of the obstruction objects 130, 132, and 1 34. Therefore, using such information, the vectors 140, 142, and 144, corresponding to each of the objects 130, 132, and 134, respectively, may be known.
  • the enhanced image signal may be generated.
  • the enhanced image signal may be generated by the one or more processors 168 based upon one or more of the image sensor signal, the range sensor signal, and inputs from the user interface 174.
  • the angle and range information corresponding to each of the objects 130, 132, and 134, as determined at blocks 188 and 190, may be used to enhance one or more of the objects 1 30, 132, and 134 relative to the other objects 130, 132, and 134.
  • the nearest object in this case the basketball 132, may be made more prominent in the enhanced image relative to the more distal objects, in this case the tree 130 and the wall 134.
  • an enhanced audio signal may be generated.
  • the enhanced audio signal may be generated by the one or more processors 168 based upon one or more of the image sensor signal, the range sensor signal, and inputs from the user interface 174.
  • the angle and range information corresponding to each of the objects 130, 1 32, and 134, as determined at blocks 188 and 190, may be used to provide the enhanced audio output corresponding to the relative angle or the relative distance of one or more of the objects 130, 132, and 134 relative to the other objects 130, 132, and 134.
  • the enhanced audio signal may be output from one or more speakers 178 in a manner such that it appears to someone sitting in the driver's seat of the vehicle 102 that the sound is originating from the direction of the nearest object, in this case the basketball 132. Audio output from the direction of the more distal objects, in this case the tree 130 and the wall 134, may be fainter than the sound coming from the direction of the basketball 132. It should be noted that in certain embodiments, the determination of the enhanced audio output and signal, at block 1 4, may be optional and that the method 180 may be performed without providing an audio output.
  • the enhanced audio output signal may be output to the one or more speakers 178, and the enhanced image signal may be output to the display 176.
  • the user such as the driver of the vehicle 102, may view the enhanced image on the display 176 and hear the enhanced audio on the one or more speakers 178. Therefore, by viewing the enhanced display, or hearing the enhanced audio, or both, the user may be better informed about obstnictions at the rear of the vehicle 1 2.
  • the method 180 may be modified in various ways in accordance with certain embodiments of the disclosure. For example, one or more operations of the method 180 may be eliminated or executed out of order in other embodiments of the disclosure. For example, in certain embodiments, it may not be necessary to place the vehicle 102 in reverse as shown in block 182 for the remainder of the the method 180 to be executed. Additionall , other operations may be added to method 180 in accordance with other embodiments of the disclosure.
  • the enhanced image of one object 130, 132, and 134 relative to the other objects 130, 132, and 1 34 may be displayed on the display 176.
  • the more proximal objects may be displayed more prominently than the more distal objects.
  • the image of the most proximal object such as the basketball 132
  • the basketball 132 may be displayed in a red color or with a red halo.
  • the next most proximal object, such as the wall 134 may be displayed in yellow, and the most distal object, such as the tree 130, may be displayed in green. Therefore, the various colors used for each of the objects 130, 132, and 134 as displayed on the display 176 may draw greater relative attention to the most proximal object, such as the basketball 132, versus the most distal object, such as the tree 130.
  • FIG. 4A an example enhanced display image as displayed on display 176 is described.
  • An image of the tree 200 corresponding to the tree 130 of FIGS. 1 A and I B, an image of the basketball 202 corresponding to the basketball 132 of FIGS. 1 A and I B. and an image of the wall 204 corresponding to the wall 134 of FIGS. I A and I B may be shown on the enhanced image.
  • the enhanced image may further contain a halo 210 surrounding and corresponding to the image of the tree 200, a halo 2 12 surrounding and corresponding to the image of the basketball 202. and a halo 214 surrounding and corresponding to the image of the wall 204.
  • the halo 212 surrounding the basketball may be more prominent than the halo 214 surrounding the wall, and die halo 214 surrounding the wall may, in turn, be more prominent than the halo 210 surrounding the tree to indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102. Therefore, the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and modify the received image based on determined angle and range information from blocks 188 and 190 to generate differentiated surrounding halos 210, 212, and 214 for each of the images of the objects 200, 202, and 204, respectively.
  • prominence may be conveyed by a larger halo surrounding the image of more proximal objects, such as the image of the basketball 202 relative to the image of other objects 200 and 204.
  • prominence may be conveyed by a thicker halo surrounding the image of more proximal objects, such as the image of the basketball 202 relative to the image of other objects 200 and 204.
  • the prominence may be conveyed by a different colored halo surrounding the image of more proximal objects, such as the image of the basketball 202 relative to the image of other objects 200 and 204.
  • FIG. 4B another example enhanced display image as displayed on display 1 76 is described .
  • An image of the tree 220 corresponding to the tree 1 0, an image of the basketball 222 corresponding to the basketball 132, and an image of the wall 224 corresponding to the wall 134 may be shown on the enhanced image.
  • the various images of objects 220, 222, and 224 may be shaded further.
  • the image of the basketball 222 may be less shaded than the image of die wall 224.
  • the image of the wall 224 may, in turn, be less shaded than the image of the tree 220 to indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102.
  • the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and modi fy the received image based on determined angle and range information from blocks 188 and 190 to generate differentiated shading for each of the images of the objects 220. 222. and 224, respectively.
  • prominence of the image of one object relative to the image of another object may be conveyed by less shading, or greater brightness, such as less shading of the image of the basketball 222 relative to the image of other objects 220 and 224.
  • prominence may be conveyed by more shading, or less brightness, of the image of more proximal objects, such as the image of the basketball 222 relative to the image of other objects 220 and 224.
  • the prominence may be conveyed by a differently colored shading of the image of more proximal objects, such as the image of the basketball 222 relative to the image of other objects 220 and 224.
  • Certain objects such as the wall 134, may span a length, where certain portions of the wall 134 arc relatively more proximal to the rear of the vehicle 102 than other portions of the wall 134. Therefore, in certain embodiments, the shading of the image 224 corresponding to a proximal portion of the wall 226 may be less than the shading of the image 224 corresponding to a more distal portion of the wall 228.
  • FIG. 4C yet another example enhanced display image as displayed on display 176 is described.
  • An image of the tree 230 corresponding to the tree 130, an image of the basketball 232 corresponding to the basketball 132, and an image of die wall 234 corresponding to the wall 134 may be shown on the enhanced image.
  • the various images of objects 230, 232. and 234 may further oscillate at various oscillation magnitudes.
  • the image of the basketball 232 may oscillate, as indicated by the relatively large arrows 242, more than the image of the wall 234, as indicated by relatively smaller arrows 244 and 246.
  • the image of the wall 234 may, in turn, be oscillated more than the image of the tree 230. as indicated by arrows 240.
  • the relative oscillations may indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102.
  • the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and generate the enhanced image such that one or more of the images of the objects oscillate differently from the images of the other objects based on determined angle and range information from blocks 188 and 1 0.
  • prominence of the image of one object relative to the image of anotlier object may be conveyed by the greater magnitude of oscillation of the image of the object corresponding to the more proximal object, such as greater oscillation of the image of the basketball 232 relative to the image of other objects 230 and 234.
  • prominence may be conveyed by less magnitude of oscillation of the image of more proximal objects, such as the image of the basketball 232 relative to the image of other objects 230 and 234.
  • the prominence may be conveyed by a different frequency of oscillation of the image of more proximal objects, such as the image of the basketball 232 relative to the image of other objects 230 and 234.
  • the oscillation of the image 234 corresponding to a proximal portion of the wall 244 may be greater than the oscillation of the image 234 corresponding to a more distal portion of the wall 246.
  • FIG. 4D a yet further example of an enhanced display image as displayed on display 176 is described.
  • An image of the tree 250 corresponding to the tree 1 0 an image of the basketball 252 corresponding to the basketball 132, and an image of the wall 254 corresponding to the wall 134 may be shown on the enhanced image.
  • the various images of objects 250, 252, and 254 may be sized relative to each other corresponding to their relative proximity to the vehicle 102.
  • the image of the basketball 252 may be rendered as disproportionately large relative to the wall 254, and the image of the wall 254 may, in turn be shown as disproportionately larger than the image of the tree 250.
  • the relatively disproportionate sizes of the images 250, 252, and 254 may indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102. Therefore, the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and modify the received image based on determined angle and range information from blocks 188 and 190 to generate a differentiated size for each of the images of the objects 250, 252, and 254, respectively. In certain embodiments, prominence of the image of one object relative to the image of another object may be conveyed by a relatively greater disproportionate size, such as a disproportionately large size of the image of the basketball 252 relative to the image of other objects 250 and 254.
  • Certain objects, such as the wall 134, may span a length, where certain portions of the wall 134 are relatively more proximal to the rear of the vehicle 102 than other portions of the wall 1 34. Therefore, in certain embodiments, the relative size of the image 254 corresponding to a proximal portion of the wall 256 may be greater than the relative size of the image 254 corresponding to a more distal portion of the wall 258.
  • a particular enhanced image may render a proximal object both with a disproportionately large size and with a relatively large halo compared to more distal objects from the vehicle 102.
  • the one or more speakers 178 may comprise speakers 178A, 178B, I 78C, and 178N. Although four speakers I 78A-N arc depicted for illustrative purposes, there may be any number of speakers. In one aspect, the speakers 178A-N may be provided within the interior or cockpit of the vehicle 102.
  • the speakers I 78A-N may be provided within the cockpit of the vehicle 102 near the rear, such that sound generated by the speakers I 78A-N may be heard by a user, such as the driver of the vehicle 102 from behind, when facing a front of the vehicle 102.
  • the one or more processors 168 may analyze the object vectors 140, 142, and 144 and generate spatialized sound vectors 270, 272. and 274 corresponding to objects 1 30, 132, and 134, respectively.
  • the sound vectors 270, 272, and 274 may represent the magnitude and direction of sound.
  • the direction of sound as represented by the sound vectors 270, 272, and 274 may appear to originate substantially from the direction of the obstruction objects 130, 132, and 134, from a predesignated position, such as the driver's seat of the vehicle.
  • the magnitude of the sound generated by the speakers 178A-N from a particular direction may be related to the distance of an obstruction in that direction.
  • the vector 142 corresponding to the basketball 132, may be the shortest vector due to the basketball being the most proximal of the obstruction objects 130, 132, and 134 behind the vehicle 102.
  • the corresponding sound vector 272 may have a relatively greater magnitude compared to the other sound vectors 270 and 274, as a result of the proximity of the basketball 132 to the vehicle 102 compared to the proximity of the other objects 130 and 134.
  • tine angle of the sound vectors ⁇ , ⁇ , and ⁇ with reference to the reference plane 150 maybe the same or substantially similar to the angles ⁇ , ⁇ , and 0 of the objects 130, 132, and 134 relative to the reference plane 150.
  • the one or more processors 168 may provide acoustic signals 280A, 280B, 280C, and 280N to output sound from each of the speakers 1 78A-N in a manner so that the sound appears to a listener to have substantially the directionality and magnitude as depicted by the sound vectors 270, 272, and 274. To produce the desired sounds, the one or more processors 168 may provide different magnitudes of acoustic signals 280A-N to the one or more speakers 178A-N.
  • the acoustic signals 280C and 280N, provided to speakers 178C and 178 may be of greater magnitude than the acoustic signals 280A and 280B, provided to speakers I 78A and 178B, to generate a greater audio output consistent with the direction of sound vector 272, corresponding to the basketball 132.
  • Embodiments described herein may be implemented using hardware, software, and/or firmware, for example, to perform the methods and/or operations described herein.
  • Certain embodiments described herein may be provided as a tangible machine-readable medium storing machine-executable instructions that, if executed by a machine, cause the machine to perform the methods and/or operations described herein.
  • the tangible machine-readable medium may include, but is not limited to, any type of disk including floppy disks, optical disks, compact disk read-only memories (CD-ROMs), compact disk rewritablcs (CD-RWs), magneto-optical disks, semiconductor devices such as read-only memories (ROMs), random access memories (RA Ms) such as dynamic and static RA Ms.
  • EPROMs erasable programmable read-only memories
  • EEPROMs electrically erasable programmable read-only memories
  • flash memories magnetic or optical cards, or any type of tangible media suitable for storing electronic instructions.
  • the machine may include any suitable processing or computing platform, device or system and may be implemented using any suitable combination of hardware and/or software.
  • the instructions may include any suitable type of code and may be implemented using any- suitable programming language.
  • machine-executable instructions for performing the methods and/or operations described herein may be embodied in firmware.

Abstract

Systems and methods are presented for providing enhanced sensory awareness of proximal objects to a vehicle. The enhanced sensory awareness may be determined based upon sensor signals sensing proximal objects.

Description

SYSTEMS AND METHODS FOR PROXIMAL OBJ ECT AWARENESS
TECHNICAL FIELD
This invention generally relates to systems and methods for awareness of proximal objects. BACKGROUND
Drivers of vehicles, such as cars, may experience limited visibility when driving in reverse. The limited visibility may lead to accidents, such as those that lead to injury , death, or property damage. As a result, vehicles may be outfitted with rear image sensors that provide an image of what is behind the vehicle when t!ie vehicle is driven in reverse. In some cases, the images from the rear of the vehicle may be provided only when the vehicle is put in reverse gear. The images viewed from the rear of the vehicle may be displayed on a display device within the cockpit of the vehicle, such as on a display panel provided on a center console of a car.
Typically, a fish eye lens camera provided on the rear exterior of the vehicle may be used for the purposes of generating an image as viewed from the rear of the vehicle.
Such systems may generate images of poor quality, such as images that arc distorted.
Drivers of vehicles may find it difficult, in many cases, to interpret such images.
Therefore, it may be difficult for a driver to determine, eitlier in a qualitative fashion or in a quantitative fashion, the distance between the vehicle and the nearest obstruction on the rear side of the vehicle. Further, it may be difficult for the driver to determine the angle of an obstaiction relative to the rear of the vehicle.
Range sensors may be provided on the rear of a vehicle to provide information about the range of an object at the rear of a vehicle. However, range sensors to not provide a visual image as viewed from the rear of the vehicle and, therefore, it may be difficult for a driver to visualize and comprehend the relative distance between the vehicle and an obstruction. Despite having visual and other sensory aids provided on a vehicle for use while driving in reverse, a driver of a vehicle may benefit from a comprehensive solution that provides user-friendly and easy to interpret information on the range of obst ctions, the direction of obstructions, and an image of the rear of the vehicle.
BRIEF DESCRIPTION OF THE FIGURES
Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
FIG. 1A is a simplified top-down view schematic diagram illustrating an example vehicle providing sensory information pertaining to obstructions at a rear side of the vehicle in accordance with embodiments of the invention.
FIG. I B is a simplified side view schematic diagram illustrating tlic example vehicle of FIG. I A operating in accordance with embodiments of the invention.
FIG. 2 is a simplified block diagram illustrating an example system for receiving sensor input and providing sensorv- infomiation regarding proximal objects at tlic rear of the vehicle of FIG. 1 A in accordance with embodiments of the invention.
FIG. 3 is a flow diagram illustrating an example method of providing an enhanced image and audio rendering of obstructions at the rear of the vehicle of FIG. 1 A in accordance with embodiments of the invention.
FIG. 4A is a simplified schematic diagram illustrating an example enhanced image of obstructions detected at the rear of the vehicle of FIG . 1A generated in accordance with embodiments of the invention.
FIG. 4B is a simplified schematic diagram illustrating an example enhanced image of obstRictions detected at the rear of the vehicle of FIG. 1 A generated in accordance with embodiments of the invention. FIG. 4C is a simplified schematic diagram illustrating an example enhanced image of obstructions detected at the rear of the vehicle of FIG. 1A generated in accordance with embodiments of the invention.
FIG. 4D is a simplified schematic diagram illustrating an example enhanced image of obstructions detected at the rear of the vehicle of FIG. 1A generated in accordance with embodiments of the invention.
FIG . 5 is a simplified diagram illustrating an example audio rendering for representing obstructions detected at the rear of the vehicle of FIG. I A generated in accordance with embodiments of the invention. DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
Embodiments of the invention are described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may. however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
Embodiments of the invention may provide apparatus, systems, mediods, and apparatus for providing awareness of proximal objects, particularly to a driver of a vehicle. In one aspect, the vehicle may be traveling in reverse, and the driver of the vehicle may be made aware of obstructions at the rear of the vehicle. Oftentimes, a driver may have limited visibility when operating a vehicle in reverse. Therefore, making the driver aware of objects at the rear of the vehicle may enhance safety. Various sensory-based information, such as, for example, enhanced imagery and enhanced audio, may be provided to make the driver aware of obstructions at the rear of the vehicle. The enhanced images may, in one aspect, provide user-friendly and easy to interpret information about objects that may be in proximity of the rear of the vehicle when the vehicle is driven in reverse. Therefore, the enhanced images may result in improved safety while operating the vehicle, particularly while driving the vehicle in reverse. The enhanced images as displayed to a driver may provide a wide view from the rear of the vehicle and may- provide images of certain objects enhanced relative to other objects based on certain parameters, such as the relative distance to each of the objects. Example embodiments of the invention will now be described with reference to the accompanying figures.
Referring now to FIGS. I A and I B, an example scenario 100 may include a vehicle 102 with an emitter 1 10, an image sensor 1 12, and a range sensor 1 14. The emitter may be configured to emit waves 120, for example, electromagnetic radiation, such as visible light, or compression waves, such as ultrasonic sound. The image sensor I 12 and the range sensor I 14 may detect a variety of objects at the rear of the vehicle, such as a tree 130, a basketball 132, and a wall 134, and may provide a variety of ranges and angles relative to the vehicle 102. Vectors 140, 142, and 144 may be defined from the range sensor 1 14 to their corresponding objects 130, 132, and 134, respectively. The vectors 140, 142, and 144 may characterize both a distance to the respective object, as well as a respective angle from a reference plane 150. The reference plane 150 is depicted as projecting in a normal direction from the rear of the vehicle 102. but in other embodiments may be at any angle relative to the vehicle 102. The angles between the vectors 140, 142. and 144 and the reference plane 150 may be defined as ψ, φ, and 0, respectively.
For the purposes of this discussion, a vehicle 102 can include, but is not limited to. a car, a taick. a light-duty truck, a heavy-duty truck, a pickup truck, a minivan. a crossover vehicle, a van, a commercial vehicle, a private vehicle, a sports utility vehicle, a tractor- trailer, an aircraft, an airplane, ajet, a helicopter, a space vehicle, a watercraft, or any other suitable vehicle having a relatively closed cockpit. However, it will be appreciated that embodiments of the disclosure may also be utilized in other environments in which a relatively closed area is provided. The image sensor 1 12 may be any known device that converts an optical image to an electronic signal. Tlie image sensor 1 12 may be of any known variety including a charge-coupled device (CCD), complementary metal oxide semiconductor (CMOS) sensors, or the like. The image sensor 1 12 may be of any pixel count and aspect ratio. The range sensor 1 14 may be of any known variety including, for example, an infrared detector. The emitter 1 10 may be a radiation emitter and may emit infrared radiation 120 diat may reflect off of an object. The reflected radiation may be detected by the range sensor 1 14 to determine a range or distance between the range sensor 1 14 and the object. For example, the emitter 1 10 may emit infrared radiation that may reflect off of objects 1 30. 132, and 134 located at the rear of the vehicle. The reflected radiation may then be detected by the range sensor 1 14 to determine the distance between the range sensor 1 14 and the one or more objects at the rear of the vehicle 102.
In certain embodiments, the range sensor 1 14 may be a light detection and ranging (L1DAR) detector. In such an implementation, the emitter 1 10 may be an electromagnetic radiation emitter that emits coherent radiation, such as a light amplification by a stimulated emission of radiation (laser) beam at one or more wavelengths across a relatively wide range, including near-infrared, visible, or near-ultraviolet (UV). In one aspect, the laser beam may be generated by providing the emitter 1 10 with electrical signals. The LIDAR detector may detect a scattered laser beam reflecting off of an obstruction object 130, 132, and 134 and determine a range to the objects 130, 132, and 134. In one aspect, the LIDAR detector may apply Mei solutions to interpret scattered laser light to determine range based diereon. In other aspects, the LI DAR detector may apply Raylcigh scattering solutions to interpret scattered laser light to determine range based thereon. In certain other embodiments, the range sensor 1 14 may be a radio detection and ranging (RADAR) detector. In such an implementation, the emitter 1 10 may be an electromagnetic radiation emitter that emits microwave radiation. In one aspect, the emitter 1 10 may be actuated with electrical signals to generate the microwave radiation 120. The microwave radiation 120 may be of a variety of amplitudes and frequencies. In certain embodiments, the microwave radiation 120 may be mono-tonal or have substantially a single frequency component. The RA DAR detector may detect scattered microwaves reflecting off of an obstruction object 130, 132, and 134 and determine a range to the object 130, 132. and 134. In one aspect, the range may be related to the power of the reflected microwave radiation. RADAR may further use Doppler analysis to determine the change in range between the range sensor 1 14 and an obstruction object 130, 132, and 134. Therefore, in certain embodiments, the range sensor 1 14 may provide both range information, as well as information about the change in range to an object 130, 132, and 134. In yet other embodiments, the range sensor 1 14 may be a sound navigation and ranging (SONAR) detector. In such an implementation, the emitter 1 10 may be an acoustic emitter that emits compression waves 120 at any frequency, such as frequencies in the ultra-sonic range. In one aspect, the emitter 1 10 may be actuated with electrical signals to generate the sound 120. The sound 120 may be of a variety of tones, magnitude, and rhythm. Rhythm, as used herein, is a succession of sounds and silences. In one aspect, the sound 120 may be a white noise spanning a relatively wide range of frequencies with a relatively consistent magnitude across the range of frequencies. Alternatively, the sound 120 may be pink noise spanning a relatively wide range of frequencies with a variation in magnitude across the range of frequencies. In yet other alternatives, the sound 120 may be mono-tonal or may have a finite number of tones corresponding to a finite number of frequencies of sound compression waves. In certain embodiments, the emitter 1 10 may emit a pulse of sound 120, also referred to as a ping. The SONAR detector may detect the ping as it reflects off of an obstruction object 130, 132, and 134 and determine a range to the object 130, 132, and 134 by measuring the time it takes for the sound to arrive at the range sensor 1 14. In one aspect, the range may be related to the total time it takes for a ping to traverse the distance from the emitter 1 10 to the obstruction objects 130. 132, and 134 and then to the range sensor 1 14. The determined range may be further related to the speed of sound. SONAR may further use Doppler analysis to determine the change in range between the range sensor 1 14 and an obstruction object 130, 132, and 134. Therefore, in certain embodiments, the range sensor 1 14 may provide both range information, as well as information about the change in range to an object 130, 132, and 134.
It should be noted that three objects 130, 132, and 134 are depicted in FIGS. I A and I B for illustrative purposes only. It should be appreciated that the systems, methods, and apparatus disclosed herein can be applied to any number of obstructions behind the vehicle 102 at any distance and at any angle.
Referring now to FIG. 2, an example system 1 0 for providing enhanced images and enhanced sounds indicative of obstructive objects 130, 132. and 134 in accordance with embodiments of the disclosure is illustrated. Tlie system 160 may include one or more controllers 164, each controller 164 having one or more processors 168 communicatively coupled to memory 1 70. The one or more processors 168 may further be communicatively coupled to the image sensor 1 12, the range sensor 1 14, a user interface 174, a display 1 76, and one or more speakers 178. Image sensor signals generated by the image sensor 1 12 and range sensor signals generated by the range sensor 1 14 may be provided to the one or more processors 168. The one or more processors 168 may further receive input from or provide output to the user interface 174.
The processors) 168 may include, without limitation, a central processing unit (CPU), a digital signal processor (DSP), a reduced instruction set computer (RISC), a complex instruction set computer (CISC), a microprocessor, a microcontroller, a field programmable gate array (FPGA), or any combination thereof. The system 160 may also include a chipset (not shown) for controlling communications between the processors) 168 and one or more of the other components of the system 160. In certain embodiments, the system 160 may be based on an Intel® Architecture system, and the processor(s) 168 and chipset may be from a family of Intel® processors and chipsets, such as the Intel® Atom® processor family. The processor(s) 168 may also include one or more processors as part of one or more application-specific integrated circuits (ASICs) or application- specific standard products (ASSPs) for handling specific data processing functions or tasks. The memory 170 may include one or more volatile and/or non-volatile memory devices including, but not limited to, random access memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), synchronous dynamic RAM (SDRAM), double data rate (DDR) SDRAM (DDR-SDRAM), RAM-BUS DRA (RDRAM), flash memory devices, electrically erasable programmable read only memory (EEPROM), non-volatile RAM (NVRAM), universal serial bus (USB) removable memory, or combinations thereof.
In certain embodiments, the one or more processors 168 may be part of an in- vchiclc infotainment (I VI ) system. In other embodiments the one or more processors 168 may be dedicated to the system 160 for providing enhanced images and enhanced sounds indicative of obstructive objects 130, 132 and 134. Therefore, in such embodiments, the system 160 is separate from the IVl system. However, the system 160 may optionally communicate with the IVI system of the vehicle 102.
The user interface 174 may be any known input device, output device, or input and output device that can be used by a user to communicate with the one or more processors 168. The user interface 174 may include, but is not limited to, a touch panel, a keyboard, a display, speakers, a switch, a visual indicator, an audio indicator, a tactile indicator, a speech-to-text engine, or combinations thereof. In one aspect, the user interface 174 may be used by a user, such as the driver of the vehicle 1 2, to selectively activate or deactivate the system 160. In another aspect, the user interface 174 may be used by the user to provide parameter settings for the system 160. Non-limiting examples of the parameter settings may include power settings of the system 160, the sensitivity of the range sensor 1 14, the optical zoom associated with the image sensor 1 12, the frame rate of the image sensor 1 12, the brightness of the display 176, the volume of the one or more speakers 178, other parameters associated with enhancements of images displayed on a display 176, and other parameters associated with enhancements of sounds played by the one or more speakers 1 78. The user interface 174 may further communicate with the one or more processors 1 8 and provide information to the user, such as an indication that the system 160 is operational. The display 176 may be any known type of display including, but not limited to, a touch screen, a liquid crystal display (LCD), a thin-film transistor (TFT) display, and an organic light-emitting diode (OLED) display, a plasma display, a cathode ray tube (CRT) display, or combinations thereof. In one aspect, the display 176 may receive display signals and, based upon the display signals, provide still or moving images corresponding to the display signals. In another aspect, the images displayed on the display 176 may be viewed by one or more users, such as a driver of the vehicle 102.
The one or more speakers 1 78 may be of any known type including, but not limited to, a cone diaphragm-type speaker, a dynamic speaker, a piezoelectric speaker, a full- range speaker, a subwoofer, a woofer, a tweeter, or combinations thereof. In one aspect, the one or more speakers 1 78 may receive speaker signals and, based upon the speaker signals, provide sound corresponding to the speaker signals. In another aspect, the sounds generated by the one or more speakers 1 78 may be heard by one or more users, such as the driver of die vehicle 102. During operation, the one or more processors 168 may generate display signals that are provided to the display 176 based at least in part on the received image sensor signals, the range sensor signals, and optionally input from the user interface 174. In one aspect, the display signals may correspond to a display image that may be shown on the display 176. In certain embodiments, the display image may be an enhanced display image of the image corresponding to the image sensor signals provided by the image sensor 1 12. The enhancement associated with the enhanced display image may entail rendering one of the objects 130, 132, and 134 differently from the other objects 130, 132, and 134. For example, the rendering of one of the objects 130, 132, and 134 may entail a different color, an oscillation, a different frequency of oscillation, a different magnitude of oscillation, a surrounding halo, a different size of a surrounding halo, a different color of a surrounding halo, a disproportionate size, a different level of pixel dithering, or combinations thereof relative to the other objects 130, 132, and 134. Therefore, in the enhanced display image, one or more of the objects 130, 132, and 134 may be displayed more prominently than the other objects 130, 132. and 134. In other words, the user viewing the enhanced display image may notice one or more of the objects 130, 132, and 134 more readily than some of the other objects 130, 132, and 134.
In certain embodiments, the most proximal of the objects 130, 132, and 134 may be displayed more prominently than the other objects 130, 132, and 134 in the enhanced display image as displayed on a display 176. For example, the basketball 132 may be more proximal to the vehicle 102 than the tree 130 or a wall 134. Accordingly, the basketball 132 may be displayed more prominently in the enhanced display image as displayed on the display 176. When viewed by a user, such as the driver of the vehicle 102, the user may notice the basketball 132 more readily than the tree 130 and the wall 134. In one aspect, based upon the enhanced display image, the user may be aware that the basketball 132 is closer to the vehicle 102 than the other two objects 130 and 134.
In certain other embodiments, the level of prominence accorded to each of the objects 130, 132, and 134 may be related to the relative distance between the objects 130, 132, and 134 and the rear of the vehicle 102. Therefore, the basketball 132 may be displayed more prominently than the wall 134, which, in turn, may be displayed more prominently than the tree 130, since the basketball 132 is more proximal than the wall 134, and the wall 134 is more proximal to the vehicle than the tree 130. As the vehicle 102 moves and the relative distances between the vehicle 102 and the objects 130, 132, and 134 changes, so might the enhancement applied to each object. For example, if the vehicle 102 moves from a position where the basketball 132 is the most proximal object to a position where the tree 130 is the most proximal object, then the image may change in a manner where the relatively high prominence shifts from the image of the basketball 132 to the image of the tree 130.
In yet other embodiments, the level of prominence accorded to each of the objects 130, 132, and 134 and the enhanced display image may be related to the relative angle between the objects 130, 132. and 1 34 and the reference plane 150. For example, objects, such as the basketball 132 with a smaller angle φ to the reference plane 150 may be displayed more prominently than objects 130 and 134 that have relatively greater angles ψ and 0, respectively, to the reference plane 150. Continuing on with the operation of the system 160, the one or more processors 168 may provide speaker signals to the one or more speakers 178. In one aspect, the speaker signals may correspond with audio output from the one or more speakers 178. The audio output may be an enhanced audio output that is generated based in part upon the image sensor signal, the range sensor signal, and optionally any input from the user interface 174. In one aspect, the enhanced audio output may be indicative of the location of the one or more objects 130, 1 32, and 1 4. In another aspect, a user, such as the driver of the vehicle 1 2, may hear the enhanced audio output and gain awareness of the objects 130, 132, and 134 at the rear of the vehicle 102. Therefore, each of the audio signals sent to corresponding speakers 178 may be rendered in a manner diat can be combined with audio output from all of the actuated speakers 178 to produce the desired directionality, magnitude, frequency, rhythm, and repetition to provide object proximity awareness to the user, such as the driver of the vehicle 102.
As a non-limiting example, consider that the one or more speakers 178 consist of two speakers 1 78, and the most proximal object 132 should be rendered from a direction that is equidistant between the two speakers 178. In such a case, the one or more processors may generate audio signals corresponding to each of the two speakers 178 so that an equal magnitude of sound is produced by the two speakers 178 such that it appears to someone listening to the sounds from a particular location, that the sound originates from some point between the two speakers 1 78.
In certain embodiments, the enhanced audio output may provide sound from a plurality of speakers in a manner such that the audio output is perceived as originating from the direction of the most proximal object 130, 132, and 134 by a user, such as the driver of the vehicle 102. For example, the audio signals provided to the one or more speakers 178 may be such that the driver of the vehicle 102 may perceive a relatively substantial magnitude of sound originating from the direction of the basketball 132, and relatively less magnitude of sound or no sound from the direction of the tree 130 and the wall 134.
I I In certain other embodiments, the level of audio output from each of one or more speakers 178 may be rendered spatially in a manner such that the level of sound perceived by a user corresponds to the proximity of the various proximal objects 130, 132, and 1 34. In other words, a driver of the vehicle 102 may perceive a greater magnitude of sound from the direction of the basketball 132 and a relatively lower magnitude of sound from the direction of the wall 134 and yet a relatively lower magnitude of sound from the direction of the tree 130.
In yet other embodiments, the level of audio output from each of the one or more speakers 178 may be rendered spatially in a manner such that the level of sound perceived by a user corresponds to the angle of the objects 1 0, 1 32, and 134 relative to the reference plane 150. For example, sound may be perceived more prominently from the direction of objects, such as the basketball 132, with a smaller angle φ to the reference plane 150 than from the direction of objects 130 and 134 that ha%'c relatively greater angles ψ and 0, respectively, to the reference plane 150. It should be noted that in some embodiments, the one or more processors 168 may also optionally receive information pertaining to the transmission (not shown) of the vehicle 102. For example, the one or more processors 168 may receive information that indicates if the vehicle 102 is in a reverse gear. The vehicle 102 may be driven in a reverse direction when the vehicle 102 is in a reverse gear. In one aspect, the system 160 may generate enhanced display images and enhanced audio output only when the vehicle 102 is in a reverse gear.
It should also be noted that in certain embodiments, only the enhanced image may¬ be generated and displayed on the display 176. In other embodiments, only the enhanced audio output may be generated and played on the one or more speakers 178. In yet other embodiments, both the enhanced image may be displayed on the display 176, as well as the enhanced audio played on the one or more speakers 1 78. In one aspect, the user of the system 1 0 may determine if an enhanced image is desired or an enhanced audio is desired or if both are desired. Referring now to FIG. 3, an example method 180 for providing an enhanced image and an enhanced audio output in accordance with embodiments of the disclosure is illustrated. The method 180 may use the elements and the system 160 as described with reference to FIGS. 1A, I B. and 2. At block 182, it is determined if the vehicle is in reverse. The determination may¬ be performed by the one or more processors 168 based upon a communicative signal received by the one or more processors 168. The communicative signal may, in one aspect, be provided by one or more of an engine controller, a transmission controller, a vehicle main computer, an IV! system, or combinations thereof. If it is determined that the vehicle 102 is not in reverse then the method 1 0 continues to monitor if the vehicle 102 transmission is placed in reverse.
If at block 1 2 it is determined that the vehicle 102 is in reverse, then input from the image sensor 1 12 may be received at block 184. As described with reference to FIG. 2, the image sensor signal generated by the image sensor 1 12 may be received via a communicative link by the one or more processors 1 8 of the system 160.
At block 186, input from the range sensor 1 14 may be received. Again, the range sensor signal generated by the range sensor 1 14 may be received via a communicative link by the one or more processors 168 of the system 160. Therefore at blocks 1 4 and 186, the image sensor signals and the range sensor signals may be received concurrently by the one or more processors 168.
At block 188, the angles ψ, φ, and 0 of each of the obstruction objects 130, 132, and 134 may be determined. The determination of the angles ψ, φ, and 0 may be conducted by the one or more processors 168. In one aspect, the combination of the image sensor infonriation with the range sensor infonriation is sufficient to detennine the angles ψ, <p, and 0 to each of the obstruction objects 130, 132, and 134. In one alternative, only one of the image sensor information and the range sensor infonriation may be needed to detennine the angles ψ, φ, and 0 to each of the obstruction objects 130, 132, and 134. In certain embodiments, determining the angles ψ. φ, and 0 may entail analyzing the image that is generated by the image sensor 1 12 to identify each of the objects 130, 132, and 134. Upon identifying the relative positions of each of the objects 130, 132, and 134, information on the distance of each of the objects 130, 132, and 134 from the range sensor 1 1 may be used to determine the angles ψ, φ, and I) of each of the objects 130, 132. and 134. In one aspect, trigonometric mathematical manipulations may be applied to the relative positions determined using the image sensor 1 12 and the distance using the range sensor 1 14 to arrive at the angles ψ, φ, and 0 of each of the objects 130, 132, and 134. Such mathematical manipulations may incorporate aspects of triangulation to determine angles from the images and distances as provided by the sensors 1 12 and 1 14.
It should be noted that in certain embodiments, the determination of the angles ψ, φ, 0, at block 188, may be optional and may not be needed for generating an enhanced display image or an enhanced audio output.
Next, at block 190, the distance to each of the obstruction objects 130, 132, and 134 may be determined. In certain embodiments, the distance information may be provided by the range sensor 1 14 to the one or more processors 168. In other embodiments, the received range sensor signal may be analyzed in conjunction with the received image sensor signal to determine the distance to each of the objects 130, 132, and 134. As a non-limiting example, consider the scenario 100 of FIGS. I A and I B, where the range sensor 1 14 is a SONAR detector. The range sensor 1 14 may receive three separate return signals corresponding to each ping that is transmitted by the acoustic emitter 1 10 or transducer. From die three separate return signals, the one or more processors 1 8 may be able to determine three different ranges. However, based only on the range sensor 1 14 information, the one or more processors 1 8 may not be able to determine which object 130, 132, and 134 corresponds to each of the determined ranges from the range sensor data. With the image sensor signals, the one or more processors 1 8 may be able to identify the three objects 130. 1 32, and 134 and then be able to estimate which of the objects 130, 132, and 134 are likely to be the nearest. Based upon these estimations, the range sensor 1 14 may determine the ranges to each of the identified proximal objects 130, 132, and 134.
Blocks 188 and 190, in combination, may provide information, such as the relative angle and the relative distance, of each of the obstruction objects 130, 132, and 1 34. Therefore, using such information, the vectors 140, 142, and 144, corresponding to each of the objects 130, 132, and 134, respectively, may be known.
At block 192. the enhanced image signal may be generated. As discussed in conjunction with FIG. 2, the enhanced image signal may be generated by the one or more processors 168 based upon one or more of the image sensor signal, the range sensor signal, and inputs from the user interface 174. In one aspect, the angle and range information corresponding to each of the objects 130, 132, and 134, as determined at blocks 188 and 190, may be used to enhance one or more of the objects 1 30, 132, and 134 relative to the other objects 130, 132, and 134. For example, the nearest object, in this case the basketball 132, may be made more prominent in the enhanced image relative to the more distal objects, in this case the tree 130 and the wall 134.
At block 194, an enhanced audio signal may be generated. As described in conjunction with FIG. 2, the enhanced audio signal may be generated by the one or more processors 168 based upon one or more of the image sensor signal, the range sensor signal, and inputs from the user interface 174. In one aspect, the angle and range information corresponding to each of the objects 130, 1 32, and 134, as determined at blocks 188 and 190, may be used to provide the enhanced audio output corresponding to the relative angle or the relative distance of one or more of the objects 130, 132, and 134 relative to the other objects 130, 132, and 134. For example, the enhanced audio signal may be output from one or more speakers 178 in a manner such that it appears to someone sitting in the driver's seat of the vehicle 102 that the sound is originating from the direction of the nearest object, in this case the basketball 132. Audio output from the direction of the more distal objects, in this case the tree 130 and the wall 134, may be fainter than the sound coming from the direction of the basketball 132. It should be noted that in certain embodiments, the determination of the enhanced audio output and signal, at block 1 4, may be optional and that the method 180 may be performed without providing an audio output.
At block 196, the enhanced audio output signal may be output to the one or more speakers 178, and the enhanced image signal may be output to the display 176. The user, such as the driver of the vehicle 102, may view the enhanced image on the display 176 and hear the enhanced audio on the one or more speakers 178. Therefore, by viewing the enhanced display, or hearing the enhanced audio, or both, the user may be better informed about obstnictions at the rear of the vehicle 1 2. It should be noted that the method 180 may be modified in various ways in accordance with certain embodiments of the disclosure. For example, one or more operations of the method 180 may be eliminated or executed out of order in other embodiments of the disclosure. For example, in certain embodiments, it may not be necessary to place the vehicle 102 in reverse as shown in block 182 for the remainder of the the method 180 to be executed. Additionall , other operations may be added to method 180 in accordance with other embodiments of the disclosure.
As discussed w ith reference to FIG. 2, the enhanced image of one object 130, 132, and 134 relative to the other objects 130, 132, and 1 34 may be displayed on the display 176. In one aspect, the more proximal objects may be displayed more prominently than the more distal objects.
For example, the image of the most proximal object, such as the basketball 132, may be displayed on the display 176 with a different color than the other objects 130 and 134. The basketball 132 may be displayed in a red color or with a red halo. The next most proximal object, such as the wall 134, may be displayed in yellow, and the most distal object, such as the tree 130, may be displayed in green. Therefore, the various colors used for each of the objects 130, 132, and 134 as displayed on the display 176 may draw greater relative attention to the most proximal object, such as the basketball 132, versus the most distal object, such as the tree 130. Referring now to FIG. 4A, an example enhanced display image as displayed on display 176 is described. An image of the tree 200 corresponding to the tree 130 of FIGS. 1 A and I B, an image of the basketball 202 corresponding to the basketball 132 of FIGS. 1 A and I B. and an image of the wall 204 corresponding to the wall 134 of FIGS. I A and I B may be shown on the enhanced image. The enhanced image may further contain a halo 210 surrounding and corresponding to the image of the tree 200, a halo 2 12 surrounding and corresponding to the image of the basketball 202. and a halo 214 surrounding and corresponding to the image of the wall 204. In one aspect, the halo 212 surrounding the basketball may be more prominent than the halo 214 surrounding the wall, and die halo 214 surrounding the wall may, in turn, be more prominent than the halo 210 surrounding the tree to indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102. Therefore, the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and modify the received image based on determined angle and range information from blocks 188 and 190 to generate differentiated surrounding halos 210, 212, and 214 for each of the images of the objects 200, 202, and 204, respectively. In certain embodiments, prominence may be conveyed by a larger halo surrounding the image of more proximal objects, such as the image of the basketball 202 relative to the image of other objects 200 and 204. In other embodiments. prominence may be conveyed by a thicker halo surrounding the image of more proximal objects, such as the image of the basketball 202 relative to the image of other objects 200 and 204. In yet other embodiments, the prominence may be conveyed by a different colored halo surrounding the image of more proximal objects, such as the image of the basketball 202 relative to the image of other objects 200 and 204. It should be noted that certain objects, such as the wall 134, may span a length, where certain portions of the wall 134 are relatively more proximal to the rear of the vehicle 102 than other portions of the wall 134. Therefore, in certain embodiments, the rendered halo corresponding to a proximal portion of the wall 216 may be more prominent than the halo corresponding to a more distal portion of the wall 21 . Referring now to FIG. 4B, another example enhanced display image as displayed on display 1 76 is described . An image of the tree 220 corresponding to the tree 1 0, an image of the basketball 222 corresponding to the basketball 132, and an image of the wall 224 corresponding to the wall 134 may be shown on the enhanced image. The various images of objects 220, 222, and 224 may be shaded further. In one aspect, the image of the basketball 222 may be less shaded than the image of die wall 224. and the image of the wall 224 may, in turn, be less shaded than the image of the tree 220 to indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102. In another aspect, the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and modi fy the received image based on determined angle and range information from blocks 188 and 190 to generate differentiated shading for each of the images of the objects 220. 222. and 224, respectively. Therefore, in certain embodiments, prominence of the image of one object relative to the image of another object may be conveyed by less shading, or greater brightness, such as less shading of the image of the basketball 222 relative to the image of other objects 220 and 224. In other embodiments, prominence may be conveyed by more shading, or less brightness, of the image of more proximal objects, such as the image of the basketball 222 relative to the image of other objects 220 and 224. In yet other embodiments, the prominence may be conveyed by a differently colored shading of the image of more proximal objects, such as the image of the basketball 222 relative to the image of other objects 220 and 224.
Certain objects, such as the wall 134, may span a length, where certain portions of the wall 134 arc relatively more proximal to the rear of the vehicle 102 than other portions of the wall 134. Therefore, in certain embodiments, the shading of the image 224 corresponding to a proximal portion of the wall 226 may be less than the shading of the image 224 corresponding to a more distal portion of the wall 228.
Referring now to FIG. 4C. yet another example enhanced display image as displayed on display 176 is described. An image of the tree 230 corresponding to the tree 130, an image of the basketball 232 corresponding to the basketball 132, and an image of die wall 234 corresponding to the wall 134 may be shown on the enhanced image. The various images of objects 230, 232. and 234 may further oscillate at various oscillation magnitudes. In one aspect, the image of the basketball 232 may oscillate, as indicated by the relatively large arrows 242, more than the image of the wall 234, as indicated by relatively smaller arrows 244 and 246. The image of the wall 234 may, in turn, be oscillated more than the image of the tree 230. as indicated by arrows 240. The relative oscillations, as described, may indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102. In another aspect, the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and generate the enhanced image such that one or more of the images of the objects oscillate differently from the images of the other objects based on determined angle and range information from blocks 188 and 1 0. Therefore, in certain embodiments, prominence of the image of one object relative to the image of anotlier object may be conveyed by the greater magnitude of oscillation of the image of the object corresponding to the more proximal object, such as greater oscillation of the image of the basketball 232 relative to the image of other objects 230 and 234. In other embodiments, prominence may be conveyed by less magnitude of oscillation of the image of more proximal objects, such as the image of the basketball 232 relative to the image of other objects 230 and 234. In yet other embodiments, the prominence may be conveyed by a different frequency of oscillation of the image of more proximal objects, such as the image of the basketball 232 relative to the image of other objects 230 and 234.
As discussed earlier, certain objects, such as the wall 134, may span a length, where certain portions of the wall 134 are relatively more proximal to the rear of the s'chiclc 102 than other portions of the wall 134. Therefore, in certain embodiments, the oscillation of the image 234 corresponding to a proximal portion of the wall 244 may be greater than the oscillation of the image 234 corresponding to a more distal portion of the wall 246.
Referring now to FIG. 4D, a yet further example of an enhanced display image as displayed on display 176 is described. An image of the tree 250 corresponding to the tree 1 0, an image of the basketball 252 corresponding to the basketball 132, and an image of the wall 254 corresponding to the wall 134 may be shown on the enhanced image. The various images of objects 250, 252, and 254 may be sized relative to each other corresponding to their relative proximity to the vehicle 102. In one aspect, the image of the basketball 252 may be rendered as disproportionately large relative to the wall 254, and the image of the wall 254 may, in turn be shown as disproportionately larger than the image of the tree 250. The relatively disproportionate sizes of the images 250, 252, and 254 may indicate that the basketball 132 is more proximal than the wall 134, which is more proximal than the tree 130 to the vehicle 102. Therefore, the one or more processors 168 may receive image sensor signals from the image sensor 1 12 and range sensor signals from the range sensor 1 14 and modify the received image based on determined angle and range information from blocks 188 and 190 to generate a differentiated size for each of the images of the objects 250, 252, and 254, respectively. In certain embodiments, prominence of the image of one object relative to the image of another object may be conveyed by a relatively greater disproportionate size, such as a disproportionately large size of the image of the basketball 252 relative to the image of other objects 250 and 254.
Certain objects, such as the wall 134, may span a length, where certain portions of the wall 134 are relatively more proximal to the rear of the vehicle 102 than other portions of the wall 1 34. Therefore, in certain embodiments, the relative size of the image 254 corresponding to a proximal portion of the wall 256 may be greater than the relative size of the image 254 corresponding to a more distal portion of the wall 258.
It should be noted that the various enhancements to portions of the image displayed on the display 176 may be combined. Therefore, prominence of the image of one object relative to another object may be conveyed with any combinations of colors, halos, oscillations, shading, brightness, and disproportionate size. As a non-limiting example, a particular enhanced image may render a proximal object both with a disproportionately large size and with a relatively large halo compared to more distal objects from the vehicle 102.
Referring now to FIG. 5, the generation of example enhanced audio signals for conveying distance and direction of proximal objects relative to the vehicle 102 is illustrated. For convenience, the vectors 140, 142, and 144 from FIGS. 1 A and I B, indicating the range to the vehicle 102 and the angles ψ, φ, and Θ relative to the reference plane 150 of each of the objects J 30. 132, and 134, respectively, are shown. The one or more speakers 178 may comprise speakers 178A, 178B, I 78C, and 178N. Although four speakers I 78A-N arc depicted for illustrative purposes, there may be any number of speakers. In one aspect, the speakers 178A-N may be provided within the interior or cockpit of the vehicle 102. In another aspect, the speakers I 78A-N may be provided within the cockpit of the vehicle 102 near the rear, such that sound generated by the speakers I 78A-N may be heard by a user, such as the driver of the vehicle 102 from behind, when facing a front of the vehicle 102.
The one or more processors 168 may analyze the object vectors 140, 142, and 144 and generate spatialized sound vectors 270, 272. and 274 corresponding to objects 1 30, 132, and 134, respectively. In one aspect, the sound vectors 270, 272, and 274 may represent the magnitude and direction of sound. In certain embodiments, the direction of sound as represented by the sound vectors 270, 272, and 274 may appear to originate substantially from the direction of the obstruction objects 130, 132, and 134, from a predesignated position, such as the driver's seat of the vehicle. Additionally, the magnitude of the sound generated by the speakers 178A-N from a particular direction may be related to the distance of an obstruction in that direction. For example, the vector 142, corresponding to the basketball 132, may be the shortest vector due to the basketball being the most proximal of the obstruction objects 130, 132, and 134 behind the vehicle 102. The corresponding sound vector 272 may have a relatively greater magnitude compared to the other sound vectors 270 and 274, as a result of the proximity of the basketball 132 to the vehicle 102 compared to the proximity of the other objects 130 and 134. Furthermore, tine angle of the sound vectors ψ, φ, and φ with reference to the reference plane 150 maybe the same or substantially similar to the angles ψ, φ, and 0 of the objects 130, 132, and 134 relative to the reference plane 150.
The one or more processors 168 may provide acoustic signals 280A, 280B, 280C, and 280N to output sound from each of the speakers 1 78A-N in a manner so that the sound appears to a listener to have substantially the directionality and magnitude as depicted by the sound vectors 270, 272, and 274. To produce the desired sounds, the one or more processors 168 may provide different magnitudes of acoustic signals 280A-N to the one or more speakers 178A-N. For example, the acoustic signals 280C and 280N, provided to speakers 178C and 178 , may be of greater magnitude than the acoustic signals 280A and 280B, provided to speakers I 78A and 178B, to generate a greater audio output consistent with the direction of sound vector 272, corresponding to the basketball 132.
Embodiments described herein may be implemented using hardware, software, and/or firmware, for example, to perform the methods and/or operations described herein. Certain embodiments described herein may be provided as a tangible machine-readable medium storing machine-executable instructions that, if executed by a machine, cause the machine to perform the methods and/or operations described herein. The tangible machine-readable medium may include, but is not limited to, any type of disk including floppy disks, optical disks, compact disk read-only memories (CD-ROMs), compact disk rewritablcs (CD-RWs), magneto-optical disks, semiconductor devices such as read-only memories (ROMs), random access memories (RA Ms) such as dynamic and static RA Ms. erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), flash memories, magnetic or optical cards, or any type of tangible media suitable for storing electronic instructions. The machine may include any suitable processing or computing platform, device or system and may be implemented using any suitable combination of hardware and/or software. The instructions may include any suitable type of code and may be implemented using any- suitable programming language. In other embodiments, machine-executable instructions for performing the methods and/or operations described herein may be embodied in firmware.
Various features, aspects, and embodiments have been described herein. Hie features, aspects, and embodiments are susceptible to combination with one another as well as to variation and modification, as will be understood by those having skill in the art. The present disclosure should. tJicrcforc, be considered to encompass such combinations, variations, and modifications. The terms and expressions which have been employed herein arc used as terms of description and not of limitation In the use of such terms and expressions, there is no intention of excluding any equivalents of the features shown and described (or portions thereof), and it is recognized that various modifications arc possible within the scope of the claims. Other modifications, variations, and alternatives arc also possible. Accordingly, the claims are intended to cover all such equivalents.
While certain embodiments of the invention have been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the claims. Although speci fic terms are employed herein, they are used in a generic and descriptive sense only, and not for purposes of limitation.
This written description uses examples to disclose certain embodiments of the invention, including the best mode, and also to enable any person skilled in the art to practice certain embodiments of the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of certain embodiments of the invention is defined in die claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.

Claims

CLAIMS The claimed invention is:
1 . A method comprising:
receiving, by at least one processor associated with a vehicle, at least one sensor signal;
determining, by the at least one processor, a range to at least one object based on the at least one sensor signal;
generating, by the at least one processor, an enhanced image signal corresponding to an enhanced image based in part on the range to the at least one object:
providing, by the at least one processor, the enhanced image signal to a display- device associated with the vehicle.
2. The method of claim 1 , wherein the at least one sensor signal comprises an image sensor signal.
3. The method of claim 1. wherein the at least one sensor signal comprises a range sensor signal.
4. The method of claim 1 , wherein the enhanced image provides an image of the at least one object, wherein one of the at least one object is visually enhanced relative to the other of the at least one object.
5. The method of claim , wherein the visual enhancement is at least one of: (i) an enhanced brightness; (ii) a different color: (iii) an oscillation; (iv) a different frequency of oscillation; (v) a different magnitude of oscillation: (vi) a surrounding halo; (vii) a different size of a surrounding halo; (viii) a different color of a surrounding halo; (ix) a disproportionate size; or (x) a different level of pixel dithering.
6. flic method of claim 1 , further comprising generating, by the at least one processor, at least one audio signal corresponding to an audio output based in part on the at least one sensor signal and providing the at least one audio signal to at least one audio speaker.
7. The method of claim 6, wherein generating the at least one audio signal further comprises determining an angle corresponding to each of the at least one object.
8. The method of claim 6. wherein the audio output comprises audio features corresponding to the proximity of the at least one object.
9. The method of claim 6, wherein the audio output corresponding to each of the at least one speaker provides sound with the greatest magnitude from substantially the direction of the most proximate of the at least one object relative to a reference point within the vehicle.
10. The method of claim 6, wherein the at least one audio speaker comprises four audio speakers provided with its corresponding respective audio signal, wherein the resulting audio output corresponds to the position of one or more of the at least one object.
1 1. A vehicle comprising:
at least one sensor configured to provide information on at least one object;
at least one processor configured to receive the information and generate an enhanced image signal corresponding to an enhanced image based on the information: a display configured to receive the enhanced image signal from the at least one processor and displaying the enhanced image.
12. The vehicle of claim 1 I . wherein the at least one sensor comprises an image sensor.
13. The vehicle of claim 1 1 , wherein the at least one sensor comprises a range sensor.
14. The vehicle of claim 1 1 , wherein the enhanced image provides an image of the at least one object, wherein one of the at least one object is visually enhanced relative to the other of the at least one object.
15. The vehicle of claim 14, wherein the visual enhancement is at least one of: (i) an enhanced brightness: (ii) a different color; (iii) an oscillation; (iv) a different frequency of oscillation: (v) a different magnitude of oscillation; (vi) a surrounding halo; (vii) a different size of a surrounding halo; (viii) a different color of a surrounding halo;
Figure imgf000027_0001
a disproportionate size: or (x) a different level of pixel dithering.
16. The vehicle of claim 1 1. wherein the at least one processor is further configured to generate at least one audio signal corresponding to an audio output based in part on the information on the at least one object.
1 7. The vehicle of claim 1 , further comprising at least one speaker configured receive the at least one audio signal and provide the audio output.
18. The vehicle of claim 16, wherein the audio output comprises audio features corresponding to the proximity of the at least one object.
19. The vehicle of claim 17, wherein the audio output corresponding to each of the at least one speaker provides sound with the greatest magnitude from substantially the direction of a most proximate of the at least one object relative to a reference point widiin the vehicle.
20. The vehicle of claim 17, wherein the at least one audio speaker comprises four audio speakers provided with its corresponding respective audio signal, wherein the resulting audio output corresponds to the position of one or more of the at least one object.
2 1. A computer-readable medium associated with a vehicle comprising computer- executable instructions that, when executed by one or more processors, executes a method comprising:
receiving at least one sensor signal;
determining a range to at least one object based on the at least one sensor signal; generating an enhanced image signal corresponding to an enhanced image; and providing the enhanced image signal to a display device.
22. The computer-readable medium of claim 21 , further comprising generating, by the at least one processor, at least one audio signal corresponding to an audio output based on the at least one sensor signal determined and providing each of the at least one audio signal to an audio speaker.
23. The computer-readable medium of claim 22, wherein generating the at least one audio signal further comprising determining an angle corresponding to each of the at least one object.
PCT/US2011/067860 2011-12-29 2011-12-29 Systems and methods for proximal object awareness WO2013101075A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/US2011/067860 WO2013101075A1 (en) 2011-12-29 2011-12-29 Systems and methods for proximal object awareness
CN201180076060.6A CN104010909B (en) 2011-12-29 2011-12-29 Systems and methods for proximal object awareness
US13/977,617 US20150130937A1 (en) 2011-12-29 2011-12-29 Systems and methods for proximal object awareness
EP11878486.7A EP2797793B1 (en) 2011-12-29 2011-12-29 Systems and methods for proximal object awareness

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2011/067860 WO2013101075A1 (en) 2011-12-29 2011-12-29 Systems and methods for proximal object awareness

Publications (1)

Publication Number Publication Date
WO2013101075A1 true WO2013101075A1 (en) 2013-07-04

Family

ID=48698311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/067860 WO2013101075A1 (en) 2011-12-29 2011-12-29 Systems and methods for proximal object awareness

Country Status (4)

Country Link
US (1) US20150130937A1 (en)
EP (1) EP2797793B1 (en)
CN (1) CN104010909B (en)
WO (1) WO2013101075A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3413287A1 (en) 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10703299B2 (en) 2010-04-19 2020-07-07 SMR Patents S.à.r.l. Rear view mirror simulation
US9785846B2 (en) * 2015-12-23 2017-10-10 Automotive Research & Test Center Method for quantifying classification confidence of obstructions

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0948282A (en) * 1995-08-08 1997-02-18 Mazda Motor Corp Indirect visual confirmation device for vehicle
KR100196383B1 (en) * 1995-12-27 1999-06-15 류정열 A distance indicator with an ultrasonic sensor
US20050276450A1 (en) 2004-06-14 2005-12-15 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
JP2007142545A (en) * 2005-11-15 2007-06-07 Denso Corp Vehicle periphery image processing apparatus and program
US20100259371A1 (en) * 2009-04-10 2010-10-14 Jui-Hung Wu Bird-View Parking Aid Apparatus with Ultrasonic Obstacle Marking and Method of Maneuvering the same

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2689792B2 (en) * 1991-10-30 1997-12-10 日産自動車株式会社 Three-dimensional sound field alarm device
US6087961A (en) * 1999-10-22 2000-07-11 Daimlerchrysler Corporation Directional warning system for detecting emergency vehicles
US7298247B2 (en) * 2004-04-02 2007-11-20 Denso Corporation Vehicle periphery monitoring system
US8885045B2 (en) * 2005-08-02 2014-11-11 Nissan Motor Co., Ltd. Device and method for monitoring vehicle surroundings
KR100696392B1 (en) * 2005-12-07 2007-03-19 주식회사단해 System and method for monitoring outside of automobile
CN201145741Y (en) * 2007-11-28 2008-11-05 严伟文 2.4G wireless screen display language prompting reverse drive indicator
GB0807953D0 (en) * 2008-05-01 2008-06-11 Ying Ind Ltd Improvements in motion pictures
US9126525B2 (en) * 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
JP2011205513A (en) * 2010-03-26 2011-10-13 Aisin Seiki Co Ltd Vehicle periphery monitoring device
US20120056995A1 (en) * 2010-08-31 2012-03-08 Texas Instruments Incorporated Method and Apparatus for Stereo-Based Proximity Warning System for Vehicle Safety
JP5619077B2 (en) * 2011-07-04 2014-11-05 キヤノン株式会社 Sheet conveying apparatus and image forming apparatus
US20130093583A1 (en) * 2011-10-14 2013-04-18 Alan D. Shapiro Automotive panel warning and protection system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0948282A (en) * 1995-08-08 1997-02-18 Mazda Motor Corp Indirect visual confirmation device for vehicle
KR100196383B1 (en) * 1995-12-27 1999-06-15 류정열 A distance indicator with an ultrasonic sensor
US20050276450A1 (en) 2004-06-14 2005-12-15 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus
JP2007142545A (en) * 2005-11-15 2007-06-07 Denso Corp Vehicle periphery image processing apparatus and program
US20100259371A1 (en) * 2009-04-10 2010-10-14 Jui-Hung Wu Bird-View Parking Aid Apparatus with Ultrasonic Obstacle Marking and Method of Maneuvering the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2797793A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3413287A1 (en) 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle

Also Published As

Publication number Publication date
US20150130937A1 (en) 2015-05-14
EP2797793A4 (en) 2015-12-23
EP2797793A1 (en) 2014-11-05
EP2797793B1 (en) 2019-03-06
CN104010909B (en) 2017-04-12
CN104010909A (en) 2014-08-27

Similar Documents

Publication Publication Date Title
JP6394281B2 (en) In-vehicle alert system, alarm control device
JP5765995B2 (en) Image display system
JP5601930B2 (en) Vehicle display device
US20150365743A1 (en) Method and apparatus for including sound from an external environment into a vehicle audio system
US10102438B2 (en) Information display device
EP3239958B1 (en) Collision avoidance system and collision avoidance method
JP2011162189A (en) Dynamic range display for automotive rear-view and parking systems
TWM454349U (en) Omnidirectional alarming system for vehicle
EP2797793B1 (en) Systems and methods for proximal object awareness
CN111186435B (en) Anti-collision method and device for automobile and storage medium
CN113232659A (en) Blind spot information acquisition device and method, vehicle, and recording medium having program recorded thereon
WO2017007643A1 (en) Systems and methods for providing non-intrusive indications of obstacles
US9139132B2 (en) Vehicle approach information device with indicator to the driver
US20210319701A1 (en) Information providing device for vehicle, and vehicle
JP6638527B2 (en) Vehicle equipment, vehicle program
JP2023184778A (en) Vehicle display system and vehicle display method
US10227040B2 (en) Vehicle parking assistance method and device
JP6131752B2 (en) Vehicle ambient environment notification system
JP2020154795A (en) Output control device, output control method, and output control program
JP2007153078A (en) On-vehicle voice output unit
KR20190114388A (en) Method and apparatus for preventing collisions during parking using ultra wideband radar
CN106772397B (en) Vehicle data processing method and vehicle radar system
KR20130021978A (en) Vehicle camera system for measuring distance and method for providing the distance information
JP2017027181A (en) Driving support voice output control device and driving support voice output control method
JP2004220333A (en) Device for supporting vehicle traveling

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11878486

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011878486

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 13977617

Country of ref document: US