WO2007032427A1 - Dispositif d’assistance de conduite, méthode de commande d’imagerie, programme de commande d’imagerie et support d’enregistrement - Google Patents

Dispositif d’assistance de conduite, méthode de commande d’imagerie, programme de commande d’imagerie et support d’enregistrement Download PDF

Info

Publication number
WO2007032427A1
WO2007032427A1 PCT/JP2006/318246 JP2006318246W WO2007032427A1 WO 2007032427 A1 WO2007032427 A1 WO 2007032427A1 JP 2006318246 W JP2006318246 W JP 2006318246W WO 2007032427 A1 WO2007032427 A1 WO 2007032427A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
moving body
unit
imaging
traveling direction
Prior art date
Application number
PCT/JP2006/318246
Other languages
English (en)
Japanese (ja)
Inventor
Katsuaki Kawamura
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007032427A1 publication Critical patent/WO2007032427A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Definitions

  • Driving support device photographing control method, photographing control program, and recording medium
  • the present invention relates to a driving support device and a photographing control method that are mounted on a moving body such as a vehicle.
  • the present invention relates to a shooting control program and a recording medium.
  • use of the present invention is not limited to the above-described driving support device, shooting control method, shooting control program, and recording medium.
  • Such a driving support apparatus is configured as follows, for example.
  • a stereo camera is installed on the vehicle, and the traffic light and its state (display color) and the distance between the traffic signal and the vehicle are recognized from the image in front of the vehicle captured by the stereo camera.
  • a warning can be output to the driver of the vehicle by voice or image to call attention (for example, Patent Document 1). reference.).
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2004-199148
  • the driving support apparatus includes an imaging unit that images the front of a moving body, an acquisition unit that acquires behavior information relating to the behavior of the moving body, and the movement based on the behavior information.
  • a determination means for determining whether or not a change has occurred in the advancing direction of the body; and when it is determined that a change has occurred in the advancing direction of the moving body, the imaging direction of the imaging means follows the change in the advancing direction.
  • Control means for changing in this way, and output means for outputting information based on the image photographed by the photographing means.
  • an imaging control method is an operation comprising: an imaging means for imaging the front of the moving body; and an output means for outputting information based on an image captured by the imaging means.
  • An imaging control method in a support device wherein an acquisition step of acquiring behavior information related to the behavior of the moving object, and determining whether or not a change has occurred in a moving direction of the moving object based on the behavior information And a control step of changing the photographing direction of the photographing means so as to follow the change in the traveling direction when it is determined that a change has occurred in the traveling direction of the moving body.
  • a shooting control program according to the invention of claim 7 causes a computer to execute the shooting control method of claim 6.
  • a recording medium according to the invention of claim 8 is characterized in that the photographing control program according to claim 7 is recorded so as to be readable by a computer.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of a driving assistance apparatus according to an embodiment.
  • FIG. 2 is a flowchart illustrating an example of imaging control processing and driving support processing procedures of the driving support device according to the embodiment.
  • FIG. 3 is a block diagram illustrating an example of a hardware configuration of the driving support apparatus according to the embodiment.
  • FIG. 4 is an explanatory diagram for explaining the outline of the driving support using the photographing control method in the driving support device according to the embodiment.
  • FIG. 5 is a diagram illustrating a shooting control process and a driving support process of the driving support apparatus according to the embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a driving support apparatus according to an embodiment of the present invention.
  • the driving support apparatus is mounted on a moving body such as a vehicle (including a four-wheeled vehicle and a two-wheeled vehicle), for example, and includes a photographing unit 101, an acquiring unit 102, a determining unit 103, a control unit 104, An extraction unit 105, a determination unit 106, and an output unit 107 are provided.
  • the photographing unit 101 photographs the front of the moving body.
  • the image in front of the moving body photographed by the photographing unit 101 is an image having an angle of view equivalent to the viewing angle that the driver of the vehicle sees when driving. is there.
  • the photographing unit 101 is specifically configured by one or a plurality of photographing devices, for example, and may photograph not only the front of the moving body but also the surroundings of the moving body.
  • the acquisition unit 102 acquires behavior information regarding the behavior of the moving object.
  • the behavior information is detected and acquired, for example, by a detection unit (not shown) of the moving body.
  • the behavior information is, specifically, information indicating the running state of the vehicle when the moving body is a vehicle, for example, information on the speed of the vehicle (speed information, acceleration information, angular velocity information, etc.). ) And information on the steering wheel of the vehicle.
  • behavior information includes vehicle tilt angle information, current location information, and information associated with operation inputs.
  • the determination unit 103 determines whether or not a change has occurred in the traveling direction of the moving object, for example, based on the behavior information acquired by the acquisition unit 102. Specifically, the determination unit 103 determines whether or not the traveling direction of the moving body has changed in the lateral direction of the moving body based on, for example, information on the acceleration of the moving body. Specifically, the determination unit 103 determines whether or not the traveling direction of the moving body has changed to the lateral direction of the moving body based on, for example, information on the speed in the traveling direction of the moving body and information on the acceleration. .
  • the control unit 104 changes the shooting direction of the shooting unit 101 to follow the change in the moving direction. That is, for example, when the determination unit 103 determines that the traveling direction of the moving body has changed to the lateral direction of the moving body based on the information related to the acceleration of the moving body, the control unit 104 changes the shooting direction of the shooting unit 101. Change horizontally.
  • control unit 104 changes the traveling direction of the moving body into the lateral direction of the moving body based on, for example, the information related to the speed in the traveling direction of the moving body and the information related to acceleration. If it is determined that the shooting direction has been changed, the shooting direction of the shooting unit 101 is changed to the horizontal direction by using the speed in the moving direction of the moving body indicated by the information on the speed and the lateral acceleration of the moving body indicated by the information on the acceleration.
  • the control unit 104 relates to the steering machine.
  • the shooting direction of the shooting unit 101 is changed to the horizontal direction based on the steering direction of the moving body indicated by the information, or the shooting of the shooting unit 101 is set based on the information on the acceleration and the speed in the traveling direction together with the information on the steering machine Change the direction to horizontal.
  • the control unit 104 specifically, when the determination unit 103 determines that the traveling direction of the moving object has changed to the lateral direction of the moving object based on the behavior information acquired by the acquiring unit 102. Then, the shooting direction is changed as follows. That is, for example, if it is determined that the moving direction of the moving body has changed to the left lateral direction and the moving body has turned to the left, the range including the left front should be captured instead of capturing the true left lateral direction. If the shooting direction of the shooting unit 101 is changed to the right and the moving direction of the moving body changes to the right side and it is determined that the moving body has turned to the right, instead of shooting the right side direction, The shooting direction of the shooting unit 101 is changed so that the range including it is shot.
  • the extraction unit 105 performs image processing and the like from the image in front of the moving body imaged by the imaging unit 101 to extract a signal image related to a traffic signal that is a road structure.
  • the road structure includes, for example, a member that straddles a road, a member that covers a driving lane, a bridge, a road surface (road pavement), a tunnel, and the like, and includes traffic lights and road white lines. is there.
  • the determination unit 106 determines the display color of the traffic light based on the signal image extracted by the extraction unit 105. Specifically, the determination unit 106 determines, for example, whether the display color of the traffic light is red or yellow. Here, the determination unit 106 does not determine whether the display color of the traffic light is blue (green), but may determine whether the display color is blue (green). Good.
  • the output unit 107 outputs information based on an image in front of the moving object photographed by the photographing unit 101.
  • the output unit 107 is based on the determination result determined by the determination unit 106.
  • Alarm information is output to the driver of the moving body. Specifically, the information and warning information based on these images are displayed and output by the output unit 107, for example.
  • the determination unit 103 determines whether or not a change has occurred in the traveling direction of the moving object.
  • the control unit 104 changes the shooting direction of the shooting unit 101 so as to follow the change in the traveling direction of the moving object. For this reason, even when the road is curved, for example, it is possible to accurately capture a traffic light and output information based on the traffic light on the road without being deviated from the imaging range of the imaging unit 101.
  • a signal image is extracted by the extraction unit 105 from the image captured by the imaging unit 101, and the display color of the traffic light is determined by the determination unit 106 based on the signal image. Since the alarm information can be output to the driver of the moving body by the output unit 107 based on the determination result, the driver can be surely alerted.
  • this driving support device even if the road on which the moving body travels is curved, the imaging direction is surely followed by the traveling direction of the moving body, and an image in front of the moving body is captured. Information based on the processed image can be output. As a result, it is possible to provide driving assistance by alerting the driver with information based on information based on the image ahead of the moving object.
  • FIG. 2 is a flowchart showing an example of imaging control processing and driving support processing procedures of the driving support device according to the embodiment of the present invention.
  • the front of the moving object is photographed by the photographing unit 101.
  • Step S201 the acquisition unit 102 acquires the behavior information of the moving object (Step S202).
  • the determination unit 103 determines whether there is a change in the traveling direction of the moving body (step S203). If it is determined that there is no change in the traveling direction of the moving body (step S203: No), the process returns to step S201 and the front of the moving body is photographed.
  • step S203: Yes the control unit 104 changes the shooting direction by the shooting unit 101 so as to follow the change in the moving direction of the moving body ( Step S204
  • control unit 104 determines whether or not there is a signal image related to the traffic light in the image captured by the imaging unit 101 (step S205), and when it is determined that there is no signal image (step S205) In step S205: No), the process returns to step S201 and the front of the moving object is photographed.
  • step S205 if it is determined that there is a signal image (step S205: Yes), the extraction unit 105 extracts the signal image from the image captured by the imaging unit 101 (step S206).
  • the display color of the traffic light based on the signal image is determined by the unit 106 (step S207), and based on the determination result, for example, the control unit 104 determines whether the display color is red or yellow (step S208). .
  • step S208: No If it is determined that the display color is not red or yellow (step S208: No), the process returns to step S201 to photograph the front of the moving body. If it is determined that the display color is red or yellow (step S208: Yes), the alarm information related to the determined red or yellow is output by the output unit 107, for example, by display output or audio output (step S208: Yes). S209), the process returns to the shooting process of step S201 and loops.
  • the shooting control process and the driving support process in the driving support device in this manner, for example, even if the road on which the moving body travels is curved, the shooting direction is surely followed by the change in the moving direction of the moving body. It is possible to capture an image in front of the moving body and output warning information by determining the presence or absence of a signal image and display color based on the captured image. Therefore, it is possible to provide driving assistance by alerting the driver accurately and reliably based on the signal image.
  • the photographing direction of the photographing apparatus is surely set in the traveling direction of the moving body.
  • the image of the front of the moving object is taken and the information based on the taken image is displayed as warning information.
  • the information on the road ahead of the moving object is accurately transmitted to the driver while accurately acquiring the image on the road on which the moving object is traveling. Can be achieved.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of the driving support apparatus according to the embodiment of the present invention.
  • the same parts as those already described are denoted by the same reference numerals and the description thereof is omitted.
  • the driving support device 300 includes a control unit 301, a user operation unit 302, a display unit 303, an information acquisition unit 304, a recording medium 305, and a recording medium decoding unit 306.
  • the audio output unit 307, the communication unit 308, the camera 309, the camera drive unit 310, the image processing unit 311, the audio generation unit 312, and the speaker 313 are configured.
  • the control unit 301 controls the entire driving support apparatus 300, for example.
  • the control unit 301 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area of the CPU.
  • a microcom- ponent configured by ECU (Electronic Control Unit) can be realized.
  • User operation unit 302 outputs information input by the user, such as characters, numerical values, and various instructions, to control unit 301.
  • the user operation unit 302 includes, for example, a push button switch, a touch panel, a remote controller, and the like. Further, the user operation unit 302 may be configured to perform an input operation by voice using a microphone that inputs voice from the outside.
  • the display unit 303 includes, for example, a CRT (Cathode Ray Tube), a TFT (Thin Film Transi stor) Liquid crystal display, organic EL (Electroluminescence) display, plasma display, etc.
  • the display unit 303 can be configured by, for example, a video IZF or a video display device connected to the video IZF.
  • the video I / F specifically includes, for example, a graphic controller that controls the entire display device and a buffer such as a VRAM (Video RAM) that temporarily stores image information that can be displayed immediately. It consists of a memory and a control IC that controls display on the display device based on image information output from the graphic controller.
  • the display unit 303 displays video (image) information, alarm information, and various other information captured by a camera 309, which will be described later.
  • the information acquisition unit 304 includes a GPS receiver and various sensors, and the position of the moving object.
  • Information on the position of the driving support device 300, vehicle behavior information, and the like are acquired.
  • the information acquisition unit 304 acquires information related to navigation from the navigation device. Then, the acquired various information is output to the control unit 301.
  • the GPS receiver receives the radio wave from the GPS satellite and obtains the geometric position with respect to the GPS satellite.
  • GPS is an abbreviation for Global Positioning System, and is a system that accurately obtains the position on the ground by receiving radio waves from four or more satellites.
  • the GPS receiver consists of an antenna for receiving radio waves from GPS satellites, a tuner that demodulates the received radio waves, and an arithmetic circuit that calculates the current position based on the demodulated information.
  • the various sensors include, for example, a speed sensor, an inclination angle sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), a lateral G sensor, and the like.
  • the speed sensor detects the vehicle speed from the output shaft of the vehicle transmission and outputs speed information.
  • the tilt angle sensor detects, for example, a tilt angle on the road surface of the vehicle, and outputs tilt angle information.
  • the angular velocity sensor detects, for example, the traveling direction of the vehicle, detects the angular velocity at the time of cornering of the vehicle, and outputs angular velocity information and relative azimuth information.
  • the lateral G sensor detects lateral G, which is an outward force (centripetal force) generated by centrifugal force at the time of cornering of the vehicle, for example, and outputs lateral G information.
  • the information acquisition unit 304 For example, the steering angle information of the steering wheel (steering wheel) of the vehicle may be acquired.
  • the information acquisition unit 304 obtains, for example, the movement displacement, movement speed, movement direction, and inclination angle of the vehicle from information output from these various sensors. Then, by using the output information from the various sensors together with the information obtained from the radio wave received by the GPS receiver, the position of the moving body can be recognized with higher accuracy.
  • the navigation-related information acquired by the information acquisition unit 304 includes, for example, map information used for route search and route guidance in the navigation device.
  • This map information includes, for example, background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing the shape of the road. Rendered in 3D or 3D.
  • this map information and a mark indicating the current position of the moving body are displayed in an overlapping manner on the display screen of the display unit 303. Further, when warning information or the like is output from the driving support device 300, the warning information or the like is displayed on the display screen of the display unit 303 in an overlapping manner.
  • the background information includes background shape information representing the shape of the background and background type information representing the type of the background.
  • the background shape information includes, for example, the coordinates of the representative point “polyline” “polygon” of the feature.
  • the background type information includes, for example, the name of the feature and text information indicating the address of the residence 'phone number, and the type of feature such as the building' river 'ground surface.
  • the road shape information is information relating to a road network having a plurality of nodes and links.
  • the node indicates an intersection where a plurality of roads intersect, such as a three-way crossroad 'crossroad' and a five-way crossing.
  • the link indicates a road connecting the nodes.
  • the road shape information further includes traffic condition information.
  • the traffic condition information includes, for example, the presence / absence of signals and pedestrian crossings, the presence / absence of highway entrances and junctions, the length (distance) of each link, vehicle width, direction of travel, traffic prohibition, and road type (highway). , Toll roads, general roads, etc.).
  • This traffic condition information includes past traffic information that has been statistically processed based on the season 'day' large holiday 'time, etc. Is remembered.
  • the information acquisition unit 304 acquires road traffic information such as traffic congestion and traffic rules distributed from a VICS (Vehicle Information and Communication System) (registered trademark) center and received by a navigation device. It may be.
  • VICS Vehicle Information and Communication System
  • the map information acquired from the navigation device by the information acquisition unit 304 is used in the driving support device 300, not only the behavior information of the vehicle but also the position of the traffic light on the road, the road shape, etc., for example. It becomes possible to read ahead and use it for driving support
  • the recording medium 305 various control programs and various information are recorded in a state readable by a computer.
  • the recording medium 305 can be realized by, for example, an HD (Hard Disk), a DV D (Digital Versatile Disk), a CD (Compact Disk), a memory card, or the like.
  • the recording medium 305 may accept writing of information by the recording medium decoding unit 306 and may record the written information in a nonvolatile manner.
  • the recording medium decoding unit 306 controls reading (reading) / writing (writing) of information with respect to the recording medium 305.
  • the map information is acquired from the navigation device by the information acquisition unit 304.
  • This map information may be recorded in the recording medium 305, for example.
  • the map information may be recorded on an external server or the like.
  • the driving support apparatus 300 may acquire map information from an external server via a network via a communication unit 308 described later, for example.
  • the acquired map information is temporarily or permanently stored in RAM.
  • the audio output unit 307 reproduces sound such as a warning sound by controlling output to the connected speaker 313.
  • one speaker 313 or a plurality of speakers 313 may be provided.
  • the audio output unit 307 performs, for example, a DZA converter that performs DZA conversion of audio digital information, an amplifier that amplifies the audio analog signal output from the DZA converter, and A / D conversion of the audio analog signal. It can be configured with an AZD converter.
  • the communication unit 308 includes, for example, an FM multiplex tuner, a wireless communication device, and other communication devices. It communicates with other communication devices.
  • the communication unit 308 may be configured to perform communication via a communication medium such as a mobile phone, PHS, communication card, and wireless LAN.
  • the camera 309 is configured by a photographing device such as a digital still camera (DSC) or a digital video camera (DVC) mounted on the rear side of a vehicle rearview mirror, for example, and includes a photoelectric conversion element such as a C-MOS or CCD. And take a picture of the front of the vehicle.
  • the camera 309 takes an image of the front of the vehicle with an angle of view in the range of ⁇ 30 ° to 45 ° left and right around the traveling direction.
  • the camera driving unit 310 for example, mechanically drives the fixed shaft of the camera 309 by control from the control unit 301 based on the behavior information of the vehicle acquired by the information acquisition unit 304. When there is a change in the traveling direction, the shooting direction of the camera 309 is changed so as to follow the traveling direction.
  • the camera driving unit 310 can be realized by a rotary motor, a solenoid, or the like.
  • the image processing unit 311 performs overall image processing in the driving support apparatus 300 and also performs image processing of an image taken by the camera 309.
  • the image processing unit 311 is configured by, for example, a GPU (Graphics Processing Unit) or a DSP (Digital Signa 1 Processor).
  • the image processing by the image processing unit 311 analyzes, for example, an image in front of the vehicle captured by the camera 309, extracts a signal image related to the traffic light from the analyzed image, and outputs the signal. This is done by discriminating the display color of the traffic light represented by the image, for example, by comparing it with color sample information set in advance.
  • Information regarding the display color of the traffic light determined by the image processing unit 311 is output to the control unit 301, and the control unit 301 controls output of warning information described later based on the information regarding the display color. To do. Further, the image ahead of the vehicle image-processed by the image processing unit 311 may be displayed on the display screen of the display unit 303 via the control unit 301.
  • the sound generation unit 312 generates information of various sounds such as an alarm sound based on the alarm information.
  • the virtual sound source corresponding to these is set and the voice guidance information is generated, and the voice guidance information is generated via the control unit 301.
  • the speaker 313 reproduces (outputs) an alarm sound output from the sound output unit 307 and various sounds output from the control unit 301 via the sound output unit 307.
  • a headphone or the like may be provided on the speaker 313, and the output form of the guidance sound or sound may be changed as appropriate so as not to generate a sound field such as an alarm sound or sound output from the entire vehicle interior. .
  • imaging unit in FIG. 1 which is a functional configuration of the driving support device that is relevant to the embodiment, is used.
  • the function of the 101 is realized by, for example, the camera 309 and the camera driving unit 310, and the acquisition unit 102 is realized by the information acquiring unit 304, the communication unit 308, and the like. Further, the determination unit 103 specifically realizes its function by the control unit 301, for example.
  • control unit 104 in FIG. 1 specifically includes, for example, the control unit 301 and the camera driving unit.
  • the function is realized by 310 or the like, and the extraction unit 105 and the discrimination unit 106 realize the function by, for example, the control unit 301 or the image processing unit 311. Further, specifically, the output unit 107 in FIG. 1 realizes its function by, for example, the display unit 303, the audio output unit 307, the speaker 313, and the like.
  • FIG. 4 is an explanatory diagram for explaining an outline of driving support using the photographing control method in the driving support device according to the embodiment of the present invention.
  • the vehicle 401 traveling on the road 410 is equipped with a camera 309 (see FIG. 3) that captures the front, and the vehicle 401 travels straight ahead in this camera 309.
  • the front of the vehicle 401 is photographed in the photographing range PD1 of the broken line A to the broken line B.
  • the traveling direction of the vehicle 401 changes from a straight traveling direction to a direction indicated by an arrow 421 in the figure (a right front direction).
  • the shooting range PD1 of the camera 309 of the vehicle 401 is in the same state, the shooting of the traffic light 411 near the exit of the right curve with the camera 309 is delayed. For this reason,
  • the behavior information for example, the vehicle 401 indicated by the arrow 422 in the figure that is generated when the traveling direction of the vehicle 401 changes in the direction indicated by the arrow 421 in the figure.
  • the horizontal G information in the horizontal direction is acquired by the information acquisition unit 304 (see Fig. 3).
  • the control unit 301 calculates the traveling direction of the vehicle 401, and follows the traveling direction calculated by the camera driving unit 310 (see Fig. 3).
  • the shooting direction of the camera 309 is changed so that, for example, the front of the vehicle 401 is shot in the shooting range PD2 of the straight line C to the straight line D.
  • the traffic light 411 near the exit of the right curve can be reliably photographed, and alarm information based on the display color of the traffic light 411, for example, is output based on the signal image of the photographed traffic light 411. Is possible. Therefore, it is possible to prevent an accident caused by a panic brake or the like due to sudden appearance of the traffic light 411 in the middle of the curve, or to alert the driver in advance.
  • behavior information for example, speed information and acceleration information in the traveling direction of the vehicle 401, or steering angle information of the steering machine, etc. are used to more precisely capture the shooting direction of the camera 309. May be changed.
  • the acceleration (lateral G) applied to the lateral direction of the vehicle 401 (that is, the lateral direction of the vehicle 401). ) Is inversely proportional to the turning radius of the turn. For this reason, from the lateral G information used as the behavior information, for example, it can be seen that the greater the lateral acceleration (lateral G) in the vehicle 401, the smaller the radius of rotation (that is, a sharp curve).
  • the camera driving unit 310 controlled by the control unit 301 controls the camera 309. If the shooting range PD1 is greatly changed in the left-right direction with respect to the front of the vehicle 401, and the acceleration (lateral G) is small, it may be changed in the left-right direction.
  • the acceleration (lateral G) has a characteristic proportional to the square of the speed
  • the acceleration in the left-right direction (lateral G) is large when the speed of the vehicle 401 that makes a turning motion is high. It can be seen that the acceleration in the left-right direction (lateral G) becomes smaller when it is slow.
  • the camera drive unit 3 When changing the shooting range of the camera 309 by 10, if the control is performed using speed information in the traveling direction in addition to the lateral G information of the vehicle 401, a more precise and precise change can be made. In addition, the same change can be made using the steering angle information of the steering device of the vehicle 401.
  • FIG. 5 is a flowchart illustrating an example of the imaging control process and the driving support process procedure of the driving support apparatus according to the embodiment of the present invention.
  • FIG. 5 the processing shown in FIG. 5 is performed by the CPU of the control unit 301 using a program stored (recorded) in the RAM, ROM, or recording medium 305 of the control unit 301 shown in FIG. This is implemented by controlling each part of the driving support device 300.
  • description will be made mainly with reference to FIG. 3 and FIG. 4, but portions that overlap with the portions already described will be denoted by the same reference numerals unless otherwise specified, and description thereof will be omitted.
  • step S501 under the control of the control unit 301, the front side of the vehicle 401 is photographed by the camera 309 (step S501), and the lateral G information, speed information, steering angle information, etc. are obtained by the information acquisition unit 304.
  • the behavior information of the vehicle 401 is acquired (step S502).
  • the control unit 301 determines whether or not there is a lateral change in the traveling direction of the vehicle 401 (step S503).
  • step S503 If it is determined that there is no lateral change in the traveling direction of the vehicle 401 (step S503: No), the process returns to step S501 and the front of the vehicle 401 is photographed. If it is determined that there is a lateral change in the traveling direction of the vehicle 401 (step S503: Yes), the camera driving unit 310 controls the camera 401 so as to follow the traveling direction change of the vehicle 401 under the control of the control unit 301. The shooting direction of 309 is changed (step S504).
  • the control unit 301 determines whether or not there is a signal image representing the traffic light 411 in an image in front of the vehicle 401 photographed by the camera 309 and image-processed through the image processing unit 311 (step) S505). If it is determined that there is no signal image in the photographed image (step S505: No), the process returns to step S501 to photograph the front of the vehicle 401. [0080] On the other hand, if it is determined that there is a signal image in the captured image (step S505: Yes), the image processing unit 311 extracts the signal image from the captured image (step S5 06), The display color of the traffic light 411 based on the signal image is determined (step S507). Then, the controller 301 determines, for example, whether or not the determined display color is red (step S508).
  • step S508 If the display color is determined to be red (step S508: Yes), the control unit 301 determines whether or not it is the first discovery of the signal 411 taken (step S50). 9) If it is determined that this is the first discovery (step S509: Yes), the control unit 301 sets a timer for measuring a predetermined time (step S510), and the display unit 303 or the voice generation unit 312, the audio output unit 307 and the speaker 313 are controlled to output alarm information indicating that the display color of the traffic light 411 is red, that is, a red signal (step S511).
  • step S509 determines whether or not the first discovery (step S509: No)
  • step S512 determines whether or not the timer has timed out (step S512), and when it is determined that the timer has timed out In (Step S512: Yes), the process proceeds to Step S511 and outputs alarm information.
  • step S508 determines whether the display color is yellow (step S513). If color display is determined to be yellow (step S513: Yes), the Te controller 301 Niyotsu determines captured the traffic signal 411 diary whether power is the first discovery (Step S 514) 0
  • step S514 If it is determined that this is the first discovery (step S514: Yes), the control unit 301 sets a timer for measuring a predetermined time (step S515), and displays the display unit 303 or the voice generation unit. 312, the audio output unit 307 and the speaker 313 are controlled to output alarm information indicating that the display color of the traffic light 411 is yellow, that is, a yellow signal (step S 516).
  • step S514 determines whether or not the first discovery (step S514: No)
  • step S517 determines whether or not the timer has timed out (step S517), and when it is determined that the timer has timed out (Step S517: Yes) moves to Step S516 above, and the alarm information Information is output.
  • step S513: No when it is determined that the display color is not yellow (step S513: No), when it is determined that the time-out has not occurred in step S512 and step S517 above (step S512, step S517: No) and after outputting the alarm information in step S511 and step S516, the signal image still exists in the image captured by the camera 309 by the control unit 301 based on the information from the image processing unit 311, for example. It is determined whether the signal has passed through the traffic light 411 (step S518).
  • step S518 If it is determined that the signal has not passed through the traffic signal 411 (step S518: No), the process returns to step S501 and repeats the processing from step S501 to step S518. If it is determined that the signal has passed through the traffic light 411 (step S518: Yes), the control unit 301 clears the memory to initialize various information temporarily stored in the RAM (step S5 19), The process proceeds to step S501 and the process is looped.
  • step S508 and S513 may be performed in a different order.
  • the above-described processing is performed every predetermined time set in the timer in step S510 or step S515 when the display color of the traffic light 411 determined in step S507 is continuously red or yellow.
  • step S511 and step S516 alarm information such as an alarm sound and an alarm image is output, and an operation for alerting the driver of the vehicle 401 and the like is performed.
  • the shooting direction of camera 309 can be changed so as to follow the change in the traveling direction of vehicle 401. For this reason, for example, even when the road 410 is curved and has a curvature S, the traffic light 411 on the road 410 can be accurately photographed so that the traffic light 411 on the road 410 does not fall out of the shooting range of the camera 309, and alarm information based on this is output be able to.
  • a signal image representing the traffic signal 411 is extracted by the image processing unit 311 from an image photographed by the camera 309, and based on this signal image.
  • the display color of the traffic light 411 is determined, for example, the driver of the vehicle 401 Alarm information can be output. For this reason, it is possible to draw attention to the driver without fail.
  • the moving body can be made to reliably follow the shooting direction in the moving direction of the moving body.
  • a front image can be taken, and information based on the taken image can be output.
  • driving assistance by informing the driver of information based on the image in front of the moving body accurately and reliably.
  • the driving support device 300 outputs alarm information when the display color of the traffic light 411 is red or yellow has been described as an example. It may be configured to discriminate blue (green) and display information based on the blue (green) on the display unit 303 or to output sound from the speaker 313.
  • each part of the vehicle 401 is controlled by the control of the control unit 301 based on this information.
  • it may be configured to improve traveling safety. In this case, for example, when the display color is red, control such as applying a brake so that the vehicle 401 safely stops within a predetermined distance from the traffic light 411 is conceivable.
  • the camera 309 is arranged behind the rear mirror of the vehicle 401 so as to photograph the front of the vehicle 401.
  • the arrangement position of the camera 309 can photograph the front of the vehicle 401. If it is a position, it is not limited to this. Further, the camera 309 may be further arranged at a position where the rear of the vehicle 401 can be photographed, and information based on the rear image of the vehicle 401 may be appropriately displayed on the display screen of the display unit 303.
  • the imaging control method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • the program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, or a DVD, and is executed by being read from the recording medium by the computer.
  • the program may be a transmission medium that can be distributed through a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Dispositif d’assistance de conduite comprenant une unité d’imagerie (101), une unité d’acquisition (102), une unité d’estimation (103), une unité de commande (104), une unité d’extraction (105), une unité d’estimation (106) et une unité de sortie (107). L’unité d’imagerie (101) prend une image de l’avant d’un corps en déplacement. L’unité d’acquisition (102) acquiert des informations de comportement sur le corps en déplacement. L’unité d’estimation (103) estime si un changement quelconque est réalisé dans la direction d’avancement du corps en déplacement. S’il est estimé qu’un changement s’est produit dans la direction d’avancement du corps en déplacement, l’unité de commande (104) modifie la direction d’imagerie de l’unité d’imagerie (101) de façon à suivre le changement de direction d’avancement du corps en déplacement. L’unité d’extraction (105) extrait une image signal concernant un feu de signalisation de l’image capturée de l’avant du corps en déplacement. L’unité d’estimation (106) estime la couleur d’affichage du feu de signalisation en fonction de l’image signal extraite. Selon le résultat de l’estimation, l’unité de sortie (107) sort des informations d’alarme pour un conducteur du corps en déplacement.
PCT/JP2006/318246 2005-09-16 2006-09-14 Dispositif d’assistance de conduite, méthode de commande d’imagerie, programme de commande d’imagerie et support d’enregistrement WO2007032427A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-270835 2005-09-16
JP2005270835 2005-09-16

Publications (1)

Publication Number Publication Date
WO2007032427A1 true WO2007032427A1 (fr) 2007-03-22

Family

ID=37865011

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/318246 WO2007032427A1 (fr) 2005-09-16 2006-09-14 Dispositif d’assistance de conduite, méthode de commande d’imagerie, programme de commande d’imagerie et support d’enregistrement

Country Status (1)

Country Link
WO (1) WO2007032427A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014175913A (ja) * 2013-03-11 2014-09-22 Nec Engineering Ltd 撮像装置及び撮像装置の制御方法、制御プログラム
CN103632556B (zh) * 2008-10-08 2015-12-09 丰田自动车株式会社 驾驶辅助装置及方法
JP2016505450A (ja) * 2013-01-29 2016-02-25 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング 車両用のカメラシステム、車両用の車載カメラの画像の画像領域を制御するための方法および装置
WO2018101063A1 (fr) 2016-11-30 2018-06-07 京セラ株式会社 Caméra et système de surveillance, dispositif de traitement d'image, véhicule et procédé de traitement d'image
CN108172003A (zh) * 2017-11-30 2018-06-15 广州华夏职业学院 交通信号灯车内提醒系统
CN109827182A (zh) * 2018-12-06 2019-05-31 上海金山环境再生能源有限公司 垃圾焚烧发电生产安全监测系统
WO2019155569A1 (fr) * 2018-02-08 2019-08-15 三菱電機株式会社 Dispositif de détection d'obstacle et procédé de détection d'obstacle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05342497A (ja) * 1992-06-09 1993-12-24 Mazda Motor Corp 車両の障害物検出装置
JPH1134829A (ja) * 1997-07-17 1999-02-09 Aisin Seiki Co Ltd 進行方向補正装置
JPH11296799A (ja) * 1998-04-09 1999-10-29 Fujitsu Ten Ltd 走行路形状認識装置
JPH11306498A (ja) * 1998-04-16 1999-11-05 Matsushita Electric Ind Co Ltd 車載カメラシステム
JP2001048034A (ja) * 1999-08-10 2001-02-20 Nissan Motor Co Ltd 車線追従装置
JP2002312898A (ja) * 2001-04-10 2002-10-25 Honda Motor Co Ltd 赤外線画像処理装置
JP2004289738A (ja) * 2003-03-25 2004-10-14 Minolta Co Ltd 撮像装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05342497A (ja) * 1992-06-09 1993-12-24 Mazda Motor Corp 車両の障害物検出装置
JPH1134829A (ja) * 1997-07-17 1999-02-09 Aisin Seiki Co Ltd 進行方向補正装置
JPH11296799A (ja) * 1998-04-09 1999-10-29 Fujitsu Ten Ltd 走行路形状認識装置
JPH11306498A (ja) * 1998-04-16 1999-11-05 Matsushita Electric Ind Co Ltd 車載カメラシステム
JP2001048034A (ja) * 1999-08-10 2001-02-20 Nissan Motor Co Ltd 車線追従装置
JP2002312898A (ja) * 2001-04-10 2002-10-25 Honda Motor Co Ltd 赤外線画像処理装置
JP2004289738A (ja) * 2003-03-25 2004-10-14 Minolta Co Ltd 撮像装置

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103632556B (zh) * 2008-10-08 2015-12-09 丰田自动车株式会社 驾驶辅助装置及方法
JP2016505450A (ja) * 2013-01-29 2016-02-25 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング 車両用のカメラシステム、車両用の車載カメラの画像の画像領域を制御するための方法および装置
CN105432074A (zh) * 2013-01-29 2016-03-23 罗伯特·博世有限公司 用于车辆的摄像机系统、用于控制用于车辆的车辆摄像机的图像的图像区域的方法和设备
JP2014175913A (ja) * 2013-03-11 2014-09-22 Nec Engineering Ltd 撮像装置及び撮像装置の制御方法、制御プログラム
WO2018101063A1 (fr) 2016-11-30 2018-06-07 京セラ株式会社 Caméra et système de surveillance, dispositif de traitement d'image, véhicule et procédé de traitement d'image
US10893214B2 (en) 2016-11-30 2021-01-12 Kyocera Corporation Camera monitoring system, image processing device, vehicle and image processing method
CN108172003A (zh) * 2017-11-30 2018-06-15 广州华夏职业学院 交通信号灯车内提醒系统
CN108172003B (zh) * 2017-11-30 2020-11-27 广州华夏职业学院 交通信号灯车内提醒系统
JPWO2019155569A1 (ja) * 2018-02-08 2020-09-24 三菱電機株式会社 障害物検出装置および障害物検出方法
WO2019155569A1 (fr) * 2018-02-08 2019-08-15 三菱電機株式会社 Dispositif de détection d'obstacle et procédé de détection d'obstacle
US11845482B2 (en) 2018-02-08 2023-12-19 Mitsubishi Electric Corporation Obstacle detection device and obstacle detection method
CN109827182B (zh) * 2018-12-06 2020-09-08 上海金山环境再生能源有限公司 垃圾焚烧发电生产安全监测系统
CN109827182A (zh) * 2018-12-06 2019-05-31 上海金山环境再生能源有限公司 垃圾焚烧发电生产安全监测系统

Similar Documents

Publication Publication Date Title
JP4622928B2 (ja) 車載カメラ制御装置および車載カメラ制御方法。
JP4434224B2 (ja) 走行支援用車載装置
US8600655B2 (en) Road marking recognition system
JP6451844B2 (ja) 車両位置判定装置及び車両位置判定方法
JP4886597B2 (ja) レーン判定装置及びレーン判定方法、並びにそれを用いたナビゲーション装置
JP4421549B2 (ja) 運転支援装置
JP4783430B2 (ja) 駆動制御装置、駆動制御方法、駆動制御プログラムおよび記録媒体
JP4848893B2 (ja) 交差点情報提供システム及び運転支援システム
JP2006209510A (ja) 画像認識装置及び画像認識方法
WO2007032427A1 (fr) Dispositif d’assistance de conduite, méthode de commande d’imagerie, programme de commande d’imagerie et support d’enregistrement
JP2008064687A (ja) 走行情報案内装置
WO2008056780A1 (fr) Dispositif d'assistance à la conduite, procédé d'assistance à la conduite, et programme
JP2006209511A (ja) 画像認識装置及び画像認識方法、並びにそれを用いた位置特定装置、車両制御装置及びナビゲーション装置
JP2010271155A (ja) 現在位置特定装置とその現在位置特定方法
JP2008070955A (ja) 移動体を表示するための表示システム、車載装置、画像送信装置及び表示方法
JP2007015525A (ja) カメラが撮影した前方画像に基づいて、先行車両と自車両との間の接近の危険に対処するための信号を出力する出力装置、および、当該出力装置のためのプログラム
JP2003178397A (ja) 道路表示等検出警告装置
JP2005214857A (ja) ナビゲーション装置、案内画像作成方法
JP2009193507A (ja) 逆走防止システム
JP5156307B2 (ja) 車載カメラシステム
JP2007047886A (ja) 路面標示認識システム
JP2009105741A (ja) 車載周辺監視装置
JP7445040B2 (ja) 通知装置及び通知方法
JP2009294882A (ja) 車両画像記録装置、車両画像記録システム
WO2007135856A1 (fr) Appareil, procédé et programme de commande de prise de vue et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06810140

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP