WO2008038369A1 - Dispositif de commande de conduite, procédé de commande de conduite, programme de commande de conduite et support d'enregistrement - Google Patents
Dispositif de commande de conduite, procédé de commande de conduite, programme de commande de conduite et support d'enregistrement Download PDFInfo
- Publication number
- WO2008038369A1 WO2008038369A1 PCT/JP2006/319328 JP2006319328W WO2008038369A1 WO 2008038369 A1 WO2008038369 A1 WO 2008038369A1 JP 2006319328 W JP2006319328 W JP 2006319328W WO 2008038369 A1 WO2008038369 A1 WO 2008038369A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- information
- drive
- drive control
- risk index
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/09675—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where a selection from the received information takes place in the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096791—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/20—Direction indicator values
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/221—Physiology, e.g. weight, heartbeat, health or special needs
Definitions
- Drive control apparatus drive control method, drive control program, and recording medium
- the present invention relates to a drive control device, a drive control method, a drive control program, and a recording medium that control a drive unit based on the possibility of a dangerous event occurring on a moving body.
- the use of the present invention is not limited to the drive control device, the drive control method, the drive control program, and the recording medium described above.
- a driving information output device that uses a driving unit such as a robot to present information related to car navigation for guiding a vehicle to a destination point and information related to driving using a driving unit such as a robot.
- Patent Document 1 JP 2004-93354 A
- a drive control device relates to drive means for driving a sensor mounted on a movable body, and the movable body.
- An information acquisition means for acquiring information
- a calculation means for calculating a risk index indicating a possibility of a dangerous event with respect to the moving object based on the information acquired by the information acquisition means
- the calculation Control means for controlling the driving means according to the calculation result calculated by the means.
- the drive control method according to the invention of claim 8 is dangerous for the moving body based on the information acquisition step of acquiring information about the moving body and the information acquired by the information acquisition step.
- a drive control program according to claim 9 causes a computer to execute the drive control method according to claim 8.
- a recording medium according to the invention of claim 10 is characterized in that the drive control program according to claim 9 is recorded in a computer-readable state.
- FIG. 1 is a block diagram showing a functional configuration of a drive control device that is effective in the present embodiment.
- FIG. 2 is a flowchart showing a drive control processing procedure of the drive control apparatus which is effective in the present embodiment.
- FIG. 3 is a block diagram showing a hardware configuration of a navigation apparatus that is effective in the present embodiment.
- FIG. 4 is a block diagram showing a functional configuration of a drive control device that works on the present embodiment.
- FIG. 5 is a flowchart showing an outline of processing of the drive control device which is effective in the present embodiment.
- FIG. 6 is a flowchart showing the contents of processing for acquiring vehicle information and image information.
- FIG. 7 is a flowchart showing the contents of a process for calculating a first attention point.
- FIG. 8 is an explanatory diagram showing the vertical distance of one point on the road in the photographed image.
- FIG. 9 is an explanatory diagram showing the relationship between the distance in the horizontal direction and the distance in the vertical direction from the center point of the photographed image.
- FIG. 10 is an explanatory diagram for explaining points on the road.
- FIG. 11 is a flowchart showing the contents of processing for calculating a control signal.
- FIG. 12 is a flowchart showing the contents of the process for controlling the drive unit.
- FIG. 13 is a flowchart showing the contents of processing for acquiring image information.
- FIG. 14 is a flowchart showing the contents of processing for calculating a risk index.
- FIG. 15 is an explanatory diagram showing image information acquired at time T.
- FIG. 16 is an explanatory diagram showing image information acquired at time (T + ⁇ ).
- FIG. 17 is an explanatory view showing image information in which white lines are matched and overlaid.
- FIG. 18 is an explanatory diagram showing a motion vector M.
- FIG. 19 is an explanatory diagram showing the magnitude of the risk index D depending on the relationship between the unit vector I and the motion vector M!
- FIG. 20 is a flowchart showing the contents of the process for calculating the second attention point! /.
- FIG. 21 is a flowchart showing the contents of a process for determining whether or not the force has returned to normal operation. Explanation of symbols
- FIG. 1 is a block diagram showing a functional configuration of a drive control device that works according to the present embodiment.
- the drive control apparatus 100 includes an information acquisition unit 101, a drive unit 102, a calculation unit 103, a notification unit 104, a determination unit 105, and a control unit 106.
- the information acquisition unit 101 acquires information about a moving object.
- the information related to the moving object is, for example, image information related to the periphery of the moving object taken by a camera mounted on the moving object.
- the information on the moving object may be information obtained from various sensor forces.
- Various sensor forces Acquired information includes, for example, the speed and acceleration information of the moving body, the current position information of the moving body, the volume and direction of the sound inside and outside the moving body, and the temperature and humidity.
- Information such as 'air component' contact of any object 'pressure' magnetic amount information, distance information from the moving object to any object, information such as user's heart rate, brain wave, breathing, etc.
- the drive unit 102 drives a sensor mounted on the moving body.
- the sensor mounted on the moving body is the above-described various sensors, for example, an image sensor (camera). Further, the drive unit 102 may drive the information acquisition unit 101. Therefore, the information acquisition unit 101 and the various sensors may be the same body or separate bodies. Then, the drive unit 102 drives the sensor or information acquisition unit 101 mounted on the moving body in the direction of the arrow and the pitch direction. Furthermore, the drive unit 102 may be, for example, a robot imitating the shape of an animal or a person.
- the calculation unit 103 calculates a risk index that indicates a possibility that a dangerous event will occur for the mobile object.
- a dangerous event is, for example, that the moving body is off the route or that the moving body is in contact with the object. Or the state of the user of the mobile object is different from normal.
- the risk index is a numerical value that represents the magnitude of the likelihood of these dangerous events.
- the notification unit 104 notifies the passenger of the moving body of the possibility of a dangerous event occurring on the moving body according to the calculation result calculated by the calculating unit 103.
- the notification unit 104 provides notification by warning sound, voice information, lighting of light, and the like. Specifically, when notifying by voice information, a notification is given of things that may cause an accident such as “There is an obstacle on the right side” or “There is an obstacle on the right side”.
- the passenger may be notified of the possibility of a dangerous event such as “Open the window because the air is bad” or “Please lower the volume of the audio”. Further, the notification of the possibility of these dangerous events may be notified by driving by the drive unit 102.
- the determination unit 105 determines whether or not the risk index calculated by the calculation unit 103 is greater than a predetermined value.
- the predetermined value may be set by the user or may be changed by past history. Further, the determination unit 105 may determine whether or not the risk index calculated by the calculation unit 103 is greater than a predetermined value and is smaller than the predetermined value and then returned to the value.
- the control unit 106 controls the drive unit 102 according to the calculation result calculated by the calculation unit 103. Further, when the determination unit 105 determines that the risk index is greater than a predetermined value, the control unit 106 stops driving by the drive unit 102. Then, when the determination unit 105 determines that the risk index has returned to a value smaller than the predetermined value, the control unit 106 may resume driving by the drive unit 102.
- FIG. 2 is a flowchart showing a drive control processing procedure of the drive control apparatus that is useful in the present embodiment.
- the calculation unit 103 calculates a risk index indicating the possibility of a dangerous event for the moving object (step S 202).
- the notification unit 104 notifies the passenger of the possibility of a dangerous event occurring on the moving object (step S203).
- determination unit 105 determines whether or not the risk index calculated in step S202 is greater than a predetermined value (step S204), and if the risk index is greater than a predetermined value (step S204). : Yes), the drive unit 102 is controlled by the control unit 106 to stop the drive (step S205), and the series of processing ends. On the other hand, if the risk index is not greater than the predetermined value in step S204 (step S204: No), the process returns to step S201, and the subsequent processing is repeated.
- the force to be notified to the passenger in step S203 is not limited to this. Specifically, for example, the process may proceed to step S204 without notifying the passenger. Alternatively, it may be determined whether the risk index is within a predetermined range with the predetermined value as an upper limit in step S205, and when the risk index is within the predetermined range, the passenger may be notified.
- the force that stops the driving of the driving unit 102 in step S205 is not limited to this.
- the drive of the drive unit 102 may be limited without stopping the drive.
- the restriction means for example, narrowing the drive range of the drive unit 102, or performing only blinking of light or outputting of sound while the drive unit 102 is stopped.
- the calculation unit 103 is dangerous for the moving body based on the information about the moving body acquired by the information acquisition unit 101.
- a risk index indicating the possibility of occurrence of an event is calculated, and the drive unit 102 that drives the sensor mounted on the moving body can be controlled by the control unit 106 according to the calculated result. Therefore, for example, even if the user is not gazing at the drive unit 102, the drive control device 100 can control the drive unit 102 when it is determined that a dangerous event occurs in the moving body.
- the user can control the driving unit 102 during dangerous driving so that the user does not disturb the driving judgment.
- the possibility of a dangerous event occurring in the moving body for the moving body passenger is determined according to the calculation result calculated by the calculating unit 103. Can be notified. Therefore, the drive control device 100 can warn the driver that the vehicle is in danger driving when the danger index is within a predetermined range. This Thus, the user can know that driving is in a dangerous state, and can avoid dangerous events such as accidents.
- control unit 106 stops driving by drive unit 102 when determination unit 105 determines that the risk index is greater than a predetermined value. can do. Therefore, the drive control apparatus 100 prevents the drive unit 102 from being mistaken for an object (a person or an oncoming vehicle) outside the vehicle by stopping the drive unit 102 in the driver's field of view during dangerous driving. it can. This allows the user to concentrate on driving because there is no extra movement in the field of view.
- the drive unit 102 can drive the information acquisition unit 101. Therefore, a wide range of information can be acquired by driving the information acquisition unit 101 by the drive unit 102. This allows the user to detect the possibility of dangerous events over a wide range.
- the information acquisition unit 101 can acquire image information related to the periphery of the moving object. Therefore, it is possible to identify an object having a large risk index with respect to the moving object from the change in the acquired image information. This allows the user to avoid dangerous events such as accidents.
- drive control apparatus 100 of the present embodiment when determination unit 105 determines that the risk index has returned to a value smaller than a predetermined value, driving by drive unit 102 can be resumed. Therefore, the drive unit 102 can be driven during normal operation. As a result, the user can move again after avoiding a dangerous event even when the movement of a robot mounted on the vehicle is stopped, for example.
- the drive unit 102 can drive the sensor mounted on the moving body in the direction of the arrow and the pitch. Therefore, the drive control device 100 can acquire information on the moving object over a wide range. This allows users to detect the possibility of dangerous events over a wide area.
- FIG. 3 is a block diagram showing the hardware configuration of the navigation apparatus that is useful in this embodiment.
- the navigation device 300 includes a CPU 301, a ROM 302, a RAM 303, a magnetic disk drive 304, a magnetic disk 305, an optical disk drive 310, an optical disk 307, an audio IZF (interface) 308, and a microphone 309.
- Each component 301 to 317 is connected by a bus 320.
- the CPU 301 governs overall control of the navigation device 300.
- the ROM 302 records programs such as a boot program and a data update program.
- RAM 303 is used as a work area for CPU 301. That is, the CPU 301 controls the entire navigation device 300 by executing various programs recorded in the ROM 302 while using the RAM 303 as a work area.
- the magnetic disk drive 304 controls reading and writing of data with respect to the magnetic disk 305 according to the control of the CPU 301.
- the magnetic disk 305 records data written under the control of the magnetic disk drive 304.
- the magnetic disk 305 for example, HD (node disk) or FD (flexible disk) can be used.
- the optical disk drive 306 controls reading and writing of data with respect to the optical disk 307 according to the control of the CPU 301.
- the optical disc 307 is a detachable recording medium from which data is read according to the control of the optical disc drive 306.
- the optical disk 307 can use a writable recording medium.
- the removable recording medium may be a power MO of the optical disk 307, a memory card, or the like.
- Examples of information recorded on the magnetic disk 305 and the optical disk 307 include map data and function data.
- the map data includes background data representing features such as buildings, rivers, and the ground surface, and road shape data representing the shape of the road. It consists of multiple data files divided by district.
- the road shape data further includes traffic condition data.
- the traffic condition data includes, for example, the presence / absence of traffic lights and pedestrian crossings, the presence / absence of highway doorways and junctions, the length (distance) of each link, road width, direction of travel, road type (high speed). Road, toll road, general road, etc.).
- the function data is three-dimensional data representing the shape of the facility on the map, character data representing the description of the facility, and other various data other than the map data.
- Map data and function data are recorded in blocks divided by district or function. Specifically, for example, the map data is recorded in such a state that each map can be divided into blocks such that each map represents a predetermined area on the map displayed on the display screen. Also, for example, the function data is recorded in a state where each function can be divided into a plurality of blocks so as to realize one function.
- the function data is data for realizing functions such as program data that realizes route search, calculation of required time, route guidance, and the like.
- Each of the map data and function data is composed of multiple data files divided by district or function.
- the audio IZF 308 is connected to a microphone 309 for audio input and a speaker 310 for audio output.
- the voice received by the microphone 309 is AZD converted in the voice IZF308.
- the microphone 309 may be installed near the sun visor of the vehicle, and the number may be one or more.
- a sound obtained by DZA-converting a predetermined sound signal in the sound IZF 308 is output. Note that sound input from the microphone 309 can be recorded on the magnetic disk 305 or the optical disk 307 as sound data.
- Examples of the input device 311 include a remote controller, a keyboard, and a touch panel provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like.
- the input device 311 may be realized by any one form of the remote control, the keyboard, and the touch panel.
- the input device 311 may be realized by a plurality of forms.
- the video IZF 312 is connected to the display 313.
- Video IZF312 can be displayed immediately, for example, with a graphic controller that controls the entire display 313, for example It consists of a buffer memory such as VRAM (Video RAM) that temporarily records image information and a control IC that controls the display 313 based on image data output from the graphic controller.
- VRAM Video RAM
- the display 313 displays icons, cursors, menus, windows, or various data such as characters and images.
- the map data force described above is drawn in two or three dimensions.
- the map data displayed on the display 313 can be displayed with a mark representing the current position of the vehicle on which the navigation device 300 is mounted.
- the current position of the vehicle is calculated by the CPU 301.
- the display 313 for example, a CRT, a TFT liquid crystal display, a plasma display, or the like can be used.
- the display 313 is installed near the dashboard of the vehicle, for example.
- a plurality of displays 313 may be installed on the vehicle, for example, near the dashboard of the vehicle or around the rear seat of the vehicle.
- Communication IZF 314 is connected to a network via radio and functions as an interface between navigation device 300 and CPU 301.
- the communication I / F 314 is further connected to a communication network such as the Internet via radio and functions as an interface between the communication network and the CPU 301.
- Communication networks include LANs, WANs, public line networks, mobile phone networks, and the like.
- the communication IZF314 is composed of, for example, an FM tuner, VICS (Vehicle Information and Communication System) Z beacon resino, wireless navigation device, and other navigation devices. Get road traffic information such as traffic regulations. VICS is a registered trademark.
- the GPS unit 315 receives radio waves from GPS satellites and outputs information indicating the current position of the vehicle.
- the output information of the GPS unit 315 is used when the CPU 301 calculates the current position of the vehicle together with output values of various sensors 316 described later.
- the information indicating the current position is information that identifies one point on the map data, such as latitude'longitude and altitude.
- Various sensors 316 output information for determining the position and behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, and an angular velocity sensor.
- the output value of each sensor 316 is CPU301 Is used to calculate the current position of the vehicle and the amount of change in speed and direction.
- Various sensors 316 output information for determining the state inside the vehicle and information for determining the state of the driver.
- the information for determining the state inside the vehicle is information such as the temperature, humidity, amount of smoke, and air component inside the vehicle.
- the information for determining the driver's condition is specifically information such as the driver's heartbeat, brain waves, and breathing.
- the camera 317 captures an image inside or outside the vehicle.
- the image can be either a still image or a moving image.
- the camera 317 captures the behavior of passengers inside the vehicle, and the captured image is output to a recording medium such as the magnetic disk 305 or the optical disk 307 via the image IZF312.
- the situation outside the vehicle is photographed by the camera 317, and the photographed image is output to a recording medium such as the magnetic disk 305 and the optical disk 307 via the video IZF 312.
- the camera 317 has an infrared camera function, and the surface temperature distribution of an object existing inside the vehicle can be relatively compared based on video information captured using the infrared camera function. .
- the video output to the recording medium is overwritten and saved.
- the information acquisition unit 101, the drive unit 102, the calculation unit 103, the notification unit 104, the determination unit 105, and the control unit 106 included in the drive control device 100 illustrated in FIG. 1 are the navigation device 300 illustrated in FIG.
- the CPU 301 executes a predetermined program using programs and data recorded in the ROM 302, RAM 303, magnetic disk 305, optical disk 307, etc., and controls each part in the navigation device 300 to realize its function.
- the navigation apparatus 300 of the embodiment includes the drive control apparatus 100 shown in FIG. 1 by executing the drive control program recorded in the ROM 302 as a recording medium in the navigation apparatus 300.
- the function can be executed by the drive control processing procedure shown in FIG.
- FIG. 4 is a block diagram showing a functional configuration of a drive control device that is effective in the present embodiment.
- the drive control device 400 includes a drive unit 401, a control unit 402, a storage unit 403, , An information input unit 404, an information output unit 405, vehicle information IZF 406, an external device IZF 407, an image processing unit 408, and a sensor unit 410.
- the drive control device 400 may be the same body as the navigation device 300 or may be a separate body. Also, it may be configured to share some functions.
- the drive unit 401 is controlled by the control unit 402, and can be driven with a plurality of degrees of freedom such as a roll direction and a roll direction accompanying the drive direction and the pitch direction.
- the driving unit 401 may be equipped with one or a plurality of sensors in the sensor unit 410 described later. Further, the drive unit 401 may be installed at a position where it can enter the field of view of the driver of the vehicle and acquire information around the vehicle. For example, when the drive unit 401 mounts the image sensor unit 411, the drive unit 401 mounted with the image sensor unit 411 is installed on the upper part of the dashboard or around the back mirror.
- the drive unit 401 may be a robot imitating the appearance of an animal or a person.
- the drive unit 401 may include all or a part of each component described later.
- an information output unit 405 is provided to display information to the user by displaying an image or outputting a sound.
- the drive unit 401 presents information to the user by its own movement. For example, information is presented to the user by raising his arm or shaking his head.
- the drive unit 401 can change the imaging direction of the camera.
- the viewing angle of a camera is 40 degrees horizontal and 30 degrees vertical in the case of a normal digital camera or movie camera. Therefore, the camera can image a wide range by changing the visual field direction by the drive unit 401.
- the control unit 402 controls the drive unit 401. Specifically, a control signal for controlling the drive unit 401 is output to the drive unit 401. Specifically, for example, a control signal for rotating the drive unit 401 in one direction or a pitch direction, or a control signal for controlling ONZOFF of various sensors of the sensor unit 410 mounted on the drive unit 401 is output to the drive unit 401. To do. Further, when the drive unit 401 is a robot imitating an animal or a person, the control unit 402 can simultaneously control a plurality of outputs such as light output, sound output, and operation output.
- the storage unit 403 stores various types of information. An example of information stored in the storage unit 403 is map information, for example.
- the map information includes road network data that also has nodes and links, and image data that is drawn using features related to facilities, roads, and other terrain (mountains, rivers, land). Map data may include textual information, information such as facility names and addresses, and road and facility images.
- the map information may include the types and installation positions of signs and signboards, the position of traffic lights, the range of the school zone, and the like.
- the storage unit 403 stores the information input unit 404, vehicle information I ZF 406, external device IZF 407, vehicle information acquired by the sensor unit 410, image information, coordinate values of each point, risk index, and the like.
- the information input unit 404 inputs various types of information.
- the touch sensor unit 425, the sound sensor unit 415, the image sensor unit 411, or the like may be a touch panel, a remote controller, a mouse, a touch sensor, a microphone, a camera, or the like.
- the information output unit 405 outputs various types of information. Specifically, for example, there are a display screen for displaying an image and an output device for outputting sound.
- Vehicle information IZF 406 acquires vehicle information related to vehicle speed and vehicle operation.
- the vehicle information is, for example, information related to operation of a blinker, a hazard, a steering wheel depression angle, a light, a wiper, and the like, and information related to a vehicle speed from the acceleration sensor unit 413 or the like.
- the vehicle information IZF 406 may acquire information inside the vehicle. Specifically, for example, information on the behavior of the driver or the state such as the temperature, humidity and air components inside the vehicle may be acquired.
- External Device IZF 407 functions as an interface with an external device, and is connected to various external devices.
- the position of the vehicle is specified by information on GPS satellite power.
- the map information database also refers to the map information.
- it connects to the navigation device 300 and sets a route to the destination point.
- the external device IZF 407 may perform its function by, for example, the GPS sensor unit 414 and the storage unit 403.
- the image processing unit 408 performs image processing using information stored in the storage unit 403, image information acquired from the image sensor unit 411, and the like. Specifically, for example, the direction and speed of movement of objects, recognition of road signs and signs, etc. are performed.
- the sensor unit 410 A seed sensor unit is provided. Specifically, for example, there are a sensor that acquires information on the state of the vehicle and a sensor that acquires information on the state of the driver.
- the image sensor unit 411 acquires image information. Specifically, for example, an image inside or outside the vehicle is captured by a CCD camera or the like and acquired as image information.
- the drive unit position detection unit 412 detects the position or rotation of the drive unit 401. Specifically, for example, the driving angle of the driving unit 401 in the direction of the pitch and the pitch is detected.
- the acceleration sensor unit 413 detects the acceleration of the vehicle by a gyro or the like.
- the acceleration of the vehicle for example, a change in the vehicle speed may be detected from the acceleration in the front-rear direction, or the acceleration force in the left-right direction may be detected as a vehicle shake.
- the GPS sensor unit 414 detects the current position of the vehicle based on radio waves from GPS satellites.
- the sound sensor unit 415 obtains the volume of sound inside or outside the vehicle, the direction in which the sound is generated, and the like using a microphone or the like.
- Temperature sensor unit 416 measures the temperature inside or outside the vehicle.
- the humidity sensor unit 417 measures the humidity inside or outside the vehicle.
- the illuminance sensor unit 418 measures the intensity of light inside or outside the vehicle. Specifically, for example, it is detected whether or not the vehicle has entered the tunnel and whether or not the sun is out. Further, the illuminance sensor unit 418 may detect the amount of ultraviolet rays in sunlight.
- the smoke sensor unit 419 detects smoke inside or outside the vehicle. Specifically, for example, cigarette smoke is detected.
- the air sensor unit 420 measures air components. Specifically, for example, carbon monoxide concentration and impurity concentration inside the vehicle are measured.
- the ultrasonic sensor unit 421 measures the distance to an object around the vehicle. Specifically, the time until the ultrasonic wave emitted from the ultrasonic sensor unit 421 mounted on the vehicle returns to the vehicle again is acquired, and the vehicle force also measures the distance to the measurement target.
- the microwave sensor unit 422 measures a distance from an object around the vehicle. Specifically, the distance between the vehicle and the object is measured using microwaves.
- the laser sensor unit 423 measures a distance from an object around the vehicle. Specifically, the distance to the vehicle force target is measured using a laser.
- the infrared sensor unit 424 acquires image information using infrared rays.
- the touch sensor unit 4 25 determines whether or not an arbitrary object is in contact with a target part inside or outside the vehicle. To do.
- the pressure sensor unit 426 measures the air pressure inside the vehicle and the force applied to each sensor.
- the biometric sensor unit 427 acquires information such as a user's heartbeat, brain waves, and respiration.
- the magnetic sensor unit 428 measures magnetism.
- FIG. 5 is a flowchart showing an outline of the processing of the drive control apparatus according to this embodiment.
- the drive angle of the drive unit 401 is initialized (step S501). Specifically, for example, the angle in the direction of the driver 401 and the angle in the pitch direction are each set to 0 °.
- step S502 vehicle information and image information are acquired (step S502).
- step S503 based on the vehicle information acquired in step S502, it is determined whether or not the winker is used (step S503). If the winker is used (step S503: Yes), the first attention point is calculated (step S504). On the other hand, when the winker is not used in step S503 (step S503: No), the process returns to step S502, and the subsequent processing is repeated.
- the first point of interest is the point of interest at time T.
- the point of interest is a point on the road that serves as a reference when calculating the risk index of the vehicle.
- the control signal to the drive unit 401 is calculated using the first attention point calculated in step S504 (step S505).
- the control signal is an electrical signal that controls the drive unit 401. Then, when the control signal calculated in step S505 is input to the drive unit 401, it is determined whether or not the first attention point calculated in step S504 is within a photographing range (step S506).
- the imageable range is the range of the camera viewing angle.
- step S506 when the first point of interest falls within the shootable range (step S506:
- step S507 the drive unit 401 is controlled in accordance with the control signal calculated in step S505 (step S507).
- step S506: No the process returns to step S501 and the subsequent processing is repeated.
- step S507 by the camera mounted on the drive unit 401 controlled in step S507.
- the image information is acquired (step S508), and the risk index is calculated based on the image information and the vehicle information acquired in step S502 (step S509), and the second attention point is calculated (step S510).
- the second point of interest is the point of interest at time (T + ⁇ ).
- step S511 If the risk index is greater than or equal to threshold A (step S511: Yes), then the risk index It is determined whether or not the force is greater than or equal to threshold value B (step S512).
- the threshold A is the value B.
- step S512 if the risk index is greater than or equal to the value B (step S512: Yes), wait until normal operation returns (step S513: No loop), then return to normal operation (step S513: Yes), it is determined whether or not to continue the control of the drive unit 401 (step S514). When not continuing in step S514 (step S514: No), a series of processing is ended as it is.
- step S511: No if the risk index is greater than or equal to the value A in step S511 (step S511: No), the process returns to step S505, and the subsequent processing is repeated.
- step S512 if the risk index is less than the threshold value B (step S512: No), the driver is not warned (step S515), the process returns to step S505, and the subsequent processing is repeated.
- W is a range in the horizontal direction
- H is a range in the height direction.
- the drive angle range of the drive unit 401 is 90 ° to + 90 ° when the straight direction angle is 0 ° and the right direction is the positive direction in the direction of the arrow, and the direction parallel to the road in the pitch direction. If the angle is 0 ° and the downward direction is the plus direction, the range is -45 ° to + 45 °.
- the control unit 402 sets the angle of the drive unit 401 in the horizontal direction to 0 ° and the angle in the pitch direction to 0 °. [0077] (Contents of processing for acquiring vehicle information and image information)
- FIG. 6 is a flowchart showing the contents of processing for acquiring vehicle information and image information.
- vehicle information and image information are acquired (step S601).
- the vehicle information is information acquired by the vehicle information IZF 406, and specifically, information on the blinker, information on the speed of the vehicle, and the like.
- the image information is information acquired by the image sensor unit 411, and specifically, information related to an image around the vehicle captured by a camera or the like. Then, the vehicle information and the image information acquired in step S601 are stored in the storage unit 403 (step S602), and the series of processes is terminated.
- FIG. 7 is a flowchart showing the contents of the process of calculating the first attention point.
- vehicle information and image information stored in the storage unit 403 are read out in step S602 of FIG. 6 (step S701).
- step S701 the image information is taken with a camera installed at a height of 1 meter from the road with an angle of 0 ° in the pitch direction.
- FIG. 8 is an explanatory diagram showing the vertical distance of one point on the road in the captured image.
- the distance from the camera to the lowest road on the image is IX (1Z tanl5 °).
- the distance of the camera force to the road at the center point in the lower half of the image divided in the vertical direction is calculated as l X (l / tanl5 °) X 2.
- Fig. 9 is an explanatory diagram showing the relationship between the distance in the horizontal direction and the distance in the vertical direction of the center point force of the photographed image.
- the horizontal direction of the camera force up to the coordinate (x, y) point
- Dl l X (lZtanl5 °) X (240Zy) [Equation 1].
- step S 701 based on the vehicle information and the image information read out, a road edge break in the turning direction is detected (step S 702). Specifically, the direction of the turn signal is determined by the turn signal information in the vehicle information as the turning direction, and the road edge break is detected by analyzing the image information.
- FIG. 10 is an explanatory diagram for explaining points on the road.
- Figure 10 shows the remote point (point A) from the vehicle at the edge of the road edge in the turn direction, the point (point B) near the vehicle at the road edge at the turn direction, and the reference point of the road to bend. (C point) and the first point of interest (D point). Point A, point B, point C, and point D all exist on the road.
- step S702 two points, point A and point B, are detected as road edge breaks.
- a length L between road edge breaks is calculated (step S703).
- the length L between the road edge breaks is the length from point A to point B detected in step S702.
- a reference point (point C) of the road ahead is detected (step S704).
- point C is the point where the distance between AC is 1Z4, the distance between AB, on the straight line between points A and B.
- point C is the center point of the road that passes after turning the road as directed by Winker.
- a first point of interest (point D) is determined (step S705).
- Point D is a point that is perpendicular to the camera optical axis from point C by the same distance as AB.
- the coordinate value of the first target point determined in step S705 is stored in the storage unit 403 (step S706), and the series of processes is terminated.
- the coordinate value of the point D is obtained as follows.
- Pd (253, 45).
- FIG. 11 is a flowchart showing the contents of the process for calculating the control signal.
- step S1101 the coordinate value of the first point of interest stored in the storage unit 403 in step S706 of FIG. 7 is read (step S1101).
- step S1102 the rotation angle is calculated from the coordinate value of the first point of interest read in step S1101 (step S1102).
- step S1102 the rotation angle is calculated using the coordinates of point D as Pd.
- FIG. 12 is a flowchart showing the contents of the process for controlling the drive unit.
- the control signal stored in the storage unit 403 in step S1103 of FIG. 11 is read (step S1201).
- the drive unit 401 is controlled according to the control signal read in step S 1201 (step Step S1202), and the series of processing ends.
- control of the drive unit 401 is, for example, control for rotating the camera to an angle at which the point D described in FIG. 7 to FIG. 10 appears in an image photographed by the camera. Specifically, by controlling the camera by the rotation angle ⁇ in the direction of the arrow using the coordinate value of the point D calculated in FIG. 11 and the rotation angle ⁇ in the direction of the pitch, the point D becomes the image. Since it overlaps with the central point, point D will appear in the image.
- FIG. 13 is a flowchart showing the contents of the process for acquiring image information.
- image information at time T is acquired by the camera controlled in step S1202 of FIG. 12 (step S1301). Therefore, the central point of image information overlaps with D point.
- the center point of the image information acquired in step S1301 is the center.
- An attention point template in a range of X 10 pixels is acquired (step S1302). Then, the attention point template and the image information at time T acquired in step S1302 are stored in the storage unit 403 (step S1303). Next, image information at ⁇ time after the image is acquired at step S1301, that is, image information at time (T + ⁇ ) is acquired (step S1304). Then, the image information at the time (T + ⁇ ) is stored in the storage unit 403 (step S 1305), and the series of processing ends.
- FIG. 14 is a flowchart showing the contents of the process for calculating the risk index.
- FIG. 15 is an explanatory diagram showing image information acquired at time T
- FIG. 16 is an explanatory diagram showing image information acquired at time ( ⁇ + ⁇ ).
- image information at time T is read from the storage unit 403 (step S1401).
- the image information at time T is the image information stored in step S1303 in FIG. 13, and specifically, for example, the image information shown in FIG. Figure In 15, the road 1501, the animal 1502, and the person 1503 that are running are imaged.
- image information at time (T + ⁇ ) is read from the storage unit 403 (step S 1402).
- the image information at time ( ⁇ + ⁇ ) is the image information stored in step S1305 in FIG. 13, and specifically, for example, the image information shown in FIG. In FIG. 16, the road 1601, the animal 1602, and the person 1603 that are running are imaged.
- step S1401 the image information at time T read in step S1401 is superimposed on the image at time (T + ⁇ ) read in step S1402 by matching white lines (step S). 1403).
- white line matching for example, pedestrian crossings, stop lines, white lines between lanes, and white lines at the ends of lanes are superimposed.
- FIG. 17 is an explanatory diagram showing image information in which white lines are matched and overlaid.
- road 1501 in FIG. 15 and road 1601 in FIG. 16 are overlapped by matching white lines. Therefore, it is possible to represent the difference between ⁇ of objects imaged other than on the road. Specifically, for example, the positions of the animal 1502 and the person 1503 at the time T and the positions of the animal 1602 and the person 1603 at the time (T + ⁇ ) can be represented simultaneously.
- step S1404 a place where the difference between the two image information is large is detected as a moving object.
- the animal 1 502 or the animal 1602 and the person 1503 (or the person 1603) are detected as moving objects.
- step S1405 feature points are extracted within the moving object region (step S1405), and a template near the feature points is acquired (step S1406).
- step S 1406 the template acquired in step S 1406 is matched with the image information at time (T + ⁇ ) to calculate a motion vector M (step S 1407). Specifically, for example, the motion vector M1 and the motion vector M2 shown in FIG. 18 are calculated.
- FIG. 18 is an explanatory diagram showing the motion vector M.
- roads 1501 and 1601, animals 1502 and 1602, people 1503 and 1603, vanishing points 1801, and power are shown.
- the motion vector M indicates the distance and direction moved during ⁇ . Therefore, the motion vector when animal 1502 moves to the position of animal 1602 during ⁇ is Ml, and the motion vector when the person 1503 moves to the position of the person 1603 during ⁇ is M2.
- the dangerous index D is also calculated for the motion vector ⁇ and the vehicle speed V force calculated in step S1407 (step S1408).
- FIG. 19 is an explanatory diagram showing the magnitude of the risk index D based on the relationship between the unit vector I and the motion vector M.
- road 1601, animal 1602, person 16 03, animal motion vector Ml, person motion vector M2, vanishing point 1801, attention point 1901, and unit vector I are shown. ing.
- risk index D is assumed to be large.
- the animal motion vector Ml is parallel to the unit vector I, the risk index D is small. Then, the risk index D calculated in step S 1408 is stored (step S 1409), and the series of processing ends.
- FIG. 20 is a flowchart showing the contents of the process of calculating the second attention point.
- step S1302 in FIG. 13 the attention point template acquired at time T is read (step S2001).
- step S 1305 in FIG. 13 the image information at the time (T + ⁇ ) stored in step S 1305 in FIG. 13 is read out, and the attention point template at the time T is matched with the image information at the time (T + ⁇ ) to obtain the time (T + The attention point template of ( ⁇ ) is acquired (step S2002). Then, the coordinate value of the second attention point is stored in the storage unit 403 using the center coordinate of the attention point template at the time (T + ⁇ ) acquired in step S2002 as the second attention point (step S20 03), a series of processing ends.
- step S511 in FIG. 5 is 20 and threshold B in step S512 is 40
- the driver goes to step S50 5 without warning. Return and calculate the control signal.
- the process proceeds to step S515 to warn the driver, and then returns to step S505 to calculate the control signal.
- the process proceeds to step S513 without warning the driver and waits until the normal operation is restored.
- FIG. 21 is a flowchart showing the contents of the process for determining whether or not the normal operation is restored.
- step S2101 it is determined whether or not the vehicle speed is 0 (step S2101). If the vehicle speed is 0 in step S2101 (step S2101: Yes), it is determined that the vehicle has returned to normal operation (step S2102), and the series of processing ends.
- step S2101 determines whether or not the depression angle is 0 for a predetermined time or more (step S2103). If the depression angle is 0 for a predetermined time or more after step S2103 (step S2103: Yes), the process proceeds to step S2102, and it is determined that the normal operation has been resumed, and the series of processing ends. In step S2103, if the angle force is not more than a certain time (step S2103: No), it is determined that the vehicle is in danger driving (step S2104), the process returns to step S2101, and the subsequent processing is repeated.
- step S515 in FIG. 5 the details of the processing to warn the driver in step S515 in FIG. 5 will be described in detail. If it is determined in step S2104 in FIG. 21 that the vehicle is dangerously driven, the drive control device 400 warns the driver.
- the warning to the driver is given by, for example, an image or sound output from the information output unit 405 or a movement of the drive unit 401. In addition, the combination is used to warn the driver of dangerous driving.
- step S511 when the risk index is less than threshold A at step S511: No in FIG. 5, the process returns to step S505, and when the risk index is less than threshold B at step S512: No. Then, proceed to step S515, warn the driver, and return to step S505.
- the driving unit 401 may be controlled to track the moving object as it is.
- the risk may be further avoided by calculating the risk index specifically for these moving objects.
- the risk index is calculated based on the relationship of the position of the object using the image information.
- the present invention is not limited to this. For example, when the risk index is larger than normal driving, when turning left or right, changing lanes, looking at signs / signboards, passing through accident-prone areas, passing through narrow roads, passing through branch points such as expressways ,and so on.
- a risk index is calculated based on the blinker information.
- turn signal information is detected, and the risk index is set to a value greater than that during normal driving.
- the risk index is larger because the vehicle crosses the opposite lane when turning right than when turning left. Therefore, when making a right turn, it is detected that the turn signal information makes a right turn, and the risk index is set to a value greater than that for a left turn.
- Map information is used to calculate a risk index when a sign / signboard is viewed, when an accident occurs frequently, or when a highway or other branching point is passed. For example, if the current location of the vehicle in the map information approaches an important sign 'signboard on the route to the school area or the route to the destination point, a point where accidents occur frequently, or a branch point such as an expressway, the risk index To a value greater than during normal operation.
- the risk index is calculated from road conditions and other vehicle information.
- the road width is calculated from the vehicle information IZF406, the external device IZF407, and the image sensor unit 411, and a lane change is detected.
- the risk index increases the risk index compared to normal operation.
- the sound sensor unit 415 when calculating the risk index according to other external situations, such as when a horn is ringing outside, the sound sensor unit 415 obtains sound and determines that the driving situation is high in stress. And make the risk index larger than that during normal operation.
- Other examples of calculating the danger index according to the external situation include weather information acquired by mobile devices such as mobile devices using the external device IZF 407, image processing by the image sensor unit 411, sound, etc.
- the risk index may be set to a value larger than that during normal operation by judging the force such as rain sound acquired by the sensor unit 415.
- the risk index may be calculated based on changes in the driver's condition. For example, if it is determined that the driver feels drowsy from information obtained from force such as the image sensor unit 411, the acceleration sensor unit 413, and the biometric sensor unit 427, the risk index is set larger than that during normal driving. The Further, the threshold value A and the threshold value B in steps S511 and S512 in FIG. 5 may be reduced. Furthermore, the imaging direction of the image sensor unit 411 may be directed toward the driver depending on the degree of sleepiness determination. Then, it is possible to determine the driver's condition in detail from the captured image information and calculate the risk index.
- the calculation of the risk index due to the change in the driver's condition is performed by determining whether or not the vehicle has deviated from the lane based on the image information outside the vehicle captured by the image sensor unit 411. When it is determined that the risk is dangerous, the risk index is increased from that during normal operation. In addition, the number and length of the driver's blinks are detected from the image information inside the vehicle imaged by the image sensor unit 411, and the risk index is made larger than that during normal driving according to the detected result. A little.
- the calculation of the risk index due to the change in the driver's condition is more specifically detected by detecting the vehicle shake by the acceleration sensor unit 413, and detecting the driver's sleepiness by this vehicle shake force.
- the risk index may be larger than during normal operation.
- the calculation of the risk index based on changes in the driver's condition is specifically performed by detecting the driver's pulse, breathing, brain waves, etc. by the biosensor unit 427, and the state of stress where sleepiness is occurring. If it is determined that it is high and overly tense, the risk index may be higher than during normal driving.
- a number may be calculated depending on the vehicle information acquired by the vehicle information IZF406, A number may be calculated. Specifically, for example, when the winker or wiper is operated, the risk index is set larger than that during normal operation. When the hazard is lit, when the light is lit, increase the risk index compared to normal operation. In addition, when the vehicle speed is larger than the speed limit based on map information or signboard sign recognition, the risk index may be larger than that during normal driving, and the steering angle is greater than or equal to a predetermined value compared to the map information. If there is a deviation, the risk index may be higher than during normal operation.
- the risk index may be made larger than that during normal driving according to the distance from the preceding vehicle.
- the distance from the preceding vehicle is obtained by a sensor that obtains the distance to the object such as the image sensor unit 411, the ultrasonic sensor unit 421, and the infrared sensor unit 424, and the inter-vehicle distance from the preceding vehicle is predetermined. If it is less than the distance, or if the relative speed with the preceding vehicle is greater than or equal to a predetermined value, the risk index may be made larger than during normal driving.
- the image sensor unit 411 or the like has acquired that the preceding vehicle has made a hazard, increase the danger index from that during normal operation.
- the navigation device 300 or the drive control device 400 of the present embodiment based on the information about the moving object acquired by the information acquiring unit 101, the calculating unit 103 Thus, a risk index indicating the possibility of a dangerous event is calculated, and the drive unit 102 that drives the sensor mounted on the moving body can be controlled by the control unit 106 according to the calculated result. Accordingly, the navigation device 300 or the drive control device 400, for example, if the user is not gazing at the drive unit 102 and determines that a dangerous event occurs in the moving body, the drive unit 102 is used. Can be controlled. Thus, the user can control the driving unit 102 during dangerous driving so that the user does not hinder driving judgment.
- the navigation device 300 or the drive control device 400 of the present embodiment depending on the calculation result calculated by the calculation unit 103, a dangerous event for the moving object to the moving object It is possible to notify the possibility of occurrence. Therefore, the navigation device 300 or the drive control device 400 can warn the driver that the vehicle is in danger driving when the risk index is within a predetermined range. As a result, the user can know that driving is in a dangerous state, and can avoid dangerous events such as accidents. [0122] Further, according to the navigation device 300 or the drive control device 400 of the embodiment, when the determination unit 105 determines that the risk index is larger than a predetermined value, the control unit 106 drives by the drive unit 102. Can be stopped.
- the navigation device 300 or the drive control device 400 stops the drive unit 102 in the driver's field of view during dangerous driving, so that the drive unit 102 is regarded as an object (a person or an oncoming vehicle) outside the vehicle. You can prevent mistakes. This allows the user to concentrate on driving because there is no extra movement in the field of view.
- the drive unit 102 can drive the information acquisition unit 101. Therefore, a wide range of information can be acquired by driving the information acquisition unit 101 by the driving unit 102. As a result, the user can know the possibility of dangerous events over a wide range.
- the information acquisition unit 101 can acquire image information related to the periphery of the moving object. Therefore, it is possible to identify an object having a large risk index for the moving object from the change in the acquired image information. This allows users to avoid dangerous events such as accidents.
- the drive by the drive unit 102 is resumed. can do. Therefore, the drive unit 102 can be driven during normal operation. As a result, for example, even if the movement of a vehicle-mounted robot is stopped, the user can move it again after avoiding a dangerous event.
- the drive unit 102 can drive the sensor mounted on the moving body in the arrow direction and the pitch direction. Therefore, the drive control apparatus 100 can acquire information on the moving object over a wide range. This allows the user to know the possibility of dangerous events over a wide area.
- the drive control device, drive control method, and drive control program of the present invention According to the recording medium, the navigation device 300 or the drive control device 400 calculates a risk index based on information acquired by various sensors included in the sensor unit 410, and the risk index is equal to or greater than a threshold value. In this case, the drive unit 401 can be controlled. Therefore, even when the driver is not gazing at the drive unit 401, the drive unit 401 can be stopped during dangerous driving. This allows the driver to concentrate on driving because there is no extra movement in the field of view.
- the drive control method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
- This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed when the recording medium force is also read by the computer.
- the program may be a transmission medium that can be distributed through a network such as the Internet.
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2006/319328 WO2008038369A1 (fr) | 2006-09-28 | 2006-09-28 | Dispositif de commande de conduite, procédé de commande de conduite, programme de commande de conduite et support d'enregistrement |
JP2008536250A JP4783430B2 (ja) | 2006-09-28 | 2006-09-28 | 駆動制御装置、駆動制御方法、駆動制御プログラムおよび記録媒体 |
US12/442,793 US8160772B2 (en) | 2006-09-28 | 2006-09-28 | Drive control apparatus, drive control method, drive control program, and recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2006/319328 WO2008038369A1 (fr) | 2006-09-28 | 2006-09-28 | Dispositif de commande de conduite, procédé de commande de conduite, programme de commande de conduite et support d'enregistrement |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008038369A1 true WO2008038369A1 (fr) | 2008-04-03 |
Family
ID=39229817
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/319328 WO2008038369A1 (fr) | 2006-09-28 | 2006-09-28 | Dispositif de commande de conduite, procédé de commande de conduite, programme de commande de conduite et support d'enregistrement |
Country Status (3)
Country | Link |
---|---|
US (1) | US8160772B2 (ja) |
JP (1) | JP4783430B2 (ja) |
WO (1) | WO2008038369A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2525336A1 (en) * | 2010-01-12 | 2012-11-21 | Toyota Jidosha Kabushiki Kaisha | Collision position predicting device |
KR20170137689A (ko) * | 2017-12-01 | 2017-12-13 | 팅크웨어(주) | 전자 기기 및 상기 전자 기기의 영상 촬영 방법 |
WO2019022042A1 (ja) * | 2017-07-25 | 2019-01-31 | 株式会社デンソー | 車両用報知装置 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004051599A1 (de) * | 2004-10-22 | 2006-05-04 | Siemens Ag | Vorrichtung und Verfahren zum Wiedergeben von Multimediadaten in einem Kraftfahrzeug |
TWI486147B (zh) * | 2011-10-04 | 2015-06-01 | Univ Nat Taiwan Science Tech | 即時生理信號量測及回饋系統 |
US9293054B2 (en) * | 2011-11-11 | 2016-03-22 | Aptima, Inc. | Systems and methods to react to environmental input |
US9418674B2 (en) * | 2012-01-17 | 2016-08-16 | GM Global Technology Operations LLC | Method and system for using vehicle sound information to enhance audio prompting |
US9007229B1 (en) | 2012-06-20 | 2015-04-14 | Amazon Technologies, Inc. | Sensor based recommendations |
US20150338227A1 (en) * | 2012-11-22 | 2015-11-26 | Freescale Semiconductor, Inc. | Navigation system |
US10347127B2 (en) * | 2013-02-21 | 2019-07-09 | Waymo Llc | Driving mode adjustment |
JP6170757B2 (ja) * | 2013-06-26 | 2017-07-26 | 富士通テン株式会社 | 表示制御装置、表示システム、情報提供方法及びプログラム |
KR101539302B1 (ko) * | 2013-12-11 | 2015-07-30 | 현대자동차주식회사 | 차량 및 그 제어방법 |
JP2016001464A (ja) | 2014-05-19 | 2016-01-07 | 株式会社リコー | 処理装置、処理システム、処理プログラム、及び、処理方法 |
JP2017528788A (ja) * | 2014-07-04 | 2017-09-28 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 空気品質警報システム及び方法 |
JP2016103249A (ja) * | 2014-11-28 | 2016-06-02 | 富士通株式会社 | 運転支援装置および運転支援方法 |
US9610893B2 (en) | 2015-03-18 | 2017-04-04 | Car1St Technologies, Llc | Methods and systems for providing alerts to a driver of a vehicle via condition detection and wireless communications |
US10328855B2 (en) | 2015-03-18 | 2019-06-25 | Uber Technologies, Inc. | Methods and systems for providing alerts to a connected vehicle driver and/or a passenger via condition detection and wireless communications |
US9507974B1 (en) * | 2015-06-10 | 2016-11-29 | Hand Held Products, Inc. | Indicia-reading systems having an interface with a user's nervous system |
US10373143B2 (en) | 2015-09-24 | 2019-08-06 | Hand Held Products, Inc. | Product identification using electroencephalography |
US9925867B2 (en) * | 2016-01-11 | 2018-03-27 | Ford Global Technologies, Llc | Fuel control regulator system with acoustic pliability |
CN110662682B (zh) * | 2017-06-01 | 2022-08-26 | 三菱电机株式会社 | 移动体控制装置、移动体控制方法以及计算机可读存储介质 |
US10460185B2 (en) | 2018-01-30 | 2019-10-29 | Toyota Motor Engineering & Manufacturing North America, Inc. | Roadside image tracking system |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1038601A (ja) * | 1996-07-29 | 1998-02-13 | Takaaki Nagai | 車両用情報表示装置 |
JP2000255319A (ja) * | 1999-03-10 | 2000-09-19 | Fuji Heavy Ind Ltd | 車両の進行方向認識装置 |
JP2001099661A (ja) * | 1999-09-30 | 2001-04-13 | Toshiba Corp | バーチャルナビゲーター |
JP2001304899A (ja) * | 2000-04-25 | 2001-10-31 | Sony Corp | カーナビゲーション装置用補助表示装置 |
JP2002331890A (ja) * | 2001-05-10 | 2002-11-19 | Toyota Motor Corp | 乗物の推奨操作表現システム |
JP2004009833A (ja) * | 2002-06-05 | 2004-01-15 | Nissan Motor Co Ltd | 運転状況記録装置 |
JP2004259069A (ja) * | 2003-02-26 | 2004-09-16 | Aisin Seiki Co Ltd | 車両危険度に応じた警報信号を出力する警報装置 |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3448822A (en) * | 1966-06-23 | 1969-06-10 | Micro Electronics Intern Inc | Vehicle anti-collision automatic control-system |
JPS6378299A (ja) * | 1986-09-22 | 1988-04-08 | アイシン・エィ・ダブリュ株式会社 | 画像処理技術を用いた信号認識装置を有する自動車 |
US4833469A (en) * | 1987-08-03 | 1989-05-23 | David Constant V | Obstacle proximity detector for moving vehicles and method for use thereof |
US5327990A (en) * | 1989-05-16 | 1994-07-12 | Busquets Albert B | Integral automatic system for protection and rescue of occupants in crashed automobiles |
JPH05265547A (ja) * | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | 車輌用車外監視装置 |
IL102097A (en) * | 1992-06-04 | 1995-05-26 | Davidian Dan | Anti-collision system for vehicles |
US5465079A (en) * | 1992-08-14 | 1995-11-07 | Vorad Safety Systems, Inc. | Method and apparatus for determining driver fitness in real time |
DE4317831C1 (de) * | 1993-05-28 | 1994-07-07 | Daimler Benz Ag | Display zur Anzeige der Gefahrenträchtigkeit der momentanen Fahrsituation eines Kraftfahrzeugs |
JPH07186876A (ja) * | 1993-12-27 | 1995-07-25 | Asuko Kk | 車両用安全装置のための制御装置 |
US7085637B2 (en) * | 1997-10-22 | 2006-08-01 | Intelligent Technologies International, Inc. | Method and system for controlling a vehicle |
US6640174B2 (en) * | 2001-01-31 | 2003-10-28 | Ford Global Technologies, Llc | Restraint and fuel system cutoff control module |
ITTO20010282A1 (it) * | 2001-03-26 | 2002-09-26 | Fiat Ricerche | Sistema di ausilio alla guida di un autoveicolo. |
JP4615139B2 (ja) * | 2001-03-30 | 2011-01-19 | 本田技研工業株式会社 | 車両の周辺監視装置 |
EP1412930B1 (de) * | 2001-07-11 | 2008-03-26 | Robert Bosch Gmbh | Verfahren und vorrichtung zum selbsttätigen auslösen einer verzögerung eines fahrzeugs |
US6519519B1 (en) * | 2002-02-01 | 2003-02-11 | Ford Global Technologies, Inc. | Passive countermeasure methods |
US7124027B1 (en) * | 2002-07-11 | 2006-10-17 | Yazaki North America, Inc. | Vehicular collision avoidance system |
JP2004093354A (ja) | 2002-08-30 | 2004-03-25 | Casio Comput Co Ltd | 運転情報出力装置およびプログラム |
WO2004102500A1 (ja) * | 2003-05-16 | 2004-11-25 | Fujitsu Limited | 警報システム,警報制御装置,及び警報制御プログラム |
JP2005024507A (ja) * | 2003-07-03 | 2005-01-27 | Denso Corp | ナビゲーション装置及びプログラム |
US7055640B2 (en) * | 2003-09-10 | 2006-06-06 | Ford Global Technologies, Llc | Fuel cut-off control system for a vehicle |
JP4277081B2 (ja) * | 2004-03-17 | 2009-06-10 | 株式会社デンソー | 運転支援装置 |
JP2006115376A (ja) * | 2004-10-18 | 2006-04-27 | Matsushita Electric Ind Co Ltd | 車載表示装置 |
JP2007128430A (ja) * | 2005-11-07 | 2007-05-24 | Toyota Motor Corp | 車両用警報装置 |
KR100776415B1 (ko) * | 2006-07-18 | 2007-11-16 | 삼성전자주식회사 | 동영상 재생 방법 및 그 시스템 |
-
2006
- 2006-09-28 JP JP2008536250A patent/JP4783430B2/ja not_active Expired - Fee Related
- 2006-09-28 WO PCT/JP2006/319328 patent/WO2008038369A1/ja active Application Filing
- 2006-09-28 US US12/442,793 patent/US8160772B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1038601A (ja) * | 1996-07-29 | 1998-02-13 | Takaaki Nagai | 車両用情報表示装置 |
JP2000255319A (ja) * | 1999-03-10 | 2000-09-19 | Fuji Heavy Ind Ltd | 車両の進行方向認識装置 |
JP2001099661A (ja) * | 1999-09-30 | 2001-04-13 | Toshiba Corp | バーチャルナビゲーター |
JP2001304899A (ja) * | 2000-04-25 | 2001-10-31 | Sony Corp | カーナビゲーション装置用補助表示装置 |
JP2002331890A (ja) * | 2001-05-10 | 2002-11-19 | Toyota Motor Corp | 乗物の推奨操作表現システム |
JP2004009833A (ja) * | 2002-06-05 | 2004-01-15 | Nissan Motor Co Ltd | 運転状況記録装置 |
JP2004259069A (ja) * | 2003-02-26 | 2004-09-16 | Aisin Seiki Co Ltd | 車両危険度に応じた警報信号を出力する警報装置 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2525336A1 (en) * | 2010-01-12 | 2012-11-21 | Toyota Jidosha Kabushiki Kaisha | Collision position predicting device |
EP2525336A4 (en) * | 2010-01-12 | 2014-06-11 | Toyota Motor Co Ltd | COLLISION POINT PREDICTION DEVICE |
US8849558B2 (en) | 2010-01-12 | 2014-09-30 | Toyota Jidosha Kabushiki Kaisha | Collision position predicting device |
WO2019022042A1 (ja) * | 2017-07-25 | 2019-01-31 | 株式会社デンソー | 車両用報知装置 |
JP2019025929A (ja) * | 2017-07-25 | 2019-02-21 | 株式会社デンソー | 車両用報知装置 |
KR20170137689A (ko) * | 2017-12-01 | 2017-12-13 | 팅크웨어(주) | 전자 기기 및 상기 전자 기기의 영상 촬영 방법 |
KR102091017B1 (ko) * | 2017-12-01 | 2020-03-19 | 팅크웨어(주) | 전자 기기 및 상기 전자 기기의 영상 촬영 방법 |
Also Published As
Publication number | Publication date |
---|---|
US20100094502A1 (en) | 2010-04-15 |
US8160772B2 (en) | 2012-04-17 |
JPWO2008038369A1 (ja) | 2010-01-28 |
JP4783430B2 (ja) | 2011-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4783430B2 (ja) | 駆動制御装置、駆動制御方法、駆動制御プログラムおよび記録媒体 | |
US20230311749A1 (en) | Communication between autonomous vehicle and external observers | |
JP4729905B2 (ja) | 車両報知装置及びプログラム | |
KR102046468B1 (ko) | 차량용 사이드 미러 | |
US9723243B2 (en) | User interface method for terminal for vehicle and apparatus thereof | |
JP5198835B2 (ja) | ビデオ画像を提示する方法およびシステム | |
US9126533B2 (en) | Driving support method and driving support device | |
JP4311426B2 (ja) | 移動体を表示するための表示システム、車載装置及び表示方法 | |
CN107010063A (zh) | 基于感知的速度限制估算和学习 | |
EP2085944A1 (en) | Driving assistance device, driving assistance method, and program | |
US20120109521A1 (en) | System and method of integrating lane position monitoring with locational information systems | |
JP2007145095A (ja) | 走行制御装置、走行制御方法、走行制御プログラムおよび記録媒体 | |
US20230135641A1 (en) | Superimposed image display device | |
JP4787196B2 (ja) | 車載用ナビゲーション装置 | |
US11828613B2 (en) | Superimposed-image display device | |
JP2003178397A (ja) | 道路表示等検出警告装置 | |
KR101281629B1 (ko) | 센서를 이용한 운전 안내 시스템 | |
CN109070799B (zh) | 移动体周围显示方法及移动体周围显示装置 | |
JP2016161483A (ja) | 情報提供装置、及び情報提供プログラム | |
JP2017126213A (ja) | 交差点状況確認システム、撮像装置、車載装置、交差点状況確認プログラムおよび交差点状況確認方法 | |
KR20220046553A (ko) | 자율 차량 상호 작용 시스템 | |
JP7215191B2 (ja) | 運転支援制御装置、運転支援制御方法、およびプログラム | |
CN117445810A (zh) | 车辆辅助驾驶方法、装置、介质及车辆 | |
TW201017106A (en) | Road object image display system | |
WO2008041284A1 (fr) | Dispositif de guidage sur itinéraire, procédé de guidage sur itinéraire, programme de guidage sur itinéraire et support d'enregistrement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06810775 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008536250 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12442793 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06810775 Country of ref document: EP Kind code of ref document: A1 |