WO2007066696A1 - Information recording device, information recording method, information recording program and computer readable recording medium - Google Patents

Information recording device, information recording method, information recording program and computer readable recording medium Download PDF

Info

Publication number
WO2007066696A1
WO2007066696A1 PCT/JP2006/324375 JP2006324375W WO2007066696A1 WO 2007066696 A1 WO2007066696 A1 WO 2007066696A1 JP 2006324375 W JP2006324375 W JP 2006324375W WO 2007066696 A1 WO2007066696 A1 WO 2007066696A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
information recording
vehicle
video data
time
Prior art date
Application number
PCT/JP2006/324375
Other languages
French (fr)
Japanese (ja)
Inventor
Hiroaki Shibasaki
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007066696A1 publication Critical patent/WO2007066696A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera

Definitions

  • This relates to an information device that records information, an information method, and a recording medium that can be read by an information program.
  • this document is not limited to a recording medium that can be read by the above-mentioned information device, information method, and information program.
  • the lidar Since 2000, the lidar has been known to record the situation of both running cars as well as the Liteda on board an airplane.
  • This driver is, for example, a front camera that captures the front of the vehicle, a rear camera that captures the rear, and a front camera.
  • the image of the driver does not include information about the history of the route between the two before and after the accident. Therefore, when analyzing the cause of an accident, it is not possible to accurately grasp the accident point if, for example, the information that leads to the intersection, such as the name of an intersection, is not displayed in the image of the driver. The title is given as an example.
  • the information that leads to the identification of the accident point is projected on the image of the driver, It should be noted, however, that it is not always possible for Z to identify the accident point based on this report.
  • a detection stage that detects the time of the data in the information device that records the video data created by the stage installed on the moving body, and an acquisition means that acquires information on the history of the movement of the moving body.
  • a storage stage that stores the information obtained by the obtaining means and the data based on the time of the data detected by the stage.
  • the information method for recording the video data obtained by the stage installed on the moving body, the detection range for detecting the time of the data and the moving body are moved.
  • 0000 is a diagram showing an example of the functional configuration of the information device according to the present embodiment.
  • 2 2 is a chart showing the meaning of the information device according to this embodiment.
  • 3 3 shows an example of a vehicle equipped with the navigation system of the present implementation, which is located in the vicinity of the vehicle.
  • FIG. 4 4 shows an example of how to create a navigation device for this implementation.
  • 6 6 shows an example of a driver image of a vehicle in this implementation.
  • FIG. 6 is a diagram showing an example of the functional configuration of the information device according to the present embodiment.
  • the information for recording the image data recorded by the stage installed on the moving body includes detection, acquisition 2, storage 3, detection 4, and display 5.
  • 014 detects the time of the video data made by the stage installed in the moving body. , For example, it may be detected from the time information provided by Thailand provided for the information. In addition, the detected image data may be stored in the configuration described later in Save 3.
  • 00152 obtains information regarding the history of movement of the mobile body.
  • it may be information that includes a moving body of a moving body, or may be a configuration that displays a moving body containing a position of the moving body at a time detected by detection on a map.
  • 0163 contains the information and video data acquired by Acquisition 2 based on the time of the video data detected by the detection. More specifically, even in the configuration in which the time detected by the detection and the information including the moving object acquired by the acquisition 2 are present for the video data, corresponding to the Munan. Guess.
  • Save 3 stores the time and the detection point for the data of the fixed time including the detection point when the sudden movement of the moving object is detected by Detection 4 described later. It may be configured such that the information including the moving parts for the fixed time period and the information corresponding to the Munan exist.
  • the 00174 detects a sudden movement of the moving body. It can also be configured to detect based on the force of various sensors mounted on the moving body, such as the movement of the moving body, for example, the movement of the moving body.
  • the sensor may be, for example, a vibration sensor G sensor, a sensor for a moving object, and a sensor capable of outputting information regarding an operation such as an operation of a Kipeda of an accelerometer of an end instruction.
  • the dangerous motion caused by detection 4 is a specified value for various sensors such as the vibration sensor G sensor, or if it is similar to a constant tan indicating an abnormality, or if the contact sensor is activated. You may detect such as a dangerous movement. Also, based on the information on the movement of the moving body, if it is a dangerous movement, it may be judged as a dangerous movement. More specifically, for example, a predetermined In addition to the above, the configuration may be such that the action of the moving body is known to be a dangerous action when there is a power of information such as an unnecessary need for a direction instruction and unnecessary speed.
  • the display unit 0195 controls the display surface and synthesizes and displays the information of the moving body and the image data which have been saved by the storage 3. More specifically, it may be configured such that the image data displayed by the stages installed on the moving body and the information including the moving body are displayed together. At this time, the time acquired by detection may be displayed together. It is also possible to combine the information and video data of the moving object before they are stored in the storage 3, and display the generated image data on the display surface by the display 5.
  • Section 2 is a chart that shows the meaning of the information device in this embodiment.
  • the information judges whether or not the input of the power of the video data set by the stage installed in the moving body is accepted by the input (step S2). It may be done by operating a new operation unit, such as a new one. It may also be configured so that the moving body starts running.
  • step S2 wait for the image data to be received, and if it is received (step S2 es), the detection detects the time of the image data set by the stage installed on the moving body. (Step 2 2). For example, it may be detected from the time information provided by Thailand provided for the information. Then, in Acquisition 2, information about the history of movement of the mobile body is acquired (step 23). For example, Detect 4 can detect a sudden movement of the moving object by using information including the moving object of the moving object or displaying the moving object's point on the map by using points or trajectories. Detect (step 2 4). Rugged moving objects, such as the movement of moving objects It may be configured to detect based on the force of the robot.
  • a sudden motion such as a vibration sensor G sensor or other type of sensor that has a specified value, is approximated to a constant tan indicating an abnormality, or a contact sensor is activated, is detected as a dangerous motion. You can. In addition, based on the information on the movement of the moving body, if it is a dangerous movement, it may be judged as a dangerous movement.
  • step 3 when the motion of the moving object is detected by the detection 4 in step 4, the information of the moving object and the image data acquired by the acquisition 2 are detected.
  • a series of processing is completed after the existence of the detected time (step S25). More specifically, it may be configured such that, for the video data recorded by the stages installed in the moving body, the information including the time and the moving body is stored corresponding to Munan. Save 3 is the time and the fixed time including the detection point for the image data at the fixed time including the detection point when a sudden movement of the moving object is detected by the detection. It may be configured such that the information including the mobile object in the above is stored in correspondence with Munan.
  • the present embodiment when the sudden movement of the moving body is detected, the time and the moving body's image are detected with respect to the video data set by the stage installed on the moving body. Corresponding information. Therefore, when analyzing the situation of an accident, it is possible to accurately grasp the information that leads to the cause of the cause such as the video data and the moving object.
  • the navigation device is installed by a navigation device mounted on a moving body such as a vehicle (including a motorcycle and a motorcycle).
  • FIG. 3 shows an example of a vehicle equipped with the navigation system of this implementation, which is located in the vicinity of the vehicle.
  • Navigation 3 is installed in the vehicle's hood.
  • the navigation 3 is composed of the main unit and (display), and the display shows the current location of the vehicle, map information, current time, etc.
  • the navigation 3 is connected to the vehicle-mounted camera 3 installed on the board and the vehicle-mounted camera 32 installed on the sunizer.
  • the camera 3 can be of a smaller size and can be configured to take images inside and outside the vehicle.
  • the iku 32 is used when recording the child in the voice input of the navigation 3.
  • the camera 3 may be, for example, a camera that is fixed to the camera and takes an image of the outside of the vehicle 0300.
  • the vehicle-mounted camera 3 may be attached to the vehicle. When the vehicle-mounted camera 3 is attached to both parts, the vehicle can be fully confirmed and the situation of the rear-end collision can be recorded when the vehicle is hit by another vehicle.
  • the in-vehicle camera 3 may be an external camera that records in a dark place.
  • the on-vehicle camera 3 and the on-vehicle camera 32 may be provided in plural on the vehicle, or may be fixed type or movable type camera.
  • Navigation 3 has the drive function to record and record to the destination point and record the running condition of the vehicle.
  • the location of the vehicle obtained by the driver and the video and audio obtained from the vehicle-mounted camera 3 and 3 2 and the GPS 4 5 sensor described later 4 5
  • an overwriting area that constantly records and overwrites the running condition, and a save that saves the running condition in case of an accident. It may have an area, or it may be configured to have a body for overwriting and a body for storage.
  • Figure 4 is a schematic diagram showing an example of how the navigation system is constructed in this project.
  • the navigation 3 is mounted on a moving body such as a vehicle, and CP 4, 4 2, 4 3, magnetic disk dry 4 4, magnetic disk 4 5 and disk dry 4 6, disk 4 7, audio 1 (interface) 48, iku 4 9, input 4, input device 4, video 4 2, display 4 3, communication 4, and GPS 4 5 and various sensors 4 6 and 4 7 are provided. Also, 4 to 4 7 are each followed by 42.
  • P 4 controls the body of Navigation 3. 4 records programs such as top programs, route programs, route programs, voice programs, map programs, communication programs, database programs, and database programs. Also, 4 3 is used as a work of P 4.
  • the route program searches for an appropriate route from the starting point to the destination point using the map information recorded on the disk 47 described later.
  • the most suitable route is a route to the destination point (or a route) and a route that best matches the specified conditions. It is searched for by executing the program, and the voice 484 2 is output via P 4.
  • the route program is the information retrieved by executing the route program, the location information of the navigation 3 obtained by the communication 4, and the map information retrieved from the optical disc 47. Based on the above, we will generate the time-based report.
  • the path generated by executing the program is output as voice 4 84 2 via P 4.
  • the voice program generates the tone voice information corresponding to the tone. That is, based on the route information generated by executing the route program, the setting corresponding to the guidance point and the voice guidance information are generated, and the voice 4 8 is output via the CP 4.
  • the map program determines the formula of the map information to be displayed on the display 4 3 according to the image 42, and displays the map information on the display 4 3 according to the determined display formula.
  • the CP 4 captures a driver image from the camera 47 while the vehicle is in motion. Then, CP 4 detects the time information regarding the time when this Dryda image was taken and the time information regarding the Munang of this Dryda image. In addition, CP 4 acquires information including the moving object at the time when the dry image was taken. Then, CP 4 combines this information, time, and information into a driver image by using 4 2 or 4 3 which will be described later, and overwrites it on the overwrite recording body. Further, the CP 4 stores this image in the storage body when the sensor is detected by various sensors 46 described later.
  • the CP 4 records the image of the driver shadowed by the camera 4 7 on the body of the apparently overwritten recording while the vehicle is in motion. Then, the CP 4 saves the driver image in the storage body when various sensors 46 to be described later detect, and obtains the time information and the information including the time stamp and the moving object. Then, the CP 4 may be configured to combine the time information, the time map, and the map information with a driver image such as 4 2 or R 4 3 described later, and save this image in the save body. In addition, C P 4 is displayed on the display 4 3 described above if there is an instruction to display the driver image, for example.
  • the 004 disk drive 44 controls the uptake of only the data to the disk 45 according to the control of the CP 4.
  • the disk 45 records the data written by the magnetic disk drive 4 4.
  • a disc 4 5 As a disc 4 5 , (Dodis (Kidis can be used.
  • the disc drive 46 controls the loading of only the data to the disc 47 according to the control of the CP 4.
  • the disc 47 is an existing recording medium whose data is produced according to the disc dry 46 control.
  • the disc 47 can also use a writable recording medium.
  • the present recording medium may be disk 47, O, or a medium.
  • 004 As an example of the information recorded on the discs 4 5 and 4 7, the presence / absence of a vehicle detected by a GPS 45, which will be described later, inside and outside the vehicle obtained by the on-vehicle camera 3 2 Examples include point information and values from various sensors 46, which will be described later. These are recorded by the driving function of the navigation 3 and used as a fee in the event of a traffic accident.
  • map information used for route / guide. , Buildings, rivers, ground surface, and other data, and road data, which are drawn in 2 or 3 dimensions on the surface of the display 43.
  • the navigation 3 will be displayed as a route, and the map information and the current location acquired by GPS 45, which will be described later, will be displayed in an overlapping manner.
  • the 004 data also has transportation data.
  • Data include, for example, the entry of a highway such as a traffic light crossing at a node, () at a link, the direction of travel, a road (highway, toll road, general road). Etc.) is included.
  • the traffic data stores past reports that are statistically based on the season, large size, and time.
  • the navigation 3 obtains information on the current stagnation from the road traffic information received by the communication 44 described later, but it can use the past information to get an idea of the situation at the specified time. It will be possible.
  • map information will be recorded on the magnetic disks 4 5 and 4 7. However, it is not limited to this. However, it is not limited to those recorded in the navigation 3 integrated door, but may be provided in the navigation 3 part. In that case, the navigation 3 acquires the map information via the network, for example, through the communication 4. It is stored in the obtained map 4 3 etc.
  • the voice 4 8 is connected to the voice 4 9 for inputting the voice (for example, to the voice 3 2 of 3 and the voice power source 4 4.
  • the voice of the voice 4 9 is replaced in the voice 4 8.
  • the voices input from iku 4 9 can be recorded on the disc 4 5 or the disc 4 7 as voice data.
  • the input device 4 is equipped with a number for inputting characters, numbers, various kinds of indications, keyboard, mouse, touch panel, etc.
  • video 4 2 displays on displays 4 3 and 4 7 (for example, 3
  • Connected. 42 is, for example, V (deo R), which temporarily records an immediate image information when controlling the display of 4 3 bodies, and a graphic. It is composed of a control C that controls the display 43 based on the output image data.
  • the image 42 may be formed by adding a munan to each image of the image projected from 47.
  • the image 42 displays the display 43 and displays the information, the message and the moving object on the driver image described later.
  • Display 4 3 has an icon, socket, menu, window, or character.
  • This display 43 is, for example, C
  • a display or a plasma display can be used.
  • the display 4 3 is installed, for example, in the manner of 3.
  • the display 43 may be provided in a plurality of vehicles, for example, one for a dry seat and one for a rear seat.
  • the camera 47 captures an image of the vehicle or the outside. Whether it is static or moving, for example, a camera 4 7 captures a partial dry motion, and the captured video is output via video 4 2 to a body such as a disk 4 5 disk 4 7 etc. .
  • the camera 47 captures an image of both external conditions, and the image is output. The video output to the recording medium is overwritten and recorded as a dry image.
  • the f44 is connected to the network via radio and functions as the navigation3 and CP4 interface.
  • the f4 is also connected to the Internet or other communication network via radio, and also functions as this communication network CP4 interface.
  • the 005 network includes W and public mobile phone networks. Physically,
  • the GPS 45 uses the GPS stars and various sensors 4 6 described later (for example, an angular velocity sensor, an acceleration sensor, the number of tires, etc.) to locate the vehicle (navigation point). Calculate the information indicating the 3 locations). This is the information that identifies the point on the map, such as the degree, altitude, etc., indicating the current location.
  • the GPS 45 uses various sensors 4 6 and others to output the parameters, speed change and heading. This makes it possible to analyze conditions such as key and hand.
  • GPS there are a total of 24 GPS stars, not the 6 4 in the Earth. These roads are adjusted to place the same star at the same time every day, and we always see 5 or 6 satellites at any point on Earth (but need to be in line of sight). . 0058 GPS A star (Cs) child clock () is attached to the star and synchronizes with the time of the star to mark the exact time.
  • the stars contain um and vidium (b) 2 as a reserve. This is because accurate time is essential for GPS position measurement.
  • GPS stars have transmitted two wavenumbers of 575 42 z () and 227 6 z (2) (below, GPSf). It is modulated by a signal called this pseudo random (Peo Rado ose Code), and when it is received by GPS, 45, etc., it refers to the corresponding code and decodes it.
  • this pseudo random Peo Rado ose Code
  • the 006 0 GPS unit 45 measures the difference between the time when the GPS device was launched from the GPS star and the time when the white device received GPSf, using the decoded de-white device total. . Then, the time difference is multiplied by the radio wave degree to calculate the distance from the GPS star to the white device (X). In addition, it is synchronized with this Coordinated Time (C).
  • GPS stars Since 006 1 GPS stars send accurate information on the road, it is possible to know the exact location of the GPS star. Therefore, if the distance from the GPS star is known, the location of the white device will be a point on the sphere centered on the GPS star and having the calculated distance as the radius. In addition, it is sent back at the interval of s of GPSf. Since the transmission time of GPSf is 4 seconds, the maximum is 4 X ⁇ 4. Therefore, every time it is necessary to know the location of the equipment.
  • GPS 45 receives GPS signals from a total of 4 GPS stars. It can be considered that this can be obtained by introducing new information (equation) using the minutes on the GPS 45 side as another unknown. In this way, GPS 45 can obtain the almost accurate current position that converges to a point by receiving GPSf signals from 4 GPS stars.
  • the 006 sensor 46 is a sensor such as an acceleration sensor, a G sensor, and an angular velocity sensor, and is used for the current position output by the GPS 45 and the measurement of speed and position.
  • the various sensors 46 include a sensor that detects a vehicle operation caused by dryness. It may be configured to detect the force of the window blinker, the penetration of the accessor, the penetration of the kipeda, etc. of both works. Also, it can be used as data to be recorded by various sensors 4 6 with a driver function.
  • each sensor 46 it is possible to specify the settings for saving the driver image in advance and to save the driver image when a sensor is detected.
  • various sensors 46 may be configured to provide an output above a predetermined threshold value or an output approximating a predetermined tan value. More specifically, for example, it may be set when the vibration sensor of the various sensors 46 detects the above vibration or a predetermined tan.
  • the constant tan is any tan that shows an abnormal condition such as a sudden rise. It may also be set when the G sensor detects an upper G predetermined turn of the G direction.
  • the constant G should be a tan that indicates an abnormality such as a sharp rise.
  • detection and 5 are performed by P 4
  • acquisition 2 is at GPS 45 and sensor 46
  • storage 3 is at disk dry 4 4
  • the disk drive 4 6 realizes the functions of the detection 4 by the various sensors 46. .
  • Figure 5 shows an example of a dry image in front of the vehicle in this implementation.
  • the driver 5 in front of the vehicle is shown by image 5a taken by the driver 3 of the navigation 3 shown in 5 and by map 5b of the history of 5 moving. Composed.
  • image 5a the Munan 52 of image 5a and the time 5 3 when image 5a was taken are displayed.
  • image 5a shows a situation where, for example, 5 is at an intersection and an oncoming vehicle 5 4 is approaching 5.
  • information such as intersection 5 5, signal 5 6 and pedestrian crossing 5 7 stop line 58, pedestrian crossing or bicycle presence 5 9 and 5 near the intersection are displayed.
  • the map 5b may be, for example, a map information in which the point 5 moved is displayed on the map, or a point 5 indicating the point 5 moved by the point or track.
  • 5 2 near the intersection and 5 3 at the intersection in 5 b correspond to 5 near the intersection and 5 5 at the intersection in image 5 a, respectively.
  • the map 5b is displayed above the driver 5, but this is not limited to the upper right, and it can be displayed anywhere in relation to the image 5a.
  • 00696 shows an example of a driver image of a vehicle according to this embodiment.
  • the image 6 in the vehicle is composed of the image 6a taken by the driving function of the navigation 3 shown in 5 and the map 6b showing the history of movement of 5.
  • Be done. 6a shows, for example, the situation in which 5 collided with an oncoming vehicle 5 4 at an intersection. In this situation, only the Munan 6 2 in image 6a and the time 6 3 in which image 6a was taken, as well as information such as intersection 5 5, traffic light 5 6, crosswalk 5 7, crossing 5 and so on. Will be the information to understand the situation at the time of the accident.
  • 007 6 b is also a map report showing 5 until 5 collides with an oncoming vehicle 5 4.
  • the location of 5 on the map 6b may be displayed with the mark 62 that indicates that the accident has occurred.
  • the map 6b may be a map centered on the accident point or a map with the row 5 direction up. Even if the map in 6b is assumed to be the distance that the vehicle 5 moves for the fixed time at the accident point. 0071 (The reason for navigation 3)
  • 7 is a chart showing the purpose of the navigation system implemented in this project.
  • the navigation 3 first determines whether or not the vehicle is running (step 7). Judgments regarding both lines may be made, for example, by referring to the forces of various sensors 46.
  • step 7 wait until the vehicle is running, and if it is running (step S7 es), CP 4 controls 4 7 to start the shadow of the driver image. (Step 7 2). It can also be a driver, for example, an image taken from the direction of the vehicle on board the vehicle as shown in 5 or 6 above. This image shows, for example, oncoming vehicles, intersections, children near the intersections, traffic lights, and road signs in addition to the two.
  • the CP 4 overwrites and records the driver image captured in step S72 (step S73). More specifically, it may be configured such that the CP 4 controls the disk drive 4 4 to overwrite the drive image with the drive image.
  • the CP 4 judges whether or not the fault has been detected (step S74).
  • it may be configured by detecting a predetermined value with various sensors 46.
  • the force of various sensors 46 can be used to save the driver image, and more specifically, even when the vibration sensor detects the above vibration or a predetermined data.
  • the constant value may be any value that indicates an abnormality such as a sudden rise. Also, for example, this may be the case when the G sensor detects an upper G predetermined turn.
  • the constant G can be any tan that shows an abnormality, such as a sudden rising G.
  • the CP 4 may also be configured to detect the dry operation, which is a sudden movement of both, by the force of various sensors 46. More specifically, it is also possible to use an unusual action such as a unique action when the driver is given a certain angle of attention without ejecting a dog squid that exceeds a predetermined speed. In addition, unusual acceleration such as acceleration / acceleration above the specified speed or deceleration at intersections without traffic lights, deceleration at the red traffic light (), and special operations when you are concerned It can be configured as In addition, it is also possible to register a random tan of a normal operation and compare the registered operation tan. In addition, intersections that do not have traffic lights and points that need to be stopped may be acquired based on the map information recorded in 42.
  • an unusual action such as a unique action when the driver is given a certain angle of attention without ejecting a dog squid that exceeds a predetermined speed.
  • unusual acceleration such as acceleration / acceleration above the specified speed or deceleration at
  • step S7 4 o If the vehicle is not detected in step 4 (step S7 4 o), it is determined whether or not the line of the vehicle is completed (step S7). On the other hand, when the fault is detected in step 4 (step S7 4 es), the CP 4 stores the driver image (step S75). More specifically, the CP 4 may control the disc drive 44 to store the detection point detected in step 4 and the driver image at a certain time after that in the storage body. In addition, the fixed time can be set by the user, or the time for saving can be extended if the switch is detected again within the fixed time from the detection point.
  • CP 4 detects the time regarding the time when the Dryda image was taken and the information regarding the Munang of the Dryda image at this time (step 76). For example, you can detect the Thailand provided in Navigation 3. Alternatively, CP 4 may control the GPS 45 to detect it. In addition, for example, video 4 2 4 7 It is also possible to add Munan to each shadowed mu statue.
  • the CP 4 obtains the map information regarding the history of vehicle movement based on the time information detected in step S76 (step 77). For example, the CP 4 controls the GPS 45 and the sensor 4 6 to obtain information about both units, and as shown in 5 and 6 above, the CP 4 controls the movement of the vehicle to a point or trajectory. It may be configured to be displayed on the map for use.
  • P 4 combines the time and time detected in step S76 and the map information acquired in step S77 with the driver image saved in step S75 (step S78). . More specifically, P 4 combines the time information and the map information with the image such as 4 2 or R 4 into the image corresponding to the Munang at this time. For example, as shown in Figs. 5 and 6, it is also possible to display a time signal and a time signal on the driver and a map information on the history of movement of the moving object on the driver.
  • P4 stores the image combined in step S78 (step S79). More specifically, the configuration in which P 4 controls the disc drive 4 4 to store the composite image in the storage body is also possible.
  • the navigation 3 judges whether or not the line of the vehicle is completed (step S7). Judgments on both lines may be made by referring to the forces of various sensors 46, for example. More specifically, it may be judged that the driving of the vehicle is completed when the force of the various sensors 46 is stopped.
  • step S7 o If the line of the vehicle is not completed in step S7 (step S7 o), the process returns to step S73 and the writing and recording of the driver image are repeated. If the vehicle has finished moving in step S7 (step S7 es), the series of processing ends.
  • the image of the side is dried at step 73.
  • the image is taken as a dual image and recorded by overwriting. It is also possible to have a configuration in which information about line status such as power is rushed and recorded. In that case, at step 4, when the voltage is detected, the detected detection point and the output for a certain time after that may be saved.
  • the driver is detected and the image of the driver is generated, but it may be configured to detect the floor of the accident. In that case, the configuration may be changed so that the accident can be detected.
  • the driver image is constantly overwritten and recorded while the vehicle is in motion. Then, when a gut is detected, the time image, the time stamp, and the map information are combined in the driver image. Therefore, when analyzing the condition of the vehicle, it is possible to accurately grasp the information that is related to the cause of the accident, such as the time when the accident occurred, the point where the accident occurred, the point before and after the time when the accident occurred, from the driver image. it can.
  • the dry image that combines the time information, the time information, and the map information is saved. Therefore, the stored image can be used for safe driving.
  • Example 2 will be explained in the case of navigation 3 explained above, in which the time and time stamps and map and map information are combined and overwritten and recorded before the detection of the event. It should be noted that the navigation 3 in this implementation 2 is almost the same as that in 3 and will be explained. In addition, the procedure for navigation 3 in this implementation 2 will be explained because it is almost the same as in 4.
  • the reference numeral 8 is a chart showing the operation of the navigation device according to the second embodiment.
  • first navigate 3 Determines whether the vehicle is running (step S8). Judgments regarding both lines may be made, for example, by referring to the forces of various sensors 46.
  • the CP 4 controls the camera 47 to start the shadowing of the driver image (step S8 es). 8 2).
  • It can also be a driver, for example, an image taken from the camera 47 mounted on the vehicle in the direction of the vehicle, as shown in 5 or 6 above. This image shows, for example, oncoming vehicles, intersections, children near the intersections, traffic lights, and road signs in addition to the two.
  • the CP 4 detects the information about the time when the Dryda image was taken and the information about the Munang of the Dryda image at this time (step 83). For example, you can detect from Thailand provided in Navigation 3. Alternatively, CP 4 may control and detect GPS 45.
  • the munan may also be configured such that, for example, the image 42 is given a munan for each of the images of the images projected from the camera 47.
  • P 4 obtains the map information regarding the history of vehicle movement based on the time information detected in step S83 (step 84).
  • P 4 controls GPS 45 and sensor 4 6 to obtain information about both positions, and as shown in 5 and 6 above, the vehicle movement
  • It can also be configured to display points on a map using points or trajectories.
  • P 4 combines the time and the time information detected in step S83 and the map information acquired in step 84 with the driver image in step 82 (step S85). More specifically, at 4 2 or 4 3, etc., P 4 combines the report and the map report into the image corresponding to the Munang at this time. , For example, as shown in 5 and 6, it is possible to display a message and a report on the driver and a map report on the history of the moving body on the driver. Then, the CP 4 overwrites and records the image combined in step S86 (step S86). More specifically, the CP 4 controls the magnetic disk drive 4 4 to overwrite-record the composite image in the overwrite recording body.
  • the CP 4 determines whether or not G has been detected (step S87). For example, it may be performed by detecting a predetermined value with various sensors 46.
  • the force of various sensors 46 can be used to save the dry image, and more specifically, even when the vibration sensor detects a vibration above or a predetermined tan.
  • the constant tan may be a tan that indicates an abnormality such as a sudden rise. It may also be the case when, for example, the G sensor detects an upper G predetermined turn of the G direction.
  • the constant G can be any tan that shows an abnormality, such as a sudden rising G. Alternatively, it is also possible to use a sensor on the vehicle body to actuate the action such as to and.
  • the CP 4 may be configured to detect a dry operation, which is a sudden movement of both by the force of various sensors 46, to detect the operation. More specifically, it is possible to use an unusual wind action such as a unique wind action when a wind action at a specified angle is given without issuing a wind window that exceeds a predetermined speed. In addition, unusual acceleration such as acceleration / acceleration above the specified speed or deceleration at intersections without traffic lights, deceleration at the red traffic light (), and special operations when you are concerned It can be configured as In addition, it is also possible to register a random tan of a normal operation and compare the registered operation tan.
  • step S87 if the speed is not detected (step S87), P4 determines whether or not the line of the vehicle is completed (step S89). On the other hand, if an error is detected in step S8 7 (step S8 7 es, the image combined in step S8 5 is saved (step S8 8). It is also possible to control the disk drive 4 4 to store the detection point at step 8 7 and the composite image at a certain time after that in the storage body.
  • the fixed time can be set by, or if the time is detected again within a fixed time from the detection point, the saving time can be extended.
  • the navigation 3 judges whether the line of the vehicle is completed (step p 89). Judgments on both lines may be made by referring to the forces of various sensors 46, for example. More specifically, it may be judged that the driving of the vehicle is completed when the force of the various sensors 46 is stopped.
  • step 89 o If the line of the vehicle is not completed in step 89 (step 89 o), return to step 86 and repeat the writing and recording of the driver image. If the vehicle has finished moving in step S8 9 (step S8 9 es), the series of processing ends.
  • the dry image taken while the vehicle is traveling is taken.
  • the time and time stamps are combined with the map and overwritten and recorded. Then, when the fault is detected, this image is saved. Therefore, when analyzing the condition of the vehicle, it is possible to accurately grasp the information that is linked to the cause of the accident, such as the time of the accident, the point of the accident, and the vehicle before and after the accident, from the driver image. it can.
  • the time information, the time map, and the map information are combined into the image of the driver before detecting the point at the time of the accident (hiat) in addition to the time when the accident is encountered. Then, when is detected at the point of the accident, the composite image is saved. Therefore, the stored image can be used for safe driving.
  • the second embodiment the time at which this image was taken for the driver image at the detection point that detected the g It is possible to simultaneously display a map report on the history of. Therefore, accurately identify the accident point and the situation after that. can do.
  • the time when this image was taken, the Munang of this image, and a report showing the history of the vehicle were recorded for the Dryda image taken while the vehicle was in motion. It exists by synthesizing. Therefore, when analyzing the condition of the vehicle, it is necessary to accurately grasp the information, such as the time when the accident occurred, the point where the accident occurred, and the cause of the vehicle before and after the accident, from the driver image. You can
  • the program prepared in advance by a computer such as a so-called computer workstation.
  • This program is recorded on a computer-readable recording medium such as a disk, a disk, or COO, and is executed by being read from the recording medium by the computer.
  • this program may be a body that can be distributed via a network such as the Internet.

Abstract

An information recording device (100) records image data picked up by a photographing means arranged in a mobile body. The information recording device is provided with a detecting section (101) for detecting the time the image data is picked up; an acquiring section (102) for acquiring history information relating to the history of movement of the mobile body; a storing section (103) for storing the history information acquired by the acquiring section (102) and the image data by relating them, based on the time the image data picked up time detected by the detecting section (101); and a display control section (105) for controlling the display screen, synthesizing moving route and the image data which are related and stored by the storing section (103) and displaying them.

Description

明 細 書 Specification
情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータ に読み取り可能な記録媒体 Information recording device, information recording method, information recording program, and computer-readable recording medium
技術分野 Technical field
[0001] この発明は、情報を記録する情報記録装置、情報記録方法、情報記録プログラム およびコンピュータに読み取り可能な記録媒体に関する。ただし、この発明の利用は 、上述した情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータに 読み取り可能な記録媒体には限られな 、。 [0001] The present invention relates to an information recording device for recording information, an information recording method, an information recording program, and a computer-readable recording medium. However, the use of this invention is not limited to the above-mentioned information recording device, information recording method, information recording program, and computer-readable recording medium.
背景技術 Background technology
[0002] 従来、飛行機に搭載されたフライトレコーダと同様に、走行中の車両の周辺状況の 記録をおこなうドライブレコーダが知られている。このようなドライブレコーダは、たとえ ば、車両前方を撮影する前方カメラ、後方を撮影する後方カメラ、前方および後方映 像を基準信号に同期して画像メモリの所定領域に書き込む機能を有しており、画像メ モリ情報に車両位置情報および時刻情報を付与した記録情報をバッファメモリに定 常的に記録する。そして、衝撃検知センサの所定値以上となることをトリガーとして映 像を保存し、あて逃げ事件などの事件に遭遇したとき、あて逃げ車両の特定や事故 における検証資料として利用する提案がされている(たとえば、下記特許文献 1参照 。)。 [0002] Conventionally, drive recorders have been known that record the surrounding conditions of a moving vehicle, similar to flight recorders mounted on airplanes. Such a drive recorder has, for example, a front camera that photographs the front of the vehicle, a rear camera that photographs the rear of the vehicle, and a function that writes front and rear images into a predetermined area of the image memory in synchronization with a reference signal. , record information in which vehicle position information and time information are added to image memory information is constantly recorded in a buffer memory. It has been proposed that when a shock detection sensor exceeds a predetermined value as a trigger, the video can be saved and used to identify the hit-and-run vehicle or as verification materials in case of a hit-and-run incident. (For example, see Patent Document 1 below.)
[0003] 特許文献 1:特開 2004— 224105号公報 [0003] Patent document 1: Japanese Patent Application Publication No. 2004-224105
発明の開示 Disclosure of invention
発明が解決しょうとする課題 Problems that the invention seeks to solve
[0004] し力しながら、上記した従来技術によれば、ドライブレコーダの映像には、事故時点 前後において車両がどのようなルートを経由したかなどの移動履歴に関する情報が 含まれていない。したがって、事故の原因を解析する際に、たとえば、交差点の名称 などの事故地点の解明に繋がる情報がドライブレコーダの映像に映されて 、な 、場 合、事故地点を的確に把握することができないという問題が一例として挙げられる。ま た、事故地点の解明に繋がる情報がドライブレコーダの映像に映されていた場合に おいても、この情報をもとにユーザが事故地点を特定できるとは限らないという問題 がー例として挙げられる。 [0004] However, according to the above-mentioned conventional technology, the video of the drive recorder does not include information regarding the movement history, such as the route taken by the vehicle before and after the accident. Therefore, when analyzing the cause of an accident, for example, if information that could lead to clarification of the accident location, such as the name of the intersection, is shown on the drive recorder video, it is not possible to accurately determine the accident location. An example of this problem is: In addition, if the drive recorder footage shows information that could lead to the location of the accident, An example of this is the problem that, even when using this information, it is not always possible for users to pinpoint the location of an accident based on this information.
課題を解決するための手段 Means to solve problems
[0005] 上述した課題を解決し、目的を達成するため、請求項 1の発明にかかる情報記録 装置は、移動体に設置された撮影手段により撮像された映像データを記録する情報 記録装置において、前記映像データの撮像時刻を検出する検出手段と、前記移動 体が移動した履歴に関する履歴情報を取得する取得手段と、前記検出手段によって 検出された前記映像データの撮像時刻に基づいて、前記取得手段によって取得さ れた履歴情報と前記映像データとを関連付けて保存する保存手段と、を備えることを 特徴とする。 [0005] In order to solve the above-mentioned problems and achieve the purpose, an information recording device according to the invention of claim 1 is an information recording device that records video data captured by a photographing means installed in a moving object. a detection means for detecting an imaging time of the video data; an acquisition means for acquiring history information regarding a history of movement of the moving body; It is characterized by comprising a storage means for associating and storing the history information acquired by the video data and the video data.
[0006] また、請求項 5の発明にかかる情報記録方法は、移動体に設置された撮影手段に より撮像された映像データを記録する情報記録方法にぉ 、て、前記映像データの撮 像時刻を検出する検出工程と、前記移動体が移動した履歴に関する履歴情報を取 得する取得工程と、前記検出工程によって検出された前記映像データの撮像時刻 に基づ!/、て、前記取得工程によって取得された履歴情報と前記映像データとを関連 付けて保存する保存工程と、を含むことを特徴とする。 [0006] Further, the information recording method according to the invention of claim 5 is an information recording method for recording video data imaged by a photographing means installed in a moving object, and the method includes: an acquisition step of acquiring history information regarding the movement history of the moving object; The method is characterized in that it includes a storage step of storing the recorded history information and the video data in association with each other.
[0007] また、請求項 6の発明に力かる情報記録プログラムは、請求項 5に記載の情報記録 方法をコンピュータに実行させることを特徴とする。 [0007] Furthermore, the information recording program according to the invention of claim 6 is characterized in that it causes a computer to execute the information recording method according to claim 5.
[0008] また、請求項 7の発明にかかるコンピュータに読み取り可能な記録媒体は、請求項 6に記載の情報記録プログラムを記録したことを特徴とする。 [0008] Furthermore, a computer-readable recording medium according to the invention of claim 7 is characterized in that the information recording program according to claim 6 is recorded thereon.
図面の簡単な説明 Brief description of the drawing
[0009] [図 1]図 1は、本実施の形態に力かる情報記録装置の機能的構成の一例を示すプロ ック図である。 [0009] FIG. 1 is a block diagram showing an example of the functional configuration of an information recording device that is useful in this embodiment.
[図 2]図 2は、本実施の形態にかかる情報記録装置の処理の内容を示すフローチヤ ートである。 [Fig. 2] Fig. 2 is a flowchart showing the contents of processing of the information recording device according to the present embodiment.
[図 3]図 3は、本実施例 1にかかるナビゲーシヨン装置が設置された車両のダッシュボ ード付近の一例を示す説明図である。 [Fig. 3] Fig. 3 is an explanatory diagram showing an example of the vicinity of the dashboard of a vehicle in which the navigation device according to the first embodiment is installed.
[図 4]図 4は、本実施例 1にかかるナビゲーシヨン装置のハードウェア構成の一例を示 すブロック図である。 [Figure 4] Figure 4 shows an example of the hardware configuration of the navigation device according to the first embodiment. FIG.
[図 5]図 5は、本実施例 1にかかる車両の事故前におけるドライブレコーダ用画像の一 例を示す説明図である。 [Fig. 5] Fig. 5 is an explanatory diagram showing an example of a drive recorder image of the vehicle according to the first embodiment before the accident.
[図 6]図 6は、本実施例 1にかかる車両の事故時におけるドライブレコーダ用画像の一 例を示す説明図である。 [FIG. 6] FIG. 6 is an explanatory diagram showing an example of a drive recorder image at the time of a vehicle accident according to the first embodiment.
[図 7]図 7は、本実施例 1にかかるナビゲーシヨン装置の処理の内容について示すフ ローチャートである。 [FIG. 7] FIG. 7 is a flowchart showing the contents of processing of the navigation device according to the first embodiment.
[図 8]図 8は、本実施例 2にかかるナビゲーシヨン装置の処理の内容について示すフ ローチャートである。 符号の説明 [FIG. 8] FIG. 8 is a flowchart showing the contents of processing of the navigation device according to the second embodiment. Explanation of symbols
[0010] 100 情報記録装置 [0010] 100 Information recording device
101 検出部 101 Detection part
102 取得部 102 Acquisition Department
103 保存部 103 Preservation Department
104 検知部 104 Detection part
105 表示制御部 105 Display control section
発明を実施するための最良の形態 BEST MODE FOR CARRYING OUT THE INVENTION
[0011] 以下に添付図面を参照して、この発明にかかる情報記録装置、情報記録方法、情 報記録プログラムおよびコンピュータに読み取り可能な記録媒体の好適な実施の形 態を詳細に説明する。 [0011] Below, preferred embodiments of an information recording device, an information recording method, an information recording program, and a computer-readable recording medium according to the present invention will be described in detail with reference to the accompanying drawings.
[0012] (実施の形態) [0012] (Embodiment)
(情報記録装置の機能的構成) (Functional configuration of information recording device)
図 1を用いて、本実施の形態に力かる情報記録装置の機能的構成について説明 する。図 1は、本実施の形態に力かる情報記録装置の機能的構成の一例を示すプロ ック図である。 The functional configuration of the information recording device that is useful in this embodiment will be explained using FIG. 1. FIG. 1 is a block diagram showing an example of the functional configuration of an information recording device that can be used in this embodiment.
[0013] 図 1において、移動体に設置された撮影手段により撮像された映像データを記録 する情報記録装置 100は、検出部 101と、取得部 102と、保存部 103と、検知部 104 と、表示制御部 105と、を含み構成されている。 [0014] 検出部 101は、移動体に設置された撮影手段により撮像された映像データの撮像 時刻を検出する。撮像時刻は、たとえば、情報記録装置 100に備えられたタイマーに よる時刻情報カゝら検出してもよい。また、検出された撮像時刻は、映像データと関連 させて、後述する保存部 103によって保存される構成でもよい。 [0013] In FIG. 1, an information recording device 100 that records video data captured by a photographing means installed on a moving body includes a detection section 101, an acquisition section 102, a storage section 103, a detection section 104, The display control section 105 is configured to include a display control section 105. [0014] Detection unit 101 detects the imaging time of video data captured by an imaging device installed on a moving body. The imaging time may be detected from time information using a timer provided in the information recording device 100, for example. Further, the detected imaging time may be stored in a storage unit 103, which will be described later, in association with the video data.
[0015] 取得部 102は、移動体が移動した履歴に関する履歴情報を取得する。履歴情報は 、たとえば、移動体が移動した移動ルートを含む情報でもよぐ検出部 101によって 検出された撮像時刻における移動体の位置を含む移動ルートを地図上に表示する 構成でもよい。 [0015] The acquisition unit 102 acquires history information regarding the movement history of the mobile object. The history information may include, for example, information including the moving route traveled by the moving object, and may be configured to display the moving route including the position of the moving object at the imaging time detected by the detection unit 101 on a map.
[0016] 保存部 103は、検出部 101によって検出された映像データの撮像時刻に基づいて 、取得部 102によって取得された履歴情報と映像データとを関連付けて保存する。よ り具体的には、映像データの各フレームに対して、検出部 101によって検出された撮 像時刻と、取得部 102によって取得された移動体の移動ルートを含む履歴情報とを フレームナンバーに対応させて保存する構成でもよい。また、保存部 103は、後述す る検知部 104によって、移動体の危険な挙動が検知された際に、検知時点を含む所 定時間分の映像データの各フレームに対して、撮像時刻と、検知時点を含む所定時 間分の移動体の移動ルートを含む履歴情報とをフレームナンバーに対応させて保存 する構成でもよい。 [0016] Based on the imaging time of the video data detected by the detection unit 101, the storage unit 103 associates and stores the history information acquired by the acquisition unit 102 and the video data. More specifically, for each frame of video data, the imaging time detected by the detection unit 101 and the history information including the moving route of the moving object acquired by the acquisition unit 102 are associated with the frame number. It is also possible to have a configuration in which the data is stored after being stored. Furthermore, when a dangerous behavior of a moving object is detected by a detection unit 104 (described later), the storage unit 103 stores the imaging time and information for each frame of video data for a predetermined period of time including the time of detection. A configuration may also be adopted in which historical information including the moving route of the moving object for a predetermined period of time including the time of detection is stored in correspondence with the frame number.
[0017] 検知部 104は、移動体の危険な挙動を検知する。移動体の危険な挙動は、たとえ ば、移動体の動作や操作など、移動体に搭載された各種センサの出力などに基づい て検知する構成でもよい。各種センサは、たとえば、振動センサや Gセンサや移動体 に対する接触センサ、およびハンドル操作や方向指示信号の入力操作やアクセルぺ ダルの操作やブレーキペダルの操作などの操作に関する情報を出力できるセンサで ちょい。 [0017] Detection unit 104 detects dangerous behavior of a moving object. The dangerous behavior of the moving object may be detected based on the outputs of various sensors mounted on the moving object, such as the movement or operation of the moving object. Various sensors include, for example, vibration sensors, G-sensors, contact sensors for moving objects, and sensors that can output information about operations such as steering wheel operation, direction signal input operation, accelerator pedal operation, and brake pedal operation. .
[0018] また、検知部 104による危険な挙動の検知は、たとえば、振動センサや Gセンサな ど各種センサの出力値が規定以上の値であったり、異常を示す所定のパターンと近 似した場合や、接触センサが作動した場合などを危険な挙動として検知してもよ!、。 また、移動体の操作に関する情報に基づいて、危険な操作であった場合に、危険な 挙動となる操作と判断するものであってもよい。より具体的には、たとえば、所定の角 度以上に急ノヽンドルをおこなう操作や方向指示の伴わないハンドル操作や不必要な 加速および減速などの情報の出力があった場合に、移動体の操作が危険な挙動と なる操作と検知する構成でもよ ヽ。 [0018] Further, the detection unit 104 detects dangerous behavior when, for example, the output value of various sensors such as a vibration sensor or a G sensor exceeds a specified value or approximates a predetermined pattern indicating an abnormality. Or, if a contact sensor is activated, it can be detected as a dangerous behavior. Furthermore, if the operation is dangerous, it may be determined that the operation results in dangerous behavior based on information regarding the operation of the moving body. More specifically, for example, a given corner A configuration that detects the operation of a moving object as an operation that causes dangerous behavior when information such as an operation that performs a sudden steering wheel operation, a steering wheel operation without direction indication, or unnecessary acceleration or deceleration is output. But ヽ.
[0019] 表示制御部 105は、表示画面を制御して、保存部 103により関連付けられて保存さ れている移動体の履歴情報と映像データを合成して表示する。より具体的には、移 動体に設置された撮影手段により撮像された映像データと、移動体の移動ルートを 含む履歴情報を合わせて表示する構成でもよい。この際に、検出部 101によって取 得された撮像時刻を合わせて表示してもよい。また、移動体の履歴情報と映像デー タおよび撮像時刻は、保存部 103によって保存される前に合成し、その合成した映 像データを表示制御部 105によって、表示画面上に表示する構成でもよい。 [0019] The display control unit 105 controls the display screen to combine and display the history information of the moving body, which is stored in association with the storage unit 103, and the video data. More specifically, a configuration may be adopted in which video data captured by a photographing means installed on a moving object and history information including the moving route of the moving object are displayed together. At this time, the imaging time acquired by the detection unit 101 may also be displayed. Furthermore, the history information, video data, and imaging time of the moving object may be combined before being stored by the storage unit 103, and the combined video data may be displayed on the display screen by the display control unit 105. .
[0020] (情報記録装置の処理の内容) [0020] (Contents of processing of information recording device)
つぎに、図 2を用 、て本実施の形態にかかる情報記録装置 100の処理の内容につ いて説明する。図 2は、本実施の形態にかかる情報記録装置の処理の内容を示すフ ローチャートである。図 2のフローチャートにおいて、まず、情報記録装置 100は、図 示しない入力部によって、移動体に設置された撮影手段により撮像された映像デー タの入力を受け付けた力否かを判断する (ステップ S201)。映像データの入力は、た とえば、ユーザが図示しない操作部を操作することによりおこなわれてもよい。また、 移動体が走行を開始することによりおこなわれる構成でもよい。 Next, with reference to FIG. 2, the contents of the processing of the information recording device 100 according to the present embodiment will be explained. FIG. 2 is a flowchart showing the details of the processing of the information recording device according to the present embodiment. In the flowchart of FIG. 2, first, the information recording device 100 determines whether or not input of video data captured by a photographing means installed on a moving object is accepted by an input unit (not shown) (step S201). ). The input of video data may be performed, for example, by the user operating an operation unit (not shown). Alternatively, a configuration may be adopted in which this is performed when the moving object starts traveling.
[0021] ステップ S201において、映像データの入力を受け付けるのを待って、受け付けた 場合 (ステップ S201: Yes)は、検出部 101は、移動体に設置された撮影手段により 撮像された映像データの撮像時刻を検出する (ステップ S202)。撮像時刻は、たとえ ば、情報記録装置 100に備えられたタイマーによる時刻情報力も検出してもよい。 [0021] In step S201, the detection unit 101 waits until the input of video data is received, and if it is accepted (step S201: Yes), the detection unit 101 captures the video data captured by the imaging means installed on the moving body. Detect time (step S202). The imaging time may also be detected by time information using a timer provided in the information recording device 100, for example.
[0022] つづいて、取得部 102は、移動体が移動した履歴に関する履歴情報を取得する( ステップ S203)。履歴情報は、たとえば、移動体が移動した移動ルートを含む情報で もよぐ地図上に移動体の移動ルートを点または軌跡を用いて表示する構成でもよい [0022] Next, the acquisition unit 102 acquires history information regarding the movement history of the mobile object (step S203). The history information may be, for example, information including the route traveled by the moving object, or may be configured to display the moving route of the moving object on a map using points or trajectories.
[0023] つぎに、検知部 104は、移動体の危険な挙動を検知する (ステップ S204)。移動体 の危険な挙動は、たとえば、移動体の動作や操作など、移動体に搭載された各種セ ンサの出力などに基づいて検知する構成でもよい。危険な挙動の検知は、たとえば、 振動センサや Gセンサなど各種センサの出力値が規定以上の値であったり、異常を 示す所定のパターンと近似した場合や、接触センサが作動した場合などを危険な挙 動として検知してもよい。また、移動体の操作に関する情報に基づいて、危険な操作 であった場合に、危険な挙動となる操作と判断するものであってもよ 、。 [0023] Next, the detection unit 104 detects dangerous behavior of the moving object (step S204). Dangerous behavior of a moving object is, for example, the movement and operation of a moving object, and various It may also be configured to detect based on the output of a sensor or the like. Dangerous behavior is detected, for example, when the output value of various sensors such as a vibration sensor or G sensor exceeds a specified value, or when it approximates a predetermined pattern that indicates an abnormality, or when a contact sensor is activated. It may also be detected as a negative behavior. Furthermore, if the operation is dangerous, it may be determined that the operation results in dangerous behavior based on information regarding the operation of the mobile object.
[0024] そして、保存部 103は、ステップ S204において、検知部 104によって移動体の挙 動が検知された際に、取得部 102によって取得された移動体の履歴情報と映像デー タとを検出部 101によって検出された撮像時刻に関連付けて保存して (ステップ S20 5)、一連の処理を終了する。より具体的には、移動体に設置された撮影手段により 撮像された映像データの各フレームに対して、撮像時刻と移動体の移動ルートを含 む履歴情報をフレームナンバーに対応させて保存する構成でもよい。また、保存部 1 03は、検知部 101によって移動体の危険な挙動が検知された際に、検知時点を含 む所定時間における映像データの各フレームに対して、撮像時刻と、検知時点を含 む所定時間における移動体のルートを含む履歴情報とをフレームナンバーに対応さ せて保存する構成でもよい。 [0024] Then, in step S204, when the behavior of the moving body is detected by the detection unit 104, the storage unit 103 saves the history information and video data of the moving body acquired by the acquisition unit 102 to the detection unit. It is stored in association with the imaging time detected by 101 (step S20 5), and the series of processes ends. More specifically, for each frame of video data captured by a photographing means installed on a moving object, history information including the time of image capture and the moving route of the moving object is stored in correspondence with the frame number. But that's fine. Furthermore, when dangerous behavior of a moving object is detected by the detection unit 101, the storage unit 103 stores information including the imaging time and the detection time for each frame of video data at a predetermined time including the detection time. The configuration may also be such that historical information including the route of the moving object at a predetermined time is stored in correspondence with the frame number.
[0025] 以上説明したように、本実施の形態によれば、移動体の危険な挙動が検知された 際に、移動体に設置された撮影手段により撮像された映像データに対して、撮像時 刻と移動体の移動ルートを含む履歴情報とを対応させて保存する。したがって、事故 の状況を解析する際に、映像データから、撮像時刻および移動体の移動ルートなど の事故原因の解明に繋がる情報を的確に把握することができる。 [0025] As explained above, according to the present embodiment, when a dangerous behavior of a moving object is detected, the video data captured by the imaging means installed on the moving object is The time and history information including the moving route of the moving object are stored in correspondence with each other. Therefore, when analyzing the situation of an accident, it is possible to accurately grasp information that will lead to elucidation of the cause of the accident, such as the time the image was taken and the moving route of the moving object, from the video data.
実施例 1 Example 1
[0026] 以下に、本発明の実施例 1について説明する。本実施例 1では、たとえば、車両( 四輪車、二輪車を含む)などの移動体に搭載されるナビゲーシヨン装置によって、本 発明の情報記録装置を実施した場合の一例について説明する。 [0026] Example 1 of the present invention will be described below. In the first embodiment, an example in which the information recording device of the present invention is implemented by a navigation device mounted on a moving body such as a vehicle (including four-wheeled vehicles and two-wheeled vehicles) will be described.
[0027] (ナビゲーシヨン装置の周辺機器構成) [0027] (Peripheral equipment configuration of navigation device)
まず、図 3を用いて、本実施例 1にかかるナビゲーシヨン装置の周辺機器構成につ いて説明する。図 3は、本実施例 1にかかるナビゲーシヨン装置が設置された車両の ダッシュボード付近の一例を示す説明図である。 [0028] 図 3において、ナビゲーシヨン装置 300は、車両のダッシュボードに設置されている 。ナビゲーシヨン装置 300は、本体部 Mおよび表示部(ディスプレイ) Dによって構成 され、表示部 Dには車両の現在地点や地図情報、現在時刻などが表示される。 First, the configuration of peripheral equipment of the navigation device according to the first embodiment will be explained using FIG. 3. FIG. 3 is an explanatory diagram showing an example of the vicinity of the dashboard of a vehicle in which the navigation device according to the first embodiment is installed. [0028] In FIG. 3, navigation device 300 is installed on the dashboard of the vehicle. The navigation device 300 includes a main body M and a display D, and the display D displays the vehicle's current location, map information, current time, and the like.
[0029] また、ナビゲーシヨン装置 300には、ダッシュボード上に設置された車載用カメラ 31 1、サンノ ィザ一に設置された車載用マイク 312が接続されている。車載用カメラ 311 は、レンズの向きを変化させることができ、車内および車外の映像を撮影することがで きる構成でもよい。車載用マイク 312は、ナビゲーシヨン装置 300の音声入力による 操作や車内の様子を記録する際などに用いられる。なお、この車載用カメラ 311は、 たとえば、ノ ックミラーの裏面に固定され、車外の映像を撮影する固定カメラでもよい [0029] Furthermore, an in-vehicle camera 311 installed on the dashboard and an in-vehicle microphone 312 installed in the sun sensor are connected to the navigation device 300. The in-vehicle camera 311 may have a configuration in which the direction of the lens can be changed and images can be taken inside and outside the vehicle. The in-vehicle microphone 312 is used to operate the navigation device 300 by voice input and to record the situation inside the vehicle. Note that this in-vehicle camera 311 may be, for example, a fixed camera that is fixed to the back of the knock mirror and captures images of the outside of the vehicle.
[0030] また、図示しないが、車載用カメラ 311は、車両の後部に取り付けられていてもよい 。車両の後部に車載用カメラ 311が取り付けられている場合、車両の後方の安全確 認ができる他、他の車両力も追突された際に追突時の状況を記録することができる。 この他、車載用カメラ 311は、暗所の記録をおこなう赤外線カメラであってもよい。ま た、車載用カメラ 311および車載用マイク 312は、車両に複数設置されていてもよい し、固定式でなく可動式のカメラであってもよい。 [0030] Although not shown, the vehicle camera 311 may be attached to the rear of the vehicle. If an in-vehicle camera 311 is installed at the rear of the vehicle, it is possible to check the safety of the rear of the vehicle, and also to record the situation at the time of a rear-end collision when another vehicle is involved in a rear-end collision. In addition, the in-vehicle camera 311 may be an infrared camera that records in a dark place. Further, the vehicle-mounted camera 311 and the vehicle-mounted microphone 312 may be installed in plural numbers in the vehicle, or may be movable cameras instead of fixed ones.
[0031] ナビゲーシヨン装置 300は、目的地点までの経路探索および情報記録をおこなう他 、車両の走行状態について記録するドライブレコーダ機能を有している。ドライブレコ ーダ機能は、車載用カメラ 311や車載用マイク 312で得られた映像および音声や後 述する GPSュ-ット 415や各種センサ 416で得られた車両の現在地点情報や走行 速度の変化などを、ナビゲーシヨン装置 300の記録媒体 (後述する磁気ディスク 405 、光ディスク 407)に記録する。 [0031] The navigation device 300 not only searches for a route to a destination and records information, but also has a drive recorder function that records the driving state of the vehicle. The drive recorder function records video and audio obtained from an in-vehicle camera 311 and an in-vehicle microphone 312, as well as information on the vehicle's current location and driving speed obtained from a GPS recorder 415 and various sensors 416, which will be described later. Changes and the like are recorded on a recording medium (a magnetic disk 405 and an optical disk 407, which will be described later) of the navigation device 300.
[0032] このようなドライブレコーダ機能を用いて走行状態を常時記録することによって、自 車が事故に巻き込まれた場合や、自車の周囲で事故が発生した場合に、事実関係 の究明に用いる資料を得ることができる。ドライブレコーダ機能を用いて記録する情 報は、記録媒体の記録容量を超えない限り蓄積してもよいし、所定時間分の記録を 残して逐次消去してもよい。また、記録媒体は、走行状態を常時上書き記録する上 書き用記録領域と、事故に巻き込まれた場合に走行状態を保存する保存用記録領 域を有するものであってもよいし、上書き記録用の記録媒体と保存用の記録媒体を それぞれ複数備える構成としてもょ ヽ。 [0032] By constantly recording driving conditions using such a drive recorder function, it can be used to investigate the facts when one's own vehicle is involved in an accident or an accident occurs around one's own vehicle. Materials can be obtained. The information recorded using the drive recorder function may be accumulated as long as it does not exceed the recording capacity of the recording medium, or may be erased one by one after leaving records for a predetermined period of time. In addition, the recording medium has an overwrite recording area that constantly overwrites driving conditions, and a storage storage area that saves driving conditions in the event of an accident. It is also possible to have a configuration with multiple recording media for overwriting and multiple recording media for storage.
[0033] (ナビゲーシヨン装置 300のハードウェア構成) [0033] (Hardware configuration of navigation device 300)
つぎに、図 4を用いて、本実施例 1にかかるナビゲーシヨン装置 300のハードウェア 構成について説明する。図 4は、本実施例 1にかかるナビゲーシヨン装置のハードウ エア構成の一例を示すブロック図である。 Next, the hardware configuration of the navigation device 300 according to the first embodiment will be described using FIG. 4. FIG. 4 is a block diagram showing an example of the hardware configuration of the navigation device according to the first embodiment.
[0034] 図 4において、ナビゲーシヨン装置 300は、車両などの移動体に搭載されており、 C PU401と、 ROM402と、 RAM403と、磁気ディスクドライブ 404と、磁気ディスク 40 5と、光ディスクドライブ 406と、光ディスク 407と、音声 IZF (インターフェース) 408と 、マイク 409と、スピーカ 410と、入力デバイス 411と、映像 IZF412と、ディスプレイ 4 13と、通信 IZF414と、 GPSユニット 415と、各種センサ 416と、カメラ 417と、を備え ている。また、各構成部 401〜417はノ ス 420によってそれぞれ接続されている。 [0034] In FIG. 4, the navigation device 300 is mounted on a moving body such as a vehicle, and includes a CPU 401, a ROM 402, a RAM 403, a magnetic disk drive 404, a magnetic disk 405, an optical disk drive 406, , optical disk 407, audio IZF (interface) 408, microphone 409, speaker 410, input device 411, video IZF412, display 413, communication IZF414, GPS unit 415, various sensors 416, camera It is equipped with 417 and. Further, each of the constituent parts 401 to 417 is connected to each other by a nozzle 420.
[0035] まず、 CPU401は、ナビゲーシヨン装置 300の全体の制御を司る。 ROM402は、 ブートプログラム、経路探索プログラム、経路誘導プログラム、音声生成プログラム、 地図情報表示プログラム、通信プログラム、データベース作成プログラム、データ解 析プログラムなどのプログラムを記録している。また、 RAM403は、 CPU401のヮー クエリアとして使用される。 [0035] First, CPU 401 controls the entire navigation device 300. ROM402 records programs such as a boot program, route search program, route guidance program, voice generation program, map information display program, communication program, database creation program, and data analysis program. Additionally, RAM403 is used as a work area for CPU401.
[0036] ここで、経路探索プログラムは、後述する光ディスク 407に記録されている地図情報 などを利用して、出発地点から目的地点までの最適な経路を探索させる。ここで、最 適な経路とは、 目的地点までの最短 (あるいは最速)経路やユーザが指定した条件 に最も合致する経路などである。経路探索プログラムを実行することによって探索さ れた誘導経路は、 CPU401を介して音声 IZF408や映像 IZF412へ出力される。 [0036] Here, the route search program searches for the optimal route from the departure point to the destination point using map information recorded on the optical disc 407, which will be described later. Here, the optimal route is the shortest (or fastest) route to the destination, or the route that best matches the conditions specified by the user. The guidance route searched by executing the route search program is output to the audio IZF 408 and video IZF 412 via the CPU 401.
[0037] また、経路誘導プログラムは、経路探索プログラムを実行することによって探索され た誘導経路情報、通信 IZF414によって取得されたナビゲーシヨン装置 300の現在 地点情報、光ディスク 407から読み出された地図情報に基づいて、リアルタイムな経 路誘導情報の生成をおこなわせる。経路誘導プログラムを実行することによって生成 された経路誘導情報は、 CPU401を介して音声 IZF408や映像 IZF412へ出力さ れる。 [0038] また、音声生成プログラムは、パターンに対応したトーンと音声の情報を生成させる [0037] The route guidance program also uses the guidance route information searched by executing the route search program, the current location information of the navigation device 300 acquired by the communication IZF414, and the map information read from the optical disk 407. Based on this information, real-time route guidance information can be generated. The route guidance information generated by executing the route guidance program is output to the audio IZF 408 and video IZF 412 via the CPU 401. [0038] Furthermore, the voice generation program generates tone and voice information corresponding to the pattern.
。すなわち、経路誘導プログラムを実行することによって生成された経路誘導情報に 基づいて、案内ポイントに対応した仮想音源の設定と音声ガイダンス情報の生成を おこない、 CPU401を介して音声 IZF408へ出力する。 . That is, based on the route guidance information generated by executing the route guidance program, it sets a virtual sound source corresponding to the guidance point and generates voice guidance information, and outputs it to the audio IZF 408 via the CPU 401.
[0039] また、地図情報表示プログラムは、映像 I/F412によってディスプレイ 413に表示 する地図情報の表示形式を決定させ、決定された表示形式によって地図情報をディ スプレイ 413に表示させる。 [0039] Furthermore, the map information display program causes the video I/F 412 to determine the display format of map information to be displayed on the display 413, and causes the map information to be displayed on the display 413 in the determined display format.
[0040] また、 CPU401は、車両の走行中、カメラ 417からドライブレコーダ用画像を撮影す る。そして、 CPU401は、このドライブレコーダ用画像が撮影された時刻に関する時 刻情報と、このドライブレコーダ用画像のフレームナンバーに関するフレーム情報を 検出する。また、 CPU401は、ドライブレコーダ用画像が撮影された時刻における移 動体の移動ルートを含む地図情報を取得する。そして、 CPU401は、この地図情報 と時刻情報およびフレーム情報を後述する ROM402または RAM403などのバッフ ァメモリにおいてドライブレコーダ用画像に合成して、上書き記録用の記録媒体に上 書き記録する。また、 CPU401は、後述する各種センサ 416によって、トリガーが検 知された場合に、この合成画像を保存用の記録媒体に保存する。 [0040] Furthermore, the CPU 401 captures images for the drive recorder from the camera 417 while the vehicle is running. Then, the CPU 401 detects time information regarding the time when this drive recorder image was taken and frame information regarding the frame number of this drive recorder image. Further, the CPU 401 acquires map information including the moving route of the moving object at the time when the drive recorder image was taken. Then, the CPU 401 combines this map information, time information, and frame information into a drive recorder image in a buffer memory such as ROM 402 or RAM 403 (described later), and overwrites the image on a recording medium for overwriting. Furthermore, when a trigger is detected by various sensors 416, which will be described later, the CPU 401 saves this composite image in a storage recording medium.
[0041] また、 CPU401は、車両の走行中、カメラ 417から撮影されたドライブレコーダ用画 像をー且上書き記録用の記録媒体に記録する。そして、 CPU401は、後述する各種 センサ 416によってトリガーが検知された場合に、ドライブレコーダ用画像を保存用の 記録媒体に保存し、時刻情報とフレーム情報および移動体の移動ルートを含む地図 情報を取得する。そして、 CPU401は、後述する ROM402または RAM403などの ノ ッファメモリにおいて時刻情報とフレーム情報および地図情報をドライブレコーダ用 画像に合成し、この合成画像を保存用の記録媒体に保存する構成でもよい。また、 C PU401は、ユーザなどによりドライブレコーダ用画像を表示する指示があれば後述 するディスプレイ 413に表示する。 [0041] Furthermore, while the vehicle is running, the CPU 401 records the drive recorder image taken by the camera 417 on a recording medium for overwriting. Then, when a trigger is detected by various sensors 416, which will be described later, the CPU 401 saves the image for the drive recorder on a storage storage medium, and acquires time information, frame information, and map information including the moving route of the moving object. do. Then, the CPU 401 may be configured to combine time information, frame information, and map information into a drive recorder image in a buffer memory such as ROM 402 or RAM 403, which will be described later, and store this combined image in a storage recording medium. Furthermore, if a user or the like instructs the CPU 401 to display a drive recorder image, the CPU 401 displays the image on a display 413, which will be described later.
[0042] 磁気ディスクドライブ 404は、 CPU401の制御にしたがって磁気ディスク 405に対 するデータの読み取り Z書き込みを制御する。磁気ディスク 405は、磁気ディスクドラ イブ 404の制御で書き込まれたデータを記録する。磁気ディスク 405としては、たとえ ば、 HD (ノヽードディスク)や FD (フレキシブルディスク)を用いることができる。 [0042] The magnetic disk drive 404 controls reading and Z writing of data on the magnetic disk 405 under the control of the CPU 401. The magnetic disk 405 records data written under the control of the magnetic disk drive 404. As a magnetic disk 405, even if For example, HD (noded disk) or FD (flexible disk) can be used.
[0043] また、光ディスクドライブ 406は、 CPU401の制御にしたがって光ディスク 407に対 するデータの読み取り Z書き込みを制御する。光ディスク 407は、光ディスクドライブ 406の制御にしたがってデータの読み出される着脱自在な記録媒体である。光ディ スク 407は、書き込み可能な記録媒体を利用することもできる。また、この着脱自在な 記録媒体として、光ディスク 407のほ力 MO、メモリカードなどであってもよい。 [0043] Further, the optical disc drive 406 controls reading and Z writing of data on the optical disc 407 under the control of the CPU 401. The optical disc 407 is a removable recording medium from which data is read under the control of the optical disc drive 406. The optical disc 407 can also be a writable recording medium. Further, this removable recording medium may be an optical disc 407, a memory card, or the like.
[0044] 磁気ディスク 405、光ディスク 407に記録される情報の一例として、図 3に示した車 載用カメラ 311や車載用マイク 312で得られた車内外の映像や音声、後述する GPS ユニット 415で検出された車両の現在地点情報、後述する各種センサ 416からの出 力値などが挙げられる。これらの情報は、ナビゲーシヨン装置 300が有するドライブレ コーダ機能によって記録され、交通事故発生時の検証用資料などとして用いられる。 [0044] Examples of information recorded on the magnetic disk 405 and the optical disk 407 include images and sounds inside and outside the vehicle obtained by the in-vehicle camera 311 and in-vehicle microphone 312 shown in FIG. Examples include current location information of the detected vehicle, output values from various sensors 416, which will be described later. This information is recorded by the drive recorder function of the navigation device 300 and is used as verification materials in the event of a traffic accident.
[0045] その他、磁気ディスク 405、光ディスク 407に記録される情報の他の一例として、経 路探索'経路誘導などに用いる地図情報が挙げられる。地図情報は、建物、河川、 地表面などの地物 (フィーチャ)をあらわす背景データと、道路の形状をあらわす道 路形状データとを有しており、ディスプレイ 413の表示画面において 2次元または 3次 元に描画される。ナビゲーシヨン装置 300が経路誘導中の場合は、地図情報と後述 する GPSユニット 415によって取得された自車の現在地点とが重ねて表示されること となる。 [0045] Another example of information recorded on the magnetic disk 405 and the optical disk 407 is map information used for route searching, route guidance, and the like. The map information includes background data representing features such as buildings, rivers, and the ground surface, and road shape data representing the shape of roads. drawn originally. When the navigation device 300 is providing route guidance, the map information and the current location of the own vehicle obtained by the GPS unit 415, which will be described later, are displayed superimposed.
[0046] 道路形状データは、さらに交通条件データを有する。交通条件データには、たとえ ば、各ノードについて、信号や横断歩道などの有無、高速道路の出入り口やジャンク シヨンの有無、各リンクについての長さ(距離)、道幅、進行方向、道路種別(高速道 路、有料道路、一般道路など)などの情報が含まれている。 [0046] The road shape data further includes traffic condition data. Traffic condition data includes, for example, the presence or absence of traffic lights and crosswalks for each node, the presence or absence of expressway entrances and exits, junctions, the length (distance) of each link, road width, direction of travel, and road type (highway Contains information such as roads, toll roads, general roads, etc.
[0047] また、交通条件データには、過去の渋滞情報を、季節 ·曜日'大型連休 '時刻など を基準に統計処理した過去渋滞情報を記憶して 、る。ナビゲーシヨン装置 300は、 後述する通信 IZF414によって受信される道路交通情報によって現在発生している 渋滞の情報を得るが、過去渋滞情報により、指定した時刻における渋滞状況の予想 をおこなうことが可能となる。 [0047] In addition, the traffic condition data stores past traffic congestion information that is statistically processed based on season, day of the week, long holidays, time, etc. The navigation device 300 obtains information on currently occurring traffic jams from the road traffic information received by communication IZF414, which will be described later, but it also makes it possible to predict the traffic jam situation at a specified time based on past traffic jam information. .
[0048] なお、本実施例 1では地図情報を磁気ディスク 405、光ディスク 407に記録するよう にしたが、これに限るものではない。地図情報は、ナビゲーシヨン装置 300のハードウ エアと一体に設けられているものに限って記録されているものではなぐナビゲーショ ン装置 300外部に設けられていてもよい。その場合、ナビゲーシヨン装置 300は、た とえば、通信 IZF414を通じて、ネットワークを介して地図情報を取得する。取得され た地図情報は RAM403などに記憶される。 [0048] In the first embodiment, map information is recorded on the magnetic disk 405 and the optical disk 407. However, it is not limited to this. The map information is not limited to being recorded integrally with the hardware of the navigation device 300, but may be provided outside the navigation device 300. In that case, the navigation device 300 acquires map information via the network, for example, through the communication IZF414. The acquired map information is stored in RAM403, etc.
[0049] また、音声 IZF408は、音声入力用のマイク 409 (たとえば、図 3の車載用マイク 31 2)および音声出力用のスピーカ 410に接続される。マイク 409に受音された音声は、 音声 IZF408内で AZD変換される。また、スピーカ 410からは音声が出力される。 なお、マイク 409から入力された音声は、音声データとして磁気ディスク 405あるいは 光ディスク 407に記録可能である。 [0049] Furthermore, the audio IZF 408 is connected to a microphone 409 for audio input (for example, the in-vehicle microphone 312 in FIG. 3) and a speaker 410 for audio output. The audio received by microphone 409 is converted to AZD in audio IZF408. Additionally, the speaker 410 outputs audio. Note that the audio input from the microphone 409 can be recorded on the magnetic disk 405 or the optical disk 407 as audio data.
[0050] また、入力デバイス 411は、文字、数値、各種指示などの入力のための複数のキー を備えたリモコン、キーボード、マウス、タツチパネルなどが挙げられる。 [0050] Examples of the input device 411 include a remote control, a keyboard, a mouse, a touch panel, etc., each having a plurality of keys for inputting characters, numbers, various instructions, and the like.
[0051] また、映像 IZF412は、ディスプレイ 413およびカメラ 417 (たとえば、図 3の車載用 カメラ 311)と接続される。映像 IZF412は、具体的には、たとえば、ディスプレイ 413 全体の制御をおこなうグラフィックコントローラと、即時表示可能な画像情報を一時的 に記録する VRAM (Video RAM)などのバッファメモリと、グラフィックコントローラか ら出力される画像データに基づいて、ディスプレイ 413を表示制御する制御 ICなどに よって構成される。なお、本実施例 1において映像 I/F412は、カメラ 417から撮影さ れたフレーム画像の各フレームごとにフレームナンバーを付与していく構成でもよい 。また、本実施例 1において映像 IZF412は、ディスプレイ 413を表示制御して、後 述するドライブレコーダ用画像に、時刻情報と、フレームナンバーと、移動体の移動 ルートに関する地図情報を合わせて表示してもよい。 [0051] Further, the video IZF 412 is connected to a display 413 and a camera 417 (for example, the in-vehicle camera 311 in FIG. 3). Specifically, the video IZF412 includes, for example, a graphics controller that controls the entire display 413, a buffer memory such as VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and output from the graphics controller. It is composed of a control IC that controls the display 413 based on the image data that is displayed. Note that in the first embodiment, the video I/F 412 may be configured to assign a frame number to each frame of the frame image photographed by the camera 417. Furthermore, in the first embodiment, the video IZF 412 controls the display 413 to display an image for a drive recorder, which will be described later, along with time information, a frame number, and map information regarding the moving route of the moving object. Good too.
[0052] ディスプレイ 413には、アイコン、カーソル、メニュー、ウィンドウ、あるいは文字や画 像などの各種データが表示される。このディスプレイ 413は、たとえば、 CRT、 TFT 液晶ディスプレイ、プラズマディスプレイなどを採用することができる。ディスプレイ 41 3は、たとえば、図 3の表示部 Dのような態様で設置される。また、ディスプレイ 413は 、車両に複数備えられていてもよぐたとえば、ドライバーに対するものと後部座席に 着座する搭乗者に対するものなどである。 [0053] カメラ 417は、車両内部あるいは外部の映像を撮影する。映像は静止画あるいは動 画のどちらでもよぐたとえば、カメラ 417によって車両内部のドライバーの挙動を撮 影し、撮影した映像を映像 IZF412を介して磁気ディスク 405や光ディスク 407など の記録媒体に出力する。また、カメラ 417によって車両外部の状況を撮影し、撮影し た映像を映像 I/F412を介して磁気ディスク 405や光ディスク 407などの記録媒体 に出力する。また、記録媒体に出力された映像は、ドライブレコーダ用画像として上 書き記録や保存がおこなわれる。 [0052] The display 413 displays icons, cursors, menus, windows, and various data such as characters and images. This display 413 can be, for example, a CRT, a TFT liquid crystal display, a plasma display, or the like. The display 413 is installed, for example, in a manner similar to display section D in FIG. 3. Further, the vehicle may be equipped with a plurality of displays 413, for example, one for the driver and one for passengers seated in the rear seats. [0053] Camera 417 takes images of the inside or outside of the vehicle. The video can be either a still image or a video. For example, the camera 417 records the behavior of the driver inside the vehicle, and the video is output to a recording medium such as a magnetic disk 405 or an optical disk 407 via the video IZF 412. . Further, the camera 417 photographs the situation outside the vehicle, and the photographed image is output to a recording medium such as a magnetic disk 405 or an optical disk 407 via a video I/F 412. In addition, the video output to the recording medium is overwritten and saved as an image for the drive recorder.
[0054] また、通信 IZF414は、無線を介してネットワークに接続され、ナビゲーシヨン装置 300と CPU401とのインターフェースとして機能する。通信 IZF414は、さらに、無線 を介してインターネットなどの通信網に接続され、この通信網と CPU401とのインター フェースとしても機能する。 [0054] Furthermore, the communication IZF 414 is connected to the network via wireless, and functions as an interface between the navigation device 300 and the CPU 401. Communication IZF414 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between this communication network and CPU401.
[0055] 通信網には、 LAN, WAN,公衆回線網や携帯電話網などがある。具体的には、 通信 IZF414は、たとえば、 FMチューナー、 VICS (Vehicle Information and Communication System) Zビーコンレシーノ 、無線ナビゲーシヨン装置、および その他のナビゲーシヨン装置によって構成され、 VICSセンター力も配信される渋滞 や交通規制などの道路交通情報を取得する。なお、 VICSは登録商標である。 [0055] Communication networks include LAN, WAN, public line networks, and mobile phone networks. Specifically, the communication IZF414 is composed of, for example, an FM tuner, a VICS (Vehicle Information and Communication System) Z beacon resino, a radio navigation device, and other navigation devices, and the VICS center power is also distributed during traffic jams and Obtain road traffic information such as traffic regulations. Please note that VICS is a registered trademark.
[0056] また、 GPSユニット 415は、 GPS衛星からの受信波や後述する各種センサ 416 (た とえば、角速度センサや加速度センサ、タイヤの回転数など)からの出力値を用いて 、車両の現在地点けピゲーシヨン装置 300の現在地点)を示す情報を算出する。現 在地点を示す情報は、たとえば緯度'経度、高度などの、地図情報上の 1点を特定す る情報である。また、 GPSユニット 415は、各種センサ 416からの出力値を用いて、ォ ドメ一ター、速度変化量、方位変化量を出力する。これにより、急ブレーキ、急ハンド ルなどの動態を解析することができる。 [0056] Furthermore, the GPS unit 415 uses received waves from GPS satellites and output values from various sensors 416 (for example, angular velocity sensor, acceleration sensor, tire rotation speed, etc.) to be described later to determine the current state of the vehicle. Calculates information indicating the current location of the location pierce device 300. Information indicating the current location is information that specifies a point on map information, such as latitude, longitude, and altitude. Furthermore, the GPS unit 415 uses the output values from various sensors 416 to output a odometer, a speed change amount, and an azimuth change amount. This makes it possible to analyze the dynamics of sudden braking, sudden steering, etc.
[0057] ここで、 GPSユニット 415を用いた現在地点の特定方法について説明する。まず、 GPSにおいては、地球の周りの 6つの軌道面に 4個ずつ、合計 24個の GPS衛星が 配置されている。これらの衛星は、毎日同じ時刻に同じ衛星が位置するように軌道が 調整され、地球上のどの地点からも(ただし、見通しのよい場所である必要がある)常 に 5な!、し 6個の衛星が見える。 [0058] GPS衛星には、セシウム (Cs)の原子時計 (発振器)が搭載されており、各衛星の時 刻と同期を受けつつ正確な時刻を刻んでいる。さらに、各衛星には予備としてセシゥ ム発振器力 ^台、ルビジウム (Rb)発振器が 2台搭載されている。これは、 GPSによる 位置計測には正確な時刻が不可欠なためである。 [0057] Here, a method for identifying the current location using GPS unit 415 will be described. First, in GPS, a total of 24 GPS satellites are placed, four in each of six orbital planes around the earth. The orbits of these satellites are adjusted so that the same satellite is located at the same time every day, and can be seen from any point on Earth (as long as it is in a clear line of sight). satellite can be seen. [0058] GPS satellites are equipped with cesium (Cs) atomic clocks (oscillators) that keep accurate time while being synchronized with the time of each satellite. Furthermore, each satellite is equipped with two backup cesium oscillators and two rubidium (Rb) oscillators. This is because accurate time is essential for position measurement using GPS.
[0059] GPS衛星力らは 1575. 42MHz (Ll)および 1227. 60MHz (L2)の 2つの周波数 の電波(以下、 GPS信号という)が送信されている。この電波は疑似ランダム符号 (Ps eudo Random Noise Code)と呼ばれる乱数符号で変調されており、 GPSュ- ット 415などで受信した場合には、乱数表に相当するコードを参照し信号内容を解読 する。 [0059] GPS satellites transmit radio waves (hereinafter referred to as GPS signals) at two frequencies: 1575.42MHz (Ll) and 1227.60MHz (L2). This radio wave is modulated with a random number code called a pseudo random noise code, and when received by a GPS unit 415, etc., the signal contents are deciphered by referring to a code corresponding to a random number table. do.
[0060] GPSユニット 415は、解読したコードと自装置内の時計によって、 GPS衛星力も GP S信号が発射された時刻と、自装置が GPS信号を受信した時刻との信号の時間差を 計測する。そして、時間差に電波の伝播速度を掛け合わせ、 GPS衛星から自装置ま での距離を算出する (距離 =速度 X時間)。なお、この時刻は協定世界時 (UTC)に 同期されている。 [0060] The GPS unit 415 uses the decoded code and the clock in its own device to measure the time difference between the GPS satellite signal and the time when the GPS signal is emitted and the time when its own device receives the GPS signal. Then, the distance from the GPS satellite to the device is calculated by multiplying the time difference by the radio wave propagation speed (distance = speed x time). Note that this time is synchronized with Coordinated Universal Time (UTC).
[0061] GPS衛星からは、軌道の正確な情報が送られてくるため、 GPS衛星の現在地点は 正確に知ることができる。したがって、 GPS衛星力 の距離が分かれば、自装置の現 在地点は GPS衛星を中心として、求めた距離を半径とする球面上のいずれかの地 点となる。なお、 GPS信号の符号列は約 lmsの間隔で繰り返し送られる。 GPS信号 の伝播速度は、 400, OOOkmZ秒であるため、最大測定距離は、 400, 000 X 0. 0 01 =400kmとなる。したがって、 100km程度の精度においては、あらかじめ自装置 の現在地点を知っておく必要がある。 [0061] GPS satellites send accurate information on their orbits, so the current location of the GPS satellite can be accurately known. Therefore, if the distance of the GPS satellite force is known, the current location of the device will be any point on the spherical surface with the GPS satellite as the center and the calculated distance as the radius. Note that the code string of the GPS signal is sent repeatedly at intervals of approximately lms. The propagation speed of the GPS signal is 400,000kmZ seconds, so the maximum measurement distance is 400,000 x 0.001 =400km. Therefore, with an accuracy of about 100 km, it is necessary to know the current location of the device in advance.
[0062] このように、各 GPS衛星のうち 3つの衛星からの距離を算出すれば、自装置の現在 地点は 3つの球面が交わる 2点のうちのいずれか一方となる。また、 2点のうち一方は 、予測できる地点力もかけ離れているため、原理的には 1点が決定されることとなる。 しかしながら、実際には算出される現在地点の候補点(3つの面の交点)は 2点になら ない。これは、主に GPSユニット 415に搭載された時計の精度力 GPS衛星に搭載 された原子時計に比べて低いため、計算結果に誤差が生じてしまうためである。 [0062] In this way, if the distances from three of the GPS satellites are calculated, the current location of the device will be one of the two points where the three spherical surfaces intersect. Furthermore, the predicted point forces for one of the two points are also far apart, so in principle only one point will be determined. However, in reality, the number of candidate points (intersections of three planes) for the current location calculated is not two. This is mainly because the accuracy of the clock mounted on the GPS unit 415 is lower than that of the atomic clock mounted on the GPS satellite, which causes errors in the calculation results.
[0063] このため、 GPSユニット 415では、合計 4つの GPS衛星から GPS信号を受信する。 これは、 GPSユニット 415側の時計の誤差分を別の未知数として、新たな情報 (方程 式)を導入することで解を得ると考えることができる。このように、 GPSユニット 415は、 4つの GPS衛星からの GPS信号を受信することによって、 1点に収束するほぼ正確 な現在地点を求めることができる。 [0063] Therefore, GPS unit 415 receives GPS signals from a total of four GPS satellites. This can be thought of as being solved by introducing new information (equation) with the clock error on the GPS unit 415 as another unknown. In this way, by receiving GPS signals from four GPS satellites, the GPS unit 415 can determine an almost accurate current location that converges on one point.
[0064] 各種センサ 416は、車速センサや加速度センサ、 Gセンサ、角速度センサなどであ り、その出力値は、 GPSユニット 415による現在地点の算出や、速度や方位の変化 量の測定などに用いられる。また、各種センサ 416は、ドライバーによる車両の各操 作を検知するセンサなども含む。車両の各操作の検知は、たとえば、ハンドル操作や ウィンカーの入力やアクセルペダルの踏み込みやブレーキペダルの踏み込みなどを 検知する構成としてもよい。また、各種センサ 416の出力値は、ドライブレコーダ機能 で記録するデータとしてもよ 、。 [0064] Various sensors 416 include vehicle speed sensors, acceleration sensors, G sensors, angular velocity sensors, etc., and their output values are used for calculation of the current location by the GPS unit 415, measurement of changes in speed and direction, etc. It will be done. The various sensors 416 also include sensors that detect various operations of the vehicle by the driver. The detection of each operation of the vehicle may be configured to detect, for example, steering wheel operation, turn signal input, accelerator pedal depression, brake pedal depression, etc. Additionally, the output values of the various sensors 416 can be used as data to be recorded with a drive recorder function.
[0065] また、各種センサ 416においては、あら力じめ、ドライブレコーダ用画像を保存する 際のトリガーを設定しておき、トリガーが検知された場合にドライブレコーダ用画像を 保存する構成としてもよい。トリガーは、たとえば、ドライブレコーダ用画像を保存する きっかけとなるものなどで、各種センサ 416における、所定のしきい値以上の出力や 所定のパターンと近似する出力などをトリガーとする構成でもよい。より具体的には、 たとえば、各種センサ 416におけるトリガーは、振動センサで規定以上の振動や所定 の振動パターンを検知した場合に設定してもよい。所定の振動パターンは、急激な 立ち上がりなど、異常を示す振動パターンであればよい。また、トリガーは、たとえば、 Gセンサで規定以上の Gや所定の Gの力かり方のパターンを検知した場合に設定し てもよい。所定の Gの力かり方は、急激な立ち上がりなど、異常を示すパターンであ ればよい。あるいは、車体の接触センサによる、他との接触の有無やエアバッグなど の作動や車両の停止をトリガーとする構成でもよい。さらに、前述のトリガーは一つ以 上であればよぐ複数を組み合わせてトリガーとしてもよ!/、。 [0065] Furthermore, the various sensors 416 may be configured to have a trigger set in advance for saving an image for the drive recorder, and to save the image for the drive recorder when the trigger is detected. . The trigger is, for example, something that triggers the storage of an image for a drive recorder, and may be configured such that the output of various sensors 416 is equal to or higher than a predetermined threshold value or output that approximates a predetermined pattern. More specifically, for example, the trigger in the various sensors 416 may be set when the vibration sensor detects a vibration exceeding a specified value or a predetermined vibration pattern. The predetermined vibration pattern may be any vibration pattern that indicates an abnormality, such as a sudden rise. Further, the trigger may be set, for example, when the G sensor detects a G greater than a specified value or a pattern of applying force of a predetermined G. The predetermined G force may be applied in a pattern that indicates an abnormality, such as a sudden rise. Alternatively, the trigger may be the presence or absence of contact with another object, the activation of an airbag, or the stopping of the vehicle using a contact sensor on the vehicle body. Furthermore, if there is more than one trigger mentioned above, you can combine multiple triggers to create a trigger!/,.
[0066] なお、実施の形態に力かる情報記録装置 100の機能的構成のうち、検出部 101お よび表示制御部 105は CPU401によって、取得部 102は GPSユニット 415および各 種センサ 416にお 、て、保存部 103は磁気ディスクドライブ 404や光ディスクドライブ 406によって、検知部 104は各種センサ 416によって、それぞれその機能を実現す る。 [0066] Of the functional configuration of the information recording device 100 according to the embodiment, the detection unit 101 and the display control unit 105 are connected to the CPU 401, and the acquisition unit 102 is connected to the GPS unit 415 and various sensors 416. The storage unit 103 realizes its functions by a magnetic disk drive 404 and an optical disk drive 406, and the detection unit 104 realizes its functions by various sensors 416. Ru.
[0067] つぎに、図 5、図 6を用いて、本実施例 1にかかるドライブレコーダ用画像の概略に ついて説明する。図 5は、本実施例 1にかかる車両の事故前におけるドライブレコー ダ用画像の一例を示す説明図である。図 5において、車両の事故前におけるドライブ レコーダ用画像 500は、自車両 501に搭載されたナビゲーシヨン装置 300のドライブ レコーダ機能によって撮影された画像 500aと、自車両 501が移動した履歴に関する 地図情報 500bとによって構成される。画像 500aには、画像 500aのフレームナンパ 一 502と、画像 500aが撮影された時刻 503が表示されている。本図において、画像 500aは、たとえば、自車両 501が交差点にさしかかり、対向車 504が自車両 501に 接近している状況が映し出されている。この他にも、たとえば、交差点の名称 505、信 号機 506、横断歩道 507、停止線 508、横断歩道または自転車横断帯ありを示す標 識 509、交差点付近の建物 510などの情報が映し出されている。 [0067] Next, the outline of the drive recorder image according to the first embodiment will be explained using FIGS. 5 and 6. FIG. 5 is an explanatory diagram showing an example of a drive recorder image of the vehicle according to the first embodiment before the accident. In FIG. 5, a drive recorder image 500 before the vehicle accident includes an image 500a taken by the drive recorder function of the navigation device 300 installed in the own vehicle 501, and map information 500b regarding the movement history of the own vehicle 501. It is composed of The image 500a displays the frame number 502 of the image 500a and the time 503 when the image 500a was photographed. In this figure, an image 500a shows, for example, a situation in which the own vehicle 501 approaches an intersection and an oncoming vehicle 504 approaches the own vehicle 501. In addition, information such as the name of the intersection 505, traffic lights 506, crosswalks 507, stop lines 508, signs indicating crosswalks or bicycle crossing zones 509, and buildings near the intersection 510 are displayed. .
[0068] また、地図情報 500bは、たとえば、自車両 501が移動した移動ルートを地図上に 表示する地図情報などでもよぐ自車両 501が移動した移動ルート 511を点または軌 跡で示す構成でもよい。地図情報 500bにおける交差点付近の建物 512および交差 点の名称 513は、画像 500aにおける交差点付近の建物 510および交差点の名称 5 05にそれぞれ対応している。なお、図 5において、地図情報 500bはドライブレコーダ 用画像 500の右上に表示されている力 これは右上に限ることはなぐ画像 500aと関 連付けて表示される構成であればどこでもよ 、。 [0068] The map information 500b may also be configured to show the travel route 511 traveled by the host vehicle 501 as a point or a trajectory, such as map information that displays the travel route traveled by the host vehicle 501 on a map. good. The building 512 near the intersection and the name 513 of the intersection in the map information 500b correspond to the building 510 near the intersection and the name 505 of the intersection in the image 500a, respectively. In FIG. 5, the map information 500b is the power displayed at the upper right of the drive recorder image 500. This is not limited to the upper right, but may be displayed anywhere as long as it is displayed in association with the image 500a.
[0069] 図 6は、本実施例 1にかかる車両の事故時におけるドライブレコーダ用画像の一例 を示す説明図である。図 6において、車両の事故時における画像 600は、自車両 50 1に搭載されたナビゲーシヨン装置 300のドライブレコーダ機能によって撮影された 画像 600aと、自車両 501が移動した履歴に関する地図情報 600bとによって構成さ れる。画像 600aは、たとえば、自車両 501が交差点において、対向車 504と衝突し た状況などが映し出されている。このような状況の場合、画像 600aのフレームナンパ 一 602と、画像 600aが撮影された時刻 603だけでなぐ交差点の名称 505、信号機 506、横断歩道 507、交差点付近の建物 510などの情報が、事故発生時の状況を 把握する情報となる。 [0070] 地図情報 600bは、自車両 501が対向車 504と衝突するまでの移動ルート 611を表 示した地図情報でもよい。また、事故発生時においては、地図情報 600bにおける自 車両 501の位置を、事故が発生したことが分力るようなマーク 612によって表示しても よい。また、地図情報 600bは、事故地点を中心にした地図でもよぐ自車両 501の進 行方向を上にした地図でもよい。地図情報 600bにおける地図の縮尺は、事故時点 の前後所定時間分自車両 501が移動する移動距離を想定した縮尺にしてもよい。 [0069] FIG. 6 is an explanatory diagram showing an example of a drive recorder image at the time of a vehicle accident according to the first embodiment. In FIG. 6, an image 600 at the time of a vehicle accident is generated by an image 600a taken by the drive recorder function of the navigation device 300 installed in the own vehicle 501, and map information 600b regarding the movement history of the own vehicle 501. configured. Image 600a shows, for example, a situation where own vehicle 501 collides with oncoming vehicle 504 at an intersection. In such a situation, information such as the name of the intersection 505, the traffic light 506, the crosswalk 507, the building near the intersection 510, etc., can be obtained from the frame pick-up 602 of image 600a and the time 603 when image 600a was taken. This information is used to understand the situation at the time of occurrence. [0070] The map information 600b may be map information that displays the travel route 611 of the host vehicle 501 until it collides with the oncoming vehicle 504. Furthermore, when an accident occurs, the position of the own vehicle 501 in the map information 600b may be displayed with a mark 612 that indicates that an accident has occurred. Furthermore, the map information 600b may be a map centered on the accident location or a map with the traveling direction of the own vehicle 501 facing upward. The scale of the map in the map information 600b may be a scale that assumes the distance traveled by the own vehicle 501 for a predetermined amount of time before and after the accident.
[0071] (ナビゲーシヨン装置 300の処理の内容) [0071] (Processing details of navigation device 300)
つぎに、図 7を用いて、本実施例 1にかかるナビゲーシヨン装置 300の処理の内容 について説明する。図 7は、本実施例 1にかかるナビゲーシヨン装置の処理の内容を 示すフローチャートである。図 7のフローチャートにおいて、まず、ナビゲーシヨン装置 300は、車両が走行中となった力否かを判断する(ステップ S701)。車両の走行に関 する判断は、たとえば、各種センサ 416の出力を参照しておこなってもよい。 Next, the contents of the processing of the navigation device 300 according to the first embodiment will be explained using FIG. FIG. 7 is a flowchart showing the contents of the processing of the navigation device according to the first embodiment. In the flowchart of FIG. 7, first, the navigation device 300 determines whether or not the vehicle is running (step S701). Judgments regarding the running of the vehicle may be made with reference to the outputs of various sensors 416, for example.
[0072] ステップ S701において、車両が走行中となるのを待って、走行中となった場合 (ス テツプ S701 :Yes)は、 CPU401がカメラ 417を制御して、ドライブレコーダ用画像の 撮影を開始する (ステップ S702)。ドライブレコーダ用画像は、たとえば、前述の図 5 または図 6に示すように、車両に搭載されたカメラ 417から車両の前方方向を撮影し た画像などでもよい。この画像には、たとえば、自車両の他に、対向車両、交差点、 交差点付近の様子、信号機、道路標識などが映し出されている。 [0072] In step S701, wait until the vehicle is running, and if the vehicle is running (step S701: Yes), the CPU 401 controls the camera 417 to start capturing images for the drive recorder. (Step S702). The drive recorder image may be, for example, an image taken in the front direction of the vehicle from a camera 417 mounted on the vehicle, as shown in FIG. 5 or 6 described above. This image shows, for example, in addition to the own vehicle, oncoming vehicles, intersections, the vicinity of intersections, traffic lights, road signs, etc.
[0073] そして、 CPU401は、ステップ S702において撮影されたドライブレコーダ用画像を 上書き記録する (ステップ S703)。より具体的には、 CPU401が磁気ディスクドライブ 404を制御して、ドライブレコーダ用画像を上書き記録用の記録媒体に上書き記録 する構成でもよい。 [0073] Then, the CPU 401 overwrites and records the drive recorder image taken in step S702 (step S703). More specifically, the CPU 401 may control the magnetic disk drive 404 to overwrite the drive recorder image onto the overwrite recording medium.
[0074] つぎに、 CPU401は、トリガーを検知した力否かを判断する(ステップ S704)。トリガ 一の検知は、たとえば、各種センサ 416によって所定のトリガーを検知しておこなう構 成でもよい。トリガーは、たとえば、各種センサ 416の出力によって、ドライブレコーダ 用画像を保存するきつかけなどで、より具体的には、振動センサで規定以上の振動 や所定の振動パターンを検知した場合などでもよい。所定の振動パターンは、急激 な立ち上がりの振動など、異常を示す振動パターンであればよい。また、トリガーは、 たとえば、 Gセンサで規定以上の Gや所定の Gの力かり方のパターンを検知した場合 などでもよい。所定の Gの力かり方は、急激な立ち上がりの Gなど、異常を示すパター ンであればよい。あるいは、車体の接触センサによる、他との接触の有無やエアバッ グなどの作動をトリガーとする構成でもよ 、。 [0074] Next, the CPU 401 determines whether or not the force detected is the trigger (step S704). The detection of the trigger may be performed by, for example, detecting a predetermined trigger using various sensors 416. The trigger may be, for example, a trigger to save an image for a drive recorder based on the output of various sensors 416, or more specifically, the trigger may be a case where a vibration sensor detects vibration exceeding a specified value or a predetermined vibration pattern. The predetermined vibration pattern may be any vibration pattern that indicates an abnormality, such as a vibration with a sudden rise. Also, the trigger is For example, it may be the case that the G-sensor detects a G that exceeds a specified value or a pattern of force applied to a predetermined G. The predetermined way of applying G force may be any pattern that indicates an abnormality, such as a sudden rise in G force. Alternatively, the trigger may be the presence or absence of contact with another vehicle or the activation of an airbag, etc., using a contact sensor in the vehicle body.
[0075] また、 CPU401は、各種センサ 416の出力によって車両の危険な挙動となるドライ バーの運転操作を検知して、トリガーを検知する構成としてもよい。より具体的には、 所定の角速度を超えた急ノヽンドルやウィンカーを出さずに指定以上の角度のハンド ル操作や眠気を催したときに特有なハンドル操作など通常と異なるハンドル操作をト リガ一としてもよい。また、指定の加速度以上の加速'減速や信号のない交差点で減 速がな力 たことや赤信号 (黄信号)で減速がな力 たことや眠気を催したときに特有 のペダル操作など通常と異なるペダル操作をトリガーとする構成としてもょ ヽ。なお、 ハンドル操作やペダル操作の異常は、あら力じめ動作パターンを登録して、登録され た動作パターンと比較する構成でもよい。また、信号のない交差点やその他停止の 必要がある地点は、 ROM402などに記録された地図情報に基づいて取得してもよ い。信号の色は、カメラ 417で撮影した画像力も判断する構成でもよい。 [0075] Furthermore, the CPU 401 may be configured to detect a trigger by detecting a driver's driving operation that causes dangerous behavior of the vehicle based on the outputs of various sensors 416. More specifically, the system can trigger unusual steering wheel operations, such as sudden steering that exceeds a predetermined angular velocity, steering wheel operations that exceed a specified angle without turning on a turn signal, and steering wheel operations that are typical when drowsy. You can also use it as In addition, normal accidents such as acceleration/deceleration exceeding a specified acceleration, failure to decelerate at an intersection without a traffic light, failure to decelerate at a red (yellow) light, and special pedal operations when feeling drowsy. It is also possible to use a configuration in which the trigger is a different pedal operation. In addition, in order to detect abnormalities in steering wheel operation or pedal operation, a movement pattern may be registered in advance and compared with the registered movement pattern. In addition, intersections without traffic lights and other points where it is necessary to stop may be obtained based on map information recorded in the ROM402 or the like. The color of the signal may also be configured to determine the strength of the image taken by the camera 417.
[0076] ステップ S704において、トリガーを検知しない場合 (ステップ S704 : No)は、車両 の走行が終了したか否かを判断する(ステップ S710)。一方、ステップ S704におい て、トリガーを検知した場合 (ステップ S704 : Yes)は、 CPU401は、ドライブレコーダ 用画像を保存する (ステップ S705)。より具体的には、 CPU401が磁気ディスクドライ ブ 404を制御して、ステップ S704においてトリガーを検知した検知時点とその前後 一定時間におけるドライブレコーダ用画像を保存用の記録媒体に保存する構成でも よい。また、一定時間はユーザによって設定できる構成でもよぐ検知時点から一定 時間内に再度トリガーを検知した場合は、保存する時間を延長できる構成でもよい。 [0076] If the trigger is not detected in step S704 (step S704: No), it is determined whether or not the vehicle has finished traveling (step S710). On the other hand, if a trigger is detected in step S704 (step S704: Yes), the CPU 401 saves the image for the drive recorder (step S705). More specifically, the CPU 401 may control the magnetic disk drive 404 to save drive recorder images at the time of detection of the trigger in step S704 and a certain period of time before and after the detection time in a storage recording medium. Alternatively, the configuration may be such that a certain period of time can be set by the user, or the storage time can be extended if the trigger is detected again within a certain period of time from the time of detection.
[0077] つづ 、て、 CPU401は、ドライブレコーダ用画像を撮影した時刻に関する時刻情 報と、この時刻におけるドライブレコーダ用画像のフレームナンバーに関するフレー ム情報を検出する (ステップ S 706)。時刻情報は、たとえば、ナビゲーシヨン装置 300 に備えられたタイマーから検出してもよい。あるいは、 CPU401力 SGPSユニット 415 を制御して検出してもよい。また、フレームナンバーは、たとえば、映像 I/F412が、 カメラ 417から撮影されたフレーム画像の各フレームごとにフレームナンバーを付与 していく構成でもよい。 [0077] Continuing, CPU 401 detects time information regarding the time when the drive recorder image was photographed and frame information regarding the frame number of the drive recorder image at this time (step S706). The time information may be detected from a timer provided in the navigation device 300, for example. Alternatively, the CPU 401 may control and detect the SGPS unit 415. Also, the frame number is, for example, if the video I/F412 is A configuration may also be adopted in which a frame number is assigned to each frame of the frame image taken by the camera 417.
[0078] つぎに、 CPU401は、ステップ S706において検出された時刻情報に基づいて、車 両の移動した履歴に関する地図情報を取得する (ステップ S707)。地図情報は、たと えば、 CPU401が GPSュ-ット 415および各種センサ 416を制御して車両の位置に 関する情報を取得し、前述した図 5、図 6に示すように、車両が移動した移動ルートを 点または軌跡を用いて地図上に表示する構成でもよい。 [0078] Next, CPU 401 obtains map information regarding the history of movement of the vehicle based on the time information detected in step S706 (step S707). For example, the map information is obtained by the CPU 401 controlling the GPS unit 415 and various sensors 416 to obtain information regarding the vehicle's position, and as shown in Figures 5 and 6 described above, the map information is obtained by controlling the GPS unit 415 and various sensors 416 to obtain information about the vehicle's position. The configuration may be such that the route is displayed on the map using points or trajectories.
[0079] そして、 CPU401は、ステップ S706において検出した時刻情報およびフレーム情 報と、ステップ S707において取得した地図情報をステップ S705において保存したド ライブレコーダ用画像に合成する(ステップ S708)。より具体的には、 ROM402また は RAM403などのバッファメモリにおいて、 CPU401が、時刻情報と地図情報を、こ の時刻におけるフレームナンバーに対応するフレーム画像に合成する。合成された 合成画像は、たとえば、図 5、図 6に示すように、ドライブレコーダ用画像左上にフレ ーム情報と時刻情報を表示し、ドライブレコーダ用画像右上に移動体の移動した履 歴に関する地図情報を表示する構成でもよい。 [0079] Then, the CPU 401 combines the time information and frame information detected in step S706 and the map information acquired in step S707 with the drive recorder image saved in step S705 (step S708). More specifically, in a buffer memory such as ROM 402 or RAM 403, CPU 401 combines time information and map information into a frame image corresponding to the frame number at this time. For example, as shown in Figures 5 and 6, the synthesized image displays frame information and time information in the upper left of the drive recorder image, and displays information about the movement history of the moving object in the upper right of the drive recorder image. It may also be configured to display map information.
[0080] つぎに、 CPU401は、ステップ S708において合成した合成画像を保存する(ステ ップ S709)。より具体的には、 CPU401が磁気ディスクドライブ 404を制御して、合 成画像を保存用の記録媒体に保存する構成でもよい。 [0080] Next, the CPU 401 saves the composite image synthesized in step S708 (step S709). More specifically, the configuration may be such that the CPU 401 controls the magnetic disk drive 404 to save the composite image in a storage recording medium.
[0081] そして、ナビゲーシヨン装置 300は、車両の走行が終了した力否かを判断する(ステ ップ S710)。車両の走行に関する判断は、たとえば、各種センサ 416の出力を参照 しておこなってもよい。より具体的には、各種センサ 416の出力が停止した時点で車 両の走行が終了したと判断してもよい。 [0081] The navigation device 300 then determines whether or not the vehicle has stopped running (step S710). Judgments regarding the running of the vehicle may be made with reference to the outputs of various sensors 416, for example. More specifically, it may be determined that the vehicle has finished running when the outputs of the various sensors 416 stop.
[0082] ステップ S710において、車両の走行が終了しない場合 (ステップ S710 : No)は、ス テツプ S703に戻って、ドライブレコーダ用画像の上書き記録を繰り返す。また、ステ ップ S710において、車両の走行が終了した場合 (ステップ S710 : Yes)は、そのまま 一連の処理を終了する。 [0082] In step S710, if the vehicle has not finished traveling (step S710: No), the process returns to step S703 and repeats overwriting of the drive recorder image. Furthermore, in step S710, if the vehicle has finished traveling (step S710: Yes), the series of processes is immediately ended.
[0083] なお、本図の説明では、ステップ S703において車両周辺の画像をドライブレコー ダ用画像として撮影して上書き記録する構成としているが、その他各種センサ 416の 出力など走行状態に関する情報をあわせて上書き記録する構成としてもよい。また、 その場合、ステップ S 704において、トリガーを検知した段階で、検知した検知時点と その前後一定時間における出力を保存してもよい。また、本図の説明では、トリガー を検知してドライブレコーダ用画像の保存をおこなう構成としている力 事故の未遂 段階を検知する構成としてもよい。その場合、事故未遂を含めて検知できるように、ト リガ一の設定を変更する構成としてもょ ヽ。 [0083] In the explanation of this figure, an image of the surroundings of the vehicle is taken as an image for the drive recorder in step S703 and is overwritten, but other various sensors 416 are used. It may also be configured to overwrite and record information related to driving conditions such as output. Further, in that case, in step S704, when the trigger is detected, the output at the time of detection and a certain period of time before and after the trigger may be saved. Further, in the explanation of this figure, a configuration may be adopted in which a trigger is detected and an image for a drive recorder is stored, but an attempt stage of a force accident is detected. In that case, it would be a good idea to change the settings of the trigger so that it can detect even attempted accidents.
[0084] 以上説明したように、本実施例 1によれば、車両の走行中、ドライブレコーダ用画像 を常時上書き記録する。そして、トリガーを検知した場合、ドライブレコーダ用画像に 時刻情報とフレーム情報および地図情報を合成して保存する。したがって、車両の 事故時の状況を解析する際に、事故発生時刻、事故発生地点、事故発生時点前後 における車両の移動ルートなどの事故原因の解明に繋がる情報をドライブレコーダ 用画像から的確に把握することができる。 [0084] As explained above, according to the first embodiment, the drive recorder image is always overwritten and recorded while the vehicle is running. When a trigger is detected, time information, frame information, and map information are combined and saved in the image for the drive recorder. Therefore, when analyzing the situation of a vehicle at the time of an accident, it is necessary to accurately grasp information that will lead to elucidation of the cause of the accident, such as the time of the accident, the location of the accident, and the vehicle's travel route before and after the time of the accident, from the drive recorder images. be able to.
[0085] また、本実施例 1によれば、事故に遭遇した時点に加えて事故未遂 (ヒヤリハット)の 時点におけるトリガーを検知した場合に、時刻情報とフレーム情報および地図情報を 合成したドライブレコーダ用画像を保存する。したがって、保存された合成画像を安 全運転に幅広く活用することができる。 [0085] Furthermore, according to the present Example 1, when a trigger is detected at the time of an attempted accident (near-miss) in addition to the time of encountering an accident, the time information, frame information, and map information are combined for use in a drive recorder. Save the image. Therefore, the stored composite images can be widely used for safe driving.
実施例 2 Example 2
[0086] つぎに、本発明の実施例 2について説明する。本実施例 2は、前述の実施例 1で説 明したナビゲーシヨン装置 300で、トリガーを検知する前に、ドライブレコーダ用画像 に時刻情報とフレーム情報および地図情報を合成してから上書き記録する場合につ いて説明する。なお、本実施例 2にかかるナビゲーシヨン装置 300の周辺機器構成 については、図 3とほぼ同様であるため説明を省略する。また、本実施例 2にかかる ナビゲーシヨン装置 300のハードウェア構成については、図 4とほぼ同様であるため 説明を省略する。 [0086] Next, Example 2 of the present invention will be described. Embodiment 2 is a case in which, in the navigation device 300 described in Embodiment 1, time information, frame information, and map information are combined with a drive recorder image before a trigger is detected, and then overwritten recording is performed. I will explain about it. Note that the peripheral device configuration of the navigation device 300 according to the second embodiment is substantially the same as that in FIG. 3, and therefore a description thereof will be omitted. Further, since the hardware configuration of the navigation device 300 according to the second embodiment is almost the same as that in FIG. 4, a description thereof will be omitted.
[0087] (ナビゲーシヨン装置 300の処理の内容) [0087] (Contents of processing of navigation device 300)
ここで、図 8を用いて、本実施例 2にかかるナビゲーシヨン装置 300の処理の内容に ついて説明する。図 8は、本実施例 2にかかるナビゲーシヨン装置の処理の内容を示 すフローチャートである。図 8のフローチャートにおいて、まず、ナビゲーシヨン装置 3 00は、車両が走行中となった力否かを判断する(ステップ S801)。車両の走行に関 する判断は、たとえば、各種センサ 416の出力を参照しておこなってもよい。 Here, the contents of the processing of the navigation device 300 according to the second embodiment will be explained using FIG. 8. FIG. 8 is a flowchart showing the contents of the processing of the navigation device according to the second embodiment. In the flowchart of Figure 8, first, navigation device 3 00 determines whether the force is such that the vehicle is running (step S801). Judgments regarding the running of the vehicle may be made with reference to the outputs of various sensors 416, for example.
[0088] ここで、車両が走行中となるのを待って、走行中となった場合 (ステップ S801: Yes) は、じ?11401がカメラ417を制御して、ドライブレコーダ用画像の撮影を開始する(ス テツプ S802)。ドライブレコーダ用画像は、たとえば、前述の図 5または図 6に示すよ うに、車両に搭載されたカメラ 417から車両の前方方向を撮影した画像などでもよい 。この画像には、たとえば、自車両の他に、対向車両、交差点、交差点付近の様子、 信号機、道路標識などが映し出されている。 [0088] Here, wait until the vehicle is running, and if it is running (step S801: Yes), is it the same? 11401 controls camera 417 and starts capturing images for the drive recorder (step S802). The drive recorder image may be, for example, an image taken in the front direction of the vehicle from a camera 417 mounted on the vehicle, as shown in FIG. 5 or 6 described above. For example, in addition to the own vehicle, this image shows oncoming vehicles, intersections, the surrounding area, traffic lights, road signs, etc.
[0089] つぎに、 CPU401は、ドライブレコーダ用画像を撮影した時刻に関する時刻情報と 、この時刻におけるドライブレコーダ用画像のフレームナンバーに関するフレーム情 報を検出する (ステップ S803)。時刻情報は、たとえば、ナビゲーシヨン装置 300に 備えられたタイマーから検出してもよい。あるいは、 CPU401が GPSユニット 415を 制御して検出してもよい。また、フレームナンバーは、たとえば、映像 IZF412が、力 メラ 417から撮影されたフレーム画像の各フレームごとにフレームナンバーを付与す る構成でもよ 、。 [0089] Next, the CPU 401 detects time information regarding the time when the drive recorder image was photographed and frame information regarding the frame number of the drive recorder image at this time (step S803). The time information may be detected from a timer provided in the navigation device 300, for example. Alternatively, the CPU 401 may control and detect the GPS unit 415. Further, the frame number may be configured such that, for example, the video IZF 412 assigns a frame number to each frame of a frame image taken from the power camera 417.
[0090] つづいて、 CPU401は、ステップ S803において検出された時刻情報に基づいて、 車両の移動した履歴に関する地図情報を取得する (ステップ S804)。地図情報は、 たとえば、 CPU401が GPSユニット 415および各種センサ 416を制御して車両の位 置に関する情報を取得し、前述した図 5、図 6に示すように、車両の移動した移動ル 一トを点または軌跡を用いて地図上に表示する構成でもよい。 [0090] Subsequently, CPU 401 obtains map information regarding the history of movement of the vehicle based on the time information detected in step S803 (step S804). For example, the map information is obtained by the CPU 401 controlling the GPS unit 415 and various sensors 416 to obtain information regarding the vehicle's position, and as shown in Figures 5 and 6 described above, the map information is obtained by controlling the GPS unit 415 and various sensors 416 to obtain information about the vehicle's location. It may also be configured to display on a map using points or trajectories.
[0091] そして、 CPU401は、ステップ S803において検出された時刻情報およびフレーム 情報と、ステップ S804によって取得された地図情報をステップ S802におけるドライ ブレコーダ用画像に合成する(ステップ S805)。より具体的には、 ROM402または R AM403などのバッファメモリにおいて、 CPU401力 時刻情報と地図情報を、この 時刻におけるフレームナンバーに対応するフレーム画像に合成する。合成画像は、 たとえば、図 5、図 6に示すように、ドライブレコーダ用画像左上にフレーム情報と時刻 情報を表示し、ドライブレコーダ用画像右上に移動体の移動した履歴に関する地図 情報を表示する構成でもよ ヽ。 [0092] つづいて、 CPU401はステップ S806において合成した合成画像を上書き記録す る(ステップ S806)。より具体的には、 CPU401が、磁気ディスクドライブ 404を制御 して、合成画像を上書き記録用の記録媒体に上書き記録する。 [0091] Then, the CPU 401 combines the time information and frame information detected in step S803 and the map information acquired in step S804 with the image for the drive recorder in step S802 (step S805). More specifically, in a buffer memory such as ROM 402 or RAM 403, CPU 401 combines time information and map information into a frame image corresponding to the frame number at this time. For example, as shown in Figures 5 and 6, the composite image has a configuration in which frame information and time information are displayed in the upper left of the drive recorder image, and map information regarding the movement history of the moving object is displayed in the upper right of the drive recorder image. But ヽ. [0092] Subsequently, the CPU 401 overwrites and records the composite image synthesized in step S806 (step S806). More specifically, the CPU 401 controls the magnetic disk drive 404 to overwrite and record the composite image on a recording medium for overwrite recording.
[0093] つぎに、 CPU401は、トリガーを検知した力否かを判断する(ステップ S807)。トリガ 一の検知は、たとえば、各種センサ 416によって所定のトリガーを検知しておこなう構 成でもよい。トリガーは、たとえば、各種センサ 416の出力によって、ドライブレコーダ 用画像を保存するきつかけなどで、より具体的には、振動センサで規定以上の振動 や所定の振動パターンを検知した場合などでもよい。所定の振動パターンは、急激 な立ち上がりの振動など、異常を示す振動パターンであればよい。また、トリガーは、 たとえば、 Gセンサで規定以上の Gや所定の Gの力かり方のパターンを検知した場合 などでもよい。所定の Gの力かり方は、急激な立ち上がりの Gなど、異常を示すパター ンであればよい。あるいは、車体の接触センサによる、他との接触の有無やエアバッ グなどの作動をトリガーとする構成でもよ 、。 [0093] Next, the CPU 401 determines whether or not the force detected is the trigger (step S807). The detection of the trigger may be performed by, for example, detecting a predetermined trigger using various sensors 416. The trigger may be, for example, a trigger to save an image for a drive recorder based on the output of various sensors 416, or more specifically, the trigger may be a case where a vibration sensor detects vibration exceeding a specified value or a predetermined vibration pattern. The predetermined vibration pattern may be any vibration pattern that indicates an abnormality, such as a vibration with a sudden rise. Further, the trigger may be, for example, when a G sensor detects a G that exceeds a specified value or a pattern of applying force of a predetermined G. The predetermined way of applying G force may be any pattern that indicates an abnormality, such as a sudden rise in G force. Alternatively, the trigger may be the presence or absence of contact with another vehicle or the activation of an airbag, etc., using a contact sensor in the vehicle body.
[0094] また、 CPU401は、各種センサ 416の出力によって車両の危険な挙動となるドライ バーの運転操作を検知して、トリガーを検知する構成としてもよい。より具体的には、 所定の角速度を超えた急ノヽンドルやウィンカーを出さずに指定以上の角度のハンド ル操作や眠気を催したときに特有なハンドル操作など通常と異なるハンドル操作をト リガ一としてもよい。また、指定の加速度以上の加速'減速や信号のない交差点で減 速がな力 たことや赤信号 (黄信号)で減速がな力 たことや眠気を催したときに特有 のペダル操作など通常と異なるペダル操作をトリガーとする構成としてもょ ヽ。なお、 ハンドル操作やペダル操作の異常は、あら力じめ動作パターンを登録して、登録され た動作パターンと比較する構成でもよい。また、信号のない交差点やその他停止の 必要がある地点は、 ROM402などに記録された地図情報に基づいて取得してもよ い。信号の色は、カメラ 417で撮影した画像力も判断する構成でもよい。 [0094] Furthermore, the CPU 401 may be configured to detect a trigger by detecting a driver's driving operation that causes dangerous behavior of the vehicle based on the outputs of various sensors 416. More specifically, the system can trigger unusual steering wheel operations, such as sudden steering that exceeds a predetermined angular velocity, steering wheel operations that exceed a specified angle without turning on a turn signal, and steering wheel operations that are typical when drowsy. You can also use it as In addition, normal accidents such as acceleration/deceleration exceeding a specified acceleration, failure to decelerate at an intersection without a traffic light, failure to decelerate at a red (yellow) light, and special pedal operations when feeling drowsy. It is also possible to use a configuration in which the trigger is a different pedal operation. In addition, in order to detect abnormalities in steering wheel operation or pedal operation, a movement pattern may be registered in advance and compared with the registered movement pattern. In addition, intersections without traffic lights and other points where it is necessary to stop may be obtained based on map information recorded in the ROM402 or the like. The color of the signal may also be configured to determine the quality of the image taken by the camera 417.
[0095] ステップ S807において、トリガーを検知しない場合 (ステップ S807 : No)は、 CPU 401は、車両の走行が終了したか否かを判断する(ステップ S809)。一方、ステップ S807において、トリガーを検知した場合 (ステップ S807 : Yes)は、ステップ S805に おいて合成した合成画像を保存する (ステップ S808)。より具体的には、 CPU401力 S 磁気ディスクドライブ 404を制御して、ステップ S807〖こお!/、てトリガーを検知した検知 時点とその前後一定時間における合成画像を保存用の記録媒体に保存する構成で もよい。また、一定時間はユーザによって設定できる構成でもよぐ検知時点から一定 時間内に再度トリガーを検知した場合は、保存する時間を延長できる構成でもよい。 [0095] If the trigger is not detected in step S807 (step S807: No), the CPU 401 determines whether or not the vehicle has finished traveling (step S809). On the other hand, if a trigger is detected in step S807 (step S807: Yes), the composite image synthesized in step S805 is saved (step S808). More specifically, CPU401 Power S The configuration may be such that the magnetic disk drive 404 is controlled to save a composite image at the detection point in time when the trigger is detected in step S807 and a certain period of time before and after that point in a storage storage medium. Alternatively, the configuration may be such that a certain period of time can be set by the user, or the storage time can be extended if the trigger is detected again within a certain period of time from the time of detection.
[0096] そして、ナビゲーシヨン装置 300は、車両の走行が終了した力否かを判断する(ステ ップ S809)。車両の走行に関する判断は、たとえば、各種センサ 416の出力を参照 しておこなってもよい。より具体的には、各種センサ 416の出力が停止した時点で車 両の走行が終了したと判断してもよい。 [0096] The navigation device 300 then determines whether or not the vehicle has stopped running (step S809). The determination regarding the running of the vehicle may be made with reference to the outputs of various sensors 416, for example. More specifically, it may be determined that the vehicle has finished running when the outputs of the various sensors 416 stop.
[0097] ステップ S809において、車両の走行が終了しない場合 (ステップ S809 : No)は、ス テツプ S806に戻って、ドライブレコーダ用画像の上書き記録を繰り返す。また、ステ ップ S809において、車両の走行が終了した場合 (ステップ S809 : Yes)は、そのまま 一連の処理を終了する。 [0097] In step S809, if the vehicle has not finished traveling (step S809: No), the process returns to step S806 and the overwriting recording of the drive recorder image is repeated. Furthermore, in step S809, if the vehicle has finished traveling (step S809: Yes), the series of processes is immediately ended.
[0098] 以上説明したように、本実施例 2によれば、車両の走行中に撮影するドライブレコー ダ用画像に対し、トリガーが検知される前に、時刻情報とフレーム情報および地図情 報を合成して上書き記録する。そして、トリガーが検知された場合に、この合成画像を 保存する。したがって、車両の事故時の状況を解析する際に、事故発生時刻、事故 発生地点、事故発生時点前後における車両の移動ルートなどの事故原因の解明に 繋がる情報をドライブレコーダ用画像力 的確に把握することができる。 [0098] As explained above, according to the second embodiment, time information, frame information, and map information are added to the drive recorder image taken while the vehicle is running before the trigger is detected. Combine and overwrite. This composite image is then saved when a trigger is detected. Therefore, when analyzing the situation of a vehicle at the time of an accident, it is necessary to use the image power of a drive recorder to accurately grasp information that will lead to the elucidation of the cause of the accident, such as the time of the accident, the location of the accident, and the vehicle's travel route before and after the time of the accident. be able to.
[0099] また、本実施例 2によれば、事故に遭遇した時点に加えて事故未遂 (ヒヤリハット)の 時点におけるトリガーを検知する前に、時刻情報とフレーム情報および地図情報をド ライブレコーダ用画像に合成する。そして、事故未遂の時点におけるトリガーを検知 した場合に、合成画像を保存する。したがって、保存された合成画像を安全運転に 幅広く活用することができる。 [0099] Furthermore, according to the second embodiment, before detecting a trigger at the time of an accident encounter and the time of an attempted accident (near miss), time information, frame information, and map information are transferred to a drive recorder image. Synthesize into Then, if a trigger is detected at the time of an attempted accident, a composite image is saved. Therefore, the saved composite images can be widely used for safe driving.
[0100] また、本発明は、本実施例 1あるいは本実施例 2の機能を少なくとも一つ以上有し ていればよい。たとえば、本実施例 1と本実施例 2の機能を有している場合には、トリ ガーを検知した検知時点とその前後において、ドライブレコーダ用画像に対して、こ の映像が撮影された時刻と、この時刻における車両の履歴に関する地図情報を同時 に表示することができる。したがって、事故時点とその前後における状況を的確に把 握することができる。 [0100] Furthermore, the present invention only needs to have at least one or more of the functions of the first embodiment or the second embodiment. For example, if the functions of Embodiment 1 and Embodiment 2 are provided, the time at which the video was captured for the drive recorder image at and before and after the trigger was detected. and map information regarding the vehicle's history at this time can be displayed simultaneously. Therefore, the situation at the time of the accident and before and after it can be accurately grasped. can be grasped.
[0101] 以上説明したように、本発明によれば、車両の走行中に撮影するドライブレコーダ 用画像に対して、この画像が撮影された時刻と、この画像のフレームナンバーと、車 両の履歴を示す地図情報とを合成して保存する。したがって、車両の事故時の状況 を解析する際に、事故発生時刻、事故発生地点、事故発生時点前後における車両 の移動ルートなどの事故原因の解明に繋がる情報をドライブレコーダ用画像力 的 確に把握することができる。 [0101] As explained above, according to the present invention, for a drive recorder image taken while the vehicle is running, the time when this image was taken, the frame number of this image, and the history of the vehicle are recorded. and the map information showing the information are combined and saved. Therefore, when analyzing the situation of a vehicle at the time of an accident, it is necessary to use the image power of the drive recorder to accurately grasp information that will lead to the elucidation of the cause of the accident, such as the time of the accident, the location of the accident, and the vehicle's travel route before and after the time of the accident. can do.
[0102] なお、本実施の形態で説明した情報記録方法は、あらかじめ用意されたプログラム をパーソナル 'コンピュータやワークステーションなどのコンピュータで実行することに より実現することができる。このプログラムは、ハードディスク、フレキシブルディスク、 CD-ROM, MO、 DVDなどのコンピュータで読み取り可能な記録媒体に記録され 、コンピュータによって記録媒体力も読み出されることによって実行される。またこの プログラムは、インターネットなどのネットワークを介して配布することが可能な伝送媒 体であってもよい。 [0102] Note that the information recording method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation. This program is recorded on a computer-readable recording medium such as a hard disk, flexible disk, CD-ROM, MO, DVD, etc., and is executed by reading the recording medium by the computer. Further, this program may be a transmission medium that can be distributed via a network such as the Internet.

Claims

請求の範囲 The scope of the claims
[1] 移動体に設置された撮影手段により撮像された映像データを記録する情報記録装 ¾【こ; i l /、て、 [1] An information recording device that records video data captured by a camera installed on a moving object.
前記映像データの撮像時刻を検出する検出手段と、 detection means for detecting an imaging time of the video data;
前記移動体が移動した履歴に関する履歴情報を取得する取得手段と、 前記検出手段によって検出された前記映像データの撮像時刻に基づいて、前記 取得手段によって取得された履歴情報と前記映像データとを関連付けて保存する保 存手段と、 an acquisition unit that acquires history information regarding a history of movement of the moving body; and an acquisition unit that associates the history information acquired by the acquisition unit with the video data based on the imaging time of the video data detected by the detection unit. a storage means for storing it,
を備えることを特徴とする情報記録装置。 An information recording device comprising:
[2] 前記移動体の危険な挙動を検知する検知手段を備え、 [2] comprising a detection means for detecting dangerous behavior of the moving object,
前記保存手段は、 The storage means is
さらに、前記検知手段によって検知された検知結果に基づいて、前記履歴情報と 前記映像データとを関連付けて保存することを特徴とする請求項 1に記載の情報記 録装置。 The information recording device according to claim 1, further comprising storing the history information and the video data in association with each other based on a detection result detected by the detection means.
[3] 前記取得手段は、 [3] The acquisition means includes:
少なくとも前記移動体が移動した移動ルートを取得するものであり、 At least a travel route traveled by the mobile object is acquired,
前記保存手段は、 The storage means is
前記移動ルートと前記映像データとを関連付けて保存することを特徴とする請求項 Claim characterized in that the travel route and the video data are stored in association with each other.
1に記載の情報記録装置。 The information recording device described in 1.
[4] 表示画面を制御して、前記保存手段により関連付けられて保存されている前記移 動ルートおよび前記映像データを合成して表示する表示制御手段を備えることを特 徴とする請求項 1〜3のいずれか一つに記載の情報記録装置。 [4] Claims 1 to 4, characterized by comprising display control means for controlling a display screen to synthesize and display the moving route and the video data, which are stored in association with each other by the storage means. The information recording device described in any one of 3.
[5] 移動体に設置された撮影手段により撮像された映像データを記録する情報記録方 法において、 [5] In an information recording method for recording video data captured by a photographing means installed on a moving object,
前記映像データの撮像時刻を検出する検出工程と、 a detection step of detecting an imaging time of the video data;
前記移動体が移動した履歴に関する履歴情報を取得する取得工程と、 前記検出工程によって検出された前記映像データの撮像時刻に基づいて、前記 取得工程によって取得された履歴情報と前記映像データとを関連付けて保存する保 存工程と、 an acquisition step of acquiring history information related to the movement history of the mobile body; and associating the history information acquired in the acquisition step with the video data based on the imaging time of the video data detected in the detection step. The security to save existing process,
を含むことを特徴とする情報記録方法。 An information recording method characterized by comprising:
[6] 請求項 5に記載の情報記録方法をコンピュータに実行させることを特徴とする情報 記録プログラム。 [6] An information recording program that causes a computer to execute the information recording method according to claim 5.
[7] 請求項 6に記載の情報記録プログラムを記録したことを特徴とするコンピュータに読 み取り可能な記録媒体。 [7] A computer-readable recording medium, on which the information recording program according to claim 6 is recorded.
PCT/JP2006/324375 2005-12-09 2006-12-06 Information recording device, information recording method, information recording program and computer readable recording medium WO2007066696A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005355978 2005-12-09
JP2005-355978 2005-12-09

Publications (1)

Publication Number Publication Date
WO2007066696A1 true WO2007066696A1 (en) 2007-06-14

Family

ID=38122842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/324375 WO2007066696A1 (en) 2005-12-09 2006-12-06 Information recording device, information recording method, information recording program and computer readable recording medium

Country Status (1)

Country Link
WO (1) WO2007066696A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009083815A (en) * 2007-10-03 2009-04-23 Fujitsu Ten Ltd Drive recorder device and accident analysis simulation device
JP2009246503A (en) * 2008-03-28 2009-10-22 Denso It Laboratory Inc Drive video image summary apparatus
JP2011128005A (en) * 2009-12-17 2011-06-30 Fujitsu Ten Ltd Navigation device, on-vehicle display system, and map display method
JP6152232B1 (en) * 2017-02-23 2017-06-21 京セラ株式会社 Electronics
JP6346701B1 (en) * 2017-11-08 2018-06-20 京セラ株式会社 Electronic device, overlay method, and overlay program
WO2018193679A1 (en) * 2017-04-17 2018-10-25 株式会社Jvcケンウッド Recording control device, recording apparats, navigation apparatus, recording method, and program
EP3734556A4 (en) * 2017-12-27 2021-01-27 JVCKenwood Corporation Recording control device, recording device, recording control method, and recording control program
CN113874920A (en) * 2019-06-07 2021-12-31 马自达汽车株式会社 Moving body external environment recognition device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06344832A (en) * 1993-06-14 1994-12-20 Nippondenso Co Ltd Road condition recording system
JPH08235491A (en) * 1995-02-27 1996-09-13 Toyota Motor Corp Recorder and analyzer for running state of vehicle
JPH1053165A (en) * 1996-08-07 1998-02-24 Nippon Soken Inc Accident situation recorder
JP2002236990A (en) * 2000-12-06 2002-08-23 Tsukuba Multimedia:Kk Map guide image system
JP2003063459A (en) * 2001-08-29 2003-03-05 Matsushita Electric Ind Co Ltd Drive recorder device for vehicle
JP2003123185A (en) * 2001-10-11 2003-04-25 Hitachi Ltd Danger information collection and distribution equipment, alarm generator, vehicle danger information transmitter and route searching device
JP2004009833A (en) * 2002-06-05 2004-01-15 Nissan Motor Co Ltd Driving situation recording device
JP2004259069A (en) * 2003-02-26 2004-09-16 Aisin Seiki Co Ltd Alarm system for outputting alarm signal depending on vehicle hazard level

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06344832A (en) * 1993-06-14 1994-12-20 Nippondenso Co Ltd Road condition recording system
JPH08235491A (en) * 1995-02-27 1996-09-13 Toyota Motor Corp Recorder and analyzer for running state of vehicle
JPH1053165A (en) * 1996-08-07 1998-02-24 Nippon Soken Inc Accident situation recorder
JP2002236990A (en) * 2000-12-06 2002-08-23 Tsukuba Multimedia:Kk Map guide image system
JP2003063459A (en) * 2001-08-29 2003-03-05 Matsushita Electric Ind Co Ltd Drive recorder device for vehicle
JP2003123185A (en) * 2001-10-11 2003-04-25 Hitachi Ltd Danger information collection and distribution equipment, alarm generator, vehicle danger information transmitter and route searching device
JP2004009833A (en) * 2002-06-05 2004-01-15 Nissan Motor Co Ltd Driving situation recording device
JP2004259069A (en) * 2003-02-26 2004-09-16 Aisin Seiki Co Ltd Alarm system for outputting alarm signal depending on vehicle hazard level

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009083815A (en) * 2007-10-03 2009-04-23 Fujitsu Ten Ltd Drive recorder device and accident analysis simulation device
JP2009246503A (en) * 2008-03-28 2009-10-22 Denso It Laboratory Inc Drive video image summary apparatus
JP2011128005A (en) * 2009-12-17 2011-06-30 Fujitsu Ten Ltd Navigation device, on-vehicle display system, and map display method
JP6152232B1 (en) * 2017-02-23 2017-06-21 京セラ株式会社 Electronics
US10459315B2 (en) 2017-02-23 2019-10-29 Kyocera Corporation Electronic apparatus for displaying overlay images
JP2018137697A (en) * 2017-02-23 2018-08-30 京セラ株式会社 Electronic apparatus
CN110178166A (en) * 2017-04-17 2019-08-27 Jvc建伍株式会社 Recording control apparatus, recording device, navigation device, recording method and program
WO2018193679A1 (en) * 2017-04-17 2018-10-25 株式会社Jvcケンウッド Recording control device, recording apparats, navigation apparatus, recording method, and program
JP2018181043A (en) * 2017-04-17 2018-11-15 株式会社Jvcケンウッド Record controller, recorder, navigator, recording method, and program
EP3614358A4 (en) * 2017-04-17 2020-04-08 Jvckenwood Corporation Recording control device, recording apparats, navigation apparatus, recording method, and program
JP2018137728A (en) * 2017-11-08 2018-08-30 京セラ株式会社 Electronic apparatus, method for overlay, and program for overlay
JP6346701B1 (en) * 2017-11-08 2018-06-20 京セラ株式会社 Electronic device, overlay method, and overlay program
EP3734556A4 (en) * 2017-12-27 2021-01-27 JVCKenwood Corporation Recording control device, recording device, recording control method, and recording control program
CN113874920A (en) * 2019-06-07 2021-12-31 马自达汽车株式会社 Moving body external environment recognition device

Similar Documents

Publication Publication Date Title
JP4799565B2 (en) Information recording apparatus, information recording method, information recording program, and computer-readable recording medium
WO2007066696A1 (en) Information recording device, information recording method, information recording program and computer readable recording medium
JP5585194B2 (en) Accident situation recording system
WO2007049596A1 (en) Information recording apparatus, information recording method, information recording program and computer readable recording medium
WO2008010391A1 (en) Information distribution device, information processing device, information distribution method, information processing method, information distribution program, information processing program, and computer readable recording medium
JP2008250463A (en) Information recording device, information recording method, information recording program and computer-readable recording medium
JP2007255907A (en) Route searching system, and device, method, and program for information registration, and computer-readable recording medium
WO2007063849A1 (en) Information recording apparatus, information recording method, information recording program, and computer readable recording medium
WO2008010392A1 (en) Information processing device, information processing method, information processing program, and computer readable recording medium
JP4845481B2 (en) Information recording apparatus, information recording method, information recording program, and computer-readable recording medium
JP4866061B2 (en) Information recording apparatus, information recording method, information recording program, and computer-readable recording medium
JP2009090927A (en) Information management server, parking assist device, navigation system equipped with parking assist device, information management method, parking assist method, information management program, parking assist program, and record medium
JP4825810B2 (en) Information recording apparatus, information recording method, information recording program, and recording medium
WO2008072282A1 (en) Information recording device, information processing device, information recording method, information processing method, information recording program, and computer-readable recording medium
JP4521036B2 (en) Route search device, route search method, route search program, and computer-readable recording medium
WO2008038333A1 (en) Data recording device, data recording method, data recording program, and computer readable recording medium
WO2007119348A1 (en) Information providing apparatus, information providing method, information providing program and recording medium
JP4776627B2 (en) Information disclosure device
WO2007049520A1 (en) Information recording apparatus, information recording method, information recording program, and computer readable recording medium
WO2007055241A1 (en) Information recording device, information recording method, information recording program and recording medium
JP2013061763A (en) Determining device
JP4987872B2 (en) Information recording apparatus, information recording method, information recording program, and computer-readable recording medium
WO2007066612A1 (en) Information recording device, communication device, information recording method, communication method, information recording program, communication program, and computer-readable recording medium
JP4289296B2 (en) Guide device and program
WO2007055194A1 (en) Information recording device, information recording method, information recording program and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06834130

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP