WO2007066696A1 - Dispositif d'enregistrement d'informations, procede d'enregistrement d'informations, programme d'enregistrement d'informations et support d'enregistrement lisible par un ordinateur - Google Patents

Dispositif d'enregistrement d'informations, procede d'enregistrement d'informations, programme d'enregistrement d'informations et support d'enregistrement lisible par un ordinateur Download PDF

Info

Publication number
WO2007066696A1
WO2007066696A1 PCT/JP2006/324375 JP2006324375W WO2007066696A1 WO 2007066696 A1 WO2007066696 A1 WO 2007066696A1 JP 2006324375 W JP2006324375 W JP 2006324375W WO 2007066696 A1 WO2007066696 A1 WO 2007066696A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
information recording
vehicle
video data
time
Prior art date
Application number
PCT/JP2006/324375
Other languages
English (en)
Japanese (ja)
Inventor
Hiroaki Shibasaki
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007066696A1 publication Critical patent/WO2007066696A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera

Definitions

  • This relates to an information device that records information, an information method, and a recording medium that can be read by an information program.
  • this document is not limited to a recording medium that can be read by the above-mentioned information device, information method, and information program.
  • the lidar Since 2000, the lidar has been known to record the situation of both running cars as well as the Liteda on board an airplane.
  • This driver is, for example, a front camera that captures the front of the vehicle, a rear camera that captures the rear, and a front camera.
  • the image of the driver does not include information about the history of the route between the two before and after the accident. Therefore, when analyzing the cause of an accident, it is not possible to accurately grasp the accident point if, for example, the information that leads to the intersection, such as the name of an intersection, is not displayed in the image of the driver. The title is given as an example.
  • the information that leads to the identification of the accident point is projected on the image of the driver, It should be noted, however, that it is not always possible for Z to identify the accident point based on this report.
  • a detection stage that detects the time of the data in the information device that records the video data created by the stage installed on the moving body, and an acquisition means that acquires information on the history of the movement of the moving body.
  • a storage stage that stores the information obtained by the obtaining means and the data based on the time of the data detected by the stage.
  • the information method for recording the video data obtained by the stage installed on the moving body, the detection range for detecting the time of the data and the moving body are moved.
  • 0000 is a diagram showing an example of the functional configuration of the information device according to the present embodiment.
  • 2 2 is a chart showing the meaning of the information device according to this embodiment.
  • 3 3 shows an example of a vehicle equipped with the navigation system of the present implementation, which is located in the vicinity of the vehicle.
  • FIG. 4 4 shows an example of how to create a navigation device for this implementation.
  • 6 6 shows an example of a driver image of a vehicle in this implementation.
  • FIG. 6 is a diagram showing an example of the functional configuration of the information device according to the present embodiment.
  • the information for recording the image data recorded by the stage installed on the moving body includes detection, acquisition 2, storage 3, detection 4, and display 5.
  • 014 detects the time of the video data made by the stage installed in the moving body. , For example, it may be detected from the time information provided by Thailand provided for the information. In addition, the detected image data may be stored in the configuration described later in Save 3.
  • 00152 obtains information regarding the history of movement of the mobile body.
  • it may be information that includes a moving body of a moving body, or may be a configuration that displays a moving body containing a position of the moving body at a time detected by detection on a map.
  • 0163 contains the information and video data acquired by Acquisition 2 based on the time of the video data detected by the detection. More specifically, even in the configuration in which the time detected by the detection and the information including the moving object acquired by the acquisition 2 are present for the video data, corresponding to the Munan. Guess.
  • Save 3 stores the time and the detection point for the data of the fixed time including the detection point when the sudden movement of the moving object is detected by Detection 4 described later. It may be configured such that the information including the moving parts for the fixed time period and the information corresponding to the Munan exist.
  • the 00174 detects a sudden movement of the moving body. It can also be configured to detect based on the force of various sensors mounted on the moving body, such as the movement of the moving body, for example, the movement of the moving body.
  • the sensor may be, for example, a vibration sensor G sensor, a sensor for a moving object, and a sensor capable of outputting information regarding an operation such as an operation of a Kipeda of an accelerometer of an end instruction.
  • the dangerous motion caused by detection 4 is a specified value for various sensors such as the vibration sensor G sensor, or if it is similar to a constant tan indicating an abnormality, or if the contact sensor is activated. You may detect such as a dangerous movement. Also, based on the information on the movement of the moving body, if it is a dangerous movement, it may be judged as a dangerous movement. More specifically, for example, a predetermined In addition to the above, the configuration may be such that the action of the moving body is known to be a dangerous action when there is a power of information such as an unnecessary need for a direction instruction and unnecessary speed.
  • the display unit 0195 controls the display surface and synthesizes and displays the information of the moving body and the image data which have been saved by the storage 3. More specifically, it may be configured such that the image data displayed by the stages installed on the moving body and the information including the moving body are displayed together. At this time, the time acquired by detection may be displayed together. It is also possible to combine the information and video data of the moving object before they are stored in the storage 3, and display the generated image data on the display surface by the display 5.
  • Section 2 is a chart that shows the meaning of the information device in this embodiment.
  • the information judges whether or not the input of the power of the video data set by the stage installed in the moving body is accepted by the input (step S2). It may be done by operating a new operation unit, such as a new one. It may also be configured so that the moving body starts running.
  • step S2 wait for the image data to be received, and if it is received (step S2 es), the detection detects the time of the image data set by the stage installed on the moving body. (Step 2 2). For example, it may be detected from the time information provided by Thailand provided for the information. Then, in Acquisition 2, information about the history of movement of the mobile body is acquired (step 23). For example, Detect 4 can detect a sudden movement of the moving object by using information including the moving object of the moving object or displaying the moving object's point on the map by using points or trajectories. Detect (step 2 4). Rugged moving objects, such as the movement of moving objects It may be configured to detect based on the force of the robot.
  • a sudden motion such as a vibration sensor G sensor or other type of sensor that has a specified value, is approximated to a constant tan indicating an abnormality, or a contact sensor is activated, is detected as a dangerous motion. You can. In addition, based on the information on the movement of the moving body, if it is a dangerous movement, it may be judged as a dangerous movement.
  • step 3 when the motion of the moving object is detected by the detection 4 in step 4, the information of the moving object and the image data acquired by the acquisition 2 are detected.
  • a series of processing is completed after the existence of the detected time (step S25). More specifically, it may be configured such that, for the video data recorded by the stages installed in the moving body, the information including the time and the moving body is stored corresponding to Munan. Save 3 is the time and the fixed time including the detection point for the image data at the fixed time including the detection point when a sudden movement of the moving object is detected by the detection. It may be configured such that the information including the mobile object in the above is stored in correspondence with Munan.
  • the present embodiment when the sudden movement of the moving body is detected, the time and the moving body's image are detected with respect to the video data set by the stage installed on the moving body. Corresponding information. Therefore, when analyzing the situation of an accident, it is possible to accurately grasp the information that leads to the cause of the cause such as the video data and the moving object.
  • the navigation device is installed by a navigation device mounted on a moving body such as a vehicle (including a motorcycle and a motorcycle).
  • FIG. 3 shows an example of a vehicle equipped with the navigation system of this implementation, which is located in the vicinity of the vehicle.
  • Navigation 3 is installed in the vehicle's hood.
  • the navigation 3 is composed of the main unit and (display), and the display shows the current location of the vehicle, map information, current time, etc.
  • the navigation 3 is connected to the vehicle-mounted camera 3 installed on the board and the vehicle-mounted camera 32 installed on the sunizer.
  • the camera 3 can be of a smaller size and can be configured to take images inside and outside the vehicle.
  • the iku 32 is used when recording the child in the voice input of the navigation 3.
  • the camera 3 may be, for example, a camera that is fixed to the camera and takes an image of the outside of the vehicle 0300.
  • the vehicle-mounted camera 3 may be attached to the vehicle. When the vehicle-mounted camera 3 is attached to both parts, the vehicle can be fully confirmed and the situation of the rear-end collision can be recorded when the vehicle is hit by another vehicle.
  • the in-vehicle camera 3 may be an external camera that records in a dark place.
  • the on-vehicle camera 3 and the on-vehicle camera 32 may be provided in plural on the vehicle, or may be fixed type or movable type camera.
  • Navigation 3 has the drive function to record and record to the destination point and record the running condition of the vehicle.
  • the location of the vehicle obtained by the driver and the video and audio obtained from the vehicle-mounted camera 3 and 3 2 and the GPS 4 5 sensor described later 4 5
  • an overwriting area that constantly records and overwrites the running condition, and a save that saves the running condition in case of an accident. It may have an area, or it may be configured to have a body for overwriting and a body for storage.
  • Figure 4 is a schematic diagram showing an example of how the navigation system is constructed in this project.
  • the navigation 3 is mounted on a moving body such as a vehicle, and CP 4, 4 2, 4 3, magnetic disk dry 4 4, magnetic disk 4 5 and disk dry 4 6, disk 4 7, audio 1 (interface) 48, iku 4 9, input 4, input device 4, video 4 2, display 4 3, communication 4, and GPS 4 5 and various sensors 4 6 and 4 7 are provided. Also, 4 to 4 7 are each followed by 42.
  • P 4 controls the body of Navigation 3. 4 records programs such as top programs, route programs, route programs, voice programs, map programs, communication programs, database programs, and database programs. Also, 4 3 is used as a work of P 4.
  • the route program searches for an appropriate route from the starting point to the destination point using the map information recorded on the disk 47 described later.
  • the most suitable route is a route to the destination point (or a route) and a route that best matches the specified conditions. It is searched for by executing the program, and the voice 484 2 is output via P 4.
  • the route program is the information retrieved by executing the route program, the location information of the navigation 3 obtained by the communication 4, and the map information retrieved from the optical disc 47. Based on the above, we will generate the time-based report.
  • the path generated by executing the program is output as voice 4 84 2 via P 4.
  • the voice program generates the tone voice information corresponding to the tone. That is, based on the route information generated by executing the route program, the setting corresponding to the guidance point and the voice guidance information are generated, and the voice 4 8 is output via the CP 4.
  • the map program determines the formula of the map information to be displayed on the display 4 3 according to the image 42, and displays the map information on the display 4 3 according to the determined display formula.
  • the CP 4 captures a driver image from the camera 47 while the vehicle is in motion. Then, CP 4 detects the time information regarding the time when this Dryda image was taken and the time information regarding the Munang of this Dryda image. In addition, CP 4 acquires information including the moving object at the time when the dry image was taken. Then, CP 4 combines this information, time, and information into a driver image by using 4 2 or 4 3 which will be described later, and overwrites it on the overwrite recording body. Further, the CP 4 stores this image in the storage body when the sensor is detected by various sensors 46 described later.
  • the CP 4 records the image of the driver shadowed by the camera 4 7 on the body of the apparently overwritten recording while the vehicle is in motion. Then, the CP 4 saves the driver image in the storage body when various sensors 46 to be described later detect, and obtains the time information and the information including the time stamp and the moving object. Then, the CP 4 may be configured to combine the time information, the time map, and the map information with a driver image such as 4 2 or R 4 3 described later, and save this image in the save body. In addition, C P 4 is displayed on the display 4 3 described above if there is an instruction to display the driver image, for example.
  • the 004 disk drive 44 controls the uptake of only the data to the disk 45 according to the control of the CP 4.
  • the disk 45 records the data written by the magnetic disk drive 4 4.
  • a disc 4 5 As a disc 4 5 , (Dodis (Kidis can be used.
  • the disc drive 46 controls the loading of only the data to the disc 47 according to the control of the CP 4.
  • the disc 47 is an existing recording medium whose data is produced according to the disc dry 46 control.
  • the disc 47 can also use a writable recording medium.
  • the present recording medium may be disk 47, O, or a medium.
  • 004 As an example of the information recorded on the discs 4 5 and 4 7, the presence / absence of a vehicle detected by a GPS 45, which will be described later, inside and outside the vehicle obtained by the on-vehicle camera 3 2 Examples include point information and values from various sensors 46, which will be described later. These are recorded by the driving function of the navigation 3 and used as a fee in the event of a traffic accident.
  • map information used for route / guide. , Buildings, rivers, ground surface, and other data, and road data, which are drawn in 2 or 3 dimensions on the surface of the display 43.
  • the navigation 3 will be displayed as a route, and the map information and the current location acquired by GPS 45, which will be described later, will be displayed in an overlapping manner.
  • the 004 data also has transportation data.
  • Data include, for example, the entry of a highway such as a traffic light crossing at a node, () at a link, the direction of travel, a road (highway, toll road, general road). Etc.) is included.
  • the traffic data stores past reports that are statistically based on the season, large size, and time.
  • the navigation 3 obtains information on the current stagnation from the road traffic information received by the communication 44 described later, but it can use the past information to get an idea of the situation at the specified time. It will be possible.
  • map information will be recorded on the magnetic disks 4 5 and 4 7. However, it is not limited to this. However, it is not limited to those recorded in the navigation 3 integrated door, but may be provided in the navigation 3 part. In that case, the navigation 3 acquires the map information via the network, for example, through the communication 4. It is stored in the obtained map 4 3 etc.
  • the voice 4 8 is connected to the voice 4 9 for inputting the voice (for example, to the voice 3 2 of 3 and the voice power source 4 4.
  • the voice of the voice 4 9 is replaced in the voice 4 8.
  • the voices input from iku 4 9 can be recorded on the disc 4 5 or the disc 4 7 as voice data.
  • the input device 4 is equipped with a number for inputting characters, numbers, various kinds of indications, keyboard, mouse, touch panel, etc.
  • video 4 2 displays on displays 4 3 and 4 7 (for example, 3
  • Connected. 42 is, for example, V (deo R), which temporarily records an immediate image information when controlling the display of 4 3 bodies, and a graphic. It is composed of a control C that controls the display 43 based on the output image data.
  • the image 42 may be formed by adding a munan to each image of the image projected from 47.
  • the image 42 displays the display 43 and displays the information, the message and the moving object on the driver image described later.
  • Display 4 3 has an icon, socket, menu, window, or character.
  • This display 43 is, for example, C
  • a display or a plasma display can be used.
  • the display 4 3 is installed, for example, in the manner of 3.
  • the display 43 may be provided in a plurality of vehicles, for example, one for a dry seat and one for a rear seat.
  • the camera 47 captures an image of the vehicle or the outside. Whether it is static or moving, for example, a camera 4 7 captures a partial dry motion, and the captured video is output via video 4 2 to a body such as a disk 4 5 disk 4 7 etc. .
  • the camera 47 captures an image of both external conditions, and the image is output. The video output to the recording medium is overwritten and recorded as a dry image.
  • the f44 is connected to the network via radio and functions as the navigation3 and CP4 interface.
  • the f4 is also connected to the Internet or other communication network via radio, and also functions as this communication network CP4 interface.
  • the 005 network includes W and public mobile phone networks. Physically,
  • the GPS 45 uses the GPS stars and various sensors 4 6 described later (for example, an angular velocity sensor, an acceleration sensor, the number of tires, etc.) to locate the vehicle (navigation point). Calculate the information indicating the 3 locations). This is the information that identifies the point on the map, such as the degree, altitude, etc., indicating the current location.
  • the GPS 45 uses various sensors 4 6 and others to output the parameters, speed change and heading. This makes it possible to analyze conditions such as key and hand.
  • GPS there are a total of 24 GPS stars, not the 6 4 in the Earth. These roads are adjusted to place the same star at the same time every day, and we always see 5 or 6 satellites at any point on Earth (but need to be in line of sight). . 0058 GPS A star (Cs) child clock () is attached to the star and synchronizes with the time of the star to mark the exact time.
  • the stars contain um and vidium (b) 2 as a reserve. This is because accurate time is essential for GPS position measurement.
  • GPS stars have transmitted two wavenumbers of 575 42 z () and 227 6 z (2) (below, GPSf). It is modulated by a signal called this pseudo random (Peo Rado ose Code), and when it is received by GPS, 45, etc., it refers to the corresponding code and decodes it.
  • this pseudo random Peo Rado ose Code
  • the 006 0 GPS unit 45 measures the difference between the time when the GPS device was launched from the GPS star and the time when the white device received GPSf, using the decoded de-white device total. . Then, the time difference is multiplied by the radio wave degree to calculate the distance from the GPS star to the white device (X). In addition, it is synchronized with this Coordinated Time (C).
  • GPS stars Since 006 1 GPS stars send accurate information on the road, it is possible to know the exact location of the GPS star. Therefore, if the distance from the GPS star is known, the location of the white device will be a point on the sphere centered on the GPS star and having the calculated distance as the radius. In addition, it is sent back at the interval of s of GPSf. Since the transmission time of GPSf is 4 seconds, the maximum is 4 X ⁇ 4. Therefore, every time it is necessary to know the location of the equipment.
  • GPS 45 receives GPS signals from a total of 4 GPS stars. It can be considered that this can be obtained by introducing new information (equation) using the minutes on the GPS 45 side as another unknown. In this way, GPS 45 can obtain the almost accurate current position that converges to a point by receiving GPSf signals from 4 GPS stars.
  • the 006 sensor 46 is a sensor such as an acceleration sensor, a G sensor, and an angular velocity sensor, and is used for the current position output by the GPS 45 and the measurement of speed and position.
  • the various sensors 46 include a sensor that detects a vehicle operation caused by dryness. It may be configured to detect the force of the window blinker, the penetration of the accessor, the penetration of the kipeda, etc. of both works. Also, it can be used as data to be recorded by various sensors 4 6 with a driver function.
  • each sensor 46 it is possible to specify the settings for saving the driver image in advance and to save the driver image when a sensor is detected.
  • various sensors 46 may be configured to provide an output above a predetermined threshold value or an output approximating a predetermined tan value. More specifically, for example, it may be set when the vibration sensor of the various sensors 46 detects the above vibration or a predetermined tan.
  • the constant tan is any tan that shows an abnormal condition such as a sudden rise. It may also be set when the G sensor detects an upper G predetermined turn of the G direction.
  • the constant G should be a tan that indicates an abnormality such as a sharp rise.
  • detection and 5 are performed by P 4
  • acquisition 2 is at GPS 45 and sensor 46
  • storage 3 is at disk dry 4 4
  • the disk drive 4 6 realizes the functions of the detection 4 by the various sensors 46. .
  • Figure 5 shows an example of a dry image in front of the vehicle in this implementation.
  • the driver 5 in front of the vehicle is shown by image 5a taken by the driver 3 of the navigation 3 shown in 5 and by map 5b of the history of 5 moving. Composed.
  • image 5a the Munan 52 of image 5a and the time 5 3 when image 5a was taken are displayed.
  • image 5a shows a situation where, for example, 5 is at an intersection and an oncoming vehicle 5 4 is approaching 5.
  • information such as intersection 5 5, signal 5 6 and pedestrian crossing 5 7 stop line 58, pedestrian crossing or bicycle presence 5 9 and 5 near the intersection are displayed.
  • the map 5b may be, for example, a map information in which the point 5 moved is displayed on the map, or a point 5 indicating the point 5 moved by the point or track.
  • 5 2 near the intersection and 5 3 at the intersection in 5 b correspond to 5 near the intersection and 5 5 at the intersection in image 5 a, respectively.
  • the map 5b is displayed above the driver 5, but this is not limited to the upper right, and it can be displayed anywhere in relation to the image 5a.
  • 00696 shows an example of a driver image of a vehicle according to this embodiment.
  • the image 6 in the vehicle is composed of the image 6a taken by the driving function of the navigation 3 shown in 5 and the map 6b showing the history of movement of 5.
  • Be done. 6a shows, for example, the situation in which 5 collided with an oncoming vehicle 5 4 at an intersection. In this situation, only the Munan 6 2 in image 6a and the time 6 3 in which image 6a was taken, as well as information such as intersection 5 5, traffic light 5 6, crosswalk 5 7, crossing 5 and so on. Will be the information to understand the situation at the time of the accident.
  • 007 6 b is also a map report showing 5 until 5 collides with an oncoming vehicle 5 4.
  • the location of 5 on the map 6b may be displayed with the mark 62 that indicates that the accident has occurred.
  • the map 6b may be a map centered on the accident point or a map with the row 5 direction up. Even if the map in 6b is assumed to be the distance that the vehicle 5 moves for the fixed time at the accident point. 0071 (The reason for navigation 3)
  • 7 is a chart showing the purpose of the navigation system implemented in this project.
  • the navigation 3 first determines whether or not the vehicle is running (step 7). Judgments regarding both lines may be made, for example, by referring to the forces of various sensors 46.
  • step 7 wait until the vehicle is running, and if it is running (step S7 es), CP 4 controls 4 7 to start the shadow of the driver image. (Step 7 2). It can also be a driver, for example, an image taken from the direction of the vehicle on board the vehicle as shown in 5 or 6 above. This image shows, for example, oncoming vehicles, intersections, children near the intersections, traffic lights, and road signs in addition to the two.
  • the CP 4 overwrites and records the driver image captured in step S72 (step S73). More specifically, it may be configured such that the CP 4 controls the disk drive 4 4 to overwrite the drive image with the drive image.
  • the CP 4 judges whether or not the fault has been detected (step S74).
  • it may be configured by detecting a predetermined value with various sensors 46.
  • the force of various sensors 46 can be used to save the driver image, and more specifically, even when the vibration sensor detects the above vibration or a predetermined data.
  • the constant value may be any value that indicates an abnormality such as a sudden rise. Also, for example, this may be the case when the G sensor detects an upper G predetermined turn.
  • the constant G can be any tan that shows an abnormality, such as a sudden rising G.
  • the CP 4 may also be configured to detect the dry operation, which is a sudden movement of both, by the force of various sensors 46. More specifically, it is also possible to use an unusual action such as a unique action when the driver is given a certain angle of attention without ejecting a dog squid that exceeds a predetermined speed. In addition, unusual acceleration such as acceleration / acceleration above the specified speed or deceleration at intersections without traffic lights, deceleration at the red traffic light (), and special operations when you are concerned It can be configured as In addition, it is also possible to register a random tan of a normal operation and compare the registered operation tan. In addition, intersections that do not have traffic lights and points that need to be stopped may be acquired based on the map information recorded in 42.
  • an unusual action such as a unique action when the driver is given a certain angle of attention without ejecting a dog squid that exceeds a predetermined speed.
  • unusual acceleration such as acceleration / acceleration above the specified speed or deceleration at
  • step S7 4 o If the vehicle is not detected in step 4 (step S7 4 o), it is determined whether or not the line of the vehicle is completed (step S7). On the other hand, when the fault is detected in step 4 (step S7 4 es), the CP 4 stores the driver image (step S75). More specifically, the CP 4 may control the disc drive 44 to store the detection point detected in step 4 and the driver image at a certain time after that in the storage body. In addition, the fixed time can be set by the user, or the time for saving can be extended if the switch is detected again within the fixed time from the detection point.
  • CP 4 detects the time regarding the time when the Dryda image was taken and the information regarding the Munang of the Dryda image at this time (step 76). For example, you can detect the Thailand provided in Navigation 3. Alternatively, CP 4 may control the GPS 45 to detect it. In addition, for example, video 4 2 4 7 It is also possible to add Munan to each shadowed mu statue.
  • the CP 4 obtains the map information regarding the history of vehicle movement based on the time information detected in step S76 (step 77). For example, the CP 4 controls the GPS 45 and the sensor 4 6 to obtain information about both units, and as shown in 5 and 6 above, the CP 4 controls the movement of the vehicle to a point or trajectory. It may be configured to be displayed on the map for use.
  • P 4 combines the time and time detected in step S76 and the map information acquired in step S77 with the driver image saved in step S75 (step S78). . More specifically, P 4 combines the time information and the map information with the image such as 4 2 or R 4 into the image corresponding to the Munang at this time. For example, as shown in Figs. 5 and 6, it is also possible to display a time signal and a time signal on the driver and a map information on the history of movement of the moving object on the driver.
  • P4 stores the image combined in step S78 (step S79). More specifically, the configuration in which P 4 controls the disc drive 4 4 to store the composite image in the storage body is also possible.
  • the navigation 3 judges whether or not the line of the vehicle is completed (step S7). Judgments on both lines may be made by referring to the forces of various sensors 46, for example. More specifically, it may be judged that the driving of the vehicle is completed when the force of the various sensors 46 is stopped.
  • step S7 o If the line of the vehicle is not completed in step S7 (step S7 o), the process returns to step S73 and the writing and recording of the driver image are repeated. If the vehicle has finished moving in step S7 (step S7 es), the series of processing ends.
  • the image of the side is dried at step 73.
  • the image is taken as a dual image and recorded by overwriting. It is also possible to have a configuration in which information about line status such as power is rushed and recorded. In that case, at step 4, when the voltage is detected, the detected detection point and the output for a certain time after that may be saved.
  • the driver is detected and the image of the driver is generated, but it may be configured to detect the floor of the accident. In that case, the configuration may be changed so that the accident can be detected.
  • the driver image is constantly overwritten and recorded while the vehicle is in motion. Then, when a gut is detected, the time image, the time stamp, and the map information are combined in the driver image. Therefore, when analyzing the condition of the vehicle, it is possible to accurately grasp the information that is related to the cause of the accident, such as the time when the accident occurred, the point where the accident occurred, the point before and after the time when the accident occurred, from the driver image. it can.
  • the dry image that combines the time information, the time information, and the map information is saved. Therefore, the stored image can be used for safe driving.
  • Example 2 will be explained in the case of navigation 3 explained above, in which the time and time stamps and map and map information are combined and overwritten and recorded before the detection of the event. It should be noted that the navigation 3 in this implementation 2 is almost the same as that in 3 and will be explained. In addition, the procedure for navigation 3 in this implementation 2 will be explained because it is almost the same as in 4.
  • the reference numeral 8 is a chart showing the operation of the navigation device according to the second embodiment.
  • first navigate 3 Determines whether the vehicle is running (step S8). Judgments regarding both lines may be made, for example, by referring to the forces of various sensors 46.
  • the CP 4 controls the camera 47 to start the shadowing of the driver image (step S8 es). 8 2).
  • It can also be a driver, for example, an image taken from the camera 47 mounted on the vehicle in the direction of the vehicle, as shown in 5 or 6 above. This image shows, for example, oncoming vehicles, intersections, children near the intersections, traffic lights, and road signs in addition to the two.
  • the CP 4 detects the information about the time when the Dryda image was taken and the information about the Munang of the Dryda image at this time (step 83). For example, you can detect from Thailand provided in Navigation 3. Alternatively, CP 4 may control and detect GPS 45.
  • the munan may also be configured such that, for example, the image 42 is given a munan for each of the images of the images projected from the camera 47.
  • P 4 obtains the map information regarding the history of vehicle movement based on the time information detected in step S83 (step 84).
  • P 4 controls GPS 45 and sensor 4 6 to obtain information about both positions, and as shown in 5 and 6 above, the vehicle movement
  • It can also be configured to display points on a map using points or trajectories.
  • P 4 combines the time and the time information detected in step S83 and the map information acquired in step 84 with the driver image in step 82 (step S85). More specifically, at 4 2 or 4 3, etc., P 4 combines the report and the map report into the image corresponding to the Munang at this time. , For example, as shown in 5 and 6, it is possible to display a message and a report on the driver and a map report on the history of the moving body on the driver. Then, the CP 4 overwrites and records the image combined in step S86 (step S86). More specifically, the CP 4 controls the magnetic disk drive 4 4 to overwrite-record the composite image in the overwrite recording body.
  • the CP 4 determines whether or not G has been detected (step S87). For example, it may be performed by detecting a predetermined value with various sensors 46.
  • the force of various sensors 46 can be used to save the dry image, and more specifically, even when the vibration sensor detects a vibration above or a predetermined tan.
  • the constant tan may be a tan that indicates an abnormality such as a sudden rise. It may also be the case when, for example, the G sensor detects an upper G predetermined turn of the G direction.
  • the constant G can be any tan that shows an abnormality, such as a sudden rising G. Alternatively, it is also possible to use a sensor on the vehicle body to actuate the action such as to and.
  • the CP 4 may be configured to detect a dry operation, which is a sudden movement of both by the force of various sensors 46, to detect the operation. More specifically, it is possible to use an unusual wind action such as a unique wind action when a wind action at a specified angle is given without issuing a wind window that exceeds a predetermined speed. In addition, unusual acceleration such as acceleration / acceleration above the specified speed or deceleration at intersections without traffic lights, deceleration at the red traffic light (), and special operations when you are concerned It can be configured as In addition, it is also possible to register a random tan of a normal operation and compare the registered operation tan.
  • step S87 if the speed is not detected (step S87), P4 determines whether or not the line of the vehicle is completed (step S89). On the other hand, if an error is detected in step S8 7 (step S8 7 es, the image combined in step S8 5 is saved (step S8 8). It is also possible to control the disk drive 4 4 to store the detection point at step 8 7 and the composite image at a certain time after that in the storage body.
  • the fixed time can be set by, or if the time is detected again within a fixed time from the detection point, the saving time can be extended.
  • the navigation 3 judges whether the line of the vehicle is completed (step p 89). Judgments on both lines may be made by referring to the forces of various sensors 46, for example. More specifically, it may be judged that the driving of the vehicle is completed when the force of the various sensors 46 is stopped.
  • step 89 o If the line of the vehicle is not completed in step 89 (step 89 o), return to step 86 and repeat the writing and recording of the driver image. If the vehicle has finished moving in step S8 9 (step S8 9 es), the series of processing ends.
  • the dry image taken while the vehicle is traveling is taken.
  • the time and time stamps are combined with the map and overwritten and recorded. Then, when the fault is detected, this image is saved. Therefore, when analyzing the condition of the vehicle, it is possible to accurately grasp the information that is linked to the cause of the accident, such as the time of the accident, the point of the accident, and the vehicle before and after the accident, from the driver image. it can.
  • the time information, the time map, and the map information are combined into the image of the driver before detecting the point at the time of the accident (hiat) in addition to the time when the accident is encountered. Then, when is detected at the point of the accident, the composite image is saved. Therefore, the stored image can be used for safe driving.
  • the second embodiment the time at which this image was taken for the driver image at the detection point that detected the g It is possible to simultaneously display a map report on the history of. Therefore, accurately identify the accident point and the situation after that. can do.
  • the time when this image was taken, the Munang of this image, and a report showing the history of the vehicle were recorded for the Dryda image taken while the vehicle was in motion. It exists by synthesizing. Therefore, when analyzing the condition of the vehicle, it is necessary to accurately grasp the information, such as the time when the accident occurred, the point where the accident occurred, and the cause of the vehicle before and after the accident, from the driver image. You can
  • the program prepared in advance by a computer such as a so-called computer workstation.
  • This program is recorded on a computer-readable recording medium such as a disk, a disk, or COO, and is executed by being read from the recording medium by the computer.
  • this program may be a body that can be distributed via a network such as the Internet.

Abstract

L'invention concerne un dispositif d'enregistrement d'informations (100) conçu pour enregistrer des données d'image saisies par un moyen photographique monté dans un élément mobile. Ce dispositif d'enregistrement d'informations comprend : une section de détection (101) permettant de détecter l'heure à laquelle les données d'image ont été saisies ; une section d'acquisition (102) permettant d'obtenir des informations historiques relatives à l'historique du déplacement de l'élément mobile ; une section de stockage (103) permettant de stocker les informations historiques obtenues par la section d'acquisition (102) et les données d'image et de les associer sur la base de l'heure à laquelle les données d'image ont été saisies, heure détectée par la section de détection (101) ; et une section de commande d'affichage (105) permettant de commander l'écran d'affichage, d'obtenir une synthèse de la route en mouvement et des données d'image qui sont associées et stockées par la section de stockage (103), et d'afficher cette synthèse.
PCT/JP2006/324375 2005-12-09 2006-12-06 Dispositif d'enregistrement d'informations, procede d'enregistrement d'informations, programme d'enregistrement d'informations et support d'enregistrement lisible par un ordinateur WO2007066696A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005355978 2005-12-09
JP2005-355978 2005-12-09

Publications (1)

Publication Number Publication Date
WO2007066696A1 true WO2007066696A1 (fr) 2007-06-14

Family

ID=38122842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/324375 WO2007066696A1 (fr) 2005-12-09 2006-12-06 Dispositif d'enregistrement d'informations, procede d'enregistrement d'informations, programme d'enregistrement d'informations et support d'enregistrement lisible par un ordinateur

Country Status (1)

Country Link
WO (1) WO2007066696A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009083815A (ja) * 2007-10-03 2009-04-23 Fujitsu Ten Ltd ドライブレコーダ装置および事故解析シミュレーション装置
JP2009246503A (ja) * 2008-03-28 2009-10-22 Denso It Laboratory Inc ドライブ映像要約装置
JP2011128005A (ja) * 2009-12-17 2011-06-30 Fujitsu Ten Ltd ナビゲーション装置、車載表示システム及び地図表示方法
JP6152232B1 (ja) * 2017-02-23 2017-06-21 京セラ株式会社 電子機器
JP6346701B1 (ja) * 2017-11-08 2018-06-20 京セラ株式会社 電子機器、オーバーレイ方法、及びオーバーレイ用プログラム
WO2018193679A1 (fr) * 2017-04-17 2018-10-25 株式会社Jvcケンウッド Dispositif de commande d'enregistrement, appareil d'enregistrement, appareil de navigation, procédé d'enregistrement et programme
EP3734556A4 (fr) * 2017-12-27 2021-01-27 JVCKenwood Corporation Dispositif de commande d'enregistrement, dispositif d'enregistrement, procédé de commande d'enregistrement, et programme de commande d'enregistrement
CN113874920A (zh) * 2019-06-07 2021-12-31 马自达汽车株式会社 移动体外部环境识别装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06344832A (ja) * 1993-06-14 1994-12-20 Nippondenso Co Ltd 道路状況記録システム
JPH08235491A (ja) * 1995-02-27 1996-09-13 Toyota Motor Corp 車両の走行状態記録装置及び車両の走行状態解析装置
JPH1053165A (ja) * 1996-08-07 1998-02-24 Nippon Soken Inc 事故状況記録装置
JP2002236990A (ja) * 2000-12-06 2002-08-23 Tsukuba Multimedia:Kk 地図誘導映像システム
JP2003063459A (ja) * 2001-08-29 2003-03-05 Matsushita Electric Ind Co Ltd 車両用ドライブレコーダ装置
JP2003123185A (ja) * 2001-10-11 2003-04-25 Hitachi Ltd 危険情報集配信装置、警報発生装置、車両危険情報送信装置および経路探索装置
JP2004009833A (ja) * 2002-06-05 2004-01-15 Nissan Motor Co Ltd 運転状況記録装置
JP2004259069A (ja) * 2003-02-26 2004-09-16 Aisin Seiki Co Ltd 車両危険度に応じた警報信号を出力する警報装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06344832A (ja) * 1993-06-14 1994-12-20 Nippondenso Co Ltd 道路状況記録システム
JPH08235491A (ja) * 1995-02-27 1996-09-13 Toyota Motor Corp 車両の走行状態記録装置及び車両の走行状態解析装置
JPH1053165A (ja) * 1996-08-07 1998-02-24 Nippon Soken Inc 事故状況記録装置
JP2002236990A (ja) * 2000-12-06 2002-08-23 Tsukuba Multimedia:Kk 地図誘導映像システム
JP2003063459A (ja) * 2001-08-29 2003-03-05 Matsushita Electric Ind Co Ltd 車両用ドライブレコーダ装置
JP2003123185A (ja) * 2001-10-11 2003-04-25 Hitachi Ltd 危険情報集配信装置、警報発生装置、車両危険情報送信装置および経路探索装置
JP2004009833A (ja) * 2002-06-05 2004-01-15 Nissan Motor Co Ltd 運転状況記録装置
JP2004259069A (ja) * 2003-02-26 2004-09-16 Aisin Seiki Co Ltd 車両危険度に応じた警報信号を出力する警報装置

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009083815A (ja) * 2007-10-03 2009-04-23 Fujitsu Ten Ltd ドライブレコーダ装置および事故解析シミュレーション装置
JP2009246503A (ja) * 2008-03-28 2009-10-22 Denso It Laboratory Inc ドライブ映像要約装置
JP2011128005A (ja) * 2009-12-17 2011-06-30 Fujitsu Ten Ltd ナビゲーション装置、車載表示システム及び地図表示方法
JP2018137697A (ja) * 2017-02-23 2018-08-30 京セラ株式会社 電子機器
JP6152232B1 (ja) * 2017-02-23 2017-06-21 京セラ株式会社 電子機器
US10459315B2 (en) 2017-02-23 2019-10-29 Kyocera Corporation Electronic apparatus for displaying overlay images
WO2018193679A1 (fr) * 2017-04-17 2018-10-25 株式会社Jvcケンウッド Dispositif de commande d'enregistrement, appareil d'enregistrement, appareil de navigation, procédé d'enregistrement et programme
JP2018181043A (ja) * 2017-04-17 2018-11-15 株式会社Jvcケンウッド 記録制御装置、記録装置、ナビゲーション装置、記録方法、及びプログラム
CN110178166A (zh) * 2017-04-17 2019-08-27 Jvc建伍株式会社 记录控制装置、记录装置、导航装置、记录方法以及程序
EP3614358A4 (fr) * 2017-04-17 2020-04-08 Jvckenwood Corporation Dispositif de commande d'enregistrement, appareil d'enregistrement, appareil de navigation, procédé d'enregistrement et programme
JP6346701B1 (ja) * 2017-11-08 2018-06-20 京セラ株式会社 電子機器、オーバーレイ方法、及びオーバーレイ用プログラム
JP2018137728A (ja) * 2017-11-08 2018-08-30 京セラ株式会社 電子機器、オーバーレイ方法、及びオーバーレイ用プログラム
EP3734556A4 (fr) * 2017-12-27 2021-01-27 JVCKenwood Corporation Dispositif de commande d'enregistrement, dispositif d'enregistrement, procédé de commande d'enregistrement, et programme de commande d'enregistrement
CN113874920A (zh) * 2019-06-07 2021-12-31 马自达汽车株式会社 移动体外部环境识别装置

Similar Documents

Publication Publication Date Title
JP4799565B2 (ja) 情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータに読み取り可能な記録媒体
WO2007066696A1 (fr) Dispositif d'enregistrement d'informations, procede d'enregistrement d'informations, programme d'enregistrement d'informations et support d'enregistrement lisible par un ordinateur
JP5585194B2 (ja) 事故状況記録システム
WO2007049596A1 (fr) Dispositif, procede et programme d'enregistrement de donnees, et support d'enregistrement lisible par ordinateur
WO2008010391A1 (fr) dispositif de distribution d'informations, dispositif de traitement d'informations, procédé de distribution d'informations, procédé de traitement d'informations, programme de distribution d'informations, programme de traitement d'informations, et support d'enregistrement lisible par un ordinateur
JP2008250463A (ja) 情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータに読み取り可能な記録媒体
JP2007255907A (ja) 情報登録装置、経路探索装置、情報登録方法、情報登録プログラムおよびコンピュータに読み取り可能な記録媒体
WO2007063849A1 (fr) Appareil d’enregistrement d’information, procédé d’enregistrement d’information, programme d’enregistrement d’information et support d’enregistrement lisible par ordinateur
WO2008010392A1 (fr) dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et support d'enregistrement lisible par un ordinateur
JP4845481B2 (ja) 情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータに読み取り可能な記録媒体
JP4866061B2 (ja) 情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータに読み取り可能な記録媒体
JP2009090927A (ja) 情報管理サーバ、駐車支援装置、駐車支援装置を備えたナビゲーション装置、情報管理方法、駐車支援方法、情報管理プログラム、駐車支援プログラム、および記録媒体
JP4825810B2 (ja) 情報記録装置、情報記録方法、情報記録プログラムおよび記録媒体
WO2008072282A1 (fr) Dispositif, procédé et programme d'enregistrement d'information,s dispositif et procédé de traitement d'informations et support d'enregistrement lisible par ordinateur
JP4521036B2 (ja) 経路探索装置、経路探索方法、経路探索プログラムおよびコンピュータに読み取り可能な記録媒体
WO2008038333A1 (fr) Dispositif, procédé et programme d'enregistrement de données et support d'enregistrement lisible par ordinateur
WO2007119348A1 (fr) appareil d'obtention d'informations, procédé d'obtention d'informations, programme d'obtention d'informations et support d'enregistrement
JP4776627B2 (ja) 情報開示装置
WO2007049520A1 (fr) Dispositif, procede et programme d'enregistrement de donnees, et support d'enregistrement lisible par ordinateur
WO2007055241A1 (fr) Dispositif, procede, programme et support d'enregistrement
JP2013061763A (ja) 判定装置
JP4987872B2 (ja) 情報記録装置、情報記録方法、情報記録プログラムおよびコンピュータに読み取り可能な記録媒体
WO2007066612A1 (fr) Dispositif d'enregistrement d'informations, dispositif de communication, procede d'enregistrement d'informations, procede de communication, programme d'enregistrement d'informations, programme de communication et support d'enregistrement lisible par un ordinateur
JP4289296B2 (ja) 案内装置およびプログラム
WO2007055194A1 (fr) Dispositif, procede, programme et support d'enregistrement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06834130

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP