WO2010150348A1 - Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program - Google Patents

Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program Download PDF

Info

Publication number
WO2010150348A1
WO2010150348A1 PCT/JP2009/061379 JP2009061379W WO2010150348A1 WO 2010150348 A1 WO2010150348 A1 WO 2010150348A1 JP 2009061379 W JP2009061379 W JP 2009061379W WO 2010150348 A1 WO2010150348 A1 WO 2010150348A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
video
importance
recording
video recording
Prior art date
Application number
PCT/JP2009/061379
Other languages
French (fr)
Japanese (ja)
Inventor
毅 中村
健一郎 矢野
隆之 枝久保
智 仲野
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2009/061379 priority Critical patent/WO2010150348A1/en
Priority to JP2011519410A priority patent/JPWO2010150348A1/en
Publication of WO2010150348A1 publication Critical patent/WO2010150348A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4147PVR [Personal Video Recorder]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/0875Registering performance data using magnetic data carriers
    • G07C5/0891Video recorder in combination with video camera

Definitions

  • the present invention relates to a video recording / reproducing apparatus for recording and reproducing a video photographed by a moving body photographing apparatus.
  • one of the new ways to enjoy a video camera is to shoot and record a scene image (hereinafter referred to as a “travel image”) that can be seen while the vehicle is traveling, and view it.
  • a scene image hereinafter referred to as a “travel image”
  • Some attempt to share this running video on the Internet For example, in a video distribution site where comments from viewers can be posted, driving images that have been driven by vehicles on roads that are difficult to pass by vehicles are uploaded, and these images are received from many viewers. Has won the comment.
  • a way to enjoy a video camera it is not limited to recording driving images in a vehicle, but it has the potential to spread to a wider range of ways to enjoy such as driving images on a bicycle or skiing and snowboarding. ing.
  • the traveling video shot as described above is not edited at all, so it is content that contains many monotonous and boring scenes for viewing as it is. It can be said.
  • the entire process of the drive is recorded, if the entire process is viewed, the same time as the entire time of the drive is required, which is not realistic.
  • a user watches a running video in many cases, he does not want to see all the scenes. For example, there are scenes that are running smoothly, scenes with good scenery near famous tourist spots, or special events. It is natural to think that it is only the scene that has occurred.
  • Patent Document 1 sets the summary segment to be extracted based on the position on the time axis of the silent section and the noise section of the audio-video information and the section length, and the importance of the summary segment.
  • Patent Document 2 discloses that traveling images taken at predetermined time intervals are sequentially and cyclically stored in a plurality of semiconductor storage devices, and the storage operation is stopped when an automobile accident occurs. A method for storing a plurality of still images before an accident in the semiconductor memory device is described.
  • Patent Document 1 only background speech is used as a judgment criterion for digest playback scheduling. That is, since other information such as an image is not used, it is difficult to perform an appropriate digest reproduction that does not leak important scenes. Further, the technique described in Patent Document 2 is intended to record an accident video of a vehicle and has a different problem from the present invention that aims to perform digest reproduction.
  • An object of the present invention is to perform an appropriate digest reproduction that does not leak important scenes.
  • the invention according to claim 1 is a video recording / reproducing apparatus, wherein the photographing means provided in the moving body, the video recording means for recording the video photographed by the photographing means, and the traveling state of the moving body are provided. It comprises: a travel condition analysis means to be obtained; and an importance degree judgment means for judging and setting the importance for each scene of the video based on the obtained travel state.
  • the invention according to claim 14 is a video recording / playback method, wherein a shooting step of shooting a video by a shooting unit provided in a moving body, a video recording step of recording a video shot by the shooting unit, A running state analysis step for obtaining a running state of the mobile body, and an importance level judging step for determining and setting the importance level for each scene of the video based on the obtained running state. To do.
  • a video recording / reproducing program executed by a navigation device having a photographing unit, wherein the video recording unit records a video photographed by the photographing unit, and obtains a traveling state of the moving body.
  • the navigation apparatus is made to function as travel state analysis means and importance level determination means for determining and setting the importance level for each scene of the video based on the obtained travel state.
  • FIG. 1 It is a figure which shows an example of a vehicle provided with the navigation apparatus with which the video recording / reproducing apparatus of this invention was applied. It is a figure which shows the apparatus structure of a navigation apparatus. It is a figure which shows the function structure of the video recording / reproducing apparatus which concerns on 1st Example. It is a figure which shows an example of the information recorded on a driving
  • a video recording / reproducing apparatus includes a photographing unit provided in a moving body, a video recording unit that records a video photographed by the photographing unit, and a traveling state for determining a traveling state of the moving body. Analyzing means and importance determining means for determining and setting importance for each scene of the video based on the obtained traveling state.
  • the video recording / playback apparatus includes a photographing unit such as a camera provided in the moving body, a video recording unit, a running state analyzing unit, and an importance level determining unit.
  • the photographing means photographs a traveling image around the moving moving body.
  • the video recording unit records the video shot by the shooting unit.
  • the traveling state analyzing means obtains the traveling state of the moving body.
  • the importance level determination means determines and sets the importance level for each scene of the video based on the obtained running state. In this way, it is possible to set an appropriate importance level for each scene of the video, and it is not necessary to leak the important scene when performing digest playback scheduling.
  • the running state analyzing unit is configured to perform the running based on a feature amount obtained by analyzing at least one of images and sounds included in the video. Find the state.
  • the image feature amount is, for example, luminance, color histogram, edge information, and motion information
  • the sound feature amount is, for example, sound pressure power and sound frequency characteristics.
  • the running state analyzing means is based on a sensor signal from at least one of a GPS sensor, an acceleration sensor, a gyro sensor, and a vehicle speed pulse sensor.
  • the travel state is obtained. Thereby, it is possible to obtain information such as the travel position, the vehicle speed state, the acceleration / deceleration state, the direction, the turning state, and the tilt state as information indicating the travel state of the moving body.
  • Another aspect of the video recording / reproducing apparatus includes a map information recording unit that records map information, and the traveling state analyzing unit obtains the traveling state based on the map information.
  • the traveling state analyzing unit obtains the traveling state based on the map information.
  • Another aspect of the video recording / reproducing apparatus includes a travel history recording unit that records a travel history of the mobile body, and the importance level determination unit is configured to determine the importance level based on the travel history and the travel state. The importance is determined and set for each scene of the video. Thereby, the appropriate importance according to the frequency
  • Another aspect of the video recording / playback apparatus includes personal information recording means for recording personal information of a user, and the importance level determination means is configured to determine the importance of the video based on the personal information and the running state. The importance is judged and set for each scene. Thereby, the appropriate importance according to personal information can be set about each scene of an image
  • Another aspect of the video recording / reproducing apparatus includes an instruction unit for a user to instruct information indicating an important scene, and the importance level determination unit includes the information instructed by the user, the running state, and the like. Based on the above, the importance is determined and set for each scene of the video. This makes it possible to prioritize important scenes instructed by the user when performing digest playback scheduling.
  • Another aspect of the video recording / reproducing apparatus includes a scene dividing unit that divides the video into a plurality of scenes based on the running state. This makes it possible to divide the scene into meaningful units.
  • the importance level determination unit determines and sets the importance level for each frame of the video based on the traveling state of the moving body. As a result, it is possible to add an index to the important frame, and the user can find an important scene based on the index.
  • Another aspect of the video recording / playback apparatus includes digest playback control means for controlling playback by creating a schedule for digest playback according to the importance set for each scene of the video. This makes it possible to appropriately perform digest playback scheduling.
  • the digest playback control means preferentially schedules the video scene as the importance set for the video scene increases.
  • the digest playback control means plays back the video scene faster as the importance set in the video scene becomes lower. Thereby, digest reproduction can be performed in a short time.
  • the digest playback control means plays back the video scene faster as the time length of the video scene becomes longer. This also makes it possible to perform digest reproduction in a short time.
  • a video recording / playback method includes a shooting step of shooting a video by a shooting unit provided in a moving body, a video recording step of recording a video shot by the shooting unit, and the moving body.
  • a video recording / playback program executed by a navigation apparatus having a photographing unit includes a video recording unit that records a video photographed by the photographing unit, and a traveling for obtaining a traveling state of the moving body.
  • the navigation device is caused to function as state analysis means and importance determination means for determining and setting importance for each scene of the video based on the obtained traveling state.
  • This video playback program can also set an appropriate importance level for each scene of the video.
  • FIG. 1 shows an example of a vehicle equipped with a navigation device to which the video recording / reproducing apparatus of the present invention is applied.
  • FIG. 1 illustrates a vehicle 90 that is a vehicle driven by a user, and the navigation device 1 is mounted on the vehicle 90.
  • the navigation device 1 mounted on the vehicle 90 includes a photographing device 70.
  • the imaging device 70 captures a scene outside the traveling vehicle 90, that is, captures a traveling image of the vehicle 90.
  • the shooting direction of the shooting device 70 is not limited to the front of the vehicle 90, but may be shot in the left and right sides of the vehicle 90, the rear, or other directions.
  • the photographing direction of the photographing apparatus 70 is not limited to one direction, and may be a plurality of directions.
  • the navigation device 1 functions as a video recording / reproducing device, the importance level is determined and set for each scene based on the traveling state of the vehicle 90 for the video imaged by the imaging device 70.
  • the navigation device 1 includes a self-supporting positioning device 10, a GPS receiver 18, a system controller 20, a disk drive 31, a data recording unit 36, a communication interface 37, a communication device 38, a display unit 40, an audio output.
  • a unit 50, an input device 60, and a photographing device 70 are provided.
  • the self-supporting positioning device 10 includes an acceleration sensor 11, a gyro sensor 12, and a vehicle speed pulse sensor 13.
  • the acceleration sensor 11 is made of, for example, a piezoelectric element, detects the acceleration of the vehicle 90, and outputs acceleration data.
  • the gyro sensor 12 is composed of, for example, a vibrating gyroscope, detects the angular velocity of the vehicle 90 when the direction of the vehicle 90 is changed, and outputs angular velocity data and relative azimuth data.
  • the vehicle speed pulse sensor 13 measures a vehicle speed pulse composed of a pulse signal generated with the rotation of the wheel of the vehicle 90.
  • GPS Global System
  • GPS positioning data
  • This is a part for receiving radio waves 19 carrying downlink data including data.
  • the system controller 20 includes an interface 21, a CPU (Central Processing Unit) 22, a ROM (Read Only Memory) 23, and a RAM (Random Access Memory) 24, and is configured to control the entire navigation device 1. .
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the interface 21 performs an interface operation with the acceleration sensor 11, the gyro sensor 12, the vehicle speed pulse sensor 13, and the GPS 18. From these, acceleration data, relative azimuth data, angular velocity data, GPS positioning data, absolute azimuth data and the like are input to the system controller 20 in addition to the vehicle speed pulse.
  • the CPU 22 controls the entire system controller 20.
  • the ROM 23 includes a nonvolatile memory (not shown) in which a control program for controlling the system controller 20 is stored.
  • the RAM 24 stores various data such as route data preset by the user via the input device 60 so as to be readable, and provides a working area to the CPU 22.
  • the system controller 20, the disk drive 31, the data recording unit 36, the communication interface 37, the display unit 40, the audio output unit 50, the input device 60 and the photographing device 70 are connected to each other via the bus line 30.
  • the data recording unit 36 is configured by, for example, an HDD or the like, and records various data used for navigation processing such as map information, traveling video data photographed by the photographing device 70, and analysis data analyzed by the system controller 20. It is.
  • the disk drive 31 reads and outputs content data such as music data and video data from the disk 33 under the control of the system controller 20. Further, the disk drive 31 records the above-described traveling video data and analysis data on the disk 33 under the control of the system controller 20.
  • the disk 33 for example, various disks such as DVD ⁇ RW are applicable.
  • the communication device 38 includes, for example, an FM tuner, a beacon receiver, and the like, and acquires information distributed from a VICS (Vehicle Information Communication System) center or the like.
  • VICS Vehicle Information Communication System
  • the photographing device 70 is, for example, a camera, and captures a traveling image while the vehicle 90 is traveling as described above.
  • the photographing device 70 also includes a microphone, and the traveling video includes sound as data in addition to images.
  • the imaging device 70 is controlled by the system controller 20.
  • the display unit 40 displays various display data and images captured by the image capturing device 70 on the display screen of the display 44 under the control of the system controller 20. For example, when displaying map information as display data on the display screen, the system controller 20 reads map information from the data recording unit 36, and the display unit 40 is read from the data recording unit 36 by the system controller 20. The map information is displayed on the display screen.
  • the display unit 40 includes a graphic controller 41 that controls the entire display unit 40 based on control data sent from the CPU 22 via the bus line 30, and image information that can be displayed immediately, such as a VRAM (Video) RAM) memory.
  • VRAM Video
  • a display 44 is composed of, for example, a liquid crystal display device having a diagonal size of about 5 to 10 inches, and is mounted near the front panel in the vehicle.
  • the audio output unit 50 is a D / A converter 51 that performs D / A (Digital-to-Analog) conversion of audio digital data sent from the RAM 24 or the like via the bus line 30 under the control of the system controller 20.
  • An amplifier (AMP) 52 that amplifies the audio analog signal output from the converter 51 and a speaker 53 that converts the amplified audio analog signal into audio and outputs the audio to the vehicle are configured.
  • the input device 60 includes keys, switches, buttons, a remote controller, a voice input device, and the like for inputting various commands and data.
  • the input device 60 is disposed around the front panel and the display 44 of the main body of the in-vehicle electronic system mounted in the vehicle.
  • the display 44 is a touch panel system
  • the touch panel provided on the display screen of the display 44 also functions as the input device 60.
  • FIG. 3 shows a functional configuration of the video recording / reproducing apparatus 100 according to the first embodiment. Note that the video recording / reproducing apparatus 100 is substantially realized by each component of the navigation apparatus 1 shown in FIG. This will be specifically described below.
  • the video recording / reproducing apparatus 100 includes an imaging device 70, a GPS 18, an acceleration sensor 11, a gyro sensor 12, a vehicle speed pulse sensor 13, a traveling video recording unit 221, an image / audio analyzing unit 222, and a traveling state analyzing unit 223. , A scene division unit 224, a scene importance level determination unit 225, a digest reproduction control unit 226, a traveling video database 361, a traveling state database 362, a scene database 364, and a map database 363.
  • the traveling video recording unit 221, the image / sound analysis unit 222, the traveling state analysis unit 223, the scene division unit 224, the scene importance level determination unit 225, and the digest reproduction control unit 226, for example, are executed by the system controller 20. It is realized by doing.
  • the traveling video database 361, the traveling state database 362, the scene database 364, and the map database 363 are recorded in the data recording unit 36 or the writable disc 33.
  • the video recording / reproducing apparatus 100 obtains the running state of the vehicle 90 based on sensor signals from various sensors and performs scene division of the running video based on the obtained running state. Further, the video recording / reproducing apparatus 100 determines and sets the importance for each divided scene based on the obtained running state.
  • each function of the video recording / reproducing apparatus 100 according to the first embodiment will be described in detail.
  • the traveling image recording unit 221 records the traveling image captured by the imaging device 70 in the traveling image database 361.
  • An example of information recorded in the traveling video database 361 is shown in FIG. As shown in FIG. 4, the traveling video database 361 records a traveling video ID that uniquely identifies the traveling video, a shooting start date and time, a time length, a file name, and the like.
  • the image / speech analysis unit 222 analyzes the image and sound from the running video and extracts the feature amount.
  • the feature amount is, for example, a feature amount of an image (hereinafter referred to as “image feature amount”), such as luminance, color histogram, edge information, motion information, and the like. Sound pressure power, sound frequency characteristics, and the like.
  • Map information used as reference information by the traveling state analysis unit 223 is recorded in advance.
  • Map information includes, for example, road traffic information such as addresses, elevations, slopes, intersections, three-dimensional intersections, railroad crossings, tunnels, bridges, vehicle width, number of lanes, speed limit, expressways, ordinary roads, residential roads, parking lots, etc.
  • Road attribute information shops, facilities, famous buildings, parks, amusement parks, landmark information such as tourist attractions, topographical information such as sea, lake, river, mountain, view spot, speed limit, one-way road traffic law The above regulatory information, near miss points, etc.
  • the running state analysis unit 223 is recorded in the map database 363, various sensor signals detected from the GPS 18, the acceleration sensor 11, the gyro sensor 12, and the vehicle speed pulse sensor 13, the feature values obtained by the image / voice analysis unit 222.
  • Various traveling states of the vehicle 90 are obtained based on the map information.
  • the traveling state analysis unit 223 records the obtained traveling state together with various sensor signals and feature quantities in the traveling state database 362.
  • the traveling state database 362 includes various sensor signals, feature amounts, traveling states obtained based on the various sensor signals and feature amounts, and frame IDs (or traveling images) that uniquely identify frames corresponding to the traveling states. (Time position in the middle) are recorded in association with each other.
  • the driving state database 362 does not have to manage the sensor signal, the feature amount, and the driving state in an integrated manner, and may be configured by a plurality of databases that manage each of them.
  • the scene dividing unit 224 performs scene division of the running video based on the running state obtained by the running state analyzing unit 223. As a reference for performing scene division, for example, a change point of the running state or a characteristic time point of the running state can be given.
  • the scene division unit 224 records scene information indicating the divided scenes in the scene database 364.
  • the scene importance level determination unit 225 determines and sets the importance level (hereinafter referred to as “scene importance level”) for each scene based on the running state in each scene divided by the scene division unit 224. The higher the importance of this scene, the higher the probability that the scene will be selected and played back during digest playback.
  • the scene importance determination unit 225 records the scene importance obtained for each scene in the scene database 364.
  • the digest playback control unit 226 performs playback control by creating a schedule for digest playback of a running video. Specifically, the digest playback control unit 226 selects a scene to be played back and determines a playback speed according to the target playback time and the scene importance level.
  • the display unit 401 is a display device that performs normal reproduction and digest reproduction of a running video.
  • the display unit 401 corresponds to the display unit 40 when reproduction is performed in the vehicle 90.
  • the display unit 401 is not limited to the display unit 40, and a display device such as another display may be used.
  • the display unit 401 also displays GUI (Graphical User Interface) display of the operation menu, information related to the traveling video, and the like.
  • GUI Graphic User Interface
  • FIG. 5 shows an example of information stored in the running state database 362.
  • the type of traveling state of the vehicle for example, traveling position, vehicle speed state, acceleration / deceleration state, heading, turning state, inclination state, image information, audio information, address, section boundary, altitude state. Examples include geographical spots, road conditions, landscape conditions, and near-miss points.
  • the traveling state analysis unit 223 obtains each traveling state based on various sensors and feature amounts. Then, the traveling state analysis unit 223 records the obtained traveling state in the traveling state database 362 in association with the frame in the traveling image (or the time position in the traveling image). Below, each driving
  • the traveling position of the vehicle 90 is obtained based on a sensor such as GPS18. Specifically, the traveling state analysis unit 223 obtains the traveling position of the vehicle 90 by latitude and longitude based on the sensor signal from the GPS 18. Alternatively, the traveling state analysis unit 223 may determine the traveling position of the vehicle 90 by using autonomous navigation based on sensor signals from the gyro sensor 12 and the vehicle speed pulse sensor 13 instead of doing this. Further, the traveling state analysis unit 223 may correct the obtained traveling position of the vehicle 90 by using map matching navigation that performs collation with map information. The traveling state analysis unit 223 records the obtained traveling position of the vehicle 90 in the traveling state database 362.
  • a sensor such as GPS18.
  • the vehicle speed state of the vehicle 90 is obtained based on the sensor signal from the vehicle speed pulse sensor 13. Specifically, the traveling state analysis unit 223 obtains the vehicle speed based on the sensor signal from the vehicle speed pulse sensor 13. Then, based on the obtained vehicle speed, the traveling state analysis unit 223 includes, for example, the vehicle 90 in any state among the vehicle speed states of stopping, normal traveling, traffic jam section, low speed traveling, and high speed traveling. Ask for. The traveling state analysis unit 223 records the obtained vehicle speed and the vehicle speed state in the traveling state database 362.
  • the acceleration / deceleration state of the vehicle 90 is obtained based on sensor signals from the acceleration sensor 11 and the like. Specifically, the running state analysis unit 223 obtains acceleration based on a sensor signal from the acceleration sensor 11 or by analyzing a change in the running position. Then, based on the obtained acceleration, the traveling state analysis unit 223 determines whether the vehicle 90 is in an acceleration / deceleration state such as sudden acceleration or sudden braking. The traveling state analysis unit 223 records the obtained acceleration and acceleration / deceleration state in the traveling state database 362.
  • the direction and turning state of the vehicle 90 are obtained based on sensor signals from the gyro sensor 12 and the like. Specifically, the traveling state analysis unit 223 obtains the azimuth, the angular velocity, and the angular acceleration based on a sensor signal from the gyro sensor 12 or by analyzing a change in the traveling position. Then, based on the obtained angular velocity and angular acceleration, the traveling state analysis unit 223 determines which state the vehicle 90 is in, for example, each turning state of a right / left turn section, a straight traveling section, or a winding section. The traveling state analysis unit 223 records the obtained azimuth, angular velocity, angular acceleration, and turning state in the traveling state database 362.
  • the inclination state of the vehicle 90 is obtained based on sensor signals from the acceleration sensor 11 and the like. Specifically, the traveling state analysis unit 223 obtains the inclination angle of the vehicle 90 based on a sensor signal from the acceleration sensor 11 or by collating the traveling position with map information. Then, based on the obtained inclination angle, the traveling state analysis unit 223, for example, the vehicle 90 in any state of flat, uphill, steep uphill, downhill, and steep downhill inclination states. Ask if there is. The traveling state analysis unit 223 records the obtained inclination angle and inclination state in the traveling state database 362.
  • the image information is obtained based on the image feature amount extracted by the image / speech analysis unit 222.
  • the running state analysis unit 223 analyzes image feature amounts such as luminance, color histogram, edge information and motion information, thereby indicating image information indicating each sign such as a white line, a road sign, a guide sign, Image information indicating traffic lights, vehicles, pedestrians, etc. is obtained.
  • the traveling state analysis unit 223 may increase the detection accuracy of these pieces of image information by collating the traveling position with the map information or using the audio feature amount.
  • the traveling state analysis unit 223 records the obtained luminance, color histogram, edge information, motion information, image information, and the display position of the image information in the frame in the traveling state database 362.
  • the audio information is obtained based on the audio feature amount extracted by the image / audio analysis unit 222.
  • the running state analysis unit 223 obtains voice information indicating a human voice, a hustle sound, a horn sound, a brake squeal sound, and the like by analyzing voice feature values such as sound pressure power and sound frequency characteristics.
  • the traveling state analysis unit 223 may increase the detection accuracy of the audio information by collating the traveling position with the map information or by using the image feature amount.
  • the traveling state analysis unit 223 records the obtained sound pressure power, frequency characteristics, and voice information in the traveling state database 362.
  • Information such as the address of the travel position and whether or not the travel position is at the partition boundary is obtained by collating the travel position with the map information.
  • the partition boundaries include prefectural boundaries, municipal boundaries, town / character boundaries, and chome boundaries.
  • the traveling state analysis unit 223 obtains an address by collating the traveling position with map information, and determines whether or not the vehicle 90 is located at the partition boundary. Instead of doing this, the traveling state analysis unit 223 may obtain the address by performing text analysis based on the image feature amount. The traveling state analysis unit 223 records the obtained address and section boundary in the traveling state database 362.
  • the altitude state is obtained by comparing the traveling position with the map information. Specifically, the traveling state analysis unit 223 obtains the altitude by comparing the traveling position with map information. Then, based on the obtained altitude, the traveling state analysis unit 223 determines whether the vehicle 90 is located at a characteristic altitude point such as the highest altitude point of the pass, the highest altitude point on the route, or the lowest altitude point. Ask for. Instead of doing this, the navigation device 1 may include an altimeter, and the traveling state analysis unit 223 may obtain the altitude based on the value of the altimeter. The traveling state analysis unit 223 records information such as whether or not the vehicle 90 is located at the obtained elevation and a characteristic elevation point in the traveling state database 362.
  • the travel spot or geographic spot information in the vicinity thereof is obtained by the travel state analysis unit 223 collating the travel position with the map information.
  • Geographic spot information includes things related to road traffic such as intersections, three-dimensional intersections, railroad crossings, tunnels, bridges, stores, facilities, famous buildings, parks, amusement parks, tourist attractions, sea, Things related to topography such as lakes, rivers, mountains, and viewing spots.
  • the traveling state analysis unit 223 records the obtained geographical spot information in the traveling state database 362.
  • the road type of the traveling position is obtained by the traveling state analysis unit 223 collating the traveling position with the map information.
  • road types for example, express roads, toll roads, general roads, residential roads, parking lots, road attributes such as road width and number of lanes, speed limits, road traffic laws such as one-way traffic, etc. There are regulations.
  • the traveling state analysis unit 223 records the road type of the traveling position thus obtained in the traveling state database 362.
  • the landscape state information indicating the scenery around the travel position is obtained by the travel state analysis unit 223 collating the travel position with the map information.
  • Examples of the landscape state information include an urban area, a countryside, a mountainous area, and a coastal coast.
  • the traveling state analysis unit 223 records the landscape state information thus obtained in the traveling state database 362.
  • the traveling state analysis unit 223 collating the traveling position with the map information.
  • near-miss points mean dangerous places and difficult places on road traffic.
  • the place of occurrence is listed.
  • the traveling state analysis unit 223 records information indicating whether or not the traveling position is at a near-miss point in the traveling state database 362.
  • the scene dividing unit 224 performs scene division of the running video at the change point or characteristic time point of the running state recorded in the running state database 362.
  • FIG. 6 lists examples of travel state change points
  • FIG. 7 lists examples of characteristic time points of the travel state.
  • all of the change points of the running state and the characteristic time points of the running state are not necessarily used for scene division. Instead, it goes without saying that scene division may be performed using only those effective in the system.
  • the scene division unit 224 performs scene division on the running video with reference to the time points shown in FIGS. 6 and 7, and then records the divided scenes in the scene database 364.
  • a travel video ID, a scene ID for identifying a scene in the travel video, and a start time and an end time in the travel video of the scene are recorded in the scene database 364 (see FIG. 10 described later).
  • FIG. 8 and FIG. 9 show the importance determination tables for the running state.
  • the travel state importance determination table predefines the importance of various travel states. 8 and 9, the importance is represented by a numerical value. Specifically, a positive integer indicates a higher importance as the value is positive, and a negative integer indicates a lower importance as the value is negative.
  • This importance determination table is recorded in advance in the data recording unit 36 or the like.
  • the scene importance determination unit 225 refers to the importance determination table for each scene divided by the scene division unit 224, obtains the importance of various running states, and determines the importance of the obtained various running states.
  • the scene importance of each scene is determined by summing the degrees. For example, for a traveling state of a scene, the vehicle 90 is traveling at a low speed and includes a guide sign as image information, an intersection and a tourist attraction as geographical spot information, and a type of landscape state Is an urban area, the scene importance is obtained as follows with reference to FIGS.
  • the scene importance determination unit 225 determines the scene importance for each scene as described above, and records the calculated scene importance in the scene database 364.
  • An example of data recorded in the scene database 364 is shown in FIG. As shown in FIG. 10, at this time, in the scene database 364, in addition to the travel video ID, the scene ID for identifying the scene in the travel video, the start time and end time in the travel video of the scene, The scene importance determined at 225 is recorded.
  • FIG. 11 shows a tendency of the priority by obtaining the scene importance based on FIGS. 8 and 9.
  • the priority indicates the priority to be played back during digest playback.
  • the “right / left turn point” is set to 50, and “straight section” is important.
  • the degree is set to ⁇ 5, and the importance of “winding section” is set to 10. Therefore, as a tendency of the importance of the scene, the priority increases when the scene includes a right / left turn point or a winding section, and the priority decreases when the scene includes a straight section. Since the right / left turn point is more important than the winding section, the scene including the right / left turn point has a higher priority than the scene including the winding section.
  • the digest playback control unit 226 performs scheduling to select a scene to be played back, and performs digest playback by playing back the scene according to the scheduling. More specifically, the digest playback control unit 226 performs playback based on the scene importance and the scene time length (that is, the time length of the scene) on the condition of the target playback time of the digest playback as a condition. Select the scene you want. This will be described in more detail below.
  • the digest reproduction control unit 226 sets a target reproduction time T for digest reproduction.
  • the target reproduction time T is set to a time corresponding to the ratio to the total time of the traveling video or a predetermined time.
  • the target reproduction time T may be set to the time input by the user via the input device 60.
  • the digest playback control unit 226 sequentially selects scenes having higher scene importance from the scene database 364, and determines the scene playback speed for the selected scene.
  • the reproduction speed indicates the degree of double speed reproduction such as constant speed reproduction, double speed reproduction, and N double speed reproduction.
  • high speed playback is performed as N increases.
  • any one of the following criteria (a) to (d) is used. In this way, digest reproduction can be performed in a short time.
  • the digest playback control unit 226 calculates the scene playback time Tsi for the selected scene based on the scene time length and the scene playback speed.
  • the scene playback time Tsi is calculated as follows.
  • Scene playback time Tsi scene time length / scene playback speed
  • the digest playback control unit 226 calculates the scene playback time Tsi for each of the scenes selected in descending order of the scene importance from the scene database 364, and those scene playbacks thus calculated.
  • the digest reproduction time Td is calculated by adding the times Tsi.
  • the digest playback control unit 226 causes the scene selected so far to be in the specified playback speed in the time series of shooting.
  • Perform scheduling The digest reproduction control unit 226 reproduces each scene on the display unit 401 according to the scheduling determined in this way. By doing so, it is possible to perform digest reproduction giving priority to reproduction of important scenes.
  • FIG. 12 is a flowchart showing an overall control process in which a scene of a running image is divided based on the running state of the vehicle, and an importance is set for each scene and digest reproduction is performed.
  • FIG. 13 is a flowchart showing a control process for performing scheduling for digest reproduction.
  • step S101 a travel image is captured by the image capturing device 70, and the captured travel image is input to the image recording / reproducing device 100.
  • the travel video recording unit 221 records the input travel video in the travel video database 361.
  • step S102 various sensor signals are input to the video recording / reproducing apparatus 100 from the GPS 18, the acceleration sensor 11, the gyro sensor 12, the vehicle speed pulse sensor 13, and the like.
  • step S ⁇ b> 103 the image / audio analysis unit 222 analyzes the running video and extracts an image feature amount and an audio feature amount.
  • step S ⁇ b> 104 the traveling state analysis unit 223 obtains various traveling states based on various sensor signals and feature amounts, and records them in the traveling state database 362.
  • step S105 the scene dividing unit 224 is recorded in the travel video database 361 based on the travel state obtained in step S104, more specifically, based on the change point of the travel state and the characteristic time point. Performs scene division of running images. Thereby, a driving
  • the scene dividing unit 224 records the divided scenes in the scene database 364.
  • step S106 the scene importance level determination unit 225 determines and sets the scene importance level for each scene recorded in the scene database 364 based on various running states in each scene. In this way, an appropriate scene importance can be set for each scene.
  • step S107 the digest reproduction control unit 226 performs scheduling for selecting an important scene to be reproduced.
  • the scheduling control process in step S107 will be described with reference to FIG.
  • step S201 the digest reproduction control unit 226 sets a target reproduction time T.
  • step S202 the digest reproduction control unit 226 sets the digest reproduction time Td to “0”. This process is a process for initializing the digest playback time Td.
  • step S203 the digest playback control unit 226 selects a scene having the highest scene importance from the scenes recorded in the scene database 364.
  • step S204 the digest playback control unit 226 determines the scene playback speed for the selected scene based on the scene importance level and the scene time length.
  • step S205 the digest playback control unit 226 obtains the scene playback time Tsi by dividing the scene time length by the scene playback speed for the selected scene.
  • step S206 the digest reproduction control unit 226 sets a value obtained by adding the scene reproduction time Tsi to the digest reproduction time Td as a new digest reproduction time Td.
  • step S207 the digest reproduction control unit 226 determines whether the digest reproduction time Td is equal to or longer than the target reproduction time T. If the digest playback control unit 226 determines that the digest playback time Td is less than the target playback time T (step S207: No), the digest playback control unit 226 returns to the process of step S203, and next to the previously selected scene. For scenes with a high degree of importance, processing similar to that described in steps S204 to S206 is performed. On the other hand, when the digest playback control unit 226 determines that the digest playback time Td is equal to or longer than the target playback time T (step S207: Yes), the process proceeds to step S208, and the scene selected so far. Are scheduled in accordance with the photographed time series and at a designated playback speed. Thereafter, the digest reproduction control unit 226 proceeds to the process of step S108 in FIG.
  • step S108 the digest playback control unit 226 plays back the scene according to the scheduling set in the processing of FIG. Thereafter, the digest reproduction control unit 226 ends this control process.
  • the running state of the vehicle is obtained based on sensor signals from various sensors, and the scene division of the running image is performed based on the obtained running state.
  • scene importance is set for each divided scene based on the running state.
  • the running video can be divided into scenes of meaningful content.
  • the scene importance level for each divided scene based on the running state it is possible to set an appropriate scene importance level for each scene. There is no need to leak.
  • the scene importance is set for each scene of the running video based on the running history in addition to the running state.
  • FIG. 14 shows a functional configuration of the video recording / reproducing apparatus 100a according to the second embodiment. 14, the same functions as those of the video recording / playback apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG. Similar to the video recording / reproducing apparatus 100 according to the first embodiment, the video recording / reproducing apparatus 100a according to the second embodiment is actually realized by each component of the navigation apparatus 1 shown in FIG. .
  • the video recording / reproducing apparatus 100a has a configuration including a travel history database 365 in addition to the functional configuration of the video recording / reproducing apparatus 100 according to the first embodiment.
  • the travel state analysis unit 223 records the travel state obtained based on various sensor signals or feature quantities in the travel state database 362, and appropriately records the travel position recorded in the travel state database 362. Back up to 365.
  • the scene importance level determination unit 225 obtains the number of times the route has traveled in each divided scene based on the travel history recorded in the travel history database 365.
  • the scene importance level determination unit 225 determines the importance level according to the number of travel times of the route in each scene, for example, using a travel frequency importance level determination table as shown in FIG. . Then, the scene importance level determination unit 225 calculates the scene importance level by adding the importance level corresponding to the number of times of travel to the value obtained by summing the importance levels of the driving state for each scene.
  • the route is set so that the importance is increased as the route is less traveled, and the importance is set smaller as the route is traveled more frequently.
  • the importance level of each scene is set so that the importance level is set to be larger for a route having a smaller number of times of travel corresponding to the scene, and the importance level is less for a route having a greater number of times of travel corresponding to the scene.
  • the scene importance level is set for each scene of the running video based on the running history in addition to the running state. In this way, an appropriate scene importance level corresponding to the number of travels can be set for each scene.
  • the scene importance is set for each scene of the running video based on the personal information that is the personal information of the user in addition to the running state. .
  • FIG. 16 shows a functional configuration of the video recording / reproducing apparatus 100b according to the third embodiment.
  • functions similar to those of the video recording / reproducing apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG.
  • the video recording / reproducing apparatus 100b according to the third embodiment is actually realized by each component of the navigation apparatus 1 shown in FIG. .
  • the video recording / playback apparatus 100b has a personal information database 366 in addition to the functional configuration of the video recording / playback apparatus 100 according to the first embodiment.
  • the personal information database 366 as personal information, for example, the location of a familiar place such as a home, a home, or a company, a commuting route, a location of a memory location, and the like are recorded. These data are recorded in advance by the user.
  • the scene importance level determination unit 225 determines whether or not the traveling position in each scene is near the position or route recorded as personal information in the personal information database 366. When the scene importance degree determination unit 225 determines that the traveling position in each scene is in the vicinity of the position or route recorded as personal information in the personal information database 366, for example, personal information as shown in FIG. The importance level corresponding to the personal information is obtained for each scene using the importance level determination table. Then, the scene importance level determination unit 225 calculates the scene importance level by adding the importance level according to the personal information to the value obtained by summing the importance levels of the running states for each scene. For example, in the importance level determination table for personal information shown in FIG.
  • the importance level is set to be small near the home or near the commuting route, and the importance level is set to be large at the memory location.
  • scenes near the user's home or near the commuting route have low priority during digest playback, and scenes at the user's memory location have high priority during digest playback.
  • the scene importance is set for each scene of the running video based on the personal information in addition to the running state. Even in this way, it is possible to set an appropriate scene importance according to personal information for each scene.
  • the scene importance level of each scene of the running video is automatically determined.
  • the video recording / playback apparatus according to each embodiment may fail to schedule a scene important for the user. Therefore, in the video recording / playback apparatus according to the fourth embodiment, the scene importance level is set for each scene of the running video based on the information on the important scene explicitly instructed by the user in addition to the running state.
  • FIG. 18 shows a functional configuration of the video recording / reproducing apparatus 100c according to the fourth embodiment.
  • the same components as those of the video recording / reproducing apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG.
  • the video recording / reproducing apparatus 100c according to the fourth embodiment is actually realized by each component of the navigation apparatus 1 shown in FIG. .
  • the video recording / reproducing apparatus 100c includes an important scene instruction means 14 and an important scene instruction information database. 367.
  • the important scene instructing means 14 is a means for notifying the video recording / reproducing apparatus 100c of this intention when the user thinks that the running video at this place is an important scene or wants to watch it later.
  • the important scene instruction unit 14 is an input device such as a button, a switch, or a touch panel.
  • the important scene instruction unit 14 may be an input device that performs voice recognition for recognizing a user's voice or gesture recognition for recognizing a user's gesture.
  • the important scene instruction unit 14 records the important scene instruction information instructed by the user in the important scene instruction information database 367.
  • the important scene instruction information recorded in the important scene instruction information database 367 includes, for example, the time when the instruction is given by the user and the traveling position.
  • the scene importance level determination unit 225 calculates a scene importance level by adding a certain level of importance to a value obtained by summing the importance levels of the driving state for a scene including an important scene instructed by the user. By doing so, the importance level is set to be large for scenes including the important scene instructed by the user, and the priority level during digest playback is increased.
  • the scene importance is set for each scene of the running video based on the important scene instruction information in addition to the running state. In this way, it is possible to perform digest playback without missing a scene that the user wants to see.
  • FIG. 19 shows a functional configuration of a video recording / reproducing apparatus 100d according to an application example.
  • a video recording / reproducing apparatus 100d shown in FIG. 19 shows an application example of the video recording / reproducing apparatus 100 according to the first embodiment as an example.
  • the same components as those of the video recording / reproducing apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG.
  • the video recording / reproducing apparatus 100d includes an index database 368 instead of the scene dividing unit 224 and the scene database 364.
  • the scene importance determination unit 225 calculates the scene importance for each frame of the driving video based on the driving state.
  • FIG. 20 is a graph showing a change in the importance of the scene with respect to the time of the running video.
  • the scene importance level determination unit 225 assigns an index to the frame having the highest importance level within the exceeded section.
  • the indicated frame ID or the time position at that time is recorded in the index database 368.
  • the scene importance level determination unit 225 displays a list of frames with indexes as representative images on the display unit 401.
  • the user selects an arbitrary image from the representative images displayed as a list on the display unit 401. In this way, the user can perform skip playback that skips unnecessary scenes.
  • step S301 to S304 is the same as the processing from step S101 to S104 described in FIG.
  • step S ⁇ b> 305 the scene importance level determination unit 225 determines the importance of the scene based on the driving state (or in addition to the driving history and personal information) for each frame of the driving video recorded in the driving video database 361. Set the degree.
  • step S306 when the scene importance level exceeds a predetermined threshold value for each frame for which the scene importance level is set, the scene importance level determination unit 225 has the highest importance level in the exceeded section. Is recorded in the index database 368.
  • step S307 the scene importance level determination unit 225 displays a list of frames to which indexes have been assigned as representative images on the display unit 401, and ends this control process.
  • the user can skip unnecessary scenes. Thereby, the user can confirm only an important scene, and can fully confirm the content of a driving
  • the imaging device 70 captures a traveling image that is an image around the traveling vehicle 90, but is not limited thereto, and in addition, the interior of the vehicle 90 (Hereinafter, referred to as “vehicle interior video”) may be taken, and scene splitting and scene importance setting may be performed for the vehicle interior video in accordance with the driving state.
  • vehicle interior video the interior of the vehicle 90
  • the video recording / reproducing apparatus uses the GPS 18, the acceleration sensor 11, the gyro sensor 12, and the vehicle speed pulse sensor 13 to obtain the traveling state.
  • the video recording / reproducing apparatus may obtain the running state using some of these sensors.
  • the video recording / reproducing apparatus may obtain the running state using another sensor in addition to all or a part of the above-described sensors. Further, in each of the above-described embodiments, the video recording / playback apparatus analyzes the image and sound included in the video to obtain the image feature amount and the sound feature amount, and uses these feature amounts to obtain the running state. However, it is not limited to this. Instead of this, it goes without saying that the video recording / reproducing apparatus may obtain the running state using only one of the image feature quantity and the audio feature quantity.
  • the system controller of the navigation device 1 has functions of a traveling video recording unit, an image / sound analysis unit, and a traveling state analysis unit, and another audio device different from the navigation device 1 includes a scene dividing unit, a scene importance level It may have functions of a determination unit and a digest reproduction control unit.
  • the navigation device 1 obtains the traveling state based on sensor signals from various sensors and records the obtained traveling state in the traveling state database.
  • the traveling video database and the traveling state database are recorded on a disc or the like.
  • the other audio apparatus performs scene reproduction and scene importance determination by using the traveling video database and the traveling state database recorded on the disc or the like, and performs digest reproduction.
  • the video recording / reproducing apparatus of the present invention is not limited to being applied to a vehicle navigation apparatus, but may be applied to a navigation apparatus mounted on a bicycle or a mobile phone instead. Yes. That is, the video recording / reproducing apparatus of the present invention can be applied to other mobile navigation devices that are not limited to vehicles.
  • the present invention can be used for a car navigation device equipped with a camera for photographing, a PND (Personal Navigation Device), a car AV system, a mobile phone, a cycle computer, a GPS logger, and other in-vehicle devices or mobile devices. .
  • a camera for photographing a PND (Personal Navigation Device)
  • a car AV system a mobile phone, a cycle computer, a GPS logger, and other in-vehicle devices or mobile devices.

Abstract

A video recording/reproduction device includes: an imaging means such as a camera arranged on a mobile body; a video recording means, a travelling state analysis means, and an importance judgment means.  The imaging means captures a travelling video around the travelling mobile body.  The video recording means records the video captured by the imaging means.  The travelling state analysis means obtains a travelling state of the mobile body.  The importance judgment means judges and sets an importance degree of each video scene on the basis of the obtained travelling state.

Description

映像記録再生装置、映像記録再生方法、および映像記録再生プログラムVideo recording / playback apparatus, video recording / playback method, and video recording / playback program
 本発明は、移動体の撮影装置により撮影された映像の記録及び再生を行う映像記録再生装置に関する。 The present invention relates to a video recording / reproducing apparatus for recording and reproducing a video photographed by a moving body photographing apparatus.
 近年、ビデオカメラの新しい楽しみ方の一つとしては、車両の走行中に見える風景の映像(以下、「走行映像」と称する)を撮影・記録して、これを視聴することが挙げられる。中には、この走行映像をインターネット上で共有することも試みられている。例えば、視聴者からのコメントを投稿することが可能な動画配信サイトでは、車両による通行が困難な道路をあえて車両で走行した走行映像がアップロードされており、これらの映像は、多くの視聴者からのコメントを獲得している。また、このような走行映像を公開している個人のサイトも数多く存在する。さらには、ビデオカメラの楽しみ方としては、車両での走行映像を記録するに止まらず、自転車での走行映像、あるいはスキーやスノーボードでの滑走映像など、より広範囲な楽しみ方に広がる可能性を秘めている。 In recent years, one of the new ways to enjoy a video camera is to shoot and record a scene image (hereinafter referred to as a “travel image”) that can be seen while the vehicle is traveling, and view it. Some attempt to share this running video on the Internet. For example, in a video distribution site where comments from viewers can be posted, driving images that have been driven by vehicles on roads that are difficult to pass by vehicles are uploaded, and these images are received from many viewers. Has won the comment. There are also many personal sites that publish such driving images. Furthermore, as a way to enjoy a video camera, it is not limited to recording driving images in a vehicle, but it has the potential to spread to a wider range of ways to enjoy such as driving images on a bicycle or skiing and snowboarding. ing.
 ところで、テレビでの放送番組とは異なり、上述したようにして撮影された走行映像は、全く編集処理がされてないため、そのままで視聴するには単調・退屈なシーンを多く含んだコンテンツであると言える。また、ドライブの全行程を録画した場合において、これを全部視聴する場合には、ドライブの全時間と同じ時間が必要となり、現実的とは言えない。ユーザも走行映像を視聴する場合、多くの場合全てのシーンを見たいわけでなく、例えば、ある快調に走行しているシーン、有名な観光地付近の風景の良いシーン、あるいは、特別なイベントが発生したシーン等だけであると考えるのが自然である。 By the way, unlike a broadcast program on television, the traveling video shot as described above is not edited at all, so it is content that contains many monotonous and boring scenes for viewing as it is. It can be said. In addition, when the entire process of the drive is recorded, if the entire process is viewed, the same time as the entire time of the drive is required, which is not realistic. When a user watches a running video, in many cases, he does not want to see all the scenes. For example, there are scenes that are running smoothly, scenes with good scenery near famous tourist spots, or special events. It is natural to think that it is only the scene that has occurred.
 そこで、上述した走行映像などのコンテンツの要約を短時間で把握するための技術として、当該コンテンツのダイジェストを作成し、当該ダイジェストの再生を行う手法が考え出されている。例えば、特許文献1には、音声映像情報の無音区間および騒音区間の時間軸上の位置、区間長に基づいて抽出すべき要約セグメントおよび当該要約セグメントの重要度を設定するとともに、この決定された要約セグメントと重要度に基づいてダイジェスト再生を行う手法が記載されている。また、関連した技術として、特許文献2には、予め定められた時間間隔で撮影された走行画像を複数の半導体記憶装置に順次循環的に記憶し、自動車事故発生時に記憶動作を停止させることで、事故前の複数の静止画像を当該半導体記憶装置に記憶する手法が記載されている。 Therefore, as a technique for grasping the summary of the content such as the above-described traveling video in a short time, a method of creating a digest of the content and reproducing the digest has been devised. For example, Patent Document 1 sets the summary segment to be extracted based on the position on the time axis of the silent section and the noise section of the audio-video information and the section length, and the importance of the summary segment. A method for digest playback based on summary segments and importance is described. Further, as a related technique, Patent Document 2 discloses that traveling images taken at predetermined time intervals are sequentially and cyclically stored in a plurality of semiconductor storage devices, and the storage operation is stopped when an automobile accident occurs. A method for storing a plurality of still images before an accident in the semiconductor memory device is described.
特開2003-87728号公報Japanese Patent Laid-Open No. 2003-87728 特開昭63-16785号公報Japanese Unexamined Patent Publication No. 63-16785
 しかしながら、特許文献1に記載の手法では、ダイジェスト再生のスケジューリングの判断基準として背景音声しか利用していない。つまり画像などの他の情報を利用していないため、重要シーンを漏らさない適切なダイジェスト再生を行うことが難しい。また、特許文献2に記載の手法では、車両の事故映像の記録を目的としており、ダイジェスト再生を行うことを目的とする本発明とは課題が異なる。 However, in the method described in Patent Document 1, only background speech is used as a judgment criterion for digest playback scheduling. That is, since other information such as an image is not used, it is difficult to perform an appropriate digest reproduction that does not leak important scenes. Further, the technique described in Patent Document 2 is intended to record an accident video of a vehicle and has a different problem from the present invention that aims to perform digest reproduction.
 本発明が解決しようとする課題としては、上記のようなものが例として挙げられる。本発明は、重要シーンを漏らさない適切なダイジェスト再生を行うことを課題とする。 Examples of problems to be solved by the present invention include the above. An object of the present invention is to perform an appropriate digest reproduction that does not leak important scenes.
 請求項1に記載の発明は、映像記録再生装置であって、移動体に備えられた撮影手段と、前記撮影手段により撮影された映像を記録する映像記録手段と、前記移動体の走行状態を求める走行状態解析手段と、求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定手段と、を備えることを特徴とする。 The invention according to claim 1 is a video recording / reproducing apparatus, wherein the photographing means provided in the moving body, the video recording means for recording the video photographed by the photographing means, and the traveling state of the moving body are provided. It comprises: a travel condition analysis means to be obtained; and an importance degree judgment means for judging and setting the importance for each scene of the video based on the obtained travel state.
 請求項14に記載の発明は、映像記録再生方法であって、移動体に備えられた撮影手段により映像を撮影する撮影工程と、前記撮影手段により撮影された映像を記録する映像記録工程と、前記移動体の走行状態を求める走行状態解析工程と、求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定工程と、を備えることを特徴とする。 The invention according to claim 14 is a video recording / playback method, wherein a shooting step of shooting a video by a shooting unit provided in a moving body, a video recording step of recording a video shot by the shooting unit, A running state analysis step for obtaining a running state of the mobile body, and an importance level judging step for determining and setting the importance level for each scene of the video based on the obtained running state. To do.
 請求項15に記載の発明は、撮影手段を有するナビゲーション装置により実行される映像記録再生プログラムであって、前記撮影手段により撮影された映像を記録する映像記録手段、前記移動体の走行状態を求める走行状態解析手段、求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定手段、として前記ナビゲーション装置を機能させることを特徴とする。 According to a fifteenth aspect of the present invention, there is provided a video recording / reproducing program executed by a navigation device having a photographing unit, wherein the video recording unit records a video photographed by the photographing unit, and obtains a traveling state of the moving body. The navigation apparatus is made to function as travel state analysis means and importance level determination means for determining and setting the importance level for each scene of the video based on the obtained travel state.
本発明の映像記録再生装置が適用されたナビゲーション装置を備える車両の一例を示す図である。It is a figure which shows an example of a vehicle provided with the navigation apparatus with which the video recording / reproducing apparatus of this invention was applied. ナビゲーション装置の装置構成を示す図である。It is a figure which shows the apparatus structure of a navigation apparatus. 第1実施例に係る映像記録再生装置の機能構成を示す図である。It is a figure which shows the function structure of the video recording / reproducing apparatus which concerns on 1st Example. 走行映像データベースに記録される情報の一例を示す図である。It is a figure which shows an example of the information recorded on a driving | running | working video database. 走行状態データベースに記録される情報の一例を示す図である。It is a figure which shows an example of the information recorded on a driving | running | working state database. 走行状態の変化点の例をリストした図である。It is the figure which listed the example of the change point of a driving | running | working state. 走行状態の特徴的な時点の例をリストした図である。It is the figure which listed the example of the characteristic time of a driving state. 走行状態の重要度判定表の一例を示す図である。It is a figure which shows an example of the importance determination table of a driving | running | working state. 走行状態の重要度判定表の一例を示す図である。It is a figure which shows an example of the importance determination table of a driving | running | working state. シーンデータベースに記録される情報の一例を示す図である。It is a figure which shows an example of the information recorded on a scene database. シーン重要度を求めることによる優先度の傾向を示す図である。It is a figure which shows the tendency of the priority by calculating | requiring scene importance. 第1実施例における全体の制御処理を示すフローチャートである。It is a flowchart which shows the whole control processing in 1st Example. ダイジェスト再生のためのスケジューリングのフローチャートである。It is a flowchart of the scheduling for digest reproduction | regeneration. 第2実施例に係る映像記録再生装置の機能構成を示す図である。It is a figure which shows the function structure of the video recording / reproducing apparatus which concerns on 2nd Example. 走行回数の重要度判定表の一例を示す図である。It is a figure which shows an example of the importance determination table of the frequency | count of driving | running | working. 第3実施例に係る映像記録再生装置の機能構成を示す図である。It is a figure which shows the function structure of the video recording / reproducing apparatus which concerns on 3rd Example. 個人情報の重要度判定表の一例を示す図である。It is a figure which shows an example of the importance determination table | surface of personal information. 第4実施例に係る映像記録再生装置の機能構成を示す図である。It is a figure which shows the function structure of the video recording / reproducing apparatus which concerns on 4th Example. 応用例に係る映像記録再生装置の機能構成を示す図である。It is a figure which shows the function structure of the video recording / reproducing apparatus which concerns on an application example. 走行映像の時間に対するシーン重要度の変化を示すグラフである。It is a graph which shows the change of the scene importance with respect to the time of a driving | running | working image | video. 応用例における全体の制御処理を示すフローチャートである。It is a flowchart which shows the whole control processing in an application example.
 11 加速度センサ
 12 ジャイロセンサ
 13 車速パルスセンサ
 18 GPSセンサ
 20 システムコントローラ
 70 撮影装置
 100 映像記録再生装置
DESCRIPTION OF SYMBOLS 11 Acceleration sensor 12 Gyro sensor 13 Vehicle speed pulse sensor 18 GPS sensor 20 System controller 70 Shooting device 100 Video recording / reproducing device
 本発明の1つの観点では、映像記録再生装置は、移動体に備えられた撮影手段と、前記撮影手段により撮影された映像を記録する映像記録手段と、前記移動体の走行状態を求める走行状態解析手段と、求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定手段と、を備える。 In one aspect of the present invention, a video recording / reproducing apparatus includes a photographing unit provided in a moving body, a video recording unit that records a video photographed by the photographing unit, and a traveling state for determining a traveling state of the moving body. Analyzing means and importance determining means for determining and setting importance for each scene of the video based on the obtained traveling state.
 上記の映像記録再生装置は、移動体に備えられたカメラなどの撮影手段と、映像記録手段と、走行状態解析手段と、重要度判定手段と、を備える。撮影手段は、走行している移動体の周囲の走行映像などを撮影する。映像記録手段は、撮影手段により撮影された映像を記録する。走行状態解析手段は移動体の走行状態を求める。重要度判定手段は、求められた走行状態に基づいて、映像のシーン毎に重要度を判定して設定する。このようにすることで、映像の各シーンについて適切な重要度を設定することができ、ダイジェスト再生のスケジューリングを行う際において、重要シーンを漏らさずに済む。 The video recording / playback apparatus includes a photographing unit such as a camera provided in the moving body, a video recording unit, a running state analyzing unit, and an importance level determining unit. The photographing means photographs a traveling image around the moving moving body. The video recording unit records the video shot by the shooting unit. The traveling state analyzing means obtains the traveling state of the moving body. The importance level determination means determines and sets the importance level for each scene of the video based on the obtained running state. In this way, it is possible to set an appropriate importance level for each scene of the video, and it is not necessary to leak the important scene when performing digest playback scheduling.
 上記の映像記録再生装置の他の一態様は、前記走行状態解析手段は、前記映像に含まれる画像及び音声のうち、少なくとも1つ以上を解析して求められた特徴量に基づいて、前記走行状態を求める。ここで、画像の特徴量とは、例えば、輝度、色ヒストグラム、エッジ情報、動き情報であり、音声の特徴量とは、例えば、音圧パワー、音の周波数特性である。画像の特徴量を用いることにより、移動体の走行状態を示す情報として、道路標示、車両や歩行者といった画像情報を得ることが可能となる。音声の特徴量を用いることにより、移動体の走行状態を示す情報として、人声、雑踏音やクラクション音といった音声情報を得ることが可能となる。 In another aspect of the video recording / playback device, the running state analyzing unit is configured to perform the running based on a feature amount obtained by analyzing at least one of images and sounds included in the video. Find the state. Here, the image feature amount is, for example, luminance, color histogram, edge information, and motion information, and the sound feature amount is, for example, sound pressure power and sound frequency characteristics. By using the feature amount of the image, it is possible to obtain image information such as road markings, vehicles, and pedestrians as information indicating the traveling state of the moving body. By using the feature amount of the voice, it is possible to obtain voice information such as a human voice, a crowded sound, and a horn sound as information indicating the traveling state of the moving body.
 上記の映像記録再生装置の他の一態様は、前記走行状態解析手段は、GPSセンサ、加速度センサ、ジャイロセンサ、及び、車速パルスセンサのうち、少なくとも1つ以上のセンサからのセンサ信号に基づいて、前記走行状態を求める。これにより、移動体の走行状態を示す情報として、走行位置、車速状態、加減速状態、方位、旋回状態、傾斜状態といった情報を得ることが可能となる。 In another aspect of the video recording / reproducing apparatus, the running state analyzing means is based on a sensor signal from at least one of a GPS sensor, an acceleration sensor, a gyro sensor, and a vehicle speed pulse sensor. The travel state is obtained. Thereby, it is possible to obtain information such as the travel position, the vehicle speed state, the acceleration / deceleration state, the direction, the turning state, and the tilt state as information indicating the travel state of the moving body.
 上記の映像記録再生装置の他の一態様は、地図情報を記録している地図情報記録手段を有し、前記走行状態解析手段は、前記地図情報に基づいて、前記走行状態を求める。これにより、移動体の走行状態を示す情報として、住所、標高状態、付近の地理的スポット、道路状態、風景情報といった情報を得ることが可能となる。 Another aspect of the video recording / reproducing apparatus includes a map information recording unit that records map information, and the traveling state analyzing unit obtains the traveling state based on the map information. As a result, it is possible to obtain information such as an address, an altitude state, a nearby geographical spot, a road state, and landscape information as information indicating the traveling state of the mobile body.
 上記の映像記録再生装置の他の一態様は、前記移動体の走行履歴を記録する走行履歴記録手段を有し、前記重要度判定手段は、前記走行履歴と前記走行状態とに基づいて、前記映像のシーン毎に重要度を判定して設定する。これにより、映像の各シーンについて、走行回数に応じた適切な重要度を設定することができる。例えば、ダイジェスト再生のスケジューリングを行う際において、見慣れた道路のシーンの優先度を下げ、馴染みのない道路のシーンの優先度を上げて、スケジューリングすることが可能となる。 Another aspect of the video recording / reproducing apparatus includes a travel history recording unit that records a travel history of the mobile body, and the importance level determination unit is configured to determine the importance level based on the travel history and the travel state. The importance is determined and set for each scene of the video. Thereby, the appropriate importance according to the frequency | count of driving | running | working can be set about each scene of an image | video. For example, when scheduling digest playback, it is possible to reduce the priority of familiar road scenes and increase the priority of unfamiliar road scenes.
 上記の映像記録再生装置の他の一態様は、ユーザの個人情報を記録する個人情報記録手段を有し、前記重要度判定手段は、前記個人情報と前記走行状態とに基づいて、前記映像のシーン毎に重要度を判定して設定する。これにより、映像の各シーンについて、個人情報に応じた適切な重要度を設定することができる。例えば、ダイジェスト再生のスケジューリングを行う際において、自宅周辺や通勤経路といった見知った場所のシーンの優先度を下げ、ユーザの思い出の場所といったシーンの優先度を上げて、スケジューリングすることが可能となる。 Another aspect of the video recording / playback apparatus includes personal information recording means for recording personal information of a user, and the importance level determination means is configured to determine the importance of the video based on the personal information and the running state. The importance is judged and set for each scene. Thereby, the appropriate importance according to personal information can be set about each scene of an image | video. For example, when performing digest playback scheduling, it is possible to lower the priority of a scene in a known place such as the vicinity of a home or a commuting route, and increase the priority of the scene such as a user's memory location.
 上記の映像記録再生装置の他の一態様は、ユーザが重要シーンを示す情報を指示するための指示手段を有し、前記重要度判定手段は、前記ユーザにより指示された情報と前記走行状態とに基づいて、前記映像のシーン毎に重要度を判定して設定する。これにより、ダイジェスト再生のスケジューリングを行う際において、ユーザの指示した重要なシーンを優先してスケジューリングすることが可能となる。 Another aspect of the video recording / reproducing apparatus includes an instruction unit for a user to instruct information indicating an important scene, and the importance level determination unit includes the information instructed by the user, the running state, and the like. Based on the above, the importance is determined and set for each scene of the video. This makes it possible to prioritize important scenes instructed by the user when performing digest playback scheduling.
 上記の映像記録再生装置の他の一態様は、前記走行状態を基に、前記映像を複数のシーンに分割するシーン分割手段を備える。これにより、意味のある単位でシーン分割することが可能となる。 Another aspect of the video recording / reproducing apparatus includes a scene dividing unit that divides the video into a plurality of scenes based on the running state. This makes it possible to divide the scene into meaningful units.
 上記の映像記録再生装置の他の一態様は、前記重要度判定手段は、前記移動体の走行状態に基づいて、前記映像のフレーム毎に重要度を判定して設定する。これにより、重要フレームにインデックスを付与することが可能となり、ユーザは、当該インデックスを基に、重要なシーンを見つけることが可能となる。 In another aspect of the video recording / reproducing apparatus, the importance level determination unit determines and sets the importance level for each frame of the video based on the traveling state of the moving body. As a result, it is possible to add an index to the important frame, and the user can find an important scene based on the index.
 上記の映像記録再生装置の他の一態様は、前記映像のシーン毎に設定された重要度に応じて、ダイジェスト再生のスケジューリングを作成することで再生を制御するダイジェスト再生制御手段を有する。これにより、ダイジェスト再生のスケジューリングを適切に行うことが可能となる。 Another aspect of the video recording / playback apparatus includes digest playback control means for controlling playback by creating a schedule for digest playback according to the importance set for each scene of the video. This makes it possible to appropriately perform digest playback scheduling.
 上記の映像記録再生装置の好適な実施例では、前記ダイジェスト再生制御手段は、前記映像のシーンに設定された重要度が高くなるほど、当該映像のシーンを優先的にスケジューリングする。 In a preferred embodiment of the video recording / playback apparatus, the digest playback control means preferentially schedules the video scene as the importance set for the video scene increases.
 上記の映像記録再生装置の好適な実施例では、前記ダイジェスト再生制御手段は、前記映像のシーンに設定された重要度が低くなるほど、当該映像のシーンを速く再生する。これにより、ダイジェスト再生を短時間で行うことが可能となる。 In a preferred embodiment of the video recording / playback apparatus, the digest playback control means plays back the video scene faster as the importance set in the video scene becomes lower. Thereby, digest reproduction can be performed in a short time.
 上記の映像記録再生装置の好適な実施例では、前記ダイジェスト再生制御手段は、前記映像のシーンの時間長が長くなるほど、当該映像のシーンを速く再生する。これによっても、ダイジェスト再生を短時間で行うことが可能となる。 In a preferred embodiment of the video recording / playback apparatus, the digest playback control means plays back the video scene faster as the time length of the video scene becomes longer. This also makes it possible to perform digest reproduction in a short time.
 本発明の他の観点では、映像記録再生方法は、移動体に備えられた撮影手段により映像を撮影する撮影工程と、前記撮影手段により撮影された映像を記録する映像記録工程と、前記移動体の走行状態を求める走行状態解析工程と、求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を設定する重要度判定工程と、を備える。この映像記録再生方法によっても、映像の各シーンについて適切な重要度を設定することができ、ダイジェスト再生のスケジューリングを行う際において、重要シーンを漏らさずに済む。 In another aspect of the present invention, a video recording / playback method includes a shooting step of shooting a video by a shooting unit provided in a moving body, a video recording step of recording a video shot by the shooting unit, and the moving body. A travel state analysis step for obtaining the travel state, and an importance level determination step for setting the importance level for each scene of the video based on the obtained travel state. Also with this video recording / playback method, it is possible to set an appropriate level of importance for each scene of the video, and it is not necessary to leak important scenes when performing digest playback scheduling.
 本発明の更なる他の観点では、撮影手段を有するナビゲーション装置により実行される映像記録再生プログラムは、前記撮影手段により撮影された映像を記録する映像記録手段、前記移動体の走行状態を求める走行状態解析手段、求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定手段、として前記ナビゲーション装置を機能させる。この映像再生プログラムによっても、映像の各シーンについて適切な重要度を設定することができる。 In still another aspect of the present invention, a video recording / playback program executed by a navigation apparatus having a photographing unit includes a video recording unit that records a video photographed by the photographing unit, and a traveling for obtaining a traveling state of the moving body. The navigation device is caused to function as state analysis means and importance determination means for determining and setting importance for each scene of the video based on the obtained traveling state. This video playback program can also set an appropriate importance level for each scene of the video.
 以下、図面を参照して本発明の好適な実施例について説明する。 Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
 [ナビゲーション装置]
 まず、本発明の映像記録再生装置が適用されたナビゲーション装置の一例について説明する。
[Navigation device]
First, an example of a navigation apparatus to which the video recording / reproducing apparatus of the present invention is applied will be described.
 図1は、本発明の映像記録再生装置が適用されたナビゲーション装置を備える車両の一例を示す。図1には、ユーザが運転する車両である車両90が図示されており、当該車両90には、ナビゲーション装置1が搭載されている。 FIG. 1 shows an example of a vehicle equipped with a navigation device to which the video recording / reproducing apparatus of the present invention is applied. FIG. 1 illustrates a vehicle 90 that is a vehicle driven by a user, and the navigation device 1 is mounted on the vehicle 90.
 第1実施例では、車両90に搭載されたナビゲーション装置1は、撮影装置70を備えている。撮影装置70は、走行している車両90の外部の風景を撮影する、即ち、車両90の走行映像を撮影する。なお、撮影装置70の撮影方向としては、車両90の前方に限られるものではなく、車両90の左右側方、後方、あるいはそれ以外の他の方向を撮影するとしても良い。また、撮影装置70の撮影方向は、1つの方向に限られるものではなく、複数の方向であるとしても良い。ナビゲーション装置1が映像記録再生装置として機能する場合には、撮影装置70により撮影された映像について、車両90の走行状態に基づいて、シーン毎に重要度を判定して設定する。 In the first embodiment, the navigation device 1 mounted on the vehicle 90 includes a photographing device 70. The imaging device 70 captures a scene outside the traveling vehicle 90, that is, captures a traveling image of the vehicle 90. Note that the shooting direction of the shooting device 70 is not limited to the front of the vehicle 90, but may be shot in the left and right sides of the vehicle 90, the rear, or other directions. Further, the photographing direction of the photographing apparatus 70 is not limited to one direction, and may be a plurality of directions. When the navigation device 1 functions as a video recording / reproducing device, the importance level is determined and set for each scene based on the traveling state of the vehicle 90 for the video imaged by the imaging device 70.
 ナビゲーション装置1の装置構成について図2を用いて説明する。図2は、ナビゲーション装置の装置構成の一例を示す。図2に示すように、ナビゲーション装置1は、自立測位装置10、GPS受信器18、システムコントローラ20、ディスクドライブ31、データ記録ユニット36、通信用インタフェース37、通信装置38、表示ユニット40、音声出力ユニット50、入力装置60及び撮影装置70を備えて構成されている。 The device configuration of the navigation device 1 will be described with reference to FIG. FIG. 2 shows an example of the device configuration of the navigation device. As shown in FIG. 2, the navigation device 1 includes a self-supporting positioning device 10, a GPS receiver 18, a system controller 20, a disk drive 31, a data recording unit 36, a communication interface 37, a communication device 38, a display unit 40, an audio output. A unit 50, an input device 60, and a photographing device 70 are provided.
 自立測位装置10は、加速度センサ11、ジャイロセンサ12及び車速パルスセンサ13を含んで構成されている。加速度センサ11は、例えば圧電素子からなり、車両90の加速度を検出し、加速度データを出力する。ジャイロセンサ12は、例えば振動ジャイロからなり、車両90の方向変換時における車両90の角速度を検出し、角速度データ及び相対方位データを出力する。車速パルスセンサ13は、車両90の車輪の回転に伴って発生されているパルス信号からなる車速パルスを計測する。 The self-supporting positioning device 10 includes an acceleration sensor 11, a gyro sensor 12, and a vehicle speed pulse sensor 13. The acceleration sensor 11 is made of, for example, a piezoelectric element, detects the acceleration of the vehicle 90, and outputs acceleration data. The gyro sensor 12 is composed of, for example, a vibrating gyroscope, detects the angular velocity of the vehicle 90 when the direction of the vehicle 90 is changed, and outputs angular velocity data and relative azimuth data. The vehicle speed pulse sensor 13 measures a vehicle speed pulse composed of a pulse signal generated with the rotation of the wheel of the vehicle 90.
 GPSセンサ(以下、単に「GPS」と称する)18は、緯度及び経度情報等から車両90の絶対的な位置を検出するために用いられるべき複数のGPS衛星からの測位用のデータ(即ち、GPSデータ)を含む下り回線データを搬送する電波19を受信する部分である。 A GPS sensor (hereinafter simply referred to as “GPS”) 18 is positioning data (ie, GPS) from a plurality of GPS satellites to be used to detect the absolute position of the vehicle 90 from latitude and longitude information. This is a part for receiving radio waves 19 carrying downlink data including data.
 システムコントローラ20は、インタフェース21、CPU(Central Processing Unit)22、ROM(Read Only Memory)23及びRAM(Random Access Memory)24を含んでおり、ナビゲーション装置1全体の制御を行うように構成されている。 The system controller 20 includes an interface 21, a CPU (Central Processing Unit) 22, a ROM (Read Only Memory) 23, and a RAM (Random Access Memory) 24, and is configured to control the entire navigation device 1. .
 インタフェース21は、加速度センサ11、ジャイロセンサ12及び車速パルスセンサ13並びにGPS18とのインタフェース動作を行う。そして、これらから、車速パルスの他、加速度データ、相対方位データ、角速度データ、GPS測位データ、絶対方位データ等をシステムコントローラ20に入力する。CPU22は、システムコントローラ20全体を制御する。ROM23は、システムコントローラ20を制御する制御プログラム等が格納された図示しない不揮発性メモリ等を有する。RAM24は、入力装置60を介して使用者により予め設定された経路データ等の各種データを読み出し可能に格納したり、CPU22に対してワーキングエリアを提供したりする。 The interface 21 performs an interface operation with the acceleration sensor 11, the gyro sensor 12, the vehicle speed pulse sensor 13, and the GPS 18. From these, acceleration data, relative azimuth data, angular velocity data, GPS positioning data, absolute azimuth data and the like are input to the system controller 20 in addition to the vehicle speed pulse. The CPU 22 controls the entire system controller 20. The ROM 23 includes a nonvolatile memory (not shown) in which a control program for controlling the system controller 20 is stored. The RAM 24 stores various data such as route data preset by the user via the input device 60 so as to be readable, and provides a working area to the CPU 22.
 システムコントローラ20、ディスクドライブ31、データ記録ユニット36、通信用インタフェース37、表示ユニット40、音声出力ユニット50、入力装置60及び撮影装置70は、バスライン30を介して相互に接続されている。 The system controller 20, the disk drive 31, the data recording unit 36, the communication interface 37, the display unit 40, the audio output unit 50, the input device 60 and the photographing device 70 are connected to each other via the bus line 30.
 データ記録ユニット36は、例えば、HDDなどにより構成され、地図情報などのナビゲーション処理に用いられる各種データ、撮影装置70により撮影された走行映像データやシステムコントローラ20により解析された解析データを記録するユニットである。ディスクドライブ31は、システムコントローラ20の制御の下、ディスク33から、音楽データ、映像データなどのコンテンツデータを読み出し、出力する。また、ディスクドライブ31は、システムコントローラ20の制御の下、ディスク33に対し、先に述べた走行映像データや解析データを記録する。ディスク33としては、例えば、DVD±RWなどの種々のディスクが適用可能である。 The data recording unit 36 is configured by, for example, an HDD or the like, and records various data used for navigation processing such as map information, traveling video data photographed by the photographing device 70, and analysis data analyzed by the system controller 20. It is. The disk drive 31 reads and outputs content data such as music data and video data from the disk 33 under the control of the system controller 20. Further, the disk drive 31 records the above-described traveling video data and analysis data on the disk 33 under the control of the system controller 20. As the disk 33, for example, various disks such as DVD ± RW are applicable.
 通信装置38は、例えば、FMチューナやビーコンレシーバなどにより構成され、VICS(Vehicle Information Communication System)センターなどから配信される情報を取得する。 The communication device 38 includes, for example, an FM tuner, a beacon receiver, and the like, and acquires information distributed from a VICS (Vehicle Information Communication System) center or the like.
 撮影装置70は、例えば、カメラであり、先にも述べたように、車両90が走行している間、走行映像を撮影する。なお、ここで、撮影装置70はマイクも備えているものとし、走行映像は画像に加えて音声もデータとして含むものとする。撮影装置70は、システムコントローラ20によって制御される。 The photographing device 70 is, for example, a camera, and captures a traveling image while the vehicle 90 is traveling as described above. Here, it is assumed that the photographing device 70 also includes a microphone, and the traveling video includes sound as data in addition to images. The imaging device 70 is controlled by the system controller 20.
 表示ユニット40は、システムコントローラ20の制御の下、各種表示データや撮影装置70により撮影された映像をディスプレイ44の表示画面上に表示する。例えば、表示データとして地図情報を表示画面上に表示する場合には、システムコントローラ20は、データ記録ユニット36より地図情報を読み出し、表示ユニット40は、システムコントローラ20によってデータ記録ユニット36より読み出された地図情報を表示画面上に表示する。表示ユニット40は、バスライン30を介してCPU22から送られる制御データに基づいて表示ユニット40全体の制御を行うグラフィックコントローラ41と、VRAM(Video RAM)等のメモリからなり即時表示可能な画像情報を一時的に記録するバッファメモリ42と、グラフィックコントローラ41から出力される画像データに基づいて、液晶、CRT(Cathode Ray Tube)等のディスプレイ44を表示制御する表示制御部43と、ディスプレイ44とを備えて構成されている。ディスプレイ44は、例えば対角5~10インチ程度の液晶表示装置等からなり、車内のフロントパネル付近に装着される。 The display unit 40 displays various display data and images captured by the image capturing device 70 on the display screen of the display 44 under the control of the system controller 20. For example, when displaying map information as display data on the display screen, the system controller 20 reads map information from the data recording unit 36, and the display unit 40 is read from the data recording unit 36 by the system controller 20. The map information is displayed on the display screen. The display unit 40 includes a graphic controller 41 that controls the entire display unit 40 based on control data sent from the CPU 22 via the bus line 30, and image information that can be displayed immediately, such as a VRAM (Video) RAM) memory. A buffer memory 42 for temporarily recording, a display control unit 43 for controlling display of a display 44 such as a liquid crystal display or a CRT (Cathode Ray Tube) based on image data output from the graphic controller 41, and a display 44 are provided. Configured. The display 44 is composed of, for example, a liquid crystal display device having a diagonal size of about 5 to 10 inches, and is mounted near the front panel in the vehicle.
 音声出力ユニット50は、システムコントローラ20の制御の下、RAM24等からバスライン30を介して送られる音声デジタルデータのD/A(Digital to Analog)変換を行うD/Aコンバータ51と、D/Aコンバータ51から出力される音声アナログ信号を増幅する増幅器(AMP)52と、増幅された音声アナログ信号を音声に変換して車内に出力するスピーカ53とを備えて構成されている。 The audio output unit 50 is a D / A converter 51 that performs D / A (Digital-to-Analog) conversion of audio digital data sent from the RAM 24 or the like via the bus line 30 under the control of the system controller 20. An amplifier (AMP) 52 that amplifies the audio analog signal output from the converter 51 and a speaker 53 that converts the amplified audio analog signal into audio and outputs the audio to the vehicle are configured.
 入力装置60は、各種コマンドやデータを入力するための、キー、スイッチ、ボタン、リモコン、音声入力装置等から構成されている。入力装置60は、車内に搭載された当該車載用電子システムの本体のフロントパネルやディスプレイ44の周囲に配置される。また、ディスプレイ44がタッチパネル方式である場合には、ディスプレイ44の表示画面上に設けられたタッチパネルも入力装置60として機能する。 The input device 60 includes keys, switches, buttons, a remote controller, a voice input device, and the like for inputting various commands and data. The input device 60 is disposed around the front panel and the display 44 of the main body of the in-vehicle electronic system mounted in the vehicle. When the display 44 is a touch panel system, the touch panel provided on the display screen of the display 44 also functions as the input device 60.
 [第1実施例]
 第1実施例に係る映像記録再生装置について説明する。図3は、第1実施例に係る映像記録再生装置100の機能構成を示す。なお、映像記録再生装置100は、実体的には、図2に示したナビゲーション装置1の各構成要素により実現される。以下に具体的に説明する。
[First embodiment]
A video recording / reproducing apparatus according to the first embodiment will be described. FIG. 3 shows a functional configuration of the video recording / reproducing apparatus 100 according to the first embodiment. Note that the video recording / reproducing apparatus 100 is substantially realized by each component of the navigation apparatus 1 shown in FIG. This will be specifically described below.
 第1実施例に係る映像記録再生装置100は、撮影装置70、GPS18、加速度センサ11、ジャイロセンサ12、車速パルスセンサ13、走行映像記録部221、画像/音声解析部222、走行状態解析部223、シーン分割部224、シーン重要度判定部225、ダイジェスト再生制御部226、走行映像データベース361、走行状態データベース362、シーンデータベース364、地図データベース363より構成される。 The video recording / reproducing apparatus 100 according to the first embodiment includes an imaging device 70, a GPS 18, an acceleration sensor 11, a gyro sensor 12, a vehicle speed pulse sensor 13, a traveling video recording unit 221, an image / audio analyzing unit 222, and a traveling state analyzing unit 223. , A scene division unit 224, a scene importance level determination unit 225, a digest reproduction control unit 226, a traveling video database 361, a traveling state database 362, a scene database 364, and a map database 363.
 ここで、走行映像記録部221、画像/音声解析部222、走行状態解析部223、シーン分割部224、シーン重要度判定部225、ダイジェスト再生制御部226は、例えば、システムコントローラ20がプログラムを実行することにより実現される。また、走行映像データベース361、走行状態データベース362、シーンデータベース364、地図データベース363は、データ記録ユニット36や、書き込み可能なディスク33に記録される。 Here, the traveling video recording unit 221, the image / sound analysis unit 222, the traveling state analysis unit 223, the scene division unit 224, the scene importance level determination unit 225, and the digest reproduction control unit 226, for example, are executed by the system controller 20. It is realized by doing. In addition, the traveling video database 361, the traveling state database 362, the scene database 364, and the map database 363 are recorded in the data recording unit 36 or the writable disc 33.
 第1実施例に係る映像記録再生装置100は、各種センサからのセンサ信号などに基づいて、車両90の走行状態を求め、求められた走行状態に基づいて、走行映像のシーン分割を行う。また、映像記録再生装置100は、求められた走行状態に基づいて、分割されたシーン毎に重要度を判定して設定する。以下、第1実施例に係る映像記録再生装置100の各機能について具体的に説明する。 The video recording / reproducing apparatus 100 according to the first embodiment obtains the running state of the vehicle 90 based on sensor signals from various sensors and performs scene division of the running video based on the obtained running state. Further, the video recording / reproducing apparatus 100 determines and sets the importance for each divided scene based on the obtained running state. Hereinafter, each function of the video recording / reproducing apparatus 100 according to the first embodiment will be described in detail.
 走行映像記録部221は、撮影装置70により撮影された走行映像を走行映像データベース361に記録する。走行映像データベース361に記録される情報の一例を図4に示す。図4に示すように、走行映像データベース361には、走行映像を唯一に識別する走行映像ID、撮影開始日時、時間長、ファイル名などが記録される。 The traveling image recording unit 221 records the traveling image captured by the imaging device 70 in the traveling image database 361. An example of information recorded in the traveling video database 361 is shown in FIG. As shown in FIG. 4, the traveling video database 361 records a traveling video ID that uniquely identifies the traveling video, a shooting start date and time, a time length, a file name, and the like.
 画像/音声解析部222は、走行映像から画像や音声を解析して特徴量を抽出する。ここで、特徴量とは、例えば、画像の特徴量(以下、「画像特徴量」と称する)であれば、輝度、色ヒストグラム、エッジ情報や動き情報などであり、音声の特徴量(以下、「音声特徴量」と称する)であれば、音圧パワーや音の周波数特性などである。 The image / speech analysis unit 222 analyzes the image and sound from the running video and extracts the feature amount. Here, the feature amount is, for example, a feature amount of an image (hereinafter referred to as “image feature amount”), such as luminance, color histogram, edge information, motion information, and the like. Sound pressure power, sound frequency characteristics, and the like.
 地図データベース363には、走行状態解析部223により参照情報として利用される地図情報が予め記録されている。地図情報としては、例えば、住所、標高、傾斜、交差点、立体交差、踏切、トンネル、橋などの道路交通情報、車幅、車線数、制限速度、高速道、一般道、生活道路、駐車場などの道路属性情報、店舗、施設、著名建築、公園、遊園地、観光名所などのランドマーク情報、海、湖、川、山、眺望スポットなどの地形情報、制限速度、一方通行などの道路交通法上の規制情報、ヒヤリハット地点といったものが挙げられる。 In the map database 363, map information used as reference information by the traveling state analysis unit 223 is recorded in advance. Map information includes, for example, road traffic information such as addresses, elevations, slopes, intersections, three-dimensional intersections, railroad crossings, tunnels, bridges, vehicle width, number of lanes, speed limit, expressways, ordinary roads, residential roads, parking lots, etc. Road attribute information, shops, facilities, famous buildings, parks, amusement parks, landmark information such as tourist attractions, topographical information such as sea, lake, river, mountain, view spot, speed limit, one-way road traffic law The above regulatory information, near miss points, etc.
 走行状態解析部223は、GPS18、加速度センサ11、ジャイロセンサ12、車速パルスセンサ13から検出された各種のセンサ信号、画像/音声解析部222で求められた特徴量、地図データベース363に記録されている地図情報に基づいて、車両90の様々な走行状態を求める。走行状態解析部223は、各種のセンサ信号及び特徴量とともに、求められた走行状態を、走行状態データベース362に記録する。走行状態データベース362には、各種のセンサ信号、特徴量、各種のセンサ信号や特徴量に基づいて求められた走行状態、当該走行状態に対応するフレームを唯一に識別するフレームID(または、走行映像中における時間位置)が互いに関連づけられて記録されている。なお、ここで、走行状態データベース362は、センサ信号、特徴量、走行状態を一元的に管理する必要はなく、それぞれを管理する複数のデータベースより構成されるとしても良いのは言うまでもない。 The running state analysis unit 223 is recorded in the map database 363, various sensor signals detected from the GPS 18, the acceleration sensor 11, the gyro sensor 12, and the vehicle speed pulse sensor 13, the feature values obtained by the image / voice analysis unit 222. Various traveling states of the vehicle 90 are obtained based on the map information. The traveling state analysis unit 223 records the obtained traveling state together with various sensor signals and feature quantities in the traveling state database 362. The traveling state database 362 includes various sensor signals, feature amounts, traveling states obtained based on the various sensor signals and feature amounts, and frame IDs (or traveling images) that uniquely identify frames corresponding to the traveling states. (Time position in the middle) are recorded in association with each other. Here, it is needless to say that the driving state database 362 does not have to manage the sensor signal, the feature amount, and the driving state in an integrated manner, and may be configured by a plurality of databases that manage each of them.
 シーン分割部224は、走行状態解析部223で得られた走行状態に基づいて、走行映像のシーン分割を行う。シーン分割を行う基準としては、例えば、走行状態の変化点や、走行状態の特徴的な時点が挙げられる。シーン分割部224は、分割されたシーンを示すシーン情報をシーンデータベース364に記録する。 The scene dividing unit 224 performs scene division of the running video based on the running state obtained by the running state analyzing unit 223. As a reference for performing scene division, for example, a change point of the running state or a characteristic time point of the running state can be given. The scene division unit 224 records scene information indicating the divided scenes in the scene database 364.
 シーン重要度判定部225は、シーン分割部224にて分割された各シーンにおける走行状態に基づいて、当該各シーンに重要度(以下、「シーン重要度」と称する)を判定して設定する。このシーン重要度が高くなるほど、ダイジェスト再生時に、そのシーンが選択・再生される確率が高くなる。シーン重要度判定部225は、各シーンについて求められたシーン重要度をシーンデータベース364に記録する。 The scene importance level determination unit 225 determines and sets the importance level (hereinafter referred to as “scene importance level”) for each scene based on the running state in each scene divided by the scene division unit 224. The higher the importance of this scene, the higher the probability that the scene will be selected and played back during digest playback. The scene importance determination unit 225 records the scene importance obtained for each scene in the scene database 364.
 ダイジェスト再生制御部226は、走行映像のダイジェスト再生のスケジューリングを作成することにより再生制御を行う。具体的には、ダイジェスト再生制御部226は、目標再生時間やシーン重要度に応じて、再生すべきシーンの選択と再生速度の決定を行う。 The digest playback control unit 226 performs playback control by creating a schedule for digest playback of a running video. Specifically, the digest playback control unit 226 selects a scene to be played back and determines a playback speed according to the target playback time and the scene importance level.
 表示部401は、走行映像の通常再生やダイジェスト再生を行う表示装置である。例えば、表示部401は、車両90内で再生を行う場合には、表示ユニット40が相当する。なお、表示部401としては、表示ユニット40には限られず、他のディスプレイなどの表示装置を用いるとしても良いのは言うまでもない。また、表示部401は、操作メニューのGUI(Graphical User Interface)表示や、走行映像に関連する情報の表示なども行う。 The display unit 401 is a display device that performs normal reproduction and digest reproduction of a running video. For example, the display unit 401 corresponds to the display unit 40 when reproduction is performed in the vehicle 90. Needless to say, the display unit 401 is not limited to the display unit 40, and a display device such as another display may be used. The display unit 401 also displays GUI (Graphical User Interface) display of the operation menu, information related to the traveling video, and the like.
 (走行状態の算出方法)
 次に、走行状態の算出方法について具体的に説明する。図5に走行状態データベース362に格納される情報の例を示す。図5に示すように、車両の走行状態の種別としては、例えば、走行位置、車速状態、加減速状態、方位、旋回状態、傾斜状態、画像情報、音声情報、住所、区画境界、標高状態、地理的スポット、道路状態、風景状態、ヒヤリハット地点が挙げられる。走行状態解析部223は、各種のセンサや特徴量に基づいて、各走行状態を求める。そして、走行状態解析部223は、求められた走行状態を、走行映像におけるフレーム(または、走行映像中における時間位置)に対応づけて、走行状態データベース362に記録する。以下で、各走行状態について具体的に説明する。
(Driving condition calculation method)
Next, a method for calculating the running state will be specifically described. FIG. 5 shows an example of information stored in the running state database 362. As shown in FIG. 5, as the type of traveling state of the vehicle, for example, traveling position, vehicle speed state, acceleration / deceleration state, heading, turning state, inclination state, image information, audio information, address, section boundary, altitude state, Examples include geographical spots, road conditions, landscape conditions, and near-miss points. The traveling state analysis unit 223 obtains each traveling state based on various sensors and feature amounts. Then, the traveling state analysis unit 223 records the obtained traveling state in the traveling state database 362 in association with the frame in the traveling image (or the time position in the traveling image). Below, each driving | running | working state is demonstrated concretely.
 車両90の走行位置は、GPS18などのセンサに基づいて求められる。具体的には、走行状態解析部223は、GPS18からのセンサ信号に基づいて、車両90の走行位置を緯度と経度で求める。または、このようにする代わりに、走行状態解析部223は、ジャイロセンサ12、車速パルスセンサ13からのセンサ信号に基づき自律航法を用いることで、車両90の走行位置を求めるとしても良い。さらに、走行状態解析部223は、求められた車両90の走行位置について、地図情報との照合を行うマップマッチング航法を用いることにより補正を行うとしても良い。走行状態解析部223は、求められた車両90の走行位置を走行状態データベース362に記録する。 The traveling position of the vehicle 90 is obtained based on a sensor such as GPS18. Specifically, the traveling state analysis unit 223 obtains the traveling position of the vehicle 90 by latitude and longitude based on the sensor signal from the GPS 18. Alternatively, the traveling state analysis unit 223 may determine the traveling position of the vehicle 90 by using autonomous navigation based on sensor signals from the gyro sensor 12 and the vehicle speed pulse sensor 13 instead of doing this. Further, the traveling state analysis unit 223 may correct the obtained traveling position of the vehicle 90 by using map matching navigation that performs collation with map information. The traveling state analysis unit 223 records the obtained traveling position of the vehicle 90 in the traveling state database 362.
 車両90の車速状態は、車速パルスセンサ13からのセンサ信号に基づいて求められる。具体的には、走行状態解析部223は、車速パルスセンサ13からのセンサ信号に基づいて、車速を求める。そして、走行状態解析部223は、求められた車速を基に、例えば、停車、通常走行中、渋滞区間、低速走行中、高速走行中の各車速状態のうち、いずれの状態に車両90があるかを求める。走行状態解析部223は、求められた車速及び車速状態を走行状態データベース362に記録する。 The vehicle speed state of the vehicle 90 is obtained based on the sensor signal from the vehicle speed pulse sensor 13. Specifically, the traveling state analysis unit 223 obtains the vehicle speed based on the sensor signal from the vehicle speed pulse sensor 13. Then, based on the obtained vehicle speed, the traveling state analysis unit 223 includes, for example, the vehicle 90 in any state among the vehicle speed states of stopping, normal traveling, traffic jam section, low speed traveling, and high speed traveling. Ask for. The traveling state analysis unit 223 records the obtained vehicle speed and the vehicle speed state in the traveling state database 362.
 車両90の加減速状態は、加速度センサ11などのセンサ信号に基づいて求められる。具体的には、走行状態解析部223は、加速度センサ11からのセンサ信号に基づいて、又は、走行位置の変化を解析することにより、加速度を求める。そして、走行状態解析部223は、求められた加速を基に、例えば、急加速や急ブレーキといった加減速状態に車両90があるか否かを求める。走行状態解析部223は、求められた加速度及び加減速状態を走行状態データベース362に記録する。 The acceleration / deceleration state of the vehicle 90 is obtained based on sensor signals from the acceleration sensor 11 and the like. Specifically, the running state analysis unit 223 obtains acceleration based on a sensor signal from the acceleration sensor 11 or by analyzing a change in the running position. Then, based on the obtained acceleration, the traveling state analysis unit 223 determines whether the vehicle 90 is in an acceleration / deceleration state such as sudden acceleration or sudden braking. The traveling state analysis unit 223 records the obtained acceleration and acceleration / deceleration state in the traveling state database 362.
 車両90の方位及び旋回状態は、ジャイロセンサ12などのセンサ信号に基づいて求められる。具体的には、走行状態解析部223は、ジャイロセンサ12からのセンサ信号に基づいて、又は、走行位置の変化を解析することにより、方位、角速度及び角加速度を求める。そして、走行状態解析部223は、求められた角速度及び角加速度を基に、例えば、右左折区間、直進区間、ワインディング区間の各旋回状態のうち、いずれの状態に車両90があるかを求める。走行状態解析部223は、求められた方位、角速度、角加速度、旋回状態を走行状態データベース362に記録する。 The direction and turning state of the vehicle 90 are obtained based on sensor signals from the gyro sensor 12 and the like. Specifically, the traveling state analysis unit 223 obtains the azimuth, the angular velocity, and the angular acceleration based on a sensor signal from the gyro sensor 12 or by analyzing a change in the traveling position. Then, based on the obtained angular velocity and angular acceleration, the traveling state analysis unit 223 determines which state the vehicle 90 is in, for example, each turning state of a right / left turn section, a straight traveling section, or a winding section. The traveling state analysis unit 223 records the obtained azimuth, angular velocity, angular acceleration, and turning state in the traveling state database 362.
 車両90の傾斜状態は、加速度センサ11などのセンサ信号に基づいて求められる。具体的には、走行状態解析部223は、加速度センサ11からのセンサ信号に基づいて、又は、走行位置と地図情報とを照合することにより、車両90の傾斜角を求める。そして、走行状態解析部223は、求められた傾斜角を基に、例えば、平坦、上り坂、急な上り坂、下り坂、急な下り坂の各傾斜状態のうち、いずれの状態に車両90があるかを求める。走行状態解析部223は、求められた傾斜角、傾斜状態を走行状態データベース362に記録する。 The inclination state of the vehicle 90 is obtained based on sensor signals from the acceleration sensor 11 and the like. Specifically, the traveling state analysis unit 223 obtains the inclination angle of the vehicle 90 based on a sensor signal from the acceleration sensor 11 or by collating the traveling position with map information. Then, based on the obtained inclination angle, the traveling state analysis unit 223, for example, the vehicle 90 in any state of flat, uphill, steep uphill, downhill, and steep downhill inclination states. Ask if there is. The traveling state analysis unit 223 records the obtained inclination angle and inclination state in the traveling state database 362.
 画像情報は、画像/音声解析部222によって抽出された画像特徴量に基づいて求められる。具体的には、走行状態解析部223は、輝度、色ヒストグラム、エッジ情報や動き情報といった画像特徴量を解析することにより、白線、道路標識、案内標識などの各標識などを示す画像情報や、信号機、車両、歩行者などを示す画像情報を求める。なお、これに加えて、走行状態解析部223は、走行位置と地図情報とを照合することにより、又は、音声特徴量を用いることにより、これらの画像情報の検出精度を高めるとしても良い。走行状態解析部223は、求められた輝度、色ヒストグラム、エッジ情報、動き情報、画像情報、当該画像情報のフレーム内での表示位置を走行状態データベース362に記録する。 The image information is obtained based on the image feature amount extracted by the image / speech analysis unit 222. Specifically, the running state analysis unit 223 analyzes image feature amounts such as luminance, color histogram, edge information and motion information, thereby indicating image information indicating each sign such as a white line, a road sign, a guide sign, Image information indicating traffic lights, vehicles, pedestrians, etc. is obtained. In addition to this, the traveling state analysis unit 223 may increase the detection accuracy of these pieces of image information by collating the traveling position with the map information or using the audio feature amount. The traveling state analysis unit 223 records the obtained luminance, color histogram, edge information, motion information, image information, and the display position of the image information in the frame in the traveling state database 362.
 音声情報は、画像/音声解析部222によって抽出された音声特徴量に基づいて求められる。具体的には、走行状態解析部223は、音圧パワーや音の周波数特性といった音声特徴量を解析することにより、人声、雑踏音、クラクション音、ブレーキのスキール音などを示す音声情報を求める。なお、これに加えて、走行状態解析部223は、走行位置と地図情報とを照合することにより、又は、画像特徴量を用いることにより、これらの音声情報の検出精度を高めるとしても良い。走行状態解析部223は、求められた音圧パワー、周波数特性、音声情報を走行状態データベース362に記録する。 The audio information is obtained based on the audio feature amount extracted by the image / audio analysis unit 222. Specifically, the running state analysis unit 223 obtains voice information indicating a human voice, a hustle sound, a horn sound, a brake squeal sound, and the like by analyzing voice feature values such as sound pressure power and sound frequency characteristics. . In addition to this, the traveling state analysis unit 223 may increase the detection accuracy of the audio information by collating the traveling position with the map information or by using the image feature amount. The traveling state analysis unit 223 records the obtained sound pressure power, frequency characteristics, and voice information in the traveling state database 362.
 走行位置の住所や、走行位置が区画境界にあるか否かといった情報は、走行位置と地図情報とを照合することにより求められる。ここで、区画境界としては、例えば、県境、市町村境界、町・字の境界、丁目の境界などが挙げられる。具体的には、走行状態解析部223は、走行位置と地図情報とを照合することにより、住所を求めるとともに、区画境界に車両90が位置しているか否かを求める。なお、このようにする代わりに、走行状態解析部223は、画像特徴量を基に、テキスト解析を行うことで住所を求めるとしても良い。走行状態解析部223は、求められた住所、区画境界を走行状態データベース362に記録する。 Information such as the address of the travel position and whether or not the travel position is at the partition boundary is obtained by collating the travel position with the map information. Here, examples of the partition boundaries include prefectural boundaries, municipal boundaries, town / character boundaries, and chome boundaries. Specifically, the traveling state analysis unit 223 obtains an address by collating the traveling position with map information, and determines whether or not the vehicle 90 is located at the partition boundary. Instead of doing this, the traveling state analysis unit 223 may obtain the address by performing text analysis based on the image feature amount. The traveling state analysis unit 223 records the obtained address and section boundary in the traveling state database 362.
 標高状態は、走行位置と地図情報とを照合することにより求められる。具体的には、走行状態解析部223は、走行位置と地図情報とを照合することにより標高を求める。そして、走行状態解析部223は、求められた標高を基に、例えば、峠の最高標高地点、経路上の最高標高地点や最低標高地点といった特徴的な標高地点に車両90が位置しているか否かを求める。なお、このようにする代わりに、ナビゲーション装置1は高度計を備えるとし、走行状態解析部223は、当該高度計の値に基づいて、標高を求めるとしても良い。走行状態解析部223は、求められた標高、特徴的な標高地点に車両90が位置しているか否かといった情報を走行状態データベース362に記録する。 The altitude state is obtained by comparing the traveling position with the map information. Specifically, the traveling state analysis unit 223 obtains the altitude by comparing the traveling position with map information. Then, based on the obtained altitude, the traveling state analysis unit 223 determines whether the vehicle 90 is located at a characteristic altitude point such as the highest altitude point of the pass, the highest altitude point on the route, or the lowest altitude point. Ask for. Instead of doing this, the navigation device 1 may include an altimeter, and the traveling state analysis unit 223 may obtain the altitude based on the value of the altimeter. The traveling state analysis unit 223 records information such as whether or not the vehicle 90 is located at the obtained elevation and a characteristic elevation point in the traveling state database 362.
 走行位置またはこの近隣における地理的スポット情報は、走行状態解析部223が走行位置と地図情報とを照合することにより求められる。地理的スポット情報としては、例えば、交差点、立体交差、踏切、トンネル、橋などの道路交通に関わるもの、店舗、施設、著名建築、公園、遊園地、観光名所などランドマークに関わるもの、海、湖、川、山、眺望スポットなど地形に関わるものなどが挙げられる。走行状態解析部223は、求められた地理的スポット情報を走行状態データベース362に記録する。 The travel spot or geographic spot information in the vicinity thereof is obtained by the travel state analysis unit 223 collating the travel position with the map information. Geographic spot information includes things related to road traffic such as intersections, three-dimensional intersections, railroad crossings, tunnels, bridges, stores, facilities, famous buildings, parks, amusement parks, tourist attractions, sea, Things related to topography such as lakes, rivers, mountains, and viewing spots. The traveling state analysis unit 223 records the obtained geographical spot information in the traveling state database 362.
 走行位置の道路種別は、走行状態解析部223が走行位置と地図情報とを照合することにより求められる。道路種別としては、例えば、高速道、有料道路、一般道、生活道路、駐車場などの種別の他、道路幅、車線数などの道路属性や、制限速度、一方通行などの道路交通法上の規制などが挙げられる。走行状態解析部223は、このようにして求められた走行位置の道路種別を走行状態データベース362に記録する。 The road type of the traveling position is obtained by the traveling state analysis unit 223 collating the traveling position with the map information. As road types, for example, express roads, toll roads, general roads, residential roads, parking lots, road attributes such as road width and number of lanes, speed limits, road traffic laws such as one-way traffic, etc. There are regulations. The traveling state analysis unit 223 records the road type of the traveling position thus obtained in the traveling state database 362.
 走行位置の周辺の風景を示す風景状態の情報は、走行状態解析部223が走行位置と地図情報とを照合することにより求められる。風景状態の情報としては、例えば、市街地、田園地帯、山間部、海岸沿岸などが挙げられる。走行状態解析部223は、このようにして求められた風景状態の情報を走行状態データベース362に記録する。 The landscape state information indicating the scenery around the travel position is obtained by the travel state analysis unit 223 collating the travel position with the map information. Examples of the landscape state information include an urban area, a countryside, a mountainous area, and a coastal coast. The traveling state analysis unit 223 records the landscape state information thus obtained in the traveling state database 362.
 ヒヤリハット地点に車両90が位置するか否かといった情報は、走行状態解析部223が走行位置と地図情報とを照合することにより求められる。なお、ここで、ヒヤリハット地点とは、道路交通上の危険箇所や難所を意味し、例えば、交通事故が頻発する交差点、見通しの悪い場所、事故には至らなかったが事故の一歩手前の事例の発生場所などが挙げられる。走行状態解析部223は、走行位置がヒヤリハット地点にあるか否かを示す情報を走行状態データベース362に記録する。 Information such as whether or not the vehicle 90 is located at the near-miss point is obtained by the traveling state analysis unit 223 collating the traveling position with the map information. Here, near-miss points mean dangerous places and difficult places on road traffic. The place of occurrence is listed. The traveling state analysis unit 223 records information indicating whether or not the traveling position is at a near-miss point in the traveling state database 362.
 (シーン分割方法)
 次に、シーン分割方法について説明する。一般に放送番組は編集済みのコンテンツであるため、“撮影の不連続点”、具体的にはシーンチェンジを検出することでシーン分割を行うことが可能である。このようにして分割されたシーンは、大まかには一つの意味のある単位に相当し、コンテンツの意味理解に非常に重要な役割を果たす。一方、走行映像の場合には、撮影を開始した後は、ユーザが意識的に撮影を停止しない限り、撮影が終了するまで、“撮影の不連続点”が全くないコンテンツとなる。つまり、走行映像の場合には、“撮影の不連続点”を検出してシーン分割を行うことが難しい。しかし、ダイジェスト再生を行う場合には、コンテンツの意味理解が不可欠であり、このためには走行画像を意味のある単位で分割する必要がある。
(Scene division method)
Next, the scene division method will be described. In general, since a broadcast program is edited content, it is possible to divide a scene by detecting a “shooting discontinuity point”, specifically, a scene change. The scene divided in this way roughly corresponds to one meaningful unit and plays a very important role in understanding the meaning of the content. On the other hand, in the case of a running video, after shooting is started, the content has no “shooting discontinuity” until shooting is completed unless the user intentionally stops shooting. That is, in the case of a running video, it is difficult to divide a scene by detecting “shooting discontinuities”. However, when performing digest playback, it is essential to understand the meaning of the content. For this purpose, it is necessary to divide the travel image into meaningful units.
 そこで、第1実施例では、シーン分割部224は、走行状態データベース362に記録された走行状態の変化点又は特徴的な時点を境にして、走行映像のシーン分割を行うこととする。図6に走行状態の変化点の例をリストし、図7に走行状態の特徴的な時点の例をリストする。ただし、これら全ての走行状態の変化点と走行状態の特徴的な時点とがシーン分割に利用されるとは限られない。代わりに、これらのうち、システムで有効なもののみを用いてシーン分割を行うとしても良いのは言うまでもない。シーン分割部224は、図6、図7に示した時点を基準にして、走行映像についてシーン分割を行った後、分割されたシーンをシーンデータベース364に記録する。この段階では、シーンデータベース364には、例えば、走行映像ID、走行映像におけるシーンを識別するシーンID、当該シーンの走行映像における開始時間と終了時間が記録される(後に述べる図10参照)。 Therefore, in the first embodiment, the scene dividing unit 224 performs scene division of the running video at the change point or characteristic time point of the running state recorded in the running state database 362. FIG. 6 lists examples of travel state change points, and FIG. 7 lists examples of characteristic time points of the travel state. However, all of the change points of the running state and the characteristic time points of the running state are not necessarily used for scene division. Instead, it goes without saying that scene division may be performed using only those effective in the system. The scene division unit 224 performs scene division on the running video with reference to the time points shown in FIGS. 6 and 7, and then records the divided scenes in the scene database 364. At this stage, for example, a travel video ID, a scene ID for identifying a scene in the travel video, and a start time and an end time in the travel video of the scene are recorded in the scene database 364 (see FIG. 10 described later).
 (シーン重要度の算出方法)
 次に、シーン重要度の算出方法について説明する。ここでは、一例として、走行状態の重要度判定表を用いる方法について説明する。
(Scene importance calculation method)
Next, a method for calculating the scene importance will be described. Here, as an example, a method using a traveling state importance determination table will be described.
 図8、図9は、走行状態の重要度判定表を示している。走行状態の重要度判定表は、各種の走行状態の重要度を予め定義したものである。図8、図9において、重要度を数値で表している。具体的には、正の整数はプラス評価で値が大きいほど重要度が高いことを示し、負の整数はマイナス評価で値が小さいほど重要度が低いことを示している。この重要度判定表は、データ記録ユニット36などに予め記録されている。 FIG. 8 and FIG. 9 show the importance determination tables for the running state. The travel state importance determination table predefines the importance of various travel states. 8 and 9, the importance is represented by a numerical value. Specifically, a positive integer indicates a higher importance as the value is positive, and a negative integer indicates a lower importance as the value is negative. This importance determination table is recorded in advance in the data recording unit 36 or the like.
 シーン重要度判定部225は、シーン分割部224にて分割された各シーン毎に、重要度判定表を参照して、各種の走行状態の重要度を求め、求められた各種の走行状態の重要度を合計することにより、各シーンのシーン重要度を判定する。例えば、あるシーンの走行状態について、車両90の車速状態が走行中でかつ低速走行となっており、画像情報として案内標識を含み、地理的スポット情報として交差点、観光名所を含み、風景状態の種別が市街地となっている場合には、図8、図9を参照して、以下のようにシーン重要度が求められる。 The scene importance determination unit 225 refers to the importance determination table for each scene divided by the scene division unit 224, obtains the importance of various running states, and determines the importance of the obtained various running states. The scene importance of each scene is determined by summing the degrees. For example, for a traveling state of a scene, the vehicle 90 is traveling at a low speed and includes a guide sign as image information, an intersection and a tourist attraction as geographical spot information, and a type of landscape state Is an urban area, the scene importance is obtained as follows with reference to FIGS.
  シーン重要度 = 0(走行中)+(-5)(低速走行)+25(案内標識)+30(交差点)+50(観光名所)+15(市街地)=115
 シーン重要度判定部225は、各シーンについて、上述のようにしてシーン重要度を求め、求められたシーン重要度をシーンデータベース364に記録する。シーンデータベース364に記録されるデータの一例を図10に示す。図10に示すように、このとき、シーンデータベース364には、走行映像ID、走行映像におけるシーンを識別するシーンID、当該シーンの走行映像における開始時間と終了時間に加えて、シーン重要度判定部225において判定されたシーン重要度が記録される。
Scene importance = 0 (traveling) + (-5) (low-speed traveling) + 25 (guide sign) + 30 (intersection) + 50 (tourist attraction) + 15 (city area) = 115
The scene importance determination unit 225 determines the scene importance for each scene as described above, and records the calculated scene importance in the scene database 364. An example of data recorded in the scene database 364 is shown in FIG. As shown in FIG. 10, at this time, in the scene database 364, in addition to the travel video ID, the scene ID for identifying the scene in the travel video, the start time and end time in the travel video of the scene, The scene importance determined at 225 is recorded.
 図11に、図8、図9を基にしてシーン重要度を求めることによる優先度の傾向を示す。なお、ここでいう優先度とは、ダイジェスト再生時における再生される優先度を示す。図8において、例えば、図8に示された走行状態の種別の一つとして「旋回状態」を見てみると、「右左折地点」は重要度が50に設定され、「直進区間」は重要度が-5に設定され、「ワインディング区間」は重要度が10に設定されている。従って、シーン重要度の傾向としては、シーンが右左折地点やワインディング区間を含むことにより優先度が上昇し、シーンが直進区間を含むことにより優先度が下がることとなる。なお、右左折地点の方がワインディング区間よりも重要度が大きくなっているので、右左折地点を含むシーンの方がワインディング区間を含むシーンよりも優先度は高くなる。 FIG. 11 shows a tendency of the priority by obtaining the scene importance based on FIGS. 8 and 9. Here, the priority indicates the priority to be played back during digest playback. In FIG. 8, for example, when looking at “turning state” as one of the types of traveling states shown in FIG. 8, the “right / left turn point” is set to 50, and “straight section” is important. The degree is set to −5, and the importance of “winding section” is set to 10. Therefore, as a tendency of the importance of the scene, the priority increases when the scene includes a right / left turn point or a winding section, and the priority decreases when the scene includes a straight section. Since the right / left turn point is more important than the winding section, the scene including the right / left turn point has a higher priority than the scene including the winding section.
 なお、ここで、上述の図8、図9に示した例では、重要度の値として、正の値、負の値を用いたが、これに限られない。このようにする代わりに、重要度の値を、例えば0~100までの連続値で示すといったように、0以上の整数値で示すとしても良い。この場合には、例えば、値が大きくなるほど、重要度が高くなるように設定される。または、このようにする代わりに、重要度の値を、例えば、重要=2、普通=1、退屈=0のように3段階評価で示すとしても良い。 In the example shown in FIGS. 8 and 9 described above, a positive value and a negative value are used as importance values, but the present invention is not limited to this. Instead of doing this, the importance value may be represented by an integer value of 0 or more, such as a continuous value from 0 to 100. In this case, for example, the importance is set to increase as the value increases. Alternatively, instead of doing this, the importance value may be indicated by a three-level evaluation, for example, importance = 2, normal = 1, and boredom = 0.
 以上に述べたようにすることで、走行映像中におけるシーン毎のシーン重要度を適切に求めることができる。 As described above, it is possible to appropriately obtain the scene importance for each scene in the running video.
 (ダイジェスト再生方法)
 次に、ダイジェスト再生方法について説明する。ダイジェスト再生制御部226は、再生すべきシーンを選択するスケジューリングを行い、当該スケジューリングに沿ってシーンの再生を行うことによりダイジェスト再生を行う。より具体的には、ダイジェスト再生制御部226は、スケジューリングとして、ダイジェスト再生の目標再生時間を条件として、シーン重要度やシーン時間長(即ち、シーンの時間的な長さ)に基づいて、再生すべきシーンを選択する。以下でより詳細に説明する。
(Digest playback method)
Next, a digest reproduction method will be described. The digest playback control unit 226 performs scheduling to select a scene to be played back, and performs digest playback by playing back the scene according to the scheduling. More specifically, the digest playback control unit 226 performs playback based on the scene importance and the scene time length (that is, the time length of the scene) on the condition of the target playback time of the digest playback as a condition. Select the scene you want. This will be described in more detail below.
 まず、ダイジェスト再生制御部226は、ダイジェスト再生の目標再生時間Tを設定する。ここで、目標再生時間Tは、走行映像の総時間との比率に応じた時間、又は、予め決められた時間に設定される。なお、このようにする代わりに、目標再生時間Tは、ユーザが入力装置60を介して入力した時間に設定されるとしても良い。 First, the digest reproduction control unit 226 sets a target reproduction time T for digest reproduction. Here, the target reproduction time T is set to a time corresponding to the ratio to the total time of the traveling video or a predetermined time. Instead of doing this, the target reproduction time T may be set to the time input by the user via the input device 60.
 ダイジェスト再生制御部226は、シーンデータベース364より、シーン重要度の高いシーンを順に選択し、選択したシーンについてシーン再生速度を決定する。再生速度とは、例えば、等速再生、2倍速再生、・・N倍速再生といったような倍速再生の程度を示す。なお、ここで、Nが大きくなるにつれて高速再生となる。再生速度の決定方法としては、例えば以下に示す基準(a)~(d)のうち、いずれかを用いて決定される。このようにすることで、ダイジェスト再生を短時間で行うことが可能となる。 The digest playback control unit 226 sequentially selects scenes having higher scene importance from the scene database 364, and determines the scene playback speed for the selected scene. The reproduction speed indicates the degree of double speed reproduction such as constant speed reproduction, double speed reproduction, and N double speed reproduction. Here, high speed playback is performed as N increases. As a method for determining the reproduction speed, for example, any one of the following criteria (a) to (d) is used. In this way, digest reproduction can be performed in a short time.
  (a)シーン重要度の高いシーンは等速再生、低いシーンになるにつれ高速再生。 (A) Scenes with high importance are played back at a constant speed, and as the scene becomes lower, playback is faster.
  (b)シーンの選択順番の早いシーンは等速再生、遅いシーンになるにつれ高速再生。 (B) Scenes with a fast scene selection order are played at a constant speed, and at a slower speed, the scene is played back at a higher speed.
  (c)シーン時間長の短いシーンは等速再生、長いシーンになるにつれ高速再生。 (C) A scene with a short scene length is played back at a constant speed, and a longer scene is played back at a higher speed.
  (d)上記の(a)、(b)、(c)の組み合わせ。 (D) A combination of (a), (b) and (c) above.
 ダイジェスト再生制御部226は、選択したシーンについて、シーン時間長とシーン再生速度とを基に、シーン再生時間Tsiを計算する。シーン再生時間Tsiは、以下のようにして算出される。 The digest playback control unit 226 calculates the scene playback time Tsi for the selected scene based on the scene time length and the scene playback speed. The scene playback time Tsi is calculated as follows.
  シーン再生時間Tsi=シーン時間長/シーン再生速度
 ダイジェスト再生制御部226は、シーンデータベース364よりシーン重要度の高い順に選択したシーンのそれぞれについてシーン再生時間Tsiを算出し、算出されたそれらのシーン再生時間Tsiを足し合わせることにより、ダイジェスト再生時間Tdを算出する。ダイジェスト再生制御部226は、ダイジェスト再生時間Tdが目標再生時間T以上となったところで、それまでに選択したシーンを、撮影された時系列に沿って、かつ、指定された再生速度となるようにスケジューリングを行う。ダイジェスト再生制御部226は、このようにして決定されたスケジューリングに従い、表示部401に対し各シーンの再生を行う。このようにすることで、重要シーンの再生を優先したダイジェスト再生を行うことができる。
Scene playback time Tsi = scene time length / scene playback speed The digest playback control unit 226 calculates the scene playback time Tsi for each of the scenes selected in descending order of the scene importance from the scene database 364, and those scene playbacks thus calculated. The digest reproduction time Td is calculated by adding the times Tsi. When the digest playback time Td becomes equal to or longer than the target playback time T, the digest playback control unit 226 causes the scene selected so far to be in the specified playback speed in the time series of shooting. Perform scheduling. The digest reproduction control unit 226 reproduces each scene on the display unit 401 according to the scheduling determined in this way. By doing so, it is possible to perform digest reproduction giving priority to reproduction of important scenes.
 (制御処理)
 次に、上述の映像記録再生装置100による制御処理について、図12、13のフローチャートを用いて説明する。
(Control processing)
Next, control processing by the video recording / reproducing apparatus 100 will be described with reference to the flowcharts of FIGS.
 図12は、車両の走行状態に基づいて走行映像のシーン分割を行い、シーン毎に重要度を設定してダイジェスト再生を行う全体の制御処理を示すフローチャートである。図13は、ダイジェスト再生のためのスケジューリングを行う制御処理を示すフローチャートである。 FIG. 12 is a flowchart showing an overall control process in which a scene of a running image is divided based on the running state of the vehicle, and an importance is set for each scene and digest reproduction is performed. FIG. 13 is a flowchart showing a control process for performing scheduling for digest reproduction.
 まず、ステップS101において、撮影装置70により走行映像が撮影され、撮影された走行映像が映像記録再生装置100に入力される。走行映像記録部221は、入力された走行映像を走行映像データベース361に記録する。 First, in step S101, a travel image is captured by the image capturing device 70, and the captured travel image is input to the image recording / reproducing device 100. The travel video recording unit 221 records the input travel video in the travel video database 361.
 ステップS102において、GPS18、加速度センサ11、ジャイロセンサ12、車速パルスセンサ13などより各種のセンサ信号が映像記録再生装置100に入力される。ステップS103において、画像/音声解析部222は、走行映像の解析を行い、画像特徴量及び音声特徴量を抽出する。ステップS104において、走行状態解析部223は、各種のセンサ信号、特徴量に基づいて、各種の走行状態を求め、走行状態データベース362に記録する。 In step S102, various sensor signals are input to the video recording / reproducing apparatus 100 from the GPS 18, the acceleration sensor 11, the gyro sensor 12, the vehicle speed pulse sensor 13, and the like. In step S <b> 103, the image / audio analysis unit 222 analyzes the running video and extracts an image feature amount and an audio feature amount. In step S <b> 104, the traveling state analysis unit 223 obtains various traveling states based on various sensor signals and feature amounts, and records them in the traveling state database 362.
 ステップS105において、シーン分割部224は、ステップS104で求められた走行状態に基づいて、より具体的には、走行状態の変化点や特徴的な時点を基準として、走行映像データベース361に記録された走行映像のシーン分割を行う。これにより、走行映像を意味のあるコンテンツのシーンで分割することができる。シーン分割部224は、分割されたシーンをシーンデータベース364に記録する。 In step S105, the scene dividing unit 224 is recorded in the travel video database 361 based on the travel state obtained in step S104, more specifically, based on the change point of the travel state and the characteristic time point. Performs scene division of running images. Thereby, a driving | running | working image | video can be divided | segmented into the scene of a meaningful content. The scene dividing unit 224 records the divided scenes in the scene database 364.
 ステップS106において、シーン重要度判定部225は、シーンデータベース364に記録された各シーンに対し、各シーンにおける各種の走行状態に基づいて、シーン重要度を判定して設定する。このようにすることで、各シーン毎に適切なシーン重要度を設定することができる。 In step S106, the scene importance level determination unit 225 determines and sets the scene importance level for each scene recorded in the scene database 364 based on various running states in each scene. In this way, an appropriate scene importance can be set for each scene.
 ステップS107において、ダイジェスト再生制御部226は、再生すべき重要シーンを選択するスケジューリングを行う。このステップS107のスケジューリングの制御処理について図13を用いて説明する。 In step S107, the digest reproduction control unit 226 performs scheduling for selecting an important scene to be reproduced. The scheduling control process in step S107 will be described with reference to FIG.
 まず、ステップS201において、ダイジェスト再生制御部226は、目標再生時間Tを設定する。続くステップS202において、ダイジェスト再生制御部226は、ダイジェスト再生時間Tdを「0」に設定する。この処理は、ダイジェスト再生時間Tdの初期化を行う処理である。 First, in step S201, the digest reproduction control unit 226 sets a target reproduction time T. In subsequent step S202, the digest reproduction control unit 226 sets the digest reproduction time Td to “0”. This process is a process for initializing the digest playback time Td.
 ステップS203において、ダイジェスト再生制御部226は、シーンデータベース364に記録されたシーンのうち、シーン重要度の最も高いシーンを選択する。続くステップS204において、ダイジェスト再生制御部226は、選択されたシーンについて、シーン重要度やシーン時間長などに基づいて、シーン再生速度を決定する。 In step S203, the digest playback control unit 226 selects a scene having the highest scene importance from the scenes recorded in the scene database 364. In subsequent step S204, the digest playback control unit 226 determines the scene playback speed for the selected scene based on the scene importance level and the scene time length.
 ステップS205において、ダイジェスト再生制御部226は、選択されたシーンについて、シーン時間長をシーン再生速度で割ることにより、シーン再生時間Tsiを求める。続くステップS206において、ダイジェスト再生制御部226は、ダイジェスト再生時間Tdにシーン再生時間Tsiを加えた値を新たなダイジェスト再生時間Tdとする。 In step S205, the digest playback control unit 226 obtains the scene playback time Tsi by dividing the scene time length by the scene playback speed for the selected scene. In subsequent step S206, the digest reproduction control unit 226 sets a value obtained by adding the scene reproduction time Tsi to the digest reproduction time Td as a new digest reproduction time Td.
 ステップS207において、ダイジェスト再生制御部226は、ダイジェスト再生時間Tdが目標再生時間T以上になっているか否かについて判定する。ダイジェスト再生制御部226は、ダイジェスト再生時間Tdが目標再生時間T未満になっていると判定した場合には(ステップS207:No)、ステップS203の処理へ戻り、先に選択されたシーンの次に重要度の高いシーンについて、ステップS204~ステップS206で述べたのと同様の処理を行う。一方、ダイジェスト再生制御部226は、ダイジェスト再生時間Tdが目標再生時間T以上になっていると判定した場合には(ステップS207:Yes)、ステップS208の処理へ進み、それまでに選択されたシーンを、撮影された時系列に沿って、かつ、指定された再生速度でスケジューリングする。この後、ダイジェスト再生制御部226は、図12のステップS108の処理へ進む。 In step S207, the digest reproduction control unit 226 determines whether the digest reproduction time Td is equal to or longer than the target reproduction time T. If the digest playback control unit 226 determines that the digest playback time Td is less than the target playback time T (step S207: No), the digest playback control unit 226 returns to the process of step S203, and next to the previously selected scene. For scenes with a high degree of importance, processing similar to that described in steps S204 to S206 is performed. On the other hand, when the digest playback control unit 226 determines that the digest playback time Td is equal to or longer than the target playback time T (step S207: Yes), the process proceeds to step S208, and the scene selected so far. Are scheduled in accordance with the photographed time series and at a designated playback speed. Thereafter, the digest reproduction control unit 226 proceeds to the process of step S108 in FIG.
 図12へ戻って説明を続けると、ステップS108において、ダイジェスト再生制御部226は、図13の処理で設定されたスケジューリングに沿ってシーンの再生を行う。この後、ダイジェスト再生制御部226は、本制御処理を終了する。 Returning to FIG. 12 and continuing the description, in step S108, the digest playback control unit 226 plays back the scene according to the scheduling set in the processing of FIG. Thereafter, the digest reproduction control unit 226 ends this control process.
 以上に述べたように、第1実施例では、各種センサからのセンサ信号などに基づいて、車両の走行状態を求め、求められた走行状態に基づいて、走行映像のシーン分割を行う。また、第1実施例では、走行状態に基づいて、分割された各シーンにシーン重要度を設定する。走行状態に基づいて、走行映像のシーン分割を行うことにより、走行映像を意味のあるコンテンツのシーンで分割することができる。また、走行状態に基づいて、分割された各シーンにシーン重要度を設定することにより、各シーンに適切なシーン重要度を設定することができ、ダイジェスト再生のスケジューリングを行う際において、重要シーンを漏らさずに済む。 As described above, in the first embodiment, the running state of the vehicle is obtained based on sensor signals from various sensors, and the scene division of the running image is performed based on the obtained running state. In the first embodiment, scene importance is set for each divided scene based on the running state. By dividing the scene of the running video based on the running state, the running video can be divided into scenes of meaningful content. In addition, by setting the scene importance level for each divided scene based on the running state, it is possible to set an appropriate scene importance level for each scene. There is no need to leak.
 [第2実施例]
 次に、第2実施例に係る映像記録再生装置について説明する。
[Second Embodiment]
Next, a video recording / reproducing apparatus according to the second embodiment will be described.
 ユーザがいつも走行している道路は、ユーザにとって見慣れた場所であるので、このような道路を走行しているときのシーンについては、ダイジェスト再生時には除外すべきであると考えられる。一方、初めての、あるいは走った回数の少ない道路は、ユーザにとって新鮮度の高い場所であるので、このような道路を走行しているときのシーンについては、ダイジェスト再生時には積極的に再生すべきであると考えられる。そこで、第2実施例に係る映像記録再生装置では、走行状態に加え、走行履歴をも基にして、走行映像の各シーンにシーン重要度を設定することとする。 Since the road on which the user is always traveling is a place familiar to the user, it is considered that the scene when traveling on such a road should be excluded during digest playback. On the other hand, the first or less frequently run road is a fresh place for the user, so the scene when driving on such a road should be actively played back during digest playback. It is believed that there is. Therefore, in the video recording / playback apparatus according to the second embodiment, the scene importance is set for each scene of the running video based on the running history in addition to the running state.
 図14は、第2実施例に係る映像記録再生装置100aの機能構成を示す。図14において、第1実施例にかかる映像記録再生装置100と同様の機能については、図3で示したのと同じ符号で示している。なお、第1実施例に係る映像記録再生装置100と同様、第2実施例に係る映像記録再生装置100aについても、実態的には、図1に示すナビゲーション装置1の各構成要素により実現される。 FIG. 14 shows a functional configuration of the video recording / reproducing apparatus 100a according to the second embodiment. 14, the same functions as those of the video recording / playback apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG. Similar to the video recording / reproducing apparatus 100 according to the first embodiment, the video recording / reproducing apparatus 100a according to the second embodiment is actually realized by each component of the navigation apparatus 1 shown in FIG. .
 図14より分かるように、第2実施例に係る映像記録再生装置100aは、第1実施例に係る映像記録再生装置100の機能構成に加えて、走行履歴データベース365を有する構成となっている。 As can be seen from FIG. 14, the video recording / reproducing apparatus 100a according to the second embodiment has a configuration including a travel history database 365 in addition to the functional configuration of the video recording / reproducing apparatus 100 according to the first embodiment.
 走行状態解析部223は、各種のセンサ信号又は特徴量に基づいて求められた走行状態を走行状態データベース362に記録するのに加え、走行状態データベース362に記録された走行位置を適宜、走行履歴データベース365にバックアップする。 The travel state analysis unit 223 records the travel state obtained based on various sensor signals or feature quantities in the travel state database 362, and appropriately records the travel position recorded in the travel state database 362. Back up to 365.
 シーン重要度判定部225は、走行履歴データベース365に記録されている走行履歴に基づいて、分割された各シーンにおける経路の走行回数を求める。シーン重要度判定部225は、求められた走行回数に基づいて、例えば、図15に示すような走行回数の重要度判定表を用いて、各シーンにおける経路の走行回数に応じた重要度を求める。そして、シーン重要度判定部225は、各シーンについて、走行状態の重要度を合計した値に対し、走行回数に応じた重要度を加えることで、シーン重要度を算出する。 The scene importance level determination unit 225 obtains the number of times the route has traveled in each divided scene based on the travel history recorded in the travel history database 365. The scene importance level determination unit 225 determines the importance level according to the number of travel times of the route in each scene, for example, using a travel frequency importance level determination table as shown in FIG. . Then, the scene importance level determination unit 225 calculates the scene importance level by adding the importance level corresponding to the number of times of travel to the value obtained by summing the importance levels of the driving state for each scene.
 例えば、図15に示す走行回数の重要度判定表では、走行回数の少ない経路ほど重要度が大きくなるように設定され、走行回数の多い経路ほど重要度が小さくなるように設定されている。このようにすることで、各シーンのシーン重要度は、シーンに対応する経路の走行回数の少ない経路ほど重要度が大きく設定され、シーンに対応する経路の走行回数の多い経路ほど重要度が小さく設定されることとなる。これにより、ユーザがいつも走行している道路は、ダイジェスト再生時における優先度が低くなり、初めての、あるいは走った回数の少ない道路は、ダイジェスト再生時における優先度が高くなる。 For example, in the importance determination table for the number of travels shown in FIG. 15, the route is set so that the importance is increased as the route is less traveled, and the importance is set smaller as the route is traveled more frequently. By doing in this way, the importance level of each scene is set so that the importance level is set to be larger for a route having a smaller number of times of travel corresponding to the scene, and the importance level is less for a route having a greater number of times of travel corresponding to the scene. Will be set. As a result, the road on which the user is always traveling has a low priority at the time of digest reproduction, and the first road or the road having a small number of runs has a high priority at the time of digest reproduction.
 以上に述べたことから分かるように、第2実施例では、走行状態に加え、走行履歴をも基にして、走行映像の各シーンにシーン重要度を設定することとしている。このようにすることで、走行回数に応じた適切なシーン重要度を各シーンに設定することができる。 As can be seen from the above description, in the second embodiment, the scene importance level is set for each scene of the running video based on the running history in addition to the running state. In this way, an appropriate scene importance level corresponding to the number of travels can be set for each scene.
 [第3実施例]
 次に、第3実施例に係る映像記録再生装置について説明する。
[Third embodiment]
Next, a video recording / reproducing apparatus according to the third embodiment will be described.
 ユーザの自宅付近や通勤経路付近は、見知った場所であるので、このような場所のシーンについては、ダイジェスト再生時には除外すべきであると考えられる。一方、ユーザにとっての思い出の場所などのシーンについては、ユーザにとって思い入れの高いシーンとなるので、ダイジェスト再生時には積極的に再生すべきであると考えられる。そこで、第3実施例に係る映像記録再生装置では、走行状態に加え、ユーザの個人的な情報である個人情報をも基にして、走行映像の各シーンにシーン重要度を設定することとする。 Since the vicinity of the user's home and the commuting route is a familiar place, it is considered that scenes of such a place should be excluded during digest playback. On the other hand, a scene such as a memory location for the user is a scene that is highly appreciated by the user, so it is considered that the scene should be actively played back during digest playback. Therefore, in the video recording / playback apparatus according to the third embodiment, the scene importance is set for each scene of the running video based on the personal information that is the personal information of the user in addition to the running state. .
 図16は、第3実施例に係る映像記録再生装置100bの機能構成を示す。図16において、第1実施例にかかる映像記録再生装置100と同様の機能については、図3で示したのと同じ符号で示している。なお、第1実施例に係る映像記録再生装置100と同様、第3実施例に係る映像記録再生装置100bについても、実態的には、図1に示すナビゲーション装置1の各構成要素により実現される。 FIG. 16 shows a functional configuration of the video recording / reproducing apparatus 100b according to the third embodiment. In FIG. 16, functions similar to those of the video recording / reproducing apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG. Similar to the video recording / reproducing apparatus 100 according to the first embodiment, the video recording / reproducing apparatus 100b according to the third embodiment is actually realized by each component of the navigation apparatus 1 shown in FIG. .
 図16より分かるように、第3実施例に係る映像記録再生装置100bは、第1実施例に係る映像記録再生装置100の機能構成に加えて、個人情報データベース366を有する構成となっている。 As can be seen from FIG. 16, the video recording / playback apparatus 100b according to the third embodiment has a personal information database 366 in addition to the functional configuration of the video recording / playback apparatus 100 according to the first embodiment.
 個人情報データベース366には、個人情報として、例えば、自宅、実家、会社といったよく見知った場所の位置、通勤経路、思い出の場所の位置などが記録されている。これらのデータは、ユーザにより予め記録される。 In the personal information database 366, as personal information, for example, the location of a familiar place such as a home, a home, or a company, a commuting route, a location of a memory location, and the like are recorded. These data are recorded in advance by the user.
 シーン重要度判定部225は、各シーンにおける走行位置が、個人情報データベース366に個人情報として記録された位置や経路の付近にあるか否かについて判定する。シーン重要度判定部225は、各シーンにおける走行位置が、個人情報データベース366に個人情報として記録された位置や経路の付近にあると判定した場合には、例えば、図17に示すような個人情報の重要度判定表を用いて、各シーンに対し、個人情報に応じた重要度を求める。そして、シーン重要度判定部225は、各シーンについて、走行状態の重要度を合計した値に対し、個人情報に応じた重要度を加えることで、シーン重要度を算出する。例えば、図17に示す個人情報の重要度判定表では、自宅付近や通勤経路付近では重要度が小さくなるように設定され、思い出の場所では重要度が大きくなるように設定されている。これにより、ユーザの自宅付近や通勤経路付近のシーンは、ダイジェスト再生時における優先度が低くなり、ユーザの思い出の場所のシーンは、ダイジェスト再生時における優先度が高くなる。 The scene importance level determination unit 225 determines whether or not the traveling position in each scene is near the position or route recorded as personal information in the personal information database 366. When the scene importance degree determination unit 225 determines that the traveling position in each scene is in the vicinity of the position or route recorded as personal information in the personal information database 366, for example, personal information as shown in FIG. The importance level corresponding to the personal information is obtained for each scene using the importance level determination table. Then, the scene importance level determination unit 225 calculates the scene importance level by adding the importance level according to the personal information to the value obtained by summing the importance levels of the running states for each scene. For example, in the importance level determination table for personal information shown in FIG. 17, the importance level is set to be small near the home or near the commuting route, and the importance level is set to be large at the memory location. As a result, scenes near the user's home or near the commuting route have low priority during digest playback, and scenes at the user's memory location have high priority during digest playback.
 以上に述べたことから分かるように、第3実施例では、走行状態に加え、個人情報をも基にして、走行映像の各シーンにシーン重要度を設定することとしている。このようにしても、個人情報に応じた適切なシーン重要度を各シーンに設定することができる。 As can be seen from the above description, in the third embodiment, the scene importance is set for each scene of the running video based on the personal information in addition to the running state. Even in this way, it is possible to set an appropriate scene importance according to personal information for each scene.
 [第4実施例]
 次に、第4実施例に係る映像記録再生装置について説明する。
[Fourth embodiment]
Next, a video recording / reproducing apparatus according to the fourth embodiment will be described.
 上述の各実施例に係る映像記録再生装置では、走行映像の各シーンのシーン重要度を自動的に判断していた。しかしながら、各実施例に係る映像記録再生装置によっても、ユーザにとって重要なシーンをスケジューリングし損ねる可能性はある。そこで、第4実施例に係る映像記録再生装置では、走行状態に加え、ユーザにより明示的に指示された重要シーンの情報をも基にして、走行映像の各シーンにシーン重要度を設定することとする。 In the video recording / reproducing apparatus according to the above-described embodiments, the scene importance level of each scene of the running video is automatically determined. However, the video recording / playback apparatus according to each embodiment may fail to schedule a scene important for the user. Therefore, in the video recording / playback apparatus according to the fourth embodiment, the scene importance level is set for each scene of the running video based on the information on the important scene explicitly instructed by the user in addition to the running state. And
 図18は、第4実施例に係る映像記録再生装置100cの機能構成を示す。図18において、第1実施例に係る映像記録再生装置100と同様の構成については、図3で示したのと同じ符号で示している。なお、第1実施例に係る映像記録再生装置100と同様、第4実施例に係る映像記録再生装置100cについても、実態的には、図1に示すナビゲーション装置1の各構成要素により実現される。 FIG. 18 shows a functional configuration of the video recording / reproducing apparatus 100c according to the fourth embodiment. 18, the same components as those of the video recording / reproducing apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG. Similar to the video recording / reproducing apparatus 100 according to the first embodiment, the video recording / reproducing apparatus 100c according to the fourth embodiment is actually realized by each component of the navigation apparatus 1 shown in FIG. .
 図18より分かるように、第4実施例に係る映像記録再生装置100cは、第1実施例に係る映像記録再生装置100の機能構成に加えて、重要シーン指示手段14と、重要シーン指示情報データベース367と、を有する構成となっている。 As can be seen from FIG. 18, in addition to the functional configuration of the video recording / reproducing apparatus 100 according to the first embodiment, the video recording / reproducing apparatus 100c according to the fourth embodiment includes an important scene instruction means 14 and an important scene instruction information database. 367.
 重要シーン指示手段14は、ユーザが走行中に、この場所の走行映像が重要シーンだと思った時、あるいは後で見たいと思った時に、この意思を映像記録再生装置100cに伝える手段である。具体的には、重要シーン指示手段14は、例えば、ボタン、スイッチ、タッチパネルなどの入力装置である。なお、重要シーン指示手段14は、ユーザの声を認識する音声認識や、ユーザの仕草を認識するジェスチャ認識を行う入力装置であっても良い。重要シーン指示手段14は、ユーザにより指示された重要シーン指示情報を重要シーン指示情報データベース367に記録する。重要シーン指示情報データベース367に記録される重要シーン指示情報としては、例えば、ユーザにより指示がされたときの時刻や走行位置が挙げられる。 The important scene instructing means 14 is a means for notifying the video recording / reproducing apparatus 100c of this intention when the user thinks that the running video at this place is an important scene or wants to watch it later. . Specifically, the important scene instruction unit 14 is an input device such as a button, a switch, or a touch panel. Note that the important scene instruction unit 14 may be an input device that performs voice recognition for recognizing a user's voice or gesture recognition for recognizing a user's gesture. The important scene instruction unit 14 records the important scene instruction information instructed by the user in the important scene instruction information database 367. The important scene instruction information recorded in the important scene instruction information database 367 includes, for example, the time when the instruction is given by the user and the traveling position.
 シーン重要度判定部225は、ユーザにより指示された重要シーンを含んだシーンについては、走行状態の重要度を合計した値に対し、一定の重要度を加えて、シーン重要度を算出する。このようにすることで、ユーザにより指示がされた重要シーンを含んだシーンについては重要度が大きく設定され、ダイジェスト再生時における優先度が高くなる。 The scene importance level determination unit 225 calculates a scene importance level by adding a certain level of importance to a value obtained by summing the importance levels of the driving state for a scene including an important scene instructed by the user. By doing so, the importance level is set to be large for scenes including the important scene instructed by the user, and the priority level during digest playback is increased.
 以上に述べたことから分かるように、第4実施例では、走行状態に加え、重要シーン指示情報をも基にして、走行映像の各シーンにシーン重要度を設定することとしている。このようにすることで、ユーザが見たいシーンを見逃すことのないダイジェスト再生を行うことができる。 As can be seen from the above description, in the fourth embodiment, the scene importance is set for each scene of the running video based on the important scene instruction information in addition to the running state. In this way, it is possible to perform digest playback without missing a scene that the user wants to see.
 [応用例]
 次に、上述の各実施例に係る映像記録再生装置の応用例について説明する。上述の各実施例では、走行映像のシーン分割を行い、分割されたシーン毎にシーン重要度を設定するとしていた。しかしながら、これに限られるものではなく、走行映像のシーン分割を行わずに、シーンの最小単位であるフレーム毎にシーン重要度を設定するとしても良い。
[Application example]
Next, application examples of the video recording / reproducing apparatus according to the above-described embodiments will be described. In each of the embodiments described above, scene division of a running video is performed, and scene importance is set for each divided scene. However, the present invention is not limited to this, and scene importance may be set for each frame, which is the minimum unit of the scene, without performing scene division of the running video.
 図19は、応用例に係る映像記録再生装置100dの機能構成を示す。図19に示す映像記録再生装置100dは、一例として、第1実施例に係る映像記録再生装置100の応用例を示している。図19において、第1実施例に係る映像記録再生装置100と同様の構成については、図3で示したのと同じ符号で示している。 FIG. 19 shows a functional configuration of a video recording / reproducing apparatus 100d according to an application example. A video recording / reproducing apparatus 100d shown in FIG. 19 shows an application example of the video recording / reproducing apparatus 100 according to the first embodiment as an example. In FIG. 19, the same components as those of the video recording / reproducing apparatus 100 according to the first embodiment are denoted by the same reference numerals as those shown in FIG.
 図19より分かるように、第1実施例に係る映像記録再生装置100と異なり、応用例に係る映像記録再生装置100dは、シーン分割部224、シーンデータベース364の代わりに、インデックスデータベース368を有する。 19, unlike the video recording / reproducing apparatus 100 according to the first embodiment, the video recording / reproducing apparatus 100d according to the application example includes an index database 368 instead of the scene dividing unit 224 and the scene database 364.
 応用例に係る映像記録再生装置100dにおいて、シーン重要度判定部225は、走行映像のフレーム毎に、走行状態に基づいてシーン重要度を算出する。図20は、走行映像の時間に対するシーン重要度の変化を示すグラフである。 In the video recording / reproducing apparatus 100d according to the application example, the scene importance determination unit 225 calculates the scene importance for each frame of the driving video based on the driving state. FIG. 20 is a graph showing a change in the importance of the scene with respect to the time of the running video.
 シーン重要度判定部225は、予め決められた閾値をシーン重要度が超えた場合には、この越えた区間内で最も高い重要度を持つフレームにインデックスを付与することとし、そのときのフレームを示すフレームID、または、そのときの時間位置をインデックスデータベース368に記録する。そして、シーン重要度判定部225は、インデックスが付与されたフレームを代表画像として、表示部401に一覧表示する。ユーザは、表示部401に一覧表示された代表画像の中から、任意の画像を選択する。このようにすることで、ユーザは、不必要なシーンをスキップするスキップ再生を行うことができる。 When the scene importance level exceeds a predetermined threshold, the scene importance level determination unit 225 assigns an index to the frame having the highest importance level within the exceeded section. The indicated frame ID or the time position at that time is recorded in the index database 368. Then, the scene importance level determination unit 225 displays a list of frames with indexes as representative images on the display unit 401. The user selects an arbitrary image from the representative images displayed as a list on the display unit 401. In this way, the user can perform skip playback that skips unnecessary scenes.
 上述した応用例に係る映像記録再生装置100dによる制御処理について、図21のフローチャートを用いて説明する。 Control processing by the video recording / reproducing apparatus 100d according to the application example described above will be described with reference to the flowchart of FIG.
 ステップS301~S304までの処理は、図12で述べたステップS101~S104までの処理と同様であるので説明を省略する。 The processing from step S301 to S304 is the same as the processing from step S101 to S104 described in FIG.
 ステップS305において、シーン重要度判定部225は、走行映像データベース361に記録された走行映像のフレーム毎に、走行状態(または、これに加えて、走行履歴や個人情報など)に基づいて、シーン重要度を設定する。 In step S <b> 305, the scene importance level determination unit 225 determines the importance of the scene based on the driving state (or in addition to the driving history and personal information) for each frame of the driving video recorded in the driving video database 361. Set the degree.
 ステップS306において、シーン重要度判定部225は、シーン重要度が設定された各フレームについて、予め決められた閾値をシーン重要度が超えている場合には、この越えた区間内で最も高い重要度を持つフレームIDもしくは走行映像中における時間位置をインデックスデータベース368に記録する。 In step S306, when the scene importance level exceeds a predetermined threshold value for each frame for which the scene importance level is set, the scene importance level determination unit 225 has the highest importance level in the exceeded section. Is recorded in the index database 368.
 ステップS307において、シーン重要度判定部225は、インデックスが付与されたフレームを代表画像として、表示部401に一覧表示して、本制御処理を終了する。 In step S307, the scene importance level determination unit 225 displays a list of frames to which indexes have been assigned as representative images on the display unit 401, and ends this control process.
 以上に述べた応用例に係る映像記録再生装置100dによっても、ユーザは、不必要なシーンをスキップすることができる。これにより、ユーザは、重要なシーンのみを確認することができ、走行映像の内容を短時間で十分に確認することができる。 Also by the video recording / reproducing apparatus 100d according to the application example described above, the user can skip unnecessary scenes. Thereby, the user can confirm only an important scene, and can fully confirm the content of a driving | running | working image | video in a short time.
 [変形例]
 本発明は、上述した実施例に限られるものではなく、請求の範囲及び明細書全体から読み取れる発明の要旨或いは思想に反しない範囲で適宜変更可能であり、そのような変更を伴う映像記録再生装置もまた本発明の技術的範囲に含まれるものである。
[Modification]
The present invention is not limited to the above-described embodiments, and can be changed as appropriate without departing from the spirit or idea of the invention that can be read from the claims and the entire specification. Is also included in the technical scope of the present invention.
 例えば、上記では、第1乃至第4実施例、応用例における制御を別個に説明したが、第1乃至第4実施例、応用例における制御を適宜組み合わせて実行しても良い。 For example, in the above description, the controls in the first to fourth embodiments and application examples have been described separately, but the controls in the first to fourth embodiments and application examples may be appropriately combined and executed.
 また、上述の各実施例では、撮影装置70は、走行している車両90の周囲の映像たる走行映像を撮影するとしているが、これに限られるものではなく、加えて、車両90の車室内の映像(以下、「車室内映像」と称する)も撮影し、当該車室内映像についても、走行状態に応じて、シーン分割、シーン重要度設定を行うとしても良い。また、上述の各実施例では、映像記録再生装置は、GPS18、加速度センサ11、ジャイロセンサ12、及び、車速パルスセンサ13を用いて、走行状態を求めるとしている。しかしながら、これは一例であり、映像記録再生装置は、これらのうち、一部のセンサを用いて走行状態を求めるとしても良いのは言うまでもない。 Further, in each of the above-described embodiments, the imaging device 70 captures a traveling image that is an image around the traveling vehicle 90, but is not limited thereto, and in addition, the interior of the vehicle 90 (Hereinafter, referred to as “vehicle interior video”) may be taken, and scene splitting and scene importance setting may be performed for the vehicle interior video in accordance with the driving state. In each of the above-described embodiments, the video recording / reproducing apparatus uses the GPS 18, the acceleration sensor 11, the gyro sensor 12, and the vehicle speed pulse sensor 13 to obtain the traveling state. However, this is only an example, and it goes without saying that the video recording / reproducing apparatus may obtain the running state using some of these sensors.
 さらに、映像記録再生装置は、上述のセンサの全部又は一部に加えて、他のセンサを用いて走行状態を求めるとしても良い。また、上述の各実施例では、映像記録再生装置は、映像に含まれる画像及び音声を解析して、画像特徴量及び音声特徴量を求め、これらの特徴量を用いて、走行状態を求めるとしているが、これに限られない。この代わりに、映像記録再生装置は、画像特徴量又は音声特徴量のうち、一方のみを用いて、走行状態を求めるとしても良いのは言うまでもない。 Furthermore, the video recording / reproducing apparatus may obtain the running state using another sensor in addition to all or a part of the above-described sensors. Further, in each of the above-described embodiments, the video recording / playback apparatus analyzes the image and sound included in the video to obtain the image feature amount and the sound feature amount, and uses these feature amounts to obtain the running state. However, it is not limited to this. Instead of this, it goes without saying that the video recording / reproducing apparatus may obtain the running state using only one of the image feature quantity and the audio feature quantity.
 また、第1乃至第4実施例では、ナビゲーション装置1により映像記録再生装置の全ての機能が実現されるとしているが、これに限られない。このようにする代わりに、ナビゲーション装置1とは異なる他の装置で、映像記録再生装置の機能のうち、全部又は一部の機能を実現するとしても良い。例えば、ナビゲーション装置1のシステムコントローラが、走行映像記録部、画像/音声解析部、走行状態解析部の機能を有し、ナビゲーション装置1とは異なる他のオーディオ装置が、シーン分割部、シーン重要度判定部、ダイジェスト再生制御部の機能を有するとしても良い。この場合には、ナビゲーション装置1は、各種のセンサからのセンサ信号などに基づいて、走行状態を求め、求められた走行状態を走行状態データベースに記録する。ここで、走行映像データベース、走行状態データベースは、ディスクなどに記録される。そして、他のオーディオ装置は、当該ディスクなどに記録された走行映像データベース、走行状態データベースを用いて、シーン分割やシーン重要度判定を行い、ダイジェスト再生を行う。 In the first to fourth embodiments, all the functions of the video recording / reproducing apparatus are realized by the navigation apparatus 1, but the present invention is not limited to this. Instead of this, all or some of the functions of the video recording / playback apparatus may be realized by another apparatus different from the navigation apparatus 1. For example, the system controller of the navigation device 1 has functions of a traveling video recording unit, an image / sound analysis unit, and a traveling state analysis unit, and another audio device different from the navigation device 1 includes a scene dividing unit, a scene importance level It may have functions of a determination unit and a digest reproduction control unit. In this case, the navigation device 1 obtains the traveling state based on sensor signals from various sensors and records the obtained traveling state in the traveling state database. Here, the traveling video database and the traveling state database are recorded on a disc or the like. Then, the other audio apparatus performs scene reproduction and scene importance determination by using the traveling video database and the traveling state database recorded on the disc or the like, and performs digest reproduction.
 なお、本発明の映像記録再生装置は、車両用のナビゲーション装置に適用されるのには限られず、代わりに、自転車や携帯電話などに搭載されるナビゲーション装置に適用されるとしても良いのは言うまでもない。つまり、本発明の映像記録再生装置は、車両に限られない他の移動体のナビゲーション装置に適用することができる。 It should be noted that the video recording / reproducing apparatus of the present invention is not limited to being applied to a vehicle navigation apparatus, but may be applied to a navigation apparatus mounted on a bicycle or a mobile phone instead. Yes. That is, the video recording / reproducing apparatus of the present invention can be applied to other mobile navigation devices that are not limited to vehicles.
 本発明は、撮影用のカメラを搭載したカーナビゲーション装置、PND(Personal Navigation Device)、カーAVシステム、携帯電話、サイクルコンピュータ、GPSロガーなどに代表される車載機器あるいはモバイル機器に利用することができる。 INDUSTRIAL APPLICABILITY The present invention can be used for a car navigation device equipped with a camera for photographing, a PND (Personal Navigation Device), a car AV system, a mobile phone, a cycle computer, a GPS logger, and other in-vehicle devices or mobile devices. .

Claims (15)

  1.  移動体に備えられた撮影手段と、
     前記撮影手段により撮影された映像を記録する映像記録手段と、
     前記移動体の走行状態を求める走行状態解析手段と、
     求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定手段と、を備えることを特徴とする映像記録再生装置。
    Photographing means provided in the moving body;
    Video recording means for recording the video imaged by the imaging means;
    Traveling state analysis means for determining the traveling state of the moving body;
    A video recording / reproducing apparatus comprising: an importance level determination unit configured to determine and set the level of importance for each scene of the video based on the obtained running state.
  2.  前記走行状態解析手段は、前記映像に含まれる画像及び音声のうち、少なくとも1つ以上を解析して求められた特徴量に基づいて、前記走行状態を求めることを特徴とする請求項1に記載の映像記録再生装置。 The said running condition analysis means calculates | requires the said running condition based on the feature-value calculated | required by analyzing at least 1 or more among the image and audio | voice contained in the said image | video. Video recording and playback device.
  3.  前記走行状態解析手段は、GPSセンサ、加速度センサ、ジャイロセンサ、及び、車速パルスセンサのうち、少なくとも1つ以上のセンサからのセンサ信号に基づいて、前記走行状態を求めることを特徴とする請求項1又は2に記載の映像記録再生装置。 The traveling state analyzing means obtains the traveling state based on a sensor signal from at least one of a GPS sensor, an acceleration sensor, a gyro sensor, and a vehicle speed pulse sensor. 3. The video recording / reproducing apparatus according to 1 or 2.
  4.  地図情報を記録している地図情報記録手段を有し、
     前記走行状態解析手段は、前記地図情報に基づいて、前記走行状態を求めることを特徴とする請求項2又は3に記載の映像記録再生装置。
    Having map information recording means for recording map information;
    The video recording / reproducing apparatus according to claim 2, wherein the running state analyzing unit obtains the running state based on the map information.
  5.  前記移動体の走行履歴を記録する走行履歴記録手段を有し、
     前記重要度判定手段は、前記走行履歴と前記走行状態とに基づいて、前記映像のシーン毎に重要度を判定して設定することを特徴とする請求項1乃至4のいずれか一項に記載の映像記録再生装置。
    Having travel history recording means for recording the travel history of the mobile body;
    5. The importance level determination unit determines and sets the level of importance for each scene of the video based on the travel history and the travel state. 6. Video recording and playback device.
  6.  ユーザの個人情報を記録する個人情報記録手段を有し、
     前記重要度判定手段は、前記個人情報と前記走行状態とに基づいて、前記映像のシーン毎に重要度を判定して設定することを特徴とする請求項1乃至5のいずれか一項に記載の映像記録再生装置。
    Having personal information recording means for recording user's personal information;
    The importance level determination unit determines and sets the importance level for each scene of the video based on the personal information and the running state. Video recording and playback device.
  7.  ユーザが重要シーンを示す情報を指示するための指示手段を有し、
     前記重要度判定手段は、前記ユーザにより指示された情報と前記走行状態とに基づいて、前記映像のシーン毎に重要度を判定して設定することを特徴とする請求項1乃至6のいずれか一項に記載の映像記録再生装置。
    An instruction means for the user to indicate information indicating an important scene;
    The importance level determination unit determines and sets the importance level for each scene of the video based on information instructed by the user and the running state. The video recording / reproducing apparatus according to one item.
  8.  前記走行状態を基に、前記映像を複数のシーンに分割するシーン分割手段を備えることを特徴とする請求項1乃至7のいずれか一項に記載の映像記録再生装置。 The video recording / reproducing apparatus according to any one of claims 1 to 7, further comprising scene dividing means for dividing the video into a plurality of scenes based on the running state.
  9.  前記重要度判定手段は、前記移動体の走行状態に基づいて、前記映像のフレーム毎に重要度を判定して設定することを特徴とする請求項1乃至7のいずれか一項に記載の映像記録再生装置。 The video according to claim 1, wherein the importance level determination unit determines and sets the level of importance for each frame of the video based on a traveling state of the moving body. Recording / playback device.
  10.  前記映像のシーン毎に設定された重要度に応じて、ダイジェスト再生のスケジューリングを作成することで再生を制御するダイジェスト再生制御手段を有することを特徴とする請求項1乃至8のいずれか一項に記載の映像記録再生装置。 9. The apparatus according to claim 1, further comprising digest reproduction control means for controlling reproduction by creating a schedule for digest reproduction according to the importance set for each scene of the video. The video recording / reproducing apparatus as described.
  11.  前記ダイジェスト再生制御手段は、前記映像のシーンに設定された重要度が高くなるほど、当該映像のシーンを優先的にスケジューリングすることを特徴とする請求項10に記載の映像記録再生装置。 11. The video recording / playback apparatus according to claim 10, wherein the digest playback control means schedules the video scene preferentially as the importance set for the video scene increases.
  12.  前記ダイジェスト再生制御手段は、前記映像のシーンに設定された重要度が低くなるほど、当該映像のシーンを速く再生することを特徴とする請求項10又は11に記載の映像記録再生装置。 12. The video recording / playback apparatus according to claim 10, wherein the digest playback control unit plays back the video scene faster as the importance set in the video scene becomes lower.
  13.  前記ダイジェスト再生制御手段は、前記映像のシーンの時間長が長くなるほど、当該映像のシーンを速く再生することを特徴とする請求項10乃至12のいずれか一項に記載の映像記録再生装置。 The video recording / playback apparatus according to any one of claims 10 to 12, wherein the digest playback control means plays back the video scene faster as the time length of the video scene becomes longer.
  14.  移動体に備えられた撮影手段により映像を撮影する撮影工程と、
     前記撮影手段により撮影された映像を記録する映像記録工程と、
     前記移動体の走行状態を求める走行状態解析工程と、
     求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定工程と、を備えることを特徴とする映像記録再生方法。
    A photographing step of photographing an image by photographing means provided in the moving body;
    A video recording step for recording the video imaged by the imaging means;
    A running state analyzing step for obtaining a running state of the moving body;
    An importance level determining step of determining and setting the importance level for each scene of the video based on the obtained running state.
  15.  撮影手段を有するナビゲーション装置により実行される映像記録再生プログラムであって、
     前記撮影手段により撮影された映像を記録する映像記録手段、
     前記移動体の走行状態を求める走行状態解析手段、
     求められた前記走行状態に基づいて、前記映像のシーン毎に重要度を判定して設定する重要度判定手段、として前記ナビゲーション装置を機能させることを特徴とする映像記録再生プログラム。
    A video recording / playback program executed by a navigation device having photographing means,
    Video recording means for recording the video imaged by the imaging means;
    Traveling state analysis means for determining the traveling state of the moving body,
    An image recording / reproducing program that causes the navigation device to function as importance determining means for determining and setting importance for each scene of the image based on the obtained running state.
PCT/JP2009/061379 2009-06-23 2009-06-23 Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program WO2010150348A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2009/061379 WO2010150348A1 (en) 2009-06-23 2009-06-23 Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program
JP2011519410A JPWO2010150348A1 (en) 2009-06-23 2009-06-23 Video recording / playback apparatus, video recording / playback method, and video recording / playback program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/061379 WO2010150348A1 (en) 2009-06-23 2009-06-23 Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program

Publications (1)

Publication Number Publication Date
WO2010150348A1 true WO2010150348A1 (en) 2010-12-29

Family

ID=43386143

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/061379 WO2010150348A1 (en) 2009-06-23 2009-06-23 Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program

Country Status (2)

Country Link
JP (1) JPWO2010150348A1 (en)
WO (1) WO2010150348A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012244193A (en) * 2011-05-13 2012-12-10 Pioneer Electronic Corp Reproduction section extraction method, program and storage medium, and reproduction section extraction device and transport equipment mounting apparatus
WO2013084446A1 (en) * 2011-12-05 2013-06-13 シャープ株式会社 Drive recorder and display device
JP2013143002A (en) * 2012-01-11 2013-07-22 Luna Co Ltd Operation management method and operation state management system for moving body
WO2013111493A1 (en) * 2012-01-23 2013-08-01 日産自動車株式会社 Monitoring system
CN103327233A (en) * 2012-03-21 2013-09-25 卡西欧计算机株式会社 Moving image capturing apparatus, moving image capturing method and storage medium storing moving image capturing program, and digest playback setting apparatus, digest playback setting method and storage medium storing digest playback setting program
JP2014027481A (en) * 2012-07-26 2014-02-06 Denso Corp Drive video recording device and drive video recording system
WO2014167793A1 (en) * 2013-04-09 2014-10-16 株式会社デンソー Vehicle outside image saving device, and portable terminal with imaging function
JP2015186193A (en) * 2014-03-26 2015-10-22 カシオ計算機株式会社 Data processing apparatus, data processing system, reproduction time reduction method, and program
JP2016167771A (en) * 2015-03-10 2016-09-15 株式会社デンソー Digest video generation device
JP6391078B1 (en) * 2017-08-18 2018-09-19 クックパッド株式会社 Information processing device, terminal device, display shelf, information processing system, information processing method, and program
JP2018163425A (en) * 2017-03-24 2018-10-18 パイオニア株式会社 Information processing apparatus
US10230779B2 (en) 2015-05-20 2019-03-12 Ricoh Company, Ltd. Content provision system, information processing apparatus and content reproduction method
CN110019060A (en) * 2017-12-27 2019-07-16 郑州畅想高科股份有限公司 A kind of engine video frequency file and log sheet automatic synchronous method and device
JP2021077115A (en) * 2019-11-08 2021-05-20 本田技研工業株式会社 Output system and control method thereof, and program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7002765B2 (en) * 2019-12-10 2022-01-20 株式会社ユピテル Electronic devices and programs
JP7026958B2 (en) * 2020-01-21 2022-03-01 株式会社ユピテル Recording device and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09147472A (en) * 1995-11-27 1997-06-06 Sanyo Electric Co Ltd Video and audio reproducing device
JP2005075253A (en) * 2003-09-03 2005-03-24 Nippon Signal Co Ltd:The Accident situation record reporting device
JP2006157651A (en) * 2004-11-30 2006-06-15 Funai Electric Co Ltd Data reproducing device
JP2007011907A (en) * 2005-07-01 2007-01-18 Horiba Ltd Driving recorder
WO2007055241A1 (en) * 2005-11-10 2007-05-18 Pioneer Corporation Information recording device, information recording method, information recording program and recording medium
JP2007172484A (en) * 2005-12-26 2007-07-05 Kayaba Ind Co Ltd Operating status storage device
JP2007284049A (en) * 2007-03-31 2007-11-01 Wataru Horikawa Economical drive support device, car-navigation system, and economical drive support program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006209333A (en) * 2005-01-26 2006-08-10 Toyota Central Res & Dev Lab Inc Risk degree deciding device and communication equipment
JP2007011908A (en) * 2005-07-01 2007-01-18 Japan Automobile Research Inst Inc Driving recorder

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09147472A (en) * 1995-11-27 1997-06-06 Sanyo Electric Co Ltd Video and audio reproducing device
JP2005075253A (en) * 2003-09-03 2005-03-24 Nippon Signal Co Ltd:The Accident situation record reporting device
JP2006157651A (en) * 2004-11-30 2006-06-15 Funai Electric Co Ltd Data reproducing device
JP2007011907A (en) * 2005-07-01 2007-01-18 Horiba Ltd Driving recorder
WO2007055241A1 (en) * 2005-11-10 2007-05-18 Pioneer Corporation Information recording device, information recording method, information recording program and recording medium
JP2007172484A (en) * 2005-12-26 2007-07-05 Kayaba Ind Co Ltd Operating status storage device
JP2007284049A (en) * 2007-03-31 2007-11-01 Wataru Horikawa Economical drive support device, car-navigation system, and economical drive support program

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012244193A (en) * 2011-05-13 2012-12-10 Pioneer Electronic Corp Reproduction section extraction method, program and storage medium, and reproduction section extraction device and transport equipment mounting apparatus
CN103907140A (en) * 2011-12-05 2014-07-02 夏普株式会社 Drive recorder and display device
WO2013084446A1 (en) * 2011-12-05 2013-06-13 シャープ株式会社 Drive recorder and display device
JPWO2013084446A1 (en) * 2011-12-05 2015-04-27 シャープ株式会社 Drive recorder and display device
JP2013143002A (en) * 2012-01-11 2013-07-22 Luna Co Ltd Operation management method and operation state management system for moving body
WO2013111493A1 (en) * 2012-01-23 2013-08-01 日産自動車株式会社 Monitoring system
JP2013197995A (en) * 2012-03-21 2013-09-30 Casio Comput Co Ltd Moving image capturing apparatus, moving image capturing method, and program
US8938154B2 (en) 2012-03-21 2015-01-20 Casio Computer Co., Ltd. Moving image capturing apparatus, moving image capturing method and storage medium storing moving image capturing program, and digest playback setting apparatus, digest playback setting method and storage medium storing digest playback setting program
CN103327233A (en) * 2012-03-21 2013-09-25 卡西欧计算机株式会社 Moving image capturing apparatus, moving image capturing method and storage medium storing moving image capturing program, and digest playback setting apparatus, digest playback setting method and storage medium storing digest playback setting program
CN103327233B (en) * 2012-03-21 2016-07-06 卡西欧计算机株式会社 Video image pickup device and method and make a summary playback setting device and method
US9350952B2 (en) 2012-07-26 2016-05-24 Denso Corporation Drive video recording device and method, drive video recording system, and summarized moving image creating device
JP2014027481A (en) * 2012-07-26 2014-02-06 Denso Corp Drive video recording device and drive video recording system
WO2014167793A1 (en) * 2013-04-09 2014-10-16 株式会社デンソー Vehicle outside image saving device, and portable terminal with imaging function
JP2015186193A (en) * 2014-03-26 2015-10-22 カシオ計算機株式会社 Data processing apparatus, data processing system, reproduction time reduction method, and program
JP2016167771A (en) * 2015-03-10 2016-09-15 株式会社デンソー Digest video generation device
US10230779B2 (en) 2015-05-20 2019-03-12 Ricoh Company, Ltd. Content provision system, information processing apparatus and content reproduction method
JP2018163425A (en) * 2017-03-24 2018-10-18 パイオニア株式会社 Information processing apparatus
JP2022003581A (en) * 2017-03-24 2022-01-11 パイオニア株式会社 Information processing device
JP6391078B1 (en) * 2017-08-18 2018-09-19 クックパッド株式会社 Information processing device, terminal device, display shelf, information processing system, information processing method, and program
JP2019036886A (en) * 2017-08-18 2019-03-07 クックパッド株式会社 Information processing device, terminal device, display shelf, information processing system, information processing method and program
CN110019060A (en) * 2017-12-27 2019-07-16 郑州畅想高科股份有限公司 A kind of engine video frequency file and log sheet automatic synchronous method and device
JP2021077115A (en) * 2019-11-08 2021-05-20 本田技研工業株式会社 Output system and control method thereof, and program
JP7117282B2 (en) 2019-11-08 2022-08-12 本田技研工業株式会社 Output system, its control method, and program

Also Published As

Publication number Publication date
JPWO2010150348A1 (en) 2012-12-06

Similar Documents

Publication Publication Date Title
WO2010150348A1 (en) Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program
US8170795B2 (en) Navigation system with animated intersection view
JP3876462B2 (en) Map information providing apparatus and method
US20060142941A1 (en) Car navigation device and car navigation method for three-dimensional display of route information
JP2007108257A (en) Map mobile device
JP4561675B2 (en) Driving support device and computer program
WO2006101012A1 (en) Map information update device, map information update method, map information update program, and computer-readable recording medium
JP3607166B2 (en) Car navigation system and playback method for car audio system
JP5672814B2 (en) Information processing apparatus, information processing method, and program
JP5726626B2 (en) Reproduction section extraction method, program and storage medium, reproduction section extraction apparatus and transport equipment mounting apparatus
JP2007259146A (en) Caption detector, caption detecting method, caption detecting program and recording medium
JP2000214766A (en) Navigation device
JP2009265061A (en) Navigation device
JP2004212232A (en) Scenery displaying dynamic image navigation system
JP4059074B2 (en) In-vehicle information presentation device
JP4721228B2 (en) Regional representative point acquisition device
JP6052274B2 (en) Information processing apparatus, information processing method, and program
JP5923901B2 (en) Information providing apparatus, information providing system, and information providing method
WO2006109469A1 (en) Music composition support device, music composition support method, music composition support program, and recording medium
JPH08219798A (en) Navigation system
JP4509215B2 (en) Navigation device, control method, control program, and recording medium
JP2010175854A (en) Engine sound output device, output control method, output control program, and recording medium
KR100738991B1 (en) Memory device and car-navigation system for play the memory device
JP4915847B2 (en) Information processing apparatus and method, information processing program, and storage medium
JPH0545171A (en) Navigation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09846481

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011519410

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09846481

Country of ref document: EP

Kind code of ref document: A1