WO2006082886A1 - Image editing device, image editing method, image editing program and computer readable recording medium - Google Patents

Image editing device, image editing method, image editing program and computer readable recording medium Download PDF

Info

Publication number
WO2006082886A1
WO2006082886A1 PCT/JP2006/301757 JP2006301757W WO2006082886A1 WO 2006082886 A1 WO2006082886 A1 WO 2006082886A1 JP 2006301757 W JP2006301757 W JP 2006301757W WO 2006082886 A1 WO2006082886 A1 WO 2006082886A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
unit
image editing
data
Prior art date
Application number
PCT/JP2006/301757
Other languages
French (fr)
Japanese (ja)
Inventor
Takeshi Sato
Kenichiro Yano
Koji Koga
Goro Kobayashi
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Priority to JP2007501613A priority Critical patent/JP4516111B2/en
Priority to US11/815,495 priority patent/US20090027399A1/en
Priority to EP06712900A priority patent/EP1845491A4/en
Publication of WO2006082886A1 publication Critical patent/WO2006082886A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00185Image output
    • H04N1/00198Creation of a soft photo presentation, e.g. digital slide-show
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00161Viewing or previewing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00167Processing or editing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00169Digital image input
    • H04N1/00172Digital image input directly from a still digital camera or from a storage medium mounted in a still digital camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00185Image output
    • H04N1/00196Creation of a photo-montage, e.g. photoalbum

Definitions

  • Image editing apparatus image editing method, image editing program, and computer-readable recording medium
  • the present invention relates to an image editing device that edits image data such as a photograph, an image editing method, an image editing program, and a computer-readable recording medium.
  • image editing device that edits image data such as a photograph, an image editing method, an image editing program, and a computer-readable recording medium.
  • use of the present invention is not limited to the above-described image editing apparatus, image editing method, image editing program, and computer-readable recording medium.
  • an electronic album creating apparatus that makes it possible to create and publish easily on a web page or the like.
  • an electronic album is created as follows.
  • a server connected to the Internet is provided with program software that edits digital image data to create an electronic album, and the server includes image data captured by a digital camera, The photographing time data of the image, the position data acquired by the mopile terminal, and the time data when the position data was acquired can be received.
  • Create a Lubam see, for example, Patent Document 1 below
  • Patent Document 1 JP 2002-183742 A
  • the electronic album creating apparatus described in Patent Document 1
  • the electronic album is created by associating, for example, the photographing time and photographing place of the image data with the program software in the server.
  • the establishment of a connection environment with the system is indispensable and the entire device configuration becomes complicated.
  • the image editing apparatus includes an input unit that receives input of image data including information related to date and time, an acquisition unit that acquires information about a route and time when the moving body has moved, and the input unit. And an associating means for associating the image data with map information based on the information on the date and time of the image data received by the information and the information on the route and the time acquired by the acquiring means. .
  • the image editing method according to the invention of claim 10 includes an input step of inputting image data including information on date and time, an acquisition step of acquiring information on a route and time of movement of the moving body, and the input An association step of associating the image data with map information based on the information on the date and time of the image data input in the step and the information on the route and time acquired in the acquisition step. It is a feature.
  • An image editing program according to claim 11 causes a computer to execute the image editing method according to claim 10.
  • a computer-readable recording medium according to the invention of claim 12 records the image editing program according to claim 11.
  • FIG. 1 is a block diagram showing an example of a functional configuration of an image editing apparatus that is useful for an embodiment.
  • FIG. 2 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment.
  • FIG. 3 is an explanatory diagram showing an example of the inside of a vehicle equipped with an image editing apparatus similar to the embodiment.
  • FIG. 4 is a block diagram illustrating an example of a hardware configuration of an image editing apparatus that is effective in the embodiment.
  • FIG. 5 is a block diagram illustrating an example of an internal configuration of an image editing unit in the image editing apparatus according to the embodiment.
  • FIG. 6 shows an example of an internal configuration of an audio reproduction unit in an image editing apparatus that is effective in the embodiment.
  • FIG. 7 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment.
  • FIG. 8 is a flowchart illustrating an example of another procedure for associating audio data in the image editing process of the image editing apparatus according to the embodiment.
  • FIG. 9 is a flowchart illustrating an example of another procedure for associating audio data in the image editing process of the image editing apparatus according to the embodiment.
  • FIG. 10 is an explanatory diagram of an example of an image data distribution process procedure in the image editing process of the image editing apparatus according to the embodiment.
  • FIG. 11 is an explanatory diagram of a specific processing example of the image editing processing of the image editing apparatus according to the embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of an image editing apparatus that is useful in an embodiment of the present invention.
  • an image editing apparatus is mounted on a moving body such as a vehicle (including a four-wheeled vehicle and a two-wheeled vehicle), for example, and includes a photographing unit 101, a sound collecting unit 102, an input unit 103, and an acquiring unit 104.
  • the association unit 105, the display unit 106, the detection unit 107, the control unit 108, and the audio reproduction unit 109 are included.
  • the imaging unit 101 captures an image.
  • the image photographed by the photographing unit 101 may be an image taken inside or outside the vehicle.
  • the photographing unit 101 is configured to be integral with or removable from the image editing apparatus.
  • the sound collection unit 102 collects sound inside the vehicle, for example. Examples of the sound collected by the sound collection unit 102 include those collected from the sound field inside the vehicle.
  • the input unit 103 accepts input of image data including information related to date and time (for example, time stamp data).
  • the input unit 103 also accepts input of image data of an image captured by the imaging unit 101 and audio data of the sound collected by the sound collection unit 102.
  • the acquisition unit 104 acquires information regarding the route and time of travel of the vehicle.
  • the acquisition unit 104 acquires behavior information regarding the behavior of the vehicle. Specifically, this behavior information is information indicating the moving or stopping state of the vehicle. For example, information on vehicle speed (speed information, acceleration information, angular velocity information, etc.), inclination angle information, lateral G (Gmvity ) Information and current location information that includes at least one of! /
  • the associating unit 105 uses the image data based on the information on the date and time of the image data received by the input unit 103, the vehicle route and time information acquired by the acquiring unit 104, and the behavior information. Is associated with map information. By the association in the associating unit 105, when and where the image data captured by the imaging unit 101 is captured. It is determined whether it is
  • Display unit 106 displays the image data associated by association unit 105.
  • the display unit 106 may display the image data arranged in the order of time series in which the image data was taken or the order in which the vehicle passed.
  • the detection unit 107 calculates the feature amount of the image image included in the image data of the image captured by the image capturing unit 101 and the feature amount of the audio parameter included in the sound data of the sound collected by the sound collection unit 102. To detect.
  • the feature amount of the image image includes, for example, the feature amount of the face image of a person included in the image image of the image data.
  • the feature amount of the voice parameter include a feature amount of a volume component (volume level), a time component (sounding duration), and a frequency component (frequency level).
  • the control unit 108 controls the photographing unit 101 based on the feature amount of the image image and the feature amount of the sound meter detected by the detection unit 107. Further, the control unit 108 controls the photographing unit 101 to take an image when the feature amount detected by the detecting unit 107 changes.
  • the audio reproduction unit 109 reproduces audio data. Further, when displaying the image data on the display unit 106, the audio reproduction unit 109 reproduces based on, for example, the feature amount detected by the detection unit 107 and the behavior information acquired by the acquisition unit 104. Select audio data. Examples of the sound reproduced by the sound reproducing unit 109 include music such as music and sound effects.
  • FIG. 2 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment of the present invention.
  • an image of a person or a landscape is input from one or a plurality of photographing units 101 (see FIG. 1, the same applies hereinafter) by the input unit 103 (see FIG. 1, the same applies hereinafter).
  • image data including date and time information is input (step S 201).
  • the acquisition unit 104 acquires information regarding the route and time the vehicle has traveled (step S 202).
  • the associating unit 105 based on the information on the date and time of the image data input in step S201 and the information on the route and time acquired in step S202 by the associating unit 105 (see FIG.
  • the image The data is associated with the map information (step S203). After associating the image data with the map information in this manner, the image data is displayed by the display unit 106 (see FIG. 1, the same applies hereinafter) (step S204). As a result, the image editing process according to this flowchart is completed.
  • the characteristics of the image image detected by the detection unit 107 during the display processing of the image data by the display unit 106 in step S204.
  • Volume and the parameter value of the voice parameter and the behavior information acquired by the acquisition unit 104, the audio playback unit 109 selects the audio data to be played back, and the selected audio data May be played back.
  • the control unit 108 changes the feature amount detected by the detection unit 107
  • the imaging unit 101 may be controlled to capture an image.
  • the input image data is based on the information on the date and time of the image data and the information on the acquired route and time. Can be associated with map information without going through a server. For this reason, image data obtained during driving of the vehicle can be automatically linked to the passing point and passing time of the vehicle without complicating the configuration of the device, and edited in chronological order or passing route order. As a result, it is possible to reduce complicated labor and cost for image editing.
  • FIG. 3 is an explanatory view showing an example of the inside of a vehicle in which an image editing apparatus that is effective in the embodiment of the present invention is mounted.
  • the passenger seat 312 is provided with a monitor 302b as the display unit 106 toward the passenger on the rear seat 313.
  • the image editing apparatus 310 (310a, 310b) includes these monitors 302 (302a, 302b), a speaker 304, a camera 305, and a microphone 306. It should be noted that the camera 305 and the microphone 306 ⁇ and the image editing device 310 (310a, 310b) may be mounted respectively.
  • the image editing device 310 (310a, 310b) may be provided with a structure that can be attached to and detached from the vehicle.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of an image editing apparatus that is useful in the embodiment of the present invention.
  • the image editing device 310 is detachably mounted on the vehicle as described above, and includes a control unit 400, a user operation unit (remote control and touch panel) 401, a display unit (monitor) 402, and the like. , Position acquisition unit (GPS, sensor) 403, recording medium 404, recording medium decoding unit 405, guidance sound output unit 406, communication unit 407, route search unit 408, route guidance unit 409, guidance Sound generation unit 410, speaker 411, image editing unit 412, image input / output IZF 413, audio reproduction unit 414, audio output unit 415, shooting unit 416, sound collection unit 417, audio input iZ F418, It is the composition which includes.
  • the control unit 400 controls the entire image editing apparatus 310, for example, and executes various arithmetic processes according to a control program, thereby comprehensively controlling each unit included in the image editing apparatus 310.
  • the control unit 400 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area for the CPU. It can be realized by a microcomputer composed of the above.
  • control unit 400 for example, at the time of route guidance of the vehicle, information on the current position of the vehicle (current position information) acquired by the position acquisition unit 403 and the recording medium 404 via the recording medium decoding unit 405 Based on the map information obtained in this way, the position on the map where the vehicle is traveling is calculated, and the calculation result is output to the display unit 402.
  • the control unit 400 inputs and outputs information related to route guidance among the route search unit 408, the route guidance unit 409, and the guidance sound generation unit 410, and displays information obtained as a result of the display unit 402. And output to the guide sound output unit 406.
  • the user operation unit 401 outputs information input by the user such as characters, numerical values, and various instructions to the control unit 400.
  • information input by the user such as characters, numerical values, and various instructions to the control unit 400.
  • various known forms such as a push button switch that detects physical pressing Z non-pressing, a touch panel, a keyboard, and a joystick can be employed.
  • the user operation unit 401 may be configured to perform an input operation by sound using a microphone or the like that inputs sound of an external force such as the sound collection unit 417 described later.
  • the user operation unit 401 may be provided integrally with the image editing apparatus 310, or may be configured to be able to operate a positional force separated from the image editing apparatus 310, such as a remote controller. . Further, the user operation unit 401 may be configured in any one of the various forms described above, or may be configured in a plurality of forms. The user inputs information by appropriately performing an input operation according to the form of the user operation unit 401.
  • Information input by an input operation of the user operation unit 401 includes, for example, destination information regarding navigation. Specifically, for example, when the image editing apparatus 310 is provided in a vehicle or the like, a point set as a destination target by a person on the vehicle is set. Further, as information input to the user operation unit 401, for example, with respect to image editing, information on the display format in an electronic album of image data input to the image editing unit 412 from an image input / output IZF 413 to be described later can be cited. Specifically, for example, the display format of the electronic album desired by the person on board the vehicle is set.
  • the touch panel when a touch panel is adopted as the form of the user operation unit 401, the touch panel is used by being stacked on the display screen side of the display unit 402. In this case, display By managing the display timing in the unit 402, the operation timing for the touch panel (user operation unit 401) and its position coordinates, the input information by the input operation is recognized.
  • a touch panel stacked on the display unit 402 as a form of the user operation unit 401, it is possible to input a large amount of information without enlarging the form of the user operation unit 401.
  • various known touch panels such as a resistance film type and a pressure sensitive type can be adopted.
  • Display unit 402 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like.
  • the display unit 402 can be configured by, for example, a video IZF or a video display device connected to the video IZF, not shown.
  • the video IZF is a graphic controller that controls the entire display device, a buffer memory such as VRAM (Video RAM) that temporarily stores image information that can be displayed immediately, and a graphic controller. Based on the image information output from the control unit, it is composed of a control IC that controls display of the display device, a GPU (Graphics Processing Unit), and the like.
  • the display unit 402 displays icons, cursors, menus, windows, or various information such as characters and images. Further, the display unit 402 displays image data edited by an image editing unit 412 described later.
  • the position acquisition unit 403 acquires current position information (latitude and longitude information) of the vehicle on which the image editing device 310 is mounted, for example, by receiving radio waves of artificial satellite power.
  • the current position information is information for receiving a radio wave from an artificial satellite and obtaining a geometric position with respect to the artificial satellite, and can be measured anywhere on the earth.
  • the position acquisition unit 403 includes a GPS antenna (not shown).
  • the GPS Global Positioning System
  • the position acquisition unit 403 can be configured by, for example, a tuner that demodulates radio waves that have also received artificial satellite force, an arithmetic circuit that calculates the current position based on the demodulated information, and the like.
  • the radio wave from the artificial satellite is 1. 57542GHz carrier, CZA (Coarse and Access) code and navigation messages are carried out using LI radio waves.
  • CZA Coarse and Access
  • the current position (latitude and longitude) of the vehicle on which the image editing apparatus 310 is mounted is detected.
  • information collected by various sensors such as a vehicle speed sensor and a gyro sensor may be taken into consideration.
  • the vehicle speed sensor detects the vehicle speed from the output shaft of the transmission of the vehicle on which the image editing device 310 is mounted.
  • the angular velocity sensor detects the angular velocity when the vehicle rotates and outputs angular velocity information and relative orientation information.
  • the mileage sensor calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal output with the rotation of the wheel in a given period, and the mileage based on the number of pulses per one rotation Output information.
  • the inclination angle sensor detects the inclination angle of the road surface and outputs inclination angle information.
  • the lateral G sensor detects lateral G, which is an outward force (gravity) generated by centrifugal force during vehicle cornering, and outputs lateral G information.
  • the vehicle current position information acquired by the position acquisition unit 403 and the information detected by the vehicle speed sensor, gyro sensor, angular velocity sensor, mileage sensor, inclination angle sensor, and lateral G sensor are the vehicle behavior. Is output to the control unit 400 as behavior information regarding
  • the recording medium 404 records various control programs and various information in a state that can be read by a computer.
  • the recording medium 404 accepts writing of information by the recording medium decoding unit 405 and records the written information in a nonvolatile manner.
  • the recording medium 404 can be realized by HD (Hard Disk), for example.
  • the recording medium 404 is not limited to HD. It can be attached to and detached from the recording medium decoding unit 405 such as DVD (Digital Versatile Disk) and CD (Compact Disk) instead of HD or in addition to HD.
  • a medium having portability may be used as the recording medium 404.
  • the recording medium 404 is not limited to the DVD and CD, but is attached to and detached from the recording medium decoding unit 405 such as a CD-ROM (CD-R, CD-RW), MO (Magno-Optical disk), and memory card. It is possible to use a portable medium that is portable.
  • the recording medium 404 stores an image editing program, a navigation program, image data, map information, and the like that realize the present invention.
  • the image data refers to a two-dimensional array value representing an image image related to a person or a landscape, for example.
  • the map information includes background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing road shapes. Rendered in 3D or 3D.
  • the background information includes background shape information representing the shape of the background and background type information representing the type of the background.
  • the background shape information includes information indicating, for example, representative points of features, polylines, polygons, coordinates of features, and the like.
  • the background type information includes, for example, text information indicating the name, address, telephone number, and the like of the feature, and type information indicating the type of the feature such as a building 'river'.
  • the road shape information is information relating to a road network having a plurality of nodes and links.
  • the node is information indicating an intersection where a plurality of roads intersect, such as a three-way “crossroad” and a five-way.
  • a link is information indicating a road connecting nodes. Some links have shape interpolation points that allow the expression of curved roads.
  • Road shape information has traffic condition information.
  • the traffic condition information is information indicating the characteristics of the intersection, the length (distance) of each link, vehicle width, traveling direction, traffic prohibition, road type, and the like.
  • the characteristics of this intersection include, for example, complex intersections such as three- and five-way intersections, intersections where roads branch at shallower angles, intersections around destinations, junctions on expressways, junctions, and route deviation rates. For example, high intersections.
  • the route deviation rate can be calculated by, for example, the past driving history force. Examples of road types include highway, toll road, and general road.
  • the power for recording image data and map information on the recording medium 404 is not limited to this.
  • the image data and the map information may be provided outside the image editing apparatus 310, not limited to those recorded only with those provided integrally with the hardware of the image editing apparatus 310.
  • the image editing apparatus 310 acquires image data via the network through the communication unit 407, for example.
  • the image editing device 310 uses the communication unit 407, for example, to map information via the network. To get.
  • the image data and map information acquired in this way may be stored in the RAM of the control unit 400, for example!
  • the recording medium decoding unit 405 controls reading and writing of information on the recording medium 404.
  • the recording medium decoding unit 405 is an HDD (Hard Disk Drive).
  • the recording medium decoding unit 405 is a DVD drive or a CD drive.
  • a CD-ROM (CD-R, CD-RW), MO, memory card, etc. is used as the writable and detachable recording medium 404, information can be written to various recording media and various recording media.
  • a dedicated drive device or the like that can read the information stored in the recording medium may be used as the recording medium decoding unit 405 as appropriate.
  • the guidance sound output unit 406 reproduces navigation guidance sound by controlling output to the connected speaker 411.
  • the guide sound output unit 406 can be realized by an audio IZF (not shown) connected to the audio output speaker 411.
  • the audio IZ F is, for example, a DZ A converter that performs DZA conversion of digital audio data, an ⁇ / A converter power, an amplifier that amplifies the analog audio signal that is output, and an AZD conversion of the analog audio signal. It can be configured with AZD converter and force.
  • the communication unit 407 performs communication with other image editing apparatuses.
  • the communication unit 407 of this embodiment may be a communication module that communicates with a communication server (not shown) via a base station (not shown), such as a mobile phone. It may be a communication module that performs direct wireless communication with the device.
  • the wireless communication is communication performed using radio waves or infrared rays' ultrasonic waves without using a wire line as a communication medium.
  • Standards that enable wireless communication include various technologies such as wireless LAN, IrDA (Infrared Data Association), HomeRF (Home Radio Frequency), BlueTooth, etc. In this embodiment, various known wireless communication technologies are used. Can be used.
  • the communication unit 407 may receive road traffic information such as traffic jams and traffic regulations on a regular basis (even if irregular). The reception of road traffic information by the communication unit 407 may be performed at the timing when the VICS (Vehicle Information and Communication system) center ⁇ road traffic information is delivered, or the road traffic information is periodically sent to the VICS center. You may do it by requesting.
  • the communication unit 407 can be realized as, for example, an AM ZFM tuner, a TV tuner, a VICSZ beacon receiver, and other communication devices.
  • rvicsj which is a well-known technology and omits detailed explanation, is a real-time transmission of road traffic information such as traffic jams and traffic regulations edited and processed at the VICS center. It is an information communication system that displays characters on the in-vehicle device.
  • VICS information road traffic information
  • FM multiplex broadcasting installed on each road.
  • Beacons include “electric beacons” mainly used on expressways and “optical beacons” used on major general roads. When “FM multiplex broadcasting” is used, road traffic information in a wide area can be received.
  • the communication unit 407 When using VISICON, receive necessary road traffic information at the place where the vehicle is located, such as detailed information on the most recent road based on the vehicle (vehicle) position! It becomes possible.
  • the communication unit 407 includes a plurality of communication means corresponding to each. Get ready.
  • the route search unit 408 calculates an optimal route from the current position to the destination based on the current position information of the vehicle acquired by the position acquisition unit 403 and the destination information input by the user. To do.
  • the route guidance unit 409 includes information on the guidance route searched by the route search unit 408 or route information received by the communication unit 407, current position information acquired by the position acquisition unit 403, and a recording medium 404 Based on the map information obtained via the decoding unit 405, real-time route guidance information is generated.
  • the route guidance information generated by the route guidance unit 409 is output to the display unit 402 via the control unit 400.
  • Guide sound generation section 410 generates tone and voice information corresponding to the pattern. In other words, based on the route guidance information generated by the route guidance unit 409, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and the guidance sound is transmitted via the control unit 400. Output to the output unit 406.
  • the speaker 411 reproduces (outputs) a navigation guidance sound output from the guidance sound output unit 406 and a sound output from the audio output unit 415 described later.
  • a headphone or the like may be provided for the speaker power 411 so that the sound output sound of the guidance sound or sound is appropriately changed so that the sound field of the guidance sound or sound output from the entire vehicle interior is not obtained. Oh ,.
  • the image editing unit 412 performs image editing processing such as image data acquired through the image input / output IZF 413 from the photographing unit 416 and the communication unit 407, which will be described later, and image data recorded in the recording medium 404.
  • the image editing unit 412 is configured by a GPU, for example.
  • the image editing unit 412 performs processing for creating electronic album (hereinafter referred to as “album”) data using image data in accordance with a control command from the control unit 400.
  • the album data refers to image data taken by the photographing unit 416 such as a digital still camera (DSC) or a digital video camera (DVC) on the display screen of the display unit 402. Digital data that can be viewed like a photo album or viewed and edited on a personal computer.
  • DSC digital still camera
  • DVC digital video camera
  • Image Input / Output IZF 413 inputs / outputs image data input / output to / from image editing unit 412 from the outside.
  • Image input / output IZF413 uses USB (Universal Serial Bus), IEEE 1394 to transfer image data from recording medium 404 that stores image data captured by DSC or DVC, for example, or image data stored in DSC or DVC, for example. (Institut e of Electrical and Electronic Engineers 1394) and image data input from the communication unit 407 by communication such as infrared rays are output to the image editing unit 412 and the image data output from the image editing unit 412 is output to the recording medium 404. And output to the communication unit 407.
  • USB Universal Serial Bus
  • the image input / output IZF 413 preferably has a controller function for controlling read Z write of the recording medium 404 when inputting / outputting image data to / from the recording medium 404. Further, the image input / output IZF 413 may have a function of a communication controller that controls communication with the communication unit 407 when inputting / outputting image data to / from the communication unit 407. [0057]
  • the audio reproduction unit 414 selects audio data obtained from the recording medium 404 through the recording medium decoding unit 405, audio data obtained from the communication unit 407 through the control unit 400, and the like, and reproduces the selected audio data. To do.
  • the audio playback unit 414 plays back audio data stored in a storage device such as an audio database (hereinafter referred to as “audio DB”) 611 (see FIG. 6) described later.
  • Audio data to be played back includes audio data such as music music and sound effects.
  • the audio reproduction unit 414 may be configured to reproduce radio audio, for example.
  • the audio output unit 415 controls the output of the audio output from the speaker 411 based on the audio data selected and reproduced by the audio reproduction unit 414. Specifically, for example, sound volume adjustment and equalizing processing is performed to control the sound output state.
  • the audio output control by the audio output unit 415 is performed by, for example, an input operation from the user operation unit 401 or a control by the control unit 400.
  • the imaging unit 416 includes the camera 305 mounted on the vehicle in FIG. 3 and an external imaging device such as the above-described DSC and DVC, and includes a photoelectric conversion element such as C-MOS or CCD. Take images inside or outside the vehicle.
  • the photographing unit 416 is connected to the image editing device 310 by wire or wirelessly, and takes a picture of a person who gets on the vehicle, for example, according to a photographing command from the control unit 400. Image data of an image photographed by the photographing unit 416 is output to the image editing unit 412 via the image input / output IZF 413.
  • the sound collection unit 417 is configured by the microphone 306 or the like mounted on the vehicle in FIG. 3, for example, collecting sound such as the utterance sound of a person boarding the vehicle from the sound field inside the vehicle. To do.
  • the voice input IZF 418 converts the voice collected by the sound collection unit 417 into digital voice data and outputs it to the control unit 400. More specifically, the voice input IZF 418 can also be configured with power such as an AZD converter that converts input analog voice data into digital voice data.
  • the audio input IZF 418 may include a filter circuit that filters digital audio data, an amplification circuit that amplifies analog audio data, and the like.
  • control unit 400 is based on image data captured by the imaging unit 416 and output from the image editing unit 412, or audio data collected by the sound collection unit 417 and output from the audio input IZF 418.
  • image data captured by the imaging unit 416 and output from the image editing unit 412, or audio data collected by the sound collection unit 417 and output from the audio input IZF 418.
  • the control unit 400 may be configured to have a DSP (Digital Signal Processor) function, for example!
  • DSP Digital Signal Processor
  • the photographing unit 101 in FIG. 1 realizes its function by, for example, the photographing unit 416
  • the sound collecting unit 102 realizes its function by, for example, the sound collecting unit 417
  • the input unit 103 in FIG. 1 realizes its function by, for example, an image input / output IZF 413 and an audio input IZF 418
  • the acquisition unit 104 realizes its function by, for example, a position acquisition unit 403.
  • the functions of the associating unit 105, the detecting unit 107, and the control unit 108 in FIG. 1 are specifically realized by, for example, the control unit 400 and the image editing unit 412.
  • the display unit 106 in FIG. 1 realizes its function by, for example, the display unit 402
  • the audio reproduction unit 109 has its function by, for example, the audio reproduction unit 414, the audio output unit 415, and the speech force 411. Realize.
  • FIG. 5 is a block diagram showing an example of the internal configuration of the image editing unit in the image editing apparatus according to the embodiment of the present invention.
  • FIG. 6 is a block diagram showing an example of the internal configuration of the audio playback unit in the image editing apparatus according to the embodiment of the present invention.
  • an image editing unit 412 includes an image editing processing unit 510, a display control unit 511, an image recognition unit 512, an image storage unit 513, a person recognition unit 514, "Person DB") 515.
  • the image editing processing unit 510 is connected to the image capturing unit 416 (see FIG. 4; the same applies hereinafter) through the image input / output IZF 413, the image data input to the image editing unit 412 from the outside, and the recording medium decoding unit 405 (see FIG. 4; the same applies to the following). )
  • the control unit 400 see FIG. 4, the same applies hereinafter
  • the image editing processing unit 510 reads out image data stored in an image storage unit 513 to be described later. Perform image editing processing.
  • the contents of the image editing process include, for example, editing image data into album data.
  • the display control unit 511 performs control for displaying the image data output from the image editing processing unit 510 as an album form on the display screen of the display unit 402.
  • the image recognition unit 512 recognizes what image image is included in the image data based on the image data input to the image editing processing unit 510.
  • the image storage unit 513 stores the image data input to the image editing processing unit 510.
  • the person recognizing unit 514 is stored in the person DB 515 when the image image in the image data input to the image editing processing unit 510 includes an image related to a person.
  • the image related to the person is read out, and the person represented by the image is recognized.
  • the recognition process is performed, for example, by face authentication based on a human face image. Since face authentication is a known technique, description thereof is omitted here.
  • the person DB 515 stores image data including image images of persons on the vehicle, and personal identification data such as the age and sex of these persons.
  • the image editing unit 412 detects the image data of the image data recognized by the image recognition unit 512, the feature amount of the image image related to the person recognized by the person recognition unit 514, and sends it to the control unit 400. Output.
  • the feature quantities of these image images are detected, for example, color tone data of the image image or emotion parameter power of the human face image. Specifically, the color tone data indicates which color tone of the entire image is close to red, blue, green, etc., and the emotion parameter indicates which facial expression of the person's face image is close (feeling emotional). Show.
  • an audio playback unit 414 includes an audio playback processing unit 610, an audio data base (hereinafter referred to as “audio DB”) 611, a music selection history database (hereinafter referred to as “music selection history DB”). 612).
  • the audio reproduction processing unit 610 performs selection / reproduction processing of audio data input to the audio reproduction unit 414 and audio data stored in the audio DB 611 !.
  • the audio reproduction processing unit 610 performs audio data selection 'reproduction processing in association with image data in album data created by, for example, the image editing unit 412 (see FIG. 4, the same applies hereinafter).
  • the audio data association here is the image in the album data. It may be performed based on feature quantities such as time stamp data included in the image data, color tone data of the image image, and human face image image.
  • the audio DB 611 stores audio data to be reproduced by the audio reproducing unit 414.
  • the sound data stored in the sound DB 611 may be the sound data input to the sound reproducing unit 414 even if the recording medium 404 (see FIG. 4; the same applies hereinafter) or the communication unit 40 7 (see FIG. 4; the same applies hereinafter).
  • the audio data provided in the bullying image editing device 310 may be used.
  • the music selection history DB 612 stores information related to the music playback history and music selection history when the audio data played back by the audio playback unit 414 is music data. This music selection history DB 612 stores, for example, information related to the reproduction history of music played during driving and information related to music selection history when the image editing apparatus 310 is mounted on a vehicle.
  • FIG. 7 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment of the present invention.
  • an image inside the vehicle is taken by the imaging unit 416 (see FIG. 4, the same applies hereinafter) provided inside the vehicle (step S 701), and the sound collecting unit provided inside the vehicle. Board the vehicle with 417 (see Figure 4, the same applies below)! / Sounds from the passenger (hereinafter referred to as “passenger” ⁇ ⁇ ) are collected (step S702).
  • Image data of an image photographed by the photographing unit 416 is input to an image editing unit 412 (see FIG. 4 and the same below) through an image input / output IZF413 (see FIG. 4 and so on), and is collected by a sound collecting unit 417.
  • the collected audio data is input to the control unit 400 (see FIG. 4 and the same below) through the audio input IZF418 (see FIG. 4 and the same below), and the image editing unit 412 and the control unit 400 input the image data.
  • the feature amount of the image image and the feature amount of the sound parameter of the sound data are detected (step s703).
  • step S 703 information regarding the feature amount of the image image detected by the image editing unit 412 is output to the control unit 400.
  • the control unit 400 determines whether or not the atmosphere in the vehicle has changed based on the detected feature amount (Ste S704). Judgment of whether or not there is a change in the atmosphere in the vehicle is, for example, whether the detected feature value of the image changes from an emotion parameter representing “smile” to an emotion parameter representing “crying face” This is done by judging that the feature value changes from a frequency component representing “laughing voice” to a frequency component representing “screaming voice”.
  • step S704 if control unit 400 determines that there is no change in the vehicle interior (step S704: No), the process returns to step S701, and the processing from step S701 to step S704 is repeated. If it is determined in step S704 that the atmosphere in the vehicle has changed (step S704: Yes), the image editing unit 412 generates image data including time stamp data photographed by the photographing unit 416 through the image input / output IZF413. Obtain (step S705).
  • step S705 the image data is acquired, and the control unit 400 also acquires the current position information of the vehicle with the position acquisition unit 403 (see FIG. 4; the same applies hereinafter) force (step S706), and the recording medium
  • the recording medium 404 see FIG. 4; the same applies hereinafter
  • step S707 map information
  • step S708 the image data is acquired, and the control unit 400 also acquires the current position information of the vehicle with the position acquisition unit 403 (see FIG. 4; the same applies hereinafter) force
  • step S706 the recording medium
  • the recording medium 404 also acquires map information (step S707), and further acquires information on the route and time of travel of the vehicle ( Step S708).
  • step S708 after acquiring information related to the traveled route and time, the control unit 400 collates the time stamp data of the image data acquired by the image editing unit 412 with the information related to the traveled route and time, A point on the map where the vehicle passes at the time indicated by the time stamp data of the image is detected, and the image data is associated with the map information (step S709).
  • the image editing unit 412 creates album data using the image data (step S710).
  • the position acquisition unit 403 or the like acquires behavior information relating to the vehicle behavior such as information relating to the vehicle speed and inclination angle information (step S711).
  • the behavior information acquired in this way is output to the audio reproduction unit 414 (see FIG. 6; the same applies hereinafter) through the control unit 400, and the audio reproduction unit 414 acquires album data from the image editing unit 412.
  • the audio playback processing unit 610 (see FIG. 6; the same applies hereinafter) allows the audio data from the audio DB 611 (see FIG. 6; the same applies hereinafter) and the music selection history DB 612 (see FIG. 6; same applies hereinafter)
  • the audio data is associated with the album data by referring to information on the power selection history (step S712).
  • associating the audio data for example, based on the map information and behavior information associated with the album data, the terrain and road type at the time of image capture are determined, and the determined terrain and road type are determined. Audio data such as matching music is read from the audio DB611 and associated. In addition, referring to the above-mentioned feature values of the image and the feature values of the audio parameters, the sound data suitable for these feature values may be associated.
  • step S712 after associating the audio data with the album data, the image editing unit 412 and the control unit 400 determine whether the album data is completed (step S713). If it is determined that the album data is completed! /, NA! / (Step S713: No), the process returns to step S701, and the processing from step S701 to step S713 is repeated. If it is determined that the album data has been completed (step S713: Yes), the series of image editing processing by this flowchart is terminated.
  • FIG. 8 and FIG. 9 are flowcharts showing an example of another procedure for associating audio data in the image editing process of the image editing apparatus according to the embodiment of the present invention.
  • FIG. 8 shows the association process based on the time stamp data of the image data
  • FIG. 9 shows the association process based on the tone data of the image image of the image data.
  • the audio reproduction processing unit 61 0 (see FIG. 6, the same applies hereinafter) of the audio reproduction unit 414 (see FIG. 6, the same applies hereinafter) is selected from the music selection history DB 612 (refer to FIG. 6, the same applies hereinafter).
  • the information about the reproduction history of the music that has been reproduced by the image editing device 310 is acquired (step S801).
  • the audio playback processing unit 610 refers to the time stamp data of the image data in the album data (step S802).
  • step S802 after referring to the time stamp data, the audio reproduction processing unit 610 selects the audio data of the music having information regarding the reproduction history reproduced at the time closest to the referred time stamp data. (Step S803). This way the audio data After the selection, the audio reproduction processing unit 610 associates the selected audio data with the album data (step S804). In step S804, for example, the chorus portion (highlight portion) of the selected audio data may be associated with the album data to associate the audio data.
  • an audio reproduction processing unit 610 for example, an image editing processing unit 510 (see FIG. 5; Refer to the feature value of the color tone data as the feature value of the image in the entire data (step S901). Then, the audio reproduction processing unit 610 selects audio data corresponding to the feature amount of the referred tone data from the audio DB 611 (see FIG. 6, the same applies hereinafter) (step S902).
  • the sound data is selected, for example, when the color of the image of the entire image data is blue, the sound data of a sad atmosphere is selected, and when the color is green, the sound of the atmosphere of the healing atmosphere is selected. If the audio data is selected and it is red, the uptempo music data is selected.
  • the audio reproduction processing unit 610 associates the selected audio data with the album data (step S903).
  • the chorus portion (highlight portion) of the selected audio data is associated with the album data so as to associate the audio data.
  • the selection of audio data in step S902 may be performed based on, for example, emotion parameters represented by the face image of the image data.
  • emotion parameters represented by the face image of the image data For example, when the face image shows joy, select the voice data with a tone of brightness and atmosphere, and when the image shows anger, select the voice data with a tone of intense atmosphere and enjoy it.
  • the voice data of up-tempo music is selected.
  • FIG. 10 and 11 are explanatory diagrams showing a specific processing example of the image editing processing of the image editing apparatus according to the embodiment of the present invention.
  • the image editing unit 412 (see FIG. 5; the same below) of the image editing device 310 (see FIG. Section 403 (see Fig. 4; the same applies hereinafter) acquired by the control unit 400 (see Fig. 4; the same applies hereinafter) as well as vehicle current position information, route and time information, and recording medium 404 (Fig. 4).
  • the shooting unit 416 (see FIG. Part 417 (Refer to Fig. 4; the same applies hereinafter) Image data when it is determined that the atmosphere in the vehicle has changed based on the feature values of the force image data and the image data and sound parameters of the sound data. get.
  • the image editing unit 412 includes the time stamp data of the image data acquired at the photo acquisition points A to D, the current position information of the vehicle acquired at the photo acquisition points A to D, and the route of the vehicle. Based on the information about the time and the time, album data is created by associating the image data acquired at the photo acquisition points A to D with the map information. Since this album data is associated with audio data such as music, music can be automatically played back as appropriate.
  • the album data created by the image editing unit 412 in this way is, for example, a double-page album on the display screen of the display unit 402 (see Fig. 4; the same applies hereinafter) as shown in Fig. 11. It can be displayed in the form.
  • image data 1120, 1130, 1140, 1150 acquired at photo acquisition points A to D can be displayed in chronological order.
  • the displayed image data 1120, 1130, 1140, and 1150 are the image images 1121, 1131, 1141, and 1151 and the scenery A to D outside the vehicle at image acquisition points A to D. Show 122, 1132, 1142, 1152! / ,.
  • Each of the displayed image data 1120, 1130, 1140, 1150 includes place information obtained from the associated map information, and text information representing the time when the image was taken or the vehicle passed. It is good to display.
  • the image data 1120 is taken at the time “8:14 am”, and the photograph acquisition point A taken is “near Yorii Station”.
  • the image data 1130 is taken at the time “8:37 am”, and the taken photo acquisition point B is “near Nagatoro”.
  • image data 1140 is taken at the time “1:20 pm”, and the photograph acquisition point C is “near Chichibu Station”.
  • the image data 1150 is taken at the time “2:53 pm”, and the photograph acquisition point D taken is “near Masamaru Pass”.
  • the image data 1120, 1130, 1140, and 1150 may be displayed in the order of the route that the vehicle has passed.
  • the image editing apparatus uses the time stamp data of the image data captured when the atmosphere in the vehicle changes, and information on the route and time the vehicle has moved. Based on this, album data can be created by associating image data with map information without using a server or the like. For this reason, image data can be automatically edited in chronological order or route order to create album data, reducing the effort of image editing.
  • the image editing apparatus can associate the image data of the album data with the audio data without using a server or the like, so that it is possible to improve entertainment properties and reduce the labor and cost of the image editing process. Can be reduced.
  • images are automatically captured as appropriate! /
  • an electronic album can be created by automatically associating the acquired image data with map information and audio data as appropriate.
  • the image editing method described in the present embodiment can be realized by executing a prepared program on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read by the computer.
  • this program may be a transmission medium that can be distributed through a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Television Signal Processing For Recording (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)

Abstract

An image editing device is provided with an input section (103) for receiving input of image data including information relating to date and time; an acquiring section (104) for acquiring information relating to a route and a time by which and at which a mobile object moved; and an associating section (105) for associating the image data with map information based on information relating to the date and the time of the image data received by the input section (103) and the information relating to the route and the time acquired by the acquiring section (104). The image editing device automatically edits the image data in time series or in route order.

Description

画像編集装置、画像編集方法、画像編集プログラムおよびコンピュータ に読み取り可能な記録媒体  Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
技術分野  Technical field
[0001] この発明は、写真などの画像データを編集する画像編集装置、画像編集方法、画 像編集プログラムおよびコンピュータに読み取り可能な記録媒体に関する。ただし、 この発明の利用は、上述した画像編集装置、画像編集方法、画像編集プログラムお よびコンピュータに読み取り可能な記録媒体に限られない。  The present invention relates to an image editing device that edits image data such as a photograph, an image editing method, an image editing program, and a computer-readable recording medium. However, use of the present invention is not limited to the above-described image editing apparatus, image editing method, image editing program, and computer-readable recording medium.
背景技術  Background art
[0002] 近年、デジタルスチルカメラ(Digital Still Camera: DSC)やデジタルビデオ力 メラ(Digital Video Camera : DVC)などの普及に伴って、撮影した静止画や動画 の画像データを用いた、いわゆる電子アルバムを作成し、ウェブページ上などで容易 に公開することを可能にした電子アルバム作成装置が提供されて 、る。このような電 子アルバム作成装置では、つぎのように電子アルバムが作成される。  [0002] With the recent spread of digital still cameras (DSC) and digital video cameras (Digital Video Cameras: DVC), so-called electronic albums using still image data and moving image data. There is provided an electronic album creating apparatus that makes it possible to create and publish easily on a web page or the like. In such an electronic album creating apparatus, an electronic album is created as follows.
[0003] 具体的には、たとえばインターネットに接続されたサーバ内に、ディジタル画像デー タを編集して電子アルバムを作成するプログラムソフトを備え、該サーバは、ディジタ ルカメラで撮影された画像データと、その画像の撮影時間データと、モパイル端末で 取得された位置データと、その位置データを取得したときの時間データとを受信可能 とされ、これら受信したそれぞれのデータを関連付けて、プログラムソフトにより電子ァ ルバムを作成する(たとえば、下記特許文献 1参照。 )0 [0003] Specifically, for example, a server connected to the Internet is provided with program software that edits digital image data to create an electronic album, and the server includes image data captured by a digital camera, The photographing time data of the image, the position data acquired by the mopile terminal, and the time data when the position data was acquired can be received. Create a Lubam (see, for example, Patent Document 1 below) 0
[0004] 特許文献 1:特開 2002— 183742号公報  [0004] Patent Document 1: JP 2002-183742 A
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0005] し力しながら、上記特許文献 1に記載された電子アルバム作成装置では、サーバ内 のプログラムソフトにより、たとえば画像データの撮影時刻と撮影場所などを関連付け て電子アルバムを作成するため、サーバとの接続環境の確立が必須であるとともに 装置構成全体が複雑になるという問題が一例として挙げられる。 課題を解決するための手段 However, in the electronic album creating apparatus described in Patent Document 1, the electronic album is created by associating, for example, the photographing time and photographing place of the image data with the program software in the server. As an example, there is a problem that the establishment of a connection environment with the system is indispensable and the entire device configuration becomes complicated. Means for solving the problem
[0006] 請求項 1の発明にかかる画像編集装置は、日時に関する情報を含む画像データの 入力を受け付ける入力手段と、移動体が移動した経路および時刻に関する情報を取 得する取得手段と、前記入力手段によって受け付けられた画像データの前記日時に 関する情報と、前記取得手段によって取得された経路および時刻に関する情報とに 基づいて、前記画像データを地図情報と関連付ける関連付け手段と、を備えることを 特徴とする。  [0006] The image editing apparatus according to the invention of claim 1 includes an input unit that receives input of image data including information related to date and time, an acquisition unit that acquires information about a route and time when the moving body has moved, and the input unit. And an associating means for associating the image data with map information based on the information on the date and time of the image data received by the information and the information on the route and the time acquired by the acquiring means. .
[0007] また、請求項 10の発明にかかる画像編集方法は、日時に関する情報を含む画像 データを入力する入力工程と、移動体が移動した経路および時刻に関する情報を取 得する取得工程と、前記入力工程によって入力された画像データの前記日時に関 する情報と、前記取得工程によって取得された経路および時刻に関する情報とに基 づいて、前記画像データを地図情報と関連付ける関連付け工程と、を含むことを特 徴とする。  [0007] Further, the image editing method according to the invention of claim 10 includes an input step of inputting image data including information on date and time, an acquisition step of acquiring information on a route and time of movement of the moving body, and the input An association step of associating the image data with map information based on the information on the date and time of the image data input in the step and the information on the route and time acquired in the acquisition step. It is a feature.
[0008] また、請求項 11の発明に力かる画像編集プログラムは、請求項 10に記載の画像編 集方法をコンピュータに実行させることを特徴とする。  [0008] An image editing program according to claim 11 causes a computer to execute the image editing method according to claim 10.
[0009] また、請求項 12の発明にかかるコンピュータに読み取り可能な記録媒体は、請求 項 11に記載の画像編集プログラムを記録したことを特徴とする。 [0009] A computer-readable recording medium according to the invention of claim 12 records the image editing program according to claim 11.
図面の簡単な説明  Brief Description of Drawings
[0010] [図 1]図 1は、実施の形態に力かる画像編集装置の機能的構成の一例を示すブロック 図である。  FIG. 1 is a block diagram showing an example of a functional configuration of an image editing apparatus that is useful for an embodiment.
[図 2]図 2は、実施の形態にカゝかる画像編集装置の画像編集処理手順の一例を示す フローチャートである。  FIG. 2 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment.
[図 3]図 3は、実施例にカゝかる画像編集装置が搭載された車両内部の一例を示す説 明図である。  FIG. 3 is an explanatory diagram showing an example of the inside of a vehicle equipped with an image editing apparatus similar to the embodiment.
[図 4]図 4は、実施例に力かる画像編集装置のハードウェア構成の一例を示すブロッ ク図である。  [FIG. 4] FIG. 4 is a block diagram illustrating an example of a hardware configuration of an image editing apparatus that is effective in the embodiment.
[図 5]図 5は、実施例にカゝかる画像編集装置における画像編集部の内部構成の一例 を示すブロック図である。 FIG. 5 is a block diagram illustrating an example of an internal configuration of an image editing unit in the image editing apparatus according to the embodiment.
O O
[図 6]図 6は、実施例に力かる画像編集装置における音声再生部の内部構成の一例 を1示—  [FIG. 6] FIG. 6 shows an example of an internal configuration of an audio reproduction unit in an image editing apparatus that is effective in the embodiment.
〇すブロック図である。 It is a block diagram.
1—  1—
[図 7]図 7は、実施例にカゝかる画像編集装置の画像編集処理手順の一例を示すフロ 一チャートである。  FIG. 7 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment.
[図 8]図 8は、実施例にカゝかる画像編集装置の画像編集処理における音声データの 他の関連付け処理手順の一例を示すフローチャートである。  [FIG. 8] FIG. 8 is a flowchart illustrating an example of another procedure for associating audio data in the image editing process of the image editing apparatus according to the embodiment.
[図 9]図 9は、実施例にカゝかる画像編集装置の画像編集処理における音声データの 他の関連付け処理手順の一例を示すフローチャートである。  FIG. 9 is a flowchart illustrating an example of another procedure for associating audio data in the image editing process of the image editing apparatus according to the embodiment.
[図 10]図 10は、実施例にかかる画像編集装置の画像編集処理における画像データ の分配処理手順の一例を示す説明図である。  FIG. 10 is an explanatory diagram of an example of an image data distribution process procedure in the image editing process of the image editing apparatus according to the embodiment.
[図 11]図 11は、実施例にかかる画像編集装置の画像編集処理の具体的処理例を示 す説明図である。  FIG. 11 is an explanatory diagram of a specific processing example of the image editing processing of the image editing apparatus according to the embodiment.
符号の説明  Explanation of symbols
102 集音部 102 Sound collector
103 入力部  103 Input section
104 取得部  104 Acquisition Department
105 関連付け部  105 Association
106 表示部  106 Display
107 検出部  107 Detector
108 制御部  108 Control unit
109、 414 音声再生部  109, 414 Audio playback section
310 画像編集装置  310 Image editing device
412 画像編集部  412 Image Editor
510 画像編集処理部  510 Image editing processor
610 音声再生処理部  610 Audio playback processor
発明を実施するための最良の形態 BEST MODE FOR CARRYING OUT THE INVENTION
012] 以下に添付の図面を参照して、この発明にかかる画像編集装置、画像編集方法、 画像編集プログラムおよびこのプログラムを記録したコンピュータに読み取り可能な 記録媒体の好適な実施の形態を詳細に説明する。 With reference to the accompanying drawings, an image editing apparatus, an image editing method, and A preferred embodiment of an image editing program and a computer-readable recording medium recording the program will be described in detail.
[0013] (実施の形態)  [0013] (Embodiment)
(画像編集装置の機能的構成)  (Functional configuration of image editing device)
まず、この発明の実施の形態に力かる画像編集装置の内容について説明する。図 1は、この発明の実施の形態に力かる画像編集装置の機能的構成の一例を示すブ ロック図である。図 1において、画像編集装置は、たとえば車両(四輪車、二輪車を含 む)などの移動体に搭載され、撮影部 101と、集音部 102と、入力部 103と、取得部 1 04と、関連付け部 105と、表示部 106と、検出部 107と、制御部 108と、音声再生部 109とを含む構成となって 、る。  First, the contents of an image editing apparatus that is relevant to the embodiment of the present invention will be described. FIG. 1 is a block diagram showing an example of a functional configuration of an image editing apparatus that is useful in an embodiment of the present invention. In FIG. 1, an image editing apparatus is mounted on a moving body such as a vehicle (including a four-wheeled vehicle and a two-wheeled vehicle), for example, and includes a photographing unit 101, a sound collecting unit 102, an input unit 103, and an acquiring unit 104. The association unit 105, the display unit 106, the detection unit 107, the control unit 108, and the audio reproduction unit 109 are included.
[0014] 撮影部 101は、画像を撮影する。撮影部 101によって撮影される画像は、車両の内 部あるいは外部を撮影したものが挙げられる。この撮影部 101は、画像編集装置と一 体的あるいは着脱自在に構成される。集音部 102は、たとえば車両の内部の音声を 集音する。集音部 102によって集音される音声は、車両の内部の音場から集音され るものが挙げられる。  [0014] The imaging unit 101 captures an image. The image photographed by the photographing unit 101 may be an image taken inside or outside the vehicle. The photographing unit 101 is configured to be integral with or removable from the image editing apparatus. The sound collection unit 102 collects sound inside the vehicle, for example. Examples of the sound collected by the sound collection unit 102 include those collected from the sound field inside the vehicle.
[0015] 入力部 103は、日時に関する情報 (たとえば、タイムスタンプデータ)を含む画像デ ータの入力を受け付ける。また、入力部 103は、撮影部 101によって撮影された画像 の画像データおよび集音部 102によって集音された音声の音声データの入力を受 け付ける。取得部 104は、車両が移動した経路および時刻に関する情報を取得する 。また、取得部 104は、車両の挙動に関する挙動情報を取得する。この挙動情報とは 、具体的には、車両の移動または停止状態を示す情報であり、たとえば車両の速度 に関する情報 (速度情報、加速度情報、角速度情報など)、傾斜角情報、横 G (Gmv ity)情報および現在位置情報のうち少なくとも!/、ずれか一つを含むものが挙げられる  The input unit 103 accepts input of image data including information related to date and time (for example, time stamp data). The input unit 103 also accepts input of image data of an image captured by the imaging unit 101 and audio data of the sound collected by the sound collection unit 102. The acquisition unit 104 acquires information regarding the route and time of travel of the vehicle. The acquisition unit 104 acquires behavior information regarding the behavior of the vehicle. Specifically, this behavior information is information indicating the moving or stopping state of the vehicle. For example, information on vehicle speed (speed information, acceleration information, angular velocity information, etc.), inclination angle information, lateral G (Gmvity ) Information and current location information that includes at least one of! /
[0016] 関連付け部 105は、入力部 103によって受け付けられた画像データの日時に関す る情報と、取得部 104によって取得された車両の経路および時刻に関する情報なら びに挙動情報とに基づいて、画像データを地図情報に関連付ける。関連付け部 105 での関連付けによって、撮影部 101により撮影された画像データがいつどこで撮影さ れたものであるかが決定する。 The associating unit 105 uses the image data based on the information on the date and time of the image data received by the input unit 103, the vehicle route and time information acquired by the acquiring unit 104, and the behavior information. Is associated with map information. By the association in the associating unit 105, when and where the image data captured by the imaging unit 101 is captured. It is determined whether it is
[0017] 表示部 106は、関連付け部 105によって関連付けられた画像データを表示する。  Display unit 106 displays the image data associated by association unit 105.
表示部 106には、たとえば画像データが撮影された時系列順や、車両の通過した経 路順に配置された状態で表示されるとよい。検出部 107は、撮影部 101によって撮 影された画像の画像データに含まれる画像イメージの特徴量および集音部 102によ つて集音された音声の音声データに含まれる音声パラメータの特徴量を検出する。  For example, the display unit 106 may display the image data arranged in the order of time series in which the image data was taken or the order in which the vehicle passed. The detection unit 107 calculates the feature amount of the image image included in the image data of the image captured by the image capturing unit 101 and the feature amount of the audio parameter included in the sound data of the sound collected by the sound collection unit 102. To detect.
[0018] 画像イメージの特徴量は、具体的には、たとえば画像データの画像イメージに含ま れる人物の顔画像イメージの特徴量などが挙げられる。また、音声パラメータの特徴 量は、具体的には、たとえば音量成分 (音量の大小)、時間成分 (発音継続時間)お よび周波数成分 (周波数の高低)の特徴量などが挙げられる。  [0018] Specifically, the feature amount of the image image includes, for example, the feature amount of the face image of a person included in the image image of the image data. Specific examples of the feature amount of the voice parameter include a feature amount of a volume component (volume level), a time component (sounding duration), and a frequency component (frequency level).
[0019] 制御部 108は、検出部 107によって検出された画像イメージの特徴量および音声 ノ メータの特徴量に基づいて、撮影部 101を制御する。また、制御部 108は、検出 部 107によって検出された特徴量に変化が生じた場合に、撮影部 101を制御して画 像の撮影をおこなう。  The control unit 108 controls the photographing unit 101 based on the feature amount of the image image and the feature amount of the sound meter detected by the detection unit 107. Further, the control unit 108 controls the photographing unit 101 to take an image when the feature amount detected by the detecting unit 107 changes.
[0020] 音声再生部 109は、音声データを再生する。また、音声再生部 109は、表示部 10 6に画像データを表示する際に、たとえば検出部 107によって検出された特徴量およ び取得部 104によって取得された挙動情報などに基づいて、再生する音声データを 選択する。音声再生部 109によって再生される音声は、音楽などの楽曲や効果音な どが挙げられる。  [0020] The audio reproduction unit 109 reproduces audio data. Further, when displaying the image data on the display unit 106, the audio reproduction unit 109 reproduces based on, for example, the feature amount detected by the detection unit 107 and the behavior information acquired by the acquisition unit 104. Select audio data. Examples of the sound reproduced by the sound reproducing unit 109 include music such as music and sound effects.
[0021] (画像編集装置の画像編集処理手順)  [0021] (Image Editing Processing Procedure of Image Editing Device)
つぎに、この発明の実施の形態にカゝかる画像編集装置の画像編集処理手順につ いて説明する。図 2は、この発明の実施の形態に力かる画像編集装置の画像編集処 理手順の一例を示すフローチャートである。  Next, an image editing process procedure of the image editing apparatus according to the embodiment of the present invention will be described. FIG. 2 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment of the present invention.
[0022] 図 2のフローチャートにおいて、まず、入力部 103 (図 1参照、以下同じ)によって、 一つまたは複数の撮影部 101 (図 1参照、以下同じ)から、たとえば人物や風景の画 像イメージを含むとともに日時に関する情報を含む画像データを入力する (ステップ S 201)。つぎに、取得部 104 (図 1参照、以下同じ)によって、車両が移動した経路お よび時刻に関する情報を取得する (ステップ S 202)。 [0023] そして、関連付け部 105 (図 1参照、以下同じ)によって、ステップ S201において入 力した画像データの日時に関する情報と、ステップ S202において取得した経路およ び時刻に関する情報とに基づいて、画像データを地図情報と関連付ける (ステップ S 203)。こうして、画像データを地図情報と関連付けたのち、表示部 106 (図 1参照、 以下同じ)によって、画像データを表示する (ステップ S204)。これにより、本フローチ ヤートによる画像編集処理を終了する。 In the flowchart of FIG. 2, first, for example, an image of a person or a landscape is input from one or a plurality of photographing units 101 (see FIG. 1, the same applies hereinafter) by the input unit 103 (see FIG. 1, the same applies hereinafter). And image data including date and time information is input (step S 201). Next, the acquisition unit 104 (see FIG. 1, the same applies hereinafter) acquires information regarding the route and time the vehicle has traveled (step S 202). [0023] Then, based on the information on the date and time of the image data input in step S201 and the information on the route and time acquired in step S202 by the associating unit 105 (see FIG. 1, the same applies hereinafter), the image The data is associated with the map information (step S203). After associating the image data with the map information in this manner, the image data is displayed by the display unit 106 (see FIG. 1, the same applies hereinafter) (step S204). As a result, the image editing process according to this flowchart is completed.
[0024] なお、図示は省略するが、ステップ S204における、表示部 106による画像データ の表示処理の際に、検出部 107 (図 1参照、以下同じ)によって検出された画像ィメ ージの特徴量ならびに音声パラメータの特徴量と、取得部 104によって取得された 挙動情報とに基づいて、音声再生部 109 (図 1参照、以下同じ)によって、再生する 音声データの選択をおこない、選択した音声データの再生をおこなってもよい。また 、制御部 108 (図 1参照、以下同じ)によって、検出部 107により検出された特徴量に 変化が生じた場合に、撮影部 101を制御して画像を撮影するようにしてもよい。  [0024] Although illustration is omitted, the characteristics of the image image detected by the detection unit 107 (see FIG. 1, the same applies hereinafter) during the display processing of the image data by the display unit 106 in step S204. Volume and the parameter value of the voice parameter and the behavior information acquired by the acquisition unit 104, the audio playback unit 109 (see FIG. 1, the same applies hereinafter) selects the audio data to be played back, and the selected audio data May be played back. In addition, when the control unit 108 (see FIG. 1, the same applies hereinafter) changes the feature amount detected by the detection unit 107, the imaging unit 101 may be controlled to capture an image.
[0025] 以上説明したように、この発明の実施の形態に力かる画像編集装置によれば、入 力した画像データを、画像データの日時に関する情報と、取得した経路および時刻 に関する情報とに基づいて、サーバなどを介さずに地図情報と関連付けることができ る。このため、車両のドライブ中などに得られた画像データを、装置の構成を複雑ィ匕 せずに自動的に車両の通過地点や通過時刻と結びつけて時系列順や通過経路順 に編集することができるようになり、画像編集の際の繁雑な手間やコストを軽減するこ とがでさる。  As described above, according to the image editing apparatus according to the embodiment of the present invention, the input image data is based on the information on the date and time of the image data and the information on the acquired route and time. Can be associated with map information without going through a server. For this reason, image data obtained during driving of the vehicle can be automatically linked to the passing point and passing time of the vehicle without complicating the configuration of the device, and edited in chronological order or passing route order. As a result, it is possible to reduce complicated labor and cost for image editing.
[0026] つぎに、この発明の実施の形態に力かる実施例について詳細に説明する。ここで は、この実施の形態に力かる画像編集装置を車載のナビゲーシヨン装置に適用した 場合を例示して説明する。  [0026] Next, examples that are useful for the embodiments of the present invention will be described in detail. Here, a case where the image editing apparatus according to this embodiment is applied to an in-vehicle navigation apparatus will be described as an example.
実施例  Example
[0027] (画像編集装置が搭載された車両内部の説明)  [0027] (Description of inside of vehicle equipped with image editing device)
まず、この発明の実施例に力かる画像編集装置が搭載された車両内部について説 明する。図 3は、この発明の実施例に力かる画像編集装置が搭載された車両内部の 一例を示す説明図である。図 3において、たとえば運転席シート 311および助手席シ ート 312の周囲には、図 1における表示部 106としてのモニタ 302aおよび音声再生 部 109の音声出力装置としてのスピーカ 304が設けられている。また、車両の天井部 314には、図 1における撮影部 101としてのカメラ 305および集音部 102としてのマイ ク 306が設けられている。 First, the interior of a vehicle equipped with an image editing apparatus that is useful in the embodiment of the present invention will be described. FIG. 3 is an explanatory view showing an example of the inside of a vehicle in which an image editing apparatus that is effective in the embodiment of the present invention is mounted. In FIG. 3, for example, the driver's seat 311 and the passenger seat A monitor 302a as the display unit 106 and a speaker 304 as an audio output device of the audio reproduction unit 109 in FIG. Further, a camera 305 as the photographing unit 101 and a microphone 306 as the sound collecting unit 102 in FIG.
[0028] そして、助手席シート 312には、後部座席シート 313への搭乗者に向けて、表示部 106としてのモニタ 302bが設けられている。画像編集装置 310 (310a, 310b)は、こ れらのモニタ 302 (302a, 302b)と、スピーカ 304と、カメラ 305と、マイク 306とを備 えて ヽる。なお、カメラ 305およびマイク 306ίま、画像編集装置 310 (310a, 310b) にそれぞれ搭載されていてもよい。また、この画像編集装置 310 (310a, 310b)は、 車両に着脱可能な構造を備えて 、てもよ 、。  [0028] The passenger seat 312 is provided with a monitor 302b as the display unit 106 toward the passenger on the rear seat 313. The image editing apparatus 310 (310a, 310b) includes these monitors 302 (302a, 302b), a speaker 304, a camera 305, and a microphone 306. It should be noted that the camera 305 and the microphone 306ί and the image editing device 310 (310a, 310b) may be mounted respectively. The image editing device 310 (310a, 310b) may be provided with a structure that can be attached to and detached from the vehicle.
[0029] (画像編集装置のハードウェア構成)  [0029] (Hardware configuration of image editing apparatus)
つぎに、この発明の実施例に力かる画像編集装置のハードウェア構成について説 明する。図 4は、この発明の実施例に力かる画像編集装置のハードウェア構成の一 例を示すブロック図である。  Next, the hardware configuration of the image editing apparatus that is useful in the embodiment of the present invention will be described. FIG. 4 is a block diagram showing an example of a hardware configuration of an image editing apparatus that is useful in the embodiment of the present invention.
[0030] 図 4において、画像編集装置 310は、上述したように着脱自在に車両に搭載されて おり、制御部 400と、ユーザ操作部(リモコン,タツチパネル) 401と、表示部(モニタ) 402と、位置取得部(GPS,センサ) 403と、記録媒体 404と、記録媒体デコード部 4 05と、案内音出力部 406と、通信部 407と、経路探索部 408と、経路誘導部 409と、 案内音生成部 410と、スピーカ 411と、画像編集部 412と、画像入出力 IZF413と、 音声再生部 414と、音声出力部 415と、撮影部 416と、集音部 417と、音声入力 iZ F418とを含む構成となっている。  In FIG. 4, the image editing device 310 is detachably mounted on the vehicle as described above, and includes a control unit 400, a user operation unit (remote control and touch panel) 401, a display unit (monitor) 402, and the like. , Position acquisition unit (GPS, sensor) 403, recording medium 404, recording medium decoding unit 405, guidance sound output unit 406, communication unit 407, route search unit 408, route guidance unit 409, guidance Sound generation unit 410, speaker 411, image editing unit 412, image input / output IZF 413, audio reproduction unit 414, audio output unit 415, shooting unit 416, sound collection unit 417, audio input iZ F418, It is the composition which includes.
[0031] 制御部 400は、たとえば画像編集装置 310全体の制御を司り、制御プログラムにし たがって各種の演算処理を実行することにより、画像編集装置 310が備える各部を 統括的に制御する。制御部 400は、たとえば所定の演算処理を実行する CPU (Cen tral Processing Unit)や、各種制御プログラムを格納する ROM (Read Only Memory)、および、 CPUのワークエリアとして機能する RAM (Random Access Memory)などによって構成されるマイクロコンピュータなどによって実現することがで きる。 [0032] また、制御部 400は、たとえば車両の経路誘導に際し、位置取得部 403によって取 得された車両の現在位置に関する情報 (現在位置情報)と、記録媒体 404から記録 媒体デコード部 405を介して得られた地図情報とに基づいて、地図上のどの位置を 車両が走行しているかを算出し、算出結果を表示部 402へ出力する。この制御部 40 0は、上記経路誘導に際し、経路探索部 408、経路誘導部 409、および案内音生成 部 410の間で経路誘導に関する情報の入出力をおこない、その結果得られる情報を 表示部 402および案内音出力部 406へ出力する。 [0031] The control unit 400 controls the entire image editing apparatus 310, for example, and executes various arithmetic processes according to a control program, thereby comprehensively controlling each unit included in the image editing apparatus 310. The control unit 400 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area for the CPU. It can be realized by a microcomputer composed of the above. [0032] In addition, the control unit 400, for example, at the time of route guidance of the vehicle, information on the current position of the vehicle (current position information) acquired by the position acquisition unit 403 and the recording medium 404 via the recording medium decoding unit 405 Based on the map information obtained in this way, the position on the map where the vehicle is traveling is calculated, and the calculation result is output to the display unit 402. The control unit 400 inputs and outputs information related to route guidance among the route search unit 408, the route guidance unit 409, and the guidance sound generation unit 410, and displays information obtained as a result of the display unit 402. And output to the guide sound output unit 406.
[0033] ユーザ操作部 401は、文字、数値、各種指示など、ユーザによって入力操作された 情報を制御部 400に対して出力する。ユーザ操作部 401の構成としては、たとえば 物理的な押下 Z非押下を検出する押しボタン式スィッチ、タツチパネル、キーボード 、ジョイスティックなどの公知の各種形態を採用することができる。ユーザ操作部 401 は、後述する集音部 417のような外部力もの音声を入力するマイクなどを用いて、音 声によって入力操作をおこなう形態としてもよい。  The user operation unit 401 outputs information input by the user such as characters, numerical values, and various instructions to the control unit 400. As the configuration of the user operation unit 401, various known forms such as a push button switch that detects physical pressing Z non-pressing, a touch panel, a keyboard, and a joystick can be employed. The user operation unit 401 may be configured to perform an input operation by sound using a microphone or the like that inputs sound of an external force such as the sound collection unit 417 described later.
[0034] このユーザ操作部 401は、画像編集装置 310に対して一体的に設けられていても よいし、リモコンのように画像編集装置 310から離間した位置力も操作可能な形態で あってもよい。また、ユーザ操作部 401は、上述した各種形態のうちのいずれか一つ の形態で構成されていても、複数の形態で構成されていてもよい。ユーザは、ユーザ 操作部 401の形態に応じて、適宜入力操作をおこなうことで情報を入力する。  [0034] The user operation unit 401 may be provided integrally with the image editing apparatus 310, or may be configured to be able to operate a positional force separated from the image editing apparatus 310, such as a remote controller. . Further, the user operation unit 401 may be configured in any one of the various forms described above, or may be configured in a plurality of forms. The user inputs information by appropriately performing an input operation according to the form of the user operation unit 401.
[0035] ユーザ操作部 401の入力操作によって入力される情報として、たとえばナビゲーシ ヨンに関しては、目的地の情報などが挙げられる。具体的に、たとえば画像編集装置 310が車両などに備えられている場合には、この車両の搭乗している人物が到達目 標とする地点が設定される。また、ユーザ操作部 401に入力される情報として、たとえ ば画像編集に関しては、後述する画像入出力 IZF413から画像編集部 412に入力 された画像データの電子アルバムにおける表示形式の情報などが挙げられる。具体 的には、たとえばこの車両の搭乗している人物が所望とする電子アルバムの表示形 式が設定される。  [0035] Information input by an input operation of the user operation unit 401 includes, for example, destination information regarding navigation. Specifically, for example, when the image editing apparatus 310 is provided in a vehicle or the like, a point set as a destination target by a person on the vehicle is set. Further, as information input to the user operation unit 401, for example, with respect to image editing, information on the display format in an electronic album of image data input to the image editing unit 412 from an image input / output IZF 413 to be described later can be cited. Specifically, for example, the display format of the electronic album desired by the person on board the vehicle is set.
[0036] ここで、たとえばユーザ操作部 401の形態としてタツチパネルを採用する場合、この タツチパネルは、表示部 402の表示画面側に積層して使用される。この場合、表示 部 402における表示タイミングと、タツチパネル (ユーザ操作部 401)に対する操作タ イミングおよびその位置座標とを管理することによって、入力操作による入力情報を 認識する。ユーザ操作部 401の形態として表示部 402に積層されたタツチパネルを 採用することにより、ユーザ操作部 401の形態を大型化することなぐ多くの情報入力 をおこなうことができる。このタツチパネルとしては、抵抗膜式、感圧式など公知の各 種タツチパネルを採用することが可能である。 Here, for example, when a touch panel is adopted as the form of the user operation unit 401, the touch panel is used by being stacked on the display screen side of the display unit 402. In this case, display By managing the display timing in the unit 402, the operation timing for the touch panel (user operation unit 401) and its position coordinates, the input information by the input operation is recognized. By adopting a touch panel stacked on the display unit 402 as a form of the user operation unit 401, it is possible to input a large amount of information without enlarging the form of the user operation unit 401. As this touch panel, various known touch panels such as a resistance film type and a pressure sensitive type can be adopted.
[0037] 表示部 402は、たとえば CRT (Cathode Ray Tube)、 TFT液晶ディスプレイ、 有機 ELディスプレイ、プラズマディスプレイなどを含む。表示部 402は、具体的には 、たとえば図示しな 、映像 IZFや映像 IZFに接続された映像表示用のディスプレイ 装置によって構成することができる。映像 IZFは、具体的には、たとえばディスプレイ 装置全体の制御をおこなうグラフィックコントローラと、即時表示可能な画像情報を一 時的に記憶する VRAM (Video RAM)などのバッファメモリと、グラフィックコント口 ーラから出力される画像情報に基づ 、て、ディスプレイ装置を表示制御する制御 IC や GPU (Graphics Processing Unit)などによって構成される。表示部 402には 、アイコン、カーソル、メニュー、ウィンドウあるいは文字や画像などの各種情報が表 示される。また、表示部 402〖こは、後述する画像編集部 412によって編集された画像 データが表示される。 [0037] Display unit 402 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like. Specifically, the display unit 402 can be configured by, for example, a video IZF or a video display device connected to the video IZF, not shown. Specifically, the video IZF is a graphic controller that controls the entire display device, a buffer memory such as VRAM (Video RAM) that temporarily stores image information that can be displayed immediately, and a graphic controller. Based on the image information output from the control unit, it is composed of a control IC that controls display of the display device, a GPU (Graphics Processing Unit), and the like. The display unit 402 displays icons, cursors, menus, windows, or various information such as characters and images. Further, the display unit 402 displays image data edited by an image editing unit 412 described later.
[0038] 位置取得部 403は、たとえば人工衛星力 の電波を受信することによって画像編集 装置 310が搭載された車両の現在位置情報 (緯度経度情報)を取得する。ここで、現 在位置情報は、人工衛星からの電波を受信し、人工衛星との幾何学的位置を求める 情報であり、地球上どこでも計測可能である。なお、位置取得部 403は、図示しない GPSアンテナを備えている。ここで、 GPS (Global Positioning System)とは、 4 つ以上の人工衛星力 の電波を受信することによって地上での位置を正確に求める システムである。ここでは、公知の技術であるため GPSについての説明は省略する。 この位置取得部 403は、たとえば人工衛星力も受信した電波を復調するチューナや 、復調した情報に基づいて現在位置を算出する演算回路などによって構成すること ができる。  [0038] The position acquisition unit 403 acquires current position information (latitude and longitude information) of the vehicle on which the image editing device 310 is mounted, for example, by receiving radio waves of artificial satellite power. Here, the current position information is information for receiving a radio wave from an artificial satellite and obtaining a geometric position with respect to the artificial satellite, and can be measured anywhere on the earth. The position acquisition unit 403 includes a GPS antenna (not shown). Here, the GPS (Global Positioning System) is a system that accurately obtains the position on the ground by receiving radio waves of four or more artificial satellite forces. Here, since it is a well-known technique, description about GPS is abbreviate | omitted. The position acquisition unit 403 can be configured by, for example, a tuner that demodulates radio waves that have also received artificial satellite force, an arithmetic circuit that calculates the current position based on the demodulated information, and the like.
[0039] なお、人工衛星からの電波としては、 1. 57542GHzの搬送波で、 CZA (Coarse and Access)コードおよび航法メッセージが乗っている LI電波などを用いておこ なわれる。これにより、画像編集装置 310が搭載された車両の現在位置 (緯度および 経度)を検知する。なお、車両の現在位置の検知に際しては、車速センサやジャイロ センサなどの各種センサによって収集された情報を加味してもよい。車速センサは、 画像編集装置 310を搭載する車両のトランスミッションの出力側シャフトから車速を検 出する。 [0039] It should be noted that the radio wave from the artificial satellite is 1. 57542GHz carrier, CZA (Coarse and Access) code and navigation messages are carried out using LI radio waves. As a result, the current position (latitude and longitude) of the vehicle on which the image editing apparatus 310 is mounted is detected. In detecting the current position of the vehicle, information collected by various sensors such as a vehicle speed sensor and a gyro sensor may be taken into consideration. The vehicle speed sensor detects the vehicle speed from the output shaft of the transmission of the vehicle on which the image editing device 310 is mounted.
[0040] その他、車両の現在位置の検知に際しては、角速度センサ、走行距離センサ、傾 斜角センサ、横 G (Gravity)センサなどの各種センサによって収集された情報を加味 してもよい。角速度センサは、車両の回転時の角速度を検出し、角速度情報と相対 方位情報とを出力する。走行距離センサは、車輪の回転に伴って出力される所定周 期のパルス信号のパルス数をカウントすることによって車輪 1回転あたりのパルス数を 算出し、その 1回転あたりのパルス数に基づく走行距離情報を出力する。傾斜角セン サは、路面の傾斜角度を検出し、傾斜角情報を出力する。横 Gセンサは、車両のコ ーナリングの際に遠心力によって発生する外向きの力(重力)である横 Gを検出し、横 G情報を出力する。なお、位置取得部 403によって取得される車両の現在位置情報 や、これらの車速センサ、ジャイロセンサ、角速度センサ、走行距離センサ、傾斜角セ ンサおよび横 Gセンサによって検出される情報は、車両の挙動に関する挙動情報と して制御部 400に出力される。  [0040] In addition, when detecting the current position of the vehicle, information collected by various sensors such as an angular velocity sensor, a travel distance sensor, an inclination angle sensor, and a lateral G (Gravity) sensor may be taken into consideration. The angular velocity sensor detects the angular velocity when the vehicle rotates and outputs angular velocity information and relative orientation information. The mileage sensor calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal output with the rotation of the wheel in a given period, and the mileage based on the number of pulses per one rotation Output information. The inclination angle sensor detects the inclination angle of the road surface and outputs inclination angle information. The lateral G sensor detects lateral G, which is an outward force (gravity) generated by centrifugal force during vehicle cornering, and outputs lateral G information. The vehicle current position information acquired by the position acquisition unit 403 and the information detected by the vehicle speed sensor, gyro sensor, angular velocity sensor, mileage sensor, inclination angle sensor, and lateral G sensor are the vehicle behavior. Is output to the control unit 400 as behavior information regarding
[0041] 記録媒体 404は、各種制御プログラムや各種情報をコンピュータに読み取り可能な 状態で記録する。記録媒体 404は、記録媒体デコード部 405による情報の書き込み を受け付けるとともに、書き込まれた情報を不揮発に記録する。記録媒体 404は、た とえば HD (Hard Disk)によって実現することができる。記録媒体 404は、 HDに限 るものではなぐ HDに代えてあるいは HDに加えて、 DVD (Digital Versatile Di sk)や CD (Compact Disk)など、記録媒体デコード部 405に対して着脱可能であ り可搬性を有するメディアを記録媒体 404として用いてもよい。そして、記録媒体 404 は、 DVDおよび CDに限るものではなぐ CD— ROM (CD— R、 CD-RW) , MO ( Magneto -Optical disk)、メモリカードなどの記録媒体デコード部 405に対して着 脱可能であり可搬性を有するメディアを利用することもできる。 [0042] なお、記録媒体 404には、本発明を実現する画像編集プログラム、ナビゲーシヨン プログラム、画像データおよび地図情報などが記録されている。ここで、画像データ は、たとえば人物や風景に関する画像イメージを表わす 2次元配列の値をいう。また 、地図情報は、たとえば建物、河川、地表面などの地物 (フィーチャ)をあらわす背景 情報と、道路の形状をあらわす道路形状情報とを有しており、表示部 402の表示画 面において 2次元または 3次元に描画される。 The recording medium 404 records various control programs and various information in a state that can be read by a computer. The recording medium 404 accepts writing of information by the recording medium decoding unit 405 and records the written information in a nonvolatile manner. The recording medium 404 can be realized by HD (Hard Disk), for example. The recording medium 404 is not limited to HD. It can be attached to and detached from the recording medium decoding unit 405 such as DVD (Digital Versatile Disk) and CD (Compact Disk) instead of HD or in addition to HD. A medium having portability may be used as the recording medium 404. The recording medium 404 is not limited to the DVD and CD, but is attached to and detached from the recording medium decoding unit 405 such as a CD-ROM (CD-R, CD-RW), MO (Magno-Optical disk), and memory card. It is possible to use a portable medium that is portable. [0042] Note that the recording medium 404 stores an image editing program, a navigation program, image data, map information, and the like that realize the present invention. Here, the image data refers to a two-dimensional array value representing an image image related to a person or a landscape, for example. The map information includes background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing road shapes. Rendered in 3D or 3D.
[0043] 背景情報は、背景の形状をあらわす背景形状情報と、背景の種別をあらわす背景 種別情報とを有する。背景形状情報は、たとえば地物の代表点、ポリライン、ポリゴン 、地物の座標などを示す情報を含む。背景種別情報は、たとえば地物の名称や住所 や電話番号などを示すテキスト情報、建物'河川などの地物の種別を示す種別情報 などを含む。  [0043] The background information includes background shape information representing the shape of the background and background type information representing the type of the background. The background shape information includes information indicating, for example, representative points of features, polylines, polygons, coordinates of features, and the like. The background type information includes, for example, text information indicating the name, address, telephone number, and the like of the feature, and type information indicating the type of the feature such as a building 'river'.
[0044] 道路形状情報は、複数のノードおよびリンクを有する道路ネットワークに関する情報 である。ノードは、三叉路 '十字路'五叉路など複数の道路が交差する交差点を示す 情報である。リンクは、ノード間を連結する道路を示す情報である。リンクには、曲線 道路の表現を可能とする形状補完点を有するものもある。道路形状情報は、交通条 件情報を有する。交通条件情報は、交差点の特徴、各リンクの長さ (距離)、車幅、進 行方向、通行禁止、道路種別などを示す情報である。  [0044] The road shape information is information relating to a road network having a plurality of nodes and links. The node is information indicating an intersection where a plurality of roads intersect, such as a three-way “crossroad” and a five-way. A link is information indicating a road connecting nodes. Some links have shape interpolation points that allow the expression of curved roads. Road shape information has traffic condition information. The traffic condition information is information indicating the characteristics of the intersection, the length (distance) of each link, vehicle width, traveling direction, traffic prohibition, road type, and the like.
[0045] この交差点の特徴としては、たとえば三叉路や五叉路などの複雑な交差点、浅 ヽ 角度で道路が分岐する交差点、目的地周辺の交差点、高速道路の出入り口ゃジャ ンクシヨン、経路逸脱率の高い交差点などが挙げられる。経路逸脱率は、たとえば過 去の走行履歴力 算出することが可能である。そして、道路種別としては、たとえば高 速道路、有料道路、一般道路などが挙げられる。  [0045] The characteristics of this intersection include, for example, complex intersections such as three- and five-way intersections, intersections where roads branch at shallower angles, intersections around destinations, junctions on expressways, junctions, and route deviation rates. For example, high intersections. The route deviation rate can be calculated by, for example, the past driving history force. Examples of road types include highway, toll road, and general road.
[0046] なお、本実施例では、たとえば画像データや地図情報を記録媒体 404に記録する ようにした力 これに限るものではない。画像データや地図情報は、画像編集装置 31 0のハードウェアと一体に設けられているものに限って記録されているものではなぐ 画像編集装置 310の外部に設けられていてもよい。その場合、画像編集装置 310は 、たとえば通信部 407を通じて、ネットワークを介して画像データを取得する。また、 画像編集装置 310は、たとえば通信部 407を通じて、ネットワークを介して地図情報 を取得する。このように取得された画像データや地図情報は、たとえば制御部 400の RAMなどに記憶されてもよ!、。 In the present embodiment, for example, the power for recording image data and map information on the recording medium 404 is not limited to this. The image data and the map information may be provided outside the image editing apparatus 310, not limited to those recorded only with those provided integrally with the hardware of the image editing apparatus 310. In that case, the image editing apparatus 310 acquires image data via the network through the communication unit 407, for example. In addition, the image editing device 310 uses the communication unit 407, for example, to map information via the network. To get. The image data and map information acquired in this way may be stored in the RAM of the control unit 400, for example!
[0047] 記録媒体デコード部 405は、記録媒体 404に対する情報の読み取り Z書き込みの 制御をおこなう。たとえば記録媒体 404として HDを用いた場合には、記録媒体デコ ード部 405は、 HDD (Hard Disk Drive)となる。同様に、記録媒体 404として DV Dあるいは CD (CD— R, CD— RWを含む)を用いた場合には、記録媒体デコード部 405は、 DVDドライブあるいは CDドライブとなる。また、書き込み可能かつ着脱可能 な記録媒体 404として、 CD— ROM (CD— R, CD— RW)、 MO、メモリカードなどを 利用する場合には、各種記録媒体への情報の書き込みおよび各種記録媒体に記憶 された情報の読み出しが可能な専用のドライブ装置などを、記録媒体デコード部 40 5として適宜用いるとよい。  [0047] The recording medium decoding unit 405 controls reading and writing of information on the recording medium 404. For example, when HD is used as the recording medium 404, the recording medium decoding unit 405 is an HDD (Hard Disk Drive). Similarly, when DV D or CD (including CD-R, CD-RW) is used as the recording medium 404, the recording medium decoding unit 405 is a DVD drive or a CD drive. When a CD-ROM (CD-R, CD-RW), MO, memory card, etc. is used as the writable and detachable recording medium 404, information can be written to various recording media and various recording media. A dedicated drive device or the like that can read the information stored in the recording medium may be used as the recording medium decoding unit 405 as appropriate.
[0048] 案内音出力部 406は、接続されたスピーカ 411への出力を制御することによって、 ナビゲーシヨンの案内音を再生する。スピーカ 411は、一つであってもよいし、複数で あってもよい。具体的に、案内音出力部 406は、音声出力用のスピーカ 411に接続 される図示しない音声 IZFによって実現することができる。より具体的には、音声 IZ Fは、たとえばディジタル音声データの DZA変換をおこなう DZ Aコンバータと、 Ό/ Aコンバータ力 出力されるアナログ音声信号を増幅する増幅器と、アナログ音声信 号の AZD変換をおこなう AZDコンバータと、力ら構成することができる。  [0048] The guidance sound output unit 406 reproduces navigation guidance sound by controlling output to the connected speaker 411. There may be one or more speakers 411. Specifically, the guide sound output unit 406 can be realized by an audio IZF (not shown) connected to the audio output speaker 411. More specifically, the audio IZ F is, for example, a DZ A converter that performs DZA conversion of digital audio data, an Ό / A converter power, an amplifier that amplifies the analog audio signal that is output, and an AZD conversion of the analog audio signal. It can be configured with AZD converter and force.
[0049] 通信部 407は、他の画像編集装置との間で通信をおこなう。本実施例の通信部 40 7は、たとえば携帯電話のように、基地局(図示せず)を介して通信サーバ(図示せず )と通信をおこなう通信モジュールであってもよぐ他の画像編集装置との間で直接無 線通信をおこなう通信モジュールであってもよい。ここで、無線通信とは、通信の媒体 となるワイヤ線を使わず、電波や赤外線'超音波を用いておこなわれる通信である。 無線通信を可能とする規格には、たとえば無線 LAN、 IrDA (Infrared Data Ass ociation)、 HomeRF (Home Radio Frequency)、 BlueToothなど各種の技術 がある力 本実施例においては公知の各種の無線通信技術を利用することができる 。なお、情報の転送速度などの面から、無線 LANを好ましい一例として用いることが できる。 [0050] ここで、通信部 407は、たとえば渋滞や交通規制などの道路交通情報を、定期的( 不定期でも可)に受信してもよい。通信部 407による道路交通情報の受信は、 VICS (Vehicle Information and Communication system)センタ ~~ り 路交 通情報が配信されたタイミングでおこなってもよいし、 VICSセンターに対し定期的に 道路交通情報を要求することでおこなってもよい。また、通信部 407は、たとえば AM ZFMチューナ、 TVチューナ、 VICSZビーコンレシーバおよびその他の通信機器 として実現することが可能である。 [0049] The communication unit 407 performs communication with other image editing apparatuses. The communication unit 407 of this embodiment may be a communication module that communicates with a communication server (not shown) via a base station (not shown), such as a mobile phone. It may be a communication module that performs direct wireless communication with the device. Here, the wireless communication is communication performed using radio waves or infrared rays' ultrasonic waves without using a wire line as a communication medium. Standards that enable wireless communication include various technologies such as wireless LAN, IrDA (Infrared Data Association), HomeRF (Home Radio Frequency), BlueTooth, etc. In this embodiment, various known wireless communication technologies are used. Can be used. Note that a wireless LAN can be used as a preferable example in terms of information transfer speed and the like. [0050] Here, the communication unit 407 may receive road traffic information such as traffic jams and traffic regulations on a regular basis (even if irregular). The reception of road traffic information by the communication unit 407 may be performed at the timing when the VICS (Vehicle Information and Communication system) center ~~ road traffic information is delivered, or the road traffic information is periodically sent to the VICS center. You may do it by requesting. The communication unit 407 can be realized as, for example, an AM ZFM tuner, a TV tuner, a VICSZ beacon receiver, and other communication devices.
[0051] なお、公知の技術であるため詳細な説明を省略する力 rvicsjとは、 VICSセンタ 一で編集、処理された渋滞や交通規制などの道路交通情報をリアルタイムに送信し 、カーナビゲーシヨン装置などの車載機器に文字 '図形で表示する情報通信システ ムである。この VICSセンターで編集、処理された道路交通情報 (VICS情報)をナビ ゲーシヨン装置に伝達する方法としては、各道路上に設置された「ビーコン」と「FM 多重放送」を利用する方法がある。「ビーコン」には、主に高速道路で使用される「電 波ビーコン」と、主要な一般道路で使用される「光ビーコン」がある。「FM多重放送」 を利用する場合には、広域エリアの道路交通情報を受信することが可能となる。「ビ 一コン」を利用する場合には、自車 (車両)位置を元にした直近の道路の詳細な情報 など、自車が位置する場所にお!、て必要な道路交通情報を受信することが可能とな る。そして、通信部 407は、他の画像編集装置との間での通信方法と、画像データや 道路交通情報を受信するための通信方法とが異なる場合には、それぞれに対応した 複数の通信手段を備えて 、てもよ 、。  [0051] It should be noted that rvicsj, which is a well-known technology and omits detailed explanation, is a real-time transmission of road traffic information such as traffic jams and traffic regulations edited and processed at the VICS center. It is an information communication system that displays characters on the in-vehicle device. As a method of transmitting road traffic information (VICS information) edited and processed at this VICS center to the navigation device, there is a method of using “beacon” and “FM multiplex broadcasting” installed on each road. “Beacons” include “electric beacons” mainly used on expressways and “optical beacons” used on major general roads. When “FM multiplex broadcasting” is used, road traffic information in a wide area can be received. When using VISICON, receive necessary road traffic information at the place where the vehicle is located, such as detailed information on the most recent road based on the vehicle (vehicle) position! It becomes possible. When the communication method with another image editing apparatus is different from the communication method for receiving image data and road traffic information, the communication unit 407 includes a plurality of communication means corresponding to each. Get ready.
[0052] 経路探索部 408は、位置取得部 403によって取得される車両の現在位置情報と、 ユーザによって入力される目的地の情報とに基づいて、現在位置から目的地までの 最適な経路を算出する。経路誘導部 409は、経路探索部 408によって探索された誘 導経路に関する情報あるいは、通信部 407によって受信した経路情報と、位置取得 部 403によって取得された現在位置情報と、記録媒体 404から記録媒体デコード部 405を経由して得られた地図情報とに基づいて、リアルタイムな経路誘導情報の生 成をおこなう。経路誘導部 409で生成された経路誘導情報は、制御部 400を介して 表示部 402へ出力される。 [0053] 案内音生成部 410は、パターンに対応したトーンと音声の情報を生成する。すなわ ち、経路誘導部 409で生成された経路誘導情報に基づいて、案内ポイントに対応し た仮想音源の設定と音声ガイダンス情報の生成をおこな 、、これらを制御部 400を 介して案内音出力部 406へ出力する。 [0052] The route search unit 408 calculates an optimal route from the current position to the destination based on the current position information of the vehicle acquired by the position acquisition unit 403 and the destination information input by the user. To do. The route guidance unit 409 includes information on the guidance route searched by the route search unit 408 or route information received by the communication unit 407, current position information acquired by the position acquisition unit 403, and a recording medium 404 Based on the map information obtained via the decoding unit 405, real-time route guidance information is generated. The route guidance information generated by the route guidance unit 409 is output to the display unit 402 via the control unit 400. [0053] Guide sound generation section 410 generates tone and voice information corresponding to the pattern. In other words, based on the route guidance information generated by the route guidance unit 409, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and the guidance sound is transmitted via the control unit 400. Output to the output unit 406.
[0054] スピーカ 411は、案内音出力部 406から出力されるナビゲーシヨンの案内音や、後 述する音声出力部 415から出力される音声を再生(出力)する。なお、たとえばこのス ピー力 411にヘッドホンなどを設け、車両内部全体が出力される案内音や音声の音 場とならな 、ように、案内音や音声の出力形態を適宜変更するようにしてもょ 、。  The speaker 411 reproduces (outputs) a navigation guidance sound output from the guidance sound output unit 406 and a sound output from the audio output unit 415 described later. For example, a headphone or the like may be provided for the speaker power 411 so that the sound output sound of the guidance sound or sound is appropriately changed so that the sound field of the guidance sound or sound output from the entire vehicle interior is not obtained. Oh ,.
[0055] 画像編集部 412は、後述する撮影部 416および通信部 407から画像入出力 IZF4 13を通じて取得した画像データや、記録媒体 404に記録されて ヽる画像データなど の画像編集処理をおこなう。この画像編集部 412は、具体的には、たとえば GPUな どによって構成される。画像編集部 412は、制御部 400からの制御命令にしたがって 、画像データを用いた電子アルバム(以下、「アルバム」という)データの作成処理を おこなう。ここで、アルバムデータとは、デジタルスチルカメラ(DSC)やデジタルビデ ォカメラ(DVC)などの撮影装置カゝらなる撮影部 416によって撮影された画像データ などを、表示部 402の表示画面上で絵日記や写真アルバムのように鑑賞することや、 パソコンなどで閲覧.編集することが可能なデジタルデータのことをいう。  The image editing unit 412 performs image editing processing such as image data acquired through the image input / output IZF 413 from the photographing unit 416 and the communication unit 407, which will be described later, and image data recorded in the recording medium 404. Specifically, the image editing unit 412 is configured by a GPU, for example. The image editing unit 412 performs processing for creating electronic album (hereinafter referred to as “album”) data using image data in accordance with a control command from the control unit 400. Here, the album data refers to image data taken by the photographing unit 416 such as a digital still camera (DSC) or a digital video camera (DVC) on the display screen of the display unit 402. Digital data that can be viewed like a photo album or viewed and edited on a personal computer.
[0056] 画像入出力 IZF413は、外部から画像編集部 412に対して入出力される画像デー タの入出力をおこなう。画像入出力 IZF413は、たとえば上記 DSCや DVCなどが撮 影した画像データを格納する記録媒体 404からの画像データや、 DSCや DVCなど に格納された画像データを USB (Universal Serial Bus)、 IEEE 1394 (Institut e of Electrical and Electronic Engineers 1394)および赤外線などの通信 によって通信部 407から入力される画像データなどを画像編集部 412に出力し、画 像編集部 412から出力される画像データを記録媒体 404や通信部 407に出力する。 なお、画像入出力 IZF413は、記録媒体 404との間で画像データの入出力をおこな う場合は、記録媒体 404のリード Zライトを制御するコントローラの機能を有するとよ い。また、画像入出力 IZF413は、通信部 407との間で画像データの入出力をおこ なう場合は、通信部 407での通信を制御する通信コントローラの機能を有するとよい [0057] 音声再生部 414は、記録媒体デコード部 405を通じて記録媒体 404から得られる 音声データや、制御部 400を通じて通信部 407から得られる音声データなどの選択 をおこない、選択した音声データの再生処理をおこなう。また、音声再生部 414は、 後述する音声データベース (以下、「音声 DB」とする) 611 (図 6参照)のような記憶装 置に格納された音声データの再生をおこなう。再生される音声データは、音楽の楽 曲や効果音などの音声データが挙げられる。また、この音声再生部 414は、画像編 集装置 310が AM/FMチューナや TVチューナを備える場合、たとえばラジオゃテ レビの音声を再生するように構成されて 、てもよ 、。 Image Input / Output IZF 413 inputs / outputs image data input / output to / from image editing unit 412 from the outside. Image input / output IZF413 uses USB (Universal Serial Bus), IEEE 1394 to transfer image data from recording medium 404 that stores image data captured by DSC or DVC, for example, or image data stored in DSC or DVC, for example. (Institut e of Electrical and Electronic Engineers 1394) and image data input from the communication unit 407 by communication such as infrared rays are output to the image editing unit 412 and the image data output from the image editing unit 412 is output to the recording medium 404. And output to the communication unit 407. Note that the image input / output IZF 413 preferably has a controller function for controlling read Z write of the recording medium 404 when inputting / outputting image data to / from the recording medium 404. Further, the image input / output IZF 413 may have a function of a communication controller that controls communication with the communication unit 407 when inputting / outputting image data to / from the communication unit 407. [0057] The audio reproduction unit 414 selects audio data obtained from the recording medium 404 through the recording medium decoding unit 405, audio data obtained from the communication unit 407 through the control unit 400, and the like, and reproduces the selected audio data. To do. The audio playback unit 414 plays back audio data stored in a storage device such as an audio database (hereinafter referred to as “audio DB”) 611 (see FIG. 6) described later. Audio data to be played back includes audio data such as music music and sound effects. In addition, when the image editing device 310 includes an AM / FM tuner or a TV tuner, the audio reproduction unit 414 may be configured to reproduce radio audio, for example.
[0058] 音声出力部 415は、音声再生部 414で選択され再生処理された音声データに基 づいて、スピーカ 411から出力される音声の出力を制御する。具体的には、たとえば 音声の音量の調整ゃィコライジング処理などをおこない、音声の出力状態を制御す る。この音声出力部 415による音声の出力の制御は、たとえばユーザ操作部 401か らの入力操作や制御部 400による制御によっておこなわれる。  The audio output unit 415 controls the output of the audio output from the speaker 411 based on the audio data selected and reproduced by the audio reproduction unit 414. Specifically, for example, sound volume adjustment and equalizing processing is performed to control the sound output state. The audio output control by the audio output unit 415 is performed by, for example, an input operation from the user operation unit 401 or a control by the control unit 400.
[0059] 撮影部 416は、図 3における車両に搭載されたカメラ 305や、上述した DSCおよび DVCなどの外部の撮影装置により構成され、 C— MOSあるいは CCDなどの光電変 換素子を有し、車両の内部や外部の画像を撮影する。この撮影部 416は、画像編集 装置 310と有線または無線で接続され、制御部 400からの撮影命令により、たとえば 車両に搭乗して 、る人物を撮影する。撮影部 416で撮影された画像の画像データは 、画像入出力 IZF413を介して画像編集部 412に出力される。  [0059] The imaging unit 416 includes the camera 305 mounted on the vehicle in FIG. 3 and an external imaging device such as the above-described DSC and DVC, and includes a photoelectric conversion element such as C-MOS or CCD. Take images inside or outside the vehicle. The photographing unit 416 is connected to the image editing device 310 by wire or wirelessly, and takes a picture of a person who gets on the vehicle, for example, according to a photographing command from the control unit 400. Image data of an image photographed by the photographing unit 416 is output to the image editing unit 412 via the image input / output IZF 413.
[0060] 集音部 417は、図 3における車両に搭載されたマイク 306などにより構成され、たと えば車両の内部の音場から、車両に搭乗している人物の発声音などの音声を集音 する。音声入力 IZF418は、集音部 417によって集音された音声をディジタル音声 データに変換し、制御部 400に出力する。音声入力 IZF418は、より具体的には、た とえば入力されたアナログ音声データをディジタル音声データに変換する AZDコン バータなど力も構成することができる。その他、音声入力 IZF418は、ディジタル音 声データをフィルタ処理するフィルタ回路や、アナログ音声データを増幅する増幅回 路などを備えていてもよい。 [0061] ここで、上述した制御部 400は、撮影部 416によって撮影され画像編集部 412から 出力される画像データや、集音部 417によって集音され音声入力 IZF418から出力 される音声データに基づいて、車両の内部の雰囲気を判断する。車両の内部の雰囲 気の判断は、具体的には、たとえば搭乗している人物の表情や声などの特徴量の変 化を検出しておこなわれる。このため、制御部 400は、たとえば DSP (Digital Sign al Processor)の機能を有する構成とされて 、てもよ!/、。 [0060] The sound collection unit 417 is configured by the microphone 306 or the like mounted on the vehicle in FIG. 3, for example, collecting sound such as the utterance sound of a person boarding the vehicle from the sound field inside the vehicle. To do. The voice input IZF 418 converts the voice collected by the sound collection unit 417 into digital voice data and outputs it to the control unit 400. More specifically, the voice input IZF 418 can also be configured with power such as an AZD converter that converts input analog voice data into digital voice data. In addition, the audio input IZF 418 may include a filter circuit that filters digital audio data, an amplification circuit that amplifies analog audio data, and the like. Here, the control unit 400 described above is based on image data captured by the imaging unit 416 and output from the image editing unit 412, or audio data collected by the sound collection unit 417 and output from the audio input IZF 418. To determine the atmosphere inside the vehicle. Specifically, the determination of the atmosphere inside the vehicle is made by detecting changes in features such as the facial expression and voice of the person on board. For this reason, the control unit 400 may be configured to have a DSP (Digital Signal Processor) function, for example!
[0062] なお、図 1における撮影部 101は、具体的には、たとえば撮影部 416によってその 機能を実現し、集音部 102は、たとえば集音部 417によってその機能を実現する。ま た、図 1における入力部 103は、具体的には、たとえば画像入出力 IZF413および 音声入力 IZF418によってその機能を実現し、取得部 104は、たとえば位置取得部 403によってその機能を実現する。  Note that, specifically, the photographing unit 101 in FIG. 1 realizes its function by, for example, the photographing unit 416, and the sound collecting unit 102 realizes its function by, for example, the sound collecting unit 417. Further, specifically, the input unit 103 in FIG. 1 realizes its function by, for example, an image input / output IZF 413 and an audio input IZF 418, and the acquisition unit 104 realizes its function by, for example, a position acquisition unit 403.
[0063] また、図 1における関連付け部 105、検出部 107および制御部 108は、具体的には 、たとえば制御部 400および画像編集部 412によってその機能を実現する。さらに、 図 1における表示部 106は、具体的には、たとえば表示部 402によってその機能を実 現し、音声再生部 109は、たとえば音声再生部 414、音声出力部 415およびスピー 力 411によってその機能を実現する。  [0063] In addition, the functions of the associating unit 105, the detecting unit 107, and the control unit 108 in FIG. 1 are specifically realized by, for example, the control unit 400 and the image editing unit 412. Furthermore, specifically, the display unit 106 in FIG. 1 realizes its function by, for example, the display unit 402, and the audio reproduction unit 109 has its function by, for example, the audio reproduction unit 414, the audio output unit 415, and the speech force 411. Realize.
[0064] ここで、上記画像編集部 412および音声再生部 414の内部構成について説明する 。図 5は、この発明の実施例にカゝかる画像編集装置における画像編集部の内部構成 の一例を示すブロック図である。また、図 6は、この発明の実施例に力かる画像編集 装置における音声再生部の内部構成の一例を示すブロック図である。  Here, the internal configuration of the image editing unit 412 and the audio reproduction unit 414 will be described. FIG. 5 is a block diagram showing an example of the internal configuration of the image editing unit in the image editing apparatus according to the embodiment of the present invention. FIG. 6 is a block diagram showing an example of the internal configuration of the audio playback unit in the image editing apparatus according to the embodiment of the present invention.
[0065] 図 5において、画像編集部 412は、画像編集処理部 510と、表示制御部 511と、画 像認識部 512と、画像記憶部 513と、人物認識部 514と、人物データベース(以下、「 人物 DB」とする) 515とを備えて構成されている。画像編集処理部 510は、画像入出 力 IZF413を通じて撮影部 416 (図 4参照、以下同じ)や外部から画像編集部 412に 入力される画像データや、記録媒体デコード部 405 (図 4参照、以下同じ)および制 御部 400 (図 4参照、以下同じ)を通じて記録媒体 404 (図 4参照、以下同じ)から画 像編集部 412に入力される画像データの画像編集処理をおこなう。また、画像編集 処理部 510は、後述する画像記憶部 513に記憶されている画像データを読み出して 画像編集処理をおこなう。画像編集処理内容は、たとえば画像データをアルバムデ ータに編集することなどが挙げられる。 In FIG. 5, an image editing unit 412 includes an image editing processing unit 510, a display control unit 511, an image recognition unit 512, an image storage unit 513, a person recognition unit 514, "Person DB") 515. The image editing processing unit 510 is connected to the image capturing unit 416 (see FIG. 4; the same applies hereinafter) through the image input / output IZF 413, the image data input to the image editing unit 412 from the outside, and the recording medium decoding unit 405 (see FIG. 4; the same applies to the following). ) And the control unit 400 (see FIG. 4, the same applies hereinafter), and performs image editing processing of image data input to the image editing unit 412 from the recording medium 404 (refer to FIG. 4 and the same applies hereinafter). The image editing processing unit 510 reads out image data stored in an image storage unit 513 to be described later. Perform image editing processing. The contents of the image editing process include, for example, editing image data into album data.
[0066] 表示制御部 511は、画像編集処理部 510から出力される画像データを、表示部 40 2の表示画面上でアルバム形態として表示するための制御をおこなう。画像認識部 5 12は、画像編集処理部 510に入力された画像データに基づいて、その画像データ にどのような画像イメージが含まれているのかを認識する。画像記憶部 513は、画像 編集処理部 510に入力された画像データを記憶する。  The display control unit 511 performs control for displaying the image data output from the image editing processing unit 510 as an album form on the display screen of the display unit 402. The image recognition unit 512 recognizes what image image is included in the image data based on the image data input to the image editing processing unit 510. The image storage unit 513 stores the image data input to the image editing processing unit 510.
[0067] 人物認識部 514は、画像編集処理部 510に入力された画像データ内の画像ィメー ジに、人物に関する画像イメージが含まれている場合に、あら力じめ人物 DB515に 格納されている人物に関する画像イメージを読み出して、その画像イメージが表わす 人物の認識処理をおこなう。認識処理は、具体的には、たとえば人物の顔画像ィメー ジに基づく顔認証によりおこなわれる。顔認証については、公知の技術であるため、 ここでは説明を省略する。人物 DB515は、車両に搭乗している人物の画像イメージ を含む画像データや、これらの人物の年齢、性別などの個人識別データなどを格納 している。  [0067] The person recognizing unit 514 is stored in the person DB 515 when the image image in the image data input to the image editing processing unit 510 includes an image related to a person. The image related to the person is read out, and the person represented by the image is recognized. Specifically, the recognition process is performed, for example, by face authentication based on a human face image. Since face authentication is a known technique, description thereof is omitted here. The person DB 515 stores image data including image images of persons on the vehicle, and personal identification data such as the age and sex of these persons.
[0068] なお、画像編集部 412は、画像認識部 512で認識された画像データの画像ィメー ジゃ、人物認識部 514で認識された人物に関する画像イメージの特徴量を検出し、 制御部 400に出力する。これらの画像イメージの特徴量は、たとえば画像イメージの 色調データや人物の顔画像イメージの感情パラメータ力 検出される。具体的には、 色調データは、画像イメージ全体の色調が赤、青、緑などのどの色合いに近いかを 示し、感情パラメータは、人物の顔画像がどの表情 (喜怒哀楽)に近いかを示す。  [0068] Note that the image editing unit 412 detects the image data of the image data recognized by the image recognition unit 512, the feature amount of the image image related to the person recognized by the person recognition unit 514, and sends it to the control unit 400. Output. The feature quantities of these image images are detected, for example, color tone data of the image image or emotion parameter power of the human face image. Specifically, the color tone data indicates which color tone of the entire image is close to red, blue, green, etc., and the emotion parameter indicates which facial expression of the person's face image is close (feeling emotional). Show.
[0069] 一方、図 6において、音声再生部 414は、音声再生処理部 610と、音声データべ一 ス(以下、「音声 DB」という) 611と、選曲履歴データベース(以下、「選曲履歴 DB」と いう) 612とを備えて構成されている。音声再生処理部 610は、音声再生部 414に入 力される音声データや、音声 DB611に格納されて!、る音声データの選択 ·再生処理 をおこなう。また、音声再生処理部 610は、たとえば画像編集部 412 (図 4参照、以下 同じ)で作成されたアルバムデータ内の画像データに関連付けて、音声データの選 択 '再生処理をおこなう。ここでの音声データの関連付けは、アルバムデータ内の画 像データに含まれるタイムスタンプデータや、画像イメージの色調データ、人物の顔 画像イメージなどの特徴量に基づきおこなわれるとよい。 On the other hand, in FIG. 6, an audio playback unit 414 includes an audio playback processing unit 610, an audio data base (hereinafter referred to as “audio DB”) 611, a music selection history database (hereinafter referred to as “music selection history DB”). 612). The audio reproduction processing unit 610 performs selection / reproduction processing of audio data input to the audio reproduction unit 414 and audio data stored in the audio DB 611 !. Also, the audio reproduction processing unit 610 performs audio data selection 'reproduction processing in association with image data in album data created by, for example, the image editing unit 412 (see FIG. 4, the same applies hereinafter). The audio data association here is the image in the album data. It may be performed based on feature quantities such as time stamp data included in the image data, color tone data of the image image, and human face image image.
[0070] 音声 DB611は、音声再生部 414で再生処理される音声データを格納する。音声 D B611に格納される音声データは、記録媒体 404 (図 4参照、以下同じ)や通信部 40 7 (図 4参照、以下同じ)力も音声再生部 414に入力される音声データであっても、あ らカじめ画像編集装置 310に装備された音声データであってもよい。選曲履歴 DB6 12は、音声再生部 414で再生処理した音声データが楽曲データである場合などに、 その楽曲の再生履歴や選曲履歴に関する情報を格納する。この選曲履歴 DB612は 、たとえば画像編集装置 310が車両に搭載されているときに、ドライブ中に再生され た楽曲の再生履歴や選曲履歴に関する情報などを格納する。  The audio DB 611 stores audio data to be reproduced by the audio reproducing unit 414. The sound data stored in the sound DB 611 may be the sound data input to the sound reproducing unit 414 even if the recording medium 404 (see FIG. 4; the same applies hereinafter) or the communication unit 40 7 (see FIG. 4; the same applies hereinafter). Alternatively, the audio data provided in the bullying image editing device 310 may be used. The music selection history DB 612 stores information related to the music playback history and music selection history when the audio data played back by the audio playback unit 414 is music data. This music selection history DB 612 stores, for example, information related to the reproduction history of music played during driving and information related to music selection history when the image editing apparatus 310 is mounted on a vehicle.
[0071] (画像編集装置の画像編集処理手順)  [0071] (Image Editing Processing Procedure of Image Editing Device)
つぎに、この発明の実施例にカゝかる画像編集装置の画像編集処理手順について 説明する。図 7は、この発明の実施例にカゝかる画像編集装置の画像編集処理手順の 一例を示すフローチャートである。図 7において、まず、車両の内部に備えられた撮 影部 416 (図 4参照、以下同じ)によって、車内の画像を撮影するとともに (ステップ S 701)、車両の内部に備えられた集音部 417 (図 4参照、以下同じ)によって、車両に 搭乗して!/ヽる者 (以下、「搭乗者」 ヽぅ)から発せられる車内の音声を集音する (ステ ップ S 702)。  Next, an image editing process procedure of the image editing apparatus according to the embodiment of the present invention will be described. FIG. 7 is a flowchart showing an example of an image editing process procedure of the image editing apparatus according to the embodiment of the present invention. In FIG. 7, first, an image inside the vehicle is taken by the imaging unit 416 (see FIG. 4, the same applies hereinafter) provided inside the vehicle (step S 701), and the sound collecting unit provided inside the vehicle. Board the vehicle with 417 (see Figure 4, the same applies below)! / Sounds from the passenger (hereinafter referred to as “passenger” ヽ ぅ) are collected (step S702).
[0072] 撮影部 416によって撮影された画像の画像データは、画像入出力 IZF413 (図 4 参照、以下同じ)を通して画像編集部 412 (図 4参照、以下同じ)に入力され、集音部 417によって集音された音声の音声データは、音声入力 IZF418 (図 4参照、以下 同じ)を通して制御部 400 (図 4参照、以下同じ)に入力され、画像編集部 412および 制御部 400によって、画像データの画像イメージの特徴量および音声データの音声 パラメータの特徴量をそれぞれ検出する(ステップ s 703)。このステップ S 703におい て、画像編集部 412によって検出された画像イメージの特徴量に関する情報は、制 御部 400に出力される。  [0072] Image data of an image photographed by the photographing unit 416 is input to an image editing unit 412 (see FIG. 4 and the same below) through an image input / output IZF413 (see FIG. 4 and so on), and is collected by a sound collecting unit 417. The collected audio data is input to the control unit 400 (see FIG. 4 and the same below) through the audio input IZF418 (see FIG. 4 and the same below), and the image editing unit 412 and the control unit 400 input the image data. The feature amount of the image image and the feature amount of the sound parameter of the sound data are detected (step s703). In step S 703, information regarding the feature amount of the image image detected by the image editing unit 412 is output to the control unit 400.
[0073] 画像イメージの特徴量および音声パラメータの特徴量を検出したのち、制御部 400 は、検出した特徴量に基づいて、車内の雰囲気に変化が生じたか否かを判断する( ステップ S704)。車内の雰囲気に変化が生じた力否かの判断は、たとえば検出した 画像イメージの特徴量が「笑顔」を表わす感情パラメータカゝら「泣き顔」を表わす感情 ノ メータに変化したり、音声パラメータの特徴量が「笑い声」を表わす周波数成分か ら「怒鳴り声」を表わす周波数成分に変化したりすることを判断しておこなわれる。 [0073] After detecting the feature amount of the image and the feature amount of the audio parameter, the control unit 400 determines whether or not the atmosphere in the vehicle has changed based on the detected feature amount ( Step S704). Judgment of whether or not there is a change in the atmosphere in the vehicle is, for example, whether the detected feature value of the image changes from an emotion parameter representing “smile” to an emotion parameter representing “crying face” This is done by judging that the feature value changes from a frequency component representing “laughing voice” to a frequency component representing “screaming voice”.
[0074] ステップ S704において、制御部 400によって、車内の雰囲気に変化が生じていな いと判断した場合 (ステップ S704 :No)、ステップ S701に戻り、ステップ S701からス テツプ S704までの処理を繰り返す。ステップ S704において、車内の雰囲気に変化 が生じたと判断した場合 (ステップ S704 : Yes)、画像編集部 412によって、画像入出 力 IZF413を通じて撮影部 416により撮影されたタイムスタンプデータを含む画像デ ータを取得する(ステップ S 705)。  In step S704, if control unit 400 determines that there is no change in the vehicle interior (step S704: No), the process returns to step S701, and the processing from step S701 to step S704 is repeated. If it is determined in step S704 that the atmosphere in the vehicle has changed (step S704: Yes), the image editing unit 412 generates image data including time stamp data photographed by the photographing unit 416 through the image input / output IZF413. Obtain (step S705).
[0075] そして、ステップ S705において、画像データを取得するとともに、制御部 400は、 位置取得部 403 (図 4参照、以下同じ)力も車両の現在位置情報を取得し (ステップ S 706)、記録媒体デコード部 405 (図 4参照、以下同じ)を通じて記録媒体 404 (図 4 参照、以下同じ)力も地図情報を取得して (ステップ S707)、さらに車両の移動した経 路および時刻に関する情報を取得する (ステップ S 708)。  [0075] Then, in step S705, the image data is acquired, and the control unit 400 also acquires the current position information of the vehicle with the position acquisition unit 403 (see FIG. 4; the same applies hereinafter) force (step S706), and the recording medium Through the decoding unit 405 (see FIG. 4, the same applies hereinafter), the recording medium 404 (see FIG. 4; the same applies hereinafter) also acquires map information (step S707), and further acquires information on the route and time of travel of the vehicle ( Step S708).
[0076] ステップ S708において、移動した経路および時刻に関する情報を取得したのち、 制御部 400によって、画像編集部 412が取得した画像データのタイムスタンプデータ と移動した経路および時刻に関する情報とを照合し、画像のタイムスタンプデータが 示す時刻に車両が通過した地図上の地点を検出し、画像データを地図情報と関連 付ける(ステップ S709)。  [0076] In step S708, after acquiring information related to the traveled route and time, the control unit 400 collates the time stamp data of the image data acquired by the image editing unit 412 with the information related to the traveled route and time, A point on the map where the vehicle passes at the time indicated by the time stamp data of the image is detected, and the image data is associated with the map information (step S709).
[0077] そして、画像データを地図情報と関連付けたのち、画像編集部 412によって、画像 データを用いてアルバムデータを作成する(ステップ S710)。こうしてアルバムデータ を作成したのち、位置取得部 403などによって、車両の速度に関する情報や傾斜角 情報などの車両の挙動に関する挙動情報を取得する (ステップ S711)。  [0077] Then, after associating the image data with the map information, the image editing unit 412 creates album data using the image data (step S710). After creating the album data in this manner, the position acquisition unit 403 or the like acquires behavior information relating to the vehicle behavior such as information relating to the vehicle speed and inclination angle information (step S711).
[0078] こうして取得された挙動情報は、制御部 400を通じて音声再生部 414 (図 6参照、 以下同じ)に出力されるとともに、音声再生部 414は、画像編集部 412からアルバム データを取得し、音声再生処理部 610 (図 6参照、以下同じ)によって、音声 DB611 (図 6参照、以下同じ)からの音声データや、選曲履歴 DB612 (図 6参照、以下同じ) 力 の選曲履歴に関する情報などを参照して、アルバムデータに音声データを関連 付ける(ステップ S 712)。 The behavior information acquired in this way is output to the audio reproduction unit 414 (see FIG. 6; the same applies hereinafter) through the control unit 400, and the audio reproduction unit 414 acquires album data from the image editing unit 412. The audio playback processing unit 610 (see FIG. 6; the same applies hereinafter) allows the audio data from the audio DB 611 (see FIG. 6; the same applies hereinafter) and the music selection history DB 612 (see FIG. 6; same applies hereinafter) The audio data is associated with the album data by referring to information on the power selection history (step S712).
[0079] ここで、音声データの関連付けは、たとえばアルバムデータに関連付けられた地図 情報や挙動情報に基づいて、画像撮影時の地形や道路種別などを判断し、判断し た地形や道路種別などに合う楽曲などの音声データを音声 DB611から読み出して 関連付ける。その他、上述した画像イメージの特徴量および音声パラメータの特徴量 を参照して、これらの特徴量に合う音声データを関連付けるようにしてもょ 、。  [0079] Here, associating the audio data, for example, based on the map information and behavior information associated with the album data, the terrain and road type at the time of image capture are determined, and the determined terrain and road type are determined. Audio data such as matching music is read from the audio DB611 and associated. In addition, referring to the above-mentioned feature values of the image and the feature values of the audio parameters, the sound data suitable for these feature values may be associated.
[0080] ステップ S712において、アルバムデータに音声データを関連付けたのち、画像編 集部 412および制御部 400によって、アルバムデータが完成したか否かを判断する( ステップ S713)。アルバムデータが完成して!/、な!/、と判断した場合 (ステップ S713: No)、ステップ S701に戻り、ステップ S701力らステップ S713までの処理を繰り返す 。アルバムデータが完成したと判断した場合 (ステップ S713 : Yes)、本フローチヤ一 トによる一連の画像編集処理を終了する。  [0080] In step S712, after associating the audio data with the album data, the image editing unit 412 and the control unit 400 determine whether the album data is completed (step S713). If it is determined that the album data is completed! /, NA! / (Step S713: No), the process returns to step S701, and the processing from step S701 to step S713 is repeated. If it is determined that the album data has been completed (step S713: Yes), the series of image editing processing by this flowchart is terminated.
[0081] ここで、ステップ S712における、アルバムデータへの音声データの他の関連付け 処理について簡単に説明する。図 8および図 9は、この発明の実施例に力かる画像 編集装置の画像編集処理における音声データの他の関連付け処理手順の一例を 示すフローチャートである。なお、図 8は、画像データのタイムスタンプデータに基づ く関連付け処理を示し、図 9は、画像データの画像イメージの色調データに基づく関 連付け処理を示している。  [0081] Here, another process of associating audio data with album data in step S712 will be briefly described. FIG. 8 and FIG. 9 are flowcharts showing an example of another procedure for associating audio data in the image editing process of the image editing apparatus according to the embodiment of the present invention. FIG. 8 shows the association process based on the time stamp data of the image data, and FIG. 9 shows the association process based on the tone data of the image image of the image data.
[0082] 図 8において、まず、音声再生部 414 (図 6参照、以下同じ)の音声再生処理部 61 0 (図 6参照、以下同じ)は、選曲履歴 DB612 (図 6参照、以下同じ)から、たとえば画 像編集装置 310 (図 4参照、以下同じ)で再生されていた楽曲の再生履歴に関する 情報を取得する (ステップ S801)。再生履歴に関する情報を取得したのち、音声再 生処理部 610は、アルバムデータ内の画像データのタイムスタンプデータを参照する (ステップ S802)。 In FIG. 8, first, the audio reproduction processing unit 61 0 (see FIG. 6, the same applies hereinafter) of the audio reproduction unit 414 (see FIG. 6, the same applies hereinafter) is selected from the music selection history DB 612 (refer to FIG. 6, the same applies hereinafter). For example, the information about the reproduction history of the music that has been reproduced by the image editing device 310 (see FIG. 4, the same applies hereinafter) is acquired (step S801). After acquiring the information regarding the playback history, the audio playback processing unit 610 refers to the time stamp data of the image data in the album data (step S802).
[0083] ステップ S802において、タイムスタンプデータを参照したのち、音声再生処理部 6 10は、参照したタイムスタンプデータに最も近い時刻に再生された再生履歴に関す る情報をもつ楽曲の音声データを選択する (ステップ S803)。こうして音声データを 選択したのち、音声再生処理部 610は、アルバムデータに選択した音声データを関 連付ける (ステップ S804)。このステップ S804においては、たとえば選択した音声デ ータのサビの部分 (ハイライト部分)を対応付けてアルバムデータに音声データを関 連付けるようにしてもよい。 [0083] In step S802, after referring to the time stamp data, the audio reproduction processing unit 610 selects the audio data of the music having information regarding the reproduction history reproduced at the time closest to the referred time stamp data. (Step S803). This way the audio data After the selection, the audio reproduction processing unit 610 associates the selected audio data with the album data (step S804). In step S804, for example, the chorus portion (highlight portion) of the selected audio data may be associated with the album data to associate the audio data.
[0084] 一方、図 9において、音声再生処理部 610 (図 6参照、以下同じ)は、たとえば画像 編集処理部 510 (図 5参照、以下同じ)力もアルバムデータを参照し、アルバムデータ 内の画像データ全体における画像イメージの特徴量として、色調データの特徴量を 参照する (ステップ S901)。そして、音声再生処理部 610は、音声 DB611 (図 6参照 、以下同じ)から、参照した色調データの特徴量に応じた音声データを選択する (ス テツプ S902)。 On the other hand, in FIG. 9, an audio reproduction processing unit 610 (see FIG. 6, the same applies hereinafter), for example, an image editing processing unit 510 (see FIG. 5; Refer to the feature value of the color tone data as the feature value of the image in the entire data (step S901). Then, the audio reproduction processing unit 610 selects audio data corresponding to the feature amount of the referred tone data from the audio DB 611 (see FIG. 6, the same applies hereinafter) (step S902).
[0085] このステップ S902における音声データの選択は、たとえば画像データ全体の画像 イメージの色調が青である場合、悲しい雰囲気の曲調の音声データを選択し、緑で ある場合、癒される雰囲気の曲調の音声データを選択し、赤である場合、アップテン ポの曲調の音声データを選択するようにおこなわれる。こうして音声データを選択し たのち、音声再生処理部 610は、アルバムデータに選択した音声データを関連付け る(ステップ S903)。このステップ S903においては、たとえば選択した音声データの サビの部分 (ハイライト部分)を対応付けてアルバムデータに音声データを関連付け るようにしてちょい。  In this step S902, the sound data is selected, for example, when the color of the image of the entire image data is blue, the sound data of a sad atmosphere is selected, and when the color is green, the sound of the atmosphere of the healing atmosphere is selected. If the audio data is selected and it is red, the uptempo music data is selected. After selecting the audio data in this way, the audio reproduction processing unit 610 associates the selected audio data with the album data (step S903). In step S903, for example, the chorus portion (highlight portion) of the selected audio data is associated with the album data so as to associate the audio data.
[0086] なお、ステップ S902における音声データの選択は、たとえば画像データの顔画像 イメージが表わす感情パラメータに基づいておこなわれてもよい。この場合、たとえば 顔画像イメージが喜びを表わして 、るときは、明る 、雰囲気の曲調の音声データを選 択し、怒りを表わしているときは、激しい雰囲気の曲調の音声データを選択し、楽しみ を表わして!/、るときは、アップテンポの曲調の音声データを選択するようにおこなわれ る。  Note that the selection of audio data in step S902 may be performed based on, for example, emotion parameters represented by the face image of the image data. In this case, for example, when the face image shows joy, select the voice data with a tone of brightness and atmosphere, and when the image shows anger, select the voice data with a tone of intense atmosphere and enjoy it. When !! is displayed, the voice data of up-tempo music is selected.
[0087] つぎに、この発明の実施例に力かる画像編集装置の画像編集処理の具体例につ いて説明する。図 10および図 11は、この発明の実施例に力かる画像編集装置の画 像編集処理の具体的処理例を示す説明図である。図 10に示すように、画像編集装 置 310 (図 4参照、以下同じ)の画像編集部 412 (図 5参照、以下同じ)は、位置取得 部 403 (図 4参照、以下同じ)によって取得され制御部 400 (図 4参照、以下同じ)を通 じて入力した車両の現在位置情報、ならびに経路および時刻に関する情報や、記録 媒体 404 (図 4参照、以下同じ)力 制御部 400を通じて入力した地図情報を参照し 、車両の出発地 Sから終点地 Eまでの通過した経路上において、撮影部 416 (図 4参 照、以下同じ)や集音部 417 (図 4参照、以下同じ)力 の画像データおよび音声デ ータの画像イメージおよび音声パラメータのそれぞれの特徴量に基づ 、て、車内の 雰囲気が変化したと判断したときに画像データを取得する。 Next, a specific example of the image editing process of the image editing apparatus that is useful in the embodiment of the present invention will be described. 10 and 11 are explanatory diagrams showing a specific processing example of the image editing processing of the image editing apparatus according to the embodiment of the present invention. As shown in FIG. 10, the image editing unit 412 (see FIG. 5; the same below) of the image editing device 310 (see FIG. Section 403 (see Fig. 4; the same applies hereinafter) acquired by the control unit 400 (see Fig. 4; the same applies hereinafter) as well as vehicle current position information, route and time information, and recording medium 404 (Fig. 4). Refer to the map information input through the force control unit 400, and on the route from the starting point S to the end point E of the vehicle, the shooting unit 416 (see FIG. Part 417 (Refer to Fig. 4; the same applies hereinafter) Image data when it is determined that the atmosphere in the vehicle has changed based on the feature values of the force image data and the image data and sound parameters of the sound data. get.
[0088] 図 10に示す場合、写真取得ポイント A〜Dにおいて、車内の雰囲気が変化したと 判断され画像データが取得されたことを示す。そして、画像編集部 412は、たとえば 写真取得ポイント A〜Dで取得された画像データのタイムスタンプデータと、写真取 得ポイント A〜Dにお 、て取得した車両の現在位置情報や、車両の経路および時刻 に関する情報とに基づ 、て、写真取得ポイント A〜Dで取得した画像データを地図情 報に関連付けてアルバムデータを作成する。このアルバムデータには、楽曲などの 音声データが関連付けられるため、適宜自動的に音楽などを再生することができる。  [0088] In the case shown in Fig. 10, it is determined that the atmosphere in the vehicle has changed at photo acquisition points A to D, and image data has been acquired. Then, for example, the image editing unit 412 includes the time stamp data of the image data acquired at the photo acquisition points A to D, the current position information of the vehicle acquired at the photo acquisition points A to D, and the route of the vehicle. Based on the information about the time and the time, album data is created by associating the image data acquired at the photo acquisition points A to D with the map information. Since this album data is associated with audio data such as music, music can be automatically played back as appropriate.
[0089] このようにして画像編集部 412が作成したアルバムデータは、図 11に示すように、 たとえば表示部 402 (図 4参照、以下同じ)の表示画面上に、見開き本のアルバムの ような形態で表示することができる。この表示されたアルバムデータ 1110には、たと えば写真取得ポイント A〜D (図 10参照、以下同じ)で取得された画像データ 1120、 1130、 1140、 1150を時系列順に表示することができる。表示された各画像データ 1120、 1130、 1140、 1150は、写真取得ポイント A〜Dにおける車内写真 A〜Dの 画像イメージ 1121、 1131、 1141、 1151や車両の外の風景 A〜Dの画像イメージ 1 122、 1132、 1142、 1152を表示するとよ!/、。  [0089] The album data created by the image editing unit 412 in this way is, for example, a double-page album on the display screen of the display unit 402 (see Fig. 4; the same applies hereinafter) as shown in Fig. 11. It can be displayed in the form. In the displayed album data 1110, for example, image data 1120, 1130, 1140, 1150 acquired at photo acquisition points A to D (see FIG. 10, the same shall apply hereinafter) can be displayed in chronological order. The displayed image data 1120, 1130, 1140, and 1150 are the image images 1121, 1131, 1141, and 1151 and the scenery A to D outside the vehicle at image acquisition points A to D. Show 122, 1132, 1142, 1152! / ,.
[0090] また、表示された各画像データ 1120、 1130、 1140、 1150は、関連付けられた地 図情報などから得た地名や、撮影された時刻もしくは車両が通過した時刻などを表 わすテキスト情報を表示するとよい。図 11においては、たとえば画像データ 1120は 、時刻「午前 8時 14分」に撮影され、撮影された写真取得ポイント Aが「寄居駅付近」 であることを表わす。同様に、画像データ 1130は、時刻「午前 8時 37分」に撮影され 、撮影された写真取得ポイント Bが「長瀞付近」であることを表わす。また、画像データ 1140は、時刻「午後 1時 20分」に撮影され、撮影された写真取得ポイント Cが「秩父 駅付近」であることを表わす。さらに、画像データ 1150は、時刻「午後 2時 53分」に撮 影され、撮影された写真取得ポイント Dが「正丸峠付近」であることを表わしている。な お、ここでは画像データ 1120、 1130、 1140、 1150を時系列順に表示した力 たと えば車両の通過した経路順に表示してもよ 、。 [0090] Each of the displayed image data 1120, 1130, 1140, 1150 includes place information obtained from the associated map information, and text information representing the time when the image was taken or the vehicle passed. It is good to display. In FIG. 11, for example, the image data 1120 is taken at the time “8:14 am”, and the photograph acquisition point A taken is “near Yorii Station”. Similarly, the image data 1130 is taken at the time “8:37 am”, and the taken photo acquisition point B is “near Nagatoro”. Also, image data 1140 is taken at the time “1:20 pm”, and the photograph acquisition point C is “near Chichibu Station”. Further, the image data 1150 is taken at the time “2:53 pm”, and the photograph acquisition point D taken is “near Masamaru Pass”. Here, for example, the image data 1120, 1130, 1140, and 1150 may be displayed in the order of the route that the vehicle has passed.
[0091] 以上説明したように、本実施例に力かる画像編集装置は、車内の雰囲気が変化し たときに撮影した画像データのタイムスタンプデータと、車両が移動した経路および 時刻に関する情報とに基づいて、サーバなどを介さずに画像データを地図情報と関 連付けてアルバムデータを作成することができる。このため、画像データを自動的に 時系列順や経路順に編集してアルバムデータを作成し、画像編集の手間を軽減す ることがでさる。 As described above, the image editing apparatus according to the present embodiment uses the time stamp data of the image data captured when the atmosphere in the vehicle changes, and information on the route and time the vehicle has moved. Based on this, album data can be created by associating image data with map information without using a server or the like. For this reason, image data can be automatically edited in chronological order or route order to create album data, reducing the effort of image editing.
[0092] また、本実施例に力かる画像編集装置は、サーバなどを介さずにアルバムデータ の画像データを音声データと関連付けることもできるため、エンターテインメント性を 向上させ画像編集処理の手間やコストを軽減することができる。  In addition, the image editing apparatus according to the present embodiment can associate the image data of the album data with the audio data without using a server or the like, so that it is possible to improve entertainment properties and reduce the labor and cost of the image editing process. Can be reduced.
[0093] 以上のように、この発明にカゝかる画像編集装置、画像編集方法、画像編集プロダラ ムおよびコンピュータに読み取り可能な記録媒体によれば、画像の撮影を適宜自動 的におこな!/、、取得した画像データを適宜自動的に地図情報や音声データと関連 付けて電子アルバムを作成することができるという効果を奏する。  [0093] As described above, according to the image editing apparatus, the image editing method, the image editing program, and the computer-readable recording medium according to the present invention, images are automatically captured as appropriate! / As a result, an electronic album can be created by automatically associating the acquired image data with map information and audio data as appropriate.
[0094] なお、本実施の形態で説明した画像編集方法は、予め用意されたプログラムをパ 一ソナル 'コンピュータやワークステーションなどのコンピュータで実行することにより 実現することができる。このプログラムは、ハードディスク、フレキシブルディスク、 CD -ROM, MO、 DVDなどのコンピュータで読み取り可能な記録媒体に記録され、コ ンピュータによって記録媒体力 読み出されることによって実行される。また、このプ ログラムは、インターネットなどのネットワークを介して配布することが可能な伝送媒体 であっても良い。  Note that the image editing method described in the present embodiment can be realized by executing a prepared program on a computer such as a personal computer or a workstation. This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read by the computer. Further, this program may be a transmission medium that can be distributed through a network such as the Internet.

Claims

請求の範囲 The scope of the claims
[1] 日時に関する情報を含む画像データの入力を受け付ける入力手段と、  [1] An input means for accepting input of image data including information on date and time,
移動体が移動した経路および時刻に関する情報を取得する取得手段と、 前記入力手段によって受け付けられた画像データの前記日時に関する情報と、前 記取得手段によって取得された経路および時刻に関する情報とに基づいて、前記画 像データを地図情報と関連付ける関連付け手段と、  Based on acquisition means for acquiring information on the route and time of movement of the moving body, information on the date and time of the image data received by the input means, and information on the route and time acquired by the acquisition means An association means for associating the image data with map information;
を備えることを特徴とする画像編集装置。  An image editing apparatus comprising:
[2] さらに、前記関連付け手段によって関連付けられた前記画像データを表示する表 示手段を備えることを特徴とする請求項 1に記載の画像編集装置。  2. The image editing apparatus according to claim 1, further comprising display means for displaying the image data associated by the association means.
[3] さらに、画像を撮影する撮影手段と、 [3] Furthermore, photographing means for photographing an image;
音声を集音する集音手段と、  Sound collecting means for collecting sound;
前記撮影手段によって撮影された画像の画像データに含まれる画像イメージの特 徴量と、前記集音手段によって集音された音声の音声データに含まれる音声パラメ ータの特徴量とを検出する検出手段と、  Detection that detects the feature amount of the image image included in the image data of the image photographed by the photographing means and the feature amount of the sound parameter contained in the sound data of the sound collected by the sound collecting means. Means,
前記検出手段によって検出された特徴量に基づいて、前記撮影手段を制御する制 御手段とを備えることを特徴とする請求項 1または 2に記載の画像編集装置。  3. The image editing apparatus according to claim 1, further comprising a control unit that controls the photographing unit based on a feature amount detected by the detection unit.
[4] 前記検出手段は、前記画像データに含まれる人物に関する画像イメージから前記 人物の顔画像イメージの特徴量を検出することを特徴とする請求項 3に記載の画像 編集装置。 4. The image editing apparatus according to claim 3, wherein the detection unit detects a feature amount of the person's face image from an image related to the person included in the image data.
[5] さらに、音声データを再生する音声再生手段を備え、  [5] Furthermore, an audio reproducing means for reproducing audio data is provided,
前記取得手段は、前記移動体の挙動に関する挙動情報を取得し、  The acquisition means acquires behavior information relating to the behavior of the moving object,
前記音声再生手段は、前記表示手段に前記画像データを表示する際に、前記検 出手段によって検出された前記特徴量および前記取得手段によって取得された前 記挙動情報に基づいて、再生する音声データの選択をおこなうことを特徴とする請求 項 3に記載の画像編集装置。  The audio reproducing means reproduces audio data to be reproduced based on the feature amount detected by the detecting means and the behavior information acquired by the acquiring means when displaying the image data on the display means. The image editing apparatus according to claim 3, wherein selection is performed.
[6] 前記表示手段は、前記関連付け手段によって関連付けられた前記画像データとと もに、前記地図情報に基づいて、当該地図情報に含まれているテキスト情報を表示 することを特徴とする請求項 3に記載の画像編集装置。 6. The display unit displays text information included in the map information based on the map information together with the image data associated by the association unit. 3. The image editing apparatus according to 3.
[7] 前記検出手段は、前記画像イメージの特徴量として、前記人物の顔画像イメージの 感情パラメータの特徴量を検出し、前記音声パラメータの特徴量として、音量成分、 時間成分および周波数成分の特徴量を検出することを特徴とする請求項 4に記載の 画像編集装置。 [7] The detection unit detects a feature amount of an emotion parameter of the person's face image as the feature amount of the image image, and features of a volume component, a time component, and a frequency component as the feature amount of the voice parameter 5. The image editing apparatus according to claim 4, wherein the amount is detected.
[8] 前記制御手段は、前記検出手段によって検出された前記特徴量に変化が生じた 場合、前記撮影手段によって前記画像の撮影をおこなうことを特徴とする請求項 3に 記載の画像編集装置。  8. The image editing apparatus according to claim 3, wherein when the feature amount detected by the detection unit is changed, the control unit captures the image by the imaging unit.
[9] 前記取得手段は、前記挙動情報として、前記移動体の速度に関する情報、傾斜角 情報、横 G (Gravity)情報および現在位置情報のうち少なくとも!/ヽずれか一つを取 得することを特徴とする請求項 5〜8のいずれか一つに記載の画像編集装置。  [9] The acquisition means acquires at least one of! /! Deviation as information about the speed of the moving object, inclination angle information, lateral G (Gravity) information, and current position information as the behavior information. The image editing apparatus according to claim 5, wherein the image editing apparatus is characterized.
[10] 日時に関する情報を含む画像データを入力する入力工程と、  [10] An input process for inputting image data including date and time information;
移動体が移動した経路および時刻に関する情報を取得する取得工程と、 前記入力工程によって入力された画像データの前記日時に関する情報と、前記取 得工程によって取得された経路および時刻に関する情報とに基づいて、前記画像デ ータを地図情報と関連付ける関連付け工程と、  Based on an acquisition step of acquiring information on a route and time of movement of the mobile body, information on the date and time of the image data input in the input step, and information on the route and time acquired in the acquisition step An associating step for associating the image data with map information;
を含むことを特徴とする画像編集方法。  An image editing method comprising:
[11] 請求項 10に記載の画像編集方法をコンピュータに実行させることを特徴とする画 像編集プログラム。 [11] An image editing program that causes a computer to execute the image editing method according to claim 10.
[12] 請求項 11に記載の画像編集プログラムを記録したことを特徴とするコンピュータに 読み取り可能な記録媒体。  [12] A computer-readable recording medium in which the image editing program according to claim 11 is recorded.
PCT/JP2006/301757 2005-02-03 2006-02-02 Image editing device, image editing method, image editing program and computer readable recording medium WO2006082886A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2007501613A JP4516111B2 (en) 2005-02-03 2006-02-02 Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
US11/815,495 US20090027399A1 (en) 2005-02-03 2006-02-02 Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
EP06712900A EP1845491A4 (en) 2005-02-03 2006-02-02 Image editing device, image editing method, image editing program and computer readable recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005028277 2005-02-03
JP2005-028277 2005-02-03

Publications (1)

Publication Number Publication Date
WO2006082886A1 true WO2006082886A1 (en) 2006-08-10

Family

ID=36777269

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/301757 WO2006082886A1 (en) 2005-02-03 2006-02-02 Image editing device, image editing method, image editing program and computer readable recording medium

Country Status (5)

Country Link
US (1) US20090027399A1 (en)
EP (1) EP1845491A4 (en)
JP (1) JP4516111B2 (en)
CN (1) CN100533477C (en)
WO (1) WO2006082886A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8391544B2 (en) 2009-06-30 2013-03-05 Kabushiki Kaisha Toshiba Image processing apparatus and method for processing image
JP2013216241A (en) * 2012-04-10 2013-10-24 Denso Corp Affect-monitoring system
JP2018195336A (en) * 2015-11-12 2018-12-06 キヤノンマーケティングジャパン株式会社 Information processing device, control method thereof, and program, and information processing system, control method thereof, and program

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4895313B2 (en) * 2006-04-28 2012-03-14 パナソニック株式会社 Navigation apparatus and method
JP2009252250A (en) * 2008-04-01 2009-10-29 Alpine Electronics Inc Content reproducing apparatus and method
EP2443055A1 (en) * 2008-06-17 2012-04-25 Digigage Ltd. System for altering virtual views
CN102194316A (en) * 2011-03-23 2011-09-21 中兴通讯股份有限公司 Method and system for acquiring road condition information in real time
JP5779938B2 (en) * 2011-03-29 2015-09-16 ソニー株式会社 Playlist creation device, playlist creation method, and playlist creation program
JP5598484B2 (en) * 2012-01-19 2014-10-01 株式会社デンソー Audio output device
US9456239B2 (en) * 2012-11-30 2016-09-27 Mitsubishi Electric Corporation Wireless communication apparatus
CN103327263A (en) * 2013-07-16 2013-09-25 无锡方圆环球显示技术股份有限公司 Method for adding gravity acceleration data in shooting and recording process of multimedia
RU2694802C2 (en) * 2014-06-30 2019-07-16 Марио АМУРА Creating electronic images, editing images and simplified audio/video editing device, film production method starting from still images and audio tracks
US9826203B2 (en) 2014-09-08 2017-11-21 Intel Corporation Method and system for controlling a laser-based lighting system
US9864559B2 (en) * 2016-03-29 2018-01-09 Panasonic Avionics Corporation Virtual window display system
JP6669216B2 (en) * 2018-08-30 2020-03-18 株式会社Jvcケンウッド Electronic equipment, operation method, program
CN109218646A (en) * 2018-10-11 2019-01-15 惠州市德赛西威智能交通技术研究院有限公司 Vehicle electronics photograph album control method, device, car-mounted terminal and storage medium
US20210356288A1 (en) 2020-05-15 2021-11-18 Apple Inc. User interfaces for providing navigation directions
US11788851B2 (en) 2020-06-11 2023-10-17 Apple Inc. User interfaces for customized navigation routes
JP2022030591A (en) * 2020-08-07 2022-02-18 本田技研工業株式会社 Edition device, edition method, and program
US20220390248A1 (en) 2021-06-07 2022-12-08 Apple Inc. User interfaces for maps and navigation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10233985A (en) * 1997-02-18 1998-09-02 Fuji Photo Film Co Ltd Image reproducing method and image data managing method
JP2002123814A (en) * 2000-08-07 2002-04-26 Sony Corp Information processing device, information processing method, program storage medium, and program
JP2002183742A (en) 2000-12-18 2002-06-28 Yamaha Motor Co Ltd Preparation method and device for electronic album of trip and mobile tool for preparing electronic album
JP2003233555A (en) * 2002-02-13 2003-08-22 Zenrin Datacom Co Ltd Information managing system
EP1404105A2 (en) 2002-09-27 2004-03-31 Fuji Photo Film Co., Ltd. Method, apparatus, and computer program for generating albums

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6914626B2 (en) * 2000-02-21 2005-07-05 Hewlett Packard Development Company, L.P. Location-informed camera
US6928230B2 (en) * 2000-02-21 2005-08-09 Hewlett-Packard Development Company, L.P. Associating recordings and auxiliary data
US7327505B2 (en) * 2002-02-19 2008-02-05 Eastman Kodak Company Method for providing affective information in an imaging system
GB2400667B (en) * 2003-04-15 2006-05-31 Hewlett Packard Development Co Attention detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10233985A (en) * 1997-02-18 1998-09-02 Fuji Photo Film Co Ltd Image reproducing method and image data managing method
JP2002123814A (en) * 2000-08-07 2002-04-26 Sony Corp Information processing device, information processing method, program storage medium, and program
JP2002183742A (en) 2000-12-18 2002-06-28 Yamaha Motor Co Ltd Preparation method and device for electronic album of trip and mobile tool for preparing electronic album
JP2003233555A (en) * 2002-02-13 2003-08-22 Zenrin Datacom Co Ltd Information managing system
EP1404105A2 (en) 2002-09-27 2004-03-31 Fuji Photo Film Co., Ltd. Method, apparatus, and computer program for generating albums

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1845491A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8391544B2 (en) 2009-06-30 2013-03-05 Kabushiki Kaisha Toshiba Image processing apparatus and method for processing image
JP2013216241A (en) * 2012-04-10 2013-10-24 Denso Corp Affect-monitoring system
JP2018195336A (en) * 2015-11-12 2018-12-06 キヤノンマーケティングジャパン株式会社 Information processing device, control method thereof, and program, and information processing system, control method thereof, and program

Also Published As

Publication number Publication date
JP4516111B2 (en) 2010-08-04
CN100533477C (en) 2009-08-26
US20090027399A1 (en) 2009-01-29
EP1845491A1 (en) 2007-10-17
JPWO2006082886A1 (en) 2008-06-26
CN101111863A (en) 2008-01-23
EP1845491A4 (en) 2008-02-13

Similar Documents

Publication Publication Date Title
JP4516111B2 (en) Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
JP3876462B2 (en) Map information providing apparatus and method
JP4533897B2 (en) PROCESS CONTROL DEVICE, ITS PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
WO2011079241A1 (en) Method of generating building facade data for a geospatial database for a mobile device
WO2006101012A1 (en) Map information update device, map information update method, map information update program, and computer-readable recording medium
JP2009162722A (en) Guidance device, guidance technique, and guidance program
JP3708141B2 (en) Electronic map device
WO2006095688A1 (en) Information reproduction device, information reproduction method, information reproduction program, and computer-readable recording medium
JP4652099B2 (en) Image display device, image display method, image display program, and recording medium
WO2006103955A1 (en) Advertisement display device, advertisement display method, and advertisement display program
JP2007259146A (en) Caption detector, caption detecting method, caption detecting program and recording medium
JP2008014881A (en) Navigation device
JP2006189977A (en) Device, method and program for image editing, and medium for computer-readable recording medium
JP2009113725A (en) Device, method and program for controlling instrument, and recording medium
WO2007023900A1 (en) Content providing device, content providing method, content providing program, and computer readable recording medium
JP4207317B2 (en) Destination identification support device and navigation system provided with the device
JP5221120B2 (en) Facility information output device, program, facility information output method, and facility information display device
JP2006064671A (en) Information-providing device and portable information terminal
WO2006095689A1 (en) Drive assistance device, drive assistance method, and drive assistance program
JP2001083872A (en) Navigation device
JP2007232578A (en) System and method for providing route information and program
JP2008249654A (en) Record reproducing device, method, program, and recording medium
JP2006064654A (en) Navigation apparatus and method
JP2008160447A (en) Broadcast program receiving device, broadcast program reception planning device, broadcast program receiving method, broadcast program reception planning method, program, and recording medium
JP4584176B2 (en) Information transfer system, portable information processing apparatus, information transfer method, information transfer program, and computer-readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2006712900

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2007501613

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 200680003740.4

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 11815495

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 2006712900

Country of ref document: EP