WO2005015899A1 - 撮影装置及び撮影方法 - Google Patents
撮影装置及び撮影方法 Download PDFInfo
- Publication number
- WO2005015899A1 WO2005015899A1 PCT/JP2004/007675 JP2004007675W WO2005015899A1 WO 2005015899 A1 WO2005015899 A1 WO 2005015899A1 JP 2004007675 W JP2004007675 W JP 2004007675W WO 2005015899 A1 WO2005015899 A1 WO 2005015899A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- scenario
- information
- synthesis
- video
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 61
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 304
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 304
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 67
- 238000003860 storage Methods 0.000 claims abstract description 41
- 239000002131 composite material Substances 0.000 claims description 228
- 238000003384 imaging method Methods 0.000 claims description 139
- 239000000203 mixture Substances 0.000 claims description 43
- 238000012545 processing Methods 0.000 claims description 13
- 238000005520 cutting process Methods 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 90
- 230000006870 function Effects 0.000 description 21
- 238000012937 correction Methods 0.000 description 7
- 241000238370 Sepia Species 0.000 description 6
- 238000012544 monitoring process Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 3
- 238000013523 data management Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- OWNRRUFOJXFKCU-UHFFFAOYSA-N Bromadiolone Chemical compound C=1C=C(C=2C=CC(Br)=CC=2)C=CC=1C(O)CC(C=1C(OC2=CC=CC=C2C=1O)=O)C1=CC=CC=C1 OWNRRUFOJXFKCU-UHFFFAOYSA-N 0.000 description 1
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000013341 scale-up Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00185—Image output
- H04N1/00198—Creation of a soft photo presentation, e.g. digital slide-show
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00458—Sequential viewing of a plurality of images, e.g. browsing or scrolling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2621—Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2206/00—Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
Definitions
- the present invention relates to an apparatus and a method for synthesizing an image obtained by shooting and a scenario image.
- Chromaki synthesis is known as one of the techniques.
- a subject is photographed against a background of a specific color (generally blue), and the background of the specific color is made transparent to create an image in which only the subject is extracted.
- a composite video is created by superimposing the extracted video on another video.
- Japanese Patent Laid-Open Publication No. 2001-346099 discloses a device for accumulating video so that a composite video by Kuguchi Maki synthesis can be easily created. By storing the actual video shot of the subject with the camera and the shooting status of the camera at the time of shooting in association with each other, it can be used to create composite video as needed in the future.
- Japanese Patent Laying-Open No. 4-183183 discloses an image signal processing apparatus for synthesizing a singing person's face and figure on a reproduced image of karaoke video software. Disclosure of the invention
- the device disclosed in Japanese Patent Application Laid-Open No. 2000-344609 is a device for storing captured images and synthesizing the images later.
- highly accurate video synthesis is required, and it is preferable to edit the video using the above-described device.
- operability at the time of video synthesis and speedy video synthesis may be required.
- the apparatus disclosed in Japanese Patent Application Laid-Open No. 4-183183 is based on the assumption that the captured image to be synthesized is the face or figure of a person singing karaoke. Under this assumption, the position where the singer stands is almost constant in front of the screen displaying the lyrics, and it is necessary to obtain an appropriate captured image without adjusting the position and orientation of the video camera for shooting. Can be. Also, the synthesis of the captured images may be performed at the start of the performance, and there is no need to control the synthesis timing.
- the photographed image to be synthesized is a predetermined object (the face or figure of a person singing karaoke)
- the synthesis timing is also fixed, such as during playback of karaoke video software. It is not possible to control the target of the shot images to be synthesized and the synthesis timing according to the scenario video to be played.
- the photographing apparatus of the present invention comprises: photographing means for acquiring a photographed image; reading means for reading out stored data so as to form a scenario image; and synthesizing the photographed image during the scenario image.
- Storage means for storing synthesis information for synthesizing, and synthesizing means for synthesizing the captured image obtained by the imaging means with the scenario image read by the reading means in accordance with the synthesis information stored in the storage means.
- display means for displaying the synthesized video obtained by the synthesizing means.
- the user is displayed on the display unit.
- the “scenario video” is a video having a concept of time to be combined with a captured image, and is, for example, a moving image, an animation, a slide show, or the like. Scenario images are generally displayed according to scenario information.
- the scenario information is, for example, information describing what data is output when, where, and how in multimedia data such as moving images, still images, text, or audio.
- the “captured image” is an image obtained by the capturing means during the reproduction of the scenario video, and is not constituted by data stored in a storage means such as a file in advance.
- the captured image may be a moving image or a still image.
- the synthesis information includes information on a time at which the captured image is synthesized in the scenario image.
- the information on the time is defined, for example, by the time in the scenario video at which the synthesis starts or the time in the scenario video at which the synthesis ends.
- the synthesis information includes at least one of information on an outer shape of a synthesis area where the captured image is synthesized and information on a position of the synthesis area in an image forming the scenario video.
- the user can acquire information on the combination area before the time when the captured image is combined with the scenery video.
- This allows the user to capture the image to be combined with the scenario video in order to capture the position or orientation of the camera or the shooting conditions. Can be changed.
- the time before the captured image is combined with the scenario video may be before the time when the combining is started or may be the time when the combining is started.
- the information on the combination area to be presented to the user may be based on at least one of the information on the outer shape and the position in the combination information.
- the user can predict an image to be combined with the combination area, and can prepare for shooting.
- a method of presenting information based on the information on the outer shape for example, there is a method of displaying an outline of a synthesis area.
- the shape and size of the composite area can be presented.
- Presenting information about the position is effective when combining a part of the captured image in the composite area, and when combining the entire captured image and the entire image forming the scenario image as they are . That is, the position or orientation of the camera may be adjusted so that the position of the object to be combined in the captured image matches the position of the combination area in the image forming the scenario video.
- the information on the combination area presented to the user may include an image obtained by processing a captured image acquired by the imaging unit in accordance with at least one of information on an outer shape and a position in the combination information.
- the user can grasp what kind of photographed image is obtained by the photographing means.
- processing in accordance with the synthesis information means for example, processing such as cutting out and enlarging / reducing a shot image, and by this, grasping what kind of image is displayed when the shot image is fitted into the synthesis area.
- the captured image can be displayed together with the images constituting the scenario video at the time when the synthesis is started, for example.
- it can be displayed together with the scenario image that is currently being displayed on the display means, that is, before the time when the synthesis is started.
- the captured image and the image constituting the displayed scenario image may be displayed in an overlapping manner or may be displayed side by side.
- the information on the combining area to be presented to the user is obtained by combining the captured image acquired by the imaging unit into an image forming a scenario image in which the combining area appears, according to the combining information. May be included.
- the user By presenting to the user a composite preview image in which the captured image obtained by the photographing means and the image constituting the scenario image in which the composite area appears, the user is composed in the composite area. It is possible to make a captured image as a desired image. That is, the user can select an image suitable for the combination area and shoot the image by looking at the combination preview image. Also, according to the shape and size of the composite area, it is possible to adjust the zoom ratio of the photographing by the photographing means, and to adjust the brightness of the photographed image in accordance with the background scenery image.
- the composite preview image may be displayed by suspending the reproduction of the scenario video, or may be displayed on a screen different from the screen on which the scenario video is being reproduced.
- the image capturing apparatus includes a unit that stops the reproduction of the scenario image by one P at a time when the synthesis of the captured image with the scenario image is started.
- the image capturing apparatus may be set so that a position of the combination area in an image forming the scenario image moves with time.
- the composition area can be set to be movable by expressing information on the position of the composition area included in the composition information as a function of time.
- the imaging device may be set so that the outer shape of the synthesis area changes with time.
- the outer shape of the combining area can be changed with the passage of time.
- the image capturing apparatus may further include a unit that notifies a user that the time to start combining the captured image with the scenario image has approached.
- Means for informing the user include, for example, a configuration for counting down by text, a configuration for providing notification by sound or light, and a configuration for providing notification by slowing down the scenario video playback speed.
- the “user” includes a person who is a subject as well as a person who performs shooting.
- the image capturing apparatus may further include means for notifying a user of information regarding the start of synthesis within a predetermined time including a time at which synthesis of the shot image with the scenario image is started.
- Means for informing the user of information on the start of synthesis can be realized by, for example, a configuration for outputting a sound or a configuration for blinking light within a predetermined time including the synthesis start time. By outputting sound and light before and after the synthesis start time, it is possible to prompt the user to acquire a captured image. Note that even if the user starts shooting a captured image after the synthesis start time has elapsed, the time from the synthesis start time until the user starts shooting is, for example, calculated based on the shot image after the shooting is started. By complementing the power image, an appropriate composite image can be obtained.
- the “user” includes a person who is a subject as well as a person who shoots.
- the photographing apparatus may further include means for recording the composite video.
- the photographing apparatus may further include a unit that records the photographed image acquired by the photographing unit in association with the identification information of the scenario image.
- the synthesized video can be reproduced by reproducing the captured image with the associated scenario image while synthesizing the captured image.
- the photographing means may perform photographing based on information on a photographing condition included in the composite information.
- the shooting conditions include, for example, brightness, presence of photo light, zoom ratio, and color tone (sepia, black and white, color).
- the imaging device may include an input unit that allows a user to input at least one of a synthesis start instruction and a synthesis end instruction of the captured image and the scenario image.
- the user can select the timing to start or end the synthesis of the captured image and the scenario image.
- a compositing start instruction for example, compositing can be started at the stage when the composite video is ready for shooting.
- a compositing end instruction for example, when it becomes impossible to shoot a composite video, the compositing can be terminated.
- the user is a person who performs photographing.
- the imaging apparatus may further include, when the synthesis start instruction is input after the synthesis start time of the captured image defined by the synthesis information has elapsed to the scenario image, the synthesis start is performed from the defined synthesis start time.
- An image to be synthesized in the synthesis area until an instruction is input may be supplemented by the captured image or the scenario image.
- an appropriate synthesized image can be created even when a user inputs a synthesis start instruction after the synthesis start time specified by the synthesis information has elapsed.
- the captured image at the time when the synthesis start instruction is input is added to the synthesis area from the synthesis start time specified by the synthesis information until the synthesis start instruction is input.
- a method of complementing with a scenario image for example, there is a method of displaying a default scenario image in a synthesis area from a synthesis start time specified by synthesis information until a synthesis start instruction is input.
- the image capturing apparatus may be configured such that, when the combining end instruction is input before the combining end time of the captured image defined by the combining information with the scenario image, the combining is performed.
- An image to be synthesized in the synthesis area between the input of the end instruction and the specified synthesis end time may be complemented by the captured image or the scenario video.
- the captured image at the time when the synthesis end instruction is input is applied to the synthesis area from the input of the synthesis end instruction to the synthesis end time specified by the synthesis information
- the display means may include a plurality of screens, and may display the scenario image or the composite image on at least one screen.
- the scenario video or the composite video on one screen and check other information on another screen.
- the image being photographed by the photographing means can be displayed on another screen.
- the subject can see the composite video by showing the composite video to the subject.
- the photographing apparatus may include an irradiation unit that emits light indicating a point or a range to be photographed by the photographing unit.
- the user can predict a captured image obtained by the image capturing means using light emitted to the subject, and can control the position and orientation of the camera based on the prediction.
- the photographed image acquired by the photographing unit may be a moving image that is continuously photographed, and the combining unit may combine the scenario image with the moving image.
- the moving image acquired by the photographing means is converted to a scenery image based on the composite information. Can be synthesized.
- the captured image obtained by the imaging unit is one or more still images
- the synthesizing unit includes one of the still images as an image to be synthesized with a scenario video that is continuously read. Use within a predetermined time.
- the still image acquired by the photographing means can be combined with the scenario image based on the combination information.
- the predetermined time may be a synthesis time specified by the synthesis information, or may be a part of the synthesis time. If one of the still images is synthesized during a part of the synthesis time, one or more other still images may be synthesized during the remaining synthesis time.
- the image capturing apparatus includes a partial image extracting unit that extracts a part of a captured image captured by the image capturing unit as a partial image, and the combining unit converts the partial image extracted by the partial image extracting unit into a partial image. It may be combined with a scenario video.
- a partial image cut out from a captured image can be combined with a scenario image.
- the composite information stored in the storage unit includes partial image specifying information that specifies a region and a time zone from which the partial image is cut out by the partial image cutting unit.
- a partial image may be cut out from the captured image based on the partial image specifying information.
- a partial image cut out from a captured image can be defined in the composite information. For example, it is possible to define a partial image that matches the scenario video.
- the position of the cutout region of the partial image specified by the partial image specifying information may move as time passes.
- the position of the partial image to be combined with the scenario video can be moved over time without changing the position and orientation of the camera.
- the shape of the cutout region of the partial image specified by the partial image specifying information may change over time.
- the shape of the partial image can be changed over time.
- the partial image specifying information may include a pair that the partial image should include.
- the partial image may be specified by the elephant, and the partial image cutout unit may cut out from the photographed image a part where the target object specified by the partial image specifying information is projected as the partial image.
- the combination information specifies an object to be included in the combination area
- the combination unit includes a portion of the scenario image on which the object specified by the partial image identification information is displayed. May be used as the combination area, and the photographed image may be combined with the combination area.
- the photographing apparatus further includes control means for performing a control for temporarily stopping the scenario image after starting reproduction of the data when there is a time to start reproduction at a time when the synthesis of the photographed image is started. You may.
- the photographing device includes: a preview image storage unit that stores, as a preview image, a scenario image displayed at a time at which the synthesis of the captured images is started; and displays the preview image at a time at which the synthesis of the captured images is started. And control means for performing control for temporarily stopping the scenario image.
- the scenario video that is displayed at the time when the synthesis starts is stored as a preview image, and the preview image that is stored at the synthesis start time is displayed. Can be displayed easily.
- control unit may move a position at which reproduction of the scenario video is restarted to a start time of the synthesis.
- the scenario data of the present invention is used for synthesizing scenario information describing a method of outputting stored data and a captured image obtained by capturing with a scenario video output according to the description of the scenario information. And synthesis information of.
- the synthesis information is information for synthesizing the captured image with the scenario image, and is, for example, information describing at what timing and at what position the captured image is to be synthesized.
- Scenario information is information that describes how to output the stored data. In contrast to the scenario information, which data to output is described in advance, the synthetic information does not pre-fix the data to be output in advance, and the data of different captured images each time shooting is performed. Is described. By outputting the captured image obtained by shooting based on the composite information and outputting the stored data based on the scenario information, it is possible to easily combine the scenario image and the captured image.
- the combination information may include information on a time at which the captured image is combined with the scenario image.
- the information on the time is defined, for example, by the time in the scenery video at which the synthesis starts or the time in the scenery video at which the synthesis ends.
- the synthesis information may include at least one of information on an outer shape of a synthesis area where a captured image is synthesized and information on a position of the synthesis area in an image forming the scenario video.
- the synthesis information may include partial image specifying information for specifying a region and a time zone where a part of the captured image is cut out as a partial image.
- the position of the cut-out area of the partial image specified by the partial image specifying information may move as time passes.
- the position of the partial image to be synthesized with the scenario image can be moved over time without changing the position and orientation of the camera.
- the shape of the cut-out region of the partial image specified by the partial image specifying information may change over time.
- the shape of the partial image can be changed over time.
- the partial image specifying information may specify a region and a time zone of the partial image according to an object to be included in the partial image. .
- the synthesis information may specify a time zone in which the synthesis area and the scenario image include the synthesis area according to an object to be included in the synthesis area in which a captured image is synthesized.
- the combination region and the time period in which the scenario image includes the combination region are specified by the target object, so that the combination region can be easily specified.
- the photographing method of the present invention includes a step of reading stored data so as to form a scenario image, a step of acquiring a photographed image, and a step of reading composite information for combining the captured image in the scenario video.
- the user can shoot while viewing the displayed composite video.
- the user can capture a desired object by moving a force lens for acquiring a captured image while viewing the displayed composite video.
- composition while shooting it is possible to realize speedy video composition in which video composition is completed at the end of shooting.
- the program according to the present invention comprises: a step of reading stored data so as to form a scenario image; and a step of obtaining a captured image; Reading composite information for synthesizing the photographed image to the scenario image, synthesizing the photographed image with the scenario image according to the synthetic information, and displaying the composite image obtained in the synthesizing step. Step and let
- the computer By causing the computer to execute this program, the computer combines the captured image and the scenario image according to the combination information, and displays the combined image obtained by the combination.
- the user can shoot while watching the displayed composite video.
- the user can move the camera for acquiring the photographed image while looking at the displayed composite video, and capture a desired photographing target.
- by performing composition while shooting it is possible to realize speedy video synthesis in which the video synthesis is completed at the end of shooting.
- the photographing apparatus of the present invention combines the captured image and the scenario video based on the combination information for combining the captured image with the scenario video, and displays the composite video by the display means. It is possible to shoot while watching the synthesized image displayed on the camera, and to provide an excellent effect that the position and orientation of the imaging device can be adjusted so that the synthesized image is appropriately shot. You can do it.
- FIG. 1 is a diagram illustrating a configuration of a photographing device according to a first embodiment.
- FIG. 2 is a diagram showing an example of scenario data.
- FIG. 3 is a diagram showing an example of a scenario video display.
- FIG. 4A is a diagram illustrating a recording method of a composite video.
- FIG. 4B is a diagram explaining a recording method of a composite video.
- FIG. 5A is a diagram for explaining a composite video recording method.
- FIG. 5B is a diagram for explaining a composite video recording method.
- FIG. 6 is a diagram showing a flow of an image synthesizing operation by the imaging device.
- FIG. 7 is a diagram showing the relationship between the operation of the photographing means, the display contents of the display means, and scenario data.
- FIG. 8A is a diagram showing an example of a composite preview image.
- FIG. 8B is a diagram showing an example of a composite video.
- FIG. 9 is a diagram showing a configuration of a photographing apparatus according to the second embodiment.
- FIG. 10 is a diagram showing the relationship between the operation of the photographing means, the display contents of the display means, and the scenario data.
- FIG. 11A is a diagram showing an example of a composite video.
- FIG. 11B is a diagram showing an example of presentation of a synthesis area.
- FIG. 12 is a diagram showing a configuration of an imaging device according to the third embodiment.
- FIG. 13 is a diagram showing a configuration of an imaging device according to the fourth embodiment.
- FIG. 14 is a diagram showing the relationship between the operation of the photographing means, the display contents of the display means, and the scenario data.
- FIG. 15 is a diagram showing the relationship between the operation of the photographing means, the display contents of the display means, and the scenario.
- FIG. 16 is a diagram showing a configuration of an imaging device according to the fifth embodiment.
- FIG. 17 is a diagram showing the relationship between the operation of the photographing means, the display contents of the display means, and the scenario.
- FIG. 18 is a diagram showing the relationship between the operation of the photographing means, the display contents of the display means, and the scenario.
- FIG. 19 is a diagram showing a configuration of an imaging device according to the sixth embodiment.
- FIG. 20 is a diagram showing an example of scenario data.
- FIG. 21A is a diagram showing an example of a captured image.
- FIG. 21B is a diagram showing an example of a partial image.
- FIG. 22 is a diagram showing an example of composite information.
- FIG. 23A is a diagram showing an example of a change in the synthesis area.
- FIG. 23B is a diagram showing an example of a partial image.
- FIG. 24 is a diagram showing an example of the composite information.
- FIG. 25 is a diagram showing a configuration of an imaging device according to the seventh embodiment.
- FIG. 26 is a diagram showing an example of scenario data.
- FIG. 27A is a diagram illustrating an example of a captured image.
- FIG. 27B is a diagram showing an example of a partial image.
- FIG. 28 is a diagram showing an example of scenario data.
- FIG. 29A is a diagram showing an example of a scenario video.
- FIG. 29B is a diagram showing a combined area in a scenario image.
- FIG. 30 is a diagram showing a configuration of a synthesis / reproduction control unit according to the ninth embodiment.
- Fig. 31A shows an example of a scenario.
- FIG. 31B is a diagram showing an example of a schedule list.
- FIG. 32 is a diagram showing a configuration of a synthesis reproduction control unit according to the tenth embodiment.
- FIG. 33A is a diagram showing an example of scenario data.
- FIG. 33B is a diagram showing an example of the schedule list.
- FIG. 34 is a diagram showing a configuration of a synthesis / reproduction control unit according to the eleventh embodiment.
- FIG. 35A is a diagram showing an example of scenario data.
- FIG. 35B is a diagram showing an example of a schedule list.
- FIG. 36 is a diagram showing a display example on each screen when the imaging device has two screens.
- -FIG. 37 is a diagram showing a display example on each screen when the imaging device has two screens.
- FIG. 38 is a diagram showing a display example on each screen when the imaging device has two screens.
- FIG. 39 is a diagram showing an example of the program of the present invention. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 is a diagram illustrating a configuration of an imaging device 10 according to the first embodiment of the present invention.
- the imaging device 10 according to the first embodiment is a portable imaging device capable of changing an object to be imaged by changing a position or an orientation of a user.
- portable video cameras and mobile phones with cameras are included.
- the photographing device 10 includes a scenario storage unit 12 for storing the scenario data, a photographing unit 18 for acquiring the photographed image, and a synthesizing and reproducing unit 2 for reproducing the scenario image and synthesizing the scenario image and the photographed image.
- a combined playback control unit 22 that controls the combined playback unit 20, and a display unit 24 that displays the combined scenario video.
- the imaging unit 18 is configured by, for example, a semiconductor device such as a CCD or a CMOS, and has a function of acquiring a digital image.
- the photographing unit 18 can photograph at least one of a moving image and a still image.
- the scenario data includes scenario information 14 and synthetic information 16.
- Scenario information 14 is information describing an output method of entity data to be output.
- the scenario video reproduced based on the scenario information 14 is a video that changes with time, and corresponds to, for example, a moving image, an animation, a slide show, and the like.
- the synthesis information 16 is information on the outer shape of the synthesis area for synthesizing the captured image in the scenario video, the position of the synthesis area in the images constituting the scenario video, and the time when the synthesis area appears.
- the time when the composite area appears is the time at which the captured image should be composited with the scenery video.
- FIG. 2 is a diagram showing an example of scenario data
- FIG. 3 is a diagram showing an example of a scenario image and a combination area displayed on a screen by the data shown in FIG.
- the scenario image may be composed of one medium or may be composed of a plurality of media.
- the scenario video is composed of media A and media B, and the composite area is located on the media A video.
- the combining area is an area where the captured image acquired by the capturing unit 18 is combined.
- the scenario video may include a time zone during which the media is not played during the playback time. For example, in the example shown in FIG.
- the scenario data has scenario information 14 and synthesis information 16.
- the scenario information 14 has scenario information 14 a describing the output method of the media A and scenario information 14 b describing the output method of the media B.
- the respective scenario information 14a and 14b are: i) media identifier, ii) storage location of entity data, iii) information about time to play and display media, and iv) information about area where video is displayed. Including.
- the scenario information 14a and 14b each include four pieces of information, but may include other information as needed.
- the media identifier is information for specifying a medium.
- the storage location of the entity data is information indicating the directory and file name where the entity data of the media is stored.
- the scenario information 14 has information on the storage location of the entity data, and the entity data is stored separately from the scenario data.
- the entity data is stored in the “home” directory for both media A and B, but the location where the entity data of each media is stored may be different. With the configuration in which the storage location of the entity data can be designated in this manner, for example, various existing data can be easily used as a scenario image.
- the information about the time for playing and displaying the media is information for specifying the time for playing the video based on the media.
- the timing of the playback start and the end of the playback are specified by the elapsed time from the start of the playback of the scenario video, and the playback display time is specified.
- the media A starts playing at a time T1 seconds after the start of the playback of the scenario video is instructed, and ends playing at a time T2 seconds after the start.
- the information on the image display area is information on the position and the shape of the image to be displayed based on the media. For example, media A is displayed at the position of coordinates (Xa, Ya), and it can be seen that its shape is rectangular.
- the scenario information 14 can include information on the time and area for displaying the media. Based on this information, the video of the media is displayed at the specified time and area.
- the scenario data is described for each media, but other description methods can be adopted.
- the output method may be described for each scenario video playback time.
- the composition information 16 will be described.
- the synthesis information 16 is information for synthesizing the captured image with the scenario video.
- the synthesis information 16 includes time information on the time at which the synthesis area appears, and area information on the position and outer shape of the synthesis area.
- the time information is information that defines the time at which the combined area appears in the scenario video by the elapsed time from the start of the playback of the scenario video. In the example shown in FIG.
- the composite area appears during a period from the time when T5 seconds elapse to the time when T6 seconds elapse from the start of the scenario video playback. That is, this period is a period in which the captured image is combined with the scenario video.
- the area information is information that defines the position and outer shape of the combined area that appears during the period specified by the time information.
- the position where the combining area appears is the coordinates (Xc, Yc), and the shape of the combining area is a circle having a radius R.
- the position at which the combined area appears can be set so that the position of the combined area moves by using the coordinates defining the appearance position as a function of time.
- the outline of the composite area can be changed as the scenario video playback time elapses by setting the time parameter.
- the video synthesized in the synthesis area is displayed in a format that overwrites the other video. If the composite area and the media image overlap, the composite area video is displayed with priority. For example, as shown in FIG. 3, when the combining area and the display area of the video of the media A overlap, the combining area is arranged in front of the video of the media A, and the image captured by the imaging unit 18 is displayed. Overwrites the video on media A.
- the scenario storage unit 12 stores scenario data as described in FIG.
- the imaging device 10 reads out the scenario data stored in the scenario storage unit 12, reproduces the scenario image on the display unit 24, and combines the image captured by the imaging unit 18 with the scenario image. It has a reproducing section 20 and a synthetic reproducing control section 22 for controlling the synthetic reproducing section 20.
- the synthesis / reproduction control unit 22 reads out the scenario information 14 and the synthesis information 16 from the scenario storage unit 12 and controls the synthesis / reproduction unit 20 based on these information.
- the synthesis playback control unit 22 fits the image shot by the shooting unit 18 into the synthesis area in the scenario image based on the synthesis information 16 so that the shot image and the scenario image can be combined. Combine.
- the image combined at this time may be the entire image acquired by the imaging unit 18 or a part of the acquired image.
- the images are processed so as to match the size of the synthesis area. For example, cut out the image to be synthesized and scale up or down the Z or captured image to fit the synthesis area.
- processing is performed by cutting out the portion where the composition area in the scenario video is located from the captured image at the appearance start time of the composition area and using it for composition. Is easy. That is, when the displayed scenario video and the image obtained by the imaging unit 18 are displayed on the display unit 24 in a superimposed manner, a portion overlapping the synthesis area is cut out from the captured image and used for synthesis. This makes it easy to adjust the direction of the imaging device 10 so that the object to be put in the synthesis area is imaged by the imaging unit 18.
- a portion at a fixed position (for example, the center) of the captured image may be cut out according to the outer shape of the combination area and used for combination.
- Information regarding the position to be cut out from the captured image may be included in the composite information, and based on this information, the image to be combined with the composite area may be cut out from the captured image.
- the synthesis playback section 20 plays the scenario video and plays back the synthesis video under the control of the synthesis playback control section 22.
- the entity data stored in the storage unit not shown in FIG. 1 is read out according to the scenario information, and the read entity data is reproduced.
- the entity data is not shown in FIG. 1, but may be stored anywhere. It is also possible to store the scenario data together with the scenario data in the scenario storage unit 12.
- the combining and reproducing unit 20 reads out and reproduces the entity data from the storage location specified by the scenario information. It is also possible to adopt a configuration in which entity data is read out by the synthesis reproduction control section 22 and sent to the synthesis reproduction section 20.
- the video synthesized by the synthesis and reproduction control unit 22 is recorded.
- the recording method for the video recording unit 26 may be a method of recording the synthesized video reproduced on the screen as it is, or a method of separately recording the captured image to be synthesized and the scenario image. May be adopted.
- the composite video can be reproduced as it is by reading the recorded composite video.
- the identification information of the scenario image to be synthesized with the captured image is associated. For example, as shown in FIG. 4A, i) the combination information identifier is included in the combination information in advance.
- the identifier “C amera A” is added, indicating that the images captured by Camera A are combined.
- scenario data may be generated in which the composite information includes the information of the captured image. That is, as shown in FIG. 5A, the combined information includes i) the combined information identifier in advance. Then, when recording the captured image, as shown in FIG. 5B, the captured image captured by the capturing unit 18 is treated as one of the media (media C) constituting the scenario data, and a new scenario data is generated.
- the media C like the media A or the media B, includes: i) a media identifier, ii) a storage location of the entity data, iii) information on a time for displaying and displaying the media, and iv) information on an area for displaying a video. Including.
- the recording range a method of recording from the start of reproduction of the scenario video to the end of the reproduction may be employed, or a method of recording a portion of the scenario video in which the combining area appears may be employed.
- the photographing condition control unit 28 has a function of controlling the photographing conditions and the like of the photographing unit 18.
- the shooting conditions include, for example, conditions such as brightness, presence / absence of photo light, zoom ratio, and color tone (sepia, black and white, and color).
- the imaging condition control unit 28 sets the imaging conditions based on the information. Further, the photographing condition control unit 28 can also accept the setting of the photographing condition by the user.
- Fig. 6 is a diagram showing the flow of the video compositing operation by the photographing device 10.
- Fig. 7 shows the operation of the photographing unit 18 and the display contents and the scenario data on the display unit 24 when performing the video compositing.
- the user instructs the photographing device 10 to start reproducing the scenario video (step S1).
- the instruction to start the reproduction of the scenario video is input to the composite reproduction control unit 22.
- the combination reproduction control unit 22 controls the combination reproduction unit 20 (step S 2), so that the combination reproduction unit 20 reproduces the scenario video. That is, the synthesizing and reproducing unit 20 reads the scenario information 14 from the entity data storage unit 40, and reproduces the scenario video based on the scenario information 14 (step S3).
- the entity storage unit 40 may be located anywhere, and is indicated by a dotted line in FIG.
- a scenario image is generated according to the scenario shown in FIG. That is, the period during which the composite area appears in the scenario video is between time T5 and time T6. As shown in FIG.
- the synthesis playback control unit 22 manages the appearance timing of the synthesis area based on the synthesis information 16 read from the scenario storage unit 12, while playing back the scenario video by the synthesis playback unit 20. At the appearance timing of the combination area (time T5), the combination reproduction control unit 22 instructs the combination reproduction unit 20 to pause the reproduction of the scenario image (step S4). The composition / playback control unit 22 further instructs the shooting condition control unit 28 to shoot in preview mode (a mode in which a shot image is displayed on the screen without being recorded). The photographing condition control unit 28 controls the photographing unit 18 in accordance with the instruction from the synthesis and reproduction control unit 22 to start photographing in the preview mode (step S4).
- the shooting condition control unit 28 controls the shooting unit 18 based on the shooting condition. This makes it possible to acquire a captured image that matches the scenario video. For example, if the scenario image color is sepia and the creator of the scenario image thinks that the color of the image to be synthesized is preferably sepia, the synthesis information 16 indicates that the color tone is sepia as a shooting condition. Ah To include The photographing condition control unit 28 controls the photographing unit 18 based on the photographing conditions included in the synthesis information 16 and acquires a photographed image with a color tone of sepia.
- the image photographed by the photographing unit 18 is sent to the synthesizing / reproducing unit 20, and the photographed image and the scenario video are synthesized by the synthesizing / reproducing unit 20 (step S5). More specifically, the scenario video at time T5 is displayed as a still image on display unit 24. The captured image is combined with the combined area included in the scenery image to create a combined image. This composite video is a composite preview image. As shown in FIG. 7, the playback of the scenario video is paused at time T5, and the composite preview image is displayed on the display unit 24.
- FIG. 8A and FIG. 8B are diagrams showing examples of the composite preview image. Playback of the scenario video is stopped, and a still image is displayed. The captured image acquired by the imaging unit 18 is displayed in the combination area. When the imaging device 10 is moved, the image acquired by the imaging unit 18 changes, and the image synthesized in the synthesis area changes as shown in FIGS. 8A and 8B.
- the user adjusts the position and orientation of the photographing device 10, the zoom ratio, the brightness, and the like so that the desired image is included in the synthesis area while watching the synthesis preview image.
- the scenario image is an image at the moment when the keeper catches the pole.
- the synthesis area is located at the face of the keeper's face, and the image captured by the imaging unit 18 is fitted into the synthesis area.
- the captured images are poles and feet, which is not suitable for the video to be combined with the scenario video. Therefore, a face image is shot by moving the shooting device 10 to change the shooting location. This makes it possible to obtain a composite image suitable for a scenario image as shown in FIG. 8B. By displaying the composite preview image in this way, it is possible for the user to adjust the image obtained by the imaging unit 18.
- the user instructs the imaging device to release the pause.
- This instruction is given, for example, when the user determines that the desired image in the combining area has been combined.
- the instruction to release the pause is input to the composite reproduction control unit 22 (step S6).
- the composite playback control unit 22 instructs the composite playback unit 20 to restart the scenario video playback, and also instructs the shooting condition control unit 28 to perform shooting in the recording mode (step S6).
- the time of the pause instruction and the time t of the suspension release are It is the same in T5, and the time in the scenario data is stopped. During this time, the real time has elapsed.
- the image photographed by the photographing unit 18 is sent to the combining and reproducing unit 20 and combined with the scenario image (step S7).
- a composite video is created by fitting a captured image into a composite area included in the scenario video. Note that, when the image capturing unit 18 captures a moving image, for example, when the subject moves, the image to be fitted in the combining area changes.
- the composite area in the scenery image is made to move as the scenery image progresses, it is possible to create a composite image in which the position of the captured image to be synthesized moves. Also, when a part of a captured image is cut out and synthesized into a synthesis area, if the cutout position is fixed, the same object is captured as a shot image without moving the imaging device, and the image of the object is captured. It is possible to create a composite image in which the composite area into which the image is fitted moves in the scenario image. Also, when a part of a captured image is cut out and synthesized in the synthesis area, the cut-out position can be moved in accordance with the position of the synthesis area.
- the composite reproduction unit 20 displays the composite video on the display unit 24.
- the photographing unit 18 sends the photographed image to the video recording unit 26 and records it in the video recording unit 26 (step S7), as indicated by the dashed arrow in FIG.
- the captured image is recorded together with the identification information of the scenario video being reproduced.
- the scenario image can be performed by another method. That is, as shown by the dashed arrow in FIG. 6, instead of the photographed image from the photographing unit 18, the composite video output from the composite reproducing unit 20 is recorded in the video recording unit 26.
- the synthesis reproduction control unit 22 manages the timing when the appearance of the synthesis area ends based on the synthesis information 16 read from the scenario storage unit 12. Then, at time ⁇ 6 when the combined area disappears from the scenario image, the combined playback control unit 22 instructs the shooting condition control unit 28 to end the shooting. Thereby, the photographing condition control unit 28 terminates the photographing by the photographing unit 18 (step S8).
- Synthetic playback control unit 22 At the end time T7 of the reproduction of the scenery video, the completion of reproduction of the scenery video is notified to the synthesizing reproduction section 20 (step S9).
- the scenario data includes synthesis information 16 for synthesizing the captured images.
- the composite playback unit 20 Under the control of the composite playback control unit 22 based on the composite information 16, the composite playback unit 20 combines the image captured by the imaging unit 18 with the scenario video to create a composite video, and Displayed on display 24. This makes it possible to combine the scenario video and the captured image while shooting. Therefore, the captured image and the scenario image can be quickly synthesized to obtain a synthesized image.
- the user can view the composite video and can adjust the image to be composited when capturing. That is, by capturing the target to be combined by the photographing unit 18 during the reproduction of the scenario video, the captured image can be combined as it is. Once the shot image has been shot, it can be played back later, the target to be synthesized can be cut out from the shot image, and the scenario video can be played. it can. In particular, for example, when the screen or keypad is limited as in a mobile terminal, it is difficult to play back and edit both the scenario video and the captured image. It is effective to adjust the photographed image by the photographing unit 18.
- the scenario video playback is restarted in response to a scenario video playback instruction from the user. It may be. Further, a means may be provided for judging the content of the image being photographed, and when it is determined that the photographed image is an image that should enter the synthesis area, the reproduction may be automatically resumed. For example, in the examples shown in Fig. 8 ⁇ and Fig. 8 ⁇ , a routine for recognizing a face is incorporated in the imaging device 10, and when it is determined that a face image has been synthesized in the synthesis area, the scenario video playback is restarted. can do.
- a configuration in which a captured image to be combined with a combination area is overwritten with a scenario image has been described as a method for combining images, but another combination method is employed. It is also possible.
- a transparent GIF format capable of specifying a transparent color is used as an image constituting a scenario image
- the image may be synthesized by superimposing a captured image below the scenario image. As a result, the captured image below is displayed in the transparent color portion of the scenario image.
- information on the rotation angle of the captured image may be included in the composite information. This makes it possible to compose an image at an angle that matches the scenery image without moving the camera. For example, if you want to combine a captured image with this face in the case of a character whose scenario image stands upside down, include the information on the rotation angle (180 °) of the captured image in the composite information and convert the captured image based on this information. If the composition is rotated and combined, an appropriate image can be combined without having the subject stand upside down or shooting upside down.
- FIG. 9 is a diagram illustrating a configuration of an imaging device 10 according to the second embodiment.
- the imaging device 10 according to the second embodiment has the same basic configuration as the imaging device 10 according to the first embodiment, but a combination area presentation unit 3 that presents information on the outer shape of the combination area. The difference is that it further has 0.
- the synthesis area presentation unit 30 has a function of presenting information on the outer shape of the synthesis area to the user based on the synthesis information 16 before the time when the synthesis area appears.
- the combining area presenting unit 30 displays, for example, the outline of the combining area as a frame on the scenario image.
- the frame is a line for making the outline of the synthesis area stand out, and the shape and size of the synthesis area are indicated to the user.
- FIG. 10 is a diagram showing the operation of the photographing unit 18 and the relationship between the display content on the display unit 24 and the scenario data when performing video synthesis.
- FIG. 10 a description will be given of an operation of video composition by the imaging device 10 according to the second embodiment.
- the horizontal axis is the time axis.
- the user instructs the image capturing apparatus 10 to start reproducing a scenario video.
- the instruction to start playback is The input of the combined reproduction control section 22 controls the combined reproduction section 20 to start the reproduction of the scenario video.
- the imaging device 10 reads the scenario information 14 by the combining and reproducing unit 20 and reproduces the scenario video from the start of the reproduction of the scenario video until time T5 is reached.
- the composite reproduction control unit 22 refers to the composite information 16 and presents information on the outline of the composite area that will appear at time T5 at time Tp before reaching time T5. This is sent to the section 30, and the combined area presenting section 30 displays a frame surrounding the combined area over the screen of the scenario video currently displayed on the display section 24.
- FIG. 11A is a diagram illustrating an example of a composite video at time T5
- FIG. 11B is a diagram illustrating an example in which information on the outer shape of the composite area is displayed on a screen.
- the scenario image at time T5 is an image of a keeper who has caught a pole, and a circular composite area is located at the face.
- the imaging device 10 displays the information on the outer shape of the combined area on the display unit 24 at the time Tp before the time T5 when the combined area appears.
- the outline of the combined area is displayed as a frame F overlaid on the scenario image.
- the imaging device 10 displays information on the outer shape of the combined area from time ⁇ to time ⁇ 5.
- scenario video reproduction and video synthesis after time # 5 are the same as those of the imaging device 10 of the first embodiment. That is, after time # 5, the composite playback control unit 22 combines the scenario video and the image captured by the capturing unit 18 to create a composite video. Then, the imaging device 10 displays the composite video on the display unit 24 by the composite reproduction unit 20. According to the synthesis information 16 of the scenario data overnight, since there is no synthesis area appearing after time ⁇ 6, the imaging device 10 reproduces only the scenario video by the synthesis reproduction unit 20.
- the outline of the synthesis area is displayed on the scenario video before the synthesis area appears and the image synthesis is started. Based on this, it is possible to predict an image to be synthesized in the synthesis area. This allows the user to prepare for capturing an image to be combined.
- the imaging device 10 of the second embodiment is different from the imaging device 10 of the first embodiment.
- the captured image and the scenario video can be synthesized while being shot, and the video can be synthesized quickly by simple operations.
- the synthesis / reproduction control unit 22 also sends information on the position where the synthesis area is displayed at the time of the start to the synthesis area presentation unit 30.
- a frame F indicating the outline of the composite area is displayed at that position in the scenario image. Displaying the position information in advance in this way is effective in an implementation in which the position of the synthesis area in the scenario image at the synthesis start time matches the position of the cutout portion in the captured image.
- information on the outline of the composite area may be displayed anywhere on the display unit 24.
- the imaging device 10 has a sub-screen in addition to the screen for reproducing and displaying the scenario video, it is also possible to display information on the outer shape of the synthesis area on the sub-screen.
- the outline information of the combining area is displayed by the frame F.
- the outline information may be indicated by a display other than the frame F.
- the scenario video inside the combined area may be displayed thinner than the others. This also allows the user to know the outline of the composite area.
- the outline information presented by the composite area presentation unit 30 is indicated by a frame F having the same shape and size as the composite area that actually appears.
- information relating to only the shape or the size of the composite area may be presented by the composite area presentation unit 30.
- information on the outer shape of the combining area is displayed.
- a captured image currently captured by the capturing unit 18 may be displayed.
- the captured image displayed on the display unit 24 may be the entire image captured by the capturing unit 18 or may be an image of only the combined range of the captured images.
- the entire captured image is To display over the entire scenario video.
- the part of the captured image cut out from the captured image is cut out from the currently captured image at the position of the composite area that appears at the start time of the composite area, transparently displayed, and superimposed on the scenario video I do.
- the user cannot view the scenario image at time T5 that should be displayed together with the combined area before time T5. Therefore, even if information on the outer shape of the combined area is displayed before time T5 as described above, there is a possibility that it may not be clear what to put in the information to match the scenario image.
- Information on the attributes of the combined area eg, “face” may also be displayed from time T p to time ⁇ 5 (however, the user can ignore this attribute and freely select the shooting target).
- FIG. 12 is a diagram showing an imaging device 10 according to the third embodiment of the present invention.
- the image capturing apparatus 10 according to the third embodiment has the same basic configuration as the image capturing apparatus 10 according to the second embodiment, but an evening image indicating that the appearance timing of the combined area is approaching. The difference is that a notification unit 32 is further provided.
- the timing notification unit 32 detects the approach of the appearance time of the synthesis area based on the information on the appearance time of the synthesis area included in the synthesis information 16 and the elapsed time from the start of the scenario playback.
- the timing notifying unit 32 may employ a configuration for notifying that the appearance time of the combined area is approaching, for example, by emitting a sound. As the appearance time of the synthesis area approaches, the sound to be generated can be made louder or the sound generation interval can be shortened to notify the approach of the appearance time of the synthesis area.
- the timing notifying unit 32 informs the user that the appearance time of the combined area is approaching, based on the time ⁇ force ⁇ at which the combined area information is presented.
- the user can know that the appearance time of the combination area is approaching, and can prepare for shooting an image to be combined.
- the timing notifying unit 32 may notify that the appearance time of the combined area is approaching by a method other than sound. For example, slow down the playback speed of a video scenario By doing so, it is possible to adopt a configuration that notifies that the appearance timing is approaching, or a configuration that displays a countdown number on the screen. In addition, it may be continued even after the appearance time of the combined area has elapsed, and may be notified immediately after the appearance time. This can prompt the user to shoot an image to be combined. In this case, a configuration may be adopted in which the pitch and volume of the sound are changed at the appearance time.
- FIG. 13 is a diagram illustrating a configuration of an imaging device 10 according to the fourth embodiment of the present invention.
- the photographing apparatus 10 according to the fourth embodiment has the same basic configuration as the photographing apparatus 10 according to the second embodiment, but is used for inputting an instruction to start and end video synthesis.
- the difference is that a combined timing input unit 34 is provided.
- the timing information input from the composite timing input unit 34 is sent to the composite reproduction control unit 22.
- the synthesis / playback control unit 22 controls the start or end of synthesis of the captured image and the scenario video based on the timing information from the synthesis timing input unit 34.
- composition timing input section 34 provided in this way, the start and end of video composition can be input, so that the user can select the video composition timing.
- the user can start the composition when the image to be combined is ready to be photographed, and can terminate the composition when the image to be combined cannot be photographed.
- FIG. 14 is a diagram showing the operation of the photographing unit 18 and the relationship between the display contents on the display unit 24 and the scenario video when performing video synthesis.
- FIG. 14 a description will be given of an operation of video synthesis by the imaging device 10 according to the embodiment.
- the horizontal axis is the time axis.
- the user instructs the image capturing apparatus 10 to start reproducing a scenario video.
- a synthesis area is displayed on the scenario video reproduced on the screen of the imaging device 10.
- the image being photographed by the photographing unit 18 is displayed in the combining area.
- the synthesis start instruction has not been input to the synthesis timing input unit 34, the video displayed on the display unit 24 is not recorded.
- imaging apparatus 10 When a synthesis start instruction is input at time T s, imaging apparatus 10 records the captured image displayed in the synthesis area in video recording unit 26. In addition, the imaging device 10 combines the photographed image (still image) at the time T s with the synthesis region from the appearance time T 5 of the synthesis region to the time T s when the start of the synthesis is instructed to create a synthesized video. And record. Thus, the image of the synthesis area until the synthesis start instruction is input can be complemented by the captured image at time Ts.
- the method of complementing the synthesis area when the input timing of the synthesis start instruction is later than the appearance timing of the synthesis area has been described.
- a buffer is provided between the video recording unit 26 and the synthesizing / reproducing unit 20, and the time T s and the time T s stored in the buffer are added to the scenery video from time T 5 to T s Then, the first captured image input to the buffer is synthesized, and the synthesized video obtained there is recorded in the video recording unit 26.
- a description will be given of a complementing method when a synthesis end instruction is input while a synthesis area still appears.
- FIG. 15 is a diagram showing the operation of the photographing unit 18 and the relationship between the display contents on the display unit 24 and the scenario image when performing video synthesis.
- FIG. 15 a description will be given of the operation of video synthesis by the imaging device 10 according to the embodiment.
- the horizontal axis is the time axis.
- the user instructs the photographing device to start reproducing the scenario video.
- the composite area is displayed in the video scenario reproduced on the screen of the imaging device 10.
- An image captured by the imaging unit 18 is displayed in the combination area.
- the photographing device 10 records the video synthesized on the screen in the video recording unit 26 after time T5.
- photographing apparatus 10 terminates recording of the image photographed by photographing section 18.
- the captured image (still image) at the time Te is combined with the combined area from the time Te instructed to finish combining to the appearance end time T6 of the combined area to create a combined video and recorded.
- the image of the combined area from when the combining end instruction is input until the display of the combined area disappears at time T6 can be complemented by the captured image at time Te.
- the captured image is not synthesized in the synthesis area by complementing the image of the synthesis area in the shifted period. Things can be avoided.
- the image of the combined area is complemented by using the image at the time of starting or finishing the combining of the captured images.
- the image of the combined area may be complemented by another method. It is also possible to stretch and record the captured image so as to be equal to the length of time during which the combined area appears. For example, if the appearance time of the composite area is 10 seconds and the playback time of the captured image is 5 seconds, the playback speed of the captured image is halved and the length of the image is reduced to the composite area. Can be adjusted to the appearance time of
- the above-mentioned complementation is performed during playback to avoid the situation where the captured image is not synthesized in the composite area and play the composite video appropriately. can do.
- FIG. 16 is a diagram illustrating a configuration of an imaging device 10 according to the fifth embodiment.
- the configuration of the imaging device 10 according to the fifth embodiment is basically the same as that of the imaging device 10 according to the first embodiment, except that a still image is combined with a scenario video.
- the imaging device 10 according to the fifth embodiment includes a shutter 36.
- the shutter 36 is connected to the image capturing section 18. When the shutter 36 is pressed, the image capturing section 18 captures a still image at the timing of the pressing as an image to be recorded.
- FIG. 17 is a diagram showing the relationship between the operation of the photographing unit 18, the display contents on the display unit 24, and the scenario when performing video composition.
- FIG. 17 a description will be given of an operation of video synthesis by the imaging device 10 of the embodiment.
- the horizontal axis is the time axis.
- the user instructs the photographing device to start reproducing the scenario video.
- the scenario data does not include the combination information, and the scenario video is reproduced.
- the reproduction of the scenario video is started by the composite reproduction control unit 22 to which the reproduction start instruction is input, controlling the composite reproduction unit 20.
- the imaging device 10 temporarily stops playing the scenario video.
- the scenario image at time T5 is displayed on the screen.
- the scenario video contains a composite area.
- the composite playback control unit 22 combines the image in the field of view of the photographing unit 18 and the paused scenario video to create a composite video.
- the synthesizing / reproducing unit 20 displays a video image synthesized under the control of the synthesizing / reproducing control unit 22. This composite video is a composite preview image.
- Adjust the position or orientation of the imaging device 10 based on the composite preview image Take a still image to be combined.
- press the shutter button 36 to acquire a still image.
- the scenario video reproduction may be resumed when the shutter 36 is pressed by the user, or may be resumed after a predetermined time has elapsed from when the scenario video was paused.
- the composite reproduction control unit 22 composites the scenario video with the still image captured by the capturing unit 18 to create a composite video. Specifically, under the control of the composite playback control unit 22, the composite playback unit 20 performs processing such as clipping or scaling on the acquired still image, and combines the processed image with the composite area in the scenario video. Combine with the area. Then, the imaging device 10 displays the composite video on the display unit 24 by the composite reproduction unit 20. In this case, unlike the case of shooting a moving image, the scenario image proceeds, but the image that fits in the composite area remains a still image.
- the synthesis / playback control unit 22 controls the synthesis / playback unit 20 so that the captured image is synthesized with the scenario image while the synthesis region appears in the scenario image. That is, from time T5 to time T6, the combining and reproducing unit 20 combines the captured image and the scenario video.
- the photographing device 10 reproduces the scenario video by the synthesizing reproduction unit 20.
- the photographing device 10 records the video while the scenario video is being reproduced in the video recording unit 26. From time T5 to time T6, the composite video is recorded by either the method of recording the output from the composite playback unit 20 or the recording of a shot still image. Note that the composite preview image is not recorded.
- the operation of the imaging device 10 according to the fifth embodiment has been described above.
- the case of a still image has been described in accordance with the operation example of the first embodiment.
- the operation example of the second embodiment, the timing notification of the third embodiment, and the fourth It is of course possible to apply the complementing method of the embodiment to the synthesis of still images.
- the imaging device 10 converts a captured still image with a scenario image. It can be synthesized quickly to obtain a composite image.
- the user can view the composite video and can adjust the image to be composited when capturing. That is, by capturing the target to be combined by the photographing unit 18 during the reproduction of the scenario video, the captured image can be combined as it is. This makes it possible to easily compose the video without the hassle of playing back the captured image once taken, extracting the target to be combined from the captured image, and synthesizing it at the playback timing of the scenario video. Can be.
- the screen keypad is limited, such as in a mobile terminal, it is difficult to play back the scenario video and display the captured still image and edit it. However, it is effective to adjust the photographed image by the photographing unit 18 while displaying the image.
- a still image may be imaged a plurality of times.
- FIG. 18 is a diagram for explaining video synthesis when a still image is shot a plurality of times.
- the horizontal axis is the time axis.
- a combined image is displayed in which the still image captured the first time and the scenario image are combined.
- a combined image obtained by combining the still image captured with the second shot and the scenario image is displayed.
- a synthesized video obtained by synthesizing the still image shot with the third shot and the scenario video is displayed until the appearance period of the synthesis area ends.
- a plurality of still images can be displayed like a slide show in the composite area of the scenario video.
- the synthesized video obtained by synthesizing the nth shot still image and the scenario video is recorded in the video recording unit 26, but the display unit 2 4 may display an image obtained by synthesizing a preview image in the field of view of the imaging unit 18 with a scenario image. This allows the user to prepare for the next (n + 1) shooting with reference to the displayed preview image while playing back the scenario video.
- the user may be allowed to select an image to be used for synthesis from a plurality of still images, or a composite video may be created and recorded using the still image captured last. If the still image taken last is synthesized, the shooting is terminated when the best image can be shot, so that the user can create a synthesized image from the still image best shot. Also, for example, in FIG.
- FIG. 19 is a diagram illustrating an imaging device 10 according to the sixth embodiment of the present invention.
- the imaging device 10 according to the sixth embodiment has the same basic configuration as the imaging device 10 according to the third embodiment, except that a clipping unit 50 that cuts out a part of a captured image is provided. different.
- the clipping unit 50 has a function of cutting out a part of the image captured by the image capturing unit 18 as a partial image.
- the clipping unit 50 cuts out a partial image according to the composite information 16 stored in the scenario storage unit 12.
- FIG. 20 is a diagram illustrating an example of scenario data stored in the scenario storage unit 12.
- the scenario data includes scenario information 14 and synthesis information 16 as in the above-described embodiment.
- the synthesis information 16 includes clipping information for cutting out a partial image from a captured image, in addition to information defining a synthesis area.
- clipping position for specifying the cutout position from the captured image
- clipping shape for specifying the cutout shape
- clicking change for indicating the change of the partial image
- the basic operation of the photographing apparatus 10 according to the sixth embodiment is the same as the operation of the photographing apparatus 10 according to the above-described embodiment, except that a partial image cut out from the photographed image by the clipping unit 50. Is combined with the scenario video.
- the operation of the imaging device 10 according to the sixth embodiment will be described.
- the synthesis / reproduction control unit 22 reads the synthesis information 16 from the scenario storage unit 12 and passes the clipping information included in the synthesis information 16 to the clipping unit 50.
- the cribing unit 50 receives a photographed image from the photographing unit 18 and cuts out a partial image from the received photographed image. Specifically, the region of the position and the shape specified by the clipping information passed from the synthesis / reproduction control unit 22 is cut out from the captured image as a partial image.
- FIG. 21A is a diagram illustrating an example of a captured image
- FIG. 21B is a diagram illustrating an example of a partial image.
- the clipping shape is described as “ellipse” as in the example shown in FIG. 20
- the elliptical partial image P shown in FIG. 21B is obtained from the captured image shown in FIG. 21A. It is cut out.
- the clipping unit 50 passes the partial image cut out from the photographed image to the combining and reproducing unit 20. Further, the combined playback control unit 22 reads the scenario information 14 from the scenario storage unit 12 and passes the read scenario information 14 to the combined playback unit 20. The synthesizing and reproducing unit 20 synthesizes the partial image passed from the clipping unit 50 with the synthesizing area of the scenario video. Then, the combining and reproducing unit 20 sends the combined video to the display unit 24 and the video recording unit 26. The display unit 24 displays the composite video sent from the composite playback unit 20. The video recording unit 26 records the composite video sent from the composite reproduction unit 20.
- the imaging device 10 according to the sixth embodiment of the present invention has been described above.
- the imaging device 10 of the sixth embodiment includes a clipping unit 50, and the clipping unit 50 combines a partial image cut out from the captured image with a scenario video, so that a part of the captured image is converted into a scenario image. Can be synthesized.
- the clipping unit 50 cuts out the partial image according to the clipping information described in the synthesis information 16, it is possible to specify in the synthesis information which part is cut out as the partial image.
- the clipping information specifies the position and the shape of the partial image to be extracted, but the partial image may be specified in another mode.
- FIG. 22 is a diagram illustrating another example of the composite information describing the clipping information. As shown in FIG. 22, it is also possible to define a partial image to be cut out as an area linked to the synthesis area.
- FIG. 23A is a diagram showing a change in the position and shape of the combining region
- FIG. 23B is a diagram showing a partial image cut out from a captured image (see FIG. 21A) in conjunction with the combining region.
- the data to be recorded in the video recording unit 26 is a composite video synthesized by the composite playback unit 2.0. It may be recorded in 26.
- the clipping unit 50 sends the partial image cut out from the photographed image to the synthesizing and reproducing unit 20, while sending the photographed image to the video recording unit 26.
- the partial image and the scenario image are combined by the combining and reproducing unit 20, the combined image is displayed on the display unit 24, and the captured image is recorded in the image recording unit 26.
- the i) identifier and i i) storage location of the captured video to be combined are recorded in the combined information of the scenario data.
- the captured video indicated by the identifier i) recorded in the scenario data is read from the video recording unit 26, and the composite information is reproduced by the composite reproduction unit 20.
- the composite video can be reproduced.
- FIG. 25 is a diagram illustrating a configuration of an imaging device 10 according to the seventh embodiment.
- the imaging device 10 according to the seventh embodiment has the same basic configuration as the imaging device 10 according to the sixth embodiment, but the imaging device 10 according to the seventh embodiment has The difference from the imaging apparatus 10 of the sixth embodiment is that the partial image cut out from the captured image is specified by specifying the type of the object to be included in the image. Accordingly, the seventh embodiment
- the photographing device 10 further includes an image recognition unit 52.
- FIG. 26 is a diagram illustrating an example of scenario data according to the seventh embodiment.
- the scenario data includes scenario information 14 and composite information 16 as in the above-described embodiment.
- the synthesis information 16 includes clipping information for cutting out a partial image in addition to information for specifying a synthesis area.
- the clipping information specifies a partial image by specifying an object included in the partial image.
- clipping target the information of the target for specifying the partial image.
- FIG. 26 it is described that a portion including the clipping object “face” is cut out from the captured image as a partial image.
- the image recognizing unit 52 has a function of recognizing the image of the image captured by the image capturing unit 18 and identifying the clipping target displayed in the captured image. Then, the image recognizing unit 52 passes information on the region where the clipping target is displayed to the clipping unit 50.
- the operation of the imaging device 10 according to the seventh embodiment is basically the same as the operation of the imaging device 10 according to the sixth embodiment, except that a region cut out by the clipping unit 50 is recognized by the image recognition unit. The point to be obtained differs depending on 52.
- the synthesis / reproduction control unit 22 of the photographing device 10 reads the synthesis information 16 from the scenario data stored in the scenario storage unit 12 and acquires information on a clipping target that specifies a partial image. Then, the synthesis / reproduction control unit 22 passes the information of the clipping target to the image recognition unit 52.
- the image recognizing unit 52 identifies an area where the object to be clipped is displayed in the image captured by the image capturing unit 18 and passes information on the area to the clipping unit 50.
- the clipping unit 50 cuts out the area indicated by the information passed from the image recognition unit 52 as a partial image.
- FIG. 27A is a diagram illustrating an example of a captured image
- FIG. 27B is a diagram illustrating an example of a partial image. If the object to be clipped is specified as “face” in the composite information 16, the area where “face” is displayed in the captured image shown in FIG. 27A is identified, and as shown in FIG. 27B. Is extracted as a partial image P. In this example, “face” is included.
- a circular region is cut out as a partial image
- the shape of the region from which the image including the clipping target is cut out is not limited to a circle. For example, a rectangular area may be cut out, or an area that matches the shape of the clipping target may be cut out.
- the clipping unit 50 passes the partial image cut out from the photographed image to the combining and reproducing unit 20.
- the synthesizing and reproducing unit 20 synthesizes the partial image passed from the clipping unit 50 with the synthesizing area of the scenario video.
- the combining and reproducing unit 20 sends the combined image to the display unit 24 and the image recording unit 26.
- the display unit 24 displays the composite video sent from the composite playback unit 20.
- the video recording unit 26 records the composite video sent from the composite playback unit 20.
- the imaging device 10 of the seventh embodiment specifies a partial image by specifying a clipping target to be included in the partial image, so that the partial image to be clipped can be easily specified. Then, the image recognizing unit 52 of the photographing device 10 can identify the region where the clipping target is displayed, and cut out the region including the clipping target.
- the image capturing apparatus 10 of the seventh embodiment combines the partial image cut out by the clipping unit 50 with the scenario video, as in the image capturing apparatus 10 of the sixth embodiment. A part of the image can be combined with the scenery image.
- the imaging apparatus 10 In the imaging apparatus 10 according to the seventh embodiment described above, a configuration is described in which the combined video obtained by combining the scenario video with the partial image cut out from the captured image is recorded in the video recording unit 26.
- the captured image itself can be recorded in the video recording unit 26 in the same manner as described in the embodiment.
- the imaging device 10 according to the eighth embodiment has the same basic configuration as the imaging device 10 according to the seventh embodiment (see FIG. 25), but is stored in the scenario storage unit 12.
- the content of the synthesized information 16 differs.
- the image recognition unit 52 has a function of identifying a target from a captured image and a target video from a scenario image.
- FIG. 28 is a diagram illustrating an example of the scenario data stored in the scenario storage unit 12 of the imaging device 10 according to the eighth embodiment.
- the synthesis reproduction control unit 22 of the photographing device 10 reads the scenario information 14 and the synthesis information 16 from the scenario data stored in the scenario storage unit 12, and determines a target object for specifying the synthesis area. Get information.
- the combined playback control unit 22 passes the scenario information 14 to the combined playback unit 20.
- the synthesizing / reproducing unit 20 passes the scenario image to the image recognizing unit 52 together with information on the target for specifying the synthesizing area.
- the image recognition unit 52 identifies an area in the scenario video where the image of the target object is displayed, and passes information on the area to the synthesis and reproduction control unit 22 as synthesis area information.
- the image recognizing unit 52 passes the information on the position and the shape of the combining area to the combining and reproducing control unit 22.
- the combined playback control unit 22 passes the combined area information to the combined playback unit 20.
- FIG. 29A is a diagram showing an example of a scenario video
- FIG. 29B is a diagram showing a composite area in the scenario video. If it is specified in the composition information 16 that the composition area includes the target object “ball”, the area where “pole” is displayed is identified in the scenario image shown in FIG. As shown in 29 B, the area including the “pole” is the composite area G.
- the combining / reproducing unit 20 combines the partial video passed from the clipping unit 50 with the scenario video based on the combining area information passed from the combining / playback controlling unit 22.
- the process of clipping a part of the captured image as a partial image by the clipping unit 50 is the same as that of the image capturing apparatus 10 according to the seventh embodiment.
- the synthesizing and reproducing unit 20 sends the synthesized video to the display unit 24 and the video recording unit 26.
- the display unit 24 displays the composite video sent from the composite playback unit 20.
- the video recording unit 26 records the composite video sent from the composite playback unit 20.
- the imaging device 10 according to the eighth embodiment can easily specify a combination area by specifying an object to be included in the combination area.
- Clipping information may be defined according to the position and shape of the cutout area. Also, clipping information is not always necessary, and the combined information 16 need not include clipping information. In this case, the entire captured image captured by the capturing unit 18 is combined with the combining area specified by the target.
- the photographing apparatus 10 according to the ninth embodiment has the same basic configuration as the photographing apparatus 10 according to the first embodiment, but differs in the function of the composition and reproduction control unit 22.
- FIG. 30 is a diagram showing the configuration of the combined reproduction control unit 22 according to the ninth embodiment.
- the combined playback control unit 22 includes a schedule generation unit 54, a schedule execution unit 56, and a combined playback instruction unit 58.
- the schedule generation unit 54 has a function of generating a data schedule list based on the scenario information 14 read from the scenario storage unit 12.
- the schedule execution unit 56 has a function of extracting the schedule at the current time from the schedule list generated by the schedule generation unit 54.
- the composite reproduction instructing section 58 has a function of instructing the composite reproducing section 20 to perform composite display or reproduction based on the contents of the schedule extracted by the schedule executing section 56.
- FIG. 31A is a diagram showing an example of scenario data
- FIG. 31B is a diagram showing a schedule list generated from the scenario data shown in FIG. 31A.
- t l0s
- the PAUSE schedule is executed. Add PAUSE schedule to schedule list.
- the scenario video is paused after the simultaneous playback media is played back, so that the scenario video including the simultaneous playback media can be displayed in the paused state. Therefore, the user can shoot an appropriate image to be combined with the scenario video while viewing the displayed image.
- the photographing apparatus 10 according to the second embodiment has the same basic configuration as the photographing apparatus 10 according to the first embodiment, but differs in the function of the composition and reproduction control unit 22.
- FIG. 32 is a diagram showing a configuration of the synthesis / reproduction control unit 22 in the tenth embodiment.
- the synthetic reproduction control unit 22 according to the tenth embodiment includes a state monitoring unit 60 and a schedule correction unit 62 in addition to the configuration of the synthetic reproduction control unit 22 according to the ninth embodiment.
- the state monitoring unit 60 has a function of monitoring the operation rate of the CPU, the memory usage, and the operation rate of the device.
- the schedule correction unit 62 has a function of setting a setup time for media playback based on each state obtained by the state monitoring unit 60.
- FIG. 33A is a diagram showing an example of scenario data
- FIG. 33B is a diagram showing a schedule list generated from the scenario data shown in FIG. 33A.
- the schedule correction unit 62 obtains and calculates the time required for media playback or device startup preparation. Determine setup time based on time.
- the photographing apparatus 10 according to the eleventh embodiment has the same basic configuration as the photographing apparatus 10 according to the first embodiment, but differs in the function of the composition and reproduction control unit 22.
- FIG. 34 is a diagram illustrating a configuration of the synthesis / reproduction control unit 22 according to the first embodiment.
- the combined playback control unit 22 in the first embodiment includes a combined image information generation unit 64 and a combined image generation unit 66 in addition to the configuration of the combination playback control unit 22 in the ninth embodiment.
- the composite image information generation unit 64 It has a function of determining the start time of the synthesis with reference to the evening, and passing information for generating a synthesized image that is a still image of the scenario video at the start time of the synthesis to the synthesized image generation unit 66.
- the composite image generation unit 66 has a function of generating a composite image based on the composite image information passed from the composite image information generation unit 64.
- the composite image generation unit 66 stores the generated composite image in the scenario storage unit 12.
- the schedule generation unit 54 generates a schedule list describing the pause of the scenario image after displaying the composite image. Further, the schedule generation unit 54 describes, in the schedule list, a process of adjusting the position at which the reproduction of the paused scenario image is resumed to the start time of the synthesis.
- FIG. 35A is a diagram showing an example of scenario data
- FIG. 35B is a diagram showing a schedule list generated from the scenario data shown in FIG. 35A.
- the media (media B) whose playback is started is played back, and the imaging unit 1 is started.
- the composite image generator: 66 displays the composite image generated by the component, and then describes the schedule list to pause the scenario video.
- the composite image previously stored in the scenario storage unit 12 is displayed, so that the composite image can be displayed smoothly.
- a process of adjusting the resume position of the pause of the playback to the start time of the composition (hereinafter, referred to as a PAUSE position correction process) is described as indicated by “SEEK:” in FIG. 35B.
- the SEEK process is a process of moving the playback position to a designated time, and moves the playback position to the time at which the PAUSE process was performed by the SEEK process.
- P AUSE position correction is performed to restore the time lag required until the reproduction is actually paused by the P AUSE process.
- This PAUSE position correction process is not displayed on the screen, but a composite image of the composite image and the captured image displayed in the composite area is displayed on the screen. By doing so, even if it takes time before the playback of the scenario video is paused, the playback of the scenario video can be resumed from the start time of the synthesis.
- the schedule generation unit 54 in the ninth embodiment or the tenth embodiment may generate a schedule list describing a process of adjusting a resume position of reproduction of a scenario video to a synthesis start time. It is possible.
- FIGS. 36 to 38 are diagrams showing display examples on each screen when the imaging device 10 has two screens.
- one screen left side
- the other screen right side
- a frame F indicating the outer shape of the combining area is displayed on the scenario video display screen in the same manner as described in the second embodiment.
- the scenario image is reproduced and displayed on one screen (left side), and the composite image of the scenario image and the current captured image at the appearance time of the composite area is displayed on the other screen (right side).
- a synthesized preview image that has been synthesized can also be displayed. As a result, the composite preview image can be confirmed before the composite region appears without the need to pause the reproduction of the scenario video as in the first embodiment.
- the synthesized video is displayed on one screen (left side), and the shooting unit 18 displays the other screen (right side).
- a captured image can be displayed.
- the entire captured image can be viewed even during synthesis with the scenario video, so that an appropriate captured image can be obtained.
- the imaging device 10 has two screens, one screen can be provided on the subject side.
- the subject By displaying the composite image or the composite preview image on the screen on the subject side and showing the photographed image of the subject synthesized with the scenario image to the subject, the subject itself can grasp what kind of composite image is created. Thus, for example, it is possible to determine what pose the subject should pose.
- the image captured by the image capturing apparatus 10 is determined by looking at the captured image displayed on the screen.
- the configuration in which the image captured by the photographing device 10 can be confirmed by other methods may be adopted.
- an irradiating unit may be provided in the imaging device 10 so that light is emitted to a range of the finder of the imaging device 10.
- the photographing range can be confirmed by the light emitted to the subject.
- the photographing apparatus and the photographing method of the present invention have been described.
- a program for configuring the photographing apparatus or a program for executing the above photographing method is also included in the scope of the present invention.
- FIG. 39 is a diagram showing an example of the program of the present invention.
- the program 70 has a scenario data management module 72, a readout module 74, an imaging control module 76, a synthesis reproduction module 78, a display module 80, and a presentation module 82.
- the computer on which the program 70 is installed is a computer having a photographing unit, for example, a digital camera.
- the scenario data management module 72 is a module for causing a computer to manage scenario data.
- the computer has a storage unit similar to the scenario storage unit 12 in the imaging device 10 of the above embodiment.
- the read module 74 is a module for reading scenario data in a computer.
- the shooting control module 76 is a module for causing a computer to shoot and obtain a shot image.
- the photographing control module 76 the computer has the same function as the photographing condition control unit 28 in the photographing apparatus 10 of the above embodiment.
- the synthesizing and reproducing module 78 is a module for synthesizing the captured image and the scenario video while causing the computer to reproduce the scenario video.
- the display module 80 is a module for displaying a scenario image or a composite image on a computer.
- the presentation module 82 is a module for causing a computer to present synthesized information to a user of the computer. By executing the presentation module 82, the imaging device 1 according to the second embodiment described above is executed. Composite territory at 0 The same function as the area presentation unit 30 can be realized.
- a photographing device similar to the photographing device 10 of the second embodiment can be realized, and a scenario video and a photographed image can be synthesized while photographing.
- the imaging apparatus of the present invention displays a composite video of a scenario video and a captured image on the display means, so that the user can perform shooting while viewing the composite video displayed on the display means. Also, it has an excellent effect that the position and orientation of the shooting device can be adjusted so that the video to be synthesized is properly shot, and is useful as a device that synthesizes images obtained by shooting with scenario video. It is.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Studio Circuits (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005510261A JPWO2005015899A1 (ja) | 2003-08-11 | 2004-05-27 | 撮影装置及び撮影方法 |
US10/543,359 US20060120623A1 (en) | 2003-08-11 | 2004-05-27 | Photographing system and photographing method |
EP04735101A EP1679885A1 (en) | 2003-08-11 | 2004-05-27 | Photographing system and photographing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-291100 | 2003-08-11 | ||
JP2003291100 | 2003-08-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005015899A1 true WO2005015899A1 (ja) | 2005-02-17 |
Family
ID=34131628
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/007675 WO2005015899A1 (ja) | 2003-08-11 | 2004-05-27 | 撮影装置及び撮影方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20060120623A1 (ja) |
EP (1) | EP1679885A1 (ja) |
JP (1) | JPWO2005015899A1 (ja) |
KR (1) | KR20060059862A (ja) |
CN (1) | CN1748410A (ja) |
WO (1) | WO2005015899A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007110660A (ja) * | 2005-09-13 | 2007-04-26 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2008538675A (ja) * | 2005-04-19 | 2008-10-30 | マイクロソフト コーポレーション | メディアタイムライン処理インフラストラクチャ |
JP2009135724A (ja) * | 2007-11-30 | 2009-06-18 | Sanyo Electric Co Ltd | 画像合成装置 |
EP2200286A2 (en) * | 2005-07-27 | 2010-06-23 | Sharp Kabushiki Kaisha | Video synthesis device and program |
JP2011015279A (ja) * | 2009-07-03 | 2011-01-20 | Olympus Imaging Corp | デジタルカメラ及び動画像再生方法 |
US7978957B2 (en) | 2005-09-13 | 2011-07-12 | Sony Corporation | Information processing apparatus and method, and program |
JP2012094103A (ja) * | 2010-02-24 | 2012-05-17 | Dainippon Printing Co Ltd | 画像表示システム |
JP2017046162A (ja) * | 2015-08-26 | 2017-03-02 | 隆正 光信 | 合成動画作成システム、合成動画作成補助システム及び合成動画作成プログラム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090142001A1 (en) * | 2007-11-30 | 2009-06-04 | Sanyo Electric Co., Ltd. | Image composing apparatus |
US8666191B2 (en) | 2011-03-02 | 2014-03-04 | Canon Kabushiki Kaisha | Systems and methods for image capturing |
US20140111640A1 (en) * | 2012-10-19 | 2014-04-24 | Candid Color Systems, Inc. | Method of sending and processing event images |
CN103150761A (zh) * | 2013-04-02 | 2013-06-12 | 乐淘奇品网络技术(北京)有限公司 | 通过网页高速逼真3d渲染进行物品设计定制的方法 |
JP2015032952A (ja) * | 2013-08-01 | 2015-02-16 | ソニー株式会社 | 表示制御装置、表示制御方法および記録媒体 |
EP3345160A4 (en) | 2015-09-02 | 2019-06-05 | Thumbroll LLC | CAMERA SYSTEM AND METHOD OF ALIGNING IMAGES AND PRESENTING A SERIES OF ALIGNED IMAGES |
WO2017168949A1 (ja) | 2016-03-29 | 2017-10-05 | ソニー株式会社 | 情報処理装置、撮像装置、画像再生装置、および方法とプログラム |
CN109474787B (zh) * | 2018-12-28 | 2021-05-14 | 维沃移动通信有限公司 | 一种拍照方法、终端设备及存储介质 |
CN115883751A (zh) * | 2021-09-28 | 2023-03-31 | 北京字跳网络技术有限公司 | 一种视频生成方法、装置、设备及存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06243023A (ja) * | 1993-02-12 | 1994-09-02 | Nec Corp | シナリオ編集装置 |
JPH08305720A (ja) * | 1995-05-01 | 1996-11-22 | Nippon Telegr & Teleph Corp <Ntt> | マルチメディアデータ管理方法及び装置 |
JPH11196362A (ja) * | 1997-01-10 | 1999-07-21 | Casio Comput Co Ltd | 撮像装置および撮像画像加工方法 |
WO1999067949A1 (fr) * | 1998-06-22 | 1999-12-29 | Fuji Photo Film Co., Ltd. | Imageur et procede |
JP2001223876A (ja) * | 2000-02-07 | 2001-08-17 | Fuji Photo Film Co Ltd | 画像出力装置 |
JP2001285784A (ja) * | 2000-01-26 | 2001-10-12 | Sony Corp | 情報処理装置および方法、並びにプログラム格納媒体 |
JP2002084457A (ja) * | 2000-09-07 | 2002-03-22 | Sony Corp | 画像処理装置及び方法、並びに記録媒体 |
JP2002368978A (ja) * | 2001-06-07 | 2002-12-20 | Mitsubishi Electric Corp | 映像信号編集装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US154158A (en) * | 1874-08-18 | Improvement in apparatus for drawing water from the ocean | ||
US5045876A (en) * | 1988-08-03 | 1991-09-03 | Canon Kabushiki Kaisha | Camera with flash device |
US5148477A (en) * | 1990-08-24 | 1992-09-15 | Board Of Regents Of The University Of Oklahoma | Method and apparatus for detecting and quantifying motion of a body part |
US6343987B2 (en) * | 1996-11-07 | 2002-02-05 | Kabushiki Kaisha Sega Enterprises | Image processing device, image processing method and recording medium |
US6621524B1 (en) * | 1997-01-10 | 2003-09-16 | Casio Computer Co., Ltd. | Image pickup apparatus and method for processing images obtained by means of same |
JP3736706B2 (ja) * | 1997-04-06 | 2006-01-18 | ソニー株式会社 | 画像表示装置及び方法 |
JP2001309212A (ja) * | 2000-04-21 | 2001-11-02 | Ricoh Co Ltd | デジタルカメラ |
WO2003025859A1 (fr) * | 2001-09-17 | 2003-03-27 | National Institute Of Advanced Industrial Science And Technology | Dispositif d'interface |
US7203338B2 (en) * | 2002-12-11 | 2007-04-10 | Nielsen Media Research, Inc. | Methods and apparatus to count people appearing in an image |
-
2004
- 2004-05-27 JP JP2005510261A patent/JPWO2005015899A1/ja not_active Withdrawn
- 2004-05-27 KR KR1020057014735A patent/KR20060059862A/ko not_active Application Discontinuation
- 2004-05-27 EP EP04735101A patent/EP1679885A1/en not_active Withdrawn
- 2004-05-27 CN CNA2004800040323A patent/CN1748410A/zh active Pending
- 2004-05-27 US US10/543,359 patent/US20060120623A1/en not_active Abandoned
- 2004-05-27 WO PCT/JP2004/007675 patent/WO2005015899A1/ja not_active Application Discontinuation
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06243023A (ja) * | 1993-02-12 | 1994-09-02 | Nec Corp | シナリオ編集装置 |
JPH08305720A (ja) * | 1995-05-01 | 1996-11-22 | Nippon Telegr & Teleph Corp <Ntt> | マルチメディアデータ管理方法及び装置 |
JPH11196362A (ja) * | 1997-01-10 | 1999-07-21 | Casio Comput Co Ltd | 撮像装置および撮像画像加工方法 |
WO1999067949A1 (fr) * | 1998-06-22 | 1999-12-29 | Fuji Photo Film Co., Ltd. | Imageur et procede |
JP2001285784A (ja) * | 2000-01-26 | 2001-10-12 | Sony Corp | 情報処理装置および方法、並びにプログラム格納媒体 |
JP2001223876A (ja) * | 2000-02-07 | 2001-08-17 | Fuji Photo Film Co Ltd | 画像出力装置 |
JP2002084457A (ja) * | 2000-09-07 | 2002-03-22 | Sony Corp | 画像処理装置及び方法、並びに記録媒体 |
JP2002368978A (ja) * | 2001-06-07 | 2002-12-20 | Mitsubishi Electric Corp | 映像信号編集装置 |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008538675A (ja) * | 2005-04-19 | 2008-10-30 | マイクロソフト コーポレーション | メディアタイムライン処理インフラストラクチャ |
US8736698B2 (en) | 2005-07-27 | 2014-05-27 | Sharp Kabushiki Kaisha | Video synthesizing apparatus and program |
EP2200290A3 (en) * | 2005-07-27 | 2010-10-06 | Sharp Kabushiki Kaisha | Video synthesis device and program |
US9100619B2 (en) | 2005-07-27 | 2015-08-04 | Sharp Kabushiki Kaisha | Video synthesizing apparatus and program |
EP2200290A2 (en) | 2005-07-27 | 2010-06-23 | Sharp Kabushiki Kaisha | Video synthesis device and program |
US8836803B2 (en) | 2005-07-27 | 2014-09-16 | Sharp Kabushiki Kaisha | Video synthesizing apparatus and program |
EP2200286A3 (en) * | 2005-07-27 | 2010-10-06 | Sharp Kabushiki Kaisha | Video synthesis device and program |
US8836804B2 (en) | 2005-07-27 | 2014-09-16 | Sharp Kabushiki Kaisha | Video synthesizing apparatus and program |
US8687121B2 (en) | 2005-07-27 | 2014-04-01 | Sharp Kabushiki Kaisha | Video synthesizing apparatus and program |
EP2200286A2 (en) * | 2005-07-27 | 2010-06-23 | Sharp Kabushiki Kaisha | Video synthesis device and program |
US8743228B2 (en) | 2005-07-27 | 2014-06-03 | Sharp Kabushiki Kaisha | Video synthesizing apparatus and program |
JP2007110660A (ja) * | 2005-09-13 | 2007-04-26 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP4600766B2 (ja) * | 2005-09-13 | 2010-12-15 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
US7978957B2 (en) | 2005-09-13 | 2011-07-12 | Sony Corporation | Information processing apparatus and method, and program |
JP2009135724A (ja) * | 2007-11-30 | 2009-06-18 | Sanyo Electric Co Ltd | 画像合成装置 |
JP2011015279A (ja) * | 2009-07-03 | 2011-01-20 | Olympus Imaging Corp | デジタルカメラ及び動画像再生方法 |
JP2012094103A (ja) * | 2010-02-24 | 2012-05-17 | Dainippon Printing Co Ltd | 画像表示システム |
JP2017046162A (ja) * | 2015-08-26 | 2017-03-02 | 隆正 光信 | 合成動画作成システム、合成動画作成補助システム及び合成動画作成プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN1748410A (zh) | 2006-03-15 |
EP1679885A1 (en) | 2006-07-12 |
KR20060059862A (ko) | 2006-06-02 |
US20060120623A1 (en) | 2006-06-08 |
JPWO2005015899A1 (ja) | 2007-10-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5568178B2 (ja) | ビデオの要約方法 | |
US8970717B2 (en) | Photographing apparatus | |
US7483061B2 (en) | Image and audio capture with mode selection | |
WO2005015899A1 (ja) | 撮影装置及び撮影方法 | |
JP4479829B2 (ja) | 撮像装置及び撮像方法 | |
JP4986886B2 (ja) | 撮像装置、画像再生装置、撮影制御方法及び画像再生方法 | |
WO2022267915A1 (zh) | 一种视频处理方法、装置、设备及存储介质 | |
JP2009058834A (ja) | 撮像装置 | |
KR101425950B1 (ko) | 사진의 소리 열을 생성하기 위한 방법 및 그러한 소리 열을 생성하고 재생하기 위한 장치 | |
JP2010283412A (ja) | 撮影装置、撮影方法および再生方法 | |
JP2007325152A (ja) | 撮像装置及びそのプログラム | |
JP5126392B2 (ja) | 再生制御装置及び再生制御方法並びにプログラム | |
JP2005260749A (ja) | 電子カメラ、及び電子カメラの制御プログラム | |
JP2010232813A (ja) | 映像編集プログラムおよび映像編集装置 | |
JP4911287B2 (ja) | 画像再生装置及びそのプログラム | |
JP2019140568A (ja) | 画像処理装置 | |
JP2007267308A (ja) | 画像再生装置 | |
JP5628992B2 (ja) | カメラ、カメラの制御方法、表示制御装置、および表示制御方法 | |
JP2010034933A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP6020665B2 (ja) | 再生制御装置、再生制御方法及びプログラム | |
JP2005117370A (ja) | デジタルカメラ | |
JP2011044819A (ja) | カメラ、および動画再生用プログラム | |
JP2004266535A (ja) | 音声記録再生機器 | |
JP2016220258A (ja) | 画像再生装置、画像再生方法及び画像再生プログラム | |
JP2015122686A (ja) | 情報処理装置、撮影装置、制御方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2005510261 Country of ref document: JP |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004735101 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2006120623 Country of ref document: US Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10543359 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020057014735 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048040323 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057014735 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 10543359 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2004735101 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2004735101 Country of ref document: EP |