CN101617531A - Image processing apparatus, moving image playing device and processing method and program - Google Patents

Image processing apparatus, moving image playing device and processing method and program Download PDF

Info

Publication number
CN101617531A
CN101617531A CN200880005527A CN200880005527A CN101617531A CN 101617531 A CN101617531 A CN 101617531A CN 200880005527 A CN200880005527 A CN 200880005527A CN 200880005527 A CN200880005527 A CN 200880005527A CN 101617531 A CN101617531 A CN 101617531A
Authority
CN
China
Prior art keywords
image
conversion
catch
composograph
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200880005527A
Other languages
Chinese (zh)
Inventor
鹤见辰吾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN101617531A publication Critical patent/CN101617531A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • G06T3/147
    • G06T3/16
    • G06T3/18
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2541Blu-ray discs; Blue laser DVR discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Abstract

In order to allow the beholder in browsing the situation of moving image, to understand the details of the moving image of taking by image capture apparatus easily.Camera operation detecting unit (120) detects when the moving image of taking from moving image input unit (110) input, the amount of exercise of image capture apparatus, and based on the amount of exercise of image capture apparatus, frame by frame is calculated the affine transformation parameter that is used for changing image.Image transforming unit (160) based on the affine transformation parameter that calculates to be kept at the history image in the video memory (170) and catch in the image at least one carry out affine transformation.Image assembled unit (180) frame by frame will be wherein at least one through conversion catch image and history image combined, and make video memory (170) preserve composograph.The composograph that is generated by image assembled unit (180) is displayed on the display unit (191).

Description

Image processing apparatus, moving image playing device and processing method and program
Technical field
The present invention relates to the image processing apparatus, and relate more specifically to can playing moving images image processing apparatus, moving image playing device, its processing method, and make computer carry out the program of this method.
Background technology
In recent years, Digital Video is extensively popularized.Therefore, for example, in the activity that the child of kindergarten participates in, the head of a family etc. utilize the image of Digital Video shooting activity scene (appearance) usually.When head of a family etc. carries out image taking in this activity, though mainly take their child's image, the image of the scene of common shooting activity when needed etc., so that can the understanding activity how to carry out.
Can for example utilize the moving image playing device on its display, to play the moving image of taking by this way at home.For example, will browse in the situation of its child as main motion of objects image, comprise that mainly this head of a family's child's moving image is played the head of a family.But when the beholder continued to browse the moving image of same target in longer playback duration, the beholder may be not too interested in the moving image of being play along with becoming in the past of playback duration.Therefore, interested in order to make the beholder, can imagine other image-related image of demonstration and current demonstration etc.
For example, proposed a kind of method for displaying image, be used for advancing according to moving image with moving image roll display (scroll-displaying) video index (video index) (still image) (for example referring to Japan not substantive examination patent application gazette No.11-289517 (Fig. 7)).
According to above-mentioned conventional art, the past of reference motion image, still image current and future are shown as video index.This allow with the moving image of current demonstration browse over, the still image in current and future.Therefore, for example, just browsing in the situation of the moving image of taking in the activity of the kindergarten that participates in child the head of a family, even just be shown as in the situation of current moving image at this child's head of a family image, for example relevant with current moving image activity scene can be shown as over or still image in the future.In this case, the head of a family can watch its child's appearance and watch movable scene etc.This helps understanding activity scene and makes the beholder interested.
But, utilize above-mentioned conventional art, image the head of a family's child just is being shown as in the situation of current moving image, and movable scene etc. may not be shown, and identical with the current moving image basically image of its details may be shown as over or the still image in future.In this case, keeping the beholder is very important to the interest of moving image.Therefore, if consider what kind of central person's surroundings that the beholder can suitably understand as the object of captured image is in browse center personage etc., the beholder can easily understand the details of moving image so, and the beholder can become interested in moving image thus.
Therefore, an object of the present invention is in browsing the situation of moving image, to make that understanding easily utilizes the details of the moving image of image capture apparatus shooting.
Summary of the invention
Made the present invention for the problem that solves the front, and a first aspect of the present invention is image processing apparatus, its processing method and make computer carry out the program of this method.Information processor is characterised in that and comprises: the moving image input unit is used to receive the capture movement image of being caught by image capture apparatus; The information converting calculation element, be used for first catching image and being positioned at first and catch second after the image and catch image, calculate and first catch image and second and catch image-related information converting along the time shaft of capture movement image based on what be included in the capture movement image; Image storage apparatus is used for and will comprises that first is that catch image and be positioned at second each image of catching before the image along the time shaft of capture movement image and save as history image; Image conversion device, at least one that is used for based on the information converting that calculates is caught image to history image and second carried out conversion; The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by image conversion device; Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one history image and second that has carried out conversion by image conversion device to catch image combined, to generate composograph; Output device is used to export composograph; And control device, be used to make output device sequentially to export composograph.Therefore, the operation below having realized: catch image and second based on first and catch image calculation and go out information converting; Based on the information converting that calculates, at least one carries out conversion to history image and second is caught in the image; To catch image at least one history image and second that has carried out conversion wherein in response to the selection operation of being accepted makes up to generate composograph; And sequentially export composograph.
And, in first aspect, image combining apparatus can by will be wherein at least one carried out second of conversion by image conversion device and catch second in image and the history image and catch image and write on the history image that history image and second to be caught image combined generating composograph, and make image storage apparatus that composograph is saved as new history image.Therefore, operation below having realized: by will be wherein at least one history image and second that has carried out conversion by image conversion device catch second in the image and catch image and write on the history image that history image and second to be caught image combined generating composograph, and composograph is saved as new history image.
And, in first aspect, image combining apparatus can by will via the image conversion device conversion and second catch image and write on the history image what catch through second of conversion that image quality in images carries out that conversion obtains according to described history image, catch image and history image combination with second.Therefore, the operation below having realized: by carrying out conversion to catching image quality in images, catch that image writes on the history image and combined with history image with second through second of conversion according to history image.
And, in first aspect, image combining apparatus can with via the image conversion device conversion, be present in second before the picture quality conversion and catch image and write on and generate new composograph on the new history image, and control device can make output device sequentially export new composograph.Therefore, the operation below having realized: be present in second before the picture quality conversion and catch image and be written in and generate new composograph on the new history image, and new composograph is exported in proper order.
And in first aspect, image processing apparatus can also comprise: the output image extraction element, being used for extracting from the new history image that is kept at image storage apparatus will be by the output image of output device output.Image combining apparatus can by will via the image conversion device conversion, be present in second before the picture quality conversion and catch image and write on the output image, it is combined generating new output image to catch image and output image with second, and control device can make output device sequentially export new output image.Therefore, realized following operation: extract output image in the new history image from be kept at image storage apparatus; Will be present in second before the picture quality conversion catches image and writes on the output image and thereby output image is combined generates new output image; And order is exported new output image.
And, in first aspect, the output image extraction element can be based on position and the size and position and the size of output image in the storage area of catching through second of conversion in the storage area of image at image storage apparatus, calculate via the image conversion device conversion, be present in second before the picture quality conversion and catch image and be written in the size that image is caught in position and second on the output image; And image combining apparatus can based on the position that calculates and big young pathbreaker via the image conversion device conversion, be present in second before the picture quality conversion and catch image and write on the output image, thereby catch image and output image is combined with second.Therefore, operation below having realized: catch the position of image and the position and the size of the output image in size and the storage area based on second in the storage area, calculate and be present in second before the picture quality conversion and catch image and be written in the size that image is caught in position and second on the output image; And will be present in second before the picture quality conversion catch image write on the output image and output image combined.
And, in first aspect, at least a portion of catching image through second of conversion in being included in new history image is from output area (promptly being used to extract the zone of output image) when outstanding, the output image extraction element can be on the direction of outstanding image section mobile output area, and extract output image from new history image.Therefore, the operation below having realized: when outstanding, output area is moved on the direction of outstanding image section second at least a portion of catching image in being included in new history image, and output image is extracted from output area.
And in first aspect, picture quality can be at least one in resolution and the compression ratio.Therefore, the operation below having realized: by carrying out conversion to catch the resolution of image and in the compression ratio at least one through second of conversion, catch that image writes on the history image and combined with history image with second according to history image.
And, in first aspect, image processing apparatus can also comprise: the output image extraction element, be used for from the new history image that is kept at image storage apparatus extracting the image that is included in the zone that calculates based on the information converting that calculates, as will be by the output image of output device output.Image combining apparatus can be carried out second before the conversion by image conversion device and catches image and write on the output image being present in, thereby catch image and output image is combined generates new output image with second, and control device can make output device sequentially export new output image.Therefore, the operation below having realized: from new history image, extract the image that is included in the zone that calculates based on information converting as output image; To be present in second before the conversion catch image write on the output image and output image combined to generate new output image; And make output device export new output image in proper order.
And in first aspect, the output image extraction element can be based on the information converting that calculates, and goes up in the opposite direction output image is carried out conversion catching side that image carries out conversion with image conversion device to second; And image combining apparatus can be carried out second before the conversion by image conversion device and catches image and write on the output image of conversion being present in, thereby catches image and generate new output image through the output image of conversion is combined second.Therefore, the operation below having realized: based on information converting, output image with second catch side that image is transformed and be transformed in the opposite direction; And being present in second before the conversion catches image and is written on the output image of conversion combined with it to generate new output image.
And, in first aspect, image conversion device can based on the information converting that calculates with second catch side that image is transformed and go up in the opposite direction history image is carried out conversion.Therefore, the operation below having realized: based on information converting, history image with second catch side that image is transformed and be transformed in the opposite direction.
And in first aspect, information converting can comprise and amplify/dwindle, moves and rotate relevant key element; And image conversion device can based on be included in the information converting that calculates with move and rotate relevant key element and catch image to second and carry out conversion, and based on be included in the information converting that calculates with amplify/dwindle relevant key element history image is carried out conversion.Therefore, the operation below having realized: based on be included in the information converting with move and rotate relevant key element and catch image to second and carry out conversion, and based on comprise in the information converting with amplify/dwindle relevant key element history image is carried out conversion.
And, in first aspect, image conversion device can with second catch side that image is transformed and go up in the opposite direction history image is carried out conversion.Therefore, the operation below having realized: with second catch side that image is transformed and go up in the opposite direction history image carried out conversion.
And in first aspect, the information converting calculation element can sequentially calculate the information converting of each frame that constitutes the capture movement image; Image conversion device can be at each frame transform history image and second at least one of catching in the image; Image combining apparatus can at each frame sequential ground will be wherein at least one history image and second that has carried out conversion via image conversion device to catch image combined; And control device can sequentially be output composograph at each frame.Therefore, realized following operation: at each the frame sequential ground computational transformation information that constitutes the capture movement image; At each frame transform history image and second at least one of catching in the image; At each frame sequential ground will be wherein at least one history image and second that has passed through conversion catch the combined composograph that generates of image; And at each frame sequential ground output composograph.
And in first aspect, first catches image and second catches image and can be and two corresponding images of successive frame that are included in the capture movement image.Therefore, the operation below having realized: utilize and be included in two corresponding images of successive frame in the capture movement image, carry out information converting calculating, catch the combination of image and the output of composograph.
And in first aspect, information converting can be to catch the movable information that image or second is caught image image capture apparatus when being hunted down first; And the information converting calculation element can be caught image and compares computational transformation information by catching image and second with first.Therefore, the operation below having realized: catch image and compare by catching image and second, calculated first and caught the movable information that image or second is caught image image capture apparatus when being hunted down with first.
And in first aspect, information converting can be and catch the relevant movable information of relative motion amount that image or second is caught image image capture apparatus and photography target when being hunted down first; And the information converting calculation element can be caught image and compares computational transformation information by catching image and second with first.Therefore, the operation below having realized: catch image and compare by catching image and second, calculated and catch the relative motion amount relevant movable information that image or second is caught image image capture apparatus and photography target when being hunted down first with first.
And in first aspect, the information converting calculation element can comprise: the feature point extraction device, be used for catching each pixel that image and second is caught image based on constituting first, and extract first and catch the image and second characteristic point of catching in the image; The amount of exercise calculation element is used for catching image and second based on each characteristic point calculating and first that extracts and catches image-related amount of exercise; And the transformation parameter calculation element, be used for coming computational transformation information by calculating the predetermined map parameter based on the amount of exercise that calculates.Therefore, the operation below having realized: catch each pixel that image and second is caught image based on constituting first, extract first and catch the image and second characteristic point of catching in the image; Based on each characteristic point that extracts, calculating and first is caught image and second and is caught image-related amount of exercise; And, calculate the predetermined map parameter, thus computational transformation information based on the amount of exercise that calculates.
And in first aspect, the feature point extraction device can be made of polycaryon processor.Polycaryon processor can be caught each pixel execution parallel processing that image and second is caught image to constituting first by utilizing the SIMD operation, extracts first and catches the image and second characteristic quantity of catching in the image.Therefore, the operation below having realized: utilize the SIMD operation to catch each pixel execution parallel processing that image and second is caught image by using polycaryon processor, extracted first and caught the image and second characteristic quantity of catching in the image to constituting first.
And in first aspect, the amount of exercise calculation element can be made of polycaryon processor.Polycaryon processor can calculate and first catch image and second and catch image-related amount of exercise by utilizing SIMD operation that each characteristic point that extracts is carried out parallel processing.Therefore, the operation below having realized: utilize the SIMD operation that each characteristic point that extracts is carried out parallel processing by using polycaryon processor, calculated and first caught image and second and catch image-related amount of exercise.
And in first aspect, image processing apparatus can also comprise: compression set is used for compression and catches image.The history image of output during composograph can be compressed image, and second to catch image can be non-compressed image or the image of catching with resolution higher than the compression history image.Therefore, realized following operation: the history image during the output composograph is a compressed image; And second to catch image be non-compressed image or have the image of catching than the high resolution of compression history image.
And a second aspect of the present invention is image processing apparatus, is used for its processing method, and makes the program of computer implemented method.Image processing apparatus is characterised in that and comprises: the moving image deriving means, be used to obtain the capture movement image, be used for catching image and second and catching in the image at least one and carry out the information converting of conversion and be associated with the capture movement image and be recorded being included in first of described capture movement image, the capture movement image is caught by image capture apparatus; The information converting extraction element is used for extracting information converting from the capture movement image that obtains; Image storage apparatus is used for and will comprises that first is that catch image and be positioned at second each image of catching before the image along the time shaft of capture movement image and save as history image; Image conversion device, at least one that is used for based on the information converting that extracts is caught image to history image and second carried out conversion; The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by image conversion device; Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one history image and second that has carried out conversion by image conversion device to catch image combined, to generate composograph; Output device is used to export composograph; And control device, be used to make output device sequentially to export composograph.Therefore, the operation below having realized: at least one carries out conversion based on the information converting that extracts is caught in the image history image and second; In response to the selection operation of being accepted at least one history image and second that has carried out conversion wherein being caught image is combined to generate composograph; And composograph is exported in proper order.
And, the program that a third aspect of the present invention is image processing apparatus, is used for its processing method and makes computer implemented method.Image processing apparatus is characterised in that and comprises: the information converting storage device is used for the information converting that is used at least one that is included in that first of the capture movement image of being caught by image capture apparatus catches that image and second catches image carried out conversion is stored explicitly with each frame that constitutes the capture movement image; The moving image deriving means is used to obtain the capture movement image; The information converting deriving means is used for obtaining the information converting that is stored in the information converting storage device with the capture movement image that is obtained explicitly; Image storage apparatus is used for and will comprises that first is that catch image and be positioned at second each image of catching before the image along the time shaft of capture movement image and save as history image; Image conversion device, at least one that is used for based on the information converting that is obtained is caught image to history image and second carried out conversion; The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by image conversion device; Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one history image and second that has carried out conversion by image conversion device to catch image combined, to generate composograph; Output device is used to export composograph; And control device, be used to make output device sequentially to export composograph.Therefore, the operation below having realized: at least one carries out conversion based on the information converting that obtains is caught in the image history image and second; In response to the selection operation of being accepted at least one history image and second that has carried out conversion wherein being caught image is combined to generate composograph; And composograph is exported in proper order.
And, the program that a fourth aspect of the present invention is image processing apparatus, is used for its processing method and makes computer implemented method.Motion picture processing device is characterised in that and comprises: the moving image input unit is used to receive the capture movement image of being caught by image capture apparatus; The information converting calculation element is used for catching each of image at what constitute the capture movement image, calculate be used for reference to constitute the capture movement image catch image at least one catch image and come another is caught the information converting that image carries out conversion; Image conversion device, be used for based on reference to constitute the capture movement image catch image at least one catch the information converting that image calculates as benchmark image, come carrying out conversion with the corresponding image of catching of information converting; Image storage apparatus is used to preserve the image of catching through conversion; And control device, be used for making output device sequentially export and be kept at last the image of catching of image storage apparatus.Therefore, the operation below having realized: at each catch image calculation be used for reference to constitute the capture movement image catch image at least one catch image and come another is caught the information converting that image carries out conversion; Based on information converting, catch image and come to carry out conversion catching image as benchmark image with reference to catching in the image at least one; And preserve the image of catching through conversion; And order is exported the image of catching of last preservation.
And, the program that a fifth aspect of the present invention is motion picture processing device, is used for its processing method and makes computer implemented method.Motion picture processing device is characterised in that and comprises: the moving image input unit is used to receive the capture movement image of being caught by image capture apparatus; The information converting calculation element, be used for first catching image and being positioned at first and catch second after the image and catch image, calculate and first catch image and second and catch image-related information converting along the time shaft of capture movement image based on what be included in the capture movement image; Image storage apparatus is used for and will comprises that first is that catch image and be positioned at second each image of catching before the image along the time shaft of capture movement image and save as history image; Image conversion device, at least one that is used for based on the information converting that calculates is caught image to history image and second carried out conversion; The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by image conversion device; Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one history image and second that has carried out conversion by image conversion device to catch image combined, to generate composograph; Display unit is used to show composograph; And control device, be used to make display unit sequentially to show composograph.Therefore, realized following operation: catch image and second based on first and catch the image calculation information converting; Based on the information converting that calculates, at least one carries out conversion to history image and second is caught in the image; In response to the selection operation of being accepted at least one history image and second that has carried out conversion wherein being caught image is combined to generate composograph; And composograph is shown in proper order.
And, the program that a sixth aspect of the present invention is image processing apparatus, is used for its processing method and makes computer implemented method.Image processing apparatus is characterised in that and comprises: the moving image input unit is used to receive the capture movement image of being caught by image capture apparatus; The capture movement image memory device is used to store the capture movement image; The information converting calculation element is used at each frame that constitutes the capture movement image, calculate be used for reference to constitute the capture movement image catch image at least one catch image and come another is caught the information converting that image carries out conversion; And recording control apparatus, the information converting and each frame that are used for calculating are recorded in the capture movement image memory device explicitly.Therefore, the operation below having realized: at each frame calculated be used for reference to constitute the capture movement image catch image at least one catch image another caught the information converting that image carries out conversion; And the information converting that calculates and each frame are recorded explicitly.
And, the program that a seventh aspect of the present invention is image processing apparatus, is used for its processing method and makes computer implemented method.Image processing apparatus is characterised in that and comprises: the moving image input unit is used to receive the moving image of being caught by image capture apparatus as the capture movement image; The metadata store device is used to store the metadata image-related with capture movement; The information converting calculation element is used at each frame that constitutes the capture movement image, calculate be used for reference to constitute the capture movement image catch image at least one catch image and come another is caught the information converting that image carries out conversion; And recording control apparatus, be used for the information converting that will calculate and capture movement image and frame and be recorded in the metadata store device explicitly as metadata.Therefore, the operation below having realized: at each frame calculated be used for reference to constitute the capture movement image catch image at least one catch image another caught the information converting that image carries out conversion; And information converting that calculates and capture movement image and frame are registered as metadata explicitly.
And, aspect the 7th in, metadata can be included in positional information and the pose information of describing in the coordinate system of image capture apparatus at least.Therefore, the operation below having realized: be included in the positional information described in the coordinate system of image capture apparatus and the metadata of pose information at least and be recorded.
According to the present invention, can obtain following main advantages: in browsing the situation of moving image, can easily understand the details of the moving image of taking by image capture apparatus.
Description of drawings
[Fig. 1] Fig. 1 is the block diagram that the functional structure example of the image processing apparatus 100 in the one embodiment of the invention is shown.
[Fig. 2] Fig. 2 comprises the diagrammatic sketch of the example of the corresponding image of frame that illustrates and be included in the moving image.
[Fig. 3] Fig. 3 comprises and showing by omitting the diagrammatic sketch of the simplified image that obtains with the background that is included in the corresponding image of frame in the moving image etc.
[Fig. 4] Fig. 4 shows the flow chart that the affine transformation parameter of being carried out by the image processing apparatus in the embodiment of the invention 100 detects the processing procedure of handling.
[Fig. 5] Fig. 5 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Fig. 6] Fig. 6 be included in each image shown in Figure 5 with the next-door neighbour be illustrated by the broken lines and show the diagrammatic sketch of the light stream that exemplary detection arrives at the corresponding image of preceding frame.
[Fig. 7] Fig. 7 comprises the diagrammatic sketch that shows the demonstration example in the situation of playing the moving image that comprises image shown in Figure 5 401 to 403.
[Fig. 8] Fig. 8 comprises the diagrammatic sketch that shows the demonstration example in the situation of playing the moving image that comprises image shown in Figure 5 401 to 403.
[Fig. 9] Fig. 9 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 10] Figure 10 be included in each image shown in Figure 9 with the next-door neighbour be illustrated by the broken lines and show the diagrammatic sketch of the light stream that exemplary detection arrives at the corresponding image of preceding frame.
[Figure 11] Figure 11 comprises the diagrammatic sketch that shows the demonstration example in the situation of playing the moving image that comprises image shown in Figure 9 421 to 423.
[Figure 12] Figure 12 comprises the diagrammatic sketch that shows the demonstration example in the situation of playing the moving image that comprises image shown in Figure 9 421 to 423.
[Figure 13] Figure 13 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 14] Figure 14 be included in each image shown in Figure 13 with the next-door neighbour be illustrated by the broken lines and show the diagrammatic sketch of the light stream that exemplary detection arrives at the corresponding image of preceding frame.
[Figure 15] Figure 15 comprises the diagrammatic sketch that shows the demonstration example in the situation of playing the moving image that comprises image shown in Figure 13 441 to 443.
[Figure 16] Figure 16 comprises the diagrammatic sketch that shows the demonstration example in the situation of playing the moving image that comprises image shown in Figure 13 441 to 443.
[Figure 17] Figure 17 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 18] Figure 18 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 19] Figure 19 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 20] Figure 20 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 21] Figure 21 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 22] Figure 22 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 23] Figure 23 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 24] Figure 24 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 25] Figure 25 shows the flow chart of the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 100.
[Figure 26] Figure 26 is the flow chart that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 100 is shown.
[Figure 27] Figure 27 is the flow chart that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 100 is shown.
[Figure 28] Figure 28 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 29] Figure 29 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 30] Figure 30 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 31] Figure 31 comprises the diagrammatic sketch of the example that shows the transformation in the captured moving image of camera.
[Figure 32] Figure 32 is the block diagram that the functional structure example of the image processing apparatus 650 in the embodiment of the invention is shown.
[Figure 33] Figure 33 comprises and schematically shows the record moving image memory cell 660 in embodiments of the present invention and the diagrammatic sketch of each file in the metadata storage unit 670.
[Figure 34] Figure 34 is the block diagram that the functional structure example of the image processing apparatus 680 in the embodiment of the invention is shown.
[Figure 35] Figure 35 comprises each frame that schematically shows the motion pictures files in the storage moving image memory cell 660 in embodiments of the present invention and the diagrammatic sketch of the relation between the viewing area.
[Figure 36] Figure 36 comprises the diagrammatic sketch that schematically shows the viewing area moving method of present image from the outstanding situation in viewing area.
[Figure 37] Figure 37 comprises the diagrammatic sketch of the example that shows the transformation in the situation of utilizing moving method moving display area shown in Figure 36.
[Figure 38] Figure 38 comprises each frame that schematically shows the motion pictures files in the storage moving image memory cell 660 in embodiments of the present invention and the diagrammatic sketch of the relation between the viewing area.
[Figure 39] Figure 39 comprises and showing when specifying the display mode that present image is fixed on the display unit 689, amplifies and be presented at the diagrammatic sketch of the overview of the amplification method in the situation of the moving image that shows on the display unit 689.
[Figure 40] Figure 40 comprises the diagrammatic sketch of the stream of each frame that schematically shows the motion pictures files in the storage moving image memory cell 660 in embodiments of the present invention.
[Figure 41] Figure 41 comprises the diagrammatic sketch of the stream of each frame that schematically shows the motion pictures files in the storage moving image memory cell 660 in embodiments of the present invention.
[Figure 42] Figure 42 comprises the diagrammatic sketch that shows the demonstration example (image 750) in the situation of playing the captured moving image of camera, and shows at the diagrammatic sketch to the image 754 in the state before the 752 execution affine transformations of the present image in the image 750.
[Figure 43] Figure 43 comprise show that the image-region that is centered on by border 753 shown in Figure 42 is exaggerated and situation about showing in the diagrammatic sketch of image 755, and show the diagrammatic sketch that in the present image through affine transformation is stored in state in the video memory 684, is kept at the image 757 in the display-memory 686.
[Figure 44] Figure 44 is the flow chart that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 680 is shown.
[Figure 45] Figure 45 is the flow chart that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 680 is shown.
[Figure 46] Figure 46 is the diagrammatic sketch that the topology example of the polycaryon processor 800 in the embodiment of the invention is shown.
[Figure 47] Figure 47 is the diagrammatic sketch that the topology example of the processor controls nuclear 801 in the embodiment of the invention is shown.
[Figure 48] Figure 48 is the diagrammatic sketch that the topology example of the arithmetic processor nuclear 811 (#1) in the embodiment of the invention is shown.
[Figure 49] Figure 49 is the diagrammatic sketch of the operation method of the polycaryon processor 800 in the schematically illustrated embodiment of the invention.
[Figure 50] Figure 50 is the diagrammatic sketch that is shown schematically in the embodiments of the invention by program in the situation of polycaryon processor 800 executable operations and data flow.
[Figure 51] Figure 51 comprises the diagrammatic sketch of the overview that schematically shows the interpretative version of utilizing each order a plurality of data item to be carried out processing, and shows the diagrammatic sketch that utilizes single command a plurality of data item to be carried out the overview of SIMD (single-instruction multiple-data) operation of handling.
[Figure 52] Figure 52 shows the diagrammatic sketch of being examined the topology example of the program that 801 exclusive disjunction processor cores 811 (#1) carry out by the processor controls in the embodiment of the invention.
[Figure 53] Figure 53 schematically shows to utilize 830 pairs in Sobel (Suo Beier) filter to be stored in view data in the main storage 781 in the embodiment of the invention to carry out the handling process in the situation of Filtering Processing and the diagrammatic sketch of data structure.
[Figure 54] Figure 54 schematically shows to utilize 830 pairs in Sobel filter to be stored in the diagrammatic sketch that view data in the main storage 781 is carried out the data flow in the situation of SIMD operation in the embodiment of the invention.
[Figure 55] Figure 55 schematically shows in the embodiment of the invention utilizing Sobel filter 830 to carry out the diagrammatic sketch of the vector generation method of 9 vectors of view data generation from be stored in first buffer 831 in the situation of Filtering Processing.
[Figure 56] Figure 56 schematically shows to utilize the diagrammatic sketch of SIMD order to the vector operations method of vectorial data item 841 to 849 execute vectors operation in the embodiment of the invention in the situation of utilizing Sobel filter 830 execution Filtering Processing.
[Figure 57] Figure 57 is the diagrammatic sketch that schematically shows the flow process of the camera operation calculation of parameter processing of carrying out chronologically in the embodiment of the invention.
[Figure 58] Figure 58 comprises the diagrammatic sketch of the blu-ray disc (Blu-ray Disc) 880 that schematically shows as the example of recording medium, schematically show the diagrammatic sketch of the data item 881 to 884 that is recorded on the blu-ray disc 880, and the diagrammatic sketch that schematically shows the internal structure of the Blu-ray player 890 that can play blu-ray disc 880.
Embodiment
Next, will be described in detail with reference to the attached drawings embodiments of the invention at this now.
Fig. 1 is the block diagram that the functional structure example of the image processing apparatus 100 in the one embodiment of the invention is shown.Image processing apparatus 100 comprises moving image input unit 110, camera operation (camerawork) detecting unit 120, record control unit 130, moving image memory cell 200, moving image acquiring unit 140, camera operation parameter extraction unit 150, image transforming unit 160, video memory 170, image assembled unit 180, indicative control unit 190, display unit 191 and operation acceptance unit 195.Image processing apparatus 100 for example can be realized by following personal computer, this personal computer can extract the characteristic quantity of the moving image of being taken by the image capture apparatus such as Digital Video by carrying out video image analysis, and utilizes the characteristic quantity that is extracted to use various types of image processing.
Moving image input unit 110 is to receive the moving image of being caught by the image capture apparatus such as Digital Video (after this abbreviating camera as) and the moving image that is received is outputed to moving image input unit in the camera operation detecting unit 120.
Camera operation detecting unit 120 is configured to by analyzing the camera motion information (camera operation) when the moving image of moving image input unit 110 outputs detects photographic images.Camera operation detecting unit 120 comprises feature point extraction unit 121, optical flow computation unit 122 and camera operation parameter calculation unit 123.Promptly, camera operation detecting unit 120 is from constituting each image extract minutiae of moving image, also extract and the corresponding light stream of characteristic point (motion vector), come option table to understand the characteristic point of leading motion (dominant motion) by the light stream of analyzing corresponding to the characteristic point of being extracted, and based on putting corresponding light stream and infer camera motion with having shown leading motion characteristics.Here, leading motion refers in corresponding to the light stream of a plurality of characteristic points by the indicated regular motion of a large amount of relatively light streams.
Feature point extraction unit 121 be configured to from the corresponding image extract minutiae of frame of the moving image that constitutes 110 outputs of autokinesis image input unit, and the characteristic point of being extracted outputed to optical flow computation unit 122.Here, feature point extraction unit 121 is at a frame (head frame) extract minutiae from general image in the frame of the moving image that has constituted 110 outputs of autokinesis image input unit, and at the frame except a frame from the area part extract minutiae, the image of this area part with corresponding to next-door neighbour the image at preceding frame Comparatively speaking take recently.Notice that for example, the point that has steeper edge gradient on vertical direction or horizontal direction (is commonly referred to as " angle point "; After this be called " angle point ") can be extracted as characteristic point.Angle point is characteristic point stronger in calculating light stream and can utilizes rim detection to obtain.Note, will be referring to figs. 2 and 3 the extraction of describing angle point in detail.And, in this example, feature point extraction unit 121 at a frame from the general image extract minutiae, and at except the frame head from the area part extract minutiae, Comparatively speaking the image of this area part and next-door neighbour take recently at preceding image.But, according to disposal ability etc., feature point extraction unit 121 can at except each frame head from the general image extract minutiae.
Optical flow computation unit 122 is configured to calculate the corresponding light stream of each characteristic point with 121 outputs from the feature point extraction unit, and will output to camera operation parameter calculation unit 123 by calculating the light stream that obtains.Particularly, optical flow computation unit 122 by independent comparison be included in the corresponding image of two successive frames from the moving image of moving image input unit 110 output (present frame and next-door neighbour at preceding frame), obtain with corresponding to next-door neighbour the image at preceding frame in the corresponding light stream of each characteristic point, as the light stream of present frame.And, obtain light stream at each frame that constitutes moving image.Notice that the detection method such as gradient method and block matching method can be as the detection method that detects light stream.Note, will be referring to figs. 2 and 3 the calculating of describing light stream in detail.
Camera operation parameter calculation unit 123 is configured to carry out following camera operation calculation of parameter and handles, promptly utilize the light stream corresponding to each characteristic point of 122 outputs to come the computing camera operating parameter, and the camera operation parameter that calculates is outputed to record control unit 130 from the optical flow computation unit.Here, in an embodiment of the present invention, the image that has constituted the moving image that will play is transformed and is shown according to camera motion.In order to carry out conversion to image, utilize the light stream that calculates by optical flow computation unit 122 to extract camera motion, and based on the motion that extracts, computing camera operating parameter (transformation parameter).Notice that in an embodiment of the present invention, will describe a such example, wherein, affine transformation (affine transformation) is used as the image conversion method that the image that constitutes the moving image that will be played is carried out conversion.And, a such example will be described, wherein, be used as the camera operation parameter with the corresponding affine transformation parameter of inverse of a matrix matrix of the affine transformation parameter that goes out based on optical flow computation.Promptly, in an embodiment of the present invention, affine transformation parameter as information converting is defined as such affine transformation parameter, it is not corresponding with the affine matrix of representing the characteristic point motion between the consecutive image, and corresponding with the affine matrix of the indication position that the image after this reference picture moves when one of consecutive image is used as reference picture.In addition, though the example that affine transformation parameter is used as the camera operation parameter will be described, yet, other image conversion method such as projective transformation (projective transformation) also can be used.Note, can obtain affine transformation parameter by utilizing 3 vectors to calculate.In addition, can obtain projective transformation parameter by utilizing 4 vectors to calculate.Here, the camera operation parameter be used for reference to constitute institute's capture movement image catch image at least one catch image and come conversion another catches the information converting of image, and be included in positional information and the pose information of describing in the camera coordinate system at least.That is, the camera operation parameter comprises the information relevant with posture with the position of camera in the situation of photographer's photographic images.In addition, based on the affine transformation parameter that obtains by camera operation parameter calculation unit 123, can in response to the operation of photographer's input (for example, push away near, zoom out, pan, inclination and rotation) camera motion infer.Note, will be referring to figs. 2 and 3 the calculating of describing affine transformation parameter in detail.
Record control unit 130 is configured to by connection that the frame of correspondence and affine transformation parameter are relative to each other, and will be recorded in the moving image memory cell 200 as motion pictures files from the moving image of moving image input unit 110 outputs and affine transformation parameter from 123 outputs of camera operation parameter calculation unit.
Moving image memory cell 200 is configured to store the be relative to each other motion pictures files of connection of the frame that wherein corresponds to each other and affine transformation parameter.In addition, moving image memory cell 200 is in response to from the request of moving image acquiring unit 140 and motion pictures files is offered moving image acquiring unit 140.
Moving image acquiring unit 140 is configured in response to importing from operation acceptance unit 195, relevant with obtaining moving image operation, obtain the motion pictures files that is stored in the moving image memory cell 200, and the motion pictures files that obtains is outputed to camera operation parameter extraction unit 150, image transforming unit 160 and image assembled unit 180.
Camera operation parameter extraction unit 150 is configured to extract frame by frame the affine transformation parameter that writes down explicitly with the motion pictures files of exporting from moving image acquiring unit 140, and the affine transformation parameter that extracts is outputed to image transforming unit 160.
Image transforming unit 160 is configured to utilize the affine transformation parameter from 150 outputs of camera operation parameter extraction unit, image that constitutes the moving image from the motion pictures files of moving image acquiring unit 140 outputs or the image frame by frame that is kept in the video memory 170 are used affine transformation, and will output to image assembled unit 180 through the image of affine transformation.Particularly, play in the fixing composograph (compositeimage) that generates corresponding to the image of each frame of present frame front by combination and show in the situation of moving image, image transforming unit 160 is utilized from the affine transformation parameter of camera operation parameter extraction unit 150 outputs carrying out affine transformation from 140 outputs of moving image acquiring unit with the corresponding image of present frame.On the contrary, play in corresponding to the image of present frame and show in the situation of moving image fixing, image transforming unit 160 is utilized from the affine transformation parameter of camera operation parameter extraction unit 150 outputs, and the combination of passing through that is kept in the video memory 170 is carried out affine transformation with each at the composograph that the corresponding image of preceding frame generates.In addition, in the situation of broadcast and demonstration moving image, image transforming unit 160 will be separated into from the affine transformation parameter of camera operation parameter extraction unit 150 outputs about key element (zoom component) of amplifying/dwindling and the key element except amplifying/dwindling (about the key element that moves or rotate) when the demonstration enlargement ratio (magnification) of fixing and the corresponding image of present frame.Image transforming unit 160 is utilized about the key element of amplifying/dwindling and is used the affine transformation that goes up in the opposite direction with the side of affine transformation parameter with each at the corresponding composograph of preceding frame to remaining in the video memory 170, and utilizes about moving or the key element of rotation pair and the corresponding image applications affine transformation of exporting from moving image acquiring unit 140 of present frame.These conversion are to be performed according to the operation input relevant with play-back command from operation acceptance unit 195.Note, will wait with reference to figure 5 to Figure 16 and describe these image transforms in detail.
Video memory 170 is the work buffers of preserving the composograph that the combination carried out by image assembled unit 180 generates.The composograph that video memory 170 is configured to be preserved offers image transforming unit 160 or image assembled unit 180.That is, video memory 170 is video memories of preserving history image.
Image assembled unit 180 is configured to the image of exporting from image transforming unit 160, is kept at the composograph the video memory 170, perhaps the image from 140 outputs of moving image acquiring unit makes up, and will output to video memory 170 and display unit 191 by the composograph that combination generates.Particularly, in the time of the fixing composograph that generates by the corresponding image of each frame of combination and present frame front, play and when showing moving image, image assembled unit 180 will write on by the image that image transforming unit 160 generates through affine transformation be kept in the video memory 170 with each on the corresponding composograph of preceding frame, come thus image is made up.On the contrary, in the time of fixing image corresponding to present frame, play and when showing moving image, image assembled unit 180 will write on from the image corresponding to present frame of moving image acquiring unit 140 output by 160 pairs of image transforming unit and be kept at carrying out on the image that affine transformation generates at the corresponding composograph of preceding frame with each video memory 170, come thus image is made up.Perhaps, in the time of the demonstration enlargement ratio of fixing image corresponding to present frame, play and when showing moving image, image assembled unit 180 will generate by image transforming unit 160 with the corresponding image of present frame through affine transformation write on by image transforming unit 160 generate corresponding on the composograph of affine transformation with each at preceding frame, come thus image is made up.These combination operations are to be performed according to the operation input relevant with play-back command from operation acceptance unit 195.Note, will wait with reference to figure 5 to Figure 16 and describe these image sets closing operations in detail.
Indicative control unit 190 is configured to the composograph that on display unit 191 frame by frame shows that in proper order the combination carried out by image assembled unit 180 generates.
Display unit 191 is configured under the control of indicative control unit 190 to show the composograph that the combination carried out by image assembled unit 180 generates.For example, can realize display unit 191 by the display or the television set of personal computer.Note, will describe the demonstration example of composograph in detail with reference to Figure 17 to Figure 24, Figure 28 to Figure 31 etc.
Operation acceptance unit 195 comprises various operation keyss etc., and be configured to when having accepted to utilize the operation input of these key inputs, just the details that the operation of being accepted is imported is exported to moving image acquiring unit 140, image transforming unit 160 or image assembled unit 180.Operation acceptance unit 195 for example comprises the key that is provided with that is used for being provided with in the situation of playing moving images display mode.As its display mode, for example, there is such display mode: wherein, to with the corresponding image applications affine transformation of present frame, and by will through the image of affine transformation with generate composograph and this composograph shown corresponding to each composograph at preceding frame is combined, perhaps there is such display mode: wherein, going up in the opposite direction using affine transformation at the corresponding composograph of preceding frame with the side of affine transformation parameter with each, and by will through the composograph of affine transformation with generate composograph and this composograph shown corresponding to the image of present frame is combined.Promptly, according to embodiments of the invention, can be by carry out demonstration at random switching between the following method: the image combination/display packing of the history image in conversion past when fixedly present image shows the border, and the image combination/display packing that moves present image demonstration border based on camera operation.
Next, detect the detection method that is used in the affine transformation parameter in the image transform with being described in detail with reference to the attached drawings.
The part of Fig. 2 (a) to (c) is the diagrammatic sketch that illustrates and be included in an example of the corresponding image of frame in the moving image.The part of Fig. 3 (a) is to illustrate by omitting the diagrammatic sketch of the simplified image that obtains with the background of the corresponding image of following frame etc., and described frame is and image shown in Figure 2 300 a corresponding frames frame before.In addition, the part of Fig. 3 (b) and (c) be the diagrammatic sketch that the simplified image that background by omitting image shown in Figure 2 300 etc. obtains is shown.
Fig. 2 and image 300,320 and 330 shown in Figure 3 comprise the image 301,321 and 331 of the horse that the people rides, and the image 302,322 and 332 that is sitting at the snake of the image 301,321 of horse and 331 fronts.In addition, as shown in Figure 2, flag, chair etc. is present in the background of these images, and flag waves in wind.
Image 320 shown in the part of Fig. 3 (a) is the simplified image corresponding to the image of following frame, and described frame is that part (a) with Fig. 2 is to the part (b) of (c) and Fig. 3 with the frame before image 300 (c) and the 330 corresponding frames.In addition, show the image that object in picture becomes the transformation in the big situation gradually with two corresponding images 320 of successive frame and 330.That is, when taking this image, carried out pushing away near operation, pushed away near operation and be the operation of the size of the object that is used for increasing gradually picture.
In an embodiment of the present invention, will describe the characteristic point in each image that detect to constitute moving image and utilize method by example corresponding to the optical flow computation affine transformation parameter of characteristic point.In addition, in this example, will the situation that angle point is used as characteristic point be described.
Here, in (c), will the method for utilizing with detected three corresponding optical flow computation affine transformation parameters of angle point in image 320 and 330 be described in the part (a) of Fig. 3 by example.
For example, suppose in the image 320 shown in the part (a) at Fig. 3 that near angle point 324 near the angle point 323 the mouth of the image 321 of horse, the buttocks of the people in the image 321 of horse and near the angle point 325 the mouth of the image 322 of snake are detected as characteristic point.In this case, in the image 330 shown in the part (b) of Fig. 3, utilize gradient method, block matching method to wait to detect with image 320 in angle point 323,324 and 325 corresponding light streams 337,338 and 339.Based on detected light stream 337,338 and 339, detect with image 320 in angle point 323,324 and 325 corresponding angle points 333,334 and 335.
Here, for example, be included in the part (a) of Fig. 3 and (b) shown in image 320 and 330 in horse image 321 and 331 and the image 322 and 332 of snake be placed on the ground, therefore, can irrespectively not move with camera motion.Therefore, can based at the image 321 of horse and 331 and the image 322 and 332 of snake in detected angle point and camera motion is accurately inferred in the light stream that obtains.For example, shown in the part (c) of Fig. 3,, can infer that publishing picture as 330 is by being that center enlarged image 320 obtains to put 336 based on detected three light streams 337 to 339 in image 330.Therefore, the camera motion in the time of can be with photographic images 330 be defined as with put 336 be the center carry out push away near operation.As above, the angle point in the object that can irrespectively move with camera motion is not detected, and based on the light stream that obtains at these angle points, the camera motion with certain regularity can accurately be detected.Therefore, can utilize the light stream that obtains at these angle points to calculate and obtain affine transformation parameter.
Yet, the situation of waving in the wind etc., can imagine the situation that comprises the object that irrespectively moves with camera motion in the picture of publishing picture as flag.For example, image 300 shown in Figure 2 is included in the flag that waves in the wind.Angle point in detecting this object that irrespectively moves with camera motion, and utilize the light stream that obtains at these angle points to infer under the situation of camera motion, can not accurately infer camera motion.
For example, detected light stream is represented with arrow in the image 300 shown in the part (b) of Fig. 2, in addition, represents with the empty circles of arrow end from the detected angle point of light stream.Here, angle point 303 to 305 is with the part (b) of Fig. 3 and angle point 333 to the 335 corresponding angle points (c).In addition, angle point 306 to 311 is detected angle points in the flag in the background of the image 301 that is present in horse.Because these flags wave in wind, therefore, the flag motion that causes because of wind action is detected as light stream.That is, in the flag that irrespectively moves with camera motion, detect the light stream that corresponds respectively to angle point 306 to 311.Therefore, when employed three light streams in calculating the situation of affine transformation parameter comprise with at least one the angle point corresponding light stream of angle point 306 in 311, can not detect accurate camera motion.In this case, can not calculate accurate affine transformation parameter.
As implied above, for example, there is such example: in captured image, the light stream (except corresponding to the light stream the light stream of the angle point 306 to 311 shown in the part (b) of Fig. 2) that detects the pairing light stream of the object that irrespectively moves with camera motion (corresponding respectively to the light stream of the angle point 306 to 311 shown in the part (b) of Fig. 2) and have certain regularity with respect to camera motion.
Therefore, in an embodiment of the present invention, will such example be described, wherein, repeatedly carry out the affine transformation parameter computing of calculating affine transformation parameter based on three light streams, obtain a plurality of affine transformation parameters thus, and from these affine transformation parameters, select best affine transformation parameter.Notice that in this example, the size that is included in the moving object in each image that constitutes moving image is less relatively with respect to image area.
Here, affine transformation will briefly be described.In two dimension, when the position of motor is that (x, y) and the position of motion destination after the affine transformation when be (x ', y '), the matrix of affine transformation can be by equation 1 expression.
[equation 1]
Here, a to f is an affine transformation parameter.In addition, the affine matrix AM that comprises these affine transformation parameters can be expressed by following equation.In this case, the zoom component XZ on the directions X, the zoom component YZ on the Y direction, translation (translation) the component XT on the directions X, translational component YT and the rotational component R on the Y direction can obtain respectively by following equation.Note, in the situation of unit matrix, a=e=1 and b=c=d=f=0.
[equation 2]
Figure G2008800055276D00231
XZ = a 2 + d 2 YZ = b 2 + e 2
XT=c YT=f
Figure G2008800055276D00234
Next, the affine transformation parameter computational methods will be described.
At first, in image, from the characteristic point that detects light stream based on it, select three characteristic points corresponding to present frame (it is a frame that constitutes in the frame of moving image).For example, from the image 300 shown in the part (b) of Fig. 2, selecting three angle points at random in the detected angle point (representing) by empty circles.Note, when projective transformation parameter is used as the camera operation parameter, select four characteristic points at random.
Then, utilization is calculated affine transformation parameter with corresponding three light streams of selected three characteristic points.For example, affine transformation parameter is calculated in three the corresponding light streams of angle point (being represented by the arrow that is connected to empty circles) that utilize and select from the angle point (being represented by empty circles) the image 300 shown in the part (b) of Fig. 2.Can utilize equation 1 to obtain affine transformation parameter.
Then, based on the affine transformation parameter that is obtained, calculate the score (score) of affine transformation parameter.Particularly, utilize the affine transformation parameter that is obtained, obtain position with the motion destination that is close to all characteristic points in the present frame corresponding image of frame before.Compare with the position of detected another feature point in present frame by the position that will utilize the characteristic point that affine transformation parameter obtains, come by characteristic point the difference between the position of two corresponding characteristic points of calculating.For example, the absolute distance between the position of two corresponding characteristic points is calculated as difference.Then, difference and the predetermined threshold value that calculates compared, and the acquisition difference is less than the number of the characteristic point of the threshold value score as affine transformation parameter by characteristic point ground.As above, from the characteristic point that detects light stream based on it, select three characteristic points randomly.Based on the corresponding light stream of these characteristic points, the processing that will calculate the score of affine transformation parameter repeats pre-determined number, calculates the score of a plurality of affine transformation parameters thus.Can this pre-determined number be set as required according to disposal ability of the image type that will compare, image processing apparatus 100 etc.Perhaps, can be with fixed value as pre-determined number.For example, can consider the disposal ability of image processing apparatus 100 and be set to pre-determined number about 20 times.
For example, consider from detected angle point the image 300 shown in the part (b) of Fig. 2, to select the situation of three angle points except angle point 306 to 311.When utilize with above during the corresponding optical flow computation affine transformation parameter of three angle points selecting, as mentioned above, because three light streams have certain rules, therefore, have obtained to come according to certain rules conversion and next-door neighbour's the affine transformation parameter at the corresponding image of preceding frame.Therefore, for the position of the angle point that utilizes affine transformation parameter to obtain and in present frame the position of detected angle point, the difference that obtains at the angle point except angle point 306 to 311 is calculated as relatively little value.Therefore, the score of affine transformation parameter becomes higher value.
On the contrary, consideration selection from detected angle point the image 300 shown in the part (b) of Fig. 2 comprises the situation of three angle points of one of angle point 306 to 311 at least.When utilize with above during the corresponding optical flow computation affine transformation parameter of three angle points selecting, as mentioned above, because three light streams comprise the light stream with certain regularity, therefore, obtained not come conversion and next-door neighbour's affine transformation parameter at the corresponding image of preceding frame according to certain regularity.Therefore, at any angle point place, the difference that obtains at the position of the position of the angle point that utilizes affine transformation parameter to obtain and detected angle point in present frame is calculated as relative big value.Therefore, the score of affine transformation parameter becomes smaller value.
Then, in the score of a plurality of affine transformation parameters that obtained, its score has peaked affine transformation parameter and is chosen as representative affine transformation parameter.Selected representative affine transformation parameter and present frame are recorded in the moving image memory cell 200 explicitly.By this way, carrying out in the situation of affine transformation, can utilize best affine transformation parameter to carry out affine transformation to the image that constitutes moving image.
As implied above, even when each image that constitutes moving image all comprises object (moving object) in motion such as people or car, if the size of this moving object is less relatively with respect to image area, then camera motion can be extracted and not influenced by moving object.
In addition, can by extract camera motion infer be considered to by photographer have a mind to cause near such as pushing away, zoom out, motion pan, inclination and the rotation.
Next, the operation of the image processing apparatus 100 in the embodiment of the invention will be described with reference to the drawings.
Fig. 4 illustrates the flow chart that the affine transformation parameter of being carried out by the image processing apparatus in the embodiment of the invention 100 detects the processing procedure of handling.
At first, motion pictures files is input to moving image input unit 110 (step S900).Then, the motion pictures files that is input to moving image input unit 110 is decoded, and obtain the image (step S901) of a frame according to temporal order.Then, judge whether a frame that is obtained is a frame (step S902) that is input to the motion pictures files of moving image input unit 110.When a frame that is obtained when being frame (step S902), from the corresponding general image of a frame extract minutiae (step S903).For example, shown in the part (b) of Fig. 2, in image, extract a plurality of angle points.Then, elect the affine transformation parameter in the unit matrix as affine transformation parameter (step S904), flow process advances to step S914 then.
On the contrary, when a frame that is obtained when not being frame (step S902), then with reference to the next-door neighbour at the corresponding image of preceding frame extract minutiae (step S905) from the zone that image is taken recently.That is and since with next-door neighbour's the characteristic point that in the corresponding image of preceding frame, extracts can be by obtaining with the corresponding light stream of these characteristic points, therefore, not with the corresponding image of present frame in extract these characteristic points.
Then, calculate and the corresponding light stream of each characteristic point (step S906) that extracts at the pairing image of preceding frame from the next-door neighbour.That is, shown in the part (b) of Fig. 2, calculated with the corresponding light stream of each angle point.
Then, variable i is initialized to " 1 " (step S907).Then, from the characteristic point that detects light stream based on it, select M characteristic point (step S908).For example, when affine transformation parameter is used as the camera operation parameter, select three characteristic points at random.In addition, when projective transformation parameter is used as the camera operation parameter, select four characteristic points at random.Then, calculate affine transformation parameter (step S909) based on M the light stream that calculates accordingly with a selected M characteristic point.
Then, based on by calculating the affine transformation parameter that obtains, calculate the score (step S910) of affine transformation parameter.Particularly, utilize the affine transformation parameter that obtains by calculating, the position of the motion destination of all characteristic points in the corresponding image of preceding frame that obtain and be close to.Compare with position (obtained when it has calculated light stream in step S906) by the position that will utilize the characteristic point that affine transformation parameter obtains corresponding to another feature point in the image of present frame, come by characteristic point the difference between the position of two corresponding characteristic points of calculating.For example, the absolute distance between two opposite positions is calculated as difference.Then, by characteristic point ground the difference that calculates is compared with predetermined threshold value, and difference is less than the obtained score as affine transformation parameter of the number of the characteristic point of threshold value.
Then, variable i is added " 1 " (step S911), and whether judgment variable i is greater than constant N (step S912).When variable i is less than or equal to constant N (step S912), flow process is returned step S908, and repeats the score computing (step S908 to S910) of affine transformation parameter.For example, can be with 20 as constant N.
On the contrary, when variable i during greater than constant N (step S912), in the score of the affine transformation parameter that is obtained, its score has peaked affine transformation parameter and is chosen as representative affine transformation parameter (step S913).Then, will be recorded in (step S914) in the moving image memory cell 200 explicitly with the affine transformation parameter and the present frame of the corresponding inverse matrix of matrix of selected representative affine transformation parameter.Notice that when present frame was frame, the selected affine transformation parameter of unit matrix was recorded in the moving image memory cell 200 explicitly with a frame.Then, image and the characteristic point in this image corresponding to present frame is rewritten (write over) and is stored (step S915).
Then, judge that whether present frame is the last frame (step S916) in the motion pictures files that is input in the moving image input unit 110.When present frame is not last frame (step S916), flow process is returned step S901, and repeats affine transformation parameter and detect processing (step S901 to S915).On the contrary, when present frame is last frame (step S916), stops affine transformation parameter and detect processing.
In present embodiment of the present invention, such example has been described, wherein,, detect affine transformation parameter based on detected light stream in the image that constitutes moving image as detection to the camera operation parameter.Yet, transducer such as acceleration transducer or gyro sensor or the zoom button that uses when carrying out zoom operation can be arranged on the camera.Camera motion amount in the time of can utilizing transducer or zoom button to detect photographic images, and, can obtain the camera operation parameter based on the camera motion amount.Notice that detected camera motion amount can be used when photographic images when judging that the camera operation parameter that obtained by camera operation parameter calculation unit 123 is whether correct.In addition, can detect a plurality of camera operation parameters, and, can from a plurality of camera operation parameters, select a camera operation parameter based on detected camera motion amount when the photographic images by camera operation parameter calculation unit 123.
Next, the situation of utilizing above-mentioned affine transformation parameter to play and show moving image will be described in detail with reference to the attached drawings.Notice that for convenience of description, Fig. 5 is simplified to each image shown in Figure 16, and in addition, the amount of exercise between two successive frames is exaggerated and illustrates.
At first, such situation will be described, wherein, when using the camera photographic images, though enlargement ratio remain unchanged, yet the camera lens of camera be the center with the camera position upwards, downwards, left and the either direction to the right be moved.
Fig. 5 comprises the diagrammatic sketch of an example of the transformation that shows the moving image that camera takes.In Fig. 5, diagrammatic sketch shows in the situation that the people's 400 that the mountain is arranged in background image is taken, with the corresponding image 401 to 403 of successive frame that is included in the moving image.In this example, illustrate such situation: photographer is taking image in the mobile camera camera lens to the right and on the direction that makes progress.In this case, the people 400 who is included in the moving image of being taken by camera moves to the left from the right side, and moves down in the image that constitutes moving image in addition.
Fig. 6 comprises such diagrammatic sketch, wherein, in each image shown in Figure 5, with being illustrated by the broken lines at the corresponding image of preceding frame of next-door neighbour, and shows the light stream that exemplary detection arrives in addition.Image 401 shown in the part (a) of image 401 shown in the part of Fig. 6 (a) and Fig. 5 is identical.In addition, identical by the image 402 shown in the part (b) of the part of solid line indication and Fig. 5 in the image 402 shown in the part (b) of Fig. 6, and identical in the image 402 shown in the part of Fig. 6 (b) by the part by the solid line indication in the image 401 shown in the part (a) of the part of dotted line indication and Fig. 6.In addition, the arrow 404 to 406 in the image 402 shown in the part of Fig. 6 (b) illustrates detected exemplary light stream in image 402.Similarly, the part of being indicated by solid line in the image 403 shown in the part of Fig. 6 (c) is identical with the image 403 shown in the part (c) of Fig. 5, and the part by the solid line indication is identical in the image 402 shown in the part (b) of the partial graph 6 of being indicated by dotted line in the image 403 shown in the part of Fig. 6 (c).In addition, the arrow 407 to 409 in the image 403 shown in the part of Fig. 6 (c) illustrates detected exemplary light stream in image 403.
As the part (b) of Fig. 6 with (c), be included in people 400 in the image and the mountain root in the background and move according to camera motion.Based on detected light stream in the motion from then on, can frame by frame obtain affine transformation parameter.
Fig. 7 comprises and shows the diagrammatic sketch that comprises the demonstration example in the situation of moving image of image shown in Figure 5 401 to 403 in broadcast.Notice that in present embodiment of the present invention, owing to will constitute each image combination of moving image, therefore, along with the past of playback duration, the image that is presented on the display unit 191 becomes greater than normal picture.Therefore, with the size of the viewing area of display unit 191 Comparatively speaking, the image that at first is shown is shown as relatively little image.Note, can specify the size, position etc. of the image that at first is shown by the user.
Shown in the part (a) of Fig. 7, at first, only be shown with the corresponding image 401 of a frame.Here, when with the matrix (3 * 3 matrix) of image 401 corresponding affine transformation parameters when being A1, the value of A1 is obtained, and, with reference to the position and the size of the image 401 of a frame, utilize the A1 matrix that is obtained to come image 401 is carried out affine transformation.Here, because A1 is a unit matrix, therefore, the position and the size of image 401 are not transformed.Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame to come image 402 is carried out affine transformation with the corresponding image 402 of next frame.Particularly, when the matrix with image 402 corresponding affine transformation parameters is A2 and when being A1 with the matrix of image 401 corresponding affine transformation parameters, the value of A1 * A2 is obtained, and, the position and the size of the image 401 of a reference frame utilize the A1 * A2 matrix that is obtained to come image 402 is carried out affine transformation.In the image shown in the part (b) of Fig. 7, only the position of image 402 is transformed.The image 402 that utilizes affine transformation parameter to carry out affine transformation be rewritten in case with corresponding to next-door neighbour image 401 overlaids at preceding frame.That is, in the zone of image 401, the image of image 402 be written in image 402 overlapping areas 410 on.In addition, in the zone of image 401, the image of image 401 with in the image 402 equitant regional 411 be not combined.That is, when the image 402 corresponding to second frame will be shown, shown in the part (b) of Fig. 7, by the integral part of image 402 and the combined composograph that generates of part corresponding to zone 411 of image 401 are shown.In addition, indicating this is that the image boundary of the latest image in the shown image can be displayed on around the corresponding image of present frame.In the part (b) of Fig. 7, image boundary be displayed on image 402 around.And, be used for the affine transformation parameter that image 402 carries out affine transformation is stored in the image transforming unit 160.
Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame to come image 403 is carried out affine transformation with the corresponding image 403 of next frame.That is, utilize by using with image 403 corresponding affine transformation parameter matrixes and at next-door neighbour's the affine transformation parameter that obtains with image 402 corresponding affine transformation parameter matrixes that in preceding affine transformation, uses to come image 403 is carried out affine transformation.Particularly, when with image 403 corresponding affine transformation parameter matrixes be A3, with image 402 corresponding affine transformation parameter matrixes be A2, and when being A1 with image 401 corresponding affine transformation parameter matrixes, the value of A1 * A2 * A3 is obtained, and the position and the size of the image 401 of a reference frame utilize the A1 * A2 * A3 matrix that is obtained to come image 403 is carried out affine transformation.In the image shown in the part (c) of Fig. 7, only the position of image 403 is transformed.The image 403 that utilizes affine transformation parameter to carry out affine transformation be rewritten in case with corresponding at the image 401 of preceding frame and 402 composograph overlaid.That is, in the zone of the composograph of image 401 and 402, the image of image 403 be written in image 403 overlapping areas 413 and 414 on.In addition, in the zone of the composograph of image 401 and 402, image 401 and 402 composograph with in image 403 overlapping areas 411 and 412 be not combined.Promptly, in the time will being shown with the corresponding image 403 of the 3rd frame, shown in the part (c) of Fig. 7, the composograph that the part corresponding to zone 412 corresponding to the part in zone 411 and image 402 of the integral part by combination image 403, image 401 generates is shown.In addition, be that the image boundary of the latest image in the shown image will be displayed on around the corresponding image of present frame the time when indicating this, image boundary is displayed on around the image 403 shown in the part (c) of Fig. 7.In addition, be used for the affine transformation parameter that image 403 carries out affine transformation is stored in the image transforming unit 160.That is, the affine transformation parameter that obtains by the affine transformation parameter matrix multiple that will correspond respectively to image 402 and 403 is stored in the image transforming unit 160.As above, when will be when carrying out affine transformation, utilize by using with the corresponding affine transformation parameter matrix of present frame and with affine transformation parameter that the corresponding affine transformation parameter matrix of each frame before present frame obtains to come to carrying out affine transformation with the corresponding image of present frame with the corresponding image of present frame.The affine transformation parameter that obtains when affine transformation is stored in the image transforming unit 160, and is used for next affine transformation.In addition, this is equally applicable to the situation among Figure 11 and Figure 15.
Fig. 8 comprises and shows the diagrammatic sketch that comprises the demonstration example in the situation of moving image of image shown in Figure 5 401 to 403 in broadcast.In demonstration example shown in Figure 7, be fixed with the corresponding composograph of each frame (being an image during beginning) before present frame, be rewritten with the corresponding image of present frame and make up, and the image that generates by combination is shown with composograph through affine transformation.On the contrary, in demonstration example shown in Figure 8, be fixed with the position of the corresponding image of present frame, going up in the opposite direction by affine transformation with the side of affine transformation parameter with the corresponding composograph of each frame before present frame, be rewritten with the corresponding image of present frame and with composograph combination through affine transformation, and be shown by the image that combination generates.That is, in Fig. 7 and demonstration example shown in Figure 8, though the image that shows at the place, fixed position with will be different by the image of affine transformation, yet other parts are common.Therefore, be given common label with the common part of the part of Fig. 7 and be described.
Shown in the part (a) of Fig. 8, at first, only be shown with the corresponding image 401 of a frame.Here, because image 401 is frames, therefore do not exist at preceding frame.Then, in the time will being shown with the corresponding image 402 of next frame, the affine transformation parameter that utilizes frame therewith to be associated comes the image 401 at preceding image as the next-door neighbour is carried out affine transformation.Particularly, when with image 402 corresponding affine transformation parameter matrixes be A2 and when being A1 with image 401 corresponding affine transformation parameter matrixes, (value of A1 * A2) is obtained, and (A1 * A2) matrix comes image 401 is carried out affine transformation to utilize the Inv that is obtained for Inv.Here, the inverse matrix of Inv A (A is a matrix) expression A.In the image shown in the part (b) of Fig. 8, only the position of image 401 is transformed.Be rewritten so that overlapping with the corresponding image 402 of present frame with image 401 through affine transformation.Note, identical by the composograph shown in the part (b) that image 402 is write on the composograph that generates on the image 401 and Fig. 7, therefore, omit description here to it.
Then, in the time will being shown with the corresponding image 403 of next frame, utilize the affine transformation parameter that is associated with this frame, going up by affine transformation in the opposite direction with the side of affine transformation parameter with composograph at corresponding image 401 of preceding frame and image 402.Particularly, when with image 403 corresponding affine transformation parameter matrixes be A3, with image 402 corresponding affine transformation parameter matrixes be A2 and when being A1 with image 401 corresponding affine transformation parameter matrixes, (value of A1 * A2 * A3) is obtained, and (A1 * A2 * A3) matrix comes the composograph of image 401 and 402 is carried out affine transformation to utilize Inv for Inv.In the image shown in the part (c) of Fig. 8, only the position of the composograph of image 401 and image 402 is transformed.Be rewritten with the corresponding image 403 of present frame so as with through the image 401 of affine transformation and 402 composograph overlaid.Note, identical by the composograph shown in the part (c) that image 403 is write on the composograph that generates on the image 401 and 402 and Fig. 7, therefore, omit description here to it.
Next, such situation will be described: when utilizing the camera photographic images, though the lens direction of camera remain unchanged, yet enlargement ratio is changed.
Fig. 9 comprises the diagrammatic sketch of an example that shows the transformation in the captured moving image of camera.In Fig. 9, diagrammatic sketch shows in the situation that the people's 420 that the mountain is arranged in background image is taken, with the corresponding image 421 to 423 of successive frame that is included in the moving image.In this example, illustrate the situation that photographer takes image in the enlargement ratio that increases camera lens.In this case, the people 420 who is included in the moving image of being taken by camera becomes big gradually in constituting the image of moving image.Notice that though camera position may slightly move when increasing enlargement ratio, yet moving of camera position do not considered in the description of this example.
Figure 10 comprises such diagrammatic sketch, wherein, in each image shown in Figure 9, with being illustrated by the broken lines at the corresponding image of preceding frame of next-door neighbour, and in addition, shows exemplary detected light stream.Image 421 shown in the part (a) of image 421 shown in the part of Figure 10 (a) and Fig. 9 is identical.In addition, identical in the image 422 shown in the part of Figure 10 (b) by the image 422 shown in the part (b) of the part of solid line indication and Fig. 9, and identical in the image 422 shown in the part of Figure 10 (b) by the part by the solid line indication in the image 421 shown in the part (a) of the part of dotted line indication and Fig. 9.In addition, the arrow 424 to 426 in the image 422 shown in the part of Figure 10 (b) illustrates detected exemplary light stream in image 422.Similarly, identical in the image 423 shown in the part of Figure 10 (c) by the image 423 shown in the part (c) of the part of solid line indication and Fig. 9, and identical in the image 423 shown in the part of Figure 10 (c) by the part by the solid line indication in the image 422 shown in the part (b) of the part of dotted line indication and Fig. 9.In addition, the arrow 427 to 429 in the image 423 shown in the part of Figure 10 (c) illustrates detected exemplary light stream in image 423.
As the part (b) of Figure 10 with (c), the size that is included in people 420 in the image and the mountain in the background changes and changes along with enlargement ratio.Based on from then on changing detected light stream, can frame by frame obtain affine transformation parameter.
Figure 11 comprises and shows the diagrammatic sketch that comprises the demonstration example in the situation of moving image of image shown in Figure 9 421 to 423 in broadcast.
Shown in the part (a) of Figure 11, at first, only be shown with the corresponding image 421 of a frame.Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame to come image 422 is carried out affine transformation with the corresponding image 422 of next frame.In the image shown in the part (b) of Figure 11, only the size of image 422 is transformed.The image 422 that utilizes affine transformation parameter to carry out affine transformation be rewritten in case with corresponding to next-door neighbour image 421 overlaids at preceding frame.That is, in the zone of image 421, the image of image 422 be written in image 422 overlapping areas on.In this case, because the overall region overlaid of image 421 and image 422, therefore, the general image of image 422 is written on the image 421.In addition, in the zone of image 421, image 421 not with image 422 equitant regional 431 in image be combined.That is, in the time will being shown with the corresponding image 422 of second frame, shown in the part (b) of Figure 11, the composograph that the part corresponding to zone 431 of general image by combination image 422 and image 421 generates is shown.In addition, indicating this is that the image boundary of the latest image in the shown image can be displayed on around the corresponding image of present frame.In the part (b) of Figure 11, image boundary be presented at image 422 around.In addition, be used for the affine transformation parameter that image 422 carries out affine transformation is stored in the image transforming unit 160.
Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame that image 423 is carried out affine transformation with the corresponding image 423 of next frame.That is, utilize by use with image 423 corresponding affine transformation parameter matrixes and the next-door neighbour in preceding affine transformation, use obtain affine transformation parameter with image 422 corresponding affine transformation parameter matrixes and come image 423 is carried out affine transformation.In the image shown in the part (c) of Figure 11, only the size of image 423 is transformed.Through the image 423 of affine transformation be rewritten in case with corresponding at the image 421 of preceding frame and 422 composograph overlaid.That is, in the zone of the composograph of image 421 and 422, the image of image 423 be written in image 423 overlapping areas on.In this case, because the overall region overlaid of image 423 and image 421 and 422, therefore, the general image of image 423 is written on the composograph of image 421 and 422.In addition, in the zone of the composograph of image 421 and 422, image 421 and 422 composograph with in image 423 overlapping areas 432 and 433 be not combined.Promptly, in the time will being shown with the corresponding image 423 of the 3rd frame, shown in the part (c) of Figure 11, the overall region by combination image 423, image 421 be shown with the composograph that generates with zone 433 corresponding parts zone 432 corresponding parts and image 422.In addition, be that the image boundary of the latest image in the shown image will be displayed on around the corresponding image of present frame the time when indicating this, image boundary be displayed on the image 423 shown in the part (c) of Figure 11 around.In addition, be used for the affine transformation parameter that image 423 carries out affine transformation is stored in the image transforming unit 160.That is, be stored in the image transforming unit 160 with the affine transformation parameter that image 422 and 423 corresponding affine transformation parameters obtain respectively by use.
Figure 12 comprises and shows the diagrammatic sketch that comprises the demonstration example in the situation of moving image of image shown in Figure 9 421 to 423 in broadcast.Difference between difference between Figure 11 and the demonstration example shown in Figure 12 and Fig. 7 and the demonstration example shown in Figure 8 is similar.Though the image that shows at fixed position place with will be different by the image of affine transformation, yet other parts are common.Therefore, with Figure 11 in the common part of part be given common label and be described.
Shown in the part (a) of Figure 12, at first, only be shown with the corresponding image 421 of a frame.Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame going up in the opposite direction carrying out affine transformation as the image 421 that is close at preceding image with the side of affine transformation parameter with the corresponding image 422 of next frame.In the image shown in the part (b) of Figure 12, only the size of image 421 is transformed.Be rewritten with the corresponding image 422 of present frame so as with image 421 overlaids through affine transformation.Note, though have the different size of composograph shown in the part (b) with Figure 11 by image 422 being write on the composograph that generates on the image 421, yet the others of the composograph shown in the part (b) of others and Figure 11 are identical, therefore, omit description here to it.
Then, in the time will being shown, utilize the affine transformation parameter be associated with this frame going up in the opposite direction carrying out affine transformation with composograph at corresponding image 421 of preceding frame and image 422 with the side of affine transformation parameter with the corresponding image 423 of next frame.In the image shown in the part (c) of Figure 12, only the size of the composograph of image 421 and image 422 is transformed.Be rewritten with the corresponding image 423 of present frame so as with through the image 421 of affine transformation and 422 composograph overlaid.Note, though have the different size of composograph shown in the part (c) with Figure 11 by the composograph that generates on the composograph that image 423 is write on image 421 and image 422, yet, the others of the composograph shown in the part (c) of others and Figure 11 are identical, therefore, omit description here to it.
Next, such situation will be described: when using the camera photographic images, though the lens direction of camera and enlargement ratio remain unchanged, yet camera is around rotating as the image taking direction of pivot.
Figure 13 comprises the diagrammatic sketch of an example that shows the transformation in the captured moving image of camera.In Figure 13, diagrammatic sketch shows in the situation that the people's 440 that the mountain is arranged in background image is taken and the corresponding image 441 to 443 of successive frame that is included in the moving image.In this example, illustrate the situation that photographer takes image when centering on as the image taking direction rotation camera of pivot.In this case, the people 440 who is included in the moving image of being taken by camera rotates in constituting the image of moving image.Note, though because the camera rotation may make camera position slightly move, yet moving of camera position do not considered in the description of this example.
Figure 14 comprises such diagrammatic sketch, wherein, in each image shown in Figure 13, with being illustrated by the broken lines at the corresponding image of preceding frame of next-door neighbour, and in addition, shows exemplary detected light stream.Image 441 shown in the part (a) of image 441 shown in the part of Figure 14 (a) and Figure 13 is identical.In addition, identical in the image 442 shown in the part of Figure 14 (b) by the image 442 shown in the part (b) of the part of solid line indication and Figure 13, and identical in the image 442 shown in the part of Figure 14 (b) by the part by the solid line indication in the image 441 shown in the part (a) of the part of dotted line indication and Figure 13.In addition, the arrow 444 to 446 in the image 442 shown in the part of Figure 14 (b) illustrates detected exemplary light stream in image 442.Similarly, identical in the image 443 shown in the part of Figure 14 (c) by the image 443 shown in the part (c) of the part of solid line indication and Figure 13, and identical in the image 443 shown in the part of Figure 14 (c) by the part by the solid line indication in the image 442 shown in the part (b) of the part of dotted line indication and Figure 13.In addition, the arrow 447 to 449 in the image 443 shown in the part of Figure 14 (c) illustrates detected exemplary light stream in image 443.
As the part (b) of Figure 14 with (c), be included in people 440 in the image and the mountain in the background and rotate along with the rotation of camera.Based on the detected light stream that from then on rotatablely moves, can frame by frame obtain affine transformation parameter.
Figure 15 comprises and shows the diagrammatic sketch that comprises the demonstration example in the situation of moving image of image shown in Figure 13 441 to 443 in broadcast.
Shown in the part (a) of Figure 15, at first, only be shown with the corresponding image 441 of a frame.Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame to come image 442 is carried out affine transformation with the corresponding image 442 of next frame.In the image shown in the part (b) of Figure 15, only the angle of image 442 is transformed.Through the image 442 of affine transformation be rewritten in case with image 441 overlaids corresponding to the next-door neighbour at preceding frame.That is, in the zone of image 441, the image of image 442 be written in image 442 overlapping areas 450 on.In addition, in the zone of image 441, the image of image 441 with in image 442 equitant regional 451 and 452 be not combined.That is, in the time will being shown with the corresponding image 442 of second frame, shown in the part (b) of Figure 15, the composograph that the part corresponding to zone 451 and 452 of integral part by combination image 442 and image 441 generates is shown.In addition, indicating this is that the image boundary of the latest image in the shown image can be displayed on around the corresponding image of present frame.In the part (b) of Figure 15, image boundary be presented at image 442 around.In addition, be used for the affine transformation parameter that image 442 carries out affine transformation is stored in the image transforming unit 160.
Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame that image 443 is carried out affine transformation with the corresponding image 443 of next frame.That is, utilize by use with image 443 corresponding affine transformation parameter matrixes and the next-door neighbour in preceding affine transformation, use obtain affine transformation parameter with image 442 corresponding affine transformation parameter matrixes and come image 443 is carried out affine transformation.In the image shown in the part (c) of Figure 15, only the angle of image 443 is transformed.Through the image 443 of affine transformation be rewritten in case with corresponding at the image 441 of preceding frame and 442 composograph overlaid.That is, in the zone of the composograph of image 441 and 442, the image of image 443 be written in image 443 overlapping areas 453 to 457 on.In addition, in the zone of the composograph of image 441 and 442, image 441 and 442 composograph also with in image 443 overlapping areas 458 to 461 be not combined.Promptly, in the time will being shown with the corresponding image 443 of the 3rd frame, shown in the part (c) of Figure 15, the overall region by combination image 443, image 441 be shown with the composograph that generates with zone 458 and 460 corresponding parts zone 459 corresponding parts and image 442.In addition, this is that the image boundary of the latest image in the shown image will be displayed on around the corresponding image of present frame the time when indication, image boundary be displayed on the image 443 shown in the part (c) of Figure 15 around.In addition, be used for the affine transformation parameter that image 443 carries out affine transformation is stored in the image transforming unit 160.That is, be stored in the image transforming unit 160 with the affine transformation parameter that image 442 and 443 corresponding affine transformation parameters obtain respectively by use.
Figure 16 comprises and shows the diagrammatic sketch that comprises the demonstration example in the situation of moving image of image shown in Figure 13 441 to 443 in broadcast.Difference between difference between Figure 15 and the demonstration example shown in Figure 16 and Fig. 7 and the demonstration example shown in Figure 8 is similar.Though the image that shows at fixed position place with will be different by the image of affine transformation, yet other parts are common.Therefore, with Figure 15 in the common part of part be given common label and be described.
Shown in the part (a) of Figure 16, at first, only be shown with the corresponding image 441 of a frame.Then, in the time will being shown, utilize the affine transformation parameter that is associated with this frame going up in the opposite direction carrying out affine transformation as the image 441 that is close at preceding image with the side of affine transformation parameter with the corresponding image 442 of next frame.In the image shown in the part (b) of Figure 16, only the angle of image 441 is transformed.Be rewritten with the corresponding image 442 of present frame so as with image 441 overlaids through affine transformation.Note, though have the different angle of composograph shown in the part (b) with Figure 15 by image 442 being write on the composograph that generates on the image 441, yet the others of the composograph shown in the part (b) of others and Figure 15 are identical, therefore, omit description here to it.
Then, in the time will being shown with the corresponding image 443 of next frame, utilize the affine transformation parameter be associated with this frame with the side of affine transformation parameter in the opposite direction on to carrying out affine transformation with composograph at corresponding image 441 of preceding frame and image 442.In the image shown in the part (c) of Figure 16, only the angle of the composograph of image 441 and image 442 is transformed.Be rewritten with the corresponding image 443 of present frame so as with through the image 441 of affine transformation and 442 composograph overlaid.Note, though have the different angle of composograph shown in the part (c) with Figure 15 by image 443 being write on the composograph that generates on image 441 and 442 the composograph, yet, the others of the composograph shown in the part (c) of others and Figure 15 are identical, therefore, omit description here to it.
The position, enlargement ratio and the reformed situation of angle order that constitute each image of moving image have been described above.Yet embodiment is applicable to the situation that these changes are combined similarly.
Next, will be shown in broadcast by the demonstration example in the situation of the moving image of camera actual photographed.In the demonstration example below, in the viewing area of display unit 191, only showing composograph, and other zone is illustrated as black with present frame with in the zone that the corresponding image of preceding frame is shown.In addition, the border is presented at around the corresponding image of present frame.In addition, below in the demonstration example shown in, with the demonstration example that illustrates from middle playing moving images.
Figure 17 to Figure 24 comprises the diagrammatic sketch of the example that shows the transformation in the moving image of being taken by camera.Figure 17 and Figure 18 are the diagrammatic sketch that has constituted the image 500 to 505 of moving image in the situation of when being illustrated in mobile camera the head of a family that plays on the square in the residence and child's image being taken.In this example, be written in and each situation on the corresponding composograph of preceding frame by affine transformation and through the image of affine transformation illustrating with the corresponding image of present frame.
In Figure 17 and image 500 to 505 shown in Figure 180, with the corresponding image of present frame be image 506 to 511.In addition, composograph is an image 512 to 517, and composograph is by making up the image that generates accordingly with each frame the preceding.As Figure 17 and shown in Figure 180, the subject (square in the residence etc.) that is included in the captured image is fixed on the picture, and moves on picture according to moving of camera with the corresponding image 506 to 511 of present frame.By display image by this way, can to the beholder image be shown as follows: as if on display unit 191, be shown as corresponding to the image of present frame in the viewing area of black and just march forward according to moving of camera.
Figure 19 and Figure 20 are illustrated in the diagrammatic sketch of carrying out when pushing away near operation in the situation that the head of a family that plays on the square in the residence and child's image is taken with the corresponding image 520 to 525 of the frame that has constituted moving image.In this example, be written in and each situation on the corresponding composograph of preceding frame by affine transformation and through the image of affine transformation illustrating with the corresponding image of present frame.
In Figure 19 and image 520 to 525 shown in Figure 20, with the corresponding image of present frame be image 526 to 531.In addition, composograph is an image 532 to 537, and composograph is by making up the image that generates accordingly with each frame the preceding.As Figure 19 and shown in Figure 20, the subject (square in the residence etc.) that is included in the captured image is fixed on the picture, and moves on picture according to the motion of camera with the corresponding image 526 to 531 of present frame.By display image by this way, the beholder can easily identify the people as the zoom object in overall space.
As above, to the demonstration example shown in Figure 20, on display, move the amplification that relates to size simultaneously/dwindle, thus, generated the image of broad with the corresponding image of present frame at Figure 17.In addition, in the composograph that order generates, only be included in the corresponding image of present frame in movement of objects, and with the part of the corresponding image of present frame outside in, show objects displayed in the past with inactive state.
Figure 21 and Figure 22 be in the situation of when being illustrated in mobile camera the head of a family that plays on the square in the residence and child's image being taken with the corresponding image 540 to 545 of frame that constitutes moving image.In this example, be written in situation through the composograph of affine transformation at the corresponding composograph of preceding frame going up in the opposite direction by affine transformation and with the corresponding image of present frame with illustrating with the side of affine transformation parameter with each.
In Figure 21 and image 540 to 545 shown in Figure 22, with the corresponding image of present frame be image 546 to 551.In addition, composograph is an image 552 to 557, and composograph is by making up the image that generates accordingly with each frame the preceding.As Figure 21 and shown in Figure 22, be fixed on the picture with the corresponding image 546 to 551 of present frame, and, in the subject in being included in captured image (square in the residence etc.), except with corresponding those images of present frame image moving according to camera on picture.By display image by this way, can to the beholder image be shown as follows: as if on display unit 191, be shown as corresponding to composograph in the viewing area of black and advance forward according to moving of camera at preceding frame.That is, can with composograph as if with Figure 17 and side shown in Figure 180 go up in the opposite direction the mode of advancing forward show with at the corresponding composograph of preceding frame.
Figure 23 and Figure 24 are illustrated in to carry out when pushing away near operation in the situation that the head of a family that plays on the square in residence and child's image is taken and the corresponding image 560 to 565 of frame that constitutes moving image.In this example, such situation will be illustrated: going up by affine transformation in the opposite direction with the side of affine transformation parameter at the corresponding composograph of preceding frame with each, and be written on the composograph of affine transformation with the corresponding image of present frame.
In Figure 23 and image 560 to 565 shown in Figure 24, with the corresponding image of present frame be image 566 to 571.In addition, composograph is an image 572 to 577, and composograph is by making up the image that generates accordingly with each frame the preceding.As Figure 23 and shown in Figure 24, be fixed on the picture with the corresponding image 566 to 571 of present frame, and in the subject in being included in captured image (square in the residence etc.), except with the corresponding image of present frame image motion according to camera on picture move.By display image by this way, can little by little increase size according to zoom as the people of zoom object, therefore, the beholder can easily identify this people in overall space.
As above, in the demonstration example shown in Figure 21 to 24, be fixed on the place, fixed position with the corresponding image of present frame, and, when relating to size and amplify/dwindle, on display screen, move, generated the image of broad thus with peripheral images around the corresponding image of present frame.In addition, in the composograph that order generates, only be included in the corresponding image of present frame in movement of objects, and with the part of the corresponding image of present frame outside in, show objects displayed in the past with inactive state, in inactive state, object moves as a whole.
Next, the operation of the image processing apparatus 100 in the embodiment of the invention will be described with reference to the drawings.
Figure 25 shows the flow chart of the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 100.Herein in the reason process, will illustrate with the corresponding image of present frame by affine transformation, and be written in and each example on the corresponding composograph of preceding frame through the image of affine transformation.
At first, in video memory 170, guarantee big or small big work buffers (step S921) than the image that constitutes moving image.Then, from moving image memory cell 200, obtain motion pictures files (step S922).Then, the motion pictures files that is obtained is decoded, and obtain the present frame (step S923) of a frame.
Then, extract and the corresponding affine transformation parameter of present frame (step S924) that is obtained from motion pictures files.Here, when present frame is frame, extract the affine transformation parameter of unit matrix.Then, utilize the affine transformation parameter obtained pair to carry out affine transformation (step S925) with the corresponding image of present frame.Here, when present frame is frame, utilize the affine transformation parameter of unit matrix to carry out affine transformation.Therefore, real image is not transformed.Then, be rewritten with the corresponding image of present frame through affine transformation and by with present frame before the composograph of pairing each image of frame combined, and the composograph that has made up the pairing image of present frame is stored in (step S926) in the video memory 170.Here, when present frame is frame, be stored in the video memory 170 with the corresponding image of a frame.Then, the composograph that has made up the pairing image of present frame in step S926 is displayed on (step S927) on the display unit 191.Here, when present frame is frame, be displayed on the display unit 191 with the corresponding image of a frame.
Then, judge whether present frame is last frame (step S928) in the frame that constitutes the input motion image file.When present frame is not last frame (step S928), flow process is returned step S923, and repeats the composograph demonstration and handle (step S923 to S927).
On the contrary, when present frame is last frame (step S928), the work buffers of being guaranteed is released (step S929), and the moving image playback process stops.
Figure 26 is the flow chart that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 100 is shown.Herein in the reason process, such example will be illustrated: going up by affine transformation in the opposite direction with the side of affine transformation parameter with the corresponding composograph of each frame before the present frame, and be written on the composograph of affine transformation with the corresponding image of present frame.Note, in processing procedure shown in Figure 26,, therefore, omit description here it because step S921 to S924 and step S927 to S929 are similar with processing procedure shown in Figure 25.
From motion pictures files, extract and the corresponding affine transformation parameter of present frame (step S924) that in step S923, obtains.Then, utilize the affine transformation parameter that is obtained, going up in the opposite direction, the corresponding composograph of each frame with before the present frame that is stored in the video memory 170 is carried out affine transformation (step S941) with the side of affine transformation parameter.Here, when present frame is frame, owing to do not have composograph to be stored in the video memory 170, so image is not transformed.Then, be rewritten with the corresponding image of present frame and by combined, and the composograph that has made up the pairing image of present frame is stored in (step S942) in the video memory 170 with composograph through affine transformation.Here, when present frame is frame, be stored in the video memory 170 with the corresponding image of a frame.Then, the composograph that has made up the pairing image of present frame in step S942 is displayed on (step S927) on the display unit 191.
Following situation has been described: by to the situation that generates composograph corresponding to the image applications affine transformation of present frame, perhaps by going up in the opposite direction using the situation that affine transformation generates composograph at the corresponding composograph of preceding frame with each with the side of affine transformation parameter.Yet, can also pass through, and go up in the opposite direction generating composograph in the corresponding composograph application of preceding frame affine transformation with the side of affine transformation parameter in addition with each to image applications affine transformation corresponding to present frame.Here, will be described in detail with reference to the attached drawings such example: wherein, affine transformation parameter is separated into and amplifies/dwindle relevant key element (zoom component) and the key element except amplifying/dwindling (with moving or rotating relevant key element); And utilize with amplify/dwindle relevant key element with the side of affine transformation parameter going up in the opposite direction to each in the corresponding composograph application of preceding frame affine transformation, and utilize with mobile or rotate relevant key element, generate composograph thus to image applications affine transformation corresponding to present frame.
Figure 27 is the flow chart that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 100 is shown.Herein in the reason process, such example will be illustrated: utilize and move or rotate relevant key element pair and the corresponding image of present frame carries out affine transformation, and additionally, utilize and amplify/dwindle relevant key element, make being written on the composograph of affine transformation thus through affine transformation with the corresponding image of present frame going up in the opposite direction carrying out affine transformation at the corresponding composograph of preceding frame with each with the side of affine transformation parameter.Note, in processing procedure shown in Figure 27,, therefore omit description here it because step S921 to S924 and step S927 to S929 are similar with processing procedure shown in Figure 25.
From motion pictures files, extract and the corresponding affine transformation parameter of present frame (step S924) that in step S923, obtains.Then, from each key element of the affine transformation parameter that obtained, isolate and amplify/dwindle relevant key element (step S951).Then, utilize isolated with amplify/dwindle relevant key element, carry out affine transformation (step S952) at the corresponding composograph of going up in the opposite direction being stored in the video memory 170 with the side of affine transformation parameter of each frame with before the present frame.Here, when present frame is frame, owing to do not have composograph to be stored in the video memory 170, so image is not transformed.Then, utilize institute isolated and mobile or rotate relevant key element, to carrying out affine transformation (step S953) with the corresponding image of present frame.Here, when present frame is frame, utilize the affine transformation parameter of unit matrix to carry out affine transformation.Therefore, real image is not transformed.
Then, be rewritten with the corresponding image of present frame and by combined, and the composograph that has made up with the corresponding image of present frame is stored in (step S954) in the video memory 170 with composograph through affine transformation through affine transformation.Here, when present frame is frame, be stored in the video memory 170 with the corresponding image of a frame.Then, the composograph that has made up in step S954 with the corresponding image of present frame is displayed on (step S927) on the display unit 191.
Next, play by the demonstration example in the situation of the moving image of camera actual photographed illustrating the processing procedure of utilizing moving image playback process shown in Figure 27.
Figure 28 to Figure 31 comprises the diagrammatic sketch of an example that shows the transformation in the moving image of being taken by camera.Figure 28 and Figure 29 are the diagrammatic sketch that constitutes the image 580 to 585 of moving image in the situation of when being illustrated in mobile camera the head of a family that plays on the square in the residence and child's image being taken.Note, the situation of not carrying out zoom operation has been shown in Figure 28 and Figure 29.
In Figure 28 and image 580 to 585 shown in Figure 29, with the corresponding image of present frame be image 586 to 591.In addition, composograph is an image 592 to 597, and composograph is by making up the image that generates accordingly with each frame the preceding.As Figure 28 and shown in Figure 29, the subject (square in the residence etc.) that is included in the captured image is fixed on the picture, and moves on picture according to moving of camera with the corresponding image 586 to 591 of present frame.Here, because Figure 28 and image 580 to 585 shown in Figure 29 are the images of catching of not carrying out zoom operation, therefore, affine transformation parameter comprises hardly and amplifies/dwindle relevant key element.Therefore, the demonstration example among Figure 28 and Figure 29 is identical with Figure 17 and demonstration example shown in Figure 180 basically.
Figure 30 and Figure 31 are illustrated in the diagrammatic sketch of carrying out when pushing away near operation in the situation that the head of a family that plays on the square in the residence and child's image is taken with the corresponding image 600 to 605 of frame that constitutes moving image.
In Figure 30 and image 600 to 605 shown in Figure 31, with the corresponding image of present frame be image 606 to 611.In addition, composograph is an image 612 to 617, and composograph is by making up the image that generates accordingly with each frame the preceding.As Figure 30 and shown in Figure 31, the image 612 to 617 that is included in the subject (square in the residence etc.) in the captured image is exaggerated according to the zoom motion of camera, and keeps same sizes with the corresponding image 606 to 611 of present frame and move on picture according to the motion of camera.That is, peripheral images is amplified gradually, moves with the corresponding image 606 to 611 of present frame thereupon.By display image by this way, can provide various Show Styles to the beholder.
Describe affine transformation parameter above and be recorded in example in the motion pictures files.Yet, can also affine transformation parameter be recorded as the accompanying information (for example, metadata) of the form different at the every frame that constitutes moving image with the form of motion pictures files.Hereinafter, will be described in detail with reference to the attached drawings below affine transformation parameter with the format record different example as accompanying information in meta data file with the form of motion pictures files.
Figure 32 is the block diagram that the functional structure example of the image processing apparatus 650 in the one embodiment of the invention is shown.Here, image processing apparatus 650 is the devices of making amendment and obtaining by to the part of image processing apparatus shown in Figure 1 100.Image processing apparatus 650 is such image processing apparatus, wherein, replace record control unit 130, moving image memory cell 200, moving image acquiring unit 140 and the camera operation parameter extraction unit 150 of image processing apparatus 100, and be provided with record control unit 651, moving image memory cell 660, metadata storage unit 670 and file acquiring unit 652.Note, the similar of structure except record control unit 651, moving image memory cell 660, metadata storage unit 670 and file acquiring unit 652 and image processing apparatus 100 shown in Figure 1, therefore, omission is to the description of these other structures.
Record control unit 651 be configured to will from the moving image record of moving image input unit 110 output moving image memory cell 660 as motion pictures files, and additionally will be recorded in explicitly the metadata storage unit 670 as meta data file with corresponding moving image and frame from the affine transformation parameter of camera operation parameter calculation unit 123 outputs.
Moving image memory cell 660 is configured to the moving image from 110 outputs of moving image input unit is stored as motion pictures files.In addition, moving image memory cell 660 offers file acquiring unit 652 in response to the request from file acquiring unit 652 with motion pictures files.Note, will describe the motion pictures files that is stored in the moving image memory cell 660 in detail with reference to Figure 33.
Metadata storage unit 670 is configured to the affine transformation parameter from 123 outputs of camera operation parameter calculation unit is stored as meta data file.In addition, metadata storage unit 670 offers file acquiring unit 652 in response to the request from file acquiring unit 652 with meta data file.Note, will describe the meta data file that is stored in the metadata storage unit 670 in detail with reference to Figure 33.
In response to from the operation relevant with playback moving image operation acceptance unit 195 input, file acquiring unit 652 is configured to obtain the motion pictures files that is stored in the moving image memory cell 660 and is stored in meta data file in the metadata storage unit 670 explicitly with motion pictures files.File acquiring unit 652 outputs to image transforming unit 160 with moving image in the motion pictures files that is obtained and the affine transformation parameter in the meta data file, and the moving image in the motion pictures files that is obtained is outputed to image assembled unit 180.
Figure 33 comprises and schematically shows the moving image memory cell 660 that is recorded in the present embodiment of the present invention and the diagrammatic sketch of each file in the metadata storage unit 670.In the part (a) of Figure 33, show the motion pictures files 661 to 663 that is stored in the moving image memory cell 660, and be stored in meta data file 671 to 673 in the metadata storage unit 670 explicitly with motion pictures files 661 to 663.Here, suppose moving image ID promptly is used for identifying the identification information of each motion pictures files that is stored in moving image memory cell 660, give each motion pictures files.For example, give motion pictures files 661 with " #1 "; Give motion pictures files 662 with " #2 "; And give motion pictures files 663 with " #n ".
In the part (b) of Figure 33, schematically show the motion pictures files 661 that is stored in the moving image memory cell 660, and be stored in meta data file 671 in the metadata storage unit 670 explicitly with motion pictures files 661.Here, motion pictures files 661 is the files that comprise the moving image that is made of the n frame, and this n frame is represented as frame 1 (664) to n (667).
And moving image ID 674, frame number 675 and affine transformation parameter 676 are stored in the meta data file 671 with being relative to each other connection.
Moving image ID 674 is the moving image ID that are endowed corresponding motion pictures files.For example, " #1 " that gives motion pictures files 661 is stored.
Frame number 675 is the sequence numbers that constitute each frame of the moving image in the corresponding motion pictures files.For example, be stored to " n " with the frame 1 (664) to n (667) corresponding " 1 " that has constituted the moving image in the motion pictures files 661.
Affine transformation parameter 676 is at the affine transformation parameter that calculates with frame number 675 each frames corresponding, moving image.Note, with frame number 675 " 1 " corresponding affine transformation parameters 676 " a1, b1, c1, d1, e1 and f1 " be the affine transformation parameter of unit matrix.In addition, with the corresponding affine transformation parameter 676 of frame number 675 " m " (m is the positive integer more than or equal to 2) " am, bm, cm, dm, em and fm " be and next-door neighbour's frame " m " corresponding affine transformation parameter of frame " m-1 " before.
Following situation has been described: according to whether wanting playing moving images, be fixed under the situation that the central part of display unit 191 grades with the corresponding present image of present frame, to the corresponding image applications affine transformation of present frame and generate the situation of composograph; And with the side of affine transformation parameter go up in the opposite direction to each situation of using affine transformation and generating composograph at the corresponding composograph of preceding frame.Yet, can sequentially affine transformation be applied to and the corresponding present image of present frame, and can generate composograph and sequentially it being stored in the video memory.In addition, in the composograph from video memory, can be extracted and be shown as the viewing area in the zone that will be shown.Therefore, during the moving image playback, display mode that can switching display unit.Describe these moving image player methods below with reference to the accompanying drawings in detail.
Figure 34 is the block diagram that the functional structure example of the image processing apparatus 680 in the embodiment of the invention is shown.Here, image processing apparatus 680 is the devices of making amendment and obtaining by to the part of image processing apparatus shown in Figure 32 650.Image processing apparatus 680 comprises moving image input unit 110, camera operation detecting unit 120, record control unit 651, moving image memory cell 660, metadata storage unit 670, file acquiring unit 681, image transforming unit 682, image assembled unit 683, video memory 684, viewing area extraction unit 685, display-memory 686, indicative control unit 687, operation acceptance unit 688 and display unit 689.Note, the similar of these unit of the structure of moving image input unit 110, camera operation detecting unit 120, record control unit 651, moving image memory cell 660, metadata storage unit 670 and image processing apparatus 650 shown in Figure 32, therefore, omission is to the description of these other structures.In addition, in this example, the example that the part of the image processing apparatus 650 that description is shown in Figure 32 is modified.Yet present embodiment also can be applicable to image processing apparatus shown in Figure 1 100.
In response to from the operation relevant with playback moving image operation acceptance unit 688 input, file acquiring unit 681 is configured to obtain the motion pictures files that is stored in the moving image memory cell 660 and is stored in meta data file in the metadata storage unit 670 explicitly with motion pictures files.File acquiring unit 681 outputs to image transforming unit 682 with moving image in the motion pictures files that is obtained and the affine transformation parameter in the meta data file, and the moving image in the motion pictures files that is obtained is outputed to image assembled unit 683.
Image transforming unit 682 be configured to utilize with the corresponding affine transformation parameter frame by frame of moving image to having constituted the image applications affine transformation of the moving image from the motion pictures files of file acquiring unit 681 outputs, and will output to image assembled unit 683 through the image of affine transformation.
Image assembled unit 683 be configured to by will write on through the image of affine transformation on the composograph will by image transforming unit 682 generate through the image of affine transformation be kept at combined at the corresponding composograph of preceding frame in the video memory 684 with each, and will be stored in the video memory 684 by the new composograph that combination generates.In addition, image assembled unit 683 writes on present image on the composograph by the position based on the present image from the viewing area of viewing area extraction unit 685 outputs, thereby present image is combined with the composograph that is kept in the display-memory 686.Particularly, when specifying the display mode of fixing present image, image assembled unit 683 is assigned to by the central part that present image is write on composograph will be combined from the present image of file acquiring unit 681 outputs and the composograph that is kept at the display-memory 686.On the contrary, during the display mode of the composograph of the image before having specified fixing present image, image assembled unit 683 writes on the present image through affine transformation on the composograph by the position based on the present image from the viewing area of viewing area extraction unit 685 outputs, thereby will be combined with the composograph that is kept in the display-memory 686 by the present image through affine transformation that image transforming unit 682 generates.In addition, the image that 683 compressions of image assembled unit are generated by image transforming unit 682 through affine transformation, with compressed, write on the composograph that is kept in the video memory 684 through the image of affine transformation, make the present image that writes on the composograph that is kept in the display-memory 686 become non-compressed image thus or have the image of catching than the resolution of the historigram image height of compression.Therefore, the history image of output during composograph becomes compressed image, and present image becomes non-compressed image or has the image of catching than the resolution of the historigram image height of compression.Here, the size of the present image that will be combined in display-memory 686 is to determine according to the value that shows enlargement ratio.Note, will describe the combination of the present image in the display-memory 686 in detail with reference to Figure 40 and Figure 41.
Video memory 684 is the work buffers of preserving the composograph that the combination carried out by image assembled unit 683 generates.The composograph that video memory 684 is configured to be preserved offers image assembled unit 683 or viewing area extraction unit 685.That is, video memory 684 is video memories of preserving history image.
Viewing area extraction unit 685 is configured to extract the image that is present in the scope of viewing area (zone that will be shown) from the composograph that is kept at video memory 684, and makes display-memory 686 preserve the image that extracts.In addition, in the composograph in being kept at video memory 684 with at least a portion of the corresponding present image of present frame from the scope of viewing area when outstanding, extraction unit 685 moving display areas in viewing area are so that whole present image is included in the scope of viewing area, and after this, extract image in the scope be present in the viewing area in the composograph from be kept at video memory 684.In addition, during the display mode of the composograph of the image before having specified fixing present image, viewing area extraction unit 685 calculates the position of present image in the viewing area, and the position of present image in the viewing area outputed to image assembled unit 683.Note, will be with reference to the extraction of such as Figure 35 to Figure 41 detaileds description of etc.ing to the image in the scope that is included in the viewing area, and will be with reference to detailed description viewing areas such as Figure 36, Figure 37 mobile.And, will describe calculating in detail with reference to Figure 40 to the position of the present image in the viewing area.
Display-memory 686 is display buffers of preserving the image that is extracted from video memory 684 by viewing area extraction unit 685, and the image of being preserved is displayed on the display unit 689.Note, will be kept at image in the display-memory 686 with reference to detailed descriptions such as Figure 35, Figure 36.
Indicative control unit 687 is configured to frame by frame and shows the composograph that is kept in the display-memory 686 in proper order.
Display unit 689 is configured to show the composograph that is kept in the display-memory 686 under the control of indicative control unit 687.For example, display unit 689 can be realized by the display or the television set of personal computer.Note, will wait the demonstration example of describing composograph in detail with reference to Figure 42.
Operation acceptance unit 688 comprises various operation keyss etc., and is configured to when having accepted to utilize the operation input of these key inputs, and just the details that the operation of being accepted is imported is exported to file acquiring unit 681 or viewing area extraction unit 685.In operation acceptance unit 688, for example, be provided with the demonstration enlargement ratio assignment key of demonstration enlargement ratio of playback command key, the designated movement image of the instruction that provides playing moving images, and the key that is provided with that display mode is set in the situation of playing moving images.As its display mode, for example, have following display mode: with the corresponding present image of present frame by affine transformation, and with present frame before the state that is fixed of the corresponding composograph of each frame in be shown; Perhaps going up by affine transformation in the opposite direction with the side of affine transformation parameter at the corresponding composograph of preceding frame, and in the state that is fixed with the corresponding present image of present frame, be shown with each.Even also can between these display modes, carry out switching during the playback moving image.Promptly, according to embodiments of the invention, can be by carry out demonstration at random switching between the following method: the image combination/display packing of the history image in conversion past when fixedly present image shows the border, and the image combination/display packing that moves present image demonstration border based on camera operation.
Figure 35 comprises each frame that schematically shows the motion pictures files in the moving image memory cell 660 that is stored in the embodiment of the invention and the diagrammatic sketch of the relation between the viewing area.Here, only illustrate video memory 684, metadata storage unit 670 and operation acceptance unit 688, and omitted the structure except these parts in the accompanying drawings.And, to such situation be described by example: wherein, utilization is stored in the affine transformation parameter 676 in the meta data file 671, to " 3 ", generates composograph according to the frame " 1 " in the motion pictures files 661 shown in the part that is included in Figure 33 (b) in video memory 684.Note, in Figure 35, show with present frame before the corresponding composograph of each frame be fixed on situation on the display unit 689.
In the part (a) of Figure 35, show such situation: as first frame in the frame of the motion pictures files 661 shown in the part (b) that constitutes Figure 33, promptly frame 1 (664) is stored in the video memory 684.For example, when the play-back command operation of the instruction that corresponding composograph of each frame before operation acceptance unit 688 acceptance have provided fixing and present frame and broadcast are stored in the motion pictures files 661 in the moving image memory cell 660 is imported, shown in the part (a) of Figure 35, be stored in the video memory 684 with frame 1 (664) the corresponding image 351 of motion pictures files 661.Here, about being kept at position in the video memory 684 with the corresponding image 351 of first frame, image 351 can be kept at preassigned position, perhaps the user utilizes operation acceptance unit 688 appointed positions.And, for example, can utilize the affine transformation parameter 676 relevant that is kept in the meta data file 671 to calculate the size of frame " 1 ", and calculate, can determine the position that image 351 is saved based on this to the composograph of " n " with motion pictures files 661.Note, in this example, will provide such description, wherein, as initial point, horizontal direction (abscissa) is as the x axle, and vertical direction (ordinate) is as the y axle with the top-left position that is placed on the image 351 in the video memory 684.
Shown in the part (a) of Figure 35, suppose that the viewing area that image 351 is placed in the situation in the video memory 684 is viewing area 361.The value of the demonstration enlargement ratio of accepting according to operation acceptance unit 688 is determined viewing area 361 based on position and size thereof that image 351 is preserved.For example, when the demonstration enlargement ratio of " 0.5 times " of present image was zoomed out in appointment, viewing area 361 was the twice that the center becomes the size of image 351 with image 351.Note, can determine the position of viewing area 361 according to affine transformation parameter with respect to image 351.That is, when the demonstration enlargement ratio of " 0.5 times " of present image was zoomed out in appointment, the affine transformation parameter that utilizes the zoom component of x direction and y direction to be doubled was provided with the viewing area.And, when the viewing area with respect to present image by translation or when rotation, can utilize affine transformation parameter to determine the position and the scope of viewing area.
In the part (b) of Figure 35, show such situation: the frame 2 (665) in the frame of the motion pictures files 661 shown in the part (b) of formation Figure 33 is stored in the video memory 684.In this case, as mentioned above, utilize 676 pairs of the affine transformation parameters and frame 2 (665) the corresponding images 352 that are stored in explicitly in the meta data file 671 with frame number 675 " 1 " and " 2 " to carry out conversion, and will rewrite through the image 352 of conversion and make up with image 351.In this case, for example, when with the corresponding image 352 of present frame not from the scope of viewing area 361 when outstanding, the position and the size of viewing area 361 remain unchanged.Here, will describe present image in detail from the outstanding situation of the scope of current display area with reference to Figure 36 and Figure 37.Note, can move translation viewing area 361 with respect to image 351 according to image 352.
In the part (c) of Figure 35, show frame 3 in the frame of the motion pictures files 661 shown in the part (b) that constitutes Figure 33 and be stored in situation in the video memory 684.In this case, equally, as mentioned above, utilize 676 pairs of the affine transformation parameters and the frame 3 corresponding images 353 that are stored in explicitly in the meta data file 671 with frame number 675 " 1 " to " 3 " to carry out conversion, and will rewrite through the image 353 of conversion and make up with image 351 and 352.
Next, will be described in detail with reference to the attached drawings according to the processing in the situation that moves moving display area of present image.
Figure 36 comprises the diagrammatic sketch that schematically shows the viewing area moving method of present image from the outstanding situation in viewing area.The part of Figure 36 (a) shows a plurality of images that comprise present image 760 that are kept in the video memory 684 and the diagrammatic sketch of the relation between the viewing area 759.Shown in the part (a) of Figure 36, because whole present image 760 all is included in the scope of viewing area 759, therefore, whole present image 760 is presented in the display unit 689 with other image.
The part of Figure 36 (b) shows a plurality of images that comprise present image 762 that are kept in the video memory 684 and the diagrammatic sketch of the relation between the viewing area 759.Here, present image 762 is corresponding images of next frame of the present image 760 shown in the part (a) with Figure 36.Shown in the part (b) of Figure 36, when the part of present image 762 was outstanding from the scope of viewing area 759, the part of present image 762 was not presented in the display unit 689.Therefore, in this case, shown in the part (b) of Figure 36, by viewing area extraction unit 685 calculate a side of viewing area 759 and from the scope of viewing area 759 difference 763 between the outstanding present image 762, and viewing area 759 moved through added value 764 is added on the difference 763 and the value that obtains.Here, added value 764 for example can be 5 pixels.Perhaps, can be with the only mobile difference in viewing area 759, and do not add added value.Note, in the part (b) of Figure 36, by example described present image 762 from the viewing area 759 the outstanding situation of right side part.Yet, when present image is outstanding from upper part, lower part or left part, can utilize the similar approach moving display area.And, when at least two parts of present image from upper and lower a, left side and right part are outstanding, calculate the difference of every side, and based on each difference that calculates, can be on the direction of each side moving display area.
In the part (c) of Figure 36, show based on the difference 763 that in the state shown in the part (b) of Figure 36, calculates and viewing area 765 after moving.
Figure 37 comprises the diagrammatic sketch that shows the transformation example in the situation of utilizing moving method moving display area shown in Figure 36.The part of Figure 37 (a) shows in the situation that the viewing area is moved, the diagrammatic sketch of the example of the transformation of the viewing area on the video memory 684, and the part of Figure 37 (b) shows in the situation that the viewing area is moved, the diagrammatic sketch of the example of the transformation of the image that shows on the display unit 689.As shown in the figure, though when the image of present image 767 back from the viewing area 766 when outstanding, also can be according to the sequence of positions ground moving display area 766 of present image.For example, when the present image on the video memory 684 when present image 767 advances to present image 769, move according to this, viewing area 766 moves to the position of viewing area 768.In this case, the image that is presented on the display unit 689 is changed into image 771 from image 770.Therefore, though the image on being presented at display unit 689 will be exaggerated/when dwindling, also can be constantly on display unit 689, show whole present image.
Next, be fixed on situation on the display unit 689 with being described in detail with reference to the attached drawings with the corresponding present image of present frame.
Figure 38 comprises each frame that schematically shows the motion pictures files in the moving image memory cell 660 that is stored in the embodiment of the invention and the diagrammatic sketch of the relation between the viewing area.Here, the same with Figure 35, only illustrate video memory 684, metadata storage unit 670 and operation acceptance unit 688, and omitted the structure except these parts in the accompanying drawings.And, by example such situation has been described: utilize the affine transformation parameter 676 that is stored in the meta data file 671, to " 3 ", in video memory 684, generate composograph according to the frame " 1 " in the motion pictures files 661 shown in the part that is included in Figure 33 (b).
In the part (a) of Figure 38,, show the situation that frame 1 (664) is kept in the video memory 684 as in the part (a) of Figure 35.Note, because therefore identical shown in the part (a) of the position of image 351 shown in the part (a) of Figure 38 and viewing area 361 and big or small and Figure 35, omit detailed description here.Notice that in this example, the viewing area is transformed with the conversion of present image.But, since with frame 1 (664) corresponding affine transformation parameter be the parameter of unit matrix, therefore, only by taking into account to determine and frame 1 (664) corresponding viewing area 361 from the appointment to showing enlargement ratio of operation acceptance unit 688.
In the part (b) of Figure 38,, show the situation that frame 2 (665) is kept in the video memory 684 as in the part (b) of Figure 35.In this case,, be transformed, and be rewritten and combined with image 351 through the image 352 of conversion with frame 2 (665) corresponding images 352 as in the part (b) of Figure 35.In addition, also affine transformation is applied to the viewing area.That is, the position of reference picture 351 and size are utilized with frame number 675 " 1 " and " 2 " and are stored in affine transformation parameter 676 in the meta data file 671 explicitly to carrying out conversion with frame 2 (665) corresponding images 352.Utilize the affine transformation parameter of determining according to the value of the demonstration enlargement ratio of accepting by operation acceptance unit 688 to come the position and the size of image 352 are carried out conversion, and become viewing area 362 according to the zone of determining through the position and the size of conversion.Particularly, when with frame number 675 " 1 " and " 2 " corresponding affine transformation parameter matrix be A1 and A2, and the matrix of the affine transformation parameter of determining according to the value of the demonstration enlargement ratio of accepting by operation acceptance unit 688 be B (for example, matrix with reference to present image) time, the value of A1 * A2 * B is obtained, and the position of reference picture 351 and size utilize the matrix A 1 * A2 * B that is obtained to determine viewing area 362.
In the part (c) of Figure 38,, show frame 3 and be kept at situation in the video memory 684 as in the part (c) of Figure 35.Equally, in this case, as mentioned above, be transformed, and be rewritten and combined with image 351 and 352 through the image 353 of conversion with present frame 3 corresponding images 353.In addition, also affine transformation is applied to the viewing area, and viewing area 363 is determined with respect to image 353.Particularly, when with frame number 675 " 1 " to the matrix of " 3 " corresponding affine transformation parameter be A1 to A3, and when the matrix of the affine transformation parameter of determining according to the value of the demonstration enlargement ratio of being accepted by operation acceptance unit 688 is B, the value of A1 * A2 * A3 * B is obtained, and the position of reference picture 351 and size utilize the matrix A 1 * A2 * A3 * B that is obtained to determine viewing area 363.
Figure 39 comprises the diagrammatic sketch of the overview that shows the amplification method in following situation: when having specified the display mode that present image is fixed on the display unit 689, the moving image that is presented on the display unit 689 is exaggerated and is shown.The part of Figure 39 (a) is the diagrammatic sketch that schematically shows the transformation of the viewing area in the situation that the moving image that is presented on the display unit 689 is exaggerated and is shown.The part of Figure 39 (b) is the diagrammatic sketch that the images in the viewing area 698 shown in the part (a) that Figure 39 is shown and 699 are displayed on the demonstration example in the situation on the display unit 689.
In the part (b) of Figure 39, show the viewing area 698 shown in the part (a) according to Figure 39, extract from video memory 684 and be displayed on image 700 on the display unit 689.Here, when operation acceptance unit 688 is accepted the operation of amplification/idsplay order in the state that the image 700 shown in the part (b) of Figure 39 is shown, viewing area extraction unit 685 size that viewing area 698 is dwindled in operation according to amplification/idsplay order.Notice that it is that carry out at the center that this size reduction is handled with present image 697.That is, as mentioned above, utilize the affine transformation parameter of determining according to the value of the demonstration enlargement ratio of accepting by operation acceptance unit 688 to come the position and the size of image 679 are carried out conversion, and determine viewing area 698 according to position and size through conversion.In this example, amplify the operation input that shows enlargement ratio, therefore, determine the zoom component of affine transformation parameter according to the amplification that shows enlargement ratio owing to provided.
For example, shown in the part (a) of Figure 39, viewing area 698 big or small reduced, and viewing area 698 becomes viewing area 699.In the part (b) of Figure 39, show the viewing area 699 shown in the part (a) according to Figure 39, extract from video memory 684 and be presented at image 701 on the display unit 689.As above, by only changing the size of viewing area, comprise that the image of present image can be exaggerated or dwindle, and can be shown.
As above illustrated, by the image in the scope that is presented at the viewing area of arranging in the video memory 684, the composograph of just playing can sequentially be shown.Here, when present image will and be combined in the video memory 684 by affine transformation, such as being to have that the resolution conversion of low resolution is handled and the picture quality conversion of compression handling can be applied to present image with image transform.Therefore, consider when will amplify with higher demonstration enlargement ratio/when showing present image, the composograph that comprises present image is fogged.Therefore, in this example,, utilize the image that in video memory 684, exists before the execution combination to show composograph for the current present image that is played.Describe this display packing below with reference to the accompanying drawings in detail.
Figure 40 and Figure 41 comprise the diagrammatic sketch of the stream of each frame that schematically shows the motion pictures files in the moving image memory cell 660 that is stored in the embodiment of the invention.Here, only illustrate the relation of 686 of moving image memory cell 660, metadata storage unit 670, video memory 684 and display-memories, and omitted the structure except that these parts in the accompanying drawings.And, in Figure 40, show with present frame before the corresponding composograph of each frame be fixed on situation on the display unit 689.In Figure 41, show with the corresponding present image of present frame and be fixed on situation on the display unit 689.
In the part (a) of Figure 40, motion pictures files 661 shown in the part of Figure 33 (b) and meta data file 671 are simplified and illustrate.Below, will describe and be included in the example that the corresponding image of frame i (666) in the motion pictures files 661 is shown.That is, suppose from generating composograph with the frame 1 that constitutes motion pictures files 661 to " i-1 " corresponding image.And, suppose that viewing area 361 shown in Figure 35 is moved to the right side according to the moving of present image.
In the part (b) of Figure 40, schematically show the video memory 684 of preserving the composograph that generates by combination and the corresponding image of each frame that constitutes motion pictures files 661.Shown in the part (b) of Figure 35, at first be stored in the video memory 684 with frame 1 (664) the corresponding image 351 that is included in the motion pictures files 661.After image 351 is kept in the video memory 684, utilize respectively and sequentially frame 2 to " i-1 " corresponding each image with formation motion pictures files 661 is carried out affine transformation, and will rewrite through the image sequence ground of affine transformation and be kept in the video memory 684 with frame 2 to the value that " i-1 " is stored in the affine transformation parameter 676 in the meta data file 671 explicitly.Frame by frame is extracted the image that is present in the viewing area in the composograph of viewing area extraction unit 685 from be kept at video memory 684, and this viewing area is according to determining from the relevant operation input of the appointment with showing enlargement ratio of operation acceptance unit 688.
Be kept in the state in the video memory 684 with frame 1 to the composograph of " i-1 " corresponding each image, utilize with frame 1 to i and be stored in affine transformation parameter 676 in the meta data file 671 explicitly to carrying out affine transformation, and will rewrite and be kept in the video memory 684 through the present image 692 of affine transformation with the corresponding image of frame i (666) that is included in the motion pictures files 661.In the composograph of viewing area extraction unit 685 from be kept at video memory 684, extraction is present in according to the image in the viewing area of determining from the relevant operation input of the appointment with showing enlargement ratio of operation acceptance unit 688 690, and make display-memory 686 preserve the image that extracts, shown in the part (c) of Figure 40.
In the part (c) of Figure 40, schematically show the display-memory 686 of preserving the image that extracts by viewing area extraction unit 685.Here, in the image that extracts by viewing area extraction unit 685, with the corresponding present image 693 of present frame be not the present image 692 that from video memory 684, extracts by viewing area extraction unit 685, and used from moving image memory cell 660 images that obtain and carried out affine transformation by image transforming unit 682.Here, can determine the preservation position of present image 693 in display-memory 686 based on the position of the present image in the video memory 684 692 and the position and the size of the viewing area 690 in size and the video memory 684.For example, when with frame number 675 " 1 " to the matrix that " i " is stored in the affine transformation parameter in the meta data file 671 explicitly be A1...Ai, and be used for determining viewing area 690 affine transformation parameter matrix (for example, when the matrix of reference picture store 684) being C, position that can reference picture 351, utilize Inv (C) * A1 * ... * Ai determines the preservation position of present image 693 in display-memory 686.
Shown in the part (c) of Figure 40, the image that is extracted by viewing area extraction unit 685 is kept in the display-memory 686.In addition, obtain from moving image memory cell 660 and be written on the image that extracts by viewing area extraction unit 685 through the image of image transforming unit 682 affine transformations, and composograph is stored in the display-memory 686.The image that is kept in the display-memory 686 is displayed on the display unit 689.As above, for present image, can utilize be in to through the image applications of affine transformation such as size reduction processing and be kept at image in the video memory 684 before state in image show the present image of clear relatively (clean).And, even carry out amplification etc., also can show present image with clear state in response to user's operation.
In the part (a) of Figure 41, be simplified and illustrate at motion pictures files 661 and the meta data file 671 shown in the part (a) of Figure 33.Note, moving image memory cell 660 shown in the part of Figure 41 (a) and metadata storage unit 670 and be kept at identical and (b) of the composograph in the video memory 684 shown in the part (b) of Figure 41 with the part (a) of Figure 40, therefore, omit description here to it.
In the part (b) of Figure 41, schematically show the video memory 684 of 692 composograph shown in the part (b) of preserving Figure 40, and with dashed lines has indicated the viewing area 361 shown in the part (b) of Figure 38 from image 351 to present image.In this example, as shown in figure 38,, calculate the viewing area by affine transformation according to present image 692 for fixing and the position of the corresponding present image of present frame on display unit 689.Promptly, with reference to image 351 as present image, utilization will be a present image 692 with the corresponding image transform of frame i (666) with frame number 675 " 1 " to the affine transformation parameter 676 that " i " is stored in the meta data file 671 explicitly, and present image 692 is stored in the video memory 684.For with the corresponding viewing area 695 of present frame i (666), the demonstration enlargement ratio value that utilization is accepted according to operation acceptance unit 688 and definite affine transformation parameter comes the position and the size of conversion present image 692, and determine viewing area 695 according to position and size through conversion.The definite of viewing area carried out by viewing area extraction unit 685.
In the part (c) of Figure 41, schematically show the display-memory 686 of preserving the image that viewing area extraction unit 685 extracted.Here, being kept at image (image except present image 696) in the display-memory 686 is by utilizing and being used for the image (the interior image of scope that is present in viewing area 695) that the relevant inverse of a matrix matrix of the affine transformation parameter of conversion viewing area 695 extracts viewing area extraction unit 685 and carrying out the image that conversion obtains.That is, the shape of the viewing area of arranging in video memory 684 may be for parallelogram etc. because of affine transformation.For at the composograph of mode in the viewing area of affine transformation that shows on the display unit 689 with the front, utilize the inverse of a matrix matrix relevant to come composograph in the conversion viewing area with the current affine transformation parameter that is used for the affine transformation present image.For example, when with frame number 675 " 1 " to the matrix that " i " is stored in the affine transformation parameter in the meta data file 671 explicitly be A1...Ai, and be used for determining viewing area 695 affine transformation parameter matrix (for example, when the matrix with reference to present image) being B, and Inv (A1 * ... Ai * B) is used as the matrix that the composograph in the viewing area is carried out conversion.Therefore, for example, shown in the part (c) of Figure 41, can be rectangle with the image transform that is transformed to parallelogram, and be presented on the display unit 689.And, in the image that viewing area extraction unit 685 is extracted, that to obtain from moving image memory cell 660 and without the image of affine transformation, and the image that non-display area extraction unit 685 extracts from video memory 684, as and the corresponding present image of present frame.Here, according to determining storage location and the size of present image 696 in display-memory 686 from the demonstration enlargement ratio of operation acceptance unit 688.
Shown in the part (c) of Figure 41, the image that viewing area extraction unit 685 is extracted is stored in the display-memory 686.In addition, will write on from the image that moving image memory cell 660 obtains on the image that viewing area extraction unit 685 extracted, and composograph will be kept in the display-memory 686.Therefore, when having specified when present image is presented at the display mode of fixed position, can will return to the state of this composograph through the composograph of affine transformation by inverse matrix, and can show composograph without affine transformation without affine transformation.And, for present image, as Figure 40, can show relative distinct image.
As illustrated, since utilize be kept at video memory 684 in the identical method of the generation method of composograph carried out generation, and can realize the playback of moving image with two kinds of display modes, therefore, can be in the switching of carrying out during the playback of moving image between two kinds of display modes.Therefore, even during playback, the beholder of the moving image of just playing also can switch to display mode the display mode of wanting.For example, in situation just with display mode playing moving images shown in Figure 40, if center and beholder that the beholder likes the people of type to appear at present image wish this people is placed the core of display unit 689 and watches this people, thereby then can switch display mode with display mode playback moving image shown in Figure 41 by carrying out the display mode handover operation from operation acceptance unit 688.And, can obtain images from moving image memory cell 660 but not be kept at composograph the video memory 684 as present image.Therefore, can watch relative distinct image.To describe this demonstration example in detail with reference to Figure 42 and Figure 43.
The part of Figure 42 (a) is the diagrammatic sketch that the demonstration example in the situation that the captured moving image of camera is played is shown.In this example, show image 750 during the moving image of playing on captured when moving in the horizontal direction camera, the head of a family and the lawn of child before the building carries out playback.Here, in image 750, with the corresponding image of each frame that constitutes moving image image 751 is formed panorama (panorama) by combination.And, in image 750, be present image 752 with the corresponding image of present frame.
Here, will the situation of amplifying and showing the image-region that is centered on by border 753 be described.In the time will amplifying/dwindle and be presented at the image that shows on the display unit 689, user's operating operation is accepted the demonstration enlargement ratio assignment key on the unit 688, and the user can specify desirable demonstration enlargement ratio thus.For example, shown in the part (a) of Figure 42, in situation about image 750 being presented on the display unit 689, if the image-region that amplification and demonstration are surrounded by border 753, then the demonstration enlargement ratio assignment key accepted on the unit 688 by operating operation of user is specified the demonstration enlargement ratio, and in addition, the user goes back assigned address, thus, can amplify and show the image-region that surrounds by border 753.
The part of Figure 42 (b) is that the diagrammatic sketch of the present image in the image 750 752 being carried out the image 754 in the affine transformation state before is shown.
The part of Figure 43 (a) is to be illustrated in the diagrammatic sketch that amplifies and show the image 755 in the situation of the image-region that is surrounded by border 753 shown in the part (a) of Figure 42.Image 755 shown in the part of Figure 43 (a) is by carry out the image that combination generates in the state before will being stored in the video memory 684 through the present image of affine transformation in display-memory 686.As above, in the zone of present image 756, the meticulous relatively image in the state before having shown in being stored into video memory 684.Therefore, when with present image 756 and zone except that this zone when comparing, can see than other zone present image 756 relatively more clearly.On the contrary, the image 757 shown in the part of Figure 43 (b) is the images that are stored in the present image through affine transformation is stored in state in the video memory 684 in the display-memory 686.When display image by this way, even in the zone of present image 758, also shown with other zone in the image of image par.That is, according to embodiments of the invention, at combination image and when showing composograph, the history image that is kept in the display-memory 686 can be compressed.Yet, can or have image than the resolution of historigram image height with the image of uncompressed as present image.Therefore, can realize high-quality image combination and demonstration.
Figure 44 and Figure 45 are the flow charts that the processing procedure of the moving image playback process of being carried out by the image processing apparatus in the embodiment of the invention 680 is shown.Note, in Figure 44 and processing procedure shown in Figure 45,, therefore, give their identical labels, and omit description here it because step S921, S925, S926, S928 and S929 and processing procedure shown in Figure 25 are similar.
In response to operation input from operation acceptance unit 688, file acquiring unit 681 obtains the motion pictures files that is stored in the moving image memory cell 660, and obtains with motion pictures files and be stored in meta data file (step S961) in the metadata storage unit 670 explicitly.Then, 681 pairs of motion pictures files of file acquiring unit are decoded and are obtained present frame, promptly are included in the frame (step S962) in the motion pictures files.Then, file acquiring unit 681 obtains and the corresponding affine transformation parameter of present frame (step S963) that is obtained from meta data file.
Then, through affine transformation be written on the composograph with the corresponding present image of present frame and be stored in the video memory 170 after (step S926), viewing area extraction unit 685 has judged whether to specify the fixedly display mode of present image (step S964).When having specified fixedly the display mode of present image (step S964), viewing area extraction unit 685 utilizes the affine transformation parameter from first frame to present frame and determines the position and the size (step S965) of viewing area with showing the corresponding affine transformation parameter of enlargement ratio.Then, viewing area extraction unit 685 extracts the composograph (S966) that is included in the viewing area from video memory 684.Then, viewing area extraction unit 685 is used for determining that the affine transformation parameter inverse of a matrix matrix of viewing area comes the composograph that extracts from video memory 684 is carried out affine transformation (step S967).
Then, viewing area extraction unit 685 is will be from video memory 684 that extract and be stored in (step S968) the display-memory 686 through the composograph of affine transformation.Then, image assembled unit 683 present image is write and with the composograph combined (step S969) that is kept in the display-memory 686.Then, display unit 689 shows the composograph (step S970) that is kept in the display-memory 686.
And when not specifying the display mode of fixing present image (step S964), viewing area extraction unit 685 utilizes position and the size (step S971) of determining the viewing area with the corresponding affine transformation parameter of demonstration enlargement ratio.Note, when move according to the conversion of present image the viewing area, can use the position of the viewing area that just has been moved.
Then, viewing area extraction unit 685 is judged the present image that is kept in the video memory 684 whether outstanding from the viewing area (step S972).Present image in being kept at video memory 684 not from the viewing area when outstanding (, when whole present image all is included in the scope of viewing area) (step S972), viewing area extraction unit 685 extracts the composograph (step S973) that is included in the viewing area from video memory 684.Then, viewing area extraction unit 685 will be stored in (step S974) the display-memory 686 from the composograph that video memory 684 extracts.
Then, viewing area extraction unit 685 is used for the affine transformation parameter matrix of conversion present image and the affine transformation parameter inverse of a matrix matrix that is used for determining the viewing area, determines the position (step S975) of present image in display-memory 686.Then, image assembled unit 683 will write through the present image of affine transformation and make up (step S976) with the composograph that is kept in the display-memory 686.Then, flow process advances to step S970.
Perhaps, present image in being kept at video memory 684 from the viewing area when outstanding (, when at least a portion of present image is not included in the scope of viewing area) (step S972), viewing area extraction unit 685 calculate side of viewing areas and present image from the difference (step S977) between the outstanding part in viewing area.Then, viewing area extraction unit 685 is based on the difference moving display area (step S978) that calculates.Then, flow process advances to step S973.
Next, will be described in detail with reference to the attached drawings following situation: wherein, the feature point extraction of utilizing polycaryon processor to carry out in the embodiment of the invention is handled and the optical flow computation processing.
Figure 46 is the diagrammatic sketch that the topology example of the polycaryon processor 800 in the embodiment of the invention is shown.Polycaryon processor 800 is that a plurality of dissimilar processor cores are installed in processor in single cpu (CPU) encapsulation.Promptly, for the disposal ability of keeping each processor core unit and in order to make configuration simple, to comprise that following two types a plurality of processor cores are installed in the polycaryon processor 800: one type corresponding to all application, and another kind of type at predetermined be applied in optimised to a certain extent.
Polycaryon processor 800 comprises processor controls nuclear 801, arithmetic processor nuclear (#1) 811 to (#8) 818 and bus 802, and is connected to main storage 781.And polycaryon processor 800 is connected to the miscellaneous equipment such as graphics device 782 and I/O equipment 783.Can adopt for example " Cell (Cell wideband engine) ", promptly be used as polycaryon processor 800 by the microprocessor of the application's applicant exploitation etc.
As in operating system, processor controls nuclear 801 is the main processor controls nuclear that frequent thread switches of carrying out.Note, will describe processor controls nuclear 801 in detail with reference to Figure 47.
Arithmetic processor nuclear (#1) 811 is applicable to the simple small-sized arithmetic processor nuclear that multimedia is handled to (#8) the 818th.Note, will describe arithmetic processor nuclear (#1) 811 in detail to (#8) 818 with reference to Figure 48.
Bus 802 is the high-speed buses that are called EIB (element interconnection bus).Processor controls nuclear 801 and arithmetic processor nuclear (#1) 811 are connected to bus 802 to each of (#8) 818.Each processor core is via bus 802 visit datas.
Main storage 781 is the main storages that are connected to bus 802.Main storage 781 storages will be written into the various programs and the required data of the performed processing of each processor core of each processor core.In addition, main storage 781 storages are through the data of each processor core processing.
Graphics device 782 is the graphics devices that are connected to bus 802.I/O equipment 783 is the outside input-output apparatus that are connected to bus 802.
Figure 47 is the diagrammatic sketch that the topology example of the processor controls nuclear 801 in the embodiment of the invention is shown.Processor controls nuclear 801 comprises processor controls unit 803 and processor controls storage system 806.
Processor controls unit 803 is the unit as the nuclear of carrying out the calculation process of being carried out by processor controls nuclear 801, and comprises the command set based on micro-processor architecture.Order buffer memory 804 and metadata cache 805 are mounted as master cache.Order buffer memory 804 for example is a 32KB order buffer memory.Metadata cache 805 for example is the 32KB metadata cache.
Processor controls storage system 806 is unit of control data access of 781 from processor controls unit 803 to main storage.In order to increase memory access speed, the auxilliary buffer memory 807 of 512KB has been installed from processor controls unit 803.
Figure 48 is the diagrammatic sketch that the topology example of arithmetic processor nuclear (#1) 811 in the embodiment of the invention is shown.Arithmetic processor nuclear (#1) 811 comprises arithmetic processor unit 820 and memory stream controller 822.Note,, therefore, omit description here it because arithmetic processor nuclear (#2) 812 has structure with the similar of arithmetic processor nuclear (#1) 811 to (#8) 818.
Arithmetic processor unit 820 is the unit as the nuclear of carrying out the calculation process of being carried out by arithmetic processor nuclear (#1) 811, and comprises the unique command set different with the processor controls unit 803 in the processor controls nuclear 801.And local storage (LS) 821 is installed in the arithmetic processor unit 820.
Local storage (LS) the 821st, the private memory of arithmetic processor unit 820, and be simultaneously can be directly from unique memory of arithmetic processor unit 820 references.The memory that can be 256KB with for example capacity is as local storage 821.Note, can visit local storage in main storage 781 and other arithmetic processor nuclear (arithmetic processor nuclear (#2) 812 is to (#8) 818), need to use memory stream controller 822 for arithmetic processor unit 820.
Memory stream controller 822 is the unit that are used for swap data between main storage 781 and other arithmetic processor nuclear etc., and is the unit that is called MFC (memory stream controller).Here, for example, arithmetic processor unit 820 transmits data via the interface requests memory stream controller 822 that is called passage (channel).
The programming mode of various programming modes as above-mentioned polycaryon processor 800 proposed.In these programming modes, carrying out main program on the processor controls nuclear 801 and be called fundamental mode in arithmetic processor nuclear (#1) 811 pattern of execution subroutine to (#8) 818.In an embodiment of the present invention, the operation method of the polycaryon processor 800 that utilizes this pattern will be described in detail with reference to the attached drawings.
Figure 49 is the diagrammatic sketch of the operation method of the polycaryon processor 800 in the schematically illustrated embodiment of the invention.In this example, to illustrate such situation by example: execute the task 784 the time when processor controls nuclear 801 utilizes data 785, processor controls nuclear 801 uses the required data 787 (parts of data 785) of Processing tasks 786 (part of task 784), and makes each arithmetic processor nuclear execute the task 786.
As shown in the figure, when utilizing data 785, processor controls nuclear 801 executes the task 784 the time, processor controls nuclear 801 uses the required data 787 (parts of data 785) of Processing tasks 786 (part of task 784), and makes each arithmetic processor nuclear execute the task 786.In an embodiment of the present invention, each arithmetic processor nuclear is carried out calculation process at each frame that constitutes moving image.
As shown in the figure, when polycaryon processor 800 executable operations, arithmetic processor nuclear (#1) 811 can be used and many operations can be performed in the short relatively time by parallel to (#8) 818.Perhaps, can be by to (#8) 818, using SIMD (single instrction/multidata) operation to utilize the many relatively calculation process of a small amount of command execution at arithmetic processor nuclear (#1) 811.Note, will describe the SIMD operation in detail with reference to Figure 53 to 56 grade.
Figure 50 is shown schematically in the program in the situation of polycaryon processor 800 executable operations in the embodiment of the invention and the diagrammatic sketch of data flow.Here, describe mode with example and relate to arithmetic processor nuclear (#1) 811 of arithmetic processor nuclear (#1) 811 to (#8) 818.Yet, also can use arithmetic processor nuclear (#2) 812 to come executable operations similarly to (#8) 818.
At first, processor controls nuclear 801 sends instruction in the local storage 821 that the arithmetic processor nuclear program 823 that will be stored in the main storage 781 is written into arithmetic processor nuclear (#1) 811 to arithmetic processor nuclear (#1) 811.Therefore, arithmetic processor nuclear (#1) the 811 arithmetic processor nuclear program 823 that will be stored in the main storage 781 is written in the local storage 821.
Then, processor controls nuclear 801 indication arithmetic processor nuclears (#1) 811 are carried out the arithmetic processor nuclear program 825 that is stored in the local storage 821.
Then, arithmetic processor nuclear (#1) 811 is sent to the arithmetic processor nuclear program 825 required data 824 that execution is stored in the local storage 821 local storage 821 from main storage 781.
Then, based on the arithmetic processor nuclear program 825 that is stored in the local storage 821, arithmetic processor nuclear (#1) 811 is handled the data 826 that send from main storage 781, carries out according to condition and handles, and result is stored in the local storage 821.
Then, arithmetic processor nuclear (#1) 811 will be sent to main storage 781 from local storage 821 based on the result of the processing that is stored in arithmetic processor nuclear program 825 execution in the local storage 821.
Then, arithmetic processor nuclear (#1) 811 notice processor controls are examined the termination of 801 calculation process.
Next, the SIMD operation of using polycaryon processor 800 to carry out will be described in detail with reference to the attached drawings.Here, the SIMD operation is to utilize the interpretative version of single command execution to the processing of multinomial data.
The part of Figure 51 (a) is the schematically illustrated diagrammatic sketch that utilizes each bar order multinomial data to be carried out the overview of the interpretative version of handling.Interpretative version shown in the part of Figure 51 (a) is common interpretative version and for example is called the scalar operation.For example, the data " A1 " and the order of data " B1 " addition are provided the result of data " C1 ".And, can carry out other three operations similarly.Carrying out in each is handled will be with the order of the data item in the delegation " A2 ", " A3 " and " A4 " and data item " B2 ", " B3 " and " B4 " addition.Utilize this order, the value phase adduction in every row is handled, and result is obtained as " C2 ", " C3 " and " C4 ".As above, in the scalar operation,, need to carry out the order that is used to handle every item number certificate in order to handle a plurality of data item.
The part of Figure 51 (b) is schematically illustrated SIMD operation, promptly utilizes single command multinomial data to be carried out the diagrammatic sketch of the overview of the interpretative version of handling.Here, be used for respectively organizing data item (dotted line 827 and 828 data item of surrounding) and can being called the vector data item of SIMD operation.And the SIMD operation that utilizes this vector data to carry out can be called vector operations.
For example, will provide the result of " C1 ", " C2 ", " C3 " and " C4 " (by data item of dotted line 829 encirclements) by the single command of dotted line 827 vector data (" A1 ", " A2 ", " A3 " and " A4 ") that surrounds and vector data (" B1 ", " B2 ", " B3 " and " the B4 ") addition of surrounding by dotted line 828.As above, owing in the SIMD operation, can utilize single command to carry out the processing of a plurality of data item, therefore, can carry out calculation process apace.And, carry out the order of these SIMD operations by the processor controls in the polycaryon processor 800 nuclear 801, and in response to the calculation process of order to data item, arithmetic processor nuclear (#1) 811 carries out parallel processings to (#8) 818.
On the contrary, for example, can not carry out following processing: carry out the addition, data " A2 " of data " A1 " and " B1 " and subtraction, data " A3 " and the multiplication of " B3 " and the division of data " A4 " and " B4 " of " B2 " by SIMD operation.That is, in the time will carrying out different disposal, can not operate by SIMD and carry out processing multinomial data.
Next, the concrete operation method of the SIMD operation in carrying out the situation that feature point extraction is handled and optical flow computation is handled will be described in detail with reference to the attached drawings.
Figure 52 illustrates the diagrammatic sketch of being examined the topology example of the program that 801 exclusive disjunction processor cores (#1) 811 carry out by the processor controls in the embodiment of the invention.Here, only show arithmetic processor nuclear (#1) 811.Yet arithmetic processor nuclear (#2) 812 also carries out similar processing to (#8) 818.
Processor controls nuclear 801 is carried out decoding 852, staggered (interlace) 853 and size and is adjusted 854, as decoding 851.Decoding 852 is to the motion pictures files process of decoding.Staggered 853 is that each frame through decoding is carried out the processing of deinterleave (deinterlace).It is to dwindle each processing through the size of the frame of deinterleave that size adjusts 854.
And processor controls is examined 801 fill orders and is sent 857 and 859 and stop notice and receive 858 and 860, as arithmetic processor nuclear management 856.It is the process of commands that send execution SIMD operation to arithmetic processor nuclear (#1) 811 to (#8) 818 that order sends 857 and 859.Stopping notice reception 858 and 860 is to the processing of (#8) 818 receptions in response to the termination notice of the SIMD operation of mentioned order from arithmetic processor nuclear (#1) 811.In addition, processor controls nuclear 801 is carried out the camera operation calculation of parameter and is handled 862, detects 861 as camera operation.The camera operation calculation of parameter is handled 862 and is based on the light stream that is calculated to (#8) 818 execution SIMD operations by arithmetic processor nuclear (#1) 811, comes frame by frame to calculate the processing of affine transformation parameter.
Arithmetic processor nuclear (#1) 811 execution Sobel Filtering Processing 864, auxiliary moment matrix (moment matrix) (second moment matrix) computing 865, separable Filtering Processing 866, Harris (Harris) angle are extracted (Calc Harris) processing 867, expansion (dilation) processing 868 and ordering and are handled 869, as feature point extraction processing 863.
Sobel Filtering Processing 864 is to calculate the value dx utilize on the x direction that P2 filter (x direction) obtains, and the processing that utilizes the value dy on the y direction that the Y directional filter obtains.Note, will describe the calculating of the value dx on the x direction in detail with reference to Figure 53 to Figure 56.
Auxiliary moment matrix computing 865 is to utilize the dx and the dy that are calculated by Sobel Filtering Processing 864 to calculate each value dx 2, dy 2And the processing of dxdy.
Separable Filtering Processing 866 is the dx that obtain to by auxiliary moment matrix computing 865 2, dy 2And dxdy uses the processing of Gaussian (Gauss) filtering (Fuzzy Processing).
The Harris angle is extracted and handled 867 is to utilize each value dx that has been used Fuzzy Processing by separable Filtering Processing 866 2, dy 2And dxdy calculates the processing of the score at Harris angle.For example calculate the score S at Harris angle by following equation.
S=(dx 2×dy 2-dx·dy×dx·dy)/(dx 2+dy 2+ε)
Divergence process 868 is the image that the score of being extracted the processing 867 Harris angles that calculate by the Harris angle constitutes to be carried out the processing of Fuzzy Processing.
It is such processing that ordering handles 869: with the descending that is extracted the score of handling the 867 Harris angles that calculate by the Harris angle pixel is sorted, choose the predetermined number from top score, and extract selected point as characteristic point.
Arithmetic processor nuclear (#1) 811 is carried out pyramid diagram picture (pyramid image) (making the pyramid diagram picture) processing 871 and optical flow computation (calculating light stream) handles 872, as light stream calculation process 870.
Pyramid image processing 871 is sequentially to generate by utilizing camera to carry out the treatment of picture of the image size reduction that the image of downscaled images size obtains on a plurality of ranks when catching.The image that is generated is called multi-resolution image.
It is such processing that optical flow computation handles 872: calculate the light stream in the image that has lowest resolution in the multi-resolution image that is generated by pyramid image processing 871, and utilize the result who calculates to calculate light stream in the image with high-resolution then once more.This processing sequence is repeated to carry out till arrival has the image of highest resolution.
As above, for example, handle for the feature point extraction processing of carrying out by feature point extraction unit shown in Figure 1 121 grades and by the optical flow computation that optical flow computation unit 122 is carried out, can obtain result by utilizing the polycaryon processor 800 execution SIMD that carry out parallel processing to operate.Notice that feature point extraction processing shown in Figure 52 and optical flow computation processing etc. only are exemplary.Also can utilize execution by carry out the SIMD operation at the polycaryon processor 800 of other processing that constitutes realizations such as the various Filtering Processing carried out on the image of moving image, threshold process.
Figure 53 is shown schematically in to utilize 830 pairs in Sobel filter to be stored in view data (with the corresponding view data of a frame that comprises in the moving image of being caught by camera) in the main storage 781 in the embodiment of the invention to carry out the data structure in the situation of Filtering Processing and the diagrammatic sketch of handling process.Notice that the view data of storage is simplified and is shown to have 32 horizontal pixels in the main storage 781 shown in the figure.And Sobel filter 830 is 3 * 3 edge extraction filters.As shown in the figure, utilize Sobel filter 830 to carry out to the Filtering Processing of the view data of storage in the main storage 781, and the Filtering Processing result is output.In this example, description is utilized the SIMD operation obtain four filtering results' example simultaneously.
Figure 54 utilizes the view data of storing in 830 pairs of main storages 781 of Sobel filter to carry out the diagrammatic sketch of the data flow in the situation of SIMD operation in the schematically illustrated embodiment of the invention.At first, row (for example, the triplex row) DMA (direct memory access (DMA)) that is stored in the predetermined number that comprises first row of view data in the main storage 781 is sent in first buffer 831 in the local storage 821 that is included in arithmetic processor nuclear.In addition, will distinguish line down through the row that DMA is sent in first buffer 831, and the capable DMA of next predetermined number will be sent in second buffer 832.As above, can shelter the delay that causes because of the DMA transmission by using the double buffering device.
Figure 55 is the diagrammatic sketch that is shown schematically in the vector generation method of 9 vectors of view data generation from be stored in first buffer 831 in the situation of utilizing Sobel filter 830 to carry out Filtering Processing in the embodiment of the invention.Shown in Figure 55, after having carried out the DMA transmission, the view data from be stored in first buffer 831 has generated 9 vectors.Particularly, vector data 841 is that four data item in the upper left corner of the delegation of the view data from be stored in first buffer 831 generate.These four data item are moved right 1, and generate vector data 842 from next four data item.Similarly, four data item are moved right 1, and generate vector data 843 from next four data item.In addition, four data item from second row to the third line generate vector data item 844 to 849 respectively similarly respectively.
Figure 56 is shown schematically in to utilize Sobel filter 830 to carry out in the situation of Filtering Processing in the embodiment of the invention, utilizes the diagrammatic sketch of SIMD order to the vector operations method of vectorial data item 841 to 849 execute vectors operation.Particularly, vectorial data item 841 to 843 is sequentially carried out the SIMD operation, and obtain vectorial A.In SIMD operation, at first, carry out the SIMD operation of " ' 1 ' * ' vector data 841 ' ".Then, carry out the SIMD operation of " ' 0 ' * ' vector data 842 ' ", and carry out the SIMD operation of " ' 1 ' * ' vector data 843 ' ".Here, because the operating result of " ' 0 ' * ' vector data 842 ' " is confirmed as " 0 ", therefore, this operation can be omitted.And because the operating result of " ' 1 ' * ' vector data 843 ' " is confirmed as and " vector data 843 " identical value, so this operation can be omitted.
Then, utilizing SIMD operation that the operating result of the operating result of " ' 1 ' * ' vector data 841 ' " and " ' 0 ' * ' vector data 842 ' " is carried out addition handles.Then, utilizing result that SIMD operation handles this addition to carry out addition with the operating result of " ' 1 ' * ' vector data 843 ' " handles.Here, for example, can utilize the operation of SIMD operation execution to data structure " vector data 1 " * " vector data 2 "+" vector data 3 ".Therefore, operation at vectorial A, for example, can omit the SIMD operation of " ' 0 ' * ' vector data 842 ' " and " ' 1 ' * ' vector data 843 ' ", and can utilize single SIMD to operate to carry out " ' 1 ' * ' vector data 841 ' "+" vector data 843 ".
And, similarly, vectorial data item 844 to 846 is carried out the SIMD operation, and obtain vectorial B.Vectorial data item 847 to 849 is carried out the SIMD operation, and obtain vectorial C.
Then, the vectorial A to C that obtains by the SIMD operation is carried out the SIMD operation, and obtain vectorial D.As above, can obtain the result when carrying out the SIMD operation, result's number is the number of vector components (being four data item in this example).
After having calculated vectorial D, the position in the view data of the data that take out in being stored in first buffer 831 shown in Figure 55 is moved right 1, and similar processing is repeated to carry out, and sequentially calculates each vectorial D thus.When the processing finished the item at the right-hand member place that is stored in the view data in first buffer 831 shown in Figure 55, result is sent to main storage 781 by DMA.
Then, in the view data that is stored in main storage 781, DMA is sent to each row of second buffer 832 and is gone to moving down 1, and the row of next predetermined number is sent to first buffer 831 by DMA.In addition, the view data that is stored in second buffer 832 is repeated above-mentioned processing.Similar processing is repeated to carry out the delegation that is stored in the place, bottom in each row of view data of main storage 781 up to arrival.
Similarly, utilize the SIDM operation to carry out the major part of feature point extraction and optical flow computation processing, can realize the raising of speed thus.
Figure 57 is the diagrammatic sketch of the camera operation calculation of parameter handling process of carrying out chronologically in the schematically illustrated embodiment of the invention.As mentioned above, for example, can operate decoding and the analyzing and processing of carrying out moving image concurrently by utilizing polycaryon processor 800 to carry out SIMD.Therefore, can be reduced to and be shorter than decode time the analysis time that is included in the frame in the moving image.
For example, in the drawings, t1 represents that 801 pairs of processor controls nuclears are included in the frame execution required time of decoding processing in the moving image; T2 represents that arithmetic processor nuclear (#1) 811 is included in a frame in the moving image to (#8) 818 couples and carries out feature point extraction and handle the required time; T3 represents that arithmetic processor nuclear (#1) 811 is included in a frame in the moving image to (#8) 818 couples and carries out optical flow computation and handle the required time; And t4 represents that the frame execution camera operation that 801 pairs of processor controls nuclears are included in the moving image detects the required time of processing.Notice that t5 represents that processor controls nuclear 801 and arithmetic processor nuclear (#1) 811 are included in a frame in the moving image to (#8) 818 couples and carry out camera operation and detect and handle the required time.And t6 represents that processor controls nuclear 801 execution management arithmetic processor nuclear (#1) 811 are to the required time of the processing of (#8) 818.For example, t1 can be made as " 25.0ms ", and t2 can be made as " 7.9ms ", and t3 can be made as " 6.7ms ", and t4 can be made as " 1.2ms ", and t5 can be made as " 15.8ms ".
Next, the situation of utilizing meta data file playing moving images content in the embodiment of the invention will be described in detail with reference to the attached drawings.
The part of Figure 58 (a) is the top view of the blu-ray disc (registered trade mark) 880 of schematically illustrated example as recording medium, and the part of Figure 58 (b) is the schematically illustrated diagrammatic sketch that is recorded in the data item 881 to 884 on the blu-ray disc 880.On blu-ray disc 880, for example, the captions 883 of dynamic image content 882 have been write down and by (for example analyzing metadata that dynamic image content 882 obtains with dynamic image content 882 (moving image of catching by camera etc.), meta data file shown in the part of Figure 33 (b)) 884, with relevant Java (registered trade mark) program 881 of playback of moving image in the embodiment of the invention.
The part of Figure 58 (c) is the diagrammatic sketch of the internal structure of the schematically illustrated Blu-ray player (blu-ray disc player) 890 that can play blu-ray disc 880.Here, since the Blu-ray player 890 that can play blu-ray disc as standard except comprising CPU 891 and OS 892, also comprise Java (registered trade mark) VM (Java (registered trade mark) virtual machine) and storehouse 893, therefore, Blu-ray player 890 can be carried out Java (registered trade mark) program.Therefore, in Blu-ray player 890 that blu-ray disc 880 is packed into, Blu-ray player 890 can be written into and carry out Java (registered trade mark) program 881.Therefore, when Blu-ray player 890 utilizes metadata 884 playing moving images contents 882, can carry out the playback of the moving image in the embodiment of the invention.That is, can on all Blu-ray players, realize the playback of the moving image in the embodiment of the invention and need not to use special-purpose PC software etc.
And the information processor in the embodiment of the invention can be connected to the network such as the internet, and can by with moving image with the image that receives via network or moving image is combined plays it.For example, image processing apparatus receives the landscape image in predetermined park via network, the landscape image in the park that receives is used as background image, and under the combined situation of the child's who will be caught moving image and this background image, plays child's moving image.Therefore, can provide and appear to child and just moving the pseudo-replay image that passes the park.
As mentioned above, in an embodiment of the present invention, owing to be played by and composograph combined with present image with the corresponding image of image frame before of current demonstration, therefore, can easily the object that serves as the image taking center be browsed with background of taking in the part time domain at least etc.Therefore, for example, when the beholder wished to watch once more the background of taking at least etc. in the part time domain, the beholder can watch the image of current demonstration and watch its background etc. simultaneously, and need not carry out rewinding operation, search operation etc.And when the beholder browsed the captured moving image of camera, the beholder can easily understand the details of moving image.
And, in the demonstration example shown in Fig. 7,11,15 etc. since be fixed so beholder's identification space expansion (spatial extension) easily at the corresponding image of preceding frame.In addition, in the demonstration example shown in Fig. 8,12,16 etc., owing to be displayed on the fixed position with the corresponding image of present frame, so the beholder can easily discern current display part.
That is, utilize past frame, moving image can spatially be expanded and be appreciated.Therefore, for example, can provide the appreciation of for example finishing panoramic picture in playing moving images method, the beholder can appreciate moving image giocoso thus.
And, come playing moving images by utilizing the moving image player method shown in Figure 35 to Figure 43 etc., even during the playback of moving image, also can easily carry out and switch to another display mode.Therefore, for example, can enjoy the appreciation of finishing panoramic picture in playing moving images, and in addition, can easily carry out in plurality of display modes and switch, the beholder can appreciate moving image more giocoso thus.And, because the image in the state before in being saved to video memory 684 can sequentially be shown as present image, therefore, can show relative distinct image.
And, in an embodiment of the present invention, described and utilized detected in advance affine transformation parameter to carry out the example of playback and demonstration.Yet, also can when playback, calculate affine transformation parameter, and can utilize the affine transformation parameter that calculates to carry out playback and demonstration.For example, by the SIMD operational computations affine transformation parameter that use utilizes polycaryon processor to carry out, can calculate the affine transformation parameter of a frame in the time in the decoding processing of a frame.Therefore, even in the situation that will play the moving image that does not calculate its affine transformation parameter, also can in playing moving images, calculate affine transformation parameter.Therefore, can when appreciating moving image, promptly carry out extension movement image spatially.
And, for example, when will be when high definition TV (television set) go up to appreciate the moving image of taking by SD (standard is clear) picture quality or will appreciate the moving image of shootings such as using Digital Still Camera or cellular moving image hold function, if the state with original image size shows moving image, then can not utilize the pixel count of high definition TV fully.And when carrying out amplification and showing, in many cases, it is remarkable that the image roughness becomes.Therefore, by carrying out the demonstration of describing in the embodiment of the invention, can execute complete utilization high definition TV pixel count and can not make the image roughness become significant to appreciate.
Notice that the composograph that generates by the combination among step S926, S942, the S954 etc. can be recorded on the recording medium etc., so that composograph can be used for the playback and the demonstration of other type.And, in an embodiment of the present invention, the example that is shown with the present frame corresponding composograph of frame has before been described.Yet, also can be along with the past of time is sequentially wiped composograph.In this case, can present (render) in the mode that when wiping composograph, stays afterimage.And, can show and the corresponding image of present frame by colour, and can be that the mode that sepia (sepia) shows is come presenting with the present frame corresponding composograph of frame before with past colour display change along with the time.
And, in an embodiment of the present invention, such situation has been described: obtain camera motion in moving object in respect to the less relatively situation of the area that is included in the image in the moving image, and utilize camera motion to come playing moving images.Yet the size that embodiments of the invention also can be applicable to moving image is with respect to being included in the relatively large situation of image area in the moving image.For example, when catching the image of the train that leaves platform, with the center object of train as image, so that train becomes bigger with respect to the ratio of image area, if calculated above-mentioned affine transformation parameter, then moving of train calculated.In this case, utilize train to move, can come playing moving images by above-mentioned display packing.In the situation of playing moving images by this way, background is fixed, and in addition, train be shown as train just along with the preceding of present image and then advance.As above, the movable information relevant with the relative motion amount of camera when catching image and photography target can be calculated and be used for conversion and constituted the information converting of the image of moving image.
Promptly, according to embodiments of the invention, in the situation that will show first image that the image capture apparatus of utilization such as Digital Video caught and second image, can utilize relative position relation between first image and second image that second image is placed on first image and to it and show that relative position relation has been indicated the motion of image capture apparatus when photographic images or the motion of photography target.Therefore, can realize with as time in the past mode playback moving image on display unit in the wicket that the user watches attentively only.
And, in an embodiment of the present invention, described the composograph that the image assembled unit is generated by example and be presented at image processing apparatus on the display unit.Yet embodiments of the invention also can be applicable to have the image processing apparatus of following image output device: this image output device is used to export the image information that is used for showing the composograph that is generated by the image assembled unit on another image display device.In addition, embodiments of the invention also can be applicable to can playing moving images the moving image playing device, can play the image capture apparatus such as Digital Video of captured moving image, or the like.
And, in an embodiment of the present invention, the moving image that utilizes camera to catch has been described.Yet for example, embodiments of the invention also can be applicable to the moving image that camera is caught is edited, partly moving image added the moving image through editing in the situation that angry (animation) wait, or the like.And, though the example of display part or all history images has been described in an embodiment of the present invention, yet, present image can only be shown through conversion.That is, the present image that only is stored in recently in the video memory can sequentially be shown.And, in the situation of calculating affine transformation parameter, for example,, for example, camera is placed as in the face of mobile train by dwindling the zone that the motion vector of catching the characteristic point in the image is calculated, can generate the mobile image and the big image thereof of train.
Note, illustrated embodiments of the invention by realizing example of the present invention.Though between the feature of embodiment and following claim, there is corresponding relation, yet, the invention is not restricted to this, and can carry out various modifications without departing from the scope of the invention.
That is, in claim 1 to 25, the moving image input unit is for example corresponding with moving image input unit 110.And the information converting calculation element is for example corresponding with camera operation detecting unit 120.And image storage apparatus is for example corresponding with video memory 170 or video memory 684.And image conversion device is for example corresponding with image transforming unit 160 or image transforming unit 682.And the operation acceptance device is for example corresponding with operation acceptance unit 195.And image combining apparatus is for example corresponding with image assembled unit 180 or image assembled unit 683.And output device is for example corresponding with display unit 191 or display unit 689.And control device is for example corresponding with indicative control unit 190 or indicative control unit 687.
And in claim 25, display unit is for example corresponding with display unit 191 or display unit 689.
And in claim 5 or 9, the output image extraction element is for example corresponding with viewing area extraction unit 685.
And in claim 18, the feature point extraction device is for example corresponding with feature point extraction unit 121.And the amount of exercise calculation element is for example corresponding with optical flow computation unit 122.And the transformation parameter calculation element is for example corresponding with camera operation parameter calculation unit 123.
And in claim 21, compression set is for example corresponding with image assembled unit 683.
And in claim 22, the moving image deriving means is for example corresponding with moving image acquiring unit 140.And the information converting extraction element is for example corresponding with camera operation parameter extraction unit 150.And image storage apparatus is for example corresponding with video memory 170 or video memory 684.And image conversion device is for example corresponding with image transforming unit 160 or image transforming unit 682.And the operation acceptance device is for example corresponding with operation acceptance unit 195.And image combining apparatus is for example corresponding with image assembled unit 180 or image assembled unit 683.And output device is for example corresponding with display unit 191 or display unit 689.And control device is for example corresponding with indicative control unit 190 or indicative control unit 687.
And in claim 23, the information converting storage device is for example corresponding with metadata storage unit 670.And the moving image deriving means is for example corresponding with file acquiring unit 652 or file acquiring unit 681.And the information converting deriving means is for example corresponding with file acquiring unit 652 or file acquiring unit 681.And image storage apparatus is for example corresponding with video memory 170 or video memory 684.And image conversion device is for example corresponding with image transforming unit 160 or image transforming unit 682.And the operation acceptance device is for example corresponding with operation acceptance unit 195.And image combining apparatus is for example corresponding with image assembled unit 180 or image assembled unit 683.And output device is for example corresponding with display unit 191 or display unit 689.And control device is for example corresponding with indicative control unit 190 or indicative control unit 687.
And in claim 24, the moving image input unit is for example corresponding with moving image input unit 110.And the information converting calculation element is for example corresponding with camera operation detecting unit 120.And image conversion device is for example corresponding with image transforming unit 160 or image transforming unit 682.And control device is for example corresponding with indicative control unit 190 or indicative control unit 687.
And in claim 26, the moving image input unit is for example corresponding with moving image input unit 110.And the capture movement image memory device is for example corresponding with moving image memory cell 200.And the information converting calculation element is for example corresponding with camera operation detecting unit 120.And recording control apparatus is for example corresponding with record control unit 130.
And in claim 27, the moving image input unit is for example corresponding with moving image input unit 110.And the metadata store device is for example corresponding with metadata storage unit 670.And the information converting calculation element is for example corresponding with camera operation detecting unit 120.And recording control apparatus is for example corresponding with record control unit 651.
And in claim 29 or 30, the moving image input step is for example corresponding with step S900.And the information converting calculation procedure is for example corresponding with step S903 to S913.And it is for example corresponding with step S926, S942 and S954 that image is preserved step.The image transform step is for example corresponding with step S925, S941, S952 and S953.The operation acceptance step is for example carried out by operation acceptance unit 195.And the image combination step is for example corresponding with step S926, S942 and S954.And controlled step is for example corresponding with step S927 or S970.
Notice that the processing procedure of Miao Shuing can be regarded the method with process sequence as in embodiments of the present invention, perhaps can regard the program that allows computer implementation sequence as, perhaps can regard the recording medium that has program stored therein on it as.

Claims (30)

1. image processing apparatus is characterized in that comprising:
The moving image input unit is used to receive the capture movement image of being caught by image capture apparatus;
The information converting calculation element, be used for first catching image and being positioned at described first and catch second after the image and catch image, calculate and described first catch image and described second and catch image-related information converting along the time shaft of described capture movement image based on what be included in described capture movement image;
Image storage apparatus is used for and will comprises that described first is that catch image and be positioned at described second each image of catching before the image along the time shaft of described capture movement image and save as history image;
Image conversion device is used for based on the information converting that calculates described history image and described second at least one of catching image being carried out conversion;
The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by described image conversion device;
Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one described history image and described second that has carried out conversion by described image conversion device to catch image combined, to generate composograph;
Output device is used to export described composograph; And
Control device is used to make described output device sequentially to export described composograph.
2. image processing apparatus according to claim 1, it is characterized in that: described image combining apparatus by will be wherein at least one carried out described second of conversion by described image conversion device and catch described second in image and the described history image and catch image and write on the described history image, it is combined generating described composograph that described history image and described second is caught image, and make described image storage apparatus that described composograph is saved as new history image.
3. image processing apparatus according to claim 2, it is characterized in that: described image combining apparatus by will via described image conversion device conversion and described second catch image and write on the described history image what catch through second of conversion that image quality in images carries out that conversion obtains according to described history image, catch image and the combination of described history image with described second.
4. image processing apparatus according to claim 3, it is characterized in that: described image combining apparatus will be present in before the picture quality conversion, catch image via second of described image conversion device conversion and write on the described new history image and generate new composograph, and
Described control device makes described output device sequentially export described new composograph.
5. image processing apparatus according to claim 4 is characterized in that also comprising: the output image extraction element, and being used for from the described new history image that is kept at described image storage apparatus extracting will be by the output image of described output device output,
Wherein, described image combining apparatus by will be present in before the picture quality conversion, catch image via second of described image conversion device conversion and write on the described output image, it is combined generating new output image to catch image and described output image with described second, and
Described control device makes described output device sequentially export described new output image.
6. image processing apparatus according to claim 5, it is characterized in that: described output image extraction element based on described through second of conversion catch in the storage area of image at described image storage apparatus the position and the size and position and the size of described output image in described storage area, calculating be present in before the picture quality conversion, catch image via second of described image conversion device conversion and be written in the size that image is caught in position and described second on the described output image, and
Described image combining apparatus based on the position that calculates and big young pathbreaker be present in before the picture quality conversion, catch image via second of described image conversion device conversion and write on the described output image, thereby catch image and described output image is combined with described second.
7. image processing apparatus according to claim 5, it is characterized in that: in being included in described new history image, catch at least a portion of image from output area through second of conversion, when promptly being used to extract described output image regional outstanding, described output image extraction element moves described output area on the direction of outstanding image section, and extracts described output image from described new history image.
8. image processing apparatus according to claim 3 is characterized in that: described picture quality is at least one in resolution and the compression ratio.
9. image processing apparatus according to claim 2, it is characterized in that also comprising: the output image extraction element, be used for from the described new history image that is kept at described image storage apparatus, extracting the image that is included in the zone that calculates based on the information converting that calculates, as will be by the output image of described output device output
Wherein, described image combining apparatus is carried out described second before the conversion by described image conversion device and is caught image and write on the described output image by being present in, it is combined generating new output image to catch image and described output image with described second, and
Described control device makes described output device sequentially export described new output image.
10. image processing apparatus according to claim 9, it is characterized in that: described output image extraction element is based on the information converting that calculates, go up in the opposite direction described output image is carried out conversion catching side that image carries out conversion to described second with described image conversion device, and
Described image combining apparatus is carried out described second before the conversion by described image conversion device and is caught image and write on this on the output image of conversion by being present in, and catches image and this output image through conversion is combined to generate new output image with described second.
11. image processing apparatus according to claim 2 is characterized in that: described image conversion device is based on the information converting that calculates, with described second catch side that image is transformed and go up in the opposite direction described history image is carried out conversion.
12. image processing apparatus according to claim 2 is characterized in that: described information converting comprises and amplifies/dwindle, moves and rotate relevant key element, and
Described image conversion device based on be included in the information converting that calculates with move and rotate relevant key element and catch image to described second and carry out conversion, and based on be included in the information converting that calculates with amplify/dwindle relevant key element described history image is carried out conversion.
13. image processing apparatus according to claim 12 is characterized in that: described image conversion device with described second catch side that image is transformed and go up in the opposite direction described history image is carried out conversion.
14. image processing apparatus according to claim 1 is characterized in that: described information converting calculation element sequentially calculates the information converting of each frame that constitutes described capture movement image,
Described image conversion device is at the described history image of each frame transform and described second at least one of catching in the image,
Described image combining apparatus at each frame sequential ground will be wherein at least one described history image and described second that has carried out conversion via described image conversion device to catch image combined, and
Described control device makes described composograph sequentially be output at each frame.
15. image processing apparatus according to claim 1 is characterized in that: described first catches image and described second catches image and is and two corresponding images of successive frame that are included in the described capture movement image.
16. image processing apparatus according to claim 15 is characterized in that: described information converting is to catch the movable information that image or described second is caught the described image capture apparatus of image when being hunted down described first, and
Described information converting calculation element is caught image and is compared and calculate described information converting by catching image and described second with described first.
17. image processing apparatus according to claim 15, it is characterized in that: described information converting is and catches the relevant movable information of relative motion amount that image or described second is caught image described image capture apparatus and photography target when being hunted down described first, and
Described information converting calculation element is caught image and is compared and calculate described information converting by catching image and described second with described first.
18. image processing apparatus according to claim 1 is characterized in that, described information converting calculation element comprises:
The feature point extraction device is used for catching each pixel that image and described second is caught image based on constituting described first, and extract described first and catch image and described second characteristic point of catching in the image,
The amount of exercise calculation element is used for catching image and described second based on each characteristic point calculating and described first that extracts and catches image-related amount of exercise, and
The transformation parameter calculation element is used for calculating described information converting by calculating the predetermined map parameter based on the amount of exercise that calculates.
19. image processing apparatus according to claim 18 is characterized in that: described feature point extraction device is made of polycaryon processor, and
Described polycaryon processor is caught each pixel execution parallel processing that image and described second is caught image by utilizing the single-instruction multiple-data operation to constituting described first, extracts described first and catches image and described second characteristic quantity of catching in the image.
20. image processing apparatus according to claim 18 is characterized in that: described amount of exercise calculation element is made of polycaryon processor, and
Described polycaryon processor is carried out parallel processing by utilizing single-instruction multiple-data operation to each characteristic point that extracts, and calculates and described first catches image and described second and catch image-related amount of exercise.
21. image processing apparatus according to claim 1 is characterized in that also comprising: compression set, be used for compression and catch image,
Wherein, the described history image when exporting described composograph is a compressed image, and described second to catch image be non-compressed image or the image of catching with resolution higher than the compression history image.
22. an image processing apparatus comprises:
The moving image deriving means, be used to obtain the capture movement image, be used for catching image and second and catching in the image at least one and carry out the information converting of conversion and be associated with described capture movement image and be recorded being included in first of described capture movement image, described capture movement image is caught by image capture apparatus;
The information converting extraction element is used for extracting described information converting from the capture movement image that obtains;
Image storage apparatus is used for and will comprises that described first is that catch image and be positioned at described second each image of catching before the image along the time shaft of described capture movement image and save as history image;
Image conversion device is used for based on the information converting that extracts described history image and described second at least one of catching image being carried out conversion;
The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by described image conversion device;
Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one described history image and described second that has carried out conversion by described image conversion device to catch image combined, to generate composograph;
Output device is used to export described composograph; And
Control device is used to make described output device sequentially to export described composograph.
23. an image processing apparatus comprises:
The information converting storage device is used for catching image and second at least one of catching image and carrying out the information converting of conversion and store explicitly with each frame that constitutes described capture movement image being included in first of the capture movement image of being caught by image capture apparatus being used for;
The moving image deriving means is used to obtain described capture movement image;
The information converting deriving means is used for obtaining the information converting that is stored in described information converting storage device with the capture movement image that is obtained explicitly;
Image storage apparatus is used for and will comprises that described first is that catch image and be positioned at described second each image of catching before the image along the time shaft of described capture movement image and save as history image;
Image conversion device is used for based on the information converting that is obtained described history image and described second at least one of catching image being carried out conversion;
The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by described image conversion device;
Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one described history image and described second that has carried out conversion by described image conversion device to catch image combined, to generate composograph;
Output device is used to export described composograph; And
Control device is used to make described output device sequentially to export described composograph.
24. an image processing apparatus is characterized in that comprising:
The moving image input unit is used to receive the capture movement image of being caught by image capture apparatus;
The information converting calculation element is used for catching each of image at what constitute described capture movement image, calculates to be used for catching image with reference to described at least one of catching image and to come another is caught the information converting that image carries out conversion;
Image conversion device, be used for based on reference to constitute described capture movement image catch image described at least one catch the information converting that image calculates as benchmark image, come carrying out conversion with the corresponding image of catching of described information converting;
Image storage apparatus is used to preserve the image of catching through conversion; And
Control device is used for making output device sequentially export and is kept at last the described image of catching of described image storage apparatus.
25. a moving image playing device is characterized in that comprising:
The moving image input unit is used to receive the capture movement image of being caught by image capture apparatus;
The information converting calculation element, be used for first catching image and being positioned at described first and catch second after the image and catch image, calculate and described first catch image and described second and catch image-related information converting along the time shaft of described capture movement image based on what be included in described capture movement image;
Image storage apparatus is used for and will comprises that described first is that catch image and be positioned at described second each image of catching before the image along the time shaft of described capture movement image and save as history image;
Image conversion device is used for based on the information converting that calculates described history image and described second at least one of catching image being carried out conversion;
The operation acceptance device is used to accept to carrying out the selection operation that the image of conversion is selected by described image conversion device;
Image combining apparatus, be used for in response to the selection operation of being accepted wherein at least one described history image and described second that has carried out conversion by described image conversion device to catch image combined, to generate composograph;
Display unit is used to show described composograph; And
Control device is used to make described display unit sequentially to show described composograph.
26. an image processing apparatus is characterized in that comprising:
The moving image input unit is used to receive the capture movement image of being caught by image capture apparatus;
The capture movement image memory device is used to store described capture movement image;
The information converting calculation element is used at each frame that constitutes described capture movement image, calculate be used for reference to constitute described capture movement image catch image at least one catch image and come another is caught the information converting that image carries out conversion; And
Recording control apparatus, the information converting and each frame that are used for calculating are recorded in described capture movement image memory device explicitly.
27. an image processing apparatus is characterized in that comprising:
The moving image input unit is used to receive the capture movement image of being caught by image capture apparatus;
The metadata store device is used to store the metadata image-related with described capture movement;
The information converting calculation element is used at each frame that constitutes described capture movement image, calculate be used for reference to constitute described capture movement image catch image at least one catch image and come another is caught the information converting that image carries out conversion; And
Recording control apparatus is used for the information converting that will calculate and described capture movement image and frame and is recorded in described metadata store device explicitly as described metadata.
28. image processing apparatus according to claim 27 is characterized in that: described metadata is included in positional information and the pose information of describing in the coordinate system of described image capture apparatus at least.
29. an image processing method is characterized in that comprising:
The moving image input step is used to receive the capture movement image of being caught by image capture apparatus;
The information converting calculation procedure, be used for first catching image and being positioned at described first and catch second after the image and catch image, calculate and described first catch image and described second and catch image-related information converting along the time shaft of described capture movement image based on what be included in described capture movement image;
Image is preserved step, is used to make image storage apparatus will comprise that described first is that catch image and be positioned at described second each image of catching before the image along the time shaft of described capture movement image and save as history image;
The image transform step is used for based on the information converting that calculates described history image and described second at least one of catching image being carried out conversion;
The operation acceptance step is used to accept to carrying out the selection operation that the image of conversion is selected by image conversion device;
The image combination step, be used for in response to the selection operation of being accepted wherein at least one described history image and described second that has carried out conversion in described image transform step to catch image combined, to generate composograph; And
Controlled step is used to make described composograph sequentially to be exported.
30. a program is characterized in that making computer to carry out:
The moving image input step is used to receive the capture movement image of being caught by image capture apparatus;
The information converting calculation procedure, be used for first catching image and being positioned at described first and catch second after the image and catch image, calculate and described first catch image and described second and catch image-related information converting along the time shaft of described capture movement image based on what be included in described capture movement image;
Image is preserved step, is used to make image storage apparatus will comprise that described first is that catch image and be positioned at described second each image of catching before the image along the time shaft of described capture movement image and save as history image;
The image transform step is used for based on the information converting that calculates described history image and described second at least one of catching image being carried out conversion;
The operation acceptance step is used to accept to carrying out the selection operation that the image of conversion is selected by image conversion device;
The image combination step, be used for in response to the selection operation of being accepted wherein at least one described history image and described second that has carried out conversion in described image transform step to catch image combined, to generate composograph; And
Controlled step is used to make described composograph sequentially to be exported.
CN200880005527A 2007-08-24 2008-08-22 Image processing apparatus, moving image playing device and processing method and program Pending CN101617531A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2007219093 2007-08-24
JP219093/2007 2007-08-24
JP317767/2007 2007-12-07

Publications (1)

Publication Number Publication Date
CN101617531A true CN101617531A (en) 2009-12-30

Family

ID=40611874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200880005527A Pending CN101617531A (en) 2007-08-24 2008-08-22 Image processing apparatus, moving image playing device and processing method and program

Country Status (1)

Country Link
CN (1) CN101617531A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109863538A (en) * 2016-08-26 2019-06-07 奇跃公司 The continuous time distortion and binocular time warp system and method shown for virtual and augmented reality
WO2021027596A1 (en) * 2019-08-09 2021-02-18 北京字节跳动网络技术有限公司 Image special effect processing method and apparatus, and electronic device and computer readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109863538A (en) * 2016-08-26 2019-06-07 奇跃公司 The continuous time distortion and binocular time warp system and method shown for virtual and augmented reality
CN109863538B (en) * 2016-08-26 2023-10-27 奇跃公司 Continuous time warping and binocular time warping systems and methods for virtual and augmented reality displays
WO2021027596A1 (en) * 2019-08-09 2021-02-18 北京字节跳动网络技术有限公司 Image special effect processing method and apparatus, and electronic device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN101606384B (en) Image processing device, dynamic image reproduction device, and processing method
EP2180701A1 (en) Image processing device, dynamic image reproduction device, and processing method and program in them
JP4623199B2 (en) Image processing apparatus, image processing method, and program
CN101617530B (en) Image processor, animation reproduction apparatus, and processing method and program for the processor and apparatus
JP4678404B2 (en) Imaging apparatus, control method thereof, and program
JP4623201B2 (en) Image processing apparatus, image processing method, and program
JP4623200B2 (en) Image processing apparatus, image processing method, and program
JP5092722B2 (en) Image processing apparatus, image processing method, and program
EP2073539A1 (en) Image processing device, dynamic image reproduction device, and processing method and program in them
CN101622868B (en) Image processor, and processing method and program for the same
CN101627623A (en) Image processing device, dynamic image reproduction device, and processing method and program in them
KR20100103776A (en) Image processor, animation reproduction apparatus, and processing method and program for the processor and apparatus
CN101617531A (en) Image processing apparatus, moving image playing device and processing method and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20091230