Embodiment
To describe embodiments of the invention with reference to the accompanying drawings in detail.
Fig. 1 illustrates the diagrammatic sketch of the structure of imaging device 100 according to an embodiment of the invention.Imaging device 100 comprises that lens unit 110, image-generating unit 120, signal processing unit 130, video memory 140, indicative control unit 150, display 160, image recording unit 170, external memory storage connect (I/F) 180 and system control unit 190.
Lens unit 110 is to be configured to the light from the object reflection is guided to the optical unit of the light receiving surface of image-generating unit 120, and comprises the driver of a group of lens, aperture (iris) and those lens of driving such as condenser lens and zoom lens.Lens unit 110 forms the corresponding object images of light with process optical path 119 on the light receiving surface of image-generating unit 120.
The light that image-generating unit 120 is configured to provide from lens unit 110 is converted to the signal of telecommunication, and this signal is sent to signal processing unit 130 as the picture signal of being caught via holding wire 129.The operator scheme that image-generating unit 120 changes transfer rate and is associated with resolution according to the instruction that provides from system control unit 190.
Signal processing unit 130 is configured to receive from image-generating unit 120 and transmits and the next picture signal of catching, and the signal that is received is carried out prearranged signal handle.Signal processing unit 130 is converted to digital form with picture signal from analog form, and to signal carry out such as white balance correction, Gamma correction, signal processing amplifying and dwindling.The picture signal that signal processing unit 130 will obtain thus outputs to video memory 140 as the seizure view data via holding wire 139.
Video memory 140 is configured to store the view data of catching that lives through signal processing.The view data that is stored in the video memory 140 is provided for indicative control unit 150 via holding wire 149, and is provided for image recording unit 170 via holding wire 148.The view data of catching form by a plurality of pixels by the matrix form two-dimensional arrangements.Every group of pixel of arranging by horizontal direction is called row (row/line).Location of pixels in each row is called row.
Indicative control unit 150 is configured to via holding wire 149 from video memory 140 reads image data, and via holding wire 159 data that read is outputed to display 160.In this embodiment of the present invention, indicative control unit 150 need not from video memory 140 reads image data to wait for to finish a frame (or) view data is written to the operation of video memory 140.At this moment, indicative control unit 150 is that the image demonstration is covered to prevent image fault by unit with behavior unit or with the pixel.Those timings of system control unit 190 controls.
Display 160 shows from the image of indicative control unit 150 outputs, and for example comprises LCD (LCD).Image on display 160 show by with the upper left corner be starting point, line by line in the horizontal direction from left to right and sequentially from top to bottom scan image realize.
Image recording unit 170, and compresses these data by predetermined file format (for example, JPEG (joint photographic experts group) (JPEG)) from video memory 140 reads image data via holding wire 148.View data through overcompression is output to external memory interface 180 via holding wire 179.External memory interface 180 is the interfaces that are used to be connected to external memory storage 210.View data through image recording unit 170 compressions is recorded to external memory storage 210 by external memory interface 180.External memory storage 210 for example comprises hard disk or flash memory.
The whole imaging device 100 of system control unit 190 controls.For example, system control unit 190 monitors the operation input from the user.When receiving imaging instruction, the operator scheme that system control unit 190 changes in the image-generating unit 120.In addition, system control unit 190 will indicate the information that receives imaging instruction and send to indicative control unit 150.
When using line by line method, image-generating unit 120 is for example every the view data of 1/30 second output one frame.When using interlacing method, image-generating unit 120 is for example every (even number line in the frame or odd-numbered line) view data of output in 1/60 second.For operator scheme, image-generating unit 120 has monitoring mode and still-mode, wherein, shows moving image, catching static images in still-mode after determining object in monitoring mode.Image-generating unit 120 is carried out different operations at these operator schemes.In monitoring mode, a spot of pixel just is enough to catch the image of whole object, and resolution is adjusted to the resolution of display 160 under the situation of considering the balance between resolution and the power consumption.In still-mode, the view data of high-resolution and higher density is output.Because number of pixels is huge, so spent the time output image data longer than monitoring mode.
For catching static images, image-generating unit 120 is operated to show latest image on display 160 under monitoring mode usually.When providing imaging instruction, for example the user presses shutter and opens and put device, and image-generating unit 120 is switched to still-mode.
Between monitoring mode and still-mode, switch and to spend the relatively long time.In addition, the image of between possibility output distortion between transfer period.Therefore, the demonstration on the display 160 for example begins to be switched to from next (or frame) of the field (or frame) that provides imaging instruction that the coverage of black shows.Finish the demonstration of execution coverage always before the output of the view data of catching under the still-mode, thereby preventing the output distortion image.
Fig. 2 is the diagrammatic sketch that illustrates the functional structure of the essential part of the imaging device of this embodiment according to the present invention.Suppose that essential part comprises that view data provides part 141, coverage data to provide part 151, drafting part 152, linage-counter 153, column counter 154, a counter 155, coverage control register 156, transition zone (transition segment) register 157, transition counter 158, display part 161 and imaging instruction to accept part 191.
View data provides part 141 that view data is caught by institute and offers drafting part 152.Described view data obtains by carrying out signal processing by 130 pairs of picture signals of signal processing unit, and picture signal is caught by image-generating unit 120.View data is offered indicative control unit 150 from video memory 140.
Covering data provides part 151 to provide will to be used as the data of the coverage data that show on the display 160.Cover data and can comprise colouring information and transmissivity.For example, can use expression to have the data of the black of 100% transmissivity.
Drawing part 152 uses from view data and view data that part 141 provides is provided or draws the pixel that will be output to display part 161 from the coverage data of covering data and providing part 151 to provide.
Linage-counter 153 has been indicated the row of drawing the pixel that part 152 will draw.Column counter 154 has been indicated the row of drawing the pixel that part 152 will draw.When carrying out demonstration according to interlacing method, a counter 155 has been indicated and has been drawn the field that part 152 will be drawn.When carrying out according to method line by line when showing, do not make use counter 155.Draw part 152 according to the view data of coming by the information of linage-counter 153, column counter 154 and a counter 155 indications to provide part 141 to provide with reference to from view data, and the output drawing result.
Cover control register 156 and be held in picture instruction accepting signal, this imaging instruction acknowledge(ment) signal indicates with the corresponding operation input of imaging instruction and is accepted.Cover control register 156 and verify this imaging instruction acknowledge(ment) signal according to the instruction of accepting part 191 from imaging instruction.Draw part 152 with reference to this imaging instruction acknowledge(ment) signal.
Transition zone register 157 remains on the corresponding operation input of imaging instruction and is accepted afterwards and the transition zone before the coverage data are shown.Transition zone is the parameter that depends on the structure of imaging device.Therefore, can in transition zone register 157, optimum value be set according to the structure of imaging device.
The current location that transition counter 158 keeps in the transition zone.Generally speaking, transition counter 158 is resetting when being accepted with the corresponding operation input of imaging instruction, and counts till transition zone finishes.When blended image data and coverage data, use transition counter 158.
Imaging instruction is accepted part 191 and is accepted and the corresponding operation input of imaging instruction from the user.The part 191 of accepting imaging instruction is for example opened by shutter puts device and realizes.
Display part 161 shows from drawing the pixel of part 152 outputs.Display part 161 is corresponding to display 160.
Cover that data provide part 151, draw part 152, linage-counter 153, column counter 154, a counter 155, cover control register 156, transition zone register 157 and transition counter 158 and be implemented function as indicative control unit 150.
Fig. 3 is the sequential chart of operation that the imaging device 100 of this embodiment according to the present invention is shown.
When using interlacing method, transmitted by the place order via the picture signal of holding wire 129 from image-generating unit 120 outputs.When using line by line method, those signals are transmitted by the frame by frame order.In Fig. 3, suppose to have used interlacing method.Show the odd field signal 711 of first frame, the even field signal 712 of first frame and the odd field signal 713 of second frame.In some imageing sensor, each or each frame all comprise invalid data at its end.In this example shown in Figure 3, illustrated with striped at the invalid data block 721 to 723 of signal 711 to 713 end separately.
Dotted line 701 has indicated the user to import the time of imaging instruction.The picture signal of catching that provides after imaging instruction is transfused to may comprise aforesaid distorted image section.In Fig. 3, comprise that the part of this distorted image section is illustrated with shade.
Lagging behind image signals corresponding via holding wire 139 a little from each treated signal of signal processing unit 130 output is output.This delay is caused by imaging circuit.The transfer rate of sort signal is identical with picture signal.Because the processing that invalid data block 721 to 723 needn't be used for subsequently, so those data blocks are not exported from signal processing unit 130.
Be illustrated in below the treated signal finish output 349 after the record of video memory and memory output show 359 be illustrated in hypothesis indicative control unit 150 at the signal that provides from signal processing unit 130 to the timing of waiting for before the record end of video memory 140 and being obtained during in reading of data subsequently always.In this case, can carry out to cover since the head of the even field signal 712 of first frame and show, but demonstration itself may be delayed.According to embodiments of the invention, indicative control unit 150 is finished signal record to video memory 140 and need not wait via holding wire 149 reading of data.
In other words, be output early than output 349 from the view data that video memory 140 reads via holding wire 149.In this example, because imaging instruction is presented in the centre of the first frame even field signal 712, so beginning demonstration switched to and cover to show from next.Therefore, distorted image section possibility remaining (with reference to control demonstration output 459).According to this embodiment of the invention, after imaging instruction 701 just has been presented, just carry out and cover demonstration, thereby control the distorted image section is not outputed to display 160 as demonstration output via holding wire 159.
In this embodiment, can after just being presented, imaging instruction show the coverage data of representing black.In this case, cover the centre that data are displayed on a field picture, shown in Fig. 4 A.Some original image may provide factitious impression.According to the modification of present embodiment, not after imaging instruction just is presented, just data to be switched completely to the coverage data.In the view data of the pixel value of each pixel of expression in the original image with cover data drawing image when just mixed, so that during the transition zone 703 between the time of accepting in imaging instruction and the beginning display distortion image section 702, gradually the pixel value of each pixel is changed into the coverage data.At this moment, also increase transmissivity gradually so that gradually view data is changed into the coverage data except changing color signal.Thereby image can be shown as and make that switching to coverage naturally shows, shown in Fig. 4 B.
Describe imager in detail and show relation between the control referring now to Fig. 5 A and 5B.
Fig. 5 A and 5B are the diagrammatic sketch that illustrates lens unit 110 and be designated as the relation between 121 the imager.Imager 121 is corresponding to image-generating unit 120, and comprises CCD or cmos image sensor.
Fig. 5 A illustrate watch from a side of the housing of imaging device 100, the relation between typical optical system and imager 121.In this case, comprise that the optical system of one group of lens 111 is disposed in the place ahead of imager 121.The set of lenses 111 that is set between object 11 and the imager 121 will be from the optical convergence of object 11 reflection on the imaging surface of imager 121, makes to go up in vertical direction (turning upside down) and horizontal direction (reversed left to right) with the corresponding object images of light to reverse.
Fig. 5 B illustrate watch from a side of the housing of imaging device 100, the relation between dioptric system and imager 121.In this case, dioptric system comprise one group of lens 113 in the place ahead that is arranged on imager 121 and be arranged in object 12 and set of lenses 113 between the optical axis to light carry out the reflecting member 112 of flexion.Reflecting member 112 is basically with the light of right angle reflection from object 12, with the object images of vertically reversing.Reflecting member 112 is for example realized by prism.Set of lenses 113 will be from the optical convergence that reflects member 112 on the imaging surface of imager 121, so that go up counter-rotating with the corresponding object images of this light in vertical direction (turning upside down) and horizontal direction (reversed left to right).Dioptric system is used in small dimension (low-profile) imaging device usually.
Fig. 6 A is the diagrammatic sketch that illustrates object and the relation between the demonstration on the display 160 to 6D.Fig. 6 A illustrates the object images 19 of watching from imaging device 100.For the ease of understanding, the thick arrow in the object images 19 is arranged on the display surface of typical displays along the scanning direction.
Fig. 6 B illustrates when the typical optical system of using shown in Fig. 5 A the image on the imaging surface that watch, that be formed on imager 121 of dead ahead from imager 121 (, from object).Thick arrow is corresponding to formed object images 19.The dotted arrow indication is from the direction of imager 121 output image datas.In this case, on thick arrow and the vertical direction, be complementary at a dotted arrow shown in the outside of imager 121.Therefore, from bottom to up from imager 121 reading images signals, thus from top to down display image on display 160.
That Fig. 6 C illustrates when the dioptric system that uses shown in Fig. 5 B is that upside from the background technology imaging device (, from the dead ahead of imager 121) is watched, be formed on the image on the imaging surface.Thick arrow is corresponding to formed object images 19.Dotted arrow has been indicated from the direction of imager 121 output image datas.In this case, thick arrow with in vertical direction, opposite at a dotted arrow shown in the outside of imager 121.When from bottom to up from imager 121 reading images signals, display image on display 160 from bottom to up.Suppose that imager 121 comprises cmos sensor, can select from bottom to top order or top-down order as the order of output image data.Suppose that imager 121 comprises ccd sensor, the direction of output image data structurally is confined to either direction.Therefore, when imager comprises typical ccd sensor, make the below object-oriented of the imaging surface of imager, thereby use at general optical system, from imaging surface from bottom to top and from a left side and right ground output image data.
According to embodiments of the invention, when using dioptric system, imager is arranged such that the top object-oriented of imaging surface, and from top to down is from video memory 140 reading of data.Therefore, from top to down display image on display 160.
Fig. 6 D illustrates when the dioptric system that uses shown in Fig. 5 B, from watch according to the top of the imaging device 100 of present embodiment the dead ahead of imager 121 (that is, from), be formed on the image on the imaging surface.Thick arrow is corresponding to formed object images 19.Dotted arrow has been indicated from the direction of imager 121 output image datas.In this case, on thick arrow and the vertical direction, be complementary at a dotted arrow shown in the outside of imager 121.From top to down is from imager 121 reading images signals, thus from top to down display image on display 160.
Referring now to accompanying drawing operation according to the imaging device 100 of the embodiment of the invention is described.
Fig. 7 is the flow chart that has illustrated by first example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In first example, carry out coverage with behavior unit ground and show.Suppose to come by 1/60 second transfer rate the image of VGA Display size (640 * 480 pixel) according to interlacing method.The variable " x " that remains in the column counter 154 has been indicated column position in the horizontal direction.The variable " y " that remains in the linage-counter 153 has been indicated line position in vertical direction.Variable " field " is held in the counter 155 on the scene.When variable " field " had value " 1 ", it represented odd field.When this variable had value " 2 ", it represented even field.Variable " RegMaskOnOff " is maintained to be covered in the control register 156.When this variable indication " on ", it means that covering demonstration is performed.When this variable indication " off ", it means that covering demonstration is not performed.Variable " MaskOnOff " is used to reference to the value of covering control register 156.
At first, variable " field " is initialized to " 1 " (step S911), and variable " y " is initialized to the value (at first being " 1 ") (step S913) of variable " field ", and variable " x " is initialized to " 1 " (step S915).Variable " MaskOnOff " is provided with the value (step S917) of variable " RegMaskOnOff ".Value display pixel (x, y) (step S920) according to variable " MaskOnOff ".
With variable " x " increase " 1 " (step S932) till the pixel (being 640 pixels in this example) that shows delegation in (step S933), the processing in repeating step S920 and the later step.
When having shown the pixel of delegation, handle the demonstration that advances at next line and handle.In this example, because suppose to have used interlacing method, so the processing in variable " y " increase " 2 " (step S935) in repeating step S915 and the later step is till the pixel (being equivalent to 240 row in this example) of one of demonstration (step S936).
When the pixel that shown one, handle the demonstration that advances at next and handle.Variable " field " is added " 1 " (step S937).If the value of variable " field " is " 2 " or littler (step S938), the then processing of repeating step S913 and later step.When the value of variable " field " surpassed " 2 ", the processing in repeating step S911 and the later step was to show next frame (step S939).If do not show next frame, then termination.
Fig. 8 is the flow chart that the example of the processing (" imaging instruction processing " hereinafter) that is used to handle imaging instruction according to the embodiment of the invention, in first example is shown.It is to carry out asynchronously with processing shown in Figure 7 that this imaging instruction is handled.When shutter opens when putting device and being pressed, the value of variable " RegMaskOnOff " is set in the reason herein.
Particularly, open (step S811) when providing imaging instruction after putting device, variable " RegMaskOnOff " setting is indicated effectively " On " (step S812) when pressing shutter.Variable " RegMaskOnOff " in the step S917 of Fig. 7 by reference.
Fig. 9 is the flow chart that the example of the processing (step S920) that is used to show A according to the embodiment of the invention, in first example is shown.
In the processing that is used for showing A, when variable " MaskOnOff " indicates " off " (step S921), (x y) is read as variable " d " (step S922) and be displayed on (step S929) on the display 160 to the pixel in the video memory.On the contrary, when variable " MaskOnOff " indicates " on " (step S921), the coverage data M askColor with value of for example representing black is set to variable " d " (step S924) and is displayed on (step S929) on the display 160.
As mentioned above, according to first example,, just judge that shutter opens to put device and whether be pressed whenever the pixel that shows delegation according to the embodiment of the invention.Advantageously, can carry out coverage with behavior unit shows.
Figure 10 is the flow chart of explanation by second example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In second example, be that unit carries out the coverage demonstration with the pixel.Other precondition is identical with first embodiment.
In second example, reference is different with first example by the moment of the value of variable " RegMaskOnOff " indication.In first example, just in step S917, variable " MaskOnOff " is provided with the value of variable " RegMaskOnOff " whenever the pixel that shows delegation.In second example, whenever showing that a pixel just is provided with the value of being indicated by variable " RegMaskOnOff " to variable " MaskOnOff " in step S917.Therefore, because be used for showing that the processing of A judges whether to carry out coverage demonstration (step S920) with pursuing pixel, so can be that unit execution coverage shows with the pixel.
Figure 11 is the flow chart of explanation by the 3rd example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 3rd example, be that unit carries out the coverage demonstration with the pixel by the mode that is similar to second example, and precondition is identical with second example.The 3rd example is used variable " PixelCounter ", opens at shutter that this variable is reset when putting device and being pressed, and whenever showing that this variable of pixel just is increased.
Variable " PixelCounter " is maintained in the transition counter 158, and whenever showing that a pixel is added " 1 " (step S931) with regard to (step S940).Variable " PixelCounter " is used to obtain the pixel value mixing ratio in the processing that is used for showing B (step S940).
Figure 12 is the flow chart of the example of the explanation processing that is used to handle imaging instruction according to the embodiment of the invention, in the 3rd example.This processing is to carry out asynchronously with the processing of reference Figure 11 explanation.When shutter opens when putting device and being pressed, the value of variable " RegMaskOnOff " is set up, thus initializing variable " PixelCounter ".
Particularly, open (step S821) when providing imaging instruction after putting device when pressing shutter, to variable " RegMaskOnOff " setting " on " (step S822), and variable " PixelCounter " is initialized to " 0 " (step S823).Variable " RegMaskOnOff " in the step S931 of Figure 11 by reference.
Figure 13 is explanation shows the processing (step S940) of B according to the embodiment of the invention, being used in the 3rd example a flow chart.
In the processing that is used for showing B, when variable " MaskOnOff " indicates " Off " (step S941), (x y) is read as variable " d " (step S942) and be displayed on (step S949) in the display 160 to the pixel in the video memory.Processing in these steps with reference to figure 9 explanation be used for show that those of processing of A are identical.
On the contrary, when variable " MaskOnOff " indicates " On " (step S941), the relation between variable " PixelCounter " and the variable " PixelMaskNum " is determined.Variable " PixelMaskNum " is maintained in the transition zone register 157, and has indicated number of pixels in the transition zone shown in Figure 3 703.When variable " PixelCounter " during less than variable " PixelMaskNum " (step S943), the pixel that be shown is present in the transition zone 703.Therefore, (x y) is mixed (step S945) with the coverage data to the pixel in the video memory mutually.Particularly, use following formula, based on variable " PixelCounter " and variable " PixelMaskNum " recently obtain pixel value, and the value that is obtained is set to variable " d ".
When variable " PixelCounter " is not less than variable " PixelMaskNum " (step S943), the pixel that be shown is not present in the transition zone 703.Therefore, the coverage data M askColor with value of for example representing black is set to variable " d " (step S944).Under any situation, display pixel value is set to the pixel of variable " d " on display 160.
As mentioned above, in the 3rd example, in transition zone 703 be with the pixel unit pixel that will read from video memory (x, y) with cover data mixing, thereby can little by little show the coverage data.
Figure 14 is the flow chart of explanation by the 4th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 4th example, carry out coverage with behavior unit ground and show, and precondition is identical with first example.The 4th example has been used variable " LineCounter ", opens at shutter that this variable is reset when putting device and being pressed, and whenever this variable of pixel that shows delegation just is increased.
Variable " LineCounter " is maintained in the transition counter 158, and whenever the pixel (step S950) that shows delegation is just added " 1 " (step S934).Variable " LineCounter " is used to obtain the pixel value mixing ratio in the processing that is used for showing C (step S950).
Figure 15 is the flow chart that the example of the processing that is used to handle imaging instruction according to the embodiment of the invention, in the 4th example is shown.This processing is to carry out asynchronously with the processing of above-mentioned Figure 14.When shutter opens when putting device and being pressed, the value of variable " RegMaskOnOff " is set up, thus initializing variable " LineCounter ".
Particularly, open (step S831) when providing imaging instruction after putting device when pressing shutter, to variable " RegMaskOnOff " setting " On " (step S832), thereby variable " LineCounter " is initialized to " 0 " (step S833).Variable " RegMaskOnOff " in the step S917 of Figure 14 by reference.Variable " LineCounter " in the step S934 of Figure 14 by reference.
Figure 16 illustrates the flow chart of example that shows the processing (step S950) of C according to the embodiment of the invention, being used in the 4th example.
In the processing that is used for showing C, when variable " MaskOnOff " indicates " Off " (step S951), (x y) is read as variable " d " (step S952) and be displayed on (step S959) on the display 160 to the pixel in the video memory.Processing in these steps with reference to figure 9 explanation be used for show that those of processing of A are identical.
On the contrary, when variable " MaskOnOff " indicates " on " (step S951), the relation between variable " LineCounter " and the variable " LineMaskNum " is determined.Variable " LineMaskNum " is maintained in the transition zone register 157, and has indicated the line number order in the transition zone shown in Figure 3 703.When variable " LineCounter " during less than variable " LineMaskNum " (step S953), the pixel that be shown is present in the transition zone 703.Therefore, the pixel that reads from video memory (x, y) by with cover data and mixed (step S955) mutually.Particularly, use following formula, based on variable " LineCounter " and variable " LineMaskNum " recently obtain pixel value, and the value that is obtained is set to variable " d ".
On the contrary, when variable " LineCounter " is not less than variable " LineMaskNum " (step S953), the pixel that be shown is not present in the transition zone 703.Therefore, the coverage data M askColor with value of for example representing black is set to variable " d " (step S954).Under any situation, all display pixel value is set to the pixel of variable " d " on display 160.
As mentioned above, in the 4th example, (x y) with the coverage data mixing, thereby can little by little show the coverage data to the pixel that will read from video memory with behavior unit in transition zone 703.
Figure 17 is the flow chart that illustrates by the 5th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 5th example, be that unit carries out the coverage demonstration with the pixel, and precondition is identical with the 3rd example.
In the 5th example, variable " PixelCounter " is added " 1 " (step S931) by the mode that is similar to the 3rd example.The difference of the 5th example and the 3rd example is when showing the new field variable " PixelCounter " to be provided with the value (step S914) of variable " PixelMaskNum ".This processing is the countermeasure at following situation: show to be changed to the value at another new field variable " PixelCounter " not reach situation with transition zone 703 corresponding number of pixels.Even can consider when showing the new field, to show the coverage data from the outset, cover demonstration and also can not bring inharmonic sensation.
As mentioned above, in the 5th example,, can show the coverage data from the outset when demonstration is changed when not reaching with transition zone 703 corresponding number of pixels for the value at another new field variable " PixelCounter ".
Figure 18 is the flow chart that illustrates by the 6th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 6th example, carry out coverage with behavior unit and show, and precondition is identical with the 4th example.
The 6th example is to obtain by revising the 4th example by the mode that is similar to the 5th example.Variable " LineCounter " is added " 1 " (step S934), and is changed at another new field the time when showing, variable " LineCounter " is provided with the value (step S914) of variable " LineMaskNum ".Therefore, even be changed when not reaching with the corresponding line number order of transition zone 703, also can show the coverage data from the outset for the line number order of counting at another new field when showing.
Figure 19 is the flow chart that illustrates by the 7th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 7th example, carry out coverage with behavior unit and show, and precondition is identical with first example.
In the 7th example, the line number purpose variable " y " that indicates on the vertical direction to be counted thus be changed and reduce (step S963, S975 and S976).The difference of the 7th example and first example just is that.Therefore, can tackle the dioptric system that uses with reference to figure 6D description according to the processing of the 7th example.
At the application note that carry out to cover shows with behavior unit the 7th example.In each of other example, variable " y " can be changed to reducing the use with the reply dioptric system.
Figure 20 is the flow chart that illustrates by the 8th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 8th example, supposed to use the method line by line of transfer rate with 1/60 second.Other precondition is identical with first example.
In the 8th example, because hypothesis has been used method line by line, so do not use variable " field ".Variable " y " is initialized to " 1 " (step S983).Whenever the pixel that shows delegation, variable " y " is just added " 1 " (step S995).
The 8th example has been described at carrying out the application of covering demonstration with behavior unit.In each of other example, can use method line by line.
Show that at carrying out to cover the situation of dividing screen simultaneously shown in Fig. 4 A and 4B has in the horizontal direction illustrated above-mentioned example.Cover and show and to carry out by another kind of display mode.
Figure 21 A and 21B are the diagrammatic sketch that illustrates other example that is shown by the imaging device 100 according to the embodiment of the invention.Figure 21 A illustrates and covers the demonstration example of viewing area for circle.Figure 21 B illustrates another demonstration example of covering the viewing area is shown obliquely.
As mentioned above, according to present embodiment of the present invention, when imaging device 100 need not to wait for finish with the Imagery Data Recording of a frame (or) under the situation of video memory 140 on display 160 during display image, this device can cover because the image fault that switching causes between the operator scheme of imageing sensor.
Embodiments of the invention are to be used to realize example of the present invention.Though between the feature of claim and the particular element in the embodiment of the invention, have corresponding relation, as will be described later, the invention is not restricted to this embodiment.It will be understood by those skilled in the art that in the scope of appended claims or its equivalent, can carry out various modifications, combination, sub-portfolio and change according to designing requirement and other factors.
View data provides part 141 for example corresponding to view data generator according to an embodiment of the invention.Covering data provides part 151 for example corresponding to the coverage data supplying device according to this embodiment of the invention.Imaging instruction is accepted part 191 for example corresponding to the imaging instruction receiving device according to this embodiment of the invention.Draw part 152 for example corresponding to drawing apparatus according to this embodiment of the invention.
In addition, each step S921, S941 and S951 are for example corresponding to detecting step according to an embodiment of the invention.In the combination of the combination of step S924, step S944 and S945 and step S954 and S955 each is for example corresponding to the plot step according to this embodiment of the invention.
The treatment step of in previous embodiment of the present invention, describing can be regarded as comprising those treatment steps method, allow computer to carry out the program of those treatment steps or store the recording medium of this program.
The cross reference of related application
The present invention comprises and the relevant theme of submitting to Japan Patent office on October 19th, 2007 of Japanese patent application JP2007-272112, and the full content of this application is incorporated into this by reference.