CN101692694B - Image display control device, and method for controlling image display control device - Google Patents

Image display control device, and method for controlling image display control device Download PDF

Info

Publication number
CN101692694B
CN101692694B CN2008101749178A CN200810174917A CN101692694B CN 101692694 B CN101692694 B CN 101692694B CN 2008101749178 A CN2008101749178 A CN 2008101749178A CN 200810174917 A CN200810174917 A CN 200810174917A CN 101692694 B CN101692694 B CN 101692694B
Authority
CN
China
Prior art keywords
pixel
imaging instruction
data
image
variable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008101749178A
Other languages
Chinese (zh)
Other versions
CN101692694A (en
Inventor
土井宣洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to CN2008101749178A priority Critical patent/CN101692694B/en
Publication of CN101692694A publication Critical patent/CN101692694A/en
Application granted granted Critical
Publication of CN101692694B publication Critical patent/CN101692694B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The invention provides an image display control device, a method and a program for controlling the image display control device. The image display control device comprises an image data providing part, an occulting data providing part, an imaging instruction receiving part and a drawing part, wherein the image data providing part provides an image consisting of a plurality of pixels arrayed two-dimensionally in a matrix form serving as captured image data; the occulting data providing part provides occulting data for occulting the image data; the imaging instruction receiving part verifies an imaging instruction receiving signal when receiving operation input corresponding to an imaging instruction; and the drawing part draws each pixel in the image data sequentially at intervals of a predetermined period, and the pixels have pixel values. When image display control device detects that the imaging instruction receiving signal is effective, the drawing part draws the pixels after the detection so as to enable each pixel to take a value comprising the occulting data as the pixel value.

Description

Image display control apparatus and the method that is used to control this device
Technical field
The present invention relates to image display control apparatus, relate in particular to and when allowing display to show the image of being caught, be used to carry out the image display control apparatus that shows control, be used to control the method for this image display control apparatus and allow computer to carry out the program of this method.
Background technology
In the imaging device such as digital camera, the image of catching by imageing sensor (for example, charge-coupled device (CCD) or complementary metal oxide semiconductors (CMOS) (CMOS) transducer) is displayed in LCD (LCD) or the electronic viewfinder (EVF).Recent many imaging devices do not comprise optical finder.In many cases, the composition of object is based on that demonstration on LCD or the EVF determines.
In this case, the user want the image of catching and the user press shutter open put device (shutter release) afterwards between the image of institute's physical record life period postpone.Particularly, the User Recognition object and press shutter open put device after, before imageing sensor makes image exposure, spent with imaging device in the corresponding time of circuit length.Under situation based on the composition of determining object in the demonstration on LCD or the EVF, after image begins to expose, time of being spent before being displayed on LCD or the EVF of image also with above-mentioned time addition.Therefore, reducing above-mentioned time delay is very important for user-friendly imaging device is provided.
According to being used to realize one of background technology that the high speed monitor shows, for example, when between rest image pattern (still-mode) and moving image mode (monitoring mode), replacing switching operation modes, give higher priority to moving image mode, to reduce the delay that monitor shows during taking continuously.For example, Japanese unexamined patent announces that No.2005-142707 discloses such technology.
Summary of the invention
Imaging device generally includes video memory, is used to store the corresponding view data of the image of catching with imageing sensor.View data is read from video memory, and image is displayed on the display.Generally speaking, after the view data of a frame that will be shown () is written to video memory, just from the video memory reads image data.
When shutter opens when putting device and being pressed, the operator scheme of imageing sensor is switched, thereby changes the transfer rate of view data.Therefore, view data may distortion.If when writing data into video memory from the video memory reads image data, then press view data that shutter opens distortion when putting device by as its former state ground distortion export.If in order to prevent the view data distortion after finishing the writing of view data from the video memory reads image data, the time delay when then showing prolongs.
In addition, in some optical system, the order that view data is written to video memory is different with the order of display image data.Therefore, if when view data is written to video memory from the video memory reads image data, may show factitious image.
Time delay when wishing to reduce demonstration prevents the distortion of image simultaneously.
The present invention has been proposed in order to address the above problem.According to first embodiment of the invention, a kind of image display control apparatus is provided, it comprises: the view data generator is used to provide by the image of forming by a plurality of pixels of matrix form two-dimensional arrangements as the seizure view data; Cover data supplying device, be used to be provided for covering the coverage data of view data; The imaging instruction receiving device is used for when receiving with the corresponding operation of imaging instruction input the imaging instruction acknowledge(ment) signal being verified; And drawing apparatus, being used for each pixel every predetermined period ground order drawing image data, each in the described pixel has pixel value.When to detect the imaging instruction acknowledge(ment) signal be effective, drawing apparatus was drawn described pixel so that each pixel is to comprise that the value of covering data is as pixel value after described detection.
According to second embodiment of the invention, the method that provides a kind of image that is used for controlling image display control apparatus to show.
According to third embodiment of the invention, a kind of program is provided, it makes computer can carry out the method in image display control apparatus.
According to above-mentioned each embodiment, advantageously, pixel is plotted as and makes and detecting the imaging instruction acknowledge(ment) signal for after effectively, and each pixel is to comprise that the value of covering data is as pixel value.
In first embodiment, when to detect the imaging instruction signal be effective, drawing apparatus can be drawn the described pixel with the pixel value that mixes mutually with the coverage data after described detection, so that the pixel value of described pixel is changed into the coverage data gradually in predetermined segment.Advantageously, can show naturally to the transition of covering data.
In first embodiment, detecting the imaging instruction acknowledge(ment) signal when showing new images after effectively, drawing apparatus can be plotted as the pixel that has pixel value separately with the coverage data.Advantageously, carrying out the transition to when showing new images before covering data and can with under the situation that pixel value mixes mutually not show and cover data.
In first embodiment, when to detect the imaging instruction acknowledge(ment) signal be effective, drawing apparatus can will be covered data and be plotted as the pixel that has pixel value separately after described detection.Advantageously, detecting the imaging instruction acknowledge(ment) signal for being plotted as the pixel that has pixel value separately with covering data after effective.
In first embodiment, drawing apparatus can be carried out detecting operation finish when arranging the drafting of one group of pixel in one direction continuously in described arrangement at every turn.Advantageously, judge line by line whether the imaging instruction acknowledge(ment) signal is effective, have the pixel that comprises the value of covering data so that can draw with behavior unit ground.
In first embodiment, drawing apparatus can be carried out detecting operation when the drafting of at every turn finishing a pixel in the described arrangement.Advantageously, judge by pixel ground whether the imaging instruction acknowledge(ment) signal is effective, so that can be that unit ground drafting has the pixel that comprises the value of covering data with the pixel.
In first embodiment, drawing apparatus can be drawn described pixel in vertical direction, so that the direction of drafting pixel is identical with the scanning direction on the display surface of wanting in check display.Advantageously, can on for the suitable direction of the characteristic of optical system, draw pixel.
In first embodiment, cover data and can comprise colouring information and transmissivity.Advantageously, view data can be shown as and change into the coverage data naturally.
Embodiments of the invention have remarkable advantage: can reduce the distortion that time delay in the imaging device prevents image simultaneously.
Description of drawings
Fig. 1 illustrates the block diagram of the structure of imaging device 100 according to an embodiment of the invention;
Fig. 2 is the block diagram that illustrates according to the functional structure of the essential part of the device of this embodiment of the invention;
Fig. 3 is the sequential chart that illustrates according to the operation of the imaging device 100 of this embodiment of the invention;
Fig. 4 A and 4B are the diagrammatic sketch that illustrates the example that is shown by the imaging device 100 according to this embodiment of the invention;
Fig. 5 A and 5B are the diagrammatic sketch that illustrates the relation between lens unit 110 and the imager 121 separately;
Fig. 6 A is the diagrammatic sketch of the relation between the demonstration that illustrates separately on object and the display 160 to 6D;
Fig. 7 is the flow chart that illustrates by first example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Fig. 8 is illustrated in the flow chart of example of handling the processing of imaging instruction according to being used in first example of this embodiment of the invention;
Fig. 9 is the flow chart of example that is used to show the processing of A in first example that is illustrated in according to this embodiment of the invention;
Figure 10 is the flow chart that illustrates by second example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Figure 11 is the flow chart that illustrates by the 3rd example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Figure 12 is illustrated in the flow chart of example of handling the processing of imaging instruction according to being used in the 3rd example of this embodiment of the invention;
Figure 13 is the flow chart of example that is used to show the processing of B in the 3rd example that is illustrated in according to this embodiment of the invention;
Figure 14 is the flow chart that illustrates by the 4th example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Figure 15 is illustrated in the flow chart of example of handling the processing of imaging instruction according to being used in the 4th example of this embodiment of the invention;
Figure 16 is the flow chart of example that is used to show the processing of C in the 4th example that is illustrated in according to this embodiment of the invention;
Figure 17 is the flow chart that illustrates by the 5th example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Figure 18 is the flow chart that illustrates by the 6th example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Figure 19 is the flow chart that illustrates by the 7th example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention;
Figure 20 is the flow chart that illustrates by the 8th example of the processing of carrying out according to the imaging device 100 of this embodiment of the invention; And
Figure 21 A and 21B are the diagrammatic sketch that illustrates other example that is shown by the imaging device 100 according to this embodiment of the invention.
Embodiment
To describe embodiments of the invention with reference to the accompanying drawings in detail.
Fig. 1 illustrates the diagrammatic sketch of the structure of imaging device 100 according to an embodiment of the invention.Imaging device 100 comprises that lens unit 110, image-generating unit 120, signal processing unit 130, video memory 140, indicative control unit 150, display 160, image recording unit 170, external memory storage connect (I/F) 180 and system control unit 190.
Lens unit 110 is to be configured to the light from the object reflection is guided to the optical unit of the light receiving surface of image-generating unit 120, and comprises the driver of a group of lens, aperture (iris) and those lens of driving such as condenser lens and zoom lens.Lens unit 110 forms the corresponding object images of light with process optical path 119 on the light receiving surface of image-generating unit 120.
The light that image-generating unit 120 is configured to provide from lens unit 110 is converted to the signal of telecommunication, and this signal is sent to signal processing unit 130 as the picture signal of being caught via holding wire 129.The operator scheme that image-generating unit 120 changes transfer rate and is associated with resolution according to the instruction that provides from system control unit 190.
Signal processing unit 130 is configured to receive from image-generating unit 120 and transmits and the next picture signal of catching, and the signal that is received is carried out prearranged signal handle.Signal processing unit 130 is converted to digital form with picture signal from analog form, and to signal carry out such as white balance correction, Gamma correction, signal processing amplifying and dwindling.The picture signal that signal processing unit 130 will obtain thus outputs to video memory 140 as the seizure view data via holding wire 139.
Video memory 140 is configured to store the view data of catching that lives through signal processing.The view data that is stored in the video memory 140 is provided for indicative control unit 150 via holding wire 149, and is provided for image recording unit 170 via holding wire 148.The view data of catching form by a plurality of pixels by the matrix form two-dimensional arrangements.Every group of pixel of arranging by horizontal direction is called row (row/line).Location of pixels in each row is called row.
Indicative control unit 150 is configured to via holding wire 149 from video memory 140 reads image data, and via holding wire 159 data that read is outputed to display 160.In this embodiment of the present invention, indicative control unit 150 need not from video memory 140 reads image data to wait for to finish a frame (or) view data is written to the operation of video memory 140.At this moment, indicative control unit 150 is that the image demonstration is covered to prevent image fault by unit with behavior unit or with the pixel.Those timings of system control unit 190 controls.
Display 160 shows from the image of indicative control unit 150 outputs, and for example comprises LCD (LCD).Image on display 160 show by with the upper left corner be starting point, line by line in the horizontal direction from left to right and sequentially from top to bottom scan image realize.
Image recording unit 170, and compresses these data by predetermined file format (for example, JPEG (joint photographic experts group) (JPEG)) from video memory 140 reads image data via holding wire 148.View data through overcompression is output to external memory interface 180 via holding wire 179.External memory interface 180 is the interfaces that are used to be connected to external memory storage 210.View data through image recording unit 170 compressions is recorded to external memory storage 210 by external memory interface 180.External memory storage 210 for example comprises hard disk or flash memory.
The whole imaging device 100 of system control unit 190 controls.For example, system control unit 190 monitors the operation input from the user.When receiving imaging instruction, the operator scheme that system control unit 190 changes in the image-generating unit 120.In addition, system control unit 190 will indicate the information that receives imaging instruction and send to indicative control unit 150.
When using line by line method, image-generating unit 120 is for example every the view data of 1/30 second output one frame.When using interlacing method, image-generating unit 120 is for example every (even number line in the frame or odd-numbered line) view data of output in 1/60 second.For operator scheme, image-generating unit 120 has monitoring mode and still-mode, wherein, shows moving image, catching static images in still-mode after determining object in monitoring mode.Image-generating unit 120 is carried out different operations at these operator schemes.In monitoring mode, a spot of pixel just is enough to catch the image of whole object, and resolution is adjusted to the resolution of display 160 under the situation of considering the balance between resolution and the power consumption.In still-mode, the view data of high-resolution and higher density is output.Because number of pixels is huge, so spent the time output image data longer than monitoring mode.
For catching static images, image-generating unit 120 is operated to show latest image on display 160 under monitoring mode usually.When providing imaging instruction, for example the user presses shutter and opens and put device, and image-generating unit 120 is switched to still-mode.
Between monitoring mode and still-mode, switch and to spend the relatively long time.In addition, the image of between possibility output distortion between transfer period.Therefore, the demonstration on the display 160 for example begins to be switched to from next (or frame) of the field (or frame) that provides imaging instruction that the coverage of black shows.Finish the demonstration of execution coverage always before the output of the view data of catching under the still-mode, thereby preventing the output distortion image.
Fig. 2 is the diagrammatic sketch that illustrates the functional structure of the essential part of the imaging device of this embodiment according to the present invention.Suppose that essential part comprises that view data provides part 141, coverage data to provide part 151, drafting part 152, linage-counter 153, column counter 154, a counter 155, coverage control register 156, transition zone (transition segment) register 157, transition counter 158, display part 161 and imaging instruction to accept part 191.
View data provides part 141 that view data is caught by institute and offers drafting part 152.Described view data obtains by carrying out signal processing by 130 pairs of picture signals of signal processing unit, and picture signal is caught by image-generating unit 120.View data is offered indicative control unit 150 from video memory 140.
Covering data provides part 151 to provide will to be used as the data of the coverage data that show on the display 160.Cover data and can comprise colouring information and transmissivity.For example, can use expression to have the data of the black of 100% transmissivity.
Drawing part 152 uses from view data and view data that part 141 provides is provided or draws the pixel that will be output to display part 161 from the coverage data of covering data and providing part 151 to provide.
Linage-counter 153 has been indicated the row of drawing the pixel that part 152 will draw.Column counter 154 has been indicated the row of drawing the pixel that part 152 will draw.When carrying out demonstration according to interlacing method, a counter 155 has been indicated and has been drawn the field that part 152 will be drawn.When carrying out according to method line by line when showing, do not make use counter 155.Draw part 152 according to the view data of coming by the information of linage-counter 153, column counter 154 and a counter 155 indications to provide part 141 to provide with reference to from view data, and the output drawing result.
Cover control register 156 and be held in picture instruction accepting signal, this imaging instruction acknowledge(ment) signal indicates with the corresponding operation input of imaging instruction and is accepted.Cover control register 156 and verify this imaging instruction acknowledge(ment) signal according to the instruction of accepting part 191 from imaging instruction.Draw part 152 with reference to this imaging instruction acknowledge(ment) signal.
Transition zone register 157 remains on the corresponding operation input of imaging instruction and is accepted afterwards and the transition zone before the coverage data are shown.Transition zone is the parameter that depends on the structure of imaging device.Therefore, can in transition zone register 157, optimum value be set according to the structure of imaging device.
The current location that transition counter 158 keeps in the transition zone.Generally speaking, transition counter 158 is resetting when being accepted with the corresponding operation input of imaging instruction, and counts till transition zone finishes.When blended image data and coverage data, use transition counter 158.
Imaging instruction is accepted part 191 and is accepted and the corresponding operation input of imaging instruction from the user.The part 191 of accepting imaging instruction is for example opened by shutter puts device and realizes.
Display part 161 shows from drawing the pixel of part 152 outputs.Display part 161 is corresponding to display 160.
Cover that data provide part 151, draw part 152, linage-counter 153, column counter 154, a counter 155, cover control register 156, transition zone register 157 and transition counter 158 and be implemented function as indicative control unit 150.
Fig. 3 is the sequential chart of operation that the imaging device 100 of this embodiment according to the present invention is shown.
When using interlacing method, transmitted by the place order via the picture signal of holding wire 129 from image-generating unit 120 outputs.When using line by line method, those signals are transmitted by the frame by frame order.In Fig. 3, suppose to have used interlacing method.Show the odd field signal 711 of first frame, the even field signal 712 of first frame and the odd field signal 713 of second frame.In some imageing sensor, each or each frame all comprise invalid data at its end.In this example shown in Figure 3, illustrated with striped at the invalid data block 721 to 723 of signal 711 to 713 end separately.
Dotted line 701 has indicated the user to import the time of imaging instruction.The picture signal of catching that provides after imaging instruction is transfused to may comprise aforesaid distorted image section.In Fig. 3, comprise that the part of this distorted image section is illustrated with shade.
Lagging behind image signals corresponding via holding wire 139 a little from each treated signal of signal processing unit 130 output is output.This delay is caused by imaging circuit.The transfer rate of sort signal is identical with picture signal.Because the processing that invalid data block 721 to 723 needn't be used for subsequently, so those data blocks are not exported from signal processing unit 130.
Be illustrated in below the treated signal finish output 349 after the record of video memory and memory output show 359 be illustrated in hypothesis indicative control unit 150 at the signal that provides from signal processing unit 130 to the timing of waiting for before the record end of video memory 140 and being obtained during in reading of data subsequently always.In this case, can carry out to cover since the head of the even field signal 712 of first frame and show, but demonstration itself may be delayed.According to embodiments of the invention, indicative control unit 150 is finished signal record to video memory 140 and need not wait via holding wire 149 reading of data.
In other words, be output early than output 349 from the view data that video memory 140 reads via holding wire 149.In this example, because imaging instruction is presented in the centre of the first frame even field signal 712, so beginning demonstration switched to and cover to show from next.Therefore, distorted image section possibility remaining (with reference to control demonstration output 459).According to this embodiment of the invention, after imaging instruction 701 just has been presented, just carry out and cover demonstration, thereby control the distorted image section is not outputed to display 160 as demonstration output via holding wire 159.
In this embodiment, can after just being presented, imaging instruction show the coverage data of representing black.In this case, cover the centre that data are displayed on a field picture, shown in Fig. 4 A.Some original image may provide factitious impression.According to the modification of present embodiment, not after imaging instruction just is presented, just data to be switched completely to the coverage data.In the view data of the pixel value of each pixel of expression in the original image with cover data drawing image when just mixed, so that during the transition zone 703 between the time of accepting in imaging instruction and the beginning display distortion image section 702, gradually the pixel value of each pixel is changed into the coverage data.At this moment, also increase transmissivity gradually so that gradually view data is changed into the coverage data except changing color signal.Thereby image can be shown as and make that switching to coverage naturally shows, shown in Fig. 4 B.
Describe imager in detail and show relation between the control referring now to Fig. 5 A and 5B.
Fig. 5 A and 5B are the diagrammatic sketch that illustrates lens unit 110 and be designated as the relation between 121 the imager.Imager 121 is corresponding to image-generating unit 120, and comprises CCD or cmos image sensor.
Fig. 5 A illustrate watch from a side of the housing of imaging device 100, the relation between typical optical system and imager 121.In this case, comprise that the optical system of one group of lens 111 is disposed in the place ahead of imager 121.The set of lenses 111 that is set between object 11 and the imager 121 will be from the optical convergence of object 11 reflection on the imaging surface of imager 121, makes to go up in vertical direction (turning upside down) and horizontal direction (reversed left to right) with the corresponding object images of light to reverse.
Fig. 5 B illustrate watch from a side of the housing of imaging device 100, the relation between dioptric system and imager 121.In this case, dioptric system comprise one group of lens 113 in the place ahead that is arranged on imager 121 and be arranged in object 12 and set of lenses 113 between the optical axis to light carry out the reflecting member 112 of flexion.Reflecting member 112 is basically with the light of right angle reflection from object 12, with the object images of vertically reversing.Reflecting member 112 is for example realized by prism.Set of lenses 113 will be from the optical convergence that reflects member 112 on the imaging surface of imager 121, so that go up counter-rotating with the corresponding object images of this light in vertical direction (turning upside down) and horizontal direction (reversed left to right).Dioptric system is used in small dimension (low-profile) imaging device usually.
Fig. 6 A is the diagrammatic sketch that illustrates object and the relation between the demonstration on the display 160 to 6D.Fig. 6 A illustrates the object images 19 of watching from imaging device 100.For the ease of understanding, the thick arrow in the object images 19 is arranged on the display surface of typical displays along the scanning direction.
Fig. 6 B illustrates when the typical optical system of using shown in Fig. 5 A the image on the imaging surface that watch, that be formed on imager 121 of dead ahead from imager 121 (, from object).Thick arrow is corresponding to formed object images 19.The dotted arrow indication is from the direction of imager 121 output image datas.In this case, on thick arrow and the vertical direction, be complementary at a dotted arrow shown in the outside of imager 121.Therefore, from bottom to up from imager 121 reading images signals, thus from top to down display image on display 160.
That Fig. 6 C illustrates when the dioptric system that uses shown in Fig. 5 B is that upside from the background technology imaging device (, from the dead ahead of imager 121) is watched, be formed on the image on the imaging surface.Thick arrow is corresponding to formed object images 19.Dotted arrow has been indicated from the direction of imager 121 output image datas.In this case, thick arrow with in vertical direction, opposite at a dotted arrow shown in the outside of imager 121.When from bottom to up from imager 121 reading images signals, display image on display 160 from bottom to up.Suppose that imager 121 comprises cmos sensor, can select from bottom to top order or top-down order as the order of output image data.Suppose that imager 121 comprises ccd sensor, the direction of output image data structurally is confined to either direction.Therefore, when imager comprises typical ccd sensor, make the below object-oriented of the imaging surface of imager, thereby use at general optical system, from imaging surface from bottom to top and from a left side and right ground output image data.
According to embodiments of the invention, when using dioptric system, imager is arranged such that the top object-oriented of imaging surface, and from top to down is from video memory 140 reading of data.Therefore, from top to down display image on display 160.
Fig. 6 D illustrates when the dioptric system that uses shown in Fig. 5 B, from watch according to the top of the imaging device 100 of present embodiment the dead ahead of imager 121 (that is, from), be formed on the image on the imaging surface.Thick arrow is corresponding to formed object images 19.Dotted arrow has been indicated from the direction of imager 121 output image datas.In this case, on thick arrow and the vertical direction, be complementary at a dotted arrow shown in the outside of imager 121.From top to down is from imager 121 reading images signals, thus from top to down display image on display 160.
Referring now to accompanying drawing operation according to the imaging device 100 of the embodiment of the invention is described.
Fig. 7 is the flow chart that has illustrated by first example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In first example, carry out coverage with behavior unit ground and show.Suppose to come by 1/60 second transfer rate the image of VGA Display size (640 * 480 pixel) according to interlacing method.The variable " x " that remains in the column counter 154 has been indicated column position in the horizontal direction.The variable " y " that remains in the linage-counter 153 has been indicated line position in vertical direction.Variable " field " is held in the counter 155 on the scene.When variable " field " had value " 1 ", it represented odd field.When this variable had value " 2 ", it represented even field.Variable " RegMaskOnOff " is maintained to be covered in the control register 156.When this variable indication " on ", it means that covering demonstration is performed.When this variable indication " off ", it means that covering demonstration is not performed.Variable " MaskOnOff " is used to reference to the value of covering control register 156.
At first, variable " field " is initialized to " 1 " (step S911), and variable " y " is initialized to the value (at first being " 1 ") (step S913) of variable " field ", and variable " x " is initialized to " 1 " (step S915).Variable " MaskOnOff " is provided with the value (step S917) of variable " RegMaskOnOff ".Value display pixel (x, y) (step S920) according to variable " MaskOnOff ".
With variable " x " increase " 1 " (step S932) till the pixel (being 640 pixels in this example) that shows delegation in (step S933), the processing in repeating step S920 and the later step.
When having shown the pixel of delegation, handle the demonstration that advances at next line and handle.In this example, because suppose to have used interlacing method, so the processing in variable " y " increase " 2 " (step S935) in repeating step S915 and the later step is till the pixel (being equivalent to 240 row in this example) of one of demonstration (step S936).
When the pixel that shown one, handle the demonstration that advances at next and handle.Variable " field " is added " 1 " (step S937).If the value of variable " field " is " 2 " or littler (step S938), the then processing of repeating step S913 and later step.When the value of variable " field " surpassed " 2 ", the processing in repeating step S911 and the later step was to show next frame (step S939).If do not show next frame, then termination.
Fig. 8 is the flow chart that the example of the processing (" imaging instruction processing " hereinafter) that is used to handle imaging instruction according to the embodiment of the invention, in first example is shown.It is to carry out asynchronously with processing shown in Figure 7 that this imaging instruction is handled.When shutter opens when putting device and being pressed, the value of variable " RegMaskOnOff " is set in the reason herein.
Particularly, open (step S811) when providing imaging instruction after putting device, variable " RegMaskOnOff " setting is indicated effectively " On " (step S812) when pressing shutter.Variable " RegMaskOnOff " in the step S917 of Fig. 7 by reference.
Fig. 9 is the flow chart that the example of the processing (step S920) that is used to show A according to the embodiment of the invention, in first example is shown.
In the processing that is used for showing A, when variable " MaskOnOff " indicates " off " (step S921), (x y) is read as variable " d " (step S922) and be displayed on (step S929) on the display 160 to the pixel in the video memory.On the contrary, when variable " MaskOnOff " indicates " on " (step S921), the coverage data M askColor with value of for example representing black is set to variable " d " (step S924) and is displayed on (step S929) on the display 160.
As mentioned above, according to first example,, just judge that shutter opens to put device and whether be pressed whenever the pixel that shows delegation according to the embodiment of the invention.Advantageously, can carry out coverage with behavior unit shows.
Figure 10 is the flow chart of explanation by second example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In second example, be that unit carries out the coverage demonstration with the pixel.Other precondition is identical with first embodiment.
In second example, reference is different with first example by the moment of the value of variable " RegMaskOnOff " indication.In first example, just in step S917, variable " MaskOnOff " is provided with the value of variable " RegMaskOnOff " whenever the pixel that shows delegation.In second example, whenever showing that a pixel just is provided with the value of being indicated by variable " RegMaskOnOff " to variable " MaskOnOff " in step S917.Therefore, because be used for showing that the processing of A judges whether to carry out coverage demonstration (step S920) with pursuing pixel, so can be that unit execution coverage shows with the pixel.
Figure 11 is the flow chart of explanation by the 3rd example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 3rd example, be that unit carries out the coverage demonstration with the pixel by the mode that is similar to second example, and precondition is identical with second example.The 3rd example is used variable " PixelCounter ", opens at shutter that this variable is reset when putting device and being pressed, and whenever showing that this variable of pixel just is increased.
Variable " PixelCounter " is maintained in the transition counter 158, and whenever showing that a pixel is added " 1 " (step S931) with regard to (step S940).Variable " PixelCounter " is used to obtain the pixel value mixing ratio in the processing that is used for showing B (step S940).
Figure 12 is the flow chart of the example of the explanation processing that is used to handle imaging instruction according to the embodiment of the invention, in the 3rd example.This processing is to carry out asynchronously with the processing of reference Figure 11 explanation.When shutter opens when putting device and being pressed, the value of variable " RegMaskOnOff " is set up, thus initializing variable " PixelCounter ".
Particularly, open (step S821) when providing imaging instruction after putting device when pressing shutter, to variable " RegMaskOnOff " setting " on " (step S822), and variable " PixelCounter " is initialized to " 0 " (step S823).Variable " RegMaskOnOff " in the step S931 of Figure 11 by reference.
Figure 13 is explanation shows the processing (step S940) of B according to the embodiment of the invention, being used in the 3rd example a flow chart.
In the processing that is used for showing B, when variable " MaskOnOff " indicates " Off " (step S941), (x y) is read as variable " d " (step S942) and be displayed on (step S949) in the display 160 to the pixel in the video memory.Processing in these steps with reference to figure 9 explanation be used for show that those of processing of A are identical.
On the contrary, when variable " MaskOnOff " indicates " On " (step S941), the relation between variable " PixelCounter " and the variable " PixelMaskNum " is determined.Variable " PixelMaskNum " is maintained in the transition zone register 157, and has indicated number of pixels in the transition zone shown in Figure 3 703.When variable " PixelCounter " during less than variable " PixelMaskNum " (step S943), the pixel that be shown is present in the transition zone 703.Therefore, (x y) is mixed (step S945) with the coverage data to the pixel in the video memory mutually.Particularly, use following formula, based on variable " PixelCounter " and variable " PixelMaskNum " recently obtain pixel value, and the value that is obtained is set to variable " d ".
d ← PixelMaskNum - PixelCounter PixelMaskNum × Mem ( x , y ) + PixelCounter PixelMaskNum × MaskColor
When variable " PixelCounter " is not less than variable " PixelMaskNum " (step S943), the pixel that be shown is not present in the transition zone 703.Therefore, the coverage data M askColor with value of for example representing black is set to variable " d " (step S944).Under any situation, display pixel value is set to the pixel of variable " d " on display 160.
As mentioned above, in the 3rd example, in transition zone 703 be with the pixel unit pixel that will read from video memory (x, y) with cover data mixing, thereby can little by little show the coverage data.
Figure 14 is the flow chart of explanation by the 4th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 4th example, carry out coverage with behavior unit ground and show, and precondition is identical with first example.The 4th example has been used variable " LineCounter ", opens at shutter that this variable is reset when putting device and being pressed, and whenever this variable of pixel that shows delegation just is increased.
Variable " LineCounter " is maintained in the transition counter 158, and whenever the pixel (step S950) that shows delegation is just added " 1 " (step S934).Variable " LineCounter " is used to obtain the pixel value mixing ratio in the processing that is used for showing C (step S950).
Figure 15 is the flow chart that the example of the processing that is used to handle imaging instruction according to the embodiment of the invention, in the 4th example is shown.This processing is to carry out asynchronously with the processing of above-mentioned Figure 14.When shutter opens when putting device and being pressed, the value of variable " RegMaskOnOff " is set up, thus initializing variable " LineCounter ".
Particularly, open (step S831) when providing imaging instruction after putting device when pressing shutter, to variable " RegMaskOnOff " setting " On " (step S832), thereby variable " LineCounter " is initialized to " 0 " (step S833).Variable " RegMaskOnOff " in the step S917 of Figure 14 by reference.Variable " LineCounter " in the step S934 of Figure 14 by reference.
Figure 16 illustrates the flow chart of example that shows the processing (step S950) of C according to the embodiment of the invention, being used in the 4th example.
In the processing that is used for showing C, when variable " MaskOnOff " indicates " Off " (step S951), (x y) is read as variable " d " (step S952) and be displayed on (step S959) on the display 160 to the pixel in the video memory.Processing in these steps with reference to figure 9 explanation be used for show that those of processing of A are identical.
On the contrary, when variable " MaskOnOff " indicates " on " (step S951), the relation between variable " LineCounter " and the variable " LineMaskNum " is determined.Variable " LineMaskNum " is maintained in the transition zone register 157, and has indicated the line number order in the transition zone shown in Figure 3 703.When variable " LineCounter " during less than variable " LineMaskNum " (step S953), the pixel that be shown is present in the transition zone 703.Therefore, the pixel that reads from video memory (x, y) by with cover data and mixed (step S955) mutually.Particularly, use following formula, based on variable " LineCounter " and variable " LineMaskNum " recently obtain pixel value, and the value that is obtained is set to variable " d ".
d ← LineMaskNum - LineCounter LineMaskNum × Mem ( x , y ) + LineCounter LineMaskNum × MaskColor
On the contrary, when variable " LineCounter " is not less than variable " LineMaskNum " (step S953), the pixel that be shown is not present in the transition zone 703.Therefore, the coverage data M askColor with value of for example representing black is set to variable " d " (step S954).Under any situation, all display pixel value is set to the pixel of variable " d " on display 160.
As mentioned above, in the 4th example, (x y) with the coverage data mixing, thereby can little by little show the coverage data to the pixel that will read from video memory with behavior unit in transition zone 703.
Figure 17 is the flow chart that illustrates by the 5th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 5th example, be that unit carries out the coverage demonstration with the pixel, and precondition is identical with the 3rd example.
In the 5th example, variable " PixelCounter " is added " 1 " (step S931) by the mode that is similar to the 3rd example.The difference of the 5th example and the 3rd example is when showing the new field variable " PixelCounter " to be provided with the value (step S914) of variable " PixelMaskNum ".This processing is the countermeasure at following situation: show to be changed to the value at another new field variable " PixelCounter " not reach situation with transition zone 703 corresponding number of pixels.Even can consider when showing the new field, to show the coverage data from the outset, cover demonstration and also can not bring inharmonic sensation.
As mentioned above, in the 5th example,, can show the coverage data from the outset when demonstration is changed when not reaching with transition zone 703 corresponding number of pixels for the value at another new field variable " PixelCounter ".
Figure 18 is the flow chart that illustrates by the 6th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 6th example, carry out coverage with behavior unit and show, and precondition is identical with the 4th example.
The 6th example is to obtain by revising the 4th example by the mode that is similar to the 5th example.Variable " LineCounter " is added " 1 " (step S934), and is changed at another new field the time when showing, variable " LineCounter " is provided with the value (step S914) of variable " LineMaskNum ".Therefore, even be changed when not reaching with the corresponding line number order of transition zone 703, also can show the coverage data from the outset for the line number order of counting at another new field when showing.
Figure 19 is the flow chart that illustrates by the 7th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 7th example, carry out coverage with behavior unit and show, and precondition is identical with first example.
In the 7th example, the line number purpose variable " y " that indicates on the vertical direction to be counted thus be changed and reduce (step S963, S975 and S976).The difference of the 7th example and first example just is that.Therefore, can tackle the dioptric system that uses with reference to figure 6D description according to the processing of the 7th example.
At the application note that carry out to cover shows with behavior unit the 7th example.In each of other example, variable " y " can be changed to reducing the use with the reply dioptric system.
Figure 20 is the flow chart that illustrates by the 8th example of the processing of carrying out according to the imaging device 100 of the embodiment of the invention.In the 8th example, supposed to use the method line by line of transfer rate with 1/60 second.Other precondition is identical with first example.
In the 8th example, because hypothesis has been used method line by line, so do not use variable " field ".Variable " y " is initialized to " 1 " (step S983).Whenever the pixel that shows delegation, variable " y " is just added " 1 " (step S995).
The 8th example has been described at carrying out the application of covering demonstration with behavior unit.In each of other example, can use method line by line.
Show that at carrying out to cover the situation of dividing screen simultaneously shown in Fig. 4 A and 4B has in the horizontal direction illustrated above-mentioned example.Cover and show and to carry out by another kind of display mode.
Figure 21 A and 21B are the diagrammatic sketch that illustrates other example that is shown by the imaging device 100 according to the embodiment of the invention.Figure 21 A illustrates and covers the demonstration example of viewing area for circle.Figure 21 B illustrates another demonstration example of covering the viewing area is shown obliquely.
As mentioned above, according to present embodiment of the present invention, when imaging device 100 need not to wait for finish with the Imagery Data Recording of a frame (or) under the situation of video memory 140 on display 160 during display image, this device can cover because the image fault that switching causes between the operator scheme of imageing sensor.
Embodiments of the invention are to be used to realize example of the present invention.Though between the feature of claim and the particular element in the embodiment of the invention, have corresponding relation, as will be described later, the invention is not restricted to this embodiment.It will be understood by those skilled in the art that in the scope of appended claims or its equivalent, can carry out various modifications, combination, sub-portfolio and change according to designing requirement and other factors.
View data provides part 141 for example corresponding to view data generator according to an embodiment of the invention.Covering data provides part 151 for example corresponding to the coverage data supplying device according to this embodiment of the invention.Imaging instruction is accepted part 191 for example corresponding to the imaging instruction receiving device according to this embodiment of the invention.Draw part 152 for example corresponding to drawing apparatus according to this embodiment of the invention.
In addition, each step S921, S941 and S951 are for example corresponding to detecting step according to an embodiment of the invention.In the combination of the combination of step S924, step S944 and S945 and step S954 and S955 each is for example corresponding to the plot step according to this embodiment of the invention.
The treatment step of in previous embodiment of the present invention, describing can be regarded as comprising those treatment steps method, allow computer to carry out the program of those treatment steps or store the recording medium of this program.
The cross reference of related application
The present invention comprises and the relevant theme of submitting to Japan Patent office on October 19th, 2007 of Japanese patent application JP2007-272112, and the full content of this application is incorporated into this by reference.

Claims (9)

1. image display control apparatus comprises:
The view data generator is used to provide by the image of forming by a plurality of pixels of matrix form two-dimensional arrangements as the seizure view data;
Cover data supplying device, be used to be provided for covering the coverage data of described view data;
The imaging instruction receiving device is used for when receiving with the corresponding operation of imaging instruction input the imaging instruction acknowledge(ment) signal being verified; And
Drawing apparatus is used for drawing each pixel of described view data every predetermined period ground, order, and each in the described pixel has pixel value, wherein
When to detect described imaging instruction acknowledge(ment) signal be effective, described drawing apparatus after described detection, draw described pixel so that each pixel with the value that comprises described coverage data as pixel value.
2. device as claimed in claim 1, wherein, when to detect described imaging instruction acknowledge(ment) signal be effective, described drawing apparatus is drawn the described pixel with the pixel value that mixes mutually with described coverage data after described detection, so that the pixel value of described pixel is changed into described coverage data gradually in predetermined segment.
3. device as claimed in claim 2, wherein, when detecting described imaging instruction acknowledge(ment) signal when showing new images after effectively, described drawing apparatus is plotted as the pixel that has pixel value separately with described coverage data.
4. device as claimed in claim 1, wherein, when to detect described imaging instruction acknowledge(ment) signal be effective, described drawing apparatus was plotted as the pixel that has pixel value separately with described coverage data after described detection.
5. device as claimed in claim 1, wherein, described drawing apparatus is carried out described detecting operation at every turn when finishing the drafting of arranging one group of pixel in one direction in described arrangement continuously.
6. device as claimed in claim 1, wherein, described drawing apparatus is carried out described detecting operation when the drafting of at every turn finishing a pixel in the described arrangement.
7. device as claimed in claim 1, wherein, described drawing apparatus is drawn described pixel in vertical direction, so that the direction of drafting pixel is identical with the scanning direction on the display surface of wanting in check display.
8. device as claimed in claim 1, wherein, described coverage data comprise colouring information and transmissivity.
9. image display control method in image display control apparatus, described image display control apparatus comprises and is used to provide by the view data generator of the image of forming by a plurality of pixels of matrix form two-dimensional arrangements as the seizure view data, be used to be provided for cover the coverage data supplying device of the coverage data of described view data, be used for the imaging instruction receiving device when receiving, the imaging instruction acknowledge(ment) signal verified with the corresponding operation of imaging instruction input, and be used for every predetermined period ground, order is drawn the drawing apparatus of each pixel in the described view data, in the described pixel each has pixel value, said method comprising the steps of:
Finish when arranging the drafting of one group of pixel in one direction in the described arrangement continuously or whether when finishing drafting to a pixel in the described arrangement, detecting described imaging instruction acknowledge(ment) signal is effective at every turn at every turn; And
When to detect described imaging instruction acknowledge(ment) signal be effective, after described detection, draw described pixel so that each pixel with the value that comprises described coverage data as described pixel value.
CN2008101749178A 2008-10-20 2008-10-20 Image display control device, and method for controlling image display control device Expired - Fee Related CN101692694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008101749178A CN101692694B (en) 2008-10-20 2008-10-20 Image display control device, and method for controlling image display control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008101749178A CN101692694B (en) 2008-10-20 2008-10-20 Image display control device, and method for controlling image display control device

Publications (2)

Publication Number Publication Date
CN101692694A CN101692694A (en) 2010-04-07
CN101692694B true CN101692694B (en) 2011-11-23

Family

ID=42081353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101749178A Expired - Fee Related CN101692694B (en) 2008-10-20 2008-10-20 Image display control device, and method for controlling image display control device

Country Status (1)

Country Link
CN (1) CN101692694B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110249022A1 (en) * 2010-04-08 2011-10-13 Rajesh Poornachandran Techniques for managing power use

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455382A (en) * 2002-03-20 2003-11-12 株式会社日立制作所 Display device
CN1705370A (en) * 2004-06-02 2005-12-07 罗伯特·博世有限公司 Virtual mask for use in autotracking video camera images
KR20070116325A (en) * 2006-06-05 2007-12-10 주식회사 코아로직 Lens shading correction device and method of image sensor
JP2008252581A (en) * 2007-03-30 2008-10-16 Kddi Corp Noticed region image encoding device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455382A (en) * 2002-03-20 2003-11-12 株式会社日立制作所 Display device
CN1705370A (en) * 2004-06-02 2005-12-07 罗伯特·博世有限公司 Virtual mask for use in autotracking video camera images
KR20070116325A (en) * 2006-06-05 2007-12-10 주식회사 코아로직 Lens shading correction device and method of image sensor
JP2008252581A (en) * 2007-03-30 2008-10-16 Kddi Corp Noticed region image encoding device

Also Published As

Publication number Publication date
CN101692694A (en) 2010-04-07

Similar Documents

Publication Publication Date Title
KR102077967B1 (en) Electronic device
US7839444B2 (en) Solid-state image-pickup device, method of driving solid-state image-pickup device and image-pickup apparatus
US6947082B2 (en) Image-taking apparatus and image-taking method
US20180152624A1 (en) Control method, control device and electronic device
CN102334332B (en) Imaging apparatus, image display apparatus, imaging method, method of displaying image and method of correcting position of focusing-area frame
US9538085B2 (en) Method of providing panoramic image and imaging device thereof
CN102348059A (en) Image pickup apparatus
KR101889932B1 (en) Apparatus and Method for photographing image
KR20130046174A (en) Vision recognition apparatus and method
US20130076940A1 (en) Photographing apparatus and method of detecting distortion thereof
US9357133B2 (en) Digital photographing apparatus, auto-focusing method, and computer-readable storage medium for executing the auto-focusing method
US9137446B2 (en) Imaging device, method of capturing image, and program product for capturing image
CN101692694B (en) Image display control device, and method for controlling image display control device
JP4609315B2 (en) Imaging device, method of displaying angle frame at zoom, and program
US7193647B2 (en) Filtering still image data from a multimode stream
JP4935616B2 (en) Image display control apparatus, control method thereof, and program
US20080225146A1 (en) Imaging apparatus and image data recording method
JP2011066699A (en) Imaging apparatus and display program for same
JP4455545B2 (en) Imaging apparatus and imaging method
JP3846860B2 (en) Imaging apparatus and imaging method
JP4687619B2 (en) Image processing apparatus, image processing method, and program
CN112312007A (en) Information processing apparatus, image pickup apparatus, control method, and storage medium
JP4016473B2 (en) Imaging method and imaging apparatus
JP5030721B2 (en) Data processing circuit
JP4557831B2 (en) Image processing apparatus and imaging apparatus using the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CI01 Publication of corrected invention patent application

Correction item: Priority

Correct: 2007-272112 2007.10.19 JP

False: Non priority data

Number: 47

Volume: 27

CI02 Correction of invention patent application

Correction item: Priority

Correct: 2007-272112 2007.10.19 JP

False: Non priority data

Number: 47

Page: The title page

Volume: 27

ERR Gazette correction

Free format text: CORRECT: PRIORITY DATA; FROM: NONE PRIORITY DATA TO: 2007-272112 2007.10.19 JP

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111123

Termination date: 20201020

CF01 Termination of patent right due to non-payment of annual fee