CN103503435B - Image aspects error correction device and method - Google Patents
Image aspects error correction device and method Download PDFInfo
- Publication number
- CN103503435B CN103503435B CN201280021986.XA CN201280021986A CN103503435B CN 103503435 B CN103503435 B CN 103503435B CN 201280021986 A CN201280021986 A CN 201280021986A CN 103503435 B CN103503435 B CN 103503435B
- Authority
- CN
- China
- Prior art keywords
- movement
- described image
- region
- interested
- region interested
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 238000012937 correction Methods 0.000 title description 2
- 230000033001 locomotion Effects 0.000 claims abstract description 81
- 230000003287 optical effect Effects 0.000 claims abstract description 55
- 230000000007 visual effect Effects 0.000 claims abstract description 33
- 238000004590 computer program Methods 0.000 claims abstract description 29
- 238000001514 detection method Methods 0.000 claims abstract description 24
- 238000003384 imaging method Methods 0.000 claims abstract description 15
- 239000013598 vector Substances 0.000 claims description 28
- 230000008859 change Effects 0.000 claims description 17
- 230000000977 initiatory effect Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000116 mitigating effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 241000239290 Araneae Species 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Provide a method that, device and computer program.The device includes:At least one processor and be stored with including computer program instructions computer program at least one memorizer, the computer program instructions are when by least one computing device, at least so that following operation is performed:At least one region interested is determined in the scene of imageing sensor imaging, by the seizure for imageing sensor is exposed to the Optical devices light for being transmitted and initiates the image for including at least one region interested;Detect movement of the device during the seizure of image;And according to the movement and at least one region interested of the device of detection being controlled to the movement of at least one of imageing sensor and Optical devices, to mitigate visual angle error at least one of the image being captured region interested.
Description
Technical field
Various embodiments of the present invention are related to imaging.In particular it relates to mitigate the visual angle error in image.
Background technology
If camera and its digital image sensor are during exposing(For example, due to user's hand shaking)Move, then may be used
Can there is visual angle error(perspective error).Visual angle error is due to its scene(scene)Visual angle is passed in digital picture
Sensor is varied from and is occurred when moving.
For example, user's hand shaking may cause the orientation of digital image sensor to be varied from relative to the scene being imaged,
So as to cause visual angle error.The scenery that visual angle error is may result in caught image has fuzzy edge.
The content of the invention
It is of the invention various and may not all of embodiment, there is provided a kind of device, the device include:At least one
Processor;And be stored with including computer program instructions computer program at least one memorizer, the computer program
Instruction is when by least one computing device, at least so that following operation is performed:In the scape of imageing sensor imaging
As at least one region interested of middle determination;Optical devices are exposed to by causing imageing sensor(optical
arragement)The light for being transmitted and the seizure of image for initiating to include at least one region interested;Detect the device
Movement during the seizure of image;And come to image according to the movement and at least one region interested of the device of detection
The movement of at least one of sensor and Optical devices is controlled, to feel emerging at least one of image being captured
Mitigate visual angle error at the region of interest.
It is of the invention various and may not all of embodiment, there is provided a kind of method, including:Pass in the image of device
At least one region interested is determined in the scene of sensor imaging;Optical devices institute is exposed to by causing imageing sensor
The light of transmission and initiate to include the picture catching at least one region interested;Detect the device during the seizure of image
Movement;And according to detection rock the region interested with least one come in imageing sensor and Optical devices extremely
Few one movement is controlled, and misses to mitigate visual angle at least one of the image being captured region interested
Difference.
It is of the invention various and may not all of embodiment, there is provided a kind of calculating including computer program instructions
Machine program, the computer program instructions are when by least one computing device so that above method is performed.
It is of the invention various and may not all of embodiment, there is provided one kind is stored with including computer program instructions
Computer program non-transient computer-readable media, the computer program instructions are by least one computing device
When, at least so that following operation is performed:Determine that at least one feels in a kind of scene of the imageing sensor imaging of device
The region of interest;Initiate to include that at least one sense is emerging by imageing sensor is exposed to light that Optical devices are transmitted
The seizure of the image in the region of interest;Detect movement of the device during the seizure of image;And according to rocking and extremely for detection
Lack a region interested to be controlled the movement of at least one of imageing sensor and Optical devices, so as in quilt
Mitigate visual angle error at least one of image of seizure region interested.
It is of the invention various and may not all of embodiment, there is provided a kind of device, the device include:For in figure
Device as at least one region interested is determined in the scene of sensor imaging;For by causing imageing sensor sudden and violent
The device of the seizure for being exposed to the Optical devices light for being transmitted and initiating the image for including at least one region interested;For
Detect the device of movement of the device during the seizure of image;And for the movement and at least one of the device according to detection
Region interested being controlled to the movement of at least one of imageing sensor and Optical devices, to be captured
Mitigate the device of visual angle error at least one of image region interested.
Description of the drawings
Various embodiments for a better understanding of the present invention, will only pass through way of example refer to the attached drawing now, wherein:
Fig. 1 illustrates the device of such as chipset;
Fig. 2 illustrates the device of such as digital camera;
Fig. 3 illustrates a kind of flow chart of method;
Fig. 4 A illustrate the user of the still image for catching scene;
Fig. 4 B illustrate the device for showing that scene is caught for still image;
Fig. 5 schematically illustrates the movement to imageing sensor and/or Optical devices during still image seizure and carries out
Control;And
Fig. 6 schematically illustrates how to determine correcting vector.
Specific embodiment
Embodiments of the invention are related to mitigate the visual angle error in still image.
Accompanying drawing illustrates device 10/20, and the device includes:At least one processor 12;And be stored with including computer
At least one memorizer 14 of the computer program 16 of programmed instruction 18, the computer program instructions 18 are being processed by least one
When device 12 is performed, at least so that following operation is performed:At least one is determined in the scene 70 of 22 imaging of imageing sensor
Region 74 interested;Catching for image is initiated by imageing sensor 22 is exposed to light that Optical devices 24 are transmitted
Catch, the image includes at least one region 74 interested;Movement of the detection means 10/20 during the seizure of image;And
Come to imageing sensor 22 and Optical devices 24 according to the movement and at least one region 74 interested of the device 10/20 of detection
At least one of movement be controlled, to mitigate at least one of the image being captured region 74 interested
Visual angle error.
Fig. 1 illustrates device 10.Device 10 can for example be chipset.Device 10 includes at least one processor 12 and extremely
A few memorizer 14.Single processor 12 is schematically illustrated in Fig. 1.In practice, device 10 can be included at more than one
Reason device.
Processor 12 is configured to from memorizer 14 read and writes to memorizer 14.Processor 12 can also include defeated
Outgoing interface and input interface, processor 12 is via the output interface output data and/or order, and data and/or order Jing
Processor 12 is input to by the input interface.
14 memorizer of memorizer includes the computer program 16 of computer program instructions 18, and computer program instructions 18 are in quilt
The operation of device 10/20 is controlled when being loaded in processor 12.Computer program instructions 18 are provided and cause device 10/20
The logical sum routine of the method being able to carry out shown in Fig. 3.Processor 12 can be loaded and perform meter by reading memorizer 14
Calculation machine programmed instruction 18.
Computer program 16 can reach device 10/20 via the delivery mechanism 40 being arbitrarily adapted to.Delivery mechanism 40
Can such as be non-transient computer-readable storage media, computer program, such as memory devices, compact disc read-only memory
(CD-ROM)Or digital versatile disc(DVD)Etc recording medium, and/or visibly embody computer program 16 manufacture
Product.Delivery mechanism 40 can be configured as the signal for reliably transmitting computer program 16.
Although memorizer 14 is illustrated as single component, which may be implemented as one or more independent assemblies, its
In some or all components can be integrated/moveable and/or can to provide permanent/semipermanent/dynamic/high speed slow
The storage device deposited.
Fig. 2 illustrates device 20.Device 20 can for example be digital camera.In some embodiments of the invention, the number
Code-phase machine can have function of cellular phone.
Device 20 shown in Fig. 2 includes:Imageing sensor 22, Optical devices 24, one or more drivings 25, one or many
Device 10 shown in individual motion detector 26, display 27, user input circuit arrangement 28 and Fig. 1.
Part 12,14,22,24,25,26,27 and 28 is operation coupling, and can there is any number of pars intermedia
Part or its combination(Including the situation for not having intermediate member).Part 12,14,22,24,25,26,27,28 for example can position jointly
In inside the shell.
Imageing sensor 22 can for example be such as charge-coupled image sensor(CCD)Or complementary metal oxide semiconductors (CMOS)
(CMOS)Etc digital image sensor.Processor 12 is configured to from 22 receives input of imageing sensor.For example, processor
12 can be with(For example, in the form of still image or video image)View data is obtained from imageing sensor 22, and is deposited
Storage is in memory 14.
Optical devices 24 can for example include one or more optics, such as one or more lens.Optical devices
24 are configured to scene receiving light of the aperture in device 20 from outside device 20.Optical devices 24 are configured to pass light
Deliver to imageing sensor(To be for example used for catching still image).
One or more drive 25 can for example include one or more motors.Processor 12 is configured to one or many
It is individual to drive 25 to provide control signal, to be controlled to the movement of Optical devices 24 and/or imageing sensor 22.If optics
Device 24 includes one or more optics, then processor 12 can be configured to control in those optics
The movement of a little or whole optics is controlling the movement of Optical devices 24.
One or more motion detectors 26 can for example include one or more accelerometers and/or gyroscope.Processor
12 are configured to input of the reception from one or more motion detectors 26 so that the motion and determination of detection means 10 are somebody's turn to do
The direction of motion and value.
Processor 12 is configured to be controlled display 27.Display 27 for example can be with display image, graphics item
And text.It includes multiple pixels and can for example be liquid crystal display(LCD)Or OLED(OLED).
Processor 12 is configured to receive the input from user input circuit arrangement 28.User input circuit arrangement 28
Can such as include that one or more corresponding with one or more buttons are switched.In some embodiments of the invention, user
At least some subscriber's line circuit device in input circuit device 28 can be integrated in touch-sensitive display 29 with display 27.It is touch-sensitive
Display 29 can be any type of touch-sensitive display.For example, which can include electric capacity, resistance and/or technology of acoustic wave.
Processor 12 can be configured to be controlled with system of displayed menus display 27, and the menu system causes to use
Family is able to access that the function of device 20 is carried out.User can be by providing input, the input at user input circuit arrangement 28
Jing processors are parsed, and menu system is navigated.
Method according to embodiments of the present invention will be described with regard to the flow chart shown in Fig. 3 now.
User 60 navigates to the menu system of device 20, and via user input circuit arrangement 28 provide one or
Multiple inputs are with the image capture mode of access mechanism 20.In this example, when device 20 enters image capture mode, light Jing
Device 20 is entered by the aperture in its shell.Light is sent to imageing sensor 22 by Optical devices 24.Processor 12 is passed from image
Sensor 22 obtains vedio data and is used for showing real time video image on display 27.In device 20 in figure
During as trap mode, on display 27 shown real time video image change the position of device 20 and orientation with user 60 and
Change.
In this example, the aperture of device 20 is pointed to certain scene 70 by user 60.This is illustrated in Figure 4 A.Processor
12 obtain the real time video image data for including scene 70 from imageing sensor 22.Processor 12 display 27 is controlled with
The scene 70 of 22 imaging of image sensor.In Fig. 4 A, illustrated scene 70 includes house 72 and tree 73.
Cartesian coordinate axes 50 are illustrated in Fig. 4 A.Y-axis and z-axis are parallel to the plane of the page and x-axis is outside from the page
Extend.Coordinate axess 50 are occurred in Fig. 4 A to Fig. 6 to indicate these accompanying drawings orientation relative to each other.
Fig. 4 B illustrate scene 70 shown on the display 27 of device 20.In figure 4b, x-axis and y-axis are parallel to page
The plane in face and z-axis extends into the page.
At frame 302 in figure 3, processor 12 determines which subregion in scene(Or which subregion)Represent interested
Region(Or multiple regions interested).In some embodiments of the invention, user 60 can be filled via user input circuit
28 offer user inputs are put one or more subregions are specified as the region interested in scene 70.For example, if device
20 include touch-sensitive display 29, then user can by show scene 70 in area touch display 29 interested come
Specify region interested.
In some embodiments of the invention, user 60 may not necessarily provide user input to specify region interested.
Processor 12 can specify region interested automatically.For example, processor 12 can perform image procossing to determine whether there is
Face, if it is present the face can be appointed as region interested by processor 12.
Processor 12 can be processed to enter the different subregions in scene using image procossing to the scene 70 for showing
Row is demarcated.Different key elements in scene 70 can be demarcated as different subregions.For example, the scene 70 shown in Fig. 4 B can be demarcated
And cause house 72 in from tree 73 different subregions in.Therefore, if the touching subregion of user 60(73 are set for example)To be referred to
It is set to region interested, then other subregions(Such as house 72)And region interested is not designated as, unless with regard to those
Subregion provides further user input.This means the different key elements in scene 70 for the mesh for specifying region interested
And treat as different subregions.
In this example, user provide input and processor 12 determined according to the user input it is interested in scene 70
Region for tree 73.Region interested is in figure 4b by indicated by dotted line 74.In some embodiments of the invention, processing
After device 12 has determined that region interested, it can control display 27 to identify the interested of determination to user 60
Region.For example, region 74 interested can be highlighted on display 27 by processor 12.
In this example, processor 12 also drives 25 to move Optical devices 24 and/or figure by controlling one or more
As sensor 22 come to the focusing of region 74 interested responding to the determination in region 74 interested.For example, it is interested
Region 74 can be burnt and other regions then can be with out of focus.
Then user 60 provides user input so that device 20 catches still image at user input circuit arrangement 28.
In response to the user input, at the frame 303 of Fig. 3, the initiation of processor 12 includes catching for the still image in region 74 interested
Catch.For example, processor 12 can be reset to imageing sensor 22, then so that imageing sensor 22 is exposed to from scene 70
Send and be sent to by Optical devices 24 light of imageing sensor 22.
Device 20 may catch still image(And imageing sensor 22 is exposed)While be moved.For example,
As external force is applied in device 20, therefore device 20 may shake while still image is caught(It is irregular to move
It is dynamic).If device 20 is held in by user when still image is caught(Two)In handss, then external force is probably and is provided by user's hand shaking
's.Alternatively, if device 20 is positioned on spider when still image is caught, external force is probably to be provided by wind.
The mobile meeting of device 20 causes imageing sensor to move, so as to change the scene being imaged at imageing sensor 22
70 and cause imaging scenery 72,73 be varied from the position at imageing sensor 22.Further, since imageing sensor is closed
Change when device 20 is moved in the visual angle of scene 70, so the shape of each scenery 72,73 being imaged also has become
Change.For example, with regard to the scene 70 shown in Fig. 4 B, the shape in house 72 and the shape of tree 73 will be varied from.
At the frame 304 of Fig. 3, processor 12 is received from one or more motion detectors 26 during picture catching
Input.Which detects device 20 and moves according to the input.
At the frame 305 of Fig. 3, processor 12 movement and one or more regions interested according to the device 20 of detection
The movement of at least one of 74 pairs of imageing sensors 22 and Optical devices 24 is controlled, to mitigate the static map being captured
The visual angle error at one or more regions interested 74 as in.
Fig. 5 illustrates the schematic diagram including imageing sensor 22 and Optical devices 24.With the arrow that reference number 90 is indicated
The light for sending is illustrated from scene 70.The arrow diagramming indicated with reference number 92 is sent to image sensing from Optical devices 24
The light of device 22.Processor 12 is carried out come the movement to device 20 by causing imageing sensor 22 and/or the movement of Optical devices 24
Compensation.In this example, imageing sensor 22 and/or Optical devices 24 are configured to relative to each other irrotational with translation
Mode is moved.In other words, imageing sensor 22 and/or Optical devices 24 are configured at least one of following direction direction
Upper movement:+/- x directions, +/- y directions and +/- z directions.Preferably, at least in imageing sensor 22 and Optical devices 24
It is individual to be configured to move up with +/- y side at least on +/- x directions.
In this example, imageing sensor 22 can not be relative to 24 run-off the straights of Optical devices/rotation, Optical devices 24
Can not be with regard to 222 run-off the straights of imageing sensor/rotation.
When device 20(For example, due to user's hand shaking)When moving during still image catches, processor 12 uses one
Individual or 25 mobile image sensors 22 of multiple drivings and/or Optical devices 24 are compensated with the movement to device 20, so as to quilt
The scene 70 of 22 imaging of imageing sensor is held essentially constant at imageing sensor 22(The scenery 72 that in other words, is imaged,
73 same places for substantially remaining in imageing sensor 22).This cause produced by still image in fuzzy mitigated.
Processor 12 also mitigates quilt using one or more 25 mobile image sensors 22 of driving and/or Optical devices 24
Visual angle error in the image of seizure.This further mitigates fuzzy in produced still image.
Processor 12 controls imageing sensor 22 and/or Optical devices 24 to be moved according to fixed correcting vector
It is dynamic.Fig. 6 illustrates the example that how can determine correcting vector 84.
In order to determine correcting vector 84, the movement that processor 12 for example can be to device 20 in discrete time section is supervised
Survey(And therefore the movement to imageing sensor 22 is monitored).
In this example, in specific discrete time section, device 20 is moved in the+y-direction so that the scene 70 of display
In scenery 72,73 move in the-y direction.Processor 12 determines imageing sensor 22 and/or Optical devices 24 in the-y direction
Translational movement the movement to device 20 is compensated(So that the position of the scenery 72,73 in scene 70 is in image sensing
It is held essentially constant at device 22).The translational movement vector for being used for compensating for that determined by processor 12 is illustrated as Fig. 6
Shown " device mobile vector 80 ".
Processor 12 also determines that the translational movement in the+x direction of imageing sensor 22 and/or Optical devices 24 will be made in sense
At the region 74 of interest due to device 20 still image seizure during movement and caused visual angle error has mitigated.Process
" visual angle error vector is illustrated as in figure 6 to the translational movement vector that the visual angle error is compensated determined by device 12
82”.The direction of visual angle error vector 82 and value depending on the scene 70 being imaged in region 74 interested and device
The direction of 20 movements in the discrete time section and value.
In this example, processor 12 caused by the mobile meeting of imageing sensor 22 and/or Optical devices 24 cause to regard
Angle error has mitigated in other regions in region interested rather than in still image.Those skilled in the art will anticipate
Know, the shifting for mitigating the visual angle error at region 74 interested of imageing sensor 22 and/or Optical devices 24
It is dynamic that visual angle error among 60 more uninterested other parts of user, can be introduced in still image.Therefore, mitigate one or many
The visual angle error in individual region interested causes one or more regions interested in still image to look than which originally
It is relatively sharp(sharp), and other regions in still image can be caused to look more obscured than which originally.
In Fig. 6, illustrated correcting vector 84 indicates imageing sensor 22 and/or how Optical devices 24 are moved with right
The movement of device 20 is compensated, and it is 82 sum of device mobile vector 80 and visual angle error vector.
Movement of 12 monitoring device 20 of processor in discrete time section and control imageing sensor 22 and/or optics dress
Put 24 to move as described above.In fact, discrete time section may be very short, so that imageing sensor 22
And/or Optical devices 24 are appeared to be in continuous moving.
In some embodiments of the invention, as explained above, user 60 can specify more than one interested
Region.In these embodiments, processor 12 can be controlled such that sense to imageing sensor 22 and/or Optical devices 24
Visual angle error in the region of interest has reduced to a certain extent/has mitigated.
In some embodiments, degree of alleviation of the visual angle error in each region interested can be identical.
In other embodiments, processor 12 can make specific region interested(Or multiple specific regions interested)Have precedence over
Other regions interested and visual angle error in more preferential region is caused to compare the visual angle error in other regions
Mitigate to a greater extent.For example, the region interested including face can have precedence over not including face other
Region interested.Larger face in the image being captured(Those namely closest with imageing sensor face)Can
To have precedence over less face.Alternatively, or additionally, processor 12 can recognize face using image recognition technology, and
And can make the face for identifying have precedence over it is unidentified go out face.
Invention as described above embodiment provides regarding in a kind of region interested for mitigating still image
The method of angle error.Advantageously, in an embodiment of the present invention, it is not necessary to rotate/tilted image sensor 22 or Optical devices 24.
This enables device 20 to have relatively small thickness.
To " computer-readable recording medium ", " computer program ", " computer program of tangible realization " etc. or
The reference of " controller ", " computer ", " processor " etc. is appreciated that not only comprising with such as single multi- processor architecture
And order(Von Neumann)The computer of the different frameworks of/parallel architecture etc, and include such as field-programmable gate array
Row(FPGA), apply specific integrated circuit(ASIC), signal processor apparatus and other processing circuit units special circuit.It is right
The reference of computer program, instruction, code etc. is appreciated that comprising the software for programmable processor or firmware, for example
The programmable content of hardware device, regardless of whether be the instruction for processor, or the equipment for fixing function,
The configuration of gate array or programmable logic device etc. is arranged.
As used in this application, term " circuit arrangement " refers to following whole:
(a)The only circuit implementation of hardware(Such as it is only the embodiment party of analogue means and/or digital means
Formula),
(b)Circuit and software(And/or firmware)Combination, such as(It is such as applicable):(i)One or more processors
Combination or(ii)Operate such that the device of such as mobile phone or server etc performs various functions together(Including one
Individual or multiple digital signal processors)The part of one or more processors/software, software and one or more memorizeies,
And
(c)The circuit of the part of such as one or more microprocessors or one or more microprocessors, which needs software
Or firmware is to be operated, even if the software or firmware are not physically present.
Suitable for the term all uses in this application, being included in any right will for this definition of " circuit arrangement "
Use in asking.As other example, as used in this application like that, term " circuit arrangement " will also cover only processor
(Or multiple processors)Part and its appended software and/or firmware implementation.As an example and can be using specific
In the case of the key element of claim, term " circuit arrangement " is also by covering is for the based band integrated circuit of mobile phone or answers
With the similar integrated circuit in processor integrated circuit, or server, cellular network device or other network equipments.
Frame shown in Fig. 3 can be with the code segment in the step in method for expressing and/or computer program 16.For the frame
Not necessarily hint has required or preferred order for the frame to specific order of diagram, and the order of frame and arrangement can
To be varied from.Additionally, some frames may be omitted.
Although describing embodiments of the invention by reference to various examples in paragraph before, will understand
It is modification can be made to given example in the case of without departing from the scope of the invention protected as requested.For example, fill
Put 20 and may further include position sensor, which is configured to verify that the position of imageing sensor 22 and/or Optical devices 24
And provide to processor 12 and be properly entered.
Context is captured as with still image above to be described embodiments of the invention.In other realities of the present invention
Apply in example, can be with captured video image rather than still image.Video image can be mitigated using technology described above
At least one region interested in visual angle error.
In some embodiments of the invention, device 20 need not include display 27.
In some alternative examples for example described above, user 60 can with it is described above not
Same mode provides input to specify one or more regions interested.In some examples in those examples, user is defeated
Entering circuit arrangement 28 can include tripper.User can partly press tripper to specify region interested(For example,
At the center of the scene 70 being displayed on display 27).Then user 60 can readjust camera light in case of need
The position of circle and/or orientation are catching still image.This region interested after readjusting can with or need not be in
The center of caught still image.
In these alternative examples, processor 12 can be by across the image biography of key element in specified region interested
The scene 70 that sensor 22/ shows is tracked to the key element when moving and specified sense is kept when camera aperture is moved
The region of interest.Alternatively or additionally, processor 12 can use the input provided by one or more motion detectors 26
To follow the trail of region interested.
The feature described in description above can be used with the combination different from the combination being expressly recited.
Although being described to function by reference to some features, those functions can be performed by further feature
Regardless of whether whether this feature is described.
Although being described to feature by reference to some embodiments, those features can occur in other enforcements
Regardless of whether whether the embodiment is described in example.
Although in causing to the present invention in description above being as possible considered as the concern of those features of particular importance,
It is understood that applicant require with regard to any patentable feature shown in cited before and/or accompanying drawing or
Combinations of features is protected regardless of whether being made that to which and is particularly emphasized.
Claims (20)
1. a kind of device for correcting image aspects error, including:
At least one processor;And
Be stored with including computer program instructions computer program at least one memorizer, the computer program instructions exist
During by least one computing device, at least so that following operation is performed:
At least one region interested is determined in the scene of imageing sensor imaging;
Initiate to include that at least one sense is emerging by described image sensor is exposed to light that Optical devices are transmitted
The seizure of the image in the region of interest;
Detection movement of the described device during the seizure of described image;And
Come to described image sensor and institute according to the movement and at least one region interested of the described device of detection
The compensation movement for stating at least one of Optical devices is controlled, so as to described in the described image being captured at least one
Mitigate by visual angle error caused by the movement for detecting at individual region interested,
Wherein according to x direction vectors and the y directions by described at least one region interested caused by the movement for detecting
The value of the change of position of the vector in discrete time section and the direction sum for determining are controlled come mobile to the compensation,
And the value of the change of wherein described x direction vectors and the y direction vectors in the discrete time section and
Initial position of the direction of the determination depending on the region interested of the described mobile as described before at least one in detection.
2. device according to claim 1, wherein the compensation movement include the described image sensor of described device with
The movement of at least one of the Optical devices, the movement be controlled such that in the described image being captured it is described extremely
At least position in a few region interested is during the movement of detection at the described image sensor of described device
Keep constant.
3. device according to claim 1 and 2, wherein described at least one region interested includes that more than one sense is emerging
The region of interest, and wherein according to the x directions arrow by the more than one region interested caused by the movement for detecting
The value of the change of amount and position of the y direction vectors in discrete time section at the described image sensor of described device and really
Fixed direction sum is controlled come mobile to the compensation.
4. device according to claim 1 and 2, wherein described at least one region interested in the scene according to
User input is determined, and wherein described user input is that the user provided at touch-sensitive display touches.
5. device according to claim 2, wherein in the described image sensor and the Optical devices to described device
At least one compensation movement further controlled so that at least one sense in the described image that is captured is emerging
At least shape in the region of interest keeps constant during the movement of detection at the described image sensor of described device.
6. the device according to aforementioned claim 1 or 2, wherein the shifting of described device is detected during the seizure of described image
Dynamic being included in detects described at least one region regarding from described image sensor interested while described image is captured
Change in the location and shape at angle.
7. the device according to aforementioned claim 1 or 2, the wherein scene of described image sensor imaging are shown
Over the display, and at least one region interested is at least one of the shown scene on the display
Individual subregion.
8. the device according to aforementioned claim 1 or 2, wherein determine multiple regions interested, and to described image
During the compensation movement of at least one of sensor and the Optical devices is controlled such that the described image being captured
At least position in the plurality of region interested detection the movement during described device described image sensor
Place keeps constant.
9. device according to claim 1 and 2, wherein only determining single region interested, and passes to described image
The compensation movement of at least one of sensor and the Optical devices is controlled such that the institute in the described image being captured
At least position in single region interested is stated during the movement of detection at the described image sensor of described device
Keep constant.
10. the device according to aforementioned claim 3, wherein the x direction vectors and the y direction vectors are described discrete
The value of the change in the time period and the direction of the determination are depending on being more than one before the movement of detection
The initial position in individual region interested.
11. devices according to aforementioned claim 1 or 2, wherein described image are still image or video image.
A kind of 12. methods for correcting image aspects error, including:
At least one region interested is determined in the scene of the imageing sensor imaging of device;
Initiate to include that at least one sense is emerging by described image sensor is exposed to light that Optical devices are transmitted
The seizure of the image in the region of interest;
Detection movement of the described device during the seizure of described image;And
Come to described image sensor and institute according to the movement and at least one region interested of the described device of detection
The compensation movement for stating at least one of Optical devices is controlled, so as to described in the described image being captured at least one
Mitigate by visual angle error caused by the movement for detecting at individual region interested,
Wherein according to x direction vectors and the y directions by described at least one region interested caused by the movement for detecting
The value of the change of position of the vector in discrete time section and the direction sum for determining are controlled come mobile to the compensation,
And the value of the change of wherein described x direction vectors and the y direction vectors in the discrete time section and
Initial position of the direction of the determination depending on the region interested of the described mobile as described before at least one in detection.
13. methods according to claim 12, wherein described at least one region interested includes that more than one sense is emerging
The region of interest, and wherein according to the x directions arrow by the more than one region interested caused by the movement for detecting
The value of the change of amount and position of the y direction vectors in discrete time section at the described image sensor of described device and really
Fixed direction sum is controlled come mobile to the compensation.
14. methods according to claim 12 or 13, wherein the compensation movement includes described image sensor and described
The movement of at least one of Optical devices, the movement are controlled such that in the described image being captured described at least one
At least position in individual region interested keeps constant during the movement of detection at described image sensor.
15. methods according to claim 12 or 13, wherein described at least one region interested in the scene
It is determined according to user input.
16. methods according to claim 12 or 13, wherein the shifting of described device is detected during the seizure of described image
Dynamic being included in detects described at least one region regarding from described image sensor interested while described image is captured
Change in the location and shape at angle.
The scene of 17. methods according to claim 12 or 13, wherein described image sensor imaging is shown
Over the display, and at least one region interested is at least one point of the shown scene on the display
Area.
A kind of 18. equipment for correcting image aspects error, including:
For the device for determining at least one region interested in the scene of the imageing sensor imaging of equipment;
For initiating to include described at least one by causing described image sensor to be exposed to light that Optical devices are transmitted
The device of the seizure of the image in region interested;
For detecting the device of movement of the equipment during the seizure of described image;And
For coming to described image sensor according to the movement of the equipment of detection and at least one region interested
Be controlled with the compensation of at least one of Optical devices movement, so as to described in the described image being captured extremely
Mitigate at a few region interested by the device of visual angle error caused by the movement for detecting,
Wherein according to x direction vectors and the y directions by described at least one region interested caused by the movement for detecting
The value of the change of position of the vector in discrete time section and the direction sum for determining are controlled come mobile to the compensation,
And the value of the change of wherein described x direction vectors and the y direction vectors in the discrete time section and
Initial position of the direction of the determination depending on the region interested of the described mobile as described before at least one in detection.
19. equipment according to claim 18, wherein the compensation movement includes described image sensor and the optics
The movement of at least one of device, the movement are controlled such that at least one sense in the described image being captured
At least position in the region of interest keeps constant during the movement of detection at described image sensor.
20. equipment according to claim 18 or 19, wherein for the equipment is detected during the seizure of described image
The device of movement include for detecting described at least one region interested from described while described image is captured
The device of the change in the location and shape at the visual angle of imageing sensor.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/082,863 | 2011-04-08 | ||
US13/082,863 US9204047B2 (en) | 2011-04-08 | 2011-04-08 | Imaging |
PCT/IB2012/051430 WO2012137096A1 (en) | 2011-04-08 | 2012-03-26 | Image perspective error correcting apparatus and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103503435A CN103503435A (en) | 2014-01-08 |
CN103503435B true CN103503435B (en) | 2017-04-05 |
Family
ID=46965815
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201280021986.XA Active CN103503435B (en) | 2011-04-08 | 2012-03-26 | Image aspects error correction device and method |
Country Status (4)
Country | Link |
---|---|
US (1) | US9204047B2 (en) |
EP (1) | EP2695376A4 (en) |
CN (1) | CN103503435B (en) |
WO (1) | WO2012137096A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102383134B1 (en) * | 2017-11-03 | 2022-04-06 | 삼성전자주식회사 | Electronic device for processing image based on priority and method for operating thefeof |
US10547790B2 (en) * | 2018-06-14 | 2020-01-28 | Google Llc | Camera area locking |
DE102019105275A1 (en) * | 2019-03-01 | 2020-09-03 | Connaught Electronics Ltd. | Method and system for detecting an object on a surface of a vehicle |
CN110225251B (en) * | 2019-05-31 | 2020-10-16 | 维沃移动通信(杭州)有限公司 | Video recording method and terminal |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3572939A (en) | 1968-07-22 | 1971-03-30 | Eastman Kodak Co | Photoelectric lens bench and method for testing optical systems |
GB0116877D0 (en) * | 2001-07-10 | 2001-09-05 | Hewlett Packard Co | Intelligent feature selection and pan zoom control |
US6801717B1 (en) * | 2003-04-02 | 2004-10-05 | Hewlett-Packard Development Company, L.P. | Method and apparatus for controlling the depth of field using multiple user interface markers |
JP2005077886A (en) | 2003-09-02 | 2005-03-24 | Canon Inc | Photographing equipment |
US20050156915A1 (en) | 2004-01-16 | 2005-07-21 | Fisher Edward N. | Handwritten character recording and recognition device |
JP4531484B2 (en) | 2004-08-16 | 2010-08-25 | パナソニック株式会社 | Camera system |
US7990412B2 (en) | 2004-11-01 | 2011-08-02 | Hewlett-Packard Development Company, L.P. | Systems and methods for correcting image perspective |
BRPI0405039C1 (en) | 2004-11-18 | 2005-10-18 | Audaces Automacao E Informatic | Magnetic mold holder |
US7791642B2 (en) * | 2004-12-13 | 2010-09-07 | Fujifilm Corporation | Image-taking apparatus |
JP2006215766A (en) | 2005-02-03 | 2006-08-17 | Victor Co Of Japan Ltd | Image display device, image display method and image display program |
EP1768387B1 (en) | 2005-09-22 | 2014-11-05 | Samsung Electronics Co., Ltd. | Image capturing apparatus with image compensation and method therefor |
EP1938577B1 (en) | 2005-10-21 | 2013-08-14 | Nokia Corporation | A method and a device for reducing motion distortion in digital imaging |
JP4766320B2 (en) * | 2006-02-06 | 2011-09-07 | カシオ計算機株式会社 | Imaging apparatus and program thereof |
JP2008020716A (en) * | 2006-07-13 | 2008-01-31 | Pentax Corp | Image blur correction device |
US7664384B2 (en) * | 2006-11-07 | 2010-02-16 | Sony Ericsson Mobile Communications Ab | User defined autofocus area |
JP4201809B2 (en) | 2006-11-13 | 2008-12-24 | 三洋電機株式会社 | Camera shake correction apparatus and method, and imaging apparatus |
KR100819301B1 (en) | 2006-12-20 | 2008-04-03 | 삼성전자주식회사 | Method and apparatus for optical image stabilizer on mobile camera module |
CN101408709B (en) | 2007-10-10 | 2010-09-29 | 鸿富锦精密工业(深圳)有限公司 | Image viewfinding device and automatic focusing method thereof |
US8259208B2 (en) * | 2008-04-15 | 2012-09-04 | Sony Corporation | Method and apparatus for performing touch-based adjustments within imaging devices |
US8237807B2 (en) * | 2008-07-24 | 2012-08-07 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
KR20100013171A (en) | 2008-07-30 | 2010-02-09 | 삼성디지털이미징 주식회사 | Method and apparatus for compensating a motion of the autofocus region, and autofocus method and apparatus using thereof |
US8493454B1 (en) * | 2010-02-17 | 2013-07-23 | Ambarella, Inc. | System for camera motion compensation |
US8558923B2 (en) * | 2010-05-03 | 2013-10-15 | Canon Kabushiki Kaisha | Image capturing apparatus and method for selective real time focus/parameter adjustment |
US8934050B2 (en) * | 2010-05-27 | 2015-01-13 | Canon Kabushiki Kaisha | User interface and method for exposure adjustment in an image capturing device |
US8625021B2 (en) * | 2010-08-30 | 2014-01-07 | Canon Kabushiki Kaisha | Image capture with region-based adjustment of imaging properties |
US8823829B2 (en) * | 2010-09-16 | 2014-09-02 | Canon Kabushiki Kaisha | Image capture with adjustment of imaging properties at transitions between regions |
US8780251B2 (en) * | 2010-09-20 | 2014-07-15 | Canon Kabushiki Kaisha | Image capture with focus adjustment |
US8704929B2 (en) * | 2010-11-30 | 2014-04-22 | Canon Kabushiki Kaisha | System and method for user guidance of photographic composition in image acquisition systems |
US8760561B2 (en) * | 2011-02-23 | 2014-06-24 | Canon Kabushiki Kaisha | Image capture for spectral profiling of objects in a scene |
-
2011
- 2011-04-08 US US13/082,863 patent/US9204047B2/en active Active
-
2012
- 2012-03-26 EP EP12768230.0A patent/EP2695376A4/en not_active Ceased
- 2012-03-26 CN CN201280021986.XA patent/CN103503435B/en active Active
- 2012-03-26 WO PCT/IB2012/051430 patent/WO2012137096A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN103503435A (en) | 2014-01-08 |
US20120257069A1 (en) | 2012-10-11 |
US9204047B2 (en) | 2015-12-01 |
WO2012137096A1 (en) | 2012-10-11 |
EP2695376A4 (en) | 2014-10-08 |
EP2695376A1 (en) | 2014-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102502404B1 (en) | Information processing device and method, and program | |
EP3443736B1 (en) | Method and apparatus for video content stabilization | |
JP6877149B2 (en) | Shooting position recommendation method, computer program and shooting position recommendation system | |
CN108881703B (en) | Anti-shake control method and device | |
US20150215532A1 (en) | Panoramic image capture | |
JP7197981B2 (en) | Camera, terminal device, camera control method, terminal device control method, and program | |
US10516823B2 (en) | Camera with movement detection | |
KR102155895B1 (en) | Device and method to receive image by tracking object | |
JP2014160226A (en) | Imaging apparatus | |
CN103503435B (en) | Image aspects error correction device and method | |
CN111093020B (en) | Information processing method, camera module and electronic equipment | |
JP6098873B2 (en) | Imaging apparatus and image processing apparatus | |
KR102474729B1 (en) | The Apparatus For Mornitoring | |
JP6128109B2 (en) | Image capturing apparatus, image capturing direction control method, and program | |
JP2014068335A (en) | Imaging apparatus and image processing method | |
CN113744299B (en) | Camera control method and device, electronic equipment and storage medium | |
US8965045B2 (en) | Image capture | |
EP3040835B1 (en) | Image navigation | |
JP6483661B2 (en) | Imaging control apparatus, imaging control method, and program | |
JP2016111561A (en) | Information processing device, system, information processing method, and program | |
CN111263115B (en) | Method, apparatus, electronic device, and computer-readable medium for presenting images | |
CN110892705A (en) | Method for photographing by determining optimum condition and apparatus for performing the same | |
GB2525175A (en) | Imaging | |
JP2023115703A (en) | Video monitoring device, video monitoring method and program | |
EP3059669A1 (en) | Controlling display of video content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C41 | Transfer of patent application or patent right or utility model | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20160201 Address after: Espoo, Finland Applicant after: Technology Co., Ltd. of Nokia Address before: Espoo, Finland Applicant before: Nokia Oyj |
|
GR01 | Patent grant | ||
GR01 | Patent grant |