CN103916659A - System And Method For Depth Of Field Visualization - Google Patents

System And Method For Depth Of Field Visualization Download PDF

Info

Publication number
CN103916659A
CN103916659A CN201410001474.8A CN201410001474A CN103916659A CN 103916659 A CN103916659 A CN 103916659A CN 201410001474 A CN201410001474 A CN 201410001474A CN 103916659 A CN103916659 A CN 103916659A
Authority
CN
China
Prior art keywords
pixel
field depth
distance value
group
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410001474.8A
Other languages
Chinese (zh)
Other versions
CN103916659B (en
Inventor
J.F.凯利
T.西格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Publication of CN103916659A publication Critical patent/CN103916659A/en
Application granted granted Critical
Publication of CN103916659B publication Critical patent/CN103916659B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

In a method for visualizing depth of field, a computing device determines a distance value for each respective pixel of a plurality of pixels of a digital image based on a distance between subject matter depicted in the respective pixel and an imaging device integrated with the computing device. The computing device determines a depth of field of the digital image. The computing device compares the depth of field of the digital image to the distance value for each respective pixel to determine a first set of pixels of the plurality of pixels having distance values outside the depth of field and a second set of pixels of the plurality of pixels having distance values inside the depth of field. The computing device indicates the first set of pixels in a user interface.

Description

For the visual system and method for field depth
Technical field
Relate generally to photography of the present invention, and relate more particularly to the visual of field depth.
Background technology
Field depth is the part of rest image or " in focus " video, as presented by the optical element of imaging system.More particularly, field depth is recently and the farthest distance between target (object) in the scene that seems sharp keen acceptably in image.Professional photographer uses field depth that subject (subject) is isolated to obtain visual effect, or expansion field depth is to focus additional component or large subject (for optical considerations).Field depth is used by photographer, because it can be adjusted, makes all or most of images in focus, or it can be adjusted with ambiguous prospect or background with by subject isolation (being called loose scape).The aperture (camera lens opening) of equipment is controlled field depth; Opening is less, field depth is larger.Also affect to the distance of subject (subject in focus), focal length, size sensor and the Pixel Dimensions of camera lens the field depth being calculated by the circle of confusion.
Field depth changes widely.Field depth is regarded as the authoring tools of most worthy and can calculates exactly in the time catching image or video.But, find that by the correct aperture that meets creation expection be in trial-and-error method, even for the most experienced photographer or videograph person.Many variablees affect imaging field depth, make to be difficult to realize the in the situation that of not free and instrument accuracy.
Summary of the invention
The each side of embodiments of the invention discloses one and has been used for making field depth visualization method, computer program and computing system.The theme of computing equipment based on describing in each pixel and and the integrated imaging device of computing equipment between distance be identified for the distance value of each respective pixel of multiple pixels of digital picture.This computing equipment is determined the field depth of digital picture.This computing equipment is compared the field depth of digital picture with the distance value for each respective pixel, have first group of pixel in described multiple pixels of the distance value of field depth outside and have second group of pixel in described multiple pixels of the distance value of field depth inside to determine.Computing equipment is indicated first group of pixel in user interface.
Accompanying drawing explanation
Fig. 1 is the functional block diagram of image capture device according to an embodiment of the invention.
Fig. 2 A is the example view of the user interface of the image capture program of the image capture device for Fig. 1 according to an embodiment of the invention.
Fig. 2 B is the example view of the user interface that is exposed for the visual field depth program impact of field depth in the image that makes to be caught by image capture program according to an embodiment of the invention.
Fig. 3 is the flow chart of describing the step of field depth program according to an embodiment of the invention.
Fig. 4 is according to the block diagram of the parts of the image capture device of illustrative embodiment of the present invention.
Embodiment
Field depth is recently and the farthest distance between target in the scene that seems sharp keen acceptably in image, and in the time catching image or video, is regarded as the authoring tools of most worthy.Although can calculate exactly field depth, determine that it is usually challenging meeting the required suitable setting of creation expection for photographer, therefore makes it become the process of trial-and-error method.Embodiments of the invention recognize will with integrated little, the lower resolution displays of most of imaging devices (such as digital camera) on see the field depth of the digital picture of catching, for photographer, be difficulty especially.Photographer can not first zoom or be attached to show photo on another display in the situation that imaging device compared with small displays on check exactly field depth.The present invention is by covering or the each several part of fuzzy capture digital image outside field depth, utilizing computer program to make field depth visual for photographer provides direct visual feedback.
Person of ordinary skill in the field knows, various aspects of the present invention can be implemented as system, method or computer program.Therefore, various aspects of the present invention can specific implementation be following form, that is: hardware implementation mode, implement software mode (comprising firmware, resident software, microcode etc.) completely completely, or the execution mode of hardware and software aspect combination, can be referred to as " circuit ", " module " or " system " here.In addition, in certain embodiments, various aspects of the present invention can also be embodied as the form of the computer program in one or more computer-readable mediums, comprise computer-readable program code in this computer-readable medium.
Can adopt the combination in any of one or more computer-readable mediums.Computer-readable medium can be computer-readable signal media or computer-readable recording medium.Computer-readable recording medium for example may be-but not limited to-electricity, magnetic, optical, electrical magnetic, infrared ray or semi-conductive system, device or device, or any above combination.The example more specifically (non exhaustive list) of computer-readable recording medium comprises: have the electrical connection, portable computer diskette, hard disk, random-access memory (ram), read-only memory (ROM), erasable type programmable read only memory (EPROM or flash memory), optical fiber, Portable, compact dish read-only memory (CD-ROM), light storage device, magnetic memory device of one or more wires or the combination of above-mentioned any appropriate.In presents, computer-readable recording medium can be any comprising or stored program tangible medium, and this program can be used or be combined with it by instruction execution system, device or device.
Computer-readable signal media can be included in the data-signal of propagating in base band or as a carrier wave part, has wherein carried computer-readable program code.The combination of electromagnetic signal that the data-signal of this propagation can adopt various ways, comprises---but being not limited to---, light signal or above-mentioned any appropriate.Computer-readable signal media can also be any computer-readable medium beyond computer-readable recording medium, and this computer-readable medium can send, propagates or transmit the program for being used or be combined with it by instruction execution system, device or device.
The program code comprising on computer-readable medium can be with any suitable medium transmission, comprises that---but being not limited to---is wireless, wired, optical cable, RF etc., or the combination of above-mentioned any appropriate.
Can write the computer program code for carrying out the present invention's operation with the combination in any of one or more programming languages, described programming language comprises object-oriented programming language-such as Java, Smalltalk, C++ etc., also comprises conventional process type programming language-such as " C " language or similar programming language.Program code can fully be carried out, partly on subscriber computer, carries out, carry out or on remote computer or server, carry out completely as an independently software kit execution, part part on subscriber computer on remote computer on subscriber computer.In the situation that relates to remote computer, remote computer can be by the network of any kind---comprise local area network (LAN) (LAN) or wide area network (WAN)-be connected to subscriber computer, or, can be connected to outer computer (for example utilizing ISP to pass through Internet connection).
Below with reference to describing the present invention according to flow chart and/or the block diagram of the method for the embodiment of the present invention, device (system) and computer program.Should be appreciated that the combination of each square frame in each square frame of flow chart and/or block diagram and flow chart and/or block diagram, can be realized by computer program instructions.These computer program instructions can offer the processor of all-purpose computer, special-purpose computer or other programmable data processing unit, thereby produce a kind of machine, make these computer program instructions in the time that the processor by computer or other programmable data processing unit is carried out, produced the device of the function/action stipulating in the one or more square frames in realization flow figure and/or block diagram.
Also these computer program instructions can be stored in computer-readable medium, these instructions make computer, other programmable data processing unit or other equipment with ad hoc fashion work, thereby the instruction being stored in computer-readable medium just produces the manufacture (article of manufacture) of the instruction of the function/action stipulating in the one or more square frames that comprise in realization flow figure and/or block diagram.
Also computer program instructions can be loaded on computer, other programmable data processing unit or miscellaneous equipment, make to carry out sequence of operations step on computer, other programmable data processing unit or miscellaneous equipment, to produce computer implemented process, thus the process of function/operation that the instruction that makes to carry out on computer or other programmable device stipulates during the square frame in realization flow figure and/or block diagram can be provided.
Describe the present invention in detail referring now to accompanying drawing.Fig. 1 is the functional block diagram of image capture device 10 according to an embodiment of the invention.Fig. 1 has only described the diagram of an embodiment and has not meant that about any restriction of environment that wherein can realize different embodiment.
In the embodiment describing, image capture device 10 is computing equipments integrated with imaging device.For example, image capture device 10 can be digital camera, and such as digital SLR camera (dSLR), it has computer process ability, or alternatively, from the imaging device of independent different computing device communication.In other embodiments, image capture device 10 can be digital camera, have the computer of integrated digital camera, the smart phone that is equipped with digital camera and maybe can catch and show any programmable electronic equipment of digital picture.Digital picture can be picture or moving image, such as video or film.Image capture device 10 comprises optical lens 20, display 30, user interface 40, imageing sensor 50, image capture program 60 and field depth program 70.
Optical lens 20 is integrated with image capture device 10.In one embodiment, optical lens 20 is interchangeable dSLR camera lenses.For example, optical lens 20 can be the interchangeable dSLR camera lens of 30mm.In another embodiment, optical lens 20 can be permanently fixed to image capture device 10.For example, in the time that image capture device 10 is digital idiot camera, optical lens 20 is for good and all fixed.Optical lens 20 operation is used for making the light imageing sensor 50 of focusing.
Aperture (not shown) is the opening that light is passed optical lens 20 and entered image capture device 10 by it.Aperture can be positioned on the difference of optical lens 20.For example, aperture can be ring or other fixture that optical element is held in place, or it can be the diaphragm that is placed on the amount of the light of launching with restricted passage camera lens in optical path.Can adjust aperture to control the amount of the light that enters image capture device 10.
If optical lens is not fixed focus lens, can be adjusted by the operator of image capture device 10 lens focus of optical lens 20.Lens focus is light focal distance of focusing within the scope of it.This focus is the point of light in this place's meeting coalescence focusing.Adjust the lens focus of optical lens 20 and also will adjust the f number of optical lens 20, it is lens focus and the ratio of the diameter of aperture.F number is set to adjust diaphragm diameter, to control the amount of the light that enters imaging device 10.In the time writing out, usually before f number, put f/.Therefore, 4 f number is written as f/4.For example, if optical lens 20 is the 100mm focal length lenses with the f number setting of f/4, optical lens 20 will have the diaphragm diameter of 25mm.
Display 30 is connected to image capture device 10.In one embodiment, display 30 is the liquid crystal display (LCD) that are fixed in image capture device 10.In another embodiment, display 30 is the display monitors that are connected to the computer with integrated digital camera.In another embodiment, display 30 is the display monitors that are connected to network or LAN.In another embodiment, display 30 is the monitors that are attached to image capture device 10 via cable.Display 30 operates to show the digital picture of being caught by image capture device 10.Digital picture is made up of one group of pixel.In one embodiment, digital picture can be rest image.In another embodiment, digital picture can be digital video.
User interface 40 is worked so that visual such as the content of the image of being caught by image capture device 10 in combination at the enterprising line operate of image capture device 10 and with display 30.User interface 40 can comprise one or more interfaces, such as operation system interface and application interface.In one embodiment, user interface 40 is included in the interface of image capture program 60 and field depth program 70.In one embodiment, user interface 40 receives the image of being caught by image capture program 60 and sends images to display 30.
Imageing sensor 50 is integrated with image capture device 10.Imageing sensor 50 is the detectors that optical imagery converted to electronic signal.The signal of telecommunication is quantized by modulus (A/D) transducer (not shown).In one embodiment, imageing sensor 50 can be charge-coupled device (CCD) transducer.In another embodiment, imageing sensor 50 can be the transducer of complementary metal oxide semiconductors (CMOS) (CMOS) transducer or another type.In another embodiment, imageing sensor 50 can be the particularization transducer for medical imaging.
In one embodiment, light is by optical lens 20 and arrive imageing sensor 50, and it comprises the pixel sensor array being evenly distributed on imageing sensor 50.Element sensor can be made up of light absorbing photon the semi-conducting material that produces electronic signal.In one embodiment, imageing sensor 50 can also comprise automatic focusing element sensor.Automatically focusing element sensor can be the array with various patterned arrangement.In another embodiment, can on the transducer separating with imageing sensor 50, comprise automatic focusing element sensor.
Image capture program 60 is standard picture prize procedures.For example, image capture program 60 be digital camera, such as the program operating in scene Recognition system.In one embodiment, image capture program 60 receives and processes the electronic signal from imageing sensor 60.Image capture program 60 sends to user interface 40 by processed image to show on display 30.
In one embodiment, image capture program 60 is gone back the automatic focusing ability of managing image capture device 10.Automatically focusing ability utilizes one or more automatic focusing element sensors whether in focus to determine image, and if image not in focus, is adjusted the focus of image capture device 10 with electromechanical means.User can user interface 40 come application drawing as prize procedure 60 to select one or more focuses to be arranged on the photographer's focus in the visual field of image capture device 10.Focus is the position in the visual field of the image capture device 10 that is associated with automatic focusing element sensor.In focus whether the theme that then image capture program 60 determines single focus place.If at this theme at single focus place not in focus, image capture program 60 adjust focus with electromechanical means until this theme in focus.
If the utilization of focusing program is automatically focusing automatically initiatively, image capture program 60 can be determined the distance between subject and transducer 50 with the ultrasonic wave of infrared light or triangulation.Initiatively focusing is a kind of focusing automatically automatically, and its distance that measures subject by being independent of optical system is determined correct focus.In one embodiment, can determine distance with supersonic detector (not shown).In another embodiment, can determine distance with infrared light detector (not shown).In another embodiment, can determine distance by other method.If focusing program is automatically utilized passive automatic focusing, image capture program 60 can be determined focus with phase-detection or contrast measurement.Passive automatic focusing is a kind of focusing automatically, and correct focus is determined in its passive analysis that enters the image of optical system by execution.In one embodiment, image capture program 60 can detect the motion of subject towards or away from camera in keeping focusing on theme.
In one embodiment, phase-detection can also determine focus place theme and and the automatic focusing element sensor that is associated of focus between distance.Phase-detection can be to move with the similar mode of rangefinder, and it is for allowing user to measure the Focusing mechanism of the distance of theme.Rangefinder shows two identical images.Image wheel of calibration on image capture device moves while rotation.Two doublings of the image and while being fused into one, read this distance from calibrating wheel.For example, in the time utilizing phase-detection, imaging device 10 comprises catches from the light of the opposite side of camera lens and makes light be diverted to the beam splitter (not shown) of the autofocus sensor positioned apart with imageing sensor 50.This produces two independent images, and it is compared for luminous intensity and interval error to determine that image is in focus or focus alignment.Between this comparable period, determine that by phase-detection the theme at focus place is to the distance between the associated element sensor of focusing automatically.For example, digital camera measures the distance of theme in electronics mode.
Field depth program 70 operates so that the field depth of the image of being caught by image capture program 60 is visual.In one embodiment, field depth program 70 is determined the field depth of catching image based on view data, this view data comprises by the automatic focusing ability by image capture program calculates the required data of field depth, such as the focal length of stop opening, optical lens 20 and the subject at focus place to and the automatic focusing element sensor that is associated of focus between distance.View data is also included in the distance of multiple focuses.Field depth program 70 impels field depth to be presented in the user interface 40 of image capture device 10.As described, field depth program 70 is subprogram or routines of image capture program 60.In another embodiment, field depth program 70 can be the stand-alone program of communicating by letter with image capture program 60.
In one embodiment, field depth program 70 covers the mask that changes transparency on image, so that difference is in the inside and outside region of field depth.For example, field depth program 70 covers for well at the opaque mask in the region of field depth outside with for the translucent mask of the focus close to field depth.Region in the field depth inside around selected focus is not masked.In another embodiment, field depth program 70 can cover the mask that changes color or pattern on image, so that difference is in the inside and outside region of field depth.For example, field depth program 70 can cover for well at the blue mask in the region of field depth outside with for the red mask of the focus close to field depth.In the present embodiment, field depth program 70 can cover with yellow mask the region of field depth inside.In another embodiment, field depth program 70 amplifies the loose scape of image to make field depth visual.The amplification of loose scape is by reducing and increase with display sizes.
Image capture device 10 can comprise as with respect to 4 in more detail describe inside and outside parts.
Fig. 2 A and 2B have described the example view of the user interface for image capture program 60 and field depth program 70 according to an embodiment of the invention.The example of the user interface 200 of Fig. 2 A and the each user interface 40 that is Fig. 1 of the user interface 250 of Fig. 2 B, and can allow user to see the displaying contents from image capture program 60 and field depth program 70.
Fig. 2 A has described the demonstration that shows field depth user interface 200 before.User interface 200 comprises the image receiving from image capture program 60.Image comprises subject 220 and target 230 and 240.Focus 210 is the positions in the visual field of the image capture device 10 that is associated of each and automatic focusing element sensor.In one embodiment, user can select single focus via user interface 200.In another embodiment, user can select multiple focuses via user interface 200.Selected focus 215 is chosen as focus in focus by user.Subject 220 is the targets at user-selected focus place.Subject 220 on the display 30 of image capture device 10 in shown image in focus.Foreground target 230 is the focus alignment targets in the prospect of image.Target context 240 is the focus alignment targets in image background.
Fig. 2 B has described the demonstration that shows field depth user interface 250 afterwards.User interface 250 comprises the image receiving from image capture program 60.Image comprises subject 220 and target 230 and 240.Subject 220 is subjects of image, and this subject in the image being shown on display 30 in focus.Foreground target 230 is the focus alignment targets in the prospect of image.Target context 240 is the focus alignment targets in image background.Focus area 260 is the regions in focus around the target 220 in field depth.It is masks on focus alignment region in the prospect of the image outside field depth that prospect covers 270.Background covering 280 is the masks on the focus alignment region in the background outside field depth.
Fig. 3 is the flow chart of having described according to an embodiment of the invention the operating procedure for making the visual field depth program 70 of field depth of catching image.
In one embodiment, user selects the photographer's focus in visual field.In one embodiment, initial, light is by camera lens 20.Imageing sensor 50 absorbs light, converts thereof into electronic signal, and this signal is sent to image capture program 60.Image capture program 60 receives electronic signal from imageing sensor 50.Image capture program 60 adjust focusing automatically so that the theme at photographer's focus place in focus.Image capture program 60 for the each focus in multiple focuses determine focus place theme and and the automatic focusing element sensor that is associated of focus between distance.In one embodiment, image capture program 60 is determined view data, this view data comprises by the automatic focusing ability by image capture program calculates the required data of field depth, such as the focal length of stop opening, optical lens 20 and the theme at photographer's focus place and and the automatic focusing element sensor that is associated of photographer's focus between distance.View data also comprise for the theme at the focus place of the each focus in multiple focuses and and the automatic focusing element sensor that is associated of focus between distance.In one embodiment, some or all view data are sent to field depth program 70 by image capture program 60.
In one embodiment, image capture program 60 is impelled and is caught image and be displayed in user interface 40.In another embodiment, image capture program 60 will be caught image and be sent to field depth program 70.
In step 300, field depth program 70 receives view data.In one embodiment, field depth program 70 receives view data from image capture program 60.In another embodiment, the automatic focusing ability that field depth program 70 can access images prize procedure 60, and field depth program is determined view data.View data comprises the dark required data of calculated field, such as the focal length of diaphragm diameter, optical lens 20 and the theme at photographer's focus place and and the automatic focusing element sensor that is associated of photographer's focus between distance.
In step 310, the depth map that field depth program 70 creates for catching image.In one embodiment, depth map represents to catch the pixel of image and the distance being associated with each pixel to the autofocus sensor being associated with each pixel.Each focus in described multiple focus is associated with one or more pixels of catching image.Each distance in view data is associated with focus.The mean value of the distance of the pixel of catching image that in one embodiment, field depth program 70 is identified for not being associated with focus.For example, field depth program 70 is determined the mean value of the distances that are associated with two adjacent focal spots, and gives not the pixel distribution average distance value between two adjacent focal spots that focus is associated.In another embodiment, field depth program 70 is identified for the progressive distance of the pixel not being associated with focus.For example, field depth program 70 is identified for the distance of two adjacent focal spots and distributes progressive distance value to the pixel not and between two adjacent focal spots being associated of focus.
In step 320, field depth program 70 is identified for the field depth of the image of being caught by program 60.In one embodiment, field depth program 70 is identified for catching the field depth of image based on view data.View data comprise the focal length of diaphragm diameter, optical lens 20 and the theme at photographer's focus place to and the automatic focusing element sensor that is associated of photographer's focus between distance.For example, field depth program 70 is identified for catching the field depth of image by calculating field depth by algorithm known.Field depth comprises the distance with the certain limit of the focus of being selected by photographer.
At step 330 place, it is visual that field depth program 70 makes to catch the field depth of image.In one embodiment, field depth program 70 determines which pixel of catching image is in field depth.Field depth program 70 is compared the distance range in field depth with the distance value of distributing to the each pixel in depth map.If pixel falls in the distance range of field depth, think that pixel is in field depth.If pixel and the Range-based connection that is distal to field depth, think that pixel exceedes the field depth in the background of catching image.Field depth program 70 use coverage diagrams are sheltered the pixel that exceedes field depth.Field depth program 70 is not sequestered in the pixel in field depth.If pixel joins with the Range-based nearer than field depth, think pixel outside field depth and in the prospect of catching image.Field depth program 70 use coverage diagrams carry out the nearer pixel of masking ratio field depth.
In one embodiment, " in focus " or the component in field depth are not masked.Sheltered by opaque mask gradually at the element of focal plane outside, the focal plane of its medium shot point is farthest the most opaque.In another embodiment, apart from the plane of camera lens focus element farthest by fuzzy, because how depth map simulation final image for this reason will appear on larger display.In another embodiment, the element in background is sheltered by pattern or color, and element in prospect is sheltered by different patterns or color.
In one embodiment, field depth program 70 impels the field depth of catching image to be displayed in user interface 40.The mask of crossing over the image viewing illustrating on the display 30 of image capture device 10 helps user to make the position of the target in image visual with respect to the subject of digital picture, and previewing digital image, therefore user can be clear which element be in focus with which element be focus alignment.Some of digital picture or all parts can be sheltered.In one embodiment, field depth program 70 sends the type of the mask that covers the indication of which pixel and will use to user interface 40.In another example, field depth program 70 sends and will cover the indication of which pixel and the type of mask to image capture program 60, and image capture program 60 sends indication to user interface 40.
Fig. 4 has described according to the block diagram of the parts of the image capture device 10 of illustrative embodiment of the present invention.Will be appreciated that Fig. 4 that the diagram of an execution mode is only provided and do not imply about any restriction of environment that wherein can realize different embodiment.Can realize the many modifications to institute's describe environment.
Image capture device 10 comprises communication structure 402, and it provides the communication between computer processor 404, memory 406, persistence holder 408, communication unit 410 and I/O (I/O) interface 412.Can realize communication structure 402 with any framework that is designed to transmit data and/or control information between processor (such as microprocessor, communication and network processing unit etc.), system storage, ancillary equipment and intrasystem any other hardware component.For example, can realize communication structure 402 by one or more buses.
Memory 406 and persistence holder 408 are computer-readable recording mediums.In the present embodiment, memory 406 comprises random-access memory (ram) 414 and cache memory 416.Usually, memory 406 can comprise any suitable volatibility or non-volatile computer readable storage medium storing program for executing.
User interface 40, image capture program 60 and field depth program 70 are stored in persistence holder 408, to carried out by the one or more one or more memories via memory 406 in each computer processor 404.In the present embodiment, persistence holder 408 comprises magnetic hard drive.Alternatively, or except magnetic hard drive, persistence holder 408 can comprise solid-state hard drive, semiconductor storage unit, read-only memory (ROM), erasable programmable read only memory (EPROM), flash memory or can stored program instruction or any other computer-readable recording medium of data message.
The medium that persistence holder 408 uses can also be removable.For example, can be by removable hard disk driver for persistence holder 408.Other examples comprise CD and disk, thumb actuator and smart card, and it is inserted in driver to be transferred on another computer-readable recording medium of a part that is also persistence holder 408.
In these examples, communication unit 410 provides and the communicating by letter of other servers.In these examples, communication unit 410 comprises one or more network interface unit.Communication unit 410 can be by providing and communicate by letter with any one or both in wireless communication link with physics.Can user interface 40, image capture program 60 and field depth program 70 be downloaded to persistence holder 408 by communication unit 410.
I/O interface 412 allows to carry out the input and output of data with other equipment that can be connected to image capture device.For example, I/O interface 412 can be provided to external equipment 418, connection such as keyboard, keypad, touch-screen and/or certain other suitable input equipment.External equipment 418 can also comprise portable computer readable storage medium storing program for executing, such as, for example thumb actuator, portable optic disk or disk and storage card.The software and data, for example user interface 40, image capture program 60 and the field depth program 70 that are used for implementing embodiments of the invention can be stored on this type of portable computer readable storage medium storing program for executing, and can be loaded on persistence holder 408 via I/O interface 412.I/O412 is also connected to display 420.
Display 420 provide a kind of to user show data mechanism and can be for example computer monitor.
The application program of program as herein described based on realize in certain embodiments of the invention this program for it identified.But, will be appreciated that any specific program name be only used to convenient for the purpose of and use, and therefore the invention is not restricted to only in any application-specific of being identified and/or imply by this class name institute, use.
Flow chart in accompanying drawing and block diagram have shown according to architectural framework in the cards, function and the operation of the system of multiple embodiment of the present invention, method and computer program product.In this, the each square frame in flow chart or block diagram can represent a part for module, program segment or a code, and a part for described module, program segment or code comprises one or more for realizing the executable instruction of logic function of regulation.Also it should be noted that what the function marking in square frame also can be marked to be different from accompanying drawing occurs in sequence in some realization as an alternative.For example, in fact two continuous square frames can be carried out substantially concurrently, and they also can be carried out by contrary order sometimes, and this determines according to related function.Also be noted that, the combination of the square frame in each square frame and block diagram and/or flow chart in block diagram and/or flow chart, can realize by the special hardware based system of the function putting rules into practice or action, or can realize with the combination of specialized hardware and computer instruction.

Claims (12)

1. for making a field depth visualization method for digital picture, the method comprising the steps of:
The theme of computing equipment based on describing in each pixel and and the integrated imaging device of computing equipment between distance, be identified for the distance value of each respective pixel of multiple pixels of digital picture;
This computing equipment is determined the field depth of digital picture; And
This computing equipment is compared the field depth of digital picture with the distance value for each respective pixel, to determine the first group of pixel having in described multiple pixels of the distance value of field depth outside, and there is second group of pixel in described multiple pixels of the distance value of field depth inside; And
This computing equipment is indicated first group of pixel in user interface.
2. the method for claim 1, wherein computing equipment indicates the step of first group of pixel to comprise in user interface: computing equipment covers the first mask in user interface in the first group of pixel having at the distance value of field depth outside.
3. method as claimed in claim 2, wherein, described the first mask changes in transparency.
4. the method for claim 1, wherein computing equipment indicates the step of first group of pixel to comprise in user interface: computing equipment makes to have in first group of pixel of the distance value of field depth outside fuzzy in user interface.
5. the method for claim 1, also comprises step:
The first subset of the pixel in first group of pixel of the definite distance value with the minimum allowable distance value range that is less than field depth of described computing equipment; And
Described computing equipment determines to have the rise that is greater than field depth the second subset from the pixel in first group of pixel of the distance value of value range.
6. method as claimed in claim 5, also comprises step:
In the first subset of the pixel of described computing equipment in first group of pixel of distance value with the minimum allowable distance value range that is less than field depth, cover the second mask; And
Described computing equipment covers the 3rd mask having the rise that is greater than field depth in the second subset of the pixel in first group of pixel of the distance value of value range.
7. for making the visual computer system of field depth of digital picture, this computer system comprises:
One or more computer processors;
One or more computer-readable recording mediums;
Program command, is stored on described computer-readable medium to carried out by least one in described one or more processors, and this program command comprises:
Program command, in order to the theme based on describing in each pixel and and the integrated imaging device of computing equipment between distance, be identified for the distance value of each respective pixel of multiple pixels of digital picture;
Program command, in order to determine the field depth of digital picture; And
Program command, in order to the field depth of digital picture is compared with the distance value for each respective pixel, to determine the first group of pixel having in described multiple pixels of the distance value of field depth outside, and there is second group of pixel in described multiple pixels of the distance value of field depth inside; And
Program command, in order to indicate first group of pixel in user interface.
8. computer system as claimed in claim 7, wherein, comprise in order to the program command of indicating first group of pixel in computer interface: in order to there is the program command that covers the first mask in first group of pixel of the distance value outside field depth in user interface.
9. computer system as claimed in claim 8, wherein, described the first mask changes in transparency.
10. computer system as claimed in claim 7, wherein, comprises in order to the program command of indicating first group of pixel in user interface: in order to make having first group of program command that pixel is fuzzy at the distance value of field depth outside in user interface.
11. computer systems as claimed in claim 7, also comprise:
Be stored on computer-readable recording medium so that by the program command of at least one execution in described one or more processors, in order to determine the first subset of the pixel in first group of pixel of the distance value with the minimum allowable distance value range that is less than field depth; And
Be stored on computer-readable recording medium so that by the program command of at least one execution in described one or more processors, there is the rise that is greater than field depth the second subset from the pixel in first group of pixel of the distance value of value range in order to determine.
12. computer systems as claimed in claim 11, also comprise:
Be stored on computer-readable recording medium to by the program command of at least one execution in described one or more processors, cover the second mask in the first subset in order to the pixel in first group of pixel of distance value with the minimum allowable distance value range that is less than field depth; And
Be stored on computer-readable recording medium so that by the program command of at least one execution in described one or more processors, in order to cover the 3rd mask thering is the rise that is greater than field depth in the second subset of the pixel in first group of pixel of the distance value of value range.
CN201410001474.8A 2013-01-02 2014-01-02 For the visual system and method for field depth Expired - Fee Related CN103916659B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/732,577 2013-01-02
US13/732,577 US20140184586A1 (en) 2013-01-02 2013-01-02 Depth of field visualization

Publications (2)

Publication Number Publication Date
CN103916659A true CN103916659A (en) 2014-07-09
CN103916659B CN103916659B (en) 2016-11-16

Family

ID=51016662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410001474.8A Expired - Fee Related CN103916659B (en) 2013-01-02 2014-01-02 For the visual system and method for field depth

Country Status (2)

Country Link
US (1) US20140184586A1 (en)
CN (1) CN103916659B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107438157A (en) * 2016-05-25 2017-12-05 聚晶半导体股份有限公司 Video capturing device and its gradual focusing method

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6245885B2 (en) * 2013-08-02 2017-12-13 キヤノン株式会社 Imaging apparatus and control method thereof
US9734551B1 (en) * 2013-12-01 2017-08-15 Google Inc. Providing depth-of-field renderings
US9449234B2 (en) 2014-03-31 2016-09-20 International Business Machines Corporation Displaying relative motion of objects in an image
US9300857B2 (en) 2014-04-09 2016-03-29 International Business Machines Corporation Real-time sharpening of raw digital images
US9456010B1 (en) 2015-12-30 2016-09-27 International Business Machines Corporation Convergence of social enterprise and digital telephony
US10733706B2 (en) * 2018-01-07 2020-08-04 Htc Corporation Mobile device, and image processing method for mobile device
CN109615648B (en) * 2018-12-07 2023-07-14 深圳前海微众银行股份有限公司 Depth of field data conversion method, device, equipment and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1926851A (en) * 2004-01-16 2007-03-07 索尼电脑娱乐公司 Method and apparatus for optimizing capture device settings through depth information
CN102422630A (en) * 2009-05-12 2012-04-18 佳能株式会社 Image pickup apparatus
CN102842110A (en) * 2011-06-20 2012-12-26 富士胶片株式会社 Image processing device and image processing method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6603485B2 (en) * 2001-04-24 2003-08-05 Hewlett-Packard Development Company, L.P. Computer cursor spotlight
US20130094753A1 (en) * 2011-10-18 2013-04-18 Shane D. Voss Filtering image data
US8805059B2 (en) * 2011-10-24 2014-08-12 Texas Instruments Incorporated Method, system and computer program product for segmenting an image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1926851A (en) * 2004-01-16 2007-03-07 索尼电脑娱乐公司 Method and apparatus for optimizing capture device settings through depth information
CN102422630A (en) * 2009-05-12 2012-04-18 佳能株式会社 Image pickup apparatus
CN102842110A (en) * 2011-06-20 2012-12-26 富士胶片株式会社 Image processing device and image processing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107438157A (en) * 2016-05-25 2017-12-05 聚晶半导体股份有限公司 Video capturing device and its gradual focusing method
CN107438157B (en) * 2016-05-25 2020-04-07 聚晶半导体股份有限公司 Image acquisition device and progressive focusing method thereof

Also Published As

Publication number Publication date
US20140184586A1 (en) 2014-07-03
CN103916659B (en) 2016-11-16

Similar Documents

Publication Publication Date Title
CN103916659A (en) System And Method For Depth Of Field Visualization
US9196027B2 (en) Automatic focus stacking of captured images
JP6512810B2 (en) Image pickup apparatus, control method and program
CN102223477B (en) Four-dimensional polynomial model for depth estimation based on two-picture matching
CN107948519A (en) Image processing method, device and equipment
CN108718381B (en) Image capturing apparatus and method for controlling image capturing apparatus
JP5996319B2 (en) Stereo measurement device and method of operating stereo measurement device
JP6497987B2 (en) Image processing apparatus, image processing method, program, and storage medium
JP2007282188A (en) Object tracker, object tracking method, object tracking program and optical equipment
JP2014010400A (en) Imaging device and lens device
CN101183206A (en) Method for calculating distance and actuate size of shot object
JP2015036632A (en) Distance measuring device, imaging apparatus, and distance measuring method
US20200309520A1 (en) Distance measuring camera
JP2015072357A (en) Focus adjustment device
JP2012253462A (en) Imaging apparatus and control method therefor
JP2020088689A (en) Image processing apparatus, imaging apparatus, and image processing method
JP2018097253A (en) Focus adjustment unit and focus adjustment method
JP2015201722A (en) Image processing device, imaging apparatus, image processing method, program and storage medium
JP2019208170A (en) Image processing apparatus, image processing method, program, and storage medium
US9449234B2 (en) Displaying relative motion of objects in an image
JP2014145867A (en) Imaging device and imaging method
JP6346484B2 (en) Image processing apparatus and control method thereof
JP2014154981A (en) Imaging apparatus and control method therefor
CN106686375B (en) A kind of calculation method and system of camera hyperfocal distance
JP2020048034A (en) Electronic device and notification method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20161116

Termination date: 20190102