WO2010128579A1 - 電子カメラ、画像処理装置及び画像処理方法 - Google Patents
電子カメラ、画像処理装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2010128579A1 WO2010128579A1 PCT/JP2010/002916 JP2010002916W WO2010128579A1 WO 2010128579 A1 WO2010128579 A1 WO 2010128579A1 JP 2010002916 W JP2010002916 W JP 2010002916W WO 2010128579 A1 WO2010128579 A1 WO 2010128579A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- blur
- region
- input image
- locus
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to an electronic camera, an image processing device, and the like, and more particularly to an electronic camera, an image processing device, and the like that track an image of an object to be tracked in an input image.
- the electronic camera has an object tracking function as an alignment means for AF (Auto-Focus), AE (Automatic Exposure) or backlight correction function.
- the electronic camera is a camera that captures still images or a camera that captures moving images.
- a touch panel when a touch panel is provided as a means for displaying an image captured by an electronic camera or an image to be captured, the user designates an image of an object to be tracked by touching the touch panel.
- the electronic camera tracks the image of the object by specifying a target area that is an area including the image of the object in an image taken after the image of the object is specified.
- a target region by searching for a region having a feature amount similar to the feature amount of an image of a specified target object (see, for example, Patent Document 1).
- the electronic camera searches for a target area by template matching using color information as a feature amount.
- the object is a fast-moving object such as a child
- a user who is not good at operating the electronic camera shoots while shaking the electronic camera up and down, left and right, or in a dark environment
- the frame rate per second
- the feature amount indicating the feature of the image of the object varies. Therefore, there is a problem that the target cannot be tracked correctly when the target region is specified using the feature amount as in the conventional electronic camera.
- the present invention has been made in view of the above-described conventional problems, and can stabilize an object even when the feature amount of the image of the object fluctuates in the input image due to a sudden movement of the object. It is an object of the present invention to provide an electronic camera, an image processing apparatus, and the like that can be tracked.
- an electronic camera tracks an image of an object to be tracked in an input image, and uses the tracking result to perform autofocus processing, automatic exposure processing, and backlight correction processing.
- An electronic camera that executes at least one of the input images, wherein the input image is a region that is determined using an image that is taken before the input image and includes the image of the object
- a blur detection unit that detects a blur trajectory that is a trajectory indicating a blur of a search area, and an end point of the blur trajectory detected by the blur detection unit, and is an area that includes the image of the target object and the search
- a tracking processing unit that tracks an image of the object by specifying a target area that is smaller than the area.
- the image processing apparatus further includes a storage unit that stores an initial feature amount that quantitatively indicates a feature of the image of the object, the blur detection unit further specifies an end point of the detected blur locus, and the tracking processing unit includes A region including a position corresponding to an end point of a blur locus specified by the blur detection unit in a next input image temporally continuous with the input image is determined as the search region, and the storage is performed in the determined search region.
- the target region in the next input image that is temporally continuous with the input image may be specified by searching for a region having a feature amount most similar to the initial feature amount stored in the unit.
- the vicinity of the position where the object is estimated to be captured can be determined as the search area. Therefore, the object can be tracked stably and the load of the search process can be reduced.
- the blur detection unit of the two endpoints of the blur locus, the endpoint farther from the position corresponding to the target area specified in the previous input image that is temporally continuous with the input image, It may be specified as an end point of the blur locus.
- the tracking processing unit selects a region having a feature amount most similar to the initial feature amount stored in the storage unit from among regions including each of the two end points of the blur locus, as a target region in the input image. It may be specified as
- the tracking processing unit identifies a target area in the input image using the blur locus, and when the length of the blur locus is equal to or less than the threshold.
- the target area in the input image may be specified by searching for an area having a feature quantity most similar to the initial feature quantity stored in the storage unit in the search area.
- the target area can be identified using the blur locus only when the image of the target object is blurred and the feature quantity is changing, so the target object can be tracked more stably. It becomes.
- the blur detection unit further specifies an end point of the detected blur locus, and the tracking processing unit includes a region including a position corresponding to the blur locus end point specified by the blur detection unit in the input image.
- the target area may be specified.
- the region including the position of the end point of the blur locus is specified as the target region by using the end point of the blur locus, so that the feature amount is extracted or compared. Therefore, it is possible to easily specify the target area.
- the blur detection unit of the two endpoints of the blur locus, the endpoint farther from the position corresponding to the target area specified in the previous input image that is temporally continuous with the input image, It may be specified as an end point of the blur locus.
- the image processing apparatus further includes a storage unit that stores an initial feature amount that quantitatively indicates a feature of the region in which the object is captured, and the shake detection unit is configured to have a shutter speed or a frame rate when the input image is captured. Is smaller than the threshold, the blur locus is detected, and the tracking processing unit detects the blur locus specified by the blur detection unit when the shutter speed or the frame rate when the input image is captured is smaller than the threshold.
- the input image is temporally continuous 1
- the area including the position corresponding to the target area specified in the previous input image is determined as the search area, and the area within the determined search area By searching for a region having a feature amount most similar to the initial characteristics stored in the storage unit, may identify the target region in the input image.
- An image processing apparatus is an image processing apparatus that tracks an image of an object to be tracked in an input image, and is captured before the input image in the input image.
- a blur detection unit that detects a blur trajectory that is a region that is determined by using an image and includes a trajectory indicating a blur of a search region that includes an image of the object, and a blur trajectory detected by the blur detection unit
- a tracking processing unit that tracks the image of the object by specifying a target area that is an area that includes the image of the object and is smaller than the search area.
- an integrated circuit is an integrated circuit that tracks an image of an object to be tracked in an input image, and an image taken before the input image is captured in the input image.
- a blur detection unit that detects a blur locus that is a region indicating a blur of a search region that is an area determined by using the region including the image of the object, and an end point of the blur locus detected by the blur detection unit
- a tracking processing unit that tracks the image of the object by specifying a target area that is an area that includes the image of the object and is smaller than the search area.
- the present invention may be realized not only as such an image processing apparatus, but also as an image processing method in which operations of characteristic components included in such an image processing apparatus are used as steps. Moreover, you may implement
- the electronic camera or the like according to the present invention uses a blurring locus even when the feature amount of the image of the object fluctuates in the input image due to a sudden movement of the object. Makes it possible to track the object stably.
- FIG. 1 is a block diagram showing a functional configuration of an image processing apparatus according to Embodiment 1 or Embodiment 2 of the present invention.
- FIG. 2 is a block diagram showing a configuration of an electronic camera which is a specific example of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 3 is a flowchart showing the operation of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram for describing specific examples of various operations in the image processing apparatus when a blurred image is not included in the input image.
- FIG. 5 is a diagram for explaining a specific example of the operation of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 6 is a flowchart showing the operation of the image processing apparatus according to the second embodiment of the present invention.
- Embodiment 1 The image processing apparatus according to Embodiment 1 of the present invention uses the position of the search region in the next input image that is temporally continuous with the blur image as the end point of the blur trajectory when the length of the blur trajectory is larger than the threshold. It is characterized by determining.
- FIG. 1 is a block diagram showing a functional configuration of an image processing apparatus according to Embodiment 1 of the present invention.
- the image processing apparatus 10 includes an initial feature amount extraction unit 11, a shake detection unit 12, a tracking processing unit 13, and a storage unit 14.
- the initial feature quantity extraction unit 11 acquires information (for example, position, shape, size, etc.) regarding an initial area that is an area including the image of the object in the input image when the image of the object is specified.
- the initial feature quantity extraction unit 11 also extracts an initial feature quantity that quantitatively indicates the feature of the image of the object included in the initial area. Further, the initial feature quantity extraction unit 11 writes the extracted initial feature quantity in the storage unit 14.
- the blur detection unit 12 detects the blur locus of the search area in the input image.
- the search area is an area determined by the tracking processing unit 13 using an image taken before the input image and includes an image of the target object.
- the blur detection unit 12 specifies the end point of the detected blur locus. Specifically, of the two end points of the blur locus, the end point far from the position corresponding to the target area specified in the previous input image that is temporally continuous with the input image is set as the end point of the blur locus. Identify.
- the tracking processing unit 13 identifies the target area using the end points of the blur locus detected by the blur detection unit 12.
- the target area is an area including an image of the target object and is smaller than the search area.
- the tracking processing unit 13 determines that the input image is a blur image when the length of the blur trajectory detected by the blur detection unit 12 is larger than the threshold, and specifies the target region using the end points of the blur trajectory. To do.
- the tracking processing unit 13 selects a region having a feature amount most similar to the initial feature amount stored in the storage unit 14 among the regions including each of the two end points of the blurring locus as a target in the input image. Specify as an area.
- the tracking processing unit 13 searches a region including a position corresponding to the end point of the blur locus specified by the blur detection unit 12 in the next input image that is temporally continuous with the input image in which the blur locus is detected. And decide. Then, when the next input image that is temporally continuous with the input image in which the blur locus is detected is not the blur image, the tracking processing unit 13 has the largest initial feature amount stored in the storage unit 14 in the determined search region. A region having a similar feature amount is searched. The tracking processing unit 13 identifies the area searched in this way as a target area.
- the storage unit 14 stores the initial feature amount extracted by the initial feature amount extraction unit 11. Further, the storage unit 14 stores information indicating the position of the end point of the blur locus specified by the blur detection unit 12 (hereinafter simply referred to as “blur end point information”). Furthermore, the storage unit 14 stores information indicating the position and size of the target region specified by the tracking processing unit 13 (hereinafter simply “target region information”).
- FIG. 2 is a block diagram showing a configuration of an electronic camera which is a specific example of the image processing apparatus according to Embodiment 1 of the present invention.
- the electronic camera 100 includes a photographing lens 101, a shutter 102, an image sensor 103, an AD converter 104, a timing generation circuit 105, an image processing circuit 106, a memory control circuit 107, Image display memory 108, DA converter 109, image display unit 110, memory 111, resize circuit 112, system control circuit 113, exposure control unit 114, distance measurement control unit 115, and zoom control unit 116
- power supply 128, interfaces 129 and 130, connectors 131 and 132, a recording medium 133, an optical finder 134, a communication unit 135, an antenna 136, an initial feature amount extraction circuit 137, a tracking processing circuit 138, a shake detection circuit 139, a tracking result drawing circuit 140, a camera A control circuit 141.
- the power supply 128 and the recording medium 133 may be removable.
- the photographing lens 101 is a lens having a zooming and focusing function, and forms incident light on the image sensor 103.
- the shutter 102 has a diaphragm function and adjusts the amount of light incident on the image sensor 103.
- the image sensor 103 converts an optical image obtained by forming incident light into an electric signal (image data).
- the AD converter 104 converts an analog signal output from the image sensor 103 into a digital signal.
- the AD converter 104 writes the image data converted into the digital signal into the image display memory 108 or the memory 111 via the memory control circuit 107.
- the AD converter 104 outputs the image data converted into the digital signal to the image processing circuit 106.
- the timing generation circuit 105 supplies a clock signal or a control signal to the image sensor 103, the AD converter 104, and the DA converter 109.
- the timing generation circuit 105 is controlled by the memory control circuit 107 and the system control circuit 113.
- the image processing circuit 106 performs predetermined image interpolation processing or color conversion processing on the image data output from the AD converter 104 or the image data output from the memory control circuit 107.
- the image processing circuit 106 performs predetermined calculation processing using the input image data, and the system control circuit 113 controls the exposure control unit 114 and the distance measurement control unit 115 based on the obtained calculation result. I do.
- the memory control circuit 107 controls the AD converter 104, the timing generation circuit 105, the image processing circuit 106, the image display memory 108, the DA converter 109, the memory 111, and the resizing circuit 112.
- the image display memory 108 stores image data for display.
- the DA converter 109 acquires display image data from the image display memory 108 via the memory control circuit 107, and converts the digital signal into an analog signal.
- the image display unit 110 displays the display image data converted into an analog signal by the DA converter 109. Further, the image display unit 110 may receive information for specifying an area (initial area) in which an object to be tracked is shown from the user.
- the image display unit 110 is a display such as a TFTLCD (Thin Film Transistor Liquid Crystal Display), a touch panel, or the like.
- the memory 111 stores image data obtained from the AD converter 104 and image data processed by the image processing circuit 106. Further, the memory 111 stores information necessary for tracking processing such as the initial feature amount extracted by the initial feature amount extraction circuit 137.
- the memory 111 corresponds to the storage unit 14 in FIG.
- the resizing circuit 112 generates a low-resolution image from an image obtained by photographing. Note that the resizing circuit 112 can select a plurality of predetermined resolutions depending on the application.
- the resizing circuit 112 reads image data stored in the memory 111, performs resizing processing on the read image data, and writes the processed data to the memory 111.
- the resizing circuit 112 is used when it is desired to record image data on the recording medium 133 or the like with a pixel number (size) different from the pixel number of the image sensor 103. Further, the number of pixels that can be displayed by the image display unit 110 is considerably smaller than the number of pixels of the image sensor 103. Therefore, the resizing circuit 112 is also used when generating a display image when the captured image data is displayed on the image display unit 110. Similarly, the resizing circuit 112 is also used when generating an image (for example, an image having an image size such as QVGA) that is used when the blur detection circuit 139 detects blur.
- an image for example, an image having an image size such as QVGA
- the system control circuit 113 performs photographing processing by controlling each processing unit and each processing circuit of the entire electronic camera 100.
- the photographing process includes an exposure process, a development process, a recording process, and the like.
- the exposure process is a process of writing image data read from the image sensor 103 into the memory 111 via the AD converter 104 and the memory control circuit 107.
- the development processing is arithmetic processing in the image processing circuit 106 and the memory control circuit 107.
- the recording process is a process of reading image data from the memory 111 and writing the image data to the recording medium 133.
- the exposure control unit 114 controls the shutter 102 having an aperture function.
- the exposure control unit 114 also has a flash light control function in cooperation with the flash 118.
- the ranging control unit 115 controls the focusing of the photographing lens 101.
- the zoom control unit 116 controls zooming of the photographing lens 101.
- the barrier control unit 117 controls the operation of the protection unit 119.
- the flash 118 irradiates the subject with flash light.
- the flash 118 further has an AF auxiliary light projecting function and a flash light control function.
- the protection unit 119 is a barrier that prevents the imaging unit from being soiled or damaged by covering the imaging unit including the photographing lens 101, the shutter 102, the imaging element 103, and the like of the electronic camera 100.
- the memory 120 records constants, variables, programs and the like for the operation of the system control circuit 113.
- the display unit 121 is a liquid crystal display device or a speaker that displays an operation state or a message using characters, images, sounds, or the like according to execution of a program in the system control circuit 113.
- the display unit 121 is installed at a single or a plurality of locations near the operation unit of the electronic camera 100 that are easily visible.
- the display unit 121 includes, for example, a combination of an LCD, an LED (Light Emitting Diode), or a sounding element.
- the non-volatile memory 122 is an electrically proof / recordable memory, and stores operation setting data of the electronic camera 100 or user-specific information.
- the nonvolatile memory 122 is, for example, an EEPROM (Electrically Erasable and Programmable Read Only Memory).
- the mode dial switch 123 can switch and set each function mode such as an automatic shooting mode, a shooting mode, a panoramic shooting mode, and a RAW mode.
- the shutter switch 124 is turned on during the operation of a shutter button (not shown), and instructs to start operations such as AF processing, AE processing, and AWB (Auto White Balance) processing. In addition, the shutter switch 124 instructs the start of a series of processing operations including exposure processing, development processing, and recording processing.
- the power control unit 125 includes a battery detection circuit, a DC-DC converter, and a switch circuit that switches a block to be energized.
- the power control unit 125 detects the presence / absence of a battery, the type of battery, and the remaining battery level.
- the power supply control unit 125 further controls the DC-DC converter based on the detection result and the instruction of the system control circuit 113, feeds back a necessary voltage, and supplies the voltage to each processing unit including the recording medium 133 with the connector 126 and 127 to supply.
- Connectors 126 and 127 are connectors for connecting the power supply control unit 125 and the power supply 128.
- the power supply 128 is a primary battery such as an alkaline battery or a lithium battery, a secondary battery such as a NiCd battery, a NiMH battery, or a Li battery, or an AC adapter.
- Interfaces 129 and 130 are interfaces for transmitting and receiving image data and the like between the recording medium 133 and the memory 111.
- Connectors 131 and 132 are connectors that connect the recording medium 133 via the interface 129 and the interface 130.
- the recording medium 133 is a recording medium such as a memory card or a hard disk that records image data.
- the optical viewfinder 134 is a viewfinder used by the photographer to check the subject. The photographer can take an image using only the optical viewfinder 134 without using the electronic viewfinder function of the image display unit 110.
- the communication unit 135 has various communication functions such as RS232C, USB, IEEE1394, modem, LAN, or wireless communication.
- the antenna 136 is a connector for connecting the electronic camera 100 to other devices by the communication unit 135 or an antenna for wireless communication.
- the initial feature quantity extraction circuit 137 extracts the initial feature quantity from the image data stored in the memory 111, and writes the extracted initial feature quantity into the memory 111.
- the coordinates of the area from which the initial feature amount is extracted are specified by the position received by the touch panel from the user, the AF area set by the user pressing the shutter switch 124, or the like.
- the initial feature quantity extraction circuit 137 corresponds to the initial feature quantity extraction unit 11 in FIG.
- the tracking processing circuit 138 reads the initial feature value from the memory 111 and performs the tracking process using the read initial feature value. Then, the tracking processing circuit 138 writes the tracking result (coordinate data, evaluation value, etc.) in the memory 111.
- the tracking processing circuit 138 corresponds to the tracking processing unit 13 in FIG.
- the blur detection circuit 139 detects a blur trajectory, and stores the detection result (the time from when the shutter is opened until it is closed and the values of the x coordinate and the y coordinate of the image of the object moving during that time) in the memory 111. Write.
- the shake detection circuit 139 corresponds to the shake detection unit 12 in FIG.
- the tracking result drawing circuit 140 processes the display image data written in the image display memory 108 in order to display the tracking result written in the memory 111 on the display unit 121. Specifically, the tracking result drawing circuit 140 performs, for example, processing such as a tracking frame, mosaicing, characters, display color change, and blurring on the image data.
- the camera control circuit 141 controls the AF process based on the position and size of the tracking result (target area) written in the memory 111 so that the target in the target area is in focus. Specifically, the camera control circuit 141 controls the photographing lens 101 so that the contrast becomes high by using, for example, the contrast of the object in the target area.
- the camera control circuit 141 may control the AE process or the backlight correction process so that the exposure of the object in the target area is appropriate. Specifically, the camera control circuit 141 controls the shutter speed or aperture value of the shutter 102 having the aperture function via the exposure control unit 114 according to a mode such as shutter speed priority or aperture value priority, for example. Also good.
- the camera control circuit 141 may set the object in a certain position or a certain size in the image (for example, so that the face that is the object appears in the center of the image, or the person that is the object).
- the electronic camera 100 may be controlled so that the whole body is photographed.
- the system control circuit 113 performs tracking by software processing. Processing may be performed.
- FIG. 3 is a flowchart showing the operation of the image processing apparatus according to the first embodiment of the present invention.
- the initial feature quantity extraction unit 11 acquires information on an initial area that is an area in which an object is shown in the input image when the object is specified, and uses the acquired initial area to obtain an image of the object.
- An initial feature amount quantitatively indicating the feature is extracted (step S101).
- the initial feature quantity extraction unit 11 registers the extracted initial feature quantity and information indicating the position and size of the initial region in the storage unit 14 (step S102).
- the image processing apparatus 10 repeats the process of specifying the target area that is the area in which the target object is shown in the input image after the input image from which the initial feature value is extracted.
- the process of specifying the target area is as follows.
- the tracking processing unit 13 determines whether or not the previous input image that is temporally continuous was a blurred image (step S103). Specifically, for example, the tracking processing unit 13 determines whether or not flag information indicating whether or not the immediately preceding input image that is temporally continuous is a blurred image is “1”.
- the tracking processing unit 13 determines a region near the position corresponding to the end point of the blurred locus as a search region (step S104). Specifically, for example, the tracking processing unit 13 determines an area centered on the position indicated by the blur end point information as a search area.
- the tracking processing unit 13 determines a region near the target region specified in the previous input image as a search region. (Step S105). Specifically, for example, the tracking processing unit 13 reads from the storage unit 14 information indicating the center position of the target area specified in the previous input image and the size of the target area. Then, the tracking processing unit 13 determines an area where the read center position and the center position match and is larger than the read target area as a search area. Note that, for example, when the read center position is the end of the input image, the center position of the search region does not necessarily match the read center position.
- the blur detection unit 12 detects a blur locus in the search area determined by the tracking processing unit 13 (step S106). Since the shake may depend on the movement of the object, it is difficult to obtain the shake locus using sensor information from a gyro sensor or an acceleration sensor attached to a camera or the like. Therefore, the blur detection unit 12 detects the blur locus using only information of one input image. Specifically, the blur detection unit 12 detects, for example, a point spread function (PSF: Point Spread Function) calculated using the pixel values of the pixels constituting the search region as a blur locus (for example, non-patent literature). 1 See “High-Quality Motion Deblurring From a Single Image (Qi Shen et.al, SIGGRAPH 2008)”.
- PSF Point Spread Function
- the storage unit 14 stores in advance a distribution of image gradients appearing in a general natural image without blur. Then, the blur detection unit 12 repeats the comparison between the image gradient distribution in the image when the search region is corrected using a predetermined point spread function and the image gradient distribution stored in the storage unit 14, thereby Search for a point spread function such that the image gradient distributions are the same or similar. The blur detection unit 12 detects the point spread function searched in this way as a blur locus.
- the blur detection unit 12 determines whether or not the length of the detected blur locus is equal to or less than a threshold value (step S107). That is, the blur detection unit 12 determines whether or not the input image is a blur image.
- the threshold value is a predetermined value, for example.
- the tracking processing unit 13 sets the initial region stored in the storage unit 14 within the search region determined in Step S104 or S105. A region having a feature amount most similar to the feature amount is searched. Then, as a result of the search, the tracking processing unit 13 specifies a region having the most similar feature amount as a target region (step S108). That is, when the input image is not a blurred image, the image processing apparatus 10 specifies the target area without using the blurred locus. Furthermore, the tracking processing unit 13 registers the target area information in the storage unit 14 and sets “0” in the flag information indicating whether the image is a blurred image.
- the blur detection unit 12 identifies the target specified in the immediately preceding input image among the two end points of the blur locus. The end point far from the position corresponding to the region is specified as the end point (step S109). Subsequently, the blur detection unit 12 registers the blur end point information in the storage unit 14 and sets “1” in the flag information indicating whether the image is a blur image.
- the tracking processing unit 13 compares the feature amount of the region including each of the two end points of the blur locus with the initial feature amount stored in the storage unit 14. As a result of comparison, a region having a similar feature amount is specified as a target region in the input image (step S110).
- the image processing apparatus 10 can track the image of the target object by repeating the above steps S103 to S110 for each input image that has been continuously photographed.
- FIG. 4 is a diagram for explaining specific examples of various operations in the image processing apparatus when a blurred image is not included in the input image.
- the initial feature quantity extraction unit 11 acquires an initial area 401 that is an area in which an object is shown in the input image 400 when the object is specified.
- the initial feature amount extraction unit 11 generates an initial color histogram 402 that is a color histogram of the acquired initial region 401.
- the initial feature amount extraction unit 11 registers the side length and center position coordinates of the initial region 401 and the initial color histogram 402 in the storage unit 14.
- the color histogram is information indicating the distribution of the frequency of colors of a plurality of pixels constituting an image or a partial region of the image.
- the horizontal axis of the color histogram indicates the color classification when the hue (H) value (0 to 360) in the HSV (Hue Saturation Value) color space is divided into 20 parts.
- the vertical axis of the color histogram indicates the number of pixels (frequency) belonging to each section. Which color category each pixel belongs to may be determined using an integer part of a value obtained by dividing the hue (H) value of each pixel by the number of categories.
- the number of divisions is not necessarily 20 and may be any number as long as it is 1 or more. However, it is preferable that the more colors included in the initial region, the larger the number of sections. Thereby, when there are many colors included in the initial region, a similar region can be searched using the frequency for each small section, so that the accuracy of specifying the target region can be improved. On the other hand, when the number of colors included in the initial region is small, the frequency for each large division is stored, so that the amount of memory used can be reduced.
- the tracking processing unit 13 determines a search area 411 in the next input image 410 that is temporally continuous with the input image 400. Specifically, the tracking processing unit 13 includes the target area (here, the initial area 401) in the input image 400 and the target area (here, the initial area 401) because the previous input image 400 is not a blurred image. Then, a rectangular area larger than the initial area 401) is determined as the search area 411. More specifically, the tracking processing unit 13 reads the length of the side of the initial area 401 and the coordinates of the center position stored in the storage unit 14. Then, the tracking processing unit 13 determines a rectangular area that is larger than the length of the read side and centered on the coordinates of the read center position as a search area.
- the shape of the search area does not have to be a rectangle, and may be an arbitrary shape such as a circle or a hexagon.
- the size of the search area may be a predetermined size, or may be a size that increases as the frame rate or the shutter speed decreases.
- the tracking processing unit 13 selects a selection area 412 that is smaller than the search area 411 and is an area within the search area 411. Then, the tracking processing unit 13 extracts a selection color histogram 413 that is a color histogram of the selected selection region 412. At this time, the selection color histogram 413 is preferably normalized using the initial color histogram 402. Specifically, the tracking processing unit 13 preferably normalizes the selection color histogram 413 by dividing the frequency of each segment in the color histogram of the selected region by the frequency of each segment of the initial color histogram 402.
- the tracking processing unit 13 calculates the ratio of the overlapping portion 420 where the initial color histogram 402 and the selection color histogram 413 overlap as the similarity. Specifically, the tracking processing unit 13 calculates the similarity according to (Equation 1).
- Ri is the frequency of the i-th segment in the initial color histogram 402.
- Ii is the frequency of the i-th segment in the selection color histogram 413.
- i is a value from 0 to 19. It is to be noted that the degree of similarity indicated by the similarity is higher as the ratio of the overlapping portion 420 is larger, and the degree of similarity indicated by the similarity is lower as the ratio is smaller.
- the tracking processing unit 13 thus repeats the selection of the selection region 412 and the extraction of the selection color histogram 413 while changing the position and size in the search region 411, and the selection region where the ratio of the overlapping portion 420 is the largest in the color histogram. 412 is specified as the target area. Then, the tracking processing unit 13 registers the length of the side of the target area and the coordinates of the center position in the storage unit 14.
- the image processing apparatus 10 specifies the target region in each of the input images temporally after the input image 400 without using the blur locus.
- FIG. 5 is a diagram for explaining a specific example of the operation in the image processing apparatus according to the first embodiment of the present invention.
- the second input image 510 is a blurred image.
- the operation of the image processing apparatus 10 in this case is shown.
- a target area 501 is specified as an area where an object to be tracked is shown. That is, the storage unit 14 stores the length of the side of the target area 501 and the coordinates of the center position.
- the image processing apparatus 10 starts image processing of the second input image 510.
- the tracking processing unit 13 reads out the side length and center position coordinates of the target area 501 in the first input image 500 stored in the storage unit 14 from the storage unit 14. Then, as illustrated in FIG. 5B, the tracking processing unit 13 is a rectangular area including the coordinates of the read center position in the second input image 510 and has a side longer than the length of the read side. The rectangular area is determined as the search area 511.
- the blur detection unit 12 detects the blur locus 512 in the search area 511.
- the blur detection unit 12 detects the blur locus 512 in the search area 511.
- blurring has occurred in the image of the object, and as shown in FIG. 5B, a blurring locus 512 made up of a curve having two end points 513 and 514 is detected.
- the blur detection unit 12 identifies the end point 514 that is the end point farther than the center position of the target area 501 among the end points 513 and 514 as the blur end point. That is, the blur detection unit 12 determines that the object has moved in the direction from the end point 513 to the end point 514. Then, the blur detection unit 12 registers the coordinates in the second input image 510 corresponding to the blur end point position in the storage unit 14 as blur end point information, and adds flag information indicating whether the image is a blur image or not. 1 ”is set.
- the tracking processing unit 13 extracts a color histogram in regions 515 and 516 including the two end points 513 and 514 of the blur locus 512, respectively. Then, the tracking processing unit 13 calculates the similarity between the extracted color histogram and the initial color histogram stored in the storage unit 14. As a result, the tracking processing unit 13 identifies an area having a color histogram with a high degree of similarity (for example, the area 515) as a target area in the second input image 510.
- the image processing apparatus 10 starts image processing of the third input image 520.
- the tracking processing unit 13 determines that the second input image 510 is a blurred image because the flag information is “1”. Therefore, the tracking processing unit 13 reads out the coordinates of the blur end point in the second input image 510 from the storage unit 14. Then, as illustrated in FIG. 5C, the tracking processing unit 13 determines a rectangular area including the read coordinates as the search area 521 in the third input image 520.
- the shake detection unit 12 detects a shake locus in the search area 521.
- the length of the blur locus is equal to or less than the threshold value. Therefore, the blur detection unit 12 sets “0” in the flag information.
- the tracking processing unit 13 searches the search area 521 for an area having a color histogram having the largest proportion of the portion overlapping the initial color histogram. As a result of the search, the tracking processing unit 13 specifies an area having a color histogram with the largest proportion of overlapping parts as the target area 522.
- the image processing apparatus 10 uses the end point of the blur locus in the input image next to the blur image when the blur image is included in some of the plurality of input images. Determine the area. As a result, the image processing apparatus 10 can narrow down the search area to an area centered on a position where the object is likely to be captured, so that the image of the object can be tracked stably. At the same time, the load of search processing can be reduced. Particularly, when the feature amount of the image of the target fluctuates due to blurring due to the sudden movement of the target object or the user suddenly moving the electronic camera suddenly, the image processing apparatus 10 inputs the image captured after the blurring image. It is possible to stably track an image of an object in an image.
- the image processing apparatus 10 can specify the position of the end point of the blur locus, which is difficult to estimate from one image, with high accuracy by using the immediately preceding image that is temporally continuous. By using the position of the end point of the blur locus specified in this way, the image processing apparatus 10 can track the object more stably.
- the image processing apparatus 10 can identify the target region by comparing the feature amount of the region including each of the two end points of the blur locus with the initial feature amount in the blur image. Therefore, the image processing apparatus 10 can reduce the load of the search process compared to searching in the search area.
- the image processing apparatus 10 specifies the target area using the blur locus only when the image is a blur image. Therefore, the image processing apparatus 10 can track the object more stably.
- the image processing apparatus 20 according to the second embodiment and the image processing apparatus 10 according to the first embodiment are the same except for some of the operations of the shake detection unit and the tracking processing unit. Therefore, a block diagram having the same functional configuration as that of the first embodiment will be described below with reference to FIG.
- the blur detection unit 22 detects a blur locus when the shutter speed or the frame rate when the input image is taken is smaller than the threshold value. And the blur detection part 22 specifies the end point of a blur locus
- the tracking processing unit 23 specifies a region including a position corresponding to the end point of the blur locus identified by the blur detection unit 22 as a target region in the input image.
- FIG. 6 is a flowchart showing the operation of the image processing apparatus according to the second embodiment of the present invention.
- processes having the same contents as those in the flowchart shown in FIG. 6
- the blur detection unit 22 acquires the shutter speed or frame rate when the input image is taken (step S201). Then, the blur detection unit 22 determines whether or not the acquired shutter speed or frame rate is equal to or higher than a threshold value (step S202).
- the threshold value is a predetermined value, and is generally a boundary value of a shutter speed or a frame rate at which the possibility of blurring of an object image increases.
- the tracking processing unit 23 in the search region determined in Step S105, the color histogram of the initial region stored in the storage unit 14. A region having the most similar feature amount is searched. Then, as a result of the search, the tracking processing unit 23 specifies a region having the most similar feature amount as a target region (step S108). That is, when the shutter speed or the frame rate is high, the possibility of a blurred image is low, and the image processing device 20 identifies the target region without detecting the blur locus.
- the blur detection unit 22 detects a blur locus in the search region determined by the tracking processing unit 23 (Step S106). That is, when the shutter speed or the frame rate is small, the input image is highly likely to be a blurred image, so the blur detection unit 22 detects the blur locus.
- the blur detection unit 22 identifies the end point far from the position corresponding to the target region identified in the immediately preceding image that is temporally continuous, from the two end points of the blur trajectory as the end point of the blur trajectory. (Step S109).
- the tracking processing unit 23 specifies a region including a position corresponding to the end point of the blur locus identified by the blur detection unit 22 as a target region (step S203).
- the image processing apparatus 20 when a plurality of input images includes a blur image, the image processing apparatus 20 according to the present embodiment identifies an area including the position of the end point of the blur locus as a target area in the blur image. As a result, the image processing apparatus 20 can easily specify the target region without performing feature amount extraction or comparison. In particular, in a plurality of input images in which a lot of blurred images are included by shooting around a target object in a dark dark environment, the image processing apparatus 20 can stably track the target object in the blurred image. Become.
- the image processing apparatus 20 specifies an area including a position corresponding to the end point of the blurring locus as a target area when the shutter speed or the frame rate is small. That is, the image processing apparatus 20 can switch whether or not to use the blur locus according to the shutter speed or the frame rate, and thus can reduce the processing load for detecting the blur locus.
- the image processing apparatus or the electronic camera according to the present invention has been described based on the embodiments.
- the present invention is not limited to these embodiments. Unless it deviates from the meaning of this invention, the form which carried out various deformation
- the blur detection unit detects the blur locus by calculating a point spread function in the search area of the input image, but may detect the blur locus by other methods.
- the blur detection unit uses a plurality of images that have the same shutter speed or frame rate as the input image, and the same subject is captured at the same time as the input image is captured, to detect the blur locus. It may be calculated.
- the direction of the blur locus is calculated together with the blur locus, so that the image processing apparatus or the electronic camera can easily specify the end point of the blur locus.
- the initial feature amount extraction unit or the tracking processing unit extracts a color histogram as a feature amount that quantitatively indicates the feature of the region in the input image. May be extracted as In this case, the tracking processing unit calculates the similarity by comparing the luminance histogram obtained from the initial region and the luminance histogram obtained from the selected region.
- the initial feature amount extraction unit or the tracking processing unit may extract the luminance itself as a feature amount and search for a similar region by template matching using the extracted luminance.
- the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on one chip. Specifically, a microprocessor, a ROM (Read Only Memory), a RAM (Random Access Memory), etc. It is a computer system comprised including. For example, as shown in FIG. 1, the initial feature quantity extraction unit 11, the shake detection unit 12 or 22, and the tracking processing unit 13 or 23 may be configured by one system LSI 30.
- the present invention may be realized not only as the image processing apparatus as described above but also as an electronic camera including a characteristic component included in the image processing apparatus as shown in FIG. Furthermore, the present invention may be realized as an image processing method in which operations of characteristic components included in the image processing apparatus are steps. Moreover, you may implement
- An electronic camera, an imaging device, or the like is a digital video camera, a digital still camera, a surveillance camera, an in-vehicle device that tracks an image of an object by specifying an area in which the object to be tracked is shown. This is useful for a camera for use with a camera or a mobile phone having a camera function.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
本発明の実施の形態1に係る画像処理装置は、ブレ軌跡の長さが閾値より大きい場合に、ブレ画像と時間的に連続する次の入力画像における探索領域の位置をブレ軌跡の終点を用いて決定することを特徴とする。
次に、本発明の実施の形態2に係る画像処理装置について説明する。
11 初期特徴量抽出部
12、22 ブレ検知部
13、23 追跡処理部
14 記憶部
30 システムLSI
100 電子カメラ
101 撮影レンズ
102 シャッター
103 撮像素子
104 AD変換器
105 タイミング発生回路
106 画像処理回路
107 メモリ制御回路
108 画像表示メモリ
109 DA変換器
110 画像表示部
111、120 メモリ
112 リサイズ回路
113 システム制御回路
114 露光制御部
115 測距制御部
116 ズーム制御部
117 バリア制御部
118 フラッシュ
119 保護部
121 表示部
122 不揮発性メモリ
123 モードダイアルスイッチ
124 シャッタースイッチ
125 電源制御部
126、127、131、132 コネクタ
128 電源
129、130 インタフェース
133 記録媒体
134 光学ファインダ
135 通信部
136 アンテナ
137 初期特徴量抽出回路
138 追跡処理回路
139 ブレ検知回路
140 追跡結果描画回路
141 カメラ制御回路
400、410 入力画像
401 初期領域
402 初期色ヒストグラム
411、511、521 探索領域
412 選択領域
413 選択色ヒストグラム
420 重複部分
500 第1入力画像
501、522 対象領域
510 第2入力画像
512 ブレ軌跡
513、514 端点
515、516 領域
520 第3入力画像
Claims (12)
- 入力画像において追跡の対象となる対象物の像を追跡し、追跡結果を利用して、オートフォーカス処理、自動露出処理及び逆光補正処理のうち少なくとも1つを実行する電子カメラであって、
前記入力画像において、当該入力画像より時間的に前に撮影された画像を用いて決定される領域であって前記対象物の像を含む領域である探索領域のブレを示す軌跡であるブレ軌跡を検出するブレ検知部と、
前記ブレ検知部によって検出されたブレ軌跡の端点を用いて、前記対象物の像を含む領域であって前記探索領域よりも小さい領域である対象領域を特定することにより、前記対象物の像を追跡する追跡処理部とを備える
電子カメラ。 - さらに、前記対象物の像の特徴を定量的に示す初期特徴量を記憶する記憶部を備え、
前記ブレ検知部は、さらに、検出したブレ軌跡の終点を特定し、
前記追跡処理部は、前記入力画像と時間的に連続する次の入力画像において、前記ブレ検知部によって特定されたブレ軌跡の終点に対応する位置を含む領域を前記探索領域と決定し、決定した探索領域内において前記記憶部に記憶された初期特徴量と最も類似する特徴量を有する領域を探索することにより、前記入力画像と時間的に連続する次の入力画像における対象領域を特定する
請求項1に記載の電子カメラ。 - 前記ブレ検知部は、前記ブレ軌跡の2つの端点のうち、前記入力画像と時間的に連続する1つ前の入力画像において特定された対象領域に対応する位置から遠い方の端点を、前記ブレ軌跡の終点として特定する
請求項2に記載の電子カメラ。 - 前記追跡処理部は、前記ブレ軌跡の2つの端点のそれぞれを含む領域のうち、前記記憶部に記憶された初期特徴量と最も類似する特徴量を有する領域を、前記入力画像における対象領域として特定する
請求項2に記載の電子カメラ。 - 前記追跡処理部は、前記ブレ軌跡の長さが閾値より大きい場合に、前記ブレ軌跡を用いて前記入力画像における対象領域を特定し、前記ブレ軌跡の長さが閾値以下である場合に、前記探索領域内において前記記憶部に記憶された初期特徴量と最も類似する特徴量を有する領域を探索することにより前記入力画像における対象領域を特定する
請求項2に記載の電子カメラ。 - 前記ブレ検知部は、さらに、検出したブレ軌跡の終点を特定し、
前記追跡処理部は、前記ブレ検知部によって特定されたブレ軌跡の終点に対応する位置を含む領域を前記入力画像における対象領域として特定する
請求項1に記載の電子カメラ。 - 前記ブレ検知部は、前記ブレ軌跡の2つの端点のうち、前記入力画像と時間的に連続する1つ前の入力画像において特定された対象領域に対応する位置から遠い方の端点を、前記ブレ軌跡の終点として特定する
請求項6に記載の電子カメラ。 - さらに、前記対象物が写っている領域の特徴を定量的に示す初期特徴量を記憶する記憶部を備え、
前記ブレ検知部は、前記入力画像が撮影されたときのシャッター速度またはフレームレートが閾値より小さい場合、前記ブレ軌跡を検出し、
前記追跡処理部は、
前記入力画像が撮影されたときのシャッター速度またはフレームレートが閾値より小さい場合、前記ブレ検知部によって特定されたブレ軌跡の終点に対応する位置を含む領域を前記入力画像における対象領域として特定し、
前記入力画像が撮影されたときのシャッター速度またはフレームレートが閾値以上である場合、前記入力画像と時間的に連続する1つ前の入力画像において特定された対象領域に対応する位置を含む領域を探索領域と決定し、決定した探索領域内において前記記憶部に記憶された初期特徴量と最も類似する特徴量を有する領域を探索することにより、前記入力画像における対象領域を特定する
請求項6に記載の電子カメラ。 - 入力画像において追跡の対象となる対象物の像を追跡する画像処理装置であって、
前記入力画像において、当該入力画像より時間的に前に撮影された画像を用いて決定される領域であって前記対象物の像を含む領域である探索領域のブレを示す軌跡であるブレ軌跡を検出するブレ検知部と、
前記ブレ検知部によって検出されたブレ軌跡の端点を用いて、前記対象物の像を含む領域であって前記探索領域よりも小さい領域である対象領域を特定することにより、前記対象物の像を追跡する追跡処理部とを備える
画像処理装置。 - 入力画像において追跡の対象となる対象物の像を追跡する画像処理方法であって、
前記入力画像において、当該入力画像より時間的に前に撮影された画像を用いて決定される領域であって前記対象物の像を含む領域である探索領域のブレを示す軌跡であるブレ軌跡を検出するブレ検知ステップと、
前記ブレ検知ステップにおいて検出されたブレ軌跡の端点を用いて、前記対象物の像を含む領域であって前記探索領域よりも小さい領域である対象領域を特定することにより、前記対象物の像を追跡する対象領域特定ステップとを含む
画像処理方法。 - 入力画像において追跡の対象となる対象物の像を追跡する集積回路であって、
前記入力画像において、当該入力画像より時間的に前に撮影された画像を用いて決定される領域であって前記対象物の像を含む領域である探索領域のブレを示す軌跡であるブレ軌跡を検出するブレ検知部と、
前記ブレ検知部によって検出されたブレ軌跡の端点を用いて、前記対象物の像を含む領域であって前記探索領域よりも小さい領域である対象領域を特定することにより、前記対象物の像を追跡する追跡処理部とを備える
集積回路。 - 入力画像において追跡の対象となる対象物の像を追跡するプログラムであって、
前記入力画像において、当該入力画像より時間的に前に撮影された画像を用いて決定される領域であって前記対象物の像を含む領域である探索領域のブレを示す軌跡であるブレ軌跡を検出するブレ検知ステップと、
前記ブレ検知ステップにおいて検出されたブレ軌跡の端点を用いて、前記対象物の像を含む領域であって前記探索領域よりも小さい領域である対象領域を特定することにより、前記対象物の像を追跡する対象領域特定ステップとをコンピュータに実行させる
プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/999,833 US20110090345A1 (en) | 2009-05-07 | 2010-04-22 | Digital camera, image processing apparatus, and image processing method |
EP10772106A EP2429177A1 (en) | 2009-05-07 | 2010-04-22 | Electron camera, image processing device, and image processing method |
CN2010800017799A CN102057665A (zh) | 2009-05-07 | 2010-04-22 | 电子照相机、图象处理装置及图象处理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009112996A JP5054063B2 (ja) | 2009-05-07 | 2009-05-07 | 電子カメラ、画像処理装置及び画像処理方法 |
JP2009-112996 | 2009-05-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010128579A1 true WO2010128579A1 (ja) | 2010-11-11 |
Family
ID=43050090
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/002916 WO2010128579A1 (ja) | 2009-05-07 | 2010-04-22 | 電子カメラ、画像処理装置及び画像処理方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20110090345A1 (ja) |
EP (1) | EP2429177A1 (ja) |
JP (1) | JP5054063B2 (ja) |
KR (1) | KR20120022512A (ja) |
CN (1) | CN102057665A (ja) |
WO (1) | WO2010128579A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103442175A (zh) * | 2013-09-02 | 2013-12-11 | 百度在线网络技术(北京)有限公司 | 移动终端的拍照控制方法、装置和移动终端 |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5720275B2 (ja) * | 2011-02-03 | 2015-05-20 | 株式会社リコー | 撮像装置および撮像方法 |
KR101870729B1 (ko) | 2011-09-01 | 2018-07-20 | 삼성전자주식회사 | 휴대용 단말기의 번역 트리구조를 이용한 번역장치 및 방법 |
KR20140011215A (ko) * | 2012-07-18 | 2014-01-28 | 삼성전자주식회사 | 촬영 장치, 그의 촬영 제어 방법 및 안구 인식 장치 |
US9406143B2 (en) * | 2013-02-21 | 2016-08-02 | Samsung Electronics Co., Ltd. | Electronic device and method of operating electronic device |
US9454827B2 (en) | 2013-08-27 | 2016-09-27 | Qualcomm Incorporated | Systems, devices and methods for tracking objects on a display |
JP6038965B2 (ja) * | 2014-01-14 | 2016-12-07 | 有限会社パパラボ | 着色検査装置および着色検査方法 |
JP6380523B2 (ja) * | 2014-02-26 | 2018-08-29 | 株式会社ソシオネクスト | 画像認識システムおよび半導体集積回路 |
JP2016006408A (ja) * | 2014-05-26 | 2016-01-14 | 有限会社パパラボ | ウェアラブル着色評価装置及びウェアラブルウェアラブル着色評価方法 |
EP3192254A4 (en) * | 2014-09-09 | 2018-04-25 | Hewlett-Packard Development Company, L.P. | Color calibration |
TWI578780B (zh) * | 2014-10-02 | 2017-04-11 | 晶睿通訊股份有限公司 | 模糊影像偵測方法及其相關攝影機和影像處理系統 |
JP6789620B2 (ja) * | 2015-10-08 | 2020-11-25 | キヤノン株式会社 | 画像処理装置及びその制御方法、コンピュータプログラム |
JP6833461B2 (ja) * | 2015-12-08 | 2021-02-24 | キヤノン株式会社 | 制御装置および制御方法、撮像装置 |
KR101906847B1 (ko) * | 2016-06-29 | 2018-10-12 | 주식회사 크리에이츠 | 공 이미지 촬영을 위한 관심 영역을 결정하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체 |
US10867491B2 (en) * | 2016-10-24 | 2020-12-15 | Signify Holding B.V. | Presence detection system and method |
CN107124538B (zh) * | 2017-05-27 | 2019-11-22 | 维沃移动通信有限公司 | 一种拍照方法及移动终端 |
CN109615631A (zh) * | 2018-10-17 | 2019-04-12 | 平安普惠企业管理有限公司 | 图片特定区域提取方法、装置、计算机设备及存储介质 |
CN109495626B (zh) * | 2018-11-14 | 2021-03-16 | 高劭源 | 一种用于便携式移动通讯设备的拍摄辅助装置及系统 |
CN111339855B (zh) * | 2020-02-14 | 2023-05-23 | 睿魔智能科技(深圳)有限公司 | 基于视觉的目标跟踪方法、系统、设备及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04343314A (ja) * | 1991-05-20 | 1992-11-30 | Olympus Optical Co Ltd | 顕微鏡 |
JPH0883392A (ja) * | 1994-09-14 | 1996-03-26 | Toshiba Corp | 車両検出方法及び車両検出装置 |
JP2001060265A (ja) * | 1999-08-24 | 2001-03-06 | Sony Corp | 画像処理装置および方法、並びに媒体 |
JP2003250804A (ja) * | 2002-03-05 | 2003-09-09 | Toshiba Corp | 画像処理装置及び超音波診断装置 |
JP2005027076A (ja) * | 2003-07-03 | 2005-01-27 | Nikon Corp | 電子カメラ |
JP2005236508A (ja) * | 2004-02-18 | 2005-09-02 | Matsushita Electric Ind Co Ltd | 自動追尾装置及び自動追尾方法 |
JP2009048428A (ja) | 2007-08-20 | 2009-03-05 | Secom Co Ltd | 移動物体追跡装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6614914B1 (en) * | 1995-05-08 | 2003-09-02 | Digimarc Corporation | Watermark embedder and reader |
US7710498B2 (en) * | 2004-02-13 | 2010-05-04 | Sony Corporation | Image processing apparatus, image processing method and program |
CN100490505C (zh) * | 2004-02-13 | 2009-05-20 | 索尼株式会社 | 图像处理装置和图像处理方法 |
EP1624672A1 (en) * | 2004-08-07 | 2006-02-08 | STMicroelectronics Limited | A method of determining a measure of edge strength and focus |
US7538813B2 (en) * | 2005-05-11 | 2009-05-26 | Sony Ericsson Mobile Communications Ab | Digital cameras with triangulation autofocus systems and related methods |
US20080278589A1 (en) * | 2007-05-11 | 2008-11-13 | Karl Ola Thorn | Methods for identifying a target subject to automatically focus a digital camera and related systems, and computer program products |
US8964056B2 (en) * | 2011-06-30 | 2015-02-24 | Cisco Technology, Inc. | Encoder-supervised imaging for video cameras |
-
2009
- 2009-05-07 JP JP2009112996A patent/JP5054063B2/ja not_active Expired - Fee Related
-
2010
- 2010-04-22 WO PCT/JP2010/002916 patent/WO2010128579A1/ja active Application Filing
- 2010-04-22 EP EP10772106A patent/EP2429177A1/en not_active Withdrawn
- 2010-04-22 US US12/999,833 patent/US20110090345A1/en not_active Abandoned
- 2010-04-22 CN CN2010800017799A patent/CN102057665A/zh active Pending
- 2010-04-22 KR KR1020107029004A patent/KR20120022512A/ko not_active Application Discontinuation
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04343314A (ja) * | 1991-05-20 | 1992-11-30 | Olympus Optical Co Ltd | 顕微鏡 |
JPH0883392A (ja) * | 1994-09-14 | 1996-03-26 | Toshiba Corp | 車両検出方法及び車両検出装置 |
JP2001060265A (ja) * | 1999-08-24 | 2001-03-06 | Sony Corp | 画像処理装置および方法、並びに媒体 |
JP2003250804A (ja) * | 2002-03-05 | 2003-09-09 | Toshiba Corp | 画像処理装置及び超音波診断装置 |
JP2005027076A (ja) * | 2003-07-03 | 2005-01-27 | Nikon Corp | 電子カメラ |
JP2005236508A (ja) * | 2004-02-18 | 2005-09-02 | Matsushita Electric Ind Co Ltd | 自動追尾装置及び自動追尾方法 |
JP2009048428A (ja) | 2007-08-20 | 2009-03-05 | Secom Co Ltd | 移動物体追跡装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103442175A (zh) * | 2013-09-02 | 2013-12-11 | 百度在线网络技术(北京)有限公司 | 移动终端的拍照控制方法、装置和移动终端 |
Also Published As
Publication number | Publication date |
---|---|
EP2429177A1 (en) | 2012-03-14 |
KR20120022512A (ko) | 2012-03-12 |
JP2010263439A (ja) | 2010-11-18 |
JP5054063B2 (ja) | 2012-10-24 |
US20110090345A1 (en) | 2011-04-21 |
CN102057665A (zh) | 2011-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5054063B2 (ja) | 電子カメラ、画像処理装置及び画像処理方法 | |
JP5313037B2 (ja) | 電子カメラ、画像処理装置および画像処理方法 | |
US7403710B2 (en) | Image processing apparatus and image processing method | |
TWI549501B (zh) | An imaging device, and a control method thereof | |
JP6106921B2 (ja) | 撮像装置、撮像方法および撮像プログラム | |
JP5090474B2 (ja) | 電子カメラおよび画像処理方法 | |
US8139136B2 (en) | Image pickup apparatus, control method of image pickup apparatus and image pickup apparatus having function to detect specific subject | |
JP5136669B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP6512810B2 (ja) | 撮像装置および制御方法とプログラム | |
JP6157242B2 (ja) | 画像処理装置及び画像処理方法 | |
KR101605771B1 (ko) | 디지털 촬영 장치, 그 제어방법 및 이를 실행하기 위한 프로그램을 저장한 기록매체 | |
US8558935B2 (en) | Scene information displaying method and apparatus and digital photographing apparatus using the scene information displaying method and apparatus | |
JP6671994B2 (ja) | 撮像装置およびその制御方法、プログラム、記憶媒体 | |
JP5246078B2 (ja) | 被写体位置特定用プログラム、およびカメラ | |
US9936139B2 (en) | Image pickup control apparatus, control method therefor, and recording medium | |
US9071760B2 (en) | Image pickup apparatus | |
KR20140096843A (ko) | 디지털 촬영 장치 및 그의 제어 방법 | |
JP2009111716A (ja) | 撮像装置、プログラムおよびテンプレート生成方法 | |
US20130293741A1 (en) | Image processing apparatus, image capturing apparatus, and storage medium storing image processing program | |
JP5370555B2 (ja) | 撮像装置、撮像方法及びプログラム | |
JP4771536B2 (ja) | 撮像装置及び主被写体となる顔の選択方法 | |
CN102082909B (zh) | 数字拍摄设备以及控制该数字拍摄设备的方法 | |
JP2007259004A (ja) | デジタルカメラ、画像処理装置及び画像処理プログラム | |
JP2015114880A (ja) | タッチパネル付表示装置 | |
US20220292692A1 (en) | Image processing apparatus and method of processing image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080001779.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12999833 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20107029004 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010772106 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10772106 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |