JP2009009206A - Extraction method of outline inside image and image processor therefor - Google Patents

Extraction method of outline inside image and image processor therefor Download PDF

Info

Publication number
JP2009009206A
JP2009009206A JP2007167581A JP2007167581A JP2009009206A JP 2009009206 A JP2009009206 A JP 2009009206A JP 2007167581 A JP2007167581 A JP 2007167581A JP 2007167581 A JP2007167581 A JP 2007167581A JP 2009009206 A JP2009009206 A JP 2009009206A
Authority
JP
Japan
Prior art keywords
contour
image
initial
distance
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2007167581A
Other languages
Japanese (ja)
Inventor
Yuji Kuniyone
Hideki Sasaki
秀貴 佐々木
祐司 國米
Original Assignee
Nikon Corp
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp, 株式会社ニコン filed Critical Nikon Corp
Priority to JP2007167581A priority Critical patent/JP2009009206A/en
Publication of JP2009009206A publication Critical patent/JP2009009206A/en
Application status is Withdrawn legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide an imaging apparatus and the like and an outline extraction method thereof capable of easily performing a quick and accurate dynamic outline extraction method.
A distance image generation unit that generates a distance image from a plurality of images obtained by imaging an object at a desired distance from different angles by an image sensor, and an outline of the object at a desired distance by the generated distance image And an initial contour generation unit that generates an initial contour closed curve, and an initial contour closed curve generated by the initial contour generation unit as an initial position, and a dynamic contour extraction method that extracts a contour of a target at a desired distance The imaging device includes an outline extraction unit.
[Selection] Figure 1

Description

  The present invention relates to a method for extracting a contour of an object in a captured image, an imaging apparatus, an image processing apparatus, and the like.

  Various methods for extracting the contour of a person or the like in an image obtained by imaging with an imaging device are known. For example, a method of extracting a specific edge or the like by binarizing the image with an appropriate threshold is one of them.

  In addition, in the active contour extraction method proposed by M. Kass et al., A closed curve including an object is given as an initial position in order to obtain a more accurate contour, and energy values and the like are obtained using pixel values on the closed curve. Define an evaluation function. Then, the position of the closed curve is corrected and changed so that the sum of the evaluation functions at each pixel is minimized, and the minimum solution closest to the target contour is obtained.

An example of such an active contour extraction method is disclosed in JP-A-10-11588. A method for integrating a distance image from a stereo image is disclosed in Japanese Patent Laid-Open No. 8-313212.
JP 10-11588 JP-A-8-313212

  The dynamic contour extraction method has a problem that a correct contour cannot be obtained or a considerable time is required for contour extraction depending on the setting of the closed curve as the initial position.

  Also, if the periphery of the target object for extracting the contour in the image is specified by manual input or the like and the dynamic contour extraction method is to be performed with the initial position as the initial position, a complicated human input operation is required. .

  In view of such circumstances, an object of the present invention is to provide an image processing apparatus and the like and an outline extraction method thereof that can easily perform a quick and accurate dynamic outline extraction method.

  An image processing apparatus according to the present invention includes a distance image generation unit that generates a distance image from a plurality of images obtained by imaging a target at a desired distance from different angles by an imaging device, and a desired distance based on the generated distance image. The contour of the target at a desired distance by the dynamic contour extraction method, with the initial contour generating unit that extracts the contour of the target at the initial position and the initial contour closed curve generated by the initial contour generating unit as the initial position And an active contour extraction unit for extracting.

  In addition, an imaging apparatus according to the present invention includes a distance image generation unit that generates a distance image from a plurality of images obtained by imaging an object at a desired distance from different angles by an imaging element, and a desired distance image generated by the generated distance image. The contour of the target at a distance is extracted, and an initial contour generation unit that generates an initial contour closed curve, and the initial contour closed curve generated by the initial contour generation unit is used as an initial position of the target at a desired distance by the dynamic contour extraction method. And a dynamic contour extracting unit that extracts a contour.

  In addition, the contour extraction method for a target in a captured image according to the present invention extracts a contour of a target at a desired distance from a plurality of images obtained by imaging a target at a desired distance from different angles by an imaging device. A method of generating a distance image from a plurality of images, extracting a contour of a target at a desired distance from the generated distance image to generate an initial contour closed curve, and initializing the generated initial contour closed curve And a step of extracting a contour of a target at a desired distance by a dynamic contour extraction method as a position.

  The initial contour closed curve setting method of the dynamic contour extraction method according to the present invention is a method of setting an initial contour closed curve used as an initial position of the dynamic contour extraction method, wherein the subject to be contour extracted is set to a different angle. A contour of a subject generated from a distance image obtained by imaging a plurality of images is set as an initial contour closed curve.

  According to the present invention, it is possible to quickly and accurately extract the contour of a desired object in a captured image, and it is possible to eliminate complicated manual work in the extraction process.

  FIG. 1 is a block diagram conceptually illustrating the overall configuration of the imaging apparatus 1 according to the embodiment. The lens 10 of the imaging apparatus 1 includes a zoom lens (not shown) whose focal length can be continuously changed, a focusing lens that adjusts the focus, and a VR (Vibration Reduction) lens that corrects camera shake during imaging. Composed.

  The lens 10 is controlled and adjusted by driving a lens driving device (not shown) according to an instruction from the CPU 17. The position of the lens 10 is detected by a lens position detector (not shown), and feedback control is performed by the CPU 17.

  The lens 10 forms a subject image on the imaging surface of the imaging element 11. The image sensor 11 outputs an electrical signal obtained by photoelectric conversion according to the light intensity of the subject image formed on the imaging surface. The image sensor 11 uses a solid-state image sensor such as a CMOS or CCD.

  Further, the image sensor 11 sequentially outputs electrical signals obtained by photoelectric conversion in response to a trigger input from a clock generator such as a timing generator (not shown) included in the CPU 17.

  In addition, a diaphragm (not shown) is provided between the lens 10 and the image sensor 11, and an appropriate exposure (auto exposure) calculated by the CPU 17 is instructed and adjusted.

  A plurality of image pickup systems such as the image pickup element 11 and the lens 10 may be provided at different angles with respect to the subject when acquiring an image with a stereo camera.

  The electrical signal output from the image sensor 11 is input to the CDS unit 12 as an analog signal and subjected to noise reduction processing by CDS (Correlated Double Sampling) processing. After the noise reduction process in the CDS unit 12, the gain is adjusted in an AGC unit (not shown). The A / D converter 13 performs conversion processing from an analog signal to a digital signal.

  The digitally converted image data is input to the image processing unit 14. When CMOS is used for the image sensor 11, the CDS unit 12, the A / D converter 13, and the like may be incorporated in the image sensor 11.

  The image processing unit 14 performs white balance processing, high-pass filter processing, noise reduction processing, edge processing, step processing, and the like on the digital image data obtained from the A / D conversion unit 13 in accordance with an instruction from the CPU 17 that is a central processing unit. Various image processing operations such as tone processing (gamma correction) are performed.

  An AWB (Auto White Balance) calculation unit 14a included in the image processing unit 14 performs parameter value calculation and processing for optimal white balance processing.

  Algorithms and parameters necessary for various image processing performed in the image processing unit 14 are recorded in advance in the flash memory 19 and the like, and optimum ones are read out as necessary according to instructions from the CPU 17. Applied.

  In addition, the image processing unit 14 includes a distance image generation unit 14 b that generates a distance image from the captured image captured from the image sensor 11. The CPU 17 causes the distance image generation unit 14b to create a distance image from two or more captured images obtained by capturing the same target at different angles.

  Further, the image processing unit 14 includes an initial contour generation unit 14c that extracts a target at a desired distance from the distance image generated by the distance image generation unit 14b and extracts an initial contour. The CPU 17 causes the initial contour generation unit 14c to extract the contour corresponding to the distance range where the target subject is located from the distance image.

  In addition, the image processing unit 14 includes a dynamic contour extraction unit 14d that performs contour extraction processing by a dynamic contour extraction method using the contour extracted by the initial contour generation unit 14c as a closed curve at an initial position. That is, the CPU 17 causes the dynamic contour extraction unit 14d to perform arithmetic processing by the dynamic contour extraction method using the contour extracted by the initial contour generation unit 14c as the initial contour closed curve, and causes the contour to be generated.

  The image data calculated by the image processing unit 14 can be temporarily recorded in the RAM 16 or the like. The RAM 16 may be configured as an external memory. By recording in the RAM 16, the image data can be read out and processed separately.

  Further, when the processing load on the image processing unit 14 is large, the image processing unit 14 and a part of the function of the CPU 17 described later are performed by, for example, a dedicated arithmetic processing unit or a computer, and the imaging apparatus 1 may be configured as a system.

  The image processing unit 14 may use the RAM 16 or a register (not shown) as a buffer memory, and may be used as a frame memory that temporarily stores image data for a plurality of captured frames. The image processing unit 14 performs predetermined image processing by appropriately reading image data from the frame memory and writing it appropriately after image processing.

  For example, when the distance image generation unit 14b uses two images captured from different angles for generation of the distance image, the first image is temporarily stored in the RAM 16 or the like according to an instruction from the CPU 17. Then, the distance image generation unit 14b can generate a distance image from the second image acquired from the image sensor 11 and the first image read from the RMA 16 or the like thereafter.

  Further, the control of the imaging operation sequence of the entire imaging apparatus 1 is in response to an instruction from the CPU 17, and an operation signal from an operation unit 18 that performs operations such as a release switch and various setting processes (not shown) is also input to the CPU 17. And processed.

  The operation unit 18 includes a power switch for turning on / off the power of the imaging apparatus 1, a full-press or half-press of a release button, and an up / down button for updating a reproduced image or the like. The up / down button is also used to select a desired person from the persons extracted and displayed on the display unit 15, or to manually drive the zoom lens to the tele or wide side during imaging.

  Furthermore, it is also possible to store various electronic information relating to image data and image processing in the memory card 1b or the like via the card interface 1a. In addition, the image data, processing program, and the like in the image pickup apparatus 1 can be read out or written to an external medium or arithmetic processing can be performed via the external interface 1c. When the image pickup apparatus 1 is configured as a system with the image processing unit 14 or the like as a separate body, the image processing unit 14 or the like becomes a processing target by the card interface 1a, the external interface 1c, a communication port (not shown), or the like. Image data can be transmitted to an arithmetic processing unit or the like for processing.

  The display unit 15 is configured to display the processed image so that the operator of the imaging apparatus 1 can confirm the image and display various processing information and various guidance such as a menu. .

  Moreover, it can also be set as the structure in which input operation is possible by using a touch panel etc. for the display part 15. FIG. In the case of the touch panel system, the display unit 15 also functions as a part of the operation unit 18.

  The display unit 15 can be composed of various display devices such as an LCD, an organic EL, and an inorganic EL. The display unit 15 includes image data recorded in the RAM 16 and the memory card 1b, image data transferred from an external imaging device, arithmetic processing unit, etc. through the external interface 1c, etc. It is also used when playing and displaying. When the imaging apparatus 1 is configured as a system, the display unit 15 may be configured as a monitor-dedicated device or the like separate from the image processing unit 14 or the like.

  The CPU 17 of the imaging apparatus 1 includes an AF calculation unit 17b, performs calculation so as to perform optimum automatic focus adjustment using image data captured by the image sensor 11, and controls the lens 10. In addition, the CPU 17 includes a detection unit 17a that detects whether a human face is present in the captured image.

  The detection unit 17a can detect whether or not there is a portion corresponding to the eyes or nose of a person's face in the captured image. As a detection method, for example, a pattern recognition theory such as composite similarity established by character recognition by capturing a face as a two-dimensional pattern, or an extraction method using a previously learned neural network is used. May be.

  Further, the autofocus calculation performed by the AF calculation unit 17b is performed at predetermined five or nine locations in the captured image. When the human face is detected in the captured image by the detection unit 17a, the autofocus operation can be performed on the human face.

  When a plurality of human faces are detected, automatic focusing is performed on the closest face or the face having the largest area in the captured image. Further, automatic focusing is performed on the face of the person who occupies the center position of the captured image. In this case, for example, the imaging apparatus 1 may automatically recognize a person occupying the center position as an extraction target from the distance image, and the initial contour generation unit 14c may generate the contour of the person.

  Next, the dynamic contour extraction process performed by the dynamic contour extraction unit 14d will be described with reference to FIG. In FIG. 2, a person image 21 is a target for which an outline is to be extracted.

  In the active contour extraction method using the Snakes method or the like, it is necessary to set an initial contour closed curve 22 around the target person image 21. The initial contour closed curve 22 can be manually given from a display screen such as the display unit 15 by, for example, a touch pen or a mouse input.

  The initial contour closed curve 22 is preferably set as close as possible to the target as accurately as possible. In the imaging apparatus 1, the initial contour closed curve 22 is automatically given by a distance image or the like to be described later, which is accurate and can reduce troublesome input labor.

  The dynamic contour extraction unit 14d can obtain the accurate contour 23 by obtaining the minimum solution or the like that minimizes the energy in each pixel on the initial contour closed curve 22 while narrowing the initial contour closed curve 22. Here, a plurality of plots on the initial contour closed curve 22 are control points when the contour is moved in time.

  Next, the active contour extraction method by the Snakes method will be briefly described. In this method, the contour of an object is extracted using an active contour model.

  In this method, the active contour model ("Snakes: Active Contour Models", IJCV vol.1 (No.3), pp. .321-331, (1988) etc.). In such a method, it is known that whether or not the contour can be accurately extracted thereafter largely depends on whether or not the initial contour closed curve 22 is set accurately.

  In this regard, the imaging apparatus 1 can provide the initial contour closed curve 22 generated by the initial contour generation unit 14c based on the distance image, and can be set much faster and more accurately than the initial position by manual input or the like. It becomes possible.

  Similarly, a growing snake method (growing snake method: 0lof Henricsson and Walter Neuenschwander, In Proc. ICPR, pp 68-73, 1994, etc.) using an active canter model may be applied.

  In addition, the dynamic contour extraction unit 14d performs edge detection based on the level of the energy of the image so that the edge portion has lower energy than other portions based on the edge image in the moving image acquired from the image sensor 11. May be generated.

  Then, the active contour extraction unit 14d causes the generated potential image to track the initial position of the set active canter model (corresponding to the initial contour closed curve 22) so that the energy is low (edge portion). May be. In the case of a moving image, it is preferable that the target subject is always present in the image. However, if the target disappears from the image again and appears again, the initial contour closed curve 22 is set again. Also good.

  Further, in the case of the above-described glow snake method or moving image, calculation processing may be performed using an arbitrary point (or pixel) on the initial contour closed curve 22 created by the initial contour generation unit 14c as an initial position.

  FIG. 3 is a conceptual diagram illustrating acquisition of a plurality of images used when the distance image generation unit 14b generates a distance image. A so-called stereo image is acquired from two angles of the first imaging device 32 and the second imaging device 33 with respect to the target 31 that is the subject.

  The first imaging device 32 and the second imaging device 33 may use the same imaging device 1. That is, in this case, the subject 31 is imaged from two different angles by moving one imaging device 1. Further, the imaging apparatus 1 may acquire a plurality of images sequentially while moving the image, and may acquire a moving image while moving the image, not only from two different angles.

  The distance image generation unit 14b can calculate the distance of the subject to be detected from the movement angle (angle) of the imaging device 1 and the parallax amount of the image obtained as a result. This will be described with reference to FIGS.

  FIG. 4 is a conceptual diagram in which an object 31 as an object is imaged with a normal monocular camera. In this case, the background 42 and the background 43 are mixed in the image 41 in addition to the target 31 as the subject. Also, the background 42 and the background 43 each have an outline and an edge portion, and it is difficult to distinguish each outline individually.

  For this reason, even if the image processing unit 14 tries to extract the contour of the target 31 in this state, there is a concern that the contour extraction may be performed on a target that includes many errors or is erroneous, and accurate contour extraction is performed. I can't expect it.

  On the other hand, as shown in FIG. 3, if the distance is calculated from the amount of parallax obtained from the acquired stereo image or the like, it becomes possible to specify the target fairly accurately. Typically, the distance image generation unit 14b is based on the amount of parallax associated with pattern matching (template matching) such as a geometric pattern, and the gradation image is whiter as the distance is black and the distance is shorter. (Distance image) is created, and an image in which distance information is superimposed is created. At this time, the distance image generation unit 14b may evaluate the validity of the calculated amount of parallax based on a luminance difference between adjacent pixels in the image.

  In this way, it is possible to extract only the target 31 that is a subject existing in a specific distance range. FIG. 5 shows an explanatory conceptual diagram of extraction of an object as a subject and an outline by a distance image.

  The distance image obtained in this way often has noise or missing parts. In particular, when template matching or the like is used, there is a tendency that the uncertainty of the position depending on the size becomes obvious.

  The image 52 is created by the image processing unit 14 leaving only the target 31 within a desired specific distance range from the distance image and deleting the others. Typically, the target 31 in the desired distance range is created by digital processing in which the digital value is “1” and the others are digitized as “0”.

  Selection or setting of the target 31 can be specified by a distance range from the imaging device 1 by input to the operation unit 18. Further, it may be specified by an input operation from the operation unit 18 or the like while confirming the target 31 on the screen displayed on the display unit 15 or the like.

  Further, a target existing at the center position in the image 52, a target to be AF target by the AF calculation unit 17b, a target to be detected by the detection unit 17a, or the like may be set as the target 31 from which the contour is extracted. . The image processing unit 14 extracts only a subject within a predetermined distance range from the designated target 31 by binarization by digital processing.

  Next, the initial contour generation unit 14 c extracts the contour of the target 31 extracted from the distance image generated by the distance image generation unit 14 b and creates the initial contour closed curve 51. Since the initial contour generation unit 14c extracts the contour of the target 31 from the image 52 in which only the target 31 exists, the contour extraction can be performed more accurately than the extraction from the image 41 in FIG.

  Typically, the boundary pixels of the target 31 extracted by the digital processing binarized as described above are selected for edge extraction or the like and used as the contour. The edge extraction may be performed using, for example, a differential operation value of the pixel value.

  Next, the dynamic contour extraction unit 14d performs the above-described dynamic contour extraction method using the initial contour closed curve 51 created by the initial contour generation unit 14c as an initial position, and performs contour processing of the target 31 as shown in FIG. 61 is extracted. FIG. 6 is an explanatory conceptual diagram of the initial contour closed curve 51 and the contour 61.

  As described above, the initial contour closed curve 51 is fairly accurate near the contour 61. Accordingly, when the dynamic contour extraction unit 14d performs the calculation by applying the dynamic contour extraction method thereafter, quick and accurate processing can be performed.

  FIG. 7 conceptually shows the contour extraction flow of the image pickup apparatus 1. Since the processing already described is duplicated, a brief description will be given below with reference to FIG.

(Step 71)
The imaging device 1 acquires images from two different angles for the target 31 that is the subject.

(Step 72)
The distance image generation unit 14b calculates the amount of parallax using the images of two different angles acquired in step 71, and creates a distance image. When the distance image generation unit 14b creates a distance image, the image processing unit 14 creates an image 52 in which only a target located at a specific distance is extracted.

(Step 73)
The initial contour generation unit 14 c extracts the contour of the subject 31 using the image 52 based on the distance image of the subject image created by the distance image generation unit 14 b in step 72.

(Step 74)
The dynamic contour extraction unit 14d executes image processing by the dynamic contour extraction method using the initial contour closed curve 51 extracted by the initial contour generation unit 14c in step 73 as an initial position. This execution program may be stored in the flash memory 19 or the like, read as appropriate according to an instruction from the CPU 17, and executed by the dynamic contour extraction unit 14d.

  The active contour extraction unit 14d performs a process of searching for a minimum solution of the evaluation value using, for example, a sum of a calculation value depending on the shape of the initial contour closed curve 51 and pixel information on the initial contour closed curve 51 as an evaluation value. .

  Typically, the calculation value that depends on the shape is a linear sum such as the total perimeter of the initial contour closed curve 51 and the round sum of curvature of each pixel. Such a linear sum becomes smaller as the closed curve becomes shorter and smoother. Further, the image information is the sum of one round on the initial contour closed curve 51 of the absolute value of the density gradient at each pixel.

(Step 75)
The contour 61 of the target 31 is extracted by the extraction processing of the dynamic contour extraction unit 14d. The extracted contour 61 may be displayed on the display unit 15 according to an instruction from the CPU 17, or may be temporarily stored in the RAM 16 or the like. Further, it may be used for various image processing performed by the image processing unit 14.

  Note that the creation of the image 52 in which only the object located at the specific distance in the above-described step 72 is extracted may be processed by the distance image generation unit 14b according to an instruction from the CPU 17.

  Next, a case where the dynamic contour extraction unit 14d performs processing by a different dynamic contour extraction method will be described. FIG. 8 shows an example in which a so-called level set method is used as the dynamic contour extraction method processed by the dynamic contour extraction unit 14d.

  When the dynamic contour extraction unit 14d executes the dynamic contour extraction processing by the level set method, even when there are two objects 82a and 82b as shown in FIG. 8, it is possible to obtain independent contours 83a and 83b at the same time. It becomes possible.

  In particular, in the case where two objects as subjects are close to the same distance range, the distance image generation unit 14b may not be able to recognize the two objects 82a and 82b separately by distance. Concerned. In this case, the initial contour generation unit 14c creates the contour 81 as the initial contour closed curve.

  As shown to Fig.8 (a), the outline 81 cannot isolate | separate the object 82a and the object 82b, but includes both. The active contour extraction unit 14 that performs dynamic contour processing by the level set method separates the contours 83a and 83b for the objects 82a and 82b individually even if the contour 81 is processed as an initial contour closed curve. However, each can be calculated independently, which is preferable.

  Details regarding the level set method are explained in various papers (eg IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 26, No. 3, pp. 402-407, March 2004, etc.). Avoid detailed description of itself.

  In the embodiment, the plurality of images for creating the distance image are not limited to images obtained by individually capturing the target subject from different angles, and are, for example, images obtained by capturing moving images while changing the angles. There may be. In this case, the distance image may be obtained by a three-dimensional restoration method such as a so-called factorization method.

  In the description of the embodiment, the distance image is created from two images. However, the present invention is not limited to this, and the distance image may be created from an arbitrary plurality of images. In this case, it is preferable to use a larger number of images because the reliability of the distance image becomes higher.

  Further, the plurality of images for creating the distance image is not limited to the angle variation by moving the imaging device. For example, the imaging device itself may create a distance image from an image acquired by fixing or moving the subject including the target to a fixed point. Further, only the imaging system such as the lens 10 and the imaging element 11 in the imaging device 1 may be moved, and the CPU 17 and the image processing unit 14 may be provided in a fixed arithmetic device or the like.

  Moreover, it is good also as image acquisition from a different angle, using together the rotation of the to-be-photographed object including a target, and the movement of an imaging device. In creating the distance image, an acquired image obtained by a so-called stereo camera may be used, or a plurality of acquired images obtained by a monocular camera may be used. Further, the distance image may be created by other known methods.

  The active contour extraction method used in the embodiment is not limited to the Snakes method, and a so-called level set method may be used. Other known active contour extraction methods may be used. Typically, other methods of extracting the exact contour of the object by extracting the edge of the object in the image and moving the contour set around it in time may be used.

  In addition, the dynamic contour extraction method used by the imaging apparatus 1 can perform processing for searching for an optimal contour while expanding the initial contour closed curve by changing the configuration of the processing program, algorithm, and the like.

  However, assuming that the target contour exists inside the initial contour closed curve, the dynamic contour extraction method that extracts the optimal contour while narrowing the initial contour closed curve to the inner side can be used for quick and accurate contour extraction. It is further preferable when processing is performed.

  The imaging apparatus 1 can provide an appropriate initial contour position even for an image with a complicated background such as a snapshot, so that an accurate and accurate contour extraction can be quickly processed by dynamic contour extraction processing. .

  The configuration and operation of the imaging device 1 described in this embodiment can be used by appropriately changing within a self-evident range.

  The present invention can be used for an imaging device, an in-vehicle image processing device that monitors the contour of an object in a captured image, a medical image processing device, a monitoring device, and other image processing devices. Moreover, it can be used for a visual system that requires object recognition, such as an industrial robot.

1 is a conceptual diagram of a configuration of an imaging apparatus according to the present invention. Conceptual diagram of the active contour extraction method Conceptual diagram of multiple image acquisition Description conceptual diagram of normal captured image Conceptual diagram of initial contour closed curve and contour explanation Conceptual illustration of the outline of the object extraction and distance image Conceptual diagram for explaining the contour extraction flow of the imaging apparatus Conceptual diagram of contour extraction by level set method

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 ... Image pick-up device, 10 ... Lens, 11 ... Image sensor, 12 ... CDS part, 13 ... A / D conversion part, 14 ... Image processing part, 14a ... AWB calculating part, 14b ... Distance Image generating unit, 14c... Initial contour generating unit, 14d... Dynamic contour extracting unit, 15... Display unit, 16. 18..Operation unit, 19..Flash memory, 1a..Card interface, 1b.Memory card, 1c..External interface

Claims (4)

  1. A distance image generation unit that generates a distance image from a plurality of images obtained by imaging a target at a desired distance from different angles by an imaging device;
    An initial contour generation unit that extracts a contour of the target at the desired distance from the generated distance image and generates an initial contour closed curve;
    An image processing apparatus comprising: a dynamic contour extraction unit that extracts a contour of a target at the desired distance by a dynamic contour extraction method using an initial contour closed curve generated by the initial contour generation unit as an initial position. .
  2. A distance image generation unit that generates a distance image from a plurality of images obtained by imaging a target at a desired distance from different angles by an image sensor;
    An initial contour generation unit that extracts a contour of a target at the desired distance from the generated distance image and generates an initial contour closed curve;
    An imaging apparatus comprising: a dynamic contour extraction unit that extracts a contour of an object at the desired distance by a dynamic contour extraction method using an initial contour closed curve generated by the initial contour generation unit as an initial position.
  3. A method for extracting a contour of a target at a desired distance from a plurality of images obtained by imaging a target at a desired distance from different angles by an imaging device,
    Generating a distance image from the plurality of images;
    Extracting a contour of the object at the desired distance from the generated distance image to generate an initial contour closed curve; and
    A method for extracting a contour of a target in a captured image, comprising: extracting a contour of a target at the desired distance by a dynamic contour extraction method using the generated initial contour closed curve as an initial position.
  4. In the method of setting the initial contour closed curve used as the initial position of the dynamic contour extraction method,
    The initial contour of the dynamic contour extraction method is characterized in that the contour of the subject generated from a distance image based on a plurality of images obtained by imaging the subject to be contour extracted from different angles is set as the initial contour closed curve. Contour closed curve setting method.
JP2007167581A 2007-06-26 2007-06-26 Extraction method of outline inside image and image processor therefor Withdrawn JP2009009206A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007167581A JP2009009206A (en) 2007-06-26 2007-06-26 Extraction method of outline inside image and image processor therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007167581A JP2009009206A (en) 2007-06-26 2007-06-26 Extraction method of outline inside image and image processor therefor

Publications (1)

Publication Number Publication Date
JP2009009206A true JP2009009206A (en) 2009-01-15

Family

ID=40324250

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007167581A Withdrawn JP2009009206A (en) 2007-06-26 2007-06-26 Extraction method of outline inside image and image processor therefor

Country Status (1)

Country Link
JP (1) JP2009009206A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010282268A (en) * 2009-06-02 2010-12-16 Konica Minolta Medical & Graphic Inc Contour extraction device and program
JP2013182330A (en) * 2012-02-29 2013-09-12 Canon Inc Image processor and image processing method
US9142864B2 (en) 2010-11-15 2015-09-22 Amprius, Inc. Electrolytes for rechargeable batteries
US9231243B2 (en) 2009-05-27 2016-01-05 Amprius, Inc. Interconnected hollow nanostructures containing high capacity active materials for use in rechargeable batteries

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9231243B2 (en) 2009-05-27 2016-01-05 Amprius, Inc. Interconnected hollow nanostructures containing high capacity active materials for use in rechargeable batteries
US10461359B2 (en) 2009-05-27 2019-10-29 Amprius, Inc. Interconnected hollow nanostructures containing high capacity active materials for use in rechargeable batteries
JP2010282268A (en) * 2009-06-02 2010-12-16 Konica Minolta Medical & Graphic Inc Contour extraction device and program
US9142864B2 (en) 2010-11-15 2015-09-22 Amprius, Inc. Electrolytes for rechargeable batteries
US10038219B2 (en) 2010-11-15 2018-07-31 Amprius, Inc. Electrolytes for rechargeable batteries
JP2013182330A (en) * 2012-02-29 2013-09-12 Canon Inc Image processor and image processing method

Similar Documents

Publication Publication Date Title
US7620218B2 (en) Real-time face tracking with reference images
JP4429241B2 (en) Image processing apparatus and method
JP4626692B2 (en) Object detection apparatus, imaging apparatus, object detection method, and program
JP4678603B2 (en) Imaging apparatus and imaging method
KR101634248B1 (en) A digital photographing apparatus, a method for controlling the same, and a computer-readable storage medium
KR101142316B1 (en) Image selection device and method for selecting image
WO2009098894A1 (en) Electronic camera and image processing method
JP4582423B2 (en) Imaging apparatus, image processing apparatus, imaging method, and image processing method
JP2004062565A (en) Image processor and image processing method, and program storage medium
JP2008191683A (en) Imaging apparatus and imaging method
JP4735742B2 (en) Imaging apparatus, strobe image generation method, and program
JP5567853B2 (en) Image recognition apparatus and method
JP2006018246A (en) Imaging apparatus and imaging method
JP2005353010A (en) Image processor and imaging device
JP2010045613A (en) Image identifying method and imaging device
CN101783019B (en) Subject tracking apparatus and control method therefor, image capturing apparatus, and display apparatus
JP4692371B2 (en) Image processing apparatus, image processing method, image processing program, recording medium recording image processing program, and moving object detection system
JP6106921B2 (en) Imaging apparatus, imaging method, and imaging program
JP2011155492A (en) Image processor
TWI438719B (en) Detection information registration device, electronic device, method for controlling detection information registration device, method for controlling electronic device, program for controlling detection information device, and program for controlling el
JP5390943B2 (en) Image processing apparatus and image processing method
US8254630B2 (en) Subject extracting method and device by eliminating a background region using binary masks
JP5206095B2 (en) Composition determination apparatus, composition determination method, and program
JP2004349750A (en) Digital camera and control method therefor
JP5397481B2 (en) Image sorting apparatus and image sorting method

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20100907