WO2009110348A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2009110348A1 WO2009110348A1 PCT/JP2009/053241 JP2009053241W WO2009110348A1 WO 2009110348 A1 WO2009110348 A1 WO 2009110348A1 JP 2009053241 W JP2009053241 W JP 2009053241W WO 2009110348 A1 WO2009110348 A1 WO 2009110348A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tracking
- tracking target
- icon
- color
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present invention relates to an imaging apparatus such as a digital still camera, and more particularly to an imaging apparatus having a function of tracking a subject.
- the function of extracting and tracking a moving object is realized in a system using a surveillance camera.
- an alarm is output when a moving object (suspicious person) is detected from within a designated area in an image obtained from a surveillance camera.
- a system having a function of displaying a tracking result has been put into practical use so that the monitoring staff can easily confirm (see Non-Patent Document 1 below).
- the tracking process using an image is a pattern matching method that searches for the position of the pattern in the image after setting the tracking pattern, a method that detects the position of the moving object based on the optical flow, and a method that tracks subject features such as color It is realized using etc.
- the moving object extraction and tracking technology using image processing has been studied for the purpose of use with surveillance cameras and robot vision, but has recently begun to be used in digital cameras for general consumers.
- Non-Patent Document 2 a method is disclosed in which, when a designated subject is tracked and a shutter button is operated, a still image having a composition centered on the subject is generated by image cutout processing (see Non-Patent Document 2 below).
- Patent Document 1 a method for avoiding that the subject noticed at the time of moving image shooting is out of the shooting range.
- this method it is determined based on the motion vector whether or not the subject of interest is likely to be out of the shooting range, and a warning is issued when it is determined that such a possibility exists.
- the user When using this type of tracking function, the user performs a predetermined operation on the digital camera so that the tracking function is valid. After this operation, the user basically believes that the tracking function works effectively and performs a shooting operation to obtain a desired image.
- tracking by image processing does not always succeed, and the ease of tracking changes due to various factors, and the camera often loses track of the tracking target. For example, when tracking is performed based on the color information of the tracking target, if the background color and the tracking target color are similar, the tracking ease is reduced, and the camera may lose track of the tracking target. When the tracking ease decreases, the tracking reliability naturally decreases.
- an object of the present invention is to provide an image pickup apparatus having a function for easily informing the user of the tracking state.
- the imaging apparatus includes: an imaging element that outputs a signal representing an image sequence obtained by sequential imaging; and the tracking target by detecting a position of the tracking target in the image sequence based on an output signal of the imaging element.
- the display control unit causes the display unit to display a tracking target icon corresponding to the tracking target and a level icon representing the evaluated degree.
- the tracking processing unit performs tracking of the tracking target based on the color information of the tracking target specified by the output signal of the imaging element, and the tracking target icon corresponds to the color information. Have different colors.
- the tracking processing unit after the tracking processing unit sets a tracking color according to the color of the tracking target, the tracking processing unit performs tracking of the tracking target by tracking an image area having the tracking color in the image sequence.
- the display control means sets the color of the tracking target icon to be colorless or a preset color before the tracking color is set.
- the display control unit displays a plurality of tracking target icons corresponding to the plurality of tracking targets on the display unit.
- the display control means includes the former tracking target.
- the display format of the tracking target icon is changed between the corresponding tracking target icon and the tracking target icon corresponding to the latter tracking target.
- the display control means includes the former tracking target.
- the display position of the corresponding tracking target icon is made closer to the level icon side than the display position of the tracking target icon corresponding to the latter tracking target.
- the tracking evaluation unit performs the evaluation of the degree for each tracking target, and the display control unit includes the degree of the degree evaluated for the tracking target.
- the maximum degree, the minimum degree, or both are reflected in the level icon.
- the display control unit determines whether each tracking target icon is in accordance with the tracking priority.
- the display size is determined, or the display position of each tracking target icon is determined according to the tracking priority.
- the display control unit displays a tracking target icon corresponding to the tracking target on the display unit, and classifies and expresses the evaluated degree in a plurality of stages according to a color used for the tracking target icon. It may be.
- the tracking target icon is generated using an image based on an output signal of the image sensor or using an image registered in advance.
- Another imaging apparatus includes: an imaging element that outputs a signal representing an image sequence obtained by sequential shooting; and a position of a tracking target in the image sequence based on an output signal of the imaging element.
- an object is to provide an imaging apparatus having a function of informing the user of the tracking state in an easy-to-understand manner.
- FIG. 1 is an overall block diagram of an imaging apparatus according to an embodiment of the present invention. It is an internal block diagram of the imaging part of FIG. It is an external appearance perspective view of the imaging device of FIG.
- FIG. 2 is a block diagram of a part particularly related to a subject tracking function in the imaging apparatus of FIG. 1. 2 is a flowchart showing an operation flow in a tracking mode of the imaging apparatus of FIG. 1.
- A) And (b) is a figure showing the example of the image for a display in tracking mode, respectively, and the figure showing the frame image used as the origin of this image for a display.
- (A), (b), and (c) are the figure which shows the tracking mode icon respectively displayed on the display part of FIG.
- FIG. 5 is a diagram for describing a tracking reliability evaluation method performed by a tracking reliability evaluation unit in FIG.
- (A), (b), and (c) are diagrams showing the state of level icons when the evaluated tracking reliability is high, medium, and low, respectively.
- (A), (b), (c), and (d) are examples of display screens when the reliability of the evaluated tracking is high, medium, or low, and when the tracking target person is lost. It is an example of a display screen. It is a figure which shows the tracking object icon which concerns on the 1st application display example and has the body icon which made the internal color into multiple colors.
- FIG. (A) And (b) is related to the third applied display example, and is a diagram showing an example of a display image when there are three tracking target persons, and enlargement of three tracking target icons included in the display image, respectively.
- FIG. (A) And (b) is a figure which shows the mode of the three tracking object icons which concern on a 4th application display example.
- (A) And (b) is a figure which shows the mode of the tracking mode icon which concerns on the 5th application display example.
- (A) And (b) is a figure which shows the mode of the tracking mode icon which concerns on a 6th application display example.
- (A) And (b) is a figure which shows the mode of the tracking mode icon which concerns on the 7th application display example.
- (A) And (b) is a figure which shows the mode of the three tracking object icons which concern on the 8th application display example. It is a figure which shows a mode that the some tracking object icon contained in a tracking mode icon is piled up back and forth, concerning an 8th application display example.
- (A), (b) and (c) is a figure which shows the mode of the tracking object icon which concerns on a 9th application display example. It is a figure which shows the mode of the tracking mode icon which concerns on a 10th application display example.
- (A) And (b) is a figure which shows a mode that the shape of the body icon which forms a tracking object icon changes with the sex of a tracking object person according to the 10th application display example.
- (A) And (b) is a figure which shows a mode that a display size differs between several tracking object icons concerning the 10th application display example. It is a figure which shows the mode of the tracking mode icon which concerns on the 11th application display example. It is a figure which shows the movement amount icon which concerns on the 12th application display example.
- (A), (b), (c), and (d) relate to the twelfth applied display example, and when the amount of movement of the tracking target person on the image is large, medium, small, and extremely small, respectively. It is a figure which shows the mode of the movement amount icon at the time. It is a figure which shows a mode that the length of a level icon is changed according to the twelfth application display example according to the amount of movements of the tracking target person on the image.
- FIG. 1 is an overall block diagram of an imaging apparatus 1 according to an embodiment of the present invention.
- the imaging device 1 is a digital still camera capable of capturing and recording still images, or a digital video camera capable of capturing and recording still images and moving images.
- the imaging apparatus 1 includes an imaging unit 11, an AFE (Analog Front End) 12, a main control unit 13, an internal memory 14, a display unit 15, a recording medium 16, and an operation unit 17.
- the operation unit 17 is provided with a shutter button 17a.
- FIG. 2 shows an internal configuration diagram of the imaging unit 11.
- the imaging unit 11 drives and controls the optical system 35, the diaphragm 32, the imaging element 33 such as a CCD (Charge Coupled Devices) or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the optical system 35 and the diaphragm 32.
- a driver 34 a driver 34.
- the optical system 35 is formed from a plurality of lenses including the zoom lens 30 and the focus lens 31.
- the zoom lens 30 and the focus lens 31 are movable in the optical axis direction.
- the driver 34 drives and controls the positions of the zoom lens 30 and the focus lens 31 and the opening degree of the diaphragm 32 based on the control signal from the main control unit 13, so that the focal length (view angle) and focus of the imaging unit 11 are controlled. The position and the amount of light incident on the image sensor 33 are controlled.
- the image sensor 33 photoelectrically converts an optical image representing a subject incident through the optical system 35 and the diaphragm 32, and outputs an electrical signal obtained by the photoelectric conversion to the AFE 12. More specifically, the image sensor 33 includes a plurality of light receiving pixels arranged two-dimensionally in a matrix, and in each photographing, each light receiving pixel stores a signal charge having a charge amount corresponding to the exposure time. An analog signal from each light receiving pixel having a magnitude proportional to the amount of stored signal charge is sequentially output to the AFE 12 in accordance with a drive pulse generated in the imaging device 1. The length of the exposure time is controlled by the main control unit 13.
- the AFE 12 amplifies the analog signal output from the imaging unit 11 (image sensor 33), and converts the amplified analog signal into a digital signal.
- the AFE 12 sequentially outputs this digital signal to the main control unit 13.
- the amplification degree of signal amplification in the AFE 12 is controlled by the main control unit 13.
- the main control unit 13 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, and functions as a video signal processing unit.
- the main control unit 13 generates a video signal representing an image photographed by the imaging unit 11 based on the output signal of the AFE 12.
- the main control unit 13 also has a function as display control means for controlling the display content of the display unit 15, and performs control necessary for display on the display unit 15.
- the internal memory 14 is formed of SDRAM (Synchronous Dynamic Random Access Memory) or the like, and temporarily stores various data generated in the imaging device 1.
- the display unit 15 is a display device including a liquid crystal display panel and the like, and displays a photographed image, an image recorded on the recording medium 16, and the like under the control of the main control unit 13.
- the recording medium 16 is a non-volatile memory such as an SD (Secure Digital) memory card, and stores captured images and the like under the control of the main control unit 13.
- the operation unit 17 receives an operation from the outside.
- the content of the operation on the operation unit 17 is transmitted to the main control unit 13.
- the shutter button 17a is a button for instructing photographing and recording of a still image. By pressing the shutter button 17a, the photographing and recording of a still image is instructed.
- the shutter button 17a can be pressed in two stages. When the photographer lightly presses the shutter button 17a, the shutter button 17a is half pressed, and when the shutter button 17a is further pressed from this state. The shutter button 17a is fully pressed.
- the operation mode of the imaging apparatus 1 includes a shooting mode in which a still image or a moving image can be shot and a playback mode in which the still image or the moving image recorded on the recording medium 16 can be played on the display unit 15.
- shooting is sequentially performed at a predetermined frame period, and a shot image sequence is acquired from the image sensor 33.
- Each image forming this captured image sequence is called a “frame image”.
- An image sequence (for example, a captured image sequence) means a plurality of images arranged in time series. The frame image sequence is updated and displayed on the display screen of the display unit 15 as a moving image.
- FIG. 3 shows an external perspective view of the imaging apparatus 1 as seen from the photographer side.
- FIG. 3 also shows a person as a subject. The photographer can confirm the photographing range of the imaging apparatus 1 by confirming the state of the subject displayed on the display screen of the display unit 15.
- the imaging apparatus 1 has a subject tracking function using image processing, and performs a characteristic display when the subject tracking function is realized.
- the subject tracking function is realized in the shooting mode.
- An operation mode that realizes the subject tracking function, which is one form of the shooting mode, is referred to as a tracking mode.
- An operation in the tracking mode is executed by performing a predetermined operation on the operation unit 17.
- the operations described below are operations of the imaging apparatus 1 in the tracking mode unless otherwise specified.
- display screen it refers to the display screen on the display unit 15, and when simply referred to as “display”, it means “display” on the display screen of the display unit 15. .
- Data representing an image is called image data.
- Image data of a certain frame image is generated from an output signal of the AFE 12 that represents an optical image of the frame image.
- FIG. 4 is a block diagram of a part of the imaging apparatus 1 that is particularly related to the subject tracking function.
- the tracking processing unit 51, the tracking reliability evaluation unit 52, and the display control unit 53 are provided in the main control unit 13 of FIG.
- the image data of each frame image forming the frame image sequence is sequentially given to the tracking processing unit 51, the tracking reliability evaluation unit 52, and the display control unit 53.
- the tracking processing unit 51 tracks the position of the specific subject in the frame image sequence by sequentially detecting the position of the specific subject in each frame image based on the image data of the frame image sequence.
- the specific subject is a person. Therefore, hereinafter, the specific subject to be tracked is referred to as a tracking target person.
- the tracking processing unit 51 also has a function as a face detection unit (not shown), detects a human face from the frame image based on the image data of the frame image, and extracts a face region including the detected face. . Processing for realizing this is called face detection processing.
- Various methods are known as a method for detecting a face included in an image, and the tracking processing unit 51 can employ any method.
- a face (face area) may be detected by extracting a skin color area from a frame image as in the technique described in Japanese Patent Application Laid-Open No. 2000-105819, or Japanese Patent Application Laid-Open No. 2006-21111 or Japanese Patent Application Laid-Open No. 2006.
- a face (face area) may be detected using the method described in Japanese Patent No. -72770.
- the tracking reliability evaluation unit 52 evaluates the tracking reliability (degree of tracking reliability) or the tracking ease (degree of tracking ease) by the tracking processing unit 51 based on the image data of the frame image sequence. .
- the reliability of tracking and the ease of tracking have the same significance or similar significance. Strictly speaking, the reliability of tracking is interpreted as representing how reliable the tracking performed in the past is, and the ease of tracking is how easy tracking is to be performed in the future. Is taken to represent. The higher the tracking ease, the higher the tracking reliability, and the lower the tracking ease, the lower the tracking reliability. In the following description, for convenience of description, it is considered that “what the tracking processing unit 51 evaluates is the reliability of tracking”, but the reliability of tracking can also be read as the ease of tracking.
- the display control unit 53 controls the display content of the display unit 15, generates image data of a display image from the image data of the frame image, and sends this to the display unit 15, thereby displaying the display image on the display screen.
- Display Since the frame images are sequentially acquired sequentially, the display image is also periodically generated and updated and displayed on the display screen. This display image is generated by reflecting the tracking result by the tracking processing unit 51 and the evaluation result by the tracking reliability evaluation unit 52 in the frame image. Since the display image is displayed on the display screen, the following description will be given with the same display image and display screen. Further, the horizontal direction of the display image and the display screen is regarded as the left-right direction, and the vertical direction of the display image and the display screen is regarded as the up-down direction.
- FIG. 5 is a flowchart showing a flow of operations of the imaging apparatus 1 in the tracking mode.
- the display control unit 53 displays the tracking mode icon before initialization superimposed on the current frame image in step S11.
- the display image 200 displayed in step S11 is shown in FIG. 6A, and the frame image 201 from which the display image 200 is generated is shown in FIG.
- the icon in the broken-line rectangular area 202 on the lower right side of the display image 200 is the tracking mode icon before initialization.
- the tracking mode icon before initialization is blinking on the display screen (however, execution of this blinking process is not essential).
- the state in which the tracking mode icon before initialization is displayed is continued until just before step S15 is reached.
- FIG. 7A shows an enlarged view of the tracking mode icon.
- the tracking mode icon is formed from a tracking target icon and a level icon arranged in the left-right direction.
- the figure shown in the broken line rectangle 210 is the tracking target icon, and the figure shown in the broken line rectangle 220 is the level icon.
- the tracking target icon is formed from a face icon and a body icon arranged in the vertical direction.
- the figure shown in the broken line rectangle 211 represents a face icon
- the figure shown in the broken line rectangle 212 represents a body icon.
- the face icon represents a figure imitating a human face as an image
- the body icon represents a figure imitating a human torso as an image.
- the broken lines in FIGS. 7A and 7B are shown for convenience of explanation, and the broken lines are not constituent elements of the tracking mode icon (the broken lines are not displayed).
- the level icons are formed from first to third bar icons to which reference numerals 221 to 223 are attached.
- the first to third bar icons are displayed side by side in the vertical direction.
- Each of the first to third bar icons represents a parallelogram figure as an image, and has a design that looks like the movement of the tracking target.
- the body icon has a generally rectangular outer shape, and its internal color is changed according to the tracking state.
- the internal color of the body icon in the tracking mode icon before initialization is colorless (that is, transparent). Accordingly, as shown in FIG. 6A, an image of a corresponding part of the frame image is displayed inside the body icon in the display image 200.
- Each bar icon has an outline of a parallelogram as described above, and the internal color of the parallelogram is also changed according to the tracking state.
- the internal color of each bar icon in the tracking mode icon before initialization is colorless.
- the internal color of the body icon and / or the internal color of each bar icon in the tracking mode icon before initialization can be set to a preset color (for example, translucent color) other than colorless. .
- the internal color of the face icon is also colorless, but the internal color of the face icon is arbitrary.
- the tracking processing unit 51 in FIG. 4 performs face detection processing on sequentially input frame images.
- steps S12 and S13 subsequent to step S11 in FIG. 5 the tracking processing unit 51 confirms whether or not a face is detected from the current frame image, and the face is detected and the shutter button 17a is half-pressed.
- the process proceeds to step S14. From the user's perspective, the following operations are performed.
- the transition condition to step 14 is a half-press operation on the shutter button 17a, but the transition condition can be changed to any other condition. For example, when a predetermined operation is performed on the operation unit 17, the process may proceed from step S13 to step S14.
- step S14 the tracking processing unit 51 recognizes the frame image in which the face is detected immediately before reaching step S14 as the initial setting frame image, and sets the tracking color based on the image data of the initial setting frame image.
- a tracking color setting method will be described with reference to FIG.
- An image 230 in FIG. 8 represents an example of an initial setting frame image.
- a broken-line rectangular area 231 is a face area extracted from the initial setting frame image 230 by the face detection process.
- the tracking processing unit 51 detects a body area 232 that is an area including a human body part corresponding to the face area 231.
- the torso region 232 is a rectangular region that exists below the face region 231 (in the direction from the eyebrow to the mouth).
- the position and size of the body region 232 in the initial setting frame image are determined depending on the position and size of the face region 231.
- the tracking processing unit 51 identifies the color in the body region 232 based on the image data of the image in the body region 232, and sets the identified color as the tracking color. For example, a color histogram of the image in the body region 232 is generated based on the color signal (for example, RGB signal) of each pixel forming the image in the body region 232. Then, the dominant color or the most frequent color in the image in the trunk region 232 is obtained based on the color histogram, and the obtained color is set as the tracking color.
- the dominant color of an image refers to the color occupying most of the image area of the image, and the most frequent color of an image is the color having the highest frequency in the color histogram of the image.
- the average color of the image in the fuselage region 232 is obtained by averaging the color signals (for example, RGB signals) of the pixels forming the image in the fuselage region 232, and the average color is tracked. You may make it set as a color.
- step S15 the display control unit 53 displays the tracking mode icon after setting the tracking color superimposed on the current frame image.
- the display of the tracking mode icon after setting the tracking color is performed in step S15 and subsequent step S16.
- An example of the display image displayed here is shown in FIG.
- the tracking mode icon after setting the tracking color is superimposed on the current frame image at the lower right of the display image 240 in FIG.
- the internal color of the body icon in the tracking mode icon after setting the tracking color is the same color as the tracking color (or a color similar to the tracking color). Further, the internal color of each bar icon in the tracking mode icon after setting the tracking color is determined by the reliability of tracking. This will be described later.
- a frame 241 surrounding a part or all of the body region of the tracking target person is displayed in a superimposed manner. However, the display of the frame 241 can be erased according to a user instruction.
- step S16 the tracking processing unit 51 performs a tracking process on the frame image sequence obtained after the process in step S15.
- Each frame image forming the frame image sequence to be subjected to the tracking process is particularly referred to as a tracking target frame image.
- the tracking processing unit 51 detects the position of the tracking target person in each tracking target frame image based on the image data of the tracking target frame image sequence.
- the tracking processing unit 51 performs a tracking process based on the color information of the tracking target person.
- a tracking processing method based on color information methods described in JP-A-5-284411, JP-A-2000-48211, JP-A-2001-169169, and the like can be used.
- the color information of the tracking target person is expressed by the tracking color set as described above. Therefore, the tracking processing unit 51 extracts, from the tracking target frame image, an area having a color having high similarity to the tracking color based on the color signal of the tracking target frame image.
- the region extracted here is regarded as the body region of the tracking target person in the tracking target frame image.
- the color of the image and the tracking color in the tracking frame Similarity evaluation is performed while sequentially changing the position of the tracking frame within the search range, and it is determined that the body region of the tracking target person exists at the position of the tracking frame where the maximum similarity is obtained.
- the search range for the current tracking target frame image is set based on the position of the tracking target person detected from the previous tracking target frame image.
- the tracking processing unit 51 detects the position of the tracking target person in each tracking target frame image by executing the tracking processing based on the color information described above for the tracking target frame images that are input one after another.
- the position of the tracking target person is expressed by the center coordinate value of the body area of the tracking target person.
- the size of the tracking target person on the tracking target frame image changes due to a change in the distance (distance in real space) between the tracking target person and the imaging device 1. For this reason, it is necessary to appropriately change the size of the tracking frame according to the size of the tracking target person on the tracking target frame image.
- This change is based on the subject size used in a known tracking algorithm.
- This is realized by using a detection method. For example, in the tracking target frame image, it is considered that the background appears at a point sufficiently away from the point where the body of the tracking target person is expected to exist, and the pixel at each position between the two points from the image feature between the two points Are classified as either the background or the tracking target person.
- the contour of the tracking target person is estimated by this classification. Then, the size of the tracking target person is estimated from the contour, and the size of the tracking frame is set according to the estimated size.
- a frame 241 in FIG. 9 is a frame that surrounds part or all of the body region detected from the tracking target frame image that is the basis of the display image 240, and the display control unit 53 uses the frame 241 as tracking result information. Generate based on.
- FIG. 10 corresponds to a detailed flowchart of the tracking process executed in step S16 of FIG.
- the process of step S16 is formed from steps S21 to S24.
- step S21 the current frame image is acquired from the output signal of the AFE 12 for one frame image at the current time.
- the frame image acquired here is the tracking target frame image as described above.
- step S22 the tracking processing unit 51 detects the position of the tracking target person in the current frame image obtained in step S21 by the tracking process described above. However, this position is not always detected. For example, if the background color and the tracking color are the same, this position detection fails. Of course, this position cannot be detected even if the tracking target person is out of the shooting range. If the position of the tracking target person can be detected in step S22, the process of step S23 is executed, and if it cannot be detected, the process of step S24 is executed.
- step S23 the tracking reliability evaluation unit 52 in FIG. 4 performs the reliability of tracking performed on the frame image in step S22, in other words, the reliability of the position of the tracking target person detected from the frame image in step S22. Assess degree. This evaluation is performed based on the background color and the tracking color.
- the evaluation value representing the estimated reliability is referred to as a reliability evaluation value, representing it by EV R.
- the reliability evaluation value can be calculated for each tracking target frame image.
- EV R has a value of 0 or more and 100 or less, the reliability evaluation value EV R as the reliability is evaluated as high increases.
- m and n are introduced as symbols representing the horizontal position and vertical position of the small block in the operation image (m is an integer value satisfying 1 ⁇ m ⁇ M, and n is an integer satisfying 1 ⁇ n ⁇ N. Number). It is assumed that the horizontal position goes to the right as m increases, and the vertical position goes down as n increases.
- a small block whose horizontal position is m and whose vertical position is n is denoted as a small block [m, n].
- the tracking reliability evaluation unit 52 recognizes the center of the body region of the tracking target person in the image to be calculated based on the tracking result information, and specifies which small block the position of the center belongs to.
- a point 300 in FIG. 12 represents this center.
- the center 300 belongs to a small block [m O , n O ] (m O is an integer value satisfying 1 ⁇ m ⁇ M, and n O is an integer value satisfying 1 ⁇ n ⁇ N).
- each small block is classified into a small block in which a tracking target person is drawn or a small block in which a background is drawn.
- the former small block is called a subject block
- the latter small block is called a background block.
- FIG. 12 illustrates that the color of the tracking target person appearing around the center 300 is different from the background color.
- the tracking reliability evaluation unit 52 calculates a color difference evaluation value representing the difference between the set tracking color and the color of the image in the background block for each background block. It is assumed that there are Q background blocks, and the color difference evaluation values calculated for the first to Qth background blocks are represented by C DIS [1] to C DIS [Q] (Q is the inequality “2 ⁇ Q ⁇ (M ⁇ N) ⁇ 1 ”). For example, when the color difference evaluation value C DIS [1] is calculated, the color signal (for example, RGB signal) of each pixel belonging to the first background block is averaged to obtain an image of the image in the first background block. An average color is obtained, and the position of the average color in the RGB color space is detected.
- C DIS [1] the color signal (for example, RGB signal) of each pixel belonging to the first background block is averaged to obtain an image of the image in the first background block. An average color is obtained, and the position of the average color in the RGB color space is detected.
- the position of the tracking color set for the tracking target person in the RGB color space is also detected, and the distance between the two positions in the RGB color space is calculated as the color difference evaluation value C DIS [1].
- the color difference evaluation value C DIS [1] increases as the degree of difference between the compared colors increases.
- the RGB color space is normalized so that the range of values that the color difference evaluation value C DIS [1] can take is 0 or more and 1 or less.
- Other color difference evaluation values C DIS [2] to C DIS [Q] are similarly calculated.
- the color space for obtaining the color difference evaluation value may be other than the RGB color space (for example, HSV color space).
- the tracking reliability evaluation unit 52 calculates a position difference evaluation value representing a spatial position difference between the center 300 and the background block on the operation image for each background block.
- the position difference evaluation values calculated for the first to Qth background blocks are represented by P DIS [1] to P DIS [Q], respectively.
- the position difference evaluation value for a certain background block is the distance between the center 300 and the nearest vertex to the center 300 among the four vertices of the background block.
- the small block [1,1] is the first background block and 1 ⁇ m 2 O and 1 ⁇ n 2 O , and as shown in FIG.
- the vertex 301 is If it is the closest to the center 300, the position difference evaluation value P DIS [1] is a spatial distance between the center 300 and the vertex 301 on the operation image. It is assumed that the spatial region of the image to be calculated is normalized so that the range of values that the position difference evaluation value P DIS [1] can take is 0 or more and 1 or less. Other position difference evaluation values P DIS [2] to P DIS [Q] are calculated in the same manner.
- step S23 in FIG. 10 the display control unit 53, based on the reliability evaluation value EV R calculated for the current frame image (the latest frame image) to be superimposed on the frame image level Determine the color of the icon. Then, a display image is generated and displayed by superimposing a tracking mode icon including a level icon having the determined color on the current frame image.
- FIGS. 13A to 13C show level icons to be superimposed on the frame image in step S23.
- the reliability evaluation value EV R in three stages. If the inequality “EV R ⁇ TH 1 ” is satisfied, it is determined that the tracking reliability is high, and the first to third bars forming the level icons to be superimposed are formed as shown in FIG.
- the internal color of the icon is the specified color.
- the prescribed color used as the internal color of the bar icon means a color (for example, red) other than a preset colorless color. If the inequality “TH 1 > EV R ⁇ TH 2 ” is satisfied, it is determined that the reliability of tracking is medium, and a level icon to be superimposed is formed as shown in FIG. 13B.
- the internal color of the first and second bar icons is set as the specified color, while the internal color of the third bar icon is set as colorless. If the inequality “TH 2 > EV R ” is satisfied, it is determined that the reliability of tracking is low, and as shown in FIG. 13C, the interior of the first bar icon that forms the level icon to be superimposed is determined. While the color is the specified color, the internal colors of the second and third bar icons are colorless.
- TH 1 and TH 2 are predetermined thresholds satisfying the inequality “100> TH 1 > TH 2 > 0”.
- step S24 where the position of the tracking target person cannot be detected, the level icon reflects that the tracking target person has been lost.
- the display control unit 53 generates and displays a display image by superimposing a tracking mode icon in which the internal colors of the first to third bar icons are all colorless on the current frame image. indicate.
- step S23 or S24 After the display in step S23 or S24, the process returns to step S21, and the process consisting of the above-described steps S21 to S24 is repeatedly executed.
- FIGS. 14A to 14D show the state of the display screen displayed during the execution of the tracking process.
- FIGS. 14A, 14B, and 14C show the state of the display screen when tracking reliability is high, medium, and low, respectively, and
- FIG. 14D shows the case where the tracking target person is lost. It is a state of the display screen in.
- the shutter button 17a When the shutter button 17a is fully pressed during the above-described tracking process, a still image focusing on the tracking target person is taken and the image data of the still image is recorded on the recording medium 16.
- the tracking target person After controlling the position of the image sensor 33 so that the tracking target person is positioned at the center of the imaging surface of the image sensor 33, the tracking target person is positioned at the center by capturing the output signal of the image sensor 33. Acquire still images.
- image trimming may be performed so as to obtain a still image in which the tracking target person is arranged in the center.
- the tracking reliability is classified into a plurality of stages and evaluated, and the evaluation result is displayed. Thereby, the reliability of tracking can be informed to the user. If the tracking reliability can be known, the user recognizes the necessity of camera operation that does not rely too much on composition setting or tracking processing to increase the reliability. As a result, it is possible to avoid the occurrence of a situation where a desired image cannot be acquired and ends. Further, as shown in FIGS. 7A to 7C, the tracking mode icon includes a tracking target icon imitating a person and a level icon having a design resembling the movement of the tracking target. It is possible to inform the user intuitively that the icon is related to.
- the tracking color is set to the internal color of the body icon, so that it is possible to easily indicate to the user which subject the imaging apparatus 1 is tracking.
- the internal color of the body icon is colorless. Thereby, it is possible to clearly notify the user whether or not the tracking has started to function.
- the tracking processing unit 51 sets the tracking color to red and blue based on the image data of the frame image including the tracking target person.
- the display control unit 53 displays a body icon having red and blue as internal colors as shown in FIG.
- a tracking frame for searching for an image area having a red color and a tracking frame for searching for an image area having a blue color are set in the tracking target frame image, and based on the search results of both image areas.
- the position of the tracking target person may be detected.
- the image area including red and blue may be considered as a set, and the position of the tracking target person may be detected by searching for the image area including red and blue in the tracking target frame image.
- the search priority may be assigned to red and blue.
- the priority order of red is high, first, a tracking frame for searching for an image area having red is set in the tracking target frame image, and the tracking target person is based on the search result of the image area. The position of is detected. If the search cannot be performed, a tracking frame for searching for an image area having a blue color is set in the tracking target frame image, and the position of the tracking target person is detected based on the search result of the image area. .
- the priority color can be set by an instruction from the user using an operation on the operation unit 17. For example, when red is set as the priority color, the tracking processing unit 51 preferentially sets red as the tracking color, and searches and tracks an image area having red from the tracking target frame image.
- step S11 in FIG. 5 a tracking mode icon including a body icon whose internal color is red is displayed.
- the first face area and the second face area are extracted from the initial setting frame image, and the tracking processing unit 51 sets the person corresponding to the red body area as the tracking target person based on the setting content of the priority color. Capture.
- the color of the body region corresponding to the first face region is red or a similar color of red
- the color of the body region corresponding to the second face region is If it is determined that the color is blue or a similar color of blue, the person corresponding to the first face area is set as the tracking target person, and the color of the body area corresponding to the first face area is set as the tracking color. Is set.
- the subsequent tracking operation is as described above.
- the plurality of persons can be tracked as tracking target persons, and the tracking target icons for the number of persons can be displayed.
- the tracking processing unit 51 extracts first to third face areas corresponding to the first to third persons from the initial setting frame image.
- the tracking processing unit 51 determines the first to third body regions corresponding to the first to third face regions from the initial setting frame image. Is extracted, the color in each fuselage area is specified based on the image data of the image in each fuselage area, and the colors specified for the first to third fuselage areas are respectively identified as first to third tracking. Set as color.
- the tracking processing unit 51 treats the first to third persons as the first to third tracking target persons, and individually detects the position of each tracking target person in the tracking target frame image sequence, thereby detecting each tracking. Track the target person individually.
- the specific example shown here will be referred to as “multiple person specific example ⁇ ”.
- the first to third persons are included in the shooting range, and the first to third face areas and the first to third faces corresponding to the first to third persons are included.
- the body region is extracted and the first to third tracking colors are set, and the first to third persons are treated as the first to third tracking target persons, and each tracking target person is individually tracked. Is done.
- the operation and display method under the assumption of the multiple person specific example ⁇ will be described.
- FIG. 16A shows a display image 400 that can be generated from a tracking target frame image.
- first to third tracking target icons corresponding to the first to third tracking target persons and one level icon are formed so that they do not overlap each other in the left-right direction.
- the arranged tracking mode icon is drawn.
- FIG. 16B shows an enlarged view of the first to third tracking target icons in the display image 400.
- the first to third tracking target icons are displayed side by side so as not to overlap each other, and the internal colors of the body icons in the first to third tracking target icons are the first to third tracking colors, respectively. Is done.
- the tracking processing unit 51 tries to individually detect the positions of the first to third tracking target persons on the tracking target frame image.
- the former tracking target person is a tracking target person that has actually been tracked, and is called an actual tracking person.
- the latter tracking target person is a tracking target person who has not actually been tracked, and is called a tracking lost person.
- the display format of the tracking target icon corresponding to the former and the latter tracking target icon may be different. As a result, it is possible to inform the user whether or not tracking is possible for each person to be tracked in an easy-to-understand manner.
- FIGS. 17A and 17B illustrate the outline of the tracking target icon corresponding to the tracking lost person.
- the outline of the tracking target icon corresponding to the tracking lost person is made thicker while the outline of the tracking target icon corresponding to the actual tracking person is relatively thick. Make it relatively thin.
- FIG. 17A corresponds to the case where the first and second persons are actual tracking persons and the third person is a tracking lost person.
- FIG. 17B illustrates the first and third persons. This corresponds to the case where the person is an actual tracking person and the second person is a tracking lost person. If all of the first to third persons are actual tracking persons, it is preferable that the outlines of the first to third tracking target icons are relatively thick.
- the display format can be considered, and different display formats may be realized by the presence or absence of flashing, changing the display size, changing the display color, and the like. That is, for example, the tracking target icon corresponding to the actual tracking person may be displayed on the display screen at all times, while the tracking target icon corresponding to the tracking lost person may be flashed on the display screen. Alternatively, for example, the display size of the tracking target icon corresponding to the actual tracking person may be larger than that of the tracking lost person.
- the first to third tracking target icons corresponding to the first to third tracking target persons and one level icon are formed and do not overlap each other in the left-right direction.
- the tracking mode icons arranged as described above are displayed, when the actual tracking person and the tracking lost person are mixed in the first to third tracking target persons, the tracking target icon corresponding to the actual tracking person is displayed. It may be displayed on the level icon side of that of the tracking lost person.
- the icons are displayed side by side in the order of the icon, the tracking target icon corresponding to the first person, the tracking target icon corresponding to the second person, and the tracking target icon corresponding to the third person. Thereafter, if the first and second persons change to the tracking lost person and the third person changes to the actual tracking person, as shown in FIG.
- the icons are displayed side by side in the order of the level icon, the tracking target icon corresponding to the third person, the tracking target icon corresponding to the first person, and the tracking target icon corresponding to the second person.
- the tracking reliability evaluation unit 52 individually calculates a reliability evaluation value for each of the plurality of actual tracking persons, thereby determining the tracking reliability for each actual tracking person. To evaluate.
- the display control unit 53 identifies the maximum reliability evaluation value among the reliability evaluation values obtained for each real tracking person, and sets the tracking target icon of the real tracking person corresponding to the maximum reliability evaluation value. It is arranged closest to the level icon and the maximum reliability evaluation value is reflected on the level icon.
- the first to third tracking target icons corresponding to the first to third tracking target persons and one level icon are formed so as not to overlap each other in the left-right direction. This will be described more specifically on the assumption that the arranged tracking mode icon is displayed. Now, assume that the second and third persons are actual tracking persons and the first person is a tracking lost person. Further, the reliability evaluation values calculated for the second and third tracking target persons using the image data of one focused tracking target frame image are represented by EV R2 and EV R3, respectively. It is assumed that the inequality “EV R2 > EV R3 ” holds.
- a tracking mode icon as shown in FIG. 19A is displayed superimposed on the focused tracking target frame image. That is, from the left side to the right side of the display screen, the level icon, the tracking target icon corresponding to the second person, the tracking target icon corresponding to the third person, and the tracking target icon corresponding to the first person are in this order. Each icon is displayed side by side. Considering that the first person is a tracking lost person, the display position of the tracking target icon corresponding to the first person is farthest from the display position of the level icon (see FIGS. 18A and 18B). ). Then, the internal color of each bar icon in the level icon are determined according to the reliability evaluation value EV R2 is the maximum reliability evaluation value. In this example, since the inequality “EV R2 ⁇ TH 1 ” is satisfied, the internal colors of the first to third bar icons are all defined colors other than colorless.
- the tracking target icon of the tracking target person corresponding to the minimum reliability evaluation value may be arranged closest to the level icon, and the minimum reliability evaluation value may be reflected on the level icon.
- a tracking mode icon as shown in FIG. 19B is displayed superimposed on the focused tracking target frame image. That is, from the left side to the right side of the display screen, the level icon, the tracking target icon corresponding to the third person, the tracking target icon corresponding to the second person, and the tracking target icon corresponding to the first person are in this order. Each icon is displayed side by side. Then, the internal color of each bar icon in the level icon are determined according to the reliability evaluation value EV R3 is the minimum of the reliability evaluation value.
- FIG. 4 When there are a plurality of actual tracking persons, it is possible to display a tracking mode icon 420 having two sets of level icons and tracking target icons as shown in FIG.
- the tracking mode icon 420 is formed by arranging the level icon and tracking target icon belonging to the first set and the level icon and tracking target icon belonging to the second set in the left-right direction so as not to overlap each other.
- a method for generating the tracking mode icon 420 will be described on the assumption that all of the first to third persons are actual tracking persons.
- the reliability evaluation values calculated for the first to third tracking target persons using the image data of one focused tracking target frame image are represented by EV R1 to EV R3, respectively.
- EV R2 > EV R1 > EV R3 holds.
- the inequalities “EV R2 ⁇ TH 1 ” and “TH 2 > EV R3 ” are satisfied (that is, the tracking reliability for the second person is high and the tracking reliability for the third person is high). The degree is low).
- the display control unit 53 specifies the maximum reliability evaluation value and the minimum reliability evaluation value among the reliability evaluation values obtained for each actual tracking person, and the maximum reliability evaluation value and the maximum reliability evaluation value.
- the tracking color of the actual tracking person corresponding to the degree evaluation value is reflected in the level icon and the tracking target icon belonging to the first set, and the tracking of the actual tracking person corresponding to the minimum reliability evaluation value and the minimum reliability evaluation value is performed.
- the color is reflected in the level icon and tracking target icon belonging to the second set.
- the tracking mode icon displayed in this case is shown in FIG.
- the maximum reliability evaluation value is EV R2 .
- the first set of level icons is generated according to EV R2
- the internal color of the body icon of the first set of tracking target icons is the tracking color for the second tracking target person.
- the internal colors of the first to third bar icons in the first set of level icons are all defined colors other than colorless.
- the minimum reliability evaluation value is EV R3 .
- the second set of level icons is generated according to EV R3 , and the internal color of the body icon of the second set of tracking target icons is the tracking color for the third tracking target person.
- the inequality “TH 2 > EV R3 ” is satisfied, in the second set of level icons, only the internal color of the first bar icon is set to a specified color other than colorless, and the second and third The internal color of the bar icon is colorless.
- tracking mode icon having three or more sets of level icons and tracking target icons.
- set of level icons and tracking target icons for the number of actual tracking persons.
- the display size of the tracking target icon may be changed according to the tracking priority.
- the first to third tracking target icons corresponding to the first to third tracking target persons and one level icon are formed and do not overlap each other in the left-right direction.
- the tracking mode icons arranged as described above are generated and displayed.
- the display size of the tracking target icon corresponding to the target person is set larger than those corresponding to the second and third tracking target persons.
- the tracking priority set for the second tracking target person is higher than that for the third tracking target person, as shown in FIG. 21B, it corresponds to the second tracking target person.
- the display size of the tracking target icon may be larger than that corresponding to the third tracking target person.
- the internal color of each bar icon forming the level icon is determined according to the reliability evaluation value of the tracking target person for which the highest tracking priority is set, for example.
- the internal color of each bar icon may be determined according to the method described in the sixth applied display example (see FIGS. 19A and 19B).
- the tracking priority setting method is arbitrary.
- the tracking priority can be set according to the tracking priority instruction by the user.
- the tracking priority instruction is given to the imaging apparatus 1 by a predetermined operation on the operation unit 17 or the like.
- the tracking target person having the priority color or a similar color of the priority color as the color of the body region is more effective than the other tracking target person. A higher tracking priority may be automatically given.
- an image of the face of a person who wants to give a high tracking priority may be stored in advance in the imaging apparatus 1 as a registered face image.
- the tracking processing unit 51 detects the image of each face area based on the image data of each face area and the image data of the registered face image. The degree of similarity with the registered face image is evaluated, and it is determined whether or not a plurality of face areas include a face area that gives a similarity higher than a predetermined reference similarity.
- a relatively high tracking priority is set for the person corresponding to the face area, and the other faces
- a relatively low tracking priority is set for the person corresponding to the region.
- the tracking mode icon 440 shown in FIG. 22 is formed of one level icon and three tracking target icons 441 to 443 that are overlapped in the front and rear.
- the level icon and the tracking target icon group are arranged side by side in the left-right direction so as not to overlap each other.
- the tracking target icons 441 to 443 are first to third tracking target icons corresponding to the first to third tracking target persons, respectively.
- the front-rear relationship of the display positions of the tracking target icons 441 to 443 may be determined according to the tracking priority.
- the tracking priority set for the first tracking target person is higher than those of the second and third tracking target persons, and according to the setting contents of the tracking priority.
- the first tracking target icon 441 is superimposed and displayed in front of the second and third tracking target icons 442 and 443. For this reason, the entire tracking target icon 441 can be visually recognized, while a part of the tracking target icon 442 and a part of the tracking target icon 443 are shielded by the tracking target icon 441 and are not visually recognized by the observer. In other words, the tracking target icon 441 displays the entire image, but the tracking target icons 442 and 443 display only a part.
- the tracking reliability is expressed by the level icon, but it may be expressed by the color of the body icon.
- the display control unit 53 generates a display image by superimposing a tracking mode icon including only the tracking target icon on the tracking target frame image.
- a reliability evaluation value EV R for a certain tracking target person calculates a reliability evaluation value EV R for a certain tracking target person by using the image data of the tracking target frame image focused, the threshold value TH of the above and its reliability evaluation value EV R to assess the magnitude relation between the 1 and TH 2. If the inequality “EV R ⁇ TH 1 ” holds, it is determined that the reliability of tracking is high, and the tracking target icon to be displayed superimposed on the tracking target frame image is displayed as shown in FIG. The internal color of the body icon is the first color. If the inequality “TH 1 > EV R ⁇ TH 2 ” holds, it is determined that the tracking reliability is medium, and the internal color of the body icon is set to the second color as shown in FIG. If the inequality “TH 2 > EV R ” is satisfied, it is determined that the tracking reliability is low, and the internal color of the body icon is set to the third color as shown in FIG. .
- the first to third colors are predetermined colors different from each other.
- the first, second, and third colors are green, yellow, and red, respectively. Note that when the reliability of tracking is expressed by the color of the body icon, the color of the body icon cannot be the tracking color. Therefore, in this case, another icon expressing the tracking color may be separately prepared and displayed.
- the face icon may be a simple graphic image imitating a human face, but the face icon may be generated using an image obtained from the output signal of the image sensor 33. Specifically, when the face area of the tracking target person is detected from the initial setting frame image, the image itself in the face area is used as the face icon image. Alternatively, an image imitating the face of the person to be tracked is generated based on the image data in the face area, and the generated image is used as a face icon image.
- FIG. 24 shows a tracking mode icon including such a face icon.
- a plurality of registered icons may be recorded in the imaging device 1 in advance, and the user may select an icon to be used as a face icon from among the plurality of registered icons.
- the user can include a desired icon in the plurality of registered icons.
- photographed in advance with the imaging device 1 can also be included in the plurality of registered icons.
- the gender of the person to be tracked may be determined based on the image data of the frame image, and the shape of the body icon may be changed depending on the gender.
- the shape of the body icon corresponding to the person to be tracked is an approximate square having a curve in the upper part as shown in FIG. 25A, and the person to be tracked is a woman.
- the shape of the body icon corresponding to the person to be tracked is a trapezoid (a trapezoid reflecting a skirt) with the bottom at the bottom as shown in FIG.
- a known method for example, the method described in Japanese Patent Application Laid-Open No. 2004-246456) can be used as a gender discrimination method.
- the size of the tracking target person estimated using the above-described subject size detection method may be reflected in the size of the tracking target icon.
- the focused tracking target frame image includes a first tracking target person and a second tracking target person.
- the second tracking target person is larger in size than the first tracking target person.
- tracking mode icons including a tracking target icon 461 corresponding to the first tracking target person and a tracking target icon 462 corresponding to the second tracking target person as shown in FIG. It is displayed superimposed on the target frame image.
- the display size of the tracking target icon 462 is larger than the display size of the tracking target icon 461.
- the tracking target icons 461 and 462 are arranged side by side so as not to overlap each other. However, as shown in FIG. It is also possible to determine the context in accordance with the tracking priority (see the eighth applied display example).
- a tracking mode icon may be formed using a tracked pattern.
- the pattern of the upper body clothes of the tracking target person is a horizontal stripe pattern
- the template pattern is also a horizontal stripe pattern.
- a motion amount icon representing the amount of motion of the tracking target person on the image may be added to the arbitrary tracking mode icon described above.
- the tracking mode icon including the motion amount icon is displayed superimposed on each frame image when the tracking process is executed. If the tracking processing unit 51 obtains a change in the position of the tracking target person between a plurality of tracking target frame images, the amount of movement of the tracking target person on the image can be determined. This amount of movement takes a value of 0 or more that increases as the position change increases.
- FIG. 28 shows a motion amount icon.
- the movement amount icon is formed by three bar icons 501 to 503.
- Bar icons 501 to 503 are displayed side by side in the vertical direction.
- Each of the bar icons 501 to 503 represents a parallelogram figure as an image.
- the internal colors of the bar icons 501 to 503 can be changed according to the amount of movement. A method for determining the internal colors of the bar icons 501 to 503 will be described. Now, representing the amount of the determined motion in the EV M.
- the main control unit 13 compares the motion amount EV M with predetermined threshold values TH M1 to TH M3 .
- the specified color used as the internal color of the bar icons 501 to 503 means a color other than colorless (for example, red) set in advance.
- the inequality “TH M1 > EV M ⁇ TH M2 ” is satisfied, as shown in FIG. 29B, the internal colors of the bar icons 501 and 502 are set as the specified color while the internal color of the bar icon 503 is colorless.
- the inequality “TH M2 > EV M ⁇ TH M3 ” is satisfied, as shown in FIG.
- the internal color of the bar icon 501 is set to the specified color while the internal color of the bar icons 502 and 503 is colorless. And When the inequality “TH M3 > EV M ” is satisfied, the internal colors of the bar icons 501 to 503 are all colorless as shown in FIG.
- a tracking mode icon having only such a motion amount icon and a tracking target icon may be displayed. That is, it is also possible to delete the level icon representing the reliability of tracking described above from the tracking mode icon and display the motion amount icon instead.
- the movement amount may be expressed by the shape change of the level icon in any of the tracking mode icons described above. That is, for example, as shown in FIG. 30, when the inequality “EV M ⁇ TH M1 ” is established with respect to the horizontal length of each bar icon forming the level icon, the inequality “TH M1 > EV M ⁇ TH M2 ” is satisfied.
- the inequality “TH M2 > EV M ⁇ TH M3 ” is satisfied when the inequality is satisfied, and when the inequality “TH M3 > EV M ” is satisfied, the lengths are L 1 , L 2 , L 3 and L 4 , respectively. .
- the length here is the length on the display screen, and L 1 > L 2 > L 3 > L 4 .
- the internal color of each bar icon forming the level icon can be determined according to the tracking reliability.
- the tracking target is a person (or a specific part of the person) is illustrated, but the tracking target may be other than the person (or the specific part of the person).
- the tracking target may be a vehicle such as an automobile or a moving robot.
- the imaging apparatus 1 in FIG. 1 can be realized by hardware or a combination of hardware and software.
- each part referred to by reference numerals 51 to 53 in FIG. 4 can be realized by hardware, software, or a combination of hardware and software.
- a block diagram of a part realized by software represents a functional block diagram of the part. 4 is described as a program, and the program is executed on a program execution device (for example, a computer) to execute the calculation. You may make it implement
Abstract
Description
15 表示部
33 撮像素子
51 追尾処理部
52 追尾信頼度評価部
53 表示制御部
次に、図10を参照して、追尾の信頼度の評価方法及び表示方法を説明する。図10は、図5のステップS16にて実行される追尾処理の詳細フローチャートに相当する。この例において、ステップS16の処理はステップS21~S24から形成されることとなる。
追尾対象人物の服の色が単色でなく複数の色を含んでいる場合、ボディアイコンの内部を複数の色にて満たすようにしてもよい。
操作部17への操作を用いたユーザによる指示により、優先色を設定することができる。例えば、赤色が優先色として設定されている場合、追尾処理部51は、赤色を追尾色として優先的に設定し、赤色を有する画像領域を追尾対象フレーム画像内から探索して追尾する。
撮影範囲に複数の人物が存在しているとき、その複数の人物を追尾対象人物として追尾し、人数分の追尾対象アイコンを表示することができる。
追尾処理において、追尾処理部51は、追尾対象フレーム画像上における第1~第3の追尾対象人物の位置を個別に検出しようとする。しかしながら、背景色と追尾色との近似や障害物による遮蔽などに起因して、或る追尾対象人物の位置は検出できるが他の追尾対象人物の位置は検出できない、といったことも生じうる。前者の追尾対象人物は、実際に追尾できている追尾対象人物であり、それを実追尾人物と呼ぶ。後者の追尾対象人物は、実際に追尾できていない追尾対象人物であり、それを追尾ロスト人物と呼ぶ。
また、図16(a)に示す如く、第1~第3の追尾対象人物に対応する第1~第3の追尾対象アイコンと1つのレベルアイコンとから形成され且つそれらが左右方向に互いに重ならないように配置された追尾モードアイコンを表示する場合において、第1~第3の追尾対象人物の中に実追尾人物と追尾ロスト人物が混在しているとき、実追尾人物に対応する追尾対象アイコンを追尾ロスト人物のそれよりもレベルアイコン側に表示するとよい。
また、追尾信頼度評価部52は、実追尾人物が複数存在する場合、複数の実追尾人物の夫々に対して個別に信頼度評価値を算出し、これによって追尾の信頼度を実追尾人物ごとに評価する。表示制御部53は、各実追尾人物に対して求められた信頼度評価値の内、最大の信頼度評価値を特定し、最大の信頼度評価値に対応する実追尾人物の追尾対象アイコンを最もレベルアイコンの近くに配置すると共に該最大の信頼度評価値をレベルアイコンに反映させる。
また、実追尾人物が複数存在する場合、図20(a)に示すような、レベルアイコンと追尾対象アイコンの組を2組分有する追尾モードアイコン420を表示することも可能である。追尾モードアイコン420は、第1組に属するレベルアイコン及び追尾対象アイコンと、第2組に属するレベルアイコン及び追尾対象アイコンと、を互いに重ならないように左右方向に並べることによって形成される。第1~第3の人物が全て実追尾人物であることを想定して追尾モードアイコン420の生成方法を説明する。
また、第1~第3の追尾対象人物の間で異なる追尾優先度が設定されている場合は、追尾優先度に応じて追尾対象アイコンの表示サイズを変更してもよい。
また、上述してきた各例ではレベルアイコンにて追尾の信頼度を表現しているが、ボディアイコンの色によって、それを表現するようにしてもよい。この場合、表示制御部53は、追尾対象アイコンのみを含む追尾モードアイコンを追尾対象フレーム画像上に重畳することによって表示用画像を生成する。
フェイスアイコンを人物の顔を模した単純な図形画像としてもよいが、フェイスアイコンを撮像素子33の出力信号から得た画像を用いて生成するようにしてもよい。具体的には、初期設定用フレーム画像から追尾対象人物の顔領域が検出された際、その顔領域内の画像そのものをフェイスアイコンの画像とする。或いは、その顔領域内の画像データに基づいて追尾対象人物の顔を模した画像を生成し、その生成した画像をフェイスアイコンの画像とする。図24に、そのようなフェイスアイコンを含む追尾モードアイコンを示す。
追尾処理をパターンマッチングを用いて行うことも可能である。即ち例えば、初期設定用フレーム画像から顔領域を抽出して、その顔領域に対応する胴体領域を抽出したとき、その胴体領域の画像をテンプレートとして記録しておく。その後、追尾対象フレーム画像内から、そのテンプレートとの類似度が高い画像領域を探索することによって、追尾対象フレーム画像上における追跡対象人物の位置を検出する(即ち、追尾を行う)。
また、上述してきた任意の追尾モードアイコンに、画像上における追尾対象人物の動き量を表す動き量アイコンを追加してもよい。この場合、動き量アイコンを含む追尾モードアイコンが追尾処理実行時における各フレーム画像に重畳して表示される。追尾処理部51が複数の追尾対象フレーム画像間における追尾対象人物の位置変化を求めれば、画像上における追尾対象人物の動き量は求まる。この動き量は、その位置変化の増大に従って増大する、0以上の値をとる。
上述した説明文中に示した具体的な数値は、単なる例示であって、当然の如く、それらを様々な数値に変更することができる。上述の実施形態の変形例または注釈事項として、以下に、注釈1~注釈3を記す。各注釈に記載した内容は、矛盾なき限り、任意に組み合わせることが可能である。
上述の実施形態では、追尾対象が人物(又は人物の特定部位)である場合を例示したが、追尾対象は人物(又は人物の特定部位)以外でもよい。例えば、追尾対象を、自動車等の車両や移動するロボットしてもよい。
上述の実施形態では、フレームを単位として考え、フレーム画像列に対して、顔検出処理や追尾処理を含む各種の処理を行っているが、フィールドを単位として考え、フィールド画像列に対して、それらの処理を行うようにしてもよい。
図1の撮像装置1は、ハードウェア、或いは、ハードウェアとソフトウェアの組み合わせによって実現可能である。特に、図4の符号51~53によって参照される各部位は、ハードウェア、ソフトウェア、またはハードウェアとソフトウェアの組み合わせによって実現可能である。ソフトウェアを用いて撮像装置1を構成する場合、ソフトウェアにて実現される部位についてのブロック図は、その部位の機能ブロック図を表すことになる。また、図4の符号51~53によって参照される各部位が行う演算処理の全部または一部を、プログラムとして記述し、該プログラムをプログラム実行装置(例えばコンピュータ)上で実行することによって、その演算処理の全部または一部を実現するようにしてもよい。
Claims (12)
- 順次撮影によって得られる画像列を表す信号を出力する撮像素子と、
前記撮像素子の出力信号に基づき、前記画像列における追尾対象の位置を検出することによって前記追尾対象を追尾する追尾処理手段と、
前記撮像素子の出力信号に基づき、前記追尾処理手段による追尾の信頼性又は容易性の度合いを複数段階に分類して評価する追尾評価手段と、
前記画像列を表示する表示手段と、
前記追尾評価手段による評価結果を前記表示手段に表示させる表示制御手段と、を備えた
ことを特徴とする撮像装置。 - 前記表示制御手段は、
前記追尾対象に対応する追尾対象アイコンと評価された前記度合いを表すレベルアイコンとを前記表示手段に表示させる
ことを特徴とする請求項1に記載の撮像装置。 - 前記追尾処理手段は、前記撮像素子の出力信号によって特定される、前記追尾対象の色情報に基づいて、前記追尾対象の追尾を実行し、
前記追尾対象アイコンは、前記色情報に応じた色を有する
ことを特徴とする請求項2に記載の撮像装置。 - 前記追尾処理手段は、前記追尾対象が有する色に応じた追尾色を設定した後、その追尾色を有する画像領域を前記画像列内で追尾することによって前記追尾対象の追尾を実行し、
前記表示制御手段は、前記追尾色が設定される前において、前記追尾対象アイコンが有する色を、無色又は事前に設定された色とする
ことを特徴とする請求項3に記載の撮像装置。 - 前記追尾対象として複数の追尾対象が存在する場合、
前記表示制御手段は、前記複数の追尾対象に対応する複数の追尾対象アイコンを前記表示手段に表示させる
ことを特徴とする請求項2~請求項4の何れかに記載の撮像装置。 - 前記複数の追尾対象が存在する場合において、前記複数の追尾対象に、実際に追尾できている追尾対象とそうでない追尾対象が含まれるとき、
前記表示制御手段は、前者の追尾対象に対応する追尾対象アイコンと後者の追尾対象に対応する追尾対象アイコンとの間で、前記追尾対象アイコンの表示形式を変更する
ことを特徴とする請求項5に記載の撮像装置。 - 前記複数の追尾対象が存在する場合において、前記複数の追尾対象に、実際に追尾できている追尾対象とそうでない追尾対象が含まれるとき、
前記表示制御手段は、前者の追尾対象に対応する追尾対象アイコンの表示位置を、後者の追尾対象に対応する追尾対象アイコンの表示位置よりも、前記レベルアイコン側に近づける
ことを特徴とする請求項5に記載の撮像装置。 - 前記複数の追尾対象が存在する場合、
前記追尾評価手段は、前記度合いの評価を追尾対象ごとに実行し、
前記表示制御手段は、前記追尾対象ごとに評価された前記度合いの内の、最大の度合い、若しくは、最小の度合い、または、それらの双方を前記レベルアイコンに反映する
ことを特徴とする請求項5に記載の撮像装置。 - 前記複数の追尾対象が存在する場合において、前記複数の追尾対象間で異なる追尾優先度が設定されているとき、
前記表示制御手段は、前記追尾優先度に応じて各追尾対象アイコンの表示サイズを決定する、或いは、前記追尾優先度に応じて各追尾対象アイコンの表示位置を決定する
ことを特徴とする請求項5に記載の撮像装置。 - 前記表示制御手段は、
前記追尾対象に対応する追尾対象アイコンを前記表示手段に表示させ、
前記追尾対象アイコンに用いる色によって、評価された前記度合いを複数段階に分類して表現する
ことを特徴とする請求項1に記載の撮像装置。 - 前記追尾対象アイコンは、
前記撮像素子の出力信号に基づく画像を用いて、又は、
事前に登録された画像を用いて、生成される
ことを特徴とする請求項1~請求項4及び請求項10の何れかに記載の撮像装置。 - 順次撮影によって得られる画像列を表す信号を出力する撮像素子と、
前記撮像素子の出力信号に基づき、前記画像列における追尾対象の位置を検出することによって前記追尾対象を追尾する追尾処理手段と、
前記画像列を表示する表示手段と、
前記追尾対象に対応する追尾対象アイコンを前記表示手段に表示させる表示制御手段と、を備え、
前記追尾処理手段は、前記撮像素子の出力信号によって特定される、前記追尾対象の色情報に基づいて、前記追尾対象の追尾を実行し、
前記追尾対象アイコンは、前記色情報に応じた色を有する
ことを特徴とする撮像装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/920,802 US8432475B2 (en) | 2008-03-03 | 2009-02-24 | Imaging device |
CN200980107707XA CN101960834B (zh) | 2008-03-03 | 2009-02-24 | 摄像装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008051676A JP5047007B2 (ja) | 2008-03-03 | 2008-03-03 | 撮像装置 |
JP2008-051676 | 2008-03-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009110348A1 true WO2009110348A1 (ja) | 2009-09-11 |
Family
ID=41055908
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/053241 WO2009110348A1 (ja) | 2008-03-03 | 2009-02-24 | 撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8432475B2 (ja) |
JP (1) | JP5047007B2 (ja) |
CN (1) | CN101960834B (ja) |
WO (1) | WO2009110348A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8929598B2 (en) | 2011-06-29 | 2015-01-06 | Olympus Imaging Corp. | Tracking apparatus, tracking method, and storage medium to store tracking program |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5177045B2 (ja) * | 2009-03-25 | 2013-04-03 | 株式会社Jvcケンウッド | 画像表示装置、撮像装置、画像表示システム、画像表示方法および画像合成装置 |
WO2011043060A1 (ja) * | 2009-10-07 | 2011-04-14 | パナソニック株式会社 | 追尾対象選択装置、方法、プログラム及び回路 |
JP5488076B2 (ja) * | 2010-03-15 | 2014-05-14 | オムロン株式会社 | 対象物追跡装置、対象物追跡方法、および制御プログラム |
WO2012021898A2 (en) | 2010-08-13 | 2012-02-16 | Certusview Technologies, Llc | Methods, apparatus and systems for surface type detection in connection with locate and marking operations |
CA2811639A1 (en) * | 2010-09-17 | 2012-03-22 | Jeffrey Farr | Methods and apparatus for tracking motion and/or orientation of a marking device |
JP5809925B2 (ja) * | 2010-11-02 | 2015-11-11 | オリンパス株式会社 | 画像処理装置、それを備えた画像表示装置及び撮像装置、画像処理方法、並びに画像処理プログラム |
US20120182427A1 (en) * | 2011-06-06 | 2012-07-19 | Aaron Marshall | System and method for providing thermal gender recognition |
JP5694097B2 (ja) * | 2011-09-08 | 2015-04-01 | オリンパスイメージング株式会社 | 撮影機器 |
EP2811736A4 (en) * | 2012-01-30 | 2014-12-10 | Panasonic Corp | OPTIMUM CAMERA SETUP DEVICE AND OPTIMUM CAMERA SETTING METHOD |
JP5978639B2 (ja) * | 2012-02-06 | 2016-08-24 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、及び記録媒体 |
KR101964861B1 (ko) * | 2012-06-29 | 2019-04-02 | 삼성전자주식회사 | 카메라 장치 및 상기 카메라 장치에서의 물체 추적 방법 |
KR101890137B1 (ko) * | 2012-10-25 | 2018-08-21 | 삼성전자주식회사 | 촬영 장치 및 제어 방법 |
US20150248772A1 (en) * | 2014-02-28 | 2015-09-03 | Semiconductor Components Industries, Llc | Imaging systems and methods for monitoring user surroundings |
JP6331785B2 (ja) * | 2014-07-08 | 2018-05-30 | 日本電気株式会社 | 物体追跡装置、物体追跡方法および物体追跡プログラム |
JP6308106B2 (ja) * | 2014-11-14 | 2018-04-11 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
US10459615B2 (en) | 2014-12-11 | 2019-10-29 | Rdi Technologies, Inc. | Apparatus and method for analyzing periodic motions in machinery |
US10062411B2 (en) * | 2014-12-11 | 2018-08-28 | Jeffrey R. Hay | Apparatus and method for visualizing periodic motions in mechanical components |
WO2016152316A1 (ja) * | 2015-03-20 | 2016-09-29 | 日本電気株式会社 | 監視システム、監視方法、監視装置および監視装置の制御プログラム |
JP6649864B2 (ja) | 2015-10-23 | 2020-02-19 | 株式会社モルフォ | 画像処理装置、電子機器、画像処理方法及びプログラム |
JP6838310B2 (ja) * | 2016-08-09 | 2021-03-03 | 大日本印刷株式会社 | 撮影画像内の個体の検出装置 |
US20180130444A1 (en) * | 2016-11-07 | 2018-05-10 | Electronics And Telecommunications Research Institute | Information image display apparatus and method |
KR101877232B1 (ko) * | 2017-05-29 | 2018-07-12 | 국방과학연구소 | 영상 유사도 및 추적 지점 변화량 기반의 영상 추적기 시스템 및 그의 추적 안정도 판별 방법 |
KR101961663B1 (ko) * | 2017-05-29 | 2019-03-27 | 국방과학연구소 | 플랫폼 운용시스템 및 그의 표적 포착 방법 |
JP6690622B2 (ja) * | 2017-09-26 | 2020-04-28 | カシオ計算機株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
US10917557B2 (en) * | 2018-04-16 | 2021-02-09 | United States Of America As Represented By The Secretary Of The Air Force | Human-automation collaborative tracker of fused object |
JP6744897B2 (ja) * | 2018-09-21 | 2020-08-19 | 株式会社日立製作所 | 超音波診断装置 |
US11423551B1 (en) * | 2018-10-17 | 2022-08-23 | Rdi Technologies, Inc. | Enhanced presentation methods for visualizing motion of physical structures and machinery |
CN111182205B (zh) * | 2019-12-30 | 2021-07-27 | 维沃移动通信有限公司 | 拍摄方法、电子设备及介质 |
US11373317B1 (en) | 2020-01-24 | 2022-06-28 | Rdi Technologies, Inc. | Measuring the speed of rotation or reciprocation of a mechanical component using one or more cameras |
US11282213B1 (en) | 2020-06-24 | 2022-03-22 | Rdi Technologies, Inc. | Enhanced analysis techniques using composite frequency spectrum data |
US11322182B1 (en) | 2020-09-28 | 2022-05-03 | Rdi Technologies, Inc. | Enhanced visualization techniques using reconstructed time waveforms |
JP2022076369A (ja) * | 2020-11-09 | 2022-05-19 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
TWI807598B (zh) * | 2021-02-04 | 2023-07-01 | 仁寶電腦工業股份有限公司 | 會議影像的產生方法及影像會議系統 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0974504A (ja) * | 1995-09-05 | 1997-03-18 | Canon Inc | 撮像装置 |
JPH09322049A (ja) * | 1996-05-28 | 1997-12-12 | Sony Corp | 被写体認識装置および方法 |
JP2007301166A (ja) * | 2006-05-11 | 2007-11-22 | Canon Inc | 撮像装置及びその制御方法及びプログラム及び記憶媒体 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2797830B2 (ja) * | 1992-03-31 | 1998-09-17 | 日本ビクター株式会社 | ビデオカメラにおける被写体追尾方法 |
US5416543A (en) | 1994-04-28 | 1995-05-16 | Eastman Kodak Company | Camera having selective inhibition of operation based on film speed |
JP3994422B2 (ja) * | 1997-11-12 | 2007-10-17 | 富士フイルム株式会社 | デジタルスチルカメラ |
JP4156084B2 (ja) | 1998-07-31 | 2008-09-24 | 松下電器産業株式会社 | 移動物体追跡装置 |
JP3490910B2 (ja) | 1998-09-28 | 2004-01-26 | 三洋電機株式会社 | 顔領域検出装置 |
DE60040051D1 (de) * | 1999-12-03 | 2008-10-09 | Fujinon Corp | Automatische Folgevorrichtung |
JP2001169169A (ja) | 1999-12-10 | 2001-06-22 | Fuji Photo Optical Co Ltd | 自動追尾装置 |
JP4819380B2 (ja) * | 2004-03-23 | 2011-11-24 | キヤノン株式会社 | 監視システム、撮像設定装置、制御方法、及びプログラム |
JP2006072770A (ja) | 2004-09-02 | 2006-03-16 | Sanyo Electric Co Ltd | 顔検出装置および顔向き推定装置 |
JP2006211139A (ja) | 2005-01-26 | 2006-08-10 | Sanyo Electric Co Ltd | 撮像装置 |
JP5061444B2 (ja) * | 2005-09-20 | 2012-10-31 | ソニー株式会社 | 撮像装置及び撮像方法 |
CN101017248A (zh) * | 2007-03-07 | 2007-08-15 | 南京大学 | 回收式背光照明自由立体成像装置及其方法 |
-
2008
- 2008-03-03 JP JP2008051676A patent/JP5047007B2/ja not_active Expired - Fee Related
-
2009
- 2009-02-24 CN CN200980107707XA patent/CN101960834B/zh not_active Expired - Fee Related
- 2009-02-24 WO PCT/JP2009/053241 patent/WO2009110348A1/ja active Application Filing
- 2009-02-24 US US12/920,802 patent/US8432475B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0974504A (ja) * | 1995-09-05 | 1997-03-18 | Canon Inc | 撮像装置 |
JPH09322049A (ja) * | 1996-05-28 | 1997-12-12 | Sony Corp | 被写体認識装置および方法 |
JP2007301166A (ja) * | 2006-05-11 | 2007-11-22 | Canon Inc | 撮像装置及びその制御方法及びプログラム及び記憶媒体 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8929598B2 (en) | 2011-06-29 | 2015-01-06 | Olympus Imaging Corp. | Tracking apparatus, tracking method, and storage medium to store tracking program |
Also Published As
Publication number | Publication date |
---|---|
US8432475B2 (en) | 2013-04-30 |
CN101960834A (zh) | 2011-01-26 |
CN101960834B (zh) | 2013-04-03 |
JP2009212637A (ja) | 2009-09-17 |
JP5047007B2 (ja) | 2012-10-10 |
US20110019027A1 (en) | 2011-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5047007B2 (ja) | 撮像装置 | |
JP4264663B2 (ja) | 撮影装置、画像処理装置、および、これらにおける画像処理方法ならびに当該方法をコンピュータに実行させるプログラム | |
EP2467805B1 (en) | Method and system for image analysis | |
KR100911695B1 (ko) | 촬상 장치 및 촬상 방법 | |
KR101537948B1 (ko) | 얼굴 포즈 추정을 이용한 촬영 방법 및 장치 | |
US9143671B2 (en) | Photographing apparatus, and method for photographing moving object with the same | |
JP6049448B2 (ja) | 被写体領域追跡装置、その制御方法及びプログラム | |
JP6564271B2 (ja) | 撮像装置及び画像処理方法、プログラム、並びに記憶媒体 | |
JP5127531B2 (ja) | 画像監視装置 | |
JP5959923B2 (ja) | 検出装置、その制御方法、および制御プログラム、並びに撮像装置および表示装置 | |
JP2004320287A (ja) | デジタルカメラ | |
JP2010011441A (ja) | 撮像装置及び画像再生装置 | |
JP2010226558A (ja) | 画像処理装置、画像処理方法、及び、プログラム | |
JP2011188297A (ja) | 電子ズーム装置、電子ズーム方法、及びプログラム | |
JP2010191793A (ja) | 警告表示装置及び警告表示方法 | |
JP2004320285A (ja) | デジタルカメラ | |
JP2011066809A (ja) | 撮像装置 | |
JP2009244944A (ja) | 画像回復装置および撮影装置 | |
JP6087615B2 (ja) | 画像処理装置およびその制御方法、撮像装置、および表示装置 | |
JP5539565B2 (ja) | 撮像装置及び被写体追跡方法 | |
JP2008167028A (ja) | 撮像装置 | |
US8675958B2 (en) | Subject determination method, computer program product for determining subject, and camera | |
WO2023002776A1 (ja) | 画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラム | |
KR102446832B1 (ko) | 영상내 객체 검출 시스템 및 그 방법 | |
WO2023106103A1 (ja) | 画像処理装置およびその制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980107707.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09718015 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12920802 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09718015 Country of ref document: EP Kind code of ref document: A1 |