JP2011035765A - Image processing apparatus and method - Google Patents

Image processing apparatus and method Download PDF

Info

Publication number
JP2011035765A
JP2011035765A JP2009181517A JP2009181517A JP2011035765A JP 2011035765 A JP2011035765 A JP 2011035765A JP 2009181517 A JP2009181517 A JP 2009181517A JP 2009181517 A JP2009181517 A JP 2009181517A JP 2011035765 A JP2011035765 A JP 2011035765A
Authority
JP
Japan
Prior art keywords
tracking
area
detected
setting condition
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2009181517A
Other languages
Japanese (ja)
Other versions
JP2011035765A5 (en
JP5419585B2 (en
Inventor
Satoshi Aoyama
聡 青山
Original Assignee
Canon Inc
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc, キヤノン株式会社 filed Critical Canon Inc
Priority to JP2009181517A priority Critical patent/JP5419585B2/en
Publication of JP2011035765A publication Critical patent/JP2011035765A/en
Publication of JP2011035765A5 publication Critical patent/JP2011035765A5/en
Application granted granted Critical
Publication of JP5419585B2 publication Critical patent/JP5419585B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

Object tracking processing that performs tracking based on correlation between images and face tracking processing that performs tracking on a face area detected from an image can be executed. An image processing apparatus capable of switching tracking processing is provided.
When a control unit determines that a first setting condition is satisfied when initially setting a tracking area, the control unit executes a face tracking process, and determines that the first setting condition is not satisfied Performs the object tracking process. If it is determined that the object tracking process is executed after the determination of the first setting condition and the second setting condition different from the first setting condition is satisfied, the process shifts from the object tracking process to the face tracking process. If it is determined that the second condition is not satisfied, the object tracking process is continued.
[Selection] Figure 4

Description

  The present invention relates to an image processing apparatus and an image processing method having a subject tracking process function.
  In a conventional digital camera, an arbitrary subject is selected as a tracking target from any one of the frame images in the moving image, and the luminance signal and color signal of the tracking target subject are selected from the frame images after that. Some tracking is performed by detecting a highly correlated area.
  It is desirable to keep updating the luminance signal and color signal of the subject to be tracked each time tracking is successful so that tracking can continue even if the orientation of the subject to be tracked or the light source changes. . However, in such a configuration, when another object passes in front of the subject to be tracked, the other object may be tracked (see Patent Document 1).
  There is also a digital camera equipped with a face detection function. Such a digital camera provides a comfortable shooting environment for the user by performing autofocus processing, automatic exposure processing, auto white balance processing, and the like based on the automatically recognized face position. By using this face detection function for tracking, even if a person's face moves, the person's face can be tracked. For example, when multiple faces are detected from an image, the faces of the same person between the images are determined from the position and size of each face detected in the previous image or the accumulated movement direction. Tracking is performed after identification (see Patent Document 2).
  According to such a configuration, in order to select the face area to be tracked from the face areas detected from the image, even if an object other than the other face crosses the face to be tracked, The possibility of tracking the object by mistake can be reduced.
  In addition, some digital cameras that employ a touch panel and have a face detection function can take a person's face designated by the user using the touch panel as a subject, and perform photographing based on that face (Patent Document 3). See).
  By combining these, when the user specifies an object other than a face on the touch panel, object tracking is performed to detect a region that is highly correlated with the tracking target and the luminance signal or color signal. It can be considered that the face tracking is performed using the face detection function.
JP 2009-1111716 A JP 2008-271310 A JP 2009-10777 A
  However, when the user designates a person as a subject from the touch panel of the digital camera, the digital camera may not always detect the face of the person on the screen. For example, when the person specified on the touch panel is facing sideways or facing back, the face may not be correctly detected from the screen when the surroundings are dark.
  Therefore, when the user designates the profile of the person who is the subject from the touch panel (or the back view, etc.), the digital camera cannot recognize the profile of the designated person as a face. May end up. In such a case, if another object crosses in front of the person designated for tracking, this other object may be tracked by mistake.
  In this way, even if the user designates a person's face as a tracking target, if the digital camera cannot detect the face at the designated position at that time, it will mistrack other than the face during tracking. There is a possibility that. In other words, there is a problem that other than the face is tracked against the user's intention to track the face.
  In order to solve this problem, if a face can be detected during object tracking, switching to face tracking can be considered. However, even though the user selects an object other than the face as the tracking target, there is a possibility that a person unintentionally appears and the face of the person is detected to switch from object tracking to face tracking. There is. As described above, when tracking of a person is started without detecting a person's face at the start of tracking and a person's face is detected during tracking of the object, it is determined whether the object tracking should be continued or switched to face tracking. It was difficult.
  This problem is not limited to digital cameras, but the same problem applies to digital video and personal computer applications as long as they have a function to track the subject specified by the user and a face detection function. May occur.
  Therefore, an object of the present invention is to perform face tracking in which the tracking target is limited to the face area and object tracking that is not limited to the face area, regardless of the face detection result when the user specifies the tracking target, An object of the present invention is to provide an image processing apparatus that tracks a subject as intended by a user.
  In order to achieve the above object, an image processing apparatus according to the present invention performs tracking by continuously detecting a tracking area that is a subject area to be tracked from a moving image composed of a plurality of images. Any one of a display means for displaying an image based on an image signal, a designation means for designating a position of a subject to be designated from the image displayed on the display means, and a face detection means for detecting a face area from the image, A first tracking process in which a region having a high correlation with the tracking region set in the image is detected from another image, and the detected region is set as the tracking region; and a face region set as the tracking region in any image A tracking unit that performs a second tracking process for selecting a face area of a person that can be regarded as the same as a face area detected in another image and setting the selected face area as the tracking area; Control means for selecting and performing either the first tracking process or the second tracking process, and the control means designates the position of the subject when the designation means designates the subject position. A tracking area is set at the position designated by the means, and it is determined whether the face area detected by the face detection means satisfies the first setting condition, and the first setting condition is satisfied. If so, the tracking area is set to the face area detected by the face detection means, the tracking means is caused to perform the second tracking process, and if the first setting condition is not satisfied, the tracking means is set. After the first tracking process is performed and the face area detected by the face detection unit does not satisfy the first setting condition, the tracking unit performs the first tracking process. The face area detected by the face detecting means is the first area. It is determined whether or not a second setting condition different from a predetermined condition is satisfied, and if the second setting condition is satisfied, the tracking area is set to a face area detected by the face detecting unit, and The tracking unit is configured to perform the second tracking process, and if the second setting condition is not satisfied, the tracking unit is configured to perform the first tracking process.
  According to the present invention, it is possible to perform face tracking in which the tracking target is limited to the face area and object tracking that is not limited to the face area, and regardless of the face detection result when the user specifies the tracking target, There is an effect that it is possible to provide an image processing apparatus that tracks a subject as intended.
It is a block diagram which shows the structure of the control system of the digital camera which concerns on embodiment of this invention. It is a flowchart figure which shows the procedure of the imaging | photography process performed with the digital camera which concerns on embodiment of this invention. It is a flowchart figure which shows the procedure of the tracking process at the time of the user input from the touch panel performed with the digital camera which concerns on embodiment of this invention. It is a flowchart figure which shows the procedure of the face tracking determination process performed with the digital camera which concerns on embodiment of this invention. It is screen display explanatory drawing which shows the transition of a display screen when performing the face tracking process by a user input with the digital camera which concerns on embodiment of this invention. FIG. 10 is a screen display explanatory diagram showing a transition of a display screen when shifting from an object tracking process to a face tracking process when a profile is designated by a user input in the digital camera according to the embodiment of the present invention. It is screen display explanatory drawing which shows the transition of a display screen when performing the object tracking process by a user input with the digital camera which concerns on embodiment of this invention.
  Hereinafter, embodiments of the present invention will be described with reference to the drawings.
  As described above, the present invention can be applied to any image processing apparatus having a function for tracking a subject designated by a user in a moving image and a face detection function for detecting a human face area from an image. It is. In the present embodiment, a digital camera will be described as an example of such an image processing apparatus.
  In the digital camera according to the present embodiment shown in FIG. 1, 10 is a photographing lens, 12 is a shutter having a diaphragm function, and 14 is an image sensor that converts an optical image into an electrical signal (image signal). An analog signal output at the time of imaging from the imaging element 14 is converted into a digital signal (image signal) by the A / D converter 16.
  In this digital camera, the memory control circuit 22 and the system control circuit 50 that constitute the control unit control the timing generation circuit 18. The timing generation circuit 18 supplies a clock signal and a control signal to the image sensor 14, the A / D converter 16, and the D / A converter 26 for control.
  The data (image signal) output from the A / D converter 16 or the data (image signal) output from the memory control circuit 22 is subjected to predetermined pixel interpolation processing and color conversion processing by the image processing circuit 20.
  In addition, this digital camera performs AF (autofocus) processing, AE (automatic exposure) processing, and EF (flash pre-emission) processing. For this reason, the image processing circuit 20 of the digital camera performs predetermined calculation processing on the captured image data, and the system control circuit 50 performs the exposure control circuit 40 and the distance measurement control circuit 42 based on the obtained calculation result. To control.
  Further, the image processing circuit 20 performs predetermined calculation processing using the captured image data (image signal), and performs AWB (auto white balance) processing based on the obtained calculation result.
  The memory control circuit 22 constituting the control unit controls the A / D converter 16, the timing generation circuit 18, the image processing circuit 20, the image display memory 24, the D / A converter 26, the memory 30, and the compression / decompression circuit 32. To do.
  The data output from the A / D converter 16 is sent to the image display memory 24 via the image processing circuit 20 and the memory control circuit 22, or the data of the A / D converter 16 is sent directly to the memory control circuit 22. Alternatively, it is written in the memory 30.
  The display image data written in the image display memory 24 is displayed on a display unit 28 including a TFT LCD or the like via a D / A converter 26. The display unit 28 functions as an electronic finder by displaying image data generated from signals continuously output from the image sensor 14 in real time as a moving image.
  The memory 30 is a memory for storing captured still images and moving images, and has a sufficient storage capacity for storing a predetermined number of still images and moving images for a predetermined time. The memory 30 is configured to be usable as a work area for the system control circuit 50.
  The image stored in the memory 30 is read into the compression / decompression circuit 32, and the image data is compressed or expanded by adaptive discrete cosine transform (ADCT) or the like. The data after the compression / decompression process is written into the memory 30.
  The digital camera includes an exposure control circuit 40 that controls the shutter 12 having an aperture function, a distance measurement control circuit 42 that controls focusing of the photographing lens 10, and a zoom control circuit 44 that controls zooming of the photographing lens 10. .
  This digital camera includes a system control circuit 50 that controls the entire camera system. The system control circuit 50 executes each process described later according to a program stored in a ROM (not shown).
  The system control circuit 50 is connected to a nonvolatile memory 56 composed of an electrically erasable / recordable EEPROM or the like. Furthermore, this digital camera includes a shutter switch SW1 62, a shutter switch SW2 64, and an operation unit 70 as operation means for inputting various operation instructions. The operation unit 70 includes, for example, a switch, a cross key, a dial, a touch panel, pointing based on line-of-sight detection, or a combination of a plurality of voice recognition devices.
  The shutter switch SW1 62 connected to the system control circuit 50 outputs an ON instruction signal to the system control circuit 50 in a state in which a shutter button (not shown) is being operated. Upon receiving this ON signal, the system control circuit 50 starts operations such as AF (autofocus) processing, AE (automatic exposure) processing, AWB (auto white balance) processing, and EF (flash pre-emission) processing.
  Further, the shutter switch SW2 64 outputs an ON instruction signal to the system control circuit 50 in a state where the operation of a shutter button (not shown) is completed. Upon receiving this ON signal, the system control circuit 50 starts the operation of the exposure processing for the main photographing in which the signal read from the image sensor 14 is written into the memory 30 via the A / D converter 16 and the memory control circuit 22. Instruct. Next, the system control circuit 50 instructs the start of the development processing operation using the calculation in the image processing circuit 20 and the memory control circuit 22. Next, the system control circuit 50 reads out the image data from the memory 30, compresses it with the compression / decompression circuit 32, and instructs the start of the recording process for writing the image data into the external recording medium 120.
  An operation unit 70 including various buttons and a touch panel connected to the system control circuit 50 includes a power button, a menu button, a shooting mode / playback mode switch, and the like. Here, the touch panel included in the operation unit 70 is configured integrally with the display unit 28, and when the user touches the screen of the display unit 28, information indicating the coordinates on the screen of the touched position is controlled by the system. Transmit to circuit 50.
  The system control circuit 50 is connected to a card controller 90 that transmits / receives data to / from an external recording medium such as a memory card. Further, an external recording medium 120 such as a memory card is connected to the system control circuit 50 via a card controller 90.
  The system control circuit 50 includes a face detection unit that analyzes the image data processed by the image processing circuit 20 and the image data stored in the image display memory 24 and detects a region that appears to be a face in the image data. 101 is connected. For example, the face detection unit 101 detects, as a face region, a region in which human eyes, nose, and mouth-like shapes are arranged in a predetermined positional relationship in the image data.
  When the face detection unit 101 detects a region that seems to be a face, the face detection unit 101 outputs a probability that the face is likely to be a human face, a position and size in the input image data, and the like. In addition, the face detection unit is configured to output the feature amount of each feature point of the detected face.
  The system control circuit 50 performs an image difference operation on two images recorded in the image display memory 24 and the memory 30 based on at least one of luminance information and color information, and has a high correlation between the images. Is connected to the image comparison unit 102.
  Next, photographing processing in the digital camera configured as described above will be described with reference to the flowchart of FIG.
  In this digital camera photographing process, when the digital camera is turned on and ready for photographing, the memory control circuit 22 and the system control circuit 50 constituting the control unit start the photographing process. The system control circuit 50 starts exposure to the image sensor 14 and causes the display unit 28 to display a moving image in real time using continuously generated image data.
  In this shooting process, in step S201, it is determined whether or not the current shooting mode is a mode in which a tracking process is performed in response to a user input (hereinafter, touch input) from the touch panel included in the operation unit 70. In the present embodiment, a mode in which a tracking process is performed by the user touching the screen of the display unit 28 while the display unit 28 displays a moving image is installed, and the user operates whether or not to execute this mode. It can be set in advance by using a menu button included in the unit 70. If the system control circuit 50 determines that the shooting mode is a mode in which the user can perform touch input (YES in step S201), the system control circuit 50 proceeds to step S202, and executes a touch tracking process.
  In this touch tracking process (step S202), for example, as shown in the screen display example of FIG. 5-a, when a touch input from the user is received, a process of converting the touch-input position into coordinates in the screen is executed. To do. At this time, the digital camera may be configured to display that the touch input has been received by displaying a frame (indicator) superimposed on the screen being displayed at a position corresponding to the user's touch input. good.
  Further, in this touch tracking process (step S202), subject determination at the touched position is performed based on the touch input, and touch tracking processing according to the subject is started according to the determination, and the process proceeds to the next step S203. . The touch tracking process will be described in detail later using the flowchart of FIG.
  The system control circuit 50 proceeds to step S203, and returns to step S201 if the shutter switch SW1 62 is not pressed.
  If the system control circuit 50 determines that the shooting mode is a mode that does not accept the user's touch input (NO in step S201), the system control circuit 50 selects the main subject in the other mode and the shutter switch SW162 is pressed. (Step S203). Other modes include a face auto mode that automatically selects the face that appears to be the main subject from the detected faces, and a subject that is located closest to the subject in a plurality of preset AF areas. There is a multi-point automatic mode to select as a subject.
  Next, when it is determined that the shutter switch SW1 62 has been pressed (YES in step S203), the system control circuit 50 performs AF / AE processing (step S204) in which a weight is set on the set main subject.
  Then, the system control circuit 50 continues the AF / AE process (step S204) and waits until the shutter switch SW2 64 is pressed (NO in step S205). In this standby state, when the pressing of the shutter switch SW1 62 is released, the process returns to step S201.
  Next, when it is determined that the shutter switch SW2 64 has been pressed, the system control circuit 50 proceeds to step S206, performs shooting, records the shot image file in the external recording medium 120, and ends the shooting process. .
  Next, touch tracking processing at the time of touch input, which is executed during the above-described imaging processing, will be described with reference to the flowchart of FIG.
  The touch tracking process performed by the digital camera includes an object tracking process as a first tracking process for tracking an object other than a face as a main subject, and a face of a specific person displayed on the screen as a main subject. And a face tracking process as a second tracking process.
  When the touch tracking process is started, the system control circuit 50 confirms whether or not there is a touch input from the user in step S300. If the system control circuit 50 determines that a new touch input has not been performed, the system control circuit 50 proceeds to step S301 and confirms whether touch tracking processing is in progress.
  If the system control circuit 50 determines that touch tracking is not in progress, the flowchart of FIG. 3 ends. If the system control circuit 50 determines that touch tracking is in progress, the system control circuit 50 proceeds to step S312 and determines whether or not the face tracking process has already been started.
  If the system control circuit 50 determines in step S312 that the face tracking process is not performed, the system control circuit 50 proceeds to step S309, which will be described later, and executes the face tracking determination process. On the other hand, if the system control circuit 50 determines that the face tracking process has already started, the system control circuit 50 proceeds to step S311 to be described later and executes the face tracking process.
  Returning to step S300, when the system control circuit 50 determines that there is a touch input from the user, the system control circuit 50 proceeds to step S302 and executes a process of acquiring touch coordinates. In the touch coordinate acquisition process in step S302, the control unit of the touch panel provided in the operation unit 70 that has received the touch input converts the touched position on the screen into coordinates on the screen, and obtains the coordinates. To the system control circuit 50.
  Next, the system control circuit 50 proceeds to the process of setting the tracking area in step S303, and sets the tracking area of the subject to be tracked based on the obtained touch coordinates. In the process of setting the tracking area, the position on the display image data is obtained from the display image data stored in the image display memory 24 and the touched coordinates.
  Further, in the process of setting the tracking area, the size of the tracking area of the subject is obtained in consideration of the contrast obtained from the luminance information around the position on the display image data, the image feature amount, and the like. That is, a temporary tracking area having a plurality of sizes with the touched coordinates as a center is set, and a temporary tracking area whose contrast and feature amount satisfy a predetermined condition is set as a tracking area.
  Here, the reason why the contrast and the feature amount are compared with predetermined conditions is that if these values do not reach a certain level, it is difficult to distinguish them from other regions. In addition, the minimum size is selected as the size is set larger, the background other than the subject to be tracked is included in the set tracking area, and the tracking accuracy may decrease. This is because it becomes higher.
  The system control circuit 50 determines whether or not the subject tracking area has been correctly obtained in step S303, and if it is determined that the subject tracking area has been correctly obtained (YES in step S304), the system control circuit 50 proceeds to step S308.
  Further, the system control circuit 50 may not be able to obtain the tracking area correctly when the brightness around the touched coordinates is uniform and the contrast cannot be detected, or when the screen has a similar pattern. In such a case, the process proceeds to step S305.
  In step S305, the system control circuit 50 performs AF processing based on the AF evaluation value at the touch input position. In this AF process, the resolution of display image data around the touch input position is improved.
  Next, in step S306, the system control circuit 50 detects the tracking area again at the same position from the image display data after AF processing.
  Next, when the tracking area cannot be detected again in step S307, the system control circuit 50 proceeds to step S322. The system control circuit 50 terminates the flowchart of FIG. 3 after performing an operation such as displaying to the user that the subject cannot be tracked as the tracking impossible processing in step S322.
  If the tracking area can be correctly determined in step S307, the system control circuit 50 proceeds to step S308. In step S308, the system control circuit 50 stores the color information of the tracking area determined as described above.
  Next, in step S309, the system control circuit 50 determines whether or not the subject is capable of face tracking. The face tracking determination process will be described later with reference to FIG.
  Next, when it is determined in step S310 that the face tracking process is possible in the face tracking determination process in step S309, the system control circuit 50 proceeds to step S311 and performs the face tracking process.
  In step S311, the system control circuit 50 performs face tracking processing. In this face tracking process, the system control circuit 50 uses the display image data in the image display memory 24 and the face detection unit 101 to obtain a face area existing in the screen being displayed.
  The system control circuit 50 performs the face tracking determination process in step S309 when the tracking area is set by touch input or when the tracking area is updated by the object tracking process and the process proceeds to the face tracking process for the first time. A face area determined to be capable of face tracking is set as a new tracking area.
  Further, if the face tracking process is already in progress, the system control circuit 50 sets a face area to be the next tracking area. In this case, for example, the distance between the position of the previously detected face area being tracked and a plurality of face areas detected on the current display image data, the amount of movement and the direction of movement of each face area, etc. Therefore, a face area that is considered to be the same as the face area set as the previous tracking area may be set. Alternatively, the system control circuit 50 uses a control module such as personal authentication (not shown) to detect the relationship between the face area detected last time and the feature points and similarities between the current face area and the face area. . Then, the system control circuit 50 may set a face area to be the next tracking area by specifying a face area that seems to be the same as the previous face area from the relevance of the detected face area.
  In order to show the face of the person being tracked to the user, the system control circuit 50 displays the face area being tracked surrounded by a frame (index) on the screen, for example, as shown in FIG. Further, in the face tracking process, the system control circuit 50 performs the display shown in FIG. 5B, and when the face area of the person surrounded by the frame (index) moves within the screen, for example, FIG. As shown in -c, the frame display is updated in accordance with the movement of the person's face.
  Next, proceeding to step S320, the system control circuit 50 determines whether the user has instructed the end of touch tracking by operating the GUI button or the like on the touch panel. When the system control circuit 50 determines that there is an instruction to end touch tracking, the system control circuit 50 proceeds to step S321 and performs tracking end processing. In the tracking end process, the system control circuit 50 notifies the user of the tracking end, and then ends the display of the frame (index) indicating the tracking area.
  In step S320, when the system control circuit 50 determines that there is no instruction to end touch tracking, the flowchart of FIG. 3 is ended, and the process proceeds to step S203 of FIG. If the shutter switch SW1 62 is not pressed in step S203, the system control circuit 50 executes the flowchart shown in FIG. 3 again for the new display image data via step S201. In this way, the tracking is continued by repeating these processes until the shutter switch SW1 62 is pressed.
  Next, when it is determined in step S310 described above that face tracking processing is impossible, the process proceeds to step S313, and the system control circuit 50 determines whether or not object tracking based on luminance information is possible.
  The determination of whether or not object tracking is possible based on this luminance information is, for example, whether the surrounding area including the tracking area has not changed significantly compared to when the tracking area is updated, or the luminance around the tracking area is uniform. Examine the condition of whether or not. When these conditions are satisfied, it is determined that the accuracy of object tracking due to the correlation of luminance information is reduced, and it is determined that object tracking is impossible.
  If the system control circuit 50 determines that the object tracking based on the luminance information is possible while satisfying all the conditions, the system control circuit 50 proceeds to step S314 and performs the object tracking process using the luminance information.
  The object tracking processing based on the luminance information is detected by correlation using the display image data in the current image display memory 24, the image data in the tracking area in the display image data in which the previous tracking area is set, and the image comparison unit 102. It is processing.
  That is, in this object tracking process, the image data that has been set as the tracking area in the display image data in which the previous tracking area is set is moved to the current display image data by correlation calculation using each luminance information. calculate. Then, among the current display image data, the area that is determined to have the highest correlation with the image data that was the tracking area in the display image data that was previously set as the tracking area is updated as a new tracking area in the current frame image To track the object.
  Next, proceeding to step S315, the system control circuit 50 stores the color information of the area newly set as the tracking area.
  Next, the process proceeds to step S320 described above, and the system control circuit 50 determines whether or not there is an instruction to end touch tracking by the user, and when it is determined that there is no instruction to end touch tracking, the flowchart of FIG. And the process proceeds to step S203 in FIG. When the system control circuit 50 determines that there is an instruction to end touch tracking, the system control circuit 50 proceeds to step S321 and performs tracking end processing. In the tracking end process, the system control circuit 50 notifies the user of the tracking end, and then ends the display of the frame (index) indicating the tracking area.
  Next, when the system control circuit 50 determines whether or not the object tracking based on the luminance information is possible in step S313 described above, if it is determined that the object tracking is impossible, the process proceeds to step S316.
  In step S316, the system control circuit 50 next attempts to track using the color information for the tracking area where it is determined that object tracking with luminance information is impossible. Therefore, the system control circuit 50 detects whether color information is stored in step S308 or step S315 described above. When the color information is stored, the system control circuit 50 uses the color information to search the display image data for a region having a color distribution similar to the current display image data ( Step S316).
  Next, the system control circuit 50 proceeds to step S317, and if the system control circuit 50 determines that a new tracking area search has succeeded in the tracking area search based on the color information, the system control circuit 50 determines that tracking is possible again and proceeds to step S318. Specifically, the system control circuit 50 generates a hue histogram from the stored color information, and detects a region where the difference from the histogram is the smallest. If the difference is less than or equal to a predetermined threshold, it is determined that the tracking area search by the color information has succeeded, and if it cannot be detected, it is determined that the search has failed.
  In step S318, the system control circuit 50 sets a region that has been successfully searched as a new tracking region. In step S315, the system control circuit 50 stores the color information of the area newly set as the tracking area.
  When the system control circuit 50 determines in step S317 described above that the search for the new tracking area has failed in the search for the tracking area based on the color information, the system control circuit 50 proceeds to step S321 and ends the tracking process.
  Next, the face tracking determination process related to step S309 described above will be described with reference to FIG.
  In this face tracking determination process, the system control circuit 50 determines whether to switch to the face tracking process based on the face detection result for the current tracking area. In this face tracking determination process, as described with reference to FIG. 3, when it is determined that the face tracking process is possible, tracking based on the face detection result is performed. Further, in this face tracking determination process, when it is determined that the face tracking process is impossible, the current tracking area is not subjected to the face detection result by the face detection unit 101, and the object tracking based on the luminance information and the color information is performed. Processing is performed.
  When the face tracking determination process is started and the process proceeds to step S400, the system control circuit 50 acquires the coordinates of the current tracking area.
  Next, proceeding to step S401, the system control circuit 50 controls the face detection unit 101, and from the currently displayed display image data in the image display memory 24, it is considered to be a person's face on the display image data. All areas are detected and the coordinates of each area are obtained.
  In step S402, the system control circuit 50 determines whether a face area has been detected from the display image data as a result of face detection. If the face area cannot be detected (NO in step S402), the system control circuit 50 proceeds to step S420, counts up the number of face detections, proceeds to step S421, and determines that face tracking is impossible. .
  If the face area is detected (YES in step S402), the system control circuit 50 proceeds to the next step S403. In step S403, the system control circuit 50 calculates the distance between the tracking area and all detected face areas.
  Next, proceeding to step S404, the system control circuit 50 determines the closest face area that is closest to the tracking area from the distances calculated in step S403.
  Next, proceeding to step S405, the system control circuit 50 determines whether the tracking elapsed time from the touch input is less than a predetermined time. This elapsed tracking time is the time when tracking is continued after touch input and succeeded, and is reset when tracking fails midway. If the system control circuit 50 determines that the elapsed tracking time from the touch input is less than the predetermined time (YES in step S405), the system control circuit 50 proceeds to step S406.
  In step S406, the system control circuit 50 determines whether the distance from the closest face area determined in step S404 is less than a predetermined threshold. If the system control circuit 50 determines that it is less than the threshold (YES in step S406), the system control circuit 50 proceeds to step S407, determines that face tracking is possible in the closest face region, and ends the face tracking determination processing.
  If the system control circuit 50 determines that the distance to the closest face area is equal to or greater than the threshold (NO in step S406), it determines that face tracking is impossible in step S408, and ends this face tracking determination process. . This is because if the position touched by the user is away from the face area, the user is considered to have an intention to designate something other than the face as a tracking target by touch input.
  In this face tracking determination process, the system control circuit 50 performs control as described above, and if the touched position and the face area are close and a predetermined time has not elapsed since the touch input, the face detection result is displayed. Reflecting this, it is possible to quickly shift to tracking processing by face tracking.
  Next, a description will be given of a case where it is determined in this face tracking determination process that the elapsed tracking time from the touch input has reached a predetermined time in step S405 described above (NO in step S405).
  In this case, the process proceeds to step S409, and the system control circuit 50 determines whether the distance between the closest face area and the tracking area is less than the threshold, and determines that the distance is less than the threshold (YES in step S409), step S410 is performed. Proceed to
  In step S410, a distance calculation process is performed to obtain the distance between the current closest face area as the first face area and the previous closest face area as the second face area. In this distance calculation process, the system control circuit 50 calculates the amount of motion of the face area that is the closest face area by obtaining the distance between the current closest face area and the previous closest face area.
  Next, the system control circuit 50 proceeds to step S411, determines whether the distance obtained in step S410 is less than the threshold value, and if it is less than the threshold value, proceeds to step S412 and counts up the number of closest faces present. That is, when the system control circuit 50 determines that the face area determined to be the closest face area remains near the tracking area, the system control circuit 50 counts up the number of close face existences.
  Conversely, if the system control circuit 50 determines in step S409 and step S411 that each distance is greater than or equal to the threshold value, the system control circuit 50 proceeds to step S413 and counts up the number of closest face absences. The reason why the system control circuit 50 performs such control is that the following situation is taken into consideration.
  One is a situation in which a face area with a large amount of movement moves within the screen, and further, this face area moves outside the screen. If a face area with large movement is set as a tracking target, there is a high possibility that it will be lost immediately even after tracking is started. Therefore, when such a face is the closest face, the system control circuit 50 Counts up the number of closest face absences.
  The other is a situation in which the face area is moving on the screen and only temporarily approaches the tracking area, and is not related to the subject to be originally tracked. If a person who does not intend to be the main subject accidentally moves near the tracking area, the person may accidentally track the person, so the system control circuit 50 has such a face as the closest face. If it is, the number of close face absences is counted up.
  Next, the system control circuit 50 proceeds to step S414, and counts up the number of times face detection has been performed. Further, the system control circuit 50 proceeds to step S415, determines whether the number of face detections is greater than or equal to a predetermined number, and if it is determined that the number is greater than or equal to the predetermined number (YES in step S415), proceeds to step S416.
  The system control circuit 50 proceeds to step S416, determines whether the number of close-up faces is a predetermined number or more, determines that face tracking is possible when the number is the predetermined number or more (YES in step S416), and proceeds to step S417. In step S417, the system control circuit 50 sets the closest face area when it is determined that face tracking is possible as the tracking area.
  In step S416, the system control circuit 50 determines whether the number of close-up faces is less than a predetermined number. If the number is less than the predetermined number, the system control circuit 50 determines that face tracking is impossible (NO in step S416), and in step S421. Judging that face tracking is impossible.
  In step S415 described above, when the system control circuit 50 determines that the number of face detections is less than the predetermined number (NO in step S415), the system control circuit 50 proceeds to step S418. In step S418, the system control circuit 50 determines whether or not the closest face absence number is equal to or greater than a predetermined number.
  If the system control circuit 50 determines that the number of closest face absences is a predetermined number or more (YES in step S418), the system control circuit 50 proceeds to step S419. In step S419, the system control circuit 50 clears the face detection count, the close face presence count, and the close face absence count, and resets the face tracking determination process. Then, the system control circuit 50 proceeds to step S421, performs a process when face tracking is impossible at this time, and ends the face tracking determination process.
  According to the control described above, the system control circuit 50 performs the face tracking from the object tracking process according to the situation between the tracking area during object tracking and the face area detected from the face detection result after a predetermined time has elapsed since the touch input. You can switch to processing.
  For example, as shown in FIG. 6, even when a side face or a face facing backward that cannot detect a person's face is selected at the time of touch input, the face of the person turns to the front and the face is detected stably. It may become possible. In such a case, face tracking based on the face detection result can be performed by the above-described control.
  That is, in the case of FIG. 6, when a subject image is captured by a digital camera, image data is obtained by the image sensor 14 that is an imaging means. This digital camera is configured to be able to display a captured subject image on a display unit 28 as display means based on image data.
  At this time, the user performs touch input for designating a subject to be tracked from a touch panel as a designation unit.
  Based on the subject specified by the touch input, the system control circuit 50 that also serves as the tracking unit sets a tracking region for tracking the subject specified by the specifying unit.
  In this digital camera, the face detection unit 101 detects the face of a person from the image being displayed on the display unit 28.
  In this digital camera, the system control circuit 50 as the control means determines whether the subject specified by the specifying means is a human face or an object by face tracking determination processing.
  Further, in this face tracking determination process, the tracking area is determined in light of the first setting condition when the tracking area is first set and the second setting condition applied when the tracking area is set. Set.
  In this face tracking determination process, the first setting condition for setting the tracking area at the beginning is set, for example, as being located within a predetermined range even once within a predetermined period.
  In the face tracking determination process, the second setting condition is that, for example, the tracking area set by the tracking area setting unit and the face position detected by the face detecting unit are within a predetermined range. Set it to be more than once.
  When it is determined that the designated subject is a person's face, the system control circuit 50 as the control means executes a face tracking process that is a second tracking process.
  Further, when the system control circuit 50 determines that the subject is an object, the system control circuit 50 executes an object tracking process that is a first tracking process.
  Then, the system control circuit 50 serving as the control unit performs the face tracking determination process when the face detection unit 101 serving as the face detection unit is within a predetermined range including the tracking region by performing the face tracking determination process. Detection is performed. At this time, the system control circuit 50 performs control by switching from the object tracking process to the face tracking process when it is determined that the number of times the face detection unit 101 has detected a human face is equal to or greater than the threshold.
  Thus, in this digital camera, even when a person's profile is specified as a tracking target, the face in front of the person can be switched to the tracking target and the face tracking process can be appropriately executed.
  Further, for example, as shown in FIG. 7, a face detected during object tracking may temporarily approach or cross over. In such a case, the object tracking can be continued by the above control without switching from the object tracking process to the face tracking process.
  That is, in the case of FIG. 7, when the system control circuit 50 is executing the object tracking process, the face detection unit 101 detects the human face from the image data captured by the face tracking determination process. . At this time, the system control circuit 50 counts the number of close-up faces existing, which is the number of face detections when the face detection unit 101 detects a person's face within a predetermined range including the tracking area. At the same time, the system control circuit 50 counts the number of times that the closest face is absent, which is the number of face detections when the face detection unit 101 detects a human face outside a predetermined range including the tracking area.
  Then, the system control circuit 50 determines whether or not the number of times that the face detection unit 101 detects a person's face outside the predetermined range including the tracking area is equal to or greater than the predetermined number. If it is determined by this determination that the number of closest face absences is a predetermined number or more, the system control circuit 50 controls to continue the object tracking process.
  Thereby, in this digital camera, when the user designates object tracking, even if a person crosses the object being tracked, the object tracking state can be maintained.
  Here, in the face tracking determination process, the first setting condition (steps S406 to S408) when the tracking area is set first, and the second setting condition (step S406) applied when the tracking area is set. The reason for making S409 to S421) different will be described.
  As described above, when the designated person is facing sideways or facing back and the surroundings are dark, the face may not be correctly detected from the screen. However, it is generally difficult to imagine that the person the user intends to be the main subject is in a situation where the face cannot be detected for a while after the start of tracking.
  In other words, if it is the main subject, the face detection should be successful about once in a relatively short time from the start of tracking. In other words, it can be considered that there is a high possibility that a person other than the person is selected as the main subject if the face cannot be detected even within a relatively short time from the start of tracking. For this reason, the second setting condition applied after a predetermined time has elapsed after the start of tracking sets a condition for shifting to the face tracking process more strictly than the first setting condition.
  The first setting condition and the second setting condition are not limited to the above embodiment. For example, the number of consecutive detections of the closest face area within a predetermined distance of the tracking area and the threshold value of the detection frequency of the closest face area within the predetermined distance of the tracking area may be added only to the second setting condition. Alternatively, the threshold value of the correlation between the movement trajectory of the closest face area and the movement trajectory of the tracking area within a predetermined period may be added only to the second setting condition.
  Further, as the setting condition, the threshold value of the second setting condition may be set to a value stricter than the threshold value of the first setting condition. For example, the threshold value for the detection count of the closest face area within a predetermined distance of the tracking area and the threshold value for the continuous detection count of the closest face area within a predetermined distance of the tracking area are set as the first setting condition. It is conceivable to set a higher value. Alternatively, the threshold value of the detection frequency of the closest face region within a predetermined distance of the tracking region, or the threshold value of the correlation height between the movement track of the closest face region and the tracking locus of the tracking region within a predetermined period, In the setting condition of 2, it is conceivable to set a value higher than that of the first setting condition. Alternatively, these conditions may be combined.
  That is, it is sufficient that the second setting condition is set to be less likely to shift to the face tracking process than the first setting condition, and various conditions can be applied.
  In the above embodiment, the touch panel included in the operation unit 70 is taken as an example of the designation means for the user to designate the subject to be tracked. However, the present invention is not limited to this. A dial or a cross key included in the operation unit 70 may be configured as the designation unit. Alternatively, if the tracking process described above is performed on an application of a personal computer that captures a moving image, the subject to be tracked is specified with a cursor, or the subject to be tracked with a keyboard is present on the screen. You may specify the area.
  In the above embodiment, as an example of the face detection unit 101, an example in which a face area is detected from display image data has been described. However, a display image is separated from display image data from a signal generated by the image sensor 14. Image data for face detection with a resolution lower than that of the data may be generated.
  In addition, as an example of object tracking processing, a configuration in which either luminance information or color information is selected and a tracking area is detected from new image data using the correlation of the selected information has been described as an example. It is not limited. A new tracking area may be obtained from the sum of correlations of both luminance information and color information, or a new tracking area may be obtained using only one of luminance information and color information. That is, any configuration may be used as long as the tracking region is updated by continuously detecting a region having a high correlation between images.
  Furthermore, the above-described embodiment is not limited to shooting, and the main subject tracking process described above may be performed on a moving image that has already been shot on an application of a personal computer. By tracking the main subject, it is possible to perform luminance adjustment and color adjustment around the main subject on the moving image. Of course, the same applies to the case of playing back a moving image shot with a digital camera or digital video camera.
  Note that the present invention is not limited to these specific embodiments described above, and it is needless to say that the present invention can take various forms without departing from the gist of the present invention. The present invention can also be realized by executing the following processing. That is, software (program) that realizes the functions of the above-described embodiments is supplied to a system or apparatus via a network or various storage media storing software, and the computer of the system or apparatus (or CPU, MPU, or the like). Is a process of reading and executing the program.
18 Timing generation circuit 22 Memory control circuit 50 System control circuit 70 Operation unit 101 Face detection unit 102 Image comparison unit

Claims (9)

  1. In an image processing apparatus that performs tracking by continuously detecting a tracking region that is a region of a subject to be tracked from a moving image composed of a plurality of images,
    Display means for displaying an image based on the image signal;
    Designation means for designating a position of a subject to be designated from an image displayed on the display means;
    Face detection means for detecting a face area from an image;
    A first tracking process in which a region having a high correlation with the tracking region set in any image is detected from another image, and the detected region is set as the tracking region, and the tracking region is set in any image Tracking means for performing a second tracking process for selecting a face area of a person that can be regarded as the same as the detected face area from face areas detected in another image, and setting the selected face area as the tracking area;
    Control means for causing the tracking means to select and perform either the first tracking process or the second tracking process;
    The control means includes
    When the position of the subject is designated by the designation means, a tracking area is set at the position designated by the designation means, and the face area detected by the face detection means satisfies the first setting condition. It is determined whether or not the first setting condition is satisfied. If the first setting condition is satisfied, the tracking area is set to the face area detected by the face detecting means, and the tracking means is subjected to the second tracking process. If the first setting condition is not satisfied, the tracking means performs the first tracking process,
    The face detected by the face detecting unit after the face region detected by the face detecting unit does not satisfy the first setting condition and the tracking unit performs the first tracking process. It is determined whether or not the area satisfies a second setting condition different from the first setting condition. If the area satisfies the second setting condition, the face detected by the face detection unit An image processing characterized in that the second tracking process is performed by the tracking unit after being set in an area, and the tracking unit is configured to perform the first tracking process if the second setting condition is not satisfied. apparatus.
  2. The first setting condition and the second setting condition both relate to the position of the face area detected by the face detecting means,
    The image processing apparatus according to claim 1, wherein the second setting condition is set more strictly than the first setting condition.
  3.   The first setting condition is related to the position of the face area detected by the face detecting unit from when the position of the subject is specified by the specifying unit until a predetermined time elapses. 3. The setting condition according to claim 2, wherein the setting condition relates to the position of the face area detected by the face detection means after a predetermined time has elapsed since the position of the subject was specified by the specification means. The image processing apparatus described.
  4.   The predetermined condition is the number of detections of the face region detected by the face detection unit within a predetermined distance of the tracking region, the number of continuous detections of the face region detected by the face detection unit within a predetermined distance of the tracking region, At least one of the detection frequency of the face area detected by the face detection means within a predetermined distance of the tracking area and the correlation between the movement locus of the tracking area and the movement locus of the face area detected by the face detection means, 4. The image processing apparatus according to claim 1, wherein a threshold value of the second setting condition is set to a value higher than a threshold value of the first setting condition. 5.
  5.   The predetermined condition is the number of detections of the face region detected by the face detection unit within a predetermined distance of the tracking region, the number of continuous detections of the face region detected by the face detection unit within a predetermined distance of the tracking region, At least one of the detection frequency of the face area detected by the face detection means within a predetermined distance of the tracking area and the correlation between the movement locus of the tracking area and the movement locus of the face area detected by the face detection means, 4. The image processing apparatus according to claim 1, wherein a threshold is set only for the second setting condition. 5.
  6. The first setting condition is:
    The position of the tracking area that has been set and the position of the face area detected by the face detecting means from when the position of the subject is specified by the specifying means until a predetermined time elapses, The image processing apparatus according to claim 1, wherein the image processing apparatus is located within a predetermined range even once.
  7. The second setting condition is:
    The position of the first face area detected by the face detecting means and the position of the second face area detected by the face detecting means before the first face area is detected are within a predetermined range. The number of times that the position of the tracking area and the position of the first face area that are set are within a predetermined range is more than a predetermined number of times. The image processing device according to claim 6.
  8. In an image processing method for performing tracking by continuously detecting a tracking region that is a region of a subject to be tracked from a moving image composed of a plurality of images,
    A display step of displaying an image based on the image signal on the display unit;
    A designation step for designating a position of a subject to be designated from the image displayed on the display unit;
    A face detection step of detecting a face area from the image;
    A first tracking step of detecting a region having a high correlation with the tracking region set in any image from another image, and performing a tracking process for setting the detected region as the tracking region;
    A tracking process of selecting a face area of a person that can be regarded as the same as the face area set as the tracking area in any image from the face areas detected in another image, and setting the selected face area as the tracking area. A second tracking step to be performed;
    A control step of selecting one of the first tracking step and the second tracking step and performing a tracking process;
    In the control step,
    When the position of the subject is designated in the designation step, a tracking area is set at the position designated in the designation step, and the face area detected in the face detection step satisfies the first setting condition. And if the first setting condition is satisfied, the tracking area is set to the face area detected in the face detection process, and the tracking process is performed in the second tracking process, If the first setting condition is not satisfied, a tracking process is performed in the first tracking step,
    After the face area detected in the face detection process does not satisfy the first setting condition and the tracking process is performed in the first tracking process, the face area detected in the face detection process is It is determined whether or not a second setting condition different from the first setting condition is satisfied. If the second setting condition is satisfied, the tracking area is set to the face area detected in the face detection step. An image processing method characterized in that tracking processing is performed in the second tracking step, and tracking processing is performed in the first tracking step if the second setting condition is not satisfied.
  9.   The program which makes a computer perform each process of Claim 8 when a computer reads and executes.
JP2009181517A 2009-08-04 2009-08-04 Image processing apparatus, image processing method, and program Active JP5419585B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009181517A JP5419585B2 (en) 2009-08-04 2009-08-04 Image processing apparatus, image processing method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009181517A JP5419585B2 (en) 2009-08-04 2009-08-04 Image processing apparatus, image processing method, and program

Publications (3)

Publication Number Publication Date
JP2011035765A true JP2011035765A (en) 2011-02-17
JP2011035765A5 JP2011035765A5 (en) 2012-09-13
JP5419585B2 JP5419585B2 (en) 2014-02-19

Family

ID=43764366

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009181517A Active JP5419585B2 (en) 2009-08-04 2009-08-04 Image processing apparatus, image processing method, and program

Country Status (1)

Country Link
JP (1) JP5419585B2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012144195A1 (en) * 2011-04-18 2012-10-26 パナソニック株式会社 Image capture device, image capture device focus control method, and integrated circuit
CN102821249A (en) * 2011-06-09 2012-12-12 佳能株式会社 Imaging apparatus and exposure control method
WO2013001940A1 (en) * 2011-06-29 2013-01-03 オリンパスイメージング株式会社 Tracking device, and tracking method
JP2013031167A (en) * 2011-07-19 2013-02-07 Axis Ab Method and camera for determining image adjustment parameter
WO2013054726A1 (en) * 2011-10-12 2013-04-18 キヤノン株式会社 Imaging device, and method and program for controlling same
JP2014164343A (en) * 2013-02-21 2014-09-08 Ricoh Co Ltd Image processing apparatus, image processing system, and program
JP2015062052A (en) * 2013-08-21 2015-04-02 キヤノン株式会社 Imaging apparatus and imaging method
JP2017525303A (en) * 2014-05-20 2017-08-31 西安中興新軟件有限責任公司 Image processing method, image processing apparatus, and computer recording medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1013729A (en) * 1996-06-19 1998-01-16 Matsushita Electric Works Ltd Tracking device
JP2009010777A (en) * 2007-06-28 2009-01-15 Sony Corp Imaging device, photography control method, and program
JP2009124565A (en) * 2007-11-16 2009-06-04 Casio Comput Co Ltd Imaging apparatus and program
JP2010072723A (en) * 2008-09-16 2010-04-02 Omron Corp Tracking device and tracking method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1013729A (en) * 1996-06-19 1998-01-16 Matsushita Electric Works Ltd Tracking device
JP2009010777A (en) * 2007-06-28 2009-01-15 Sony Corp Imaging device, photography control method, and program
JP2009124565A (en) * 2007-11-16 2009-06-04 Casio Comput Co Ltd Imaging apparatus and program
JP2010072723A (en) * 2008-09-16 2010-04-02 Omron Corp Tracking device and tracking method

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8842213B2 (en) 2011-04-18 2014-09-23 Panasonic Corporation Image capture device, image capture device focus control method, and integrated circuit
WO2012144195A1 (en) * 2011-04-18 2012-10-26 パナソニック株式会社 Image capture device, image capture device focus control method, and integrated circuit
JP5829679B2 (en) * 2011-04-18 2015-12-09 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America IMAGING DEVICE, FOCUSING CONTROL METHOD OF IMAGING DEVICE, AND INTEGRATED CIRCUIT
CN103210332A (en) * 2011-04-18 2013-07-17 松下电器产业株式会社 Image capture device, image capture device focus control method, and integrated circuit
US8779341B2 (en) 2011-06-09 2014-07-15 Canon Kabushiki Kaisha Imaging apparatus and exposure control method
CN102821249A (en) * 2011-06-09 2012-12-12 佳能株式会社 Imaging apparatus and exposure control method
US8878940B2 (en) 2011-06-29 2014-11-04 Olympus Imaging Corp. Tracking apparatus for tracking target subject in input image
JP5226903B1 (en) * 2011-06-29 2013-07-03 オリンパスイメージング株式会社 TRACKING DEVICE AND TRACKING METHOD
WO2013001940A1 (en) * 2011-06-29 2013-01-03 オリンパスイメージング株式会社 Tracking device, and tracking method
JP2013031167A (en) * 2011-07-19 2013-02-07 Axis Ab Method and camera for determining image adjustment parameter
CN103858043A (en) * 2011-10-12 2014-06-11 佳能株式会社 Imaging device, and method and program for controlling same
WO2013054726A1 (en) * 2011-10-12 2013-04-18 キヤノン株式会社 Imaging device, and method and program for controlling same
JP2014164343A (en) * 2013-02-21 2014-09-08 Ricoh Co Ltd Image processing apparatus, image processing system, and program
JP2015062052A (en) * 2013-08-21 2015-04-02 キヤノン株式会社 Imaging apparatus and imaging method
JP2017525303A (en) * 2014-05-20 2017-08-31 西安中興新軟件有限責任公司 Image processing method, image processing apparatus, and computer recording medium

Also Published As

Publication number Publication date
JP5419585B2 (en) 2014-02-19

Similar Documents

Publication Publication Date Title
JP6316540B2 (en) Camera device and control method thereof
US9344634B2 (en) Imaging apparatus having subject detection function, method for controlling the imaging apparatus, and storage medium
JP4829855B2 (en) Image projection apparatus and control method thereof
JP6039328B2 (en) Imaging control apparatus and imaging apparatus control method
KR101594295B1 (en) Photographing apparatus and photographing method
JP5613187B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND SUSTAINABLE COMPUTER-READABLE MEDIUM CONTAINING CODE FOR CAUSING COMPUTER TO CONTROL IMAGING DEVICE
EP2627075B1 (en) Auto burst image capture method applied to a mobile device and related mobile device
US8493477B2 (en) Image-capturing apparatus with automatically adjustable angle of view and control method therefor
JP5854848B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP5224955B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM
JP5820181B2 (en) Imaging system and control method thereof, display control apparatus and control method thereof, program, and storage medium
JP6083987B2 (en) Imaging apparatus, control method thereof, and program
JP6106921B2 (en) Imaging apparatus, imaging method, and imaging program
US10389946B2 (en) Image display device displaying partial areas and positional relationship therebetween
EP3573332A1 (en) Method of controlling an operation of a camera apparatus and a camera apparatus
JP5661373B2 (en) Imaging system, imaging apparatus, and control method thereof
JP2007306416A (en) Method for displaying face detection frame, method for displaying character information, and imaging apparatus
US8654243B2 (en) Image pickup apparatus and control method thereof
JP5306266B2 (en) Imaging apparatus and control method thereof
JP5623915B2 (en) Imaging device
EP2919456A1 (en) Display control apparatus and display control method
JP5313037B2 (en) Electronic camera, image processing apparatus, and image processing method
JP4900014B2 (en) Imaging apparatus and program thereof
JP5760324B2 (en) Image processing apparatus, image processing method, image processing program, and recording medium on which the image processing program is recorded
JP5995607B2 (en) Electronic device, program and recording medium

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120731

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120731

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130718

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130806

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131004

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131022

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131119

R151 Written notification of patent or utility model registration

Ref document number: 5419585

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151