US10839198B2 - Image processing apparatus, image processing method, and program - Google Patents

Image processing apparatus, image processing method, and program Download PDF

Info

Publication number
US10839198B2
US10839198B2 US15/935,282 US201815935282A US10839198B2 US 10839198 B2 US10839198 B2 US 10839198B2 US 201815935282 A US201815935282 A US 201815935282A US 10839198 B2 US10839198 B2 US 10839198B2
Authority
US
United States
Prior art keywords
images
target object
storage device
tracking
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/935,282
Other versions
US20180285627A1 (en
Inventor
Daisuke Gunji
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GUNJI, DAISUKE
Publication of US20180285627A1 publication Critical patent/US20180285627A1/en
Application granted granted Critical
Publication of US10839198B2 publication Critical patent/US10839198B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00261
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06K9/00255
    • G06K9/00275
    • G06K9/00288
    • G06K9/4652
    • G06K9/68
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/167Detection; Localisation; Normalisation using comparisons between temporally consecutive images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/169Holistic features and representations, i.e. based on the facial image taken as a whole
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • G06K2009/3291
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking

Definitions

  • the present invention relates to an image processing apparatus, an image processing method, and a program.
  • the image is recordable only when the face image that has been extracted with use of a face authentication technology satisfies the photographic conditions.
  • an image processing apparatus comprising: an acquisition unit configured to sequentially acquire images; an object recognition unit configured to recognize a target object in the images sequentially acquired by the acquisition unit; a tracking unit configured to perform tracking on the images sequentially acquired by the acquisition unit for the target object recognized by the object recognition unit; and a storage controller configured to control so as to store in a first storage device the images sequentially acquired by the acquisition unit, wherein the tracking unit further performs tracking for the target object on stored images, that are acquired by the acquisition unit before the target object is recognized by the object recognition unit in the images sequentially acquired by the acquisition unit, and are stored in the first storage device, and wherein the storage controller controls so as to store in a second storage device that is different from the first storage device, images in which the target object is tracked by the tracking unit of the images sequentially acquired by the acquisition unit, and images in which the target object is tracked by the tracking unit of the stored images.
  • an image processing method comprising: sequentially acquiring images; recognizing a target object in the images sequentially acquired in the sequentially acquiring; performing tracking on the images sequentially acquired in the sequentially acquiring for the target object recognized in the recognizing; controlling so as to store in a first storage device the images sequentially acquired in the sequentially acquiring; performing tracking for the target object on stored images, that are acquired before the target object is recognized in the sequentially acquired images, and have been stored in the first storage device; and storing in a second storage device that is different from the first storage device, images in which the target object is tracked of the sequentially acquired images, and images in which the target object is tracked in the stored images.
  • FIG. 1 is a block diagram for illustrating an image processing apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram for illustrating the image processing apparatus according to the first embodiment.
  • FIG. 3 is a flow chart for illustrating operation of the image processing apparatus according to the first embodiment.
  • FIG. 4 is a block diagram for illustrating an image processing apparatus according to a second embodiment of the present invention.
  • FIG. 5 is a flow chart for illustrating operation of the image processing apparatus according to the second embodiment.
  • FIG. 6 is a block diagram for illustrating an image processing apparatus according to a third embodiment of the present invention.
  • FIG. 7 is a flow chart for illustrating operation of the image processing apparatus according to the third embodiment.
  • FIG. 8A , FIG. 8B , FIG. 8C , and FIG. 8D are charts for showing images stored in each of a first storage device and a second storage device.
  • FIG. 1 is a block diagram for illustrating the image processing apparatus according to the first embodiment.
  • an image processing apparatus 101 according to the first embodiment is an electronic camera, that is, a digital camera, which is capable of acquiring a moving image and a still image, but the present invention is not limited thereto.
  • the image processing apparatus 101 may be a digital video camera, or a smart phone, which is an electronic device having both of a function of a personal digital assistant and a function of a mobile phone.
  • the image processing apparatus 101 may be a tablet terminal or a personal digital assistant (PDA), for example.
  • PDA personal digital assistant
  • the image processing apparatus 101 includes a non-volatile memory 102 , an authentication unit 103 , a tracking unit 104 , a buffer memory 105 , and an image pickup device 107 .
  • the image processing apparatus 101 also includes a controller (not shown), which is configured to control respective units.
  • the image processing apparatus 101 includes an image pickup optical system 108 , that is, a lens unit.
  • the image pickup device 107 includes an image pickup unit, which is configured to take an optical image formed with use of the image pickup optical system 108 .
  • a CMOS image sensor is used, for example.
  • the image pickup device (acquisition unit) 107 is configured to convert object light, which is caused to form an image by a lens, into an electrical signal by the image pickup unit, perform noise reduction processing, for example, to sequentially acquire image data, and output the image data.
  • the image pickup optical system 108 may or may not be removable from the image processing apparatus 101 .
  • the image processing apparatus 101 includes a storage device 106 .
  • the storage device 106 may or may not be removable from the image processing apparatus 101 .
  • identification information for identifying an object is stored in advance as a database.
  • the object is a person, and in which the object is identified by face authentication is described as an example.
  • the identification information a face image of the person is used, for example.
  • a face image of a person on which to focus is registered as the identification information with the non-volatile memory 102 .
  • the authentication unit (object recognition unit) 103 is configured to authenticate the object.
  • the authentication unit 103 is configured to authenticate the object by face authentication, for example.
  • the authentication unit 103 is configured to authenticate the object based on the face image that has been registered in advance.
  • the authentication unit 103 extracts, from within an image acquired by the image pickup device 107 , a portion that is estimated to be the face, and compares the portion to the face image registered with the non-volatile memory 102 to perform face authentication.
  • the object is a person has been described as an example, but the object may be an object other than a person.
  • the case in which object recognition is performed with use of face authentication is described here as an example.
  • a method of recognizing the object is not limited to face authentication, and any other object recognition method may be used as appropriate.
  • the tracking unit 104 is configured to perform tracking for the object authenticated by the authentication unit 103 .
  • the tracking unit 104 is capable of tracking the object included in images that have been acquired by the image pickup device 107 before a time point at which the object is authenticated by the authentication unit 103 and have already been stored in the buffer memory 105 .
  • the tracking unit 104 is configured to perform tracking for the object backward in chronological order, that is, going back in time, starting from a time point at which the object is authenticated by the authentication unit 103 in an image that has most recently been acquired by the image pickup device 107 .
  • the tracking unit 104 performs object tracking not only in chronological order on images sequentially acquired by the image pickup device 107 , but also in reverse chronological order on images that have been acquired in the past and have been recorded in the buffer. Stated differently, the tracking unit 104 performs tracking for the object in order from the most recent image to less recent images of the images that have been acquired by the image pickup device 107 before the time point at which the object is authenticated by the authentication unit 103 and have already been stored in the buffer memory 105 . When an image at the time when the object authenticated by the authentication unit 103 has started being photographed is present in the buffer memory 105 , the tracking unit 104 performs tracking back to the image, that is, the frame. When the image at the time when the object authenticated by the authentication unit 103 has started being photographed is not present in the buffer memory 105 , tracking is performed back to the least recent image, that is, the least recent frame.
  • the tracking unit 104 tracks the face of the object authenticated by the authentication unit 103 and a portion other than the face that is estimated based on the face.
  • the tracking unit 104 performs tracking as long as the object authenticated by the authentication unit 103 , that is, the person having his/her face registered, is estimated to be included in image data.
  • the tracking unit 104 detects the face portion of the object authenticated by the authentication unit 103 , that is, a movement locus of a face region, and determines a skin-color region located on an extended line of the movement locus as the face of the object to perform tracking.
  • the tracking unit 104 determines, as a target object, an object that is located on the extended line of the locus of the object authenticated by the authentication unit 103 , and that satisfies a predetermined condition, and tracks the target object. In this manner, the object can be tracked back to a frame that has been failed in face authentication of the object because the face region was small, for example. Moreover, the tracking unit 104 can also determine an object that is located substantially at the same point as that of the object authenticated by the authentication unit 103 and that is similar to the object in size, shape, and other feature to be the object, and perform tracking for the object.
  • the tracking unit 104 can determine an object having differences in position and shape that are within a predetermined level from the object authenticated by the authentication unit 103 to be the target object, and track the target object. In this manner, for example, the object can be tracked back to a frame that has been failed in face authentication because the object was facing back and the face was hidden.
  • the tracking unit 104 When the object authenticated by the authentication unit 103 is included in an image, the tracking unit 104 generates metadata containing information indicating that the object is included, and information indicating that the image is an image extracted in tracking by the tracking unit 104 .
  • the thus-generated metadata is stored in the storage device 106 in association with the image when the image is stored in the storage device 106 . Tracking of the object in images acquired after the object is authenticated by the authentication unit 103 may be performed by the tracking unit 104 or by the authentication unit 103 .
  • the buffer memory (first storage device) 105 is used as a ring buffer.
  • each of a plurality of images, that is, each of a plurality of frames, forming a moving image acquired by the image pickup device 107 is stored temporarily.
  • the images, that is, the image data, sequentially stored in the buffer memory 105 may be copied or moved to the storage device 106 .
  • the image data is sequentially stored in the free memory space.
  • the buffer memory 105 is used as a ring buffer, and old image data is sequentially overwritten with new image data.
  • the storage device (second storage device) 106 is configured to permanently store the image data and other such data.
  • image data that is required to be permanently stored is stored in the storage device 106 .
  • the image data acquired by the image pickup device 107 is recorded in the storage device 106 via the buffer memory 105 .
  • FIG. 2 is a block diagram for illustrating the image processing apparatus according to the first embodiment.
  • the image processing apparatus 101 includes a display 201 , a video RAM (VRAM) 202 , a bit move unit (BMU) 203 , an operation device 204 , the image pickup device 107 , and a central processing unit (CPU) 206 .
  • the image processing apparatus 101 further includes a read only memory (ROM) 207 , a random access memory (RAM) 208 , and a flash memory 209 .
  • the image processing apparatus 101 further includes a memory card interface (memory card I/F) 210 , a network interface (network I/F, NET I/F) 211 , and a bus 212 .
  • the image processing apparatus 101 includes the image pickup optical system 108 as described above.
  • the image pickup optical system 108 may or may not be removable from the image processing apparatus 101 .
  • the memory card I/F 210 is included in a memory card slot (not shown) included in the image processing apparatus 101 . To the memory card slot, a memory card 205 is mountable. The image data and other such data are written to and read from the memory card 205 via the memory card I/F 210 .
  • user interface information for operating the image processing apparatus 101 according to the first embodiment is displayed, for example.
  • the user interface information include an icon, a message, and a menu.
  • the display 201 include a liquid crystal monitor, but the present invention is not limited thereto.
  • the display 201 may be formed integrally with the image processing apparatus 101 , or may be formed separately from the image processing apparatus 101 . When the display 201 is formed integrally with the image processing apparatus 101 , a display screen of the display 201 corresponds to a display screen of the image processing apparatus 101 .
  • VRAM 202 image data used in displaying the screen of the display 201 is temporarily stored.
  • the image data stored in the VRAM 202 is transferred to the display 201 in accordance with a predetermined method to display an image on the display 201 .
  • the BMU 203 is configured to control data transfer between memories, and data transfer between a memory and each I/O device, for example.
  • Examples of the data transfer between the memories include data transfer between the VRAM 202 and another memory.
  • Examples of the data transfer between the memory and each I/O device include data transfer between the memory and the network I/F 211 .
  • the operation device 204 is used to input various operations, and includes a button and a switch, for example.
  • the button include a power button, an instruction button, a menu display button, and an enter button.
  • the switch include a change-over switch.
  • all types of operating elements such as a cursor key, a pointing device, a touch panel, and a dial may be used as the operation device 204 .
  • Operating members of the operation device 204 may be embodied by various functional icons displayed on the display 201 . A user may perform an operation by selecting one of those functional icons as appropriate.
  • the CPU (controller, storage controller) 206 is configured to control the entire image processing apparatus 101 , and control each functional block based on a computer program (control program) for controlling the image processing apparatus 101 according to the first embodiment.
  • the computer program is stored in the ROM 207 or the flash memory 209 , for example.
  • the ROM 207 stores various computer programs and data, for example.
  • the RAM 208 may function as a work area for the CPU 206 , a data save area at the time of error processing, and an area for loading the computer programs, for example.
  • the flash memory 209 records the various computer programs executed by the image processing apparatus 101 , and the data temporarily stored in the RAM 208 , for example.
  • the memory card 205 records the data temporarily stored in the RAM 208 or the flash memory 209 , for example.
  • the network I/F 211 is used for communication to/from another information processing device, a printer, and other such devices via a network.
  • the bus 212 includes an address bus, a data bus, and a control bus.
  • the computer programs may be provided to the CPU 206 from the ROM 207 , the flash memory 209 , or the memory card 205 , or from another information processing apparatus through a network via the network I/F 211 .
  • a solid state drive (SSD) or the like may be provided instead of the flash memory 209 .
  • the non-volatile memory 102 described above with reference to FIG. 1 may be embodied by the flash memory 209 illustrated in FIG. 2 , for example.
  • the authentication unit 103 and the tracking unit 104 described above with reference to FIG. 1 may be embodied by the CPU 206 illustrated in FIG. 2 , for example.
  • the storage device 106 described above with reference to FIG. 1 may be embodied by the memory card 205 illustrated in FIG. 2 , for example.
  • the buffer memory 105 described above with reference to FIG. 1 may be embodied by the flash memory 209 , for example.
  • FIG. 3 is a flow chart for illustrating operation of the image processing apparatus 101 according to the first embodiment.
  • Step S 301 the CPU 206 controls respective functional blocks so that the moving image is started being taken with use of the image pickup device 107 .
  • the timing to start taking the image may be the time when the image processing apparatus 101 is powered on, for example.
  • the timing to start taking the image is not limited thereto.
  • the image may be started being taken when the display 201 becomes ready to display an image.
  • the image data, that is, each frame forming the moving image, acquired by the image pickup device 107 is sequentially stored in the buffer memory 105 .
  • the buffer memory 105 is used as a ring buffer. Therefore, when no more free memory space is available in the buffer memory 105 , old image data is sequentially overwritten with new image data.
  • Step S 302 the CPU 206 executes face authentication processing using the authentication unit 103 on the image data acquired by the image pickup device 107 .
  • Step S 302 is continued.
  • the processing proceeds to Step S 303 .
  • Step S 303 the CPU 206 executes tracking using the tracking unit 104 .
  • Tracking is performed on the object authenticated by the authentication unit 103 .
  • Tracking for the object is executed by the tracking unit 104 on previous images that have been acquired by the image pickup device 107 before the time point at which the object is authenticated by the authentication unit 103 and have already been stored in the buffer memory 105 .
  • tracking for the object is performed by the tracking unit 104 in reverse chronological order starting from the time point at which the object is authenticated by the authentication unit 103 in the image that has most recently been acquired by the image pickup device 107 .
  • the tracking unit 104 When the object authenticated by the authentication unit 103 is included in the image, the tracking unit 104 generates metadata containing the information indicating that the object is included, and the information indicating that the image is an image extracted in tracking by the tracking unit 104 .
  • the thus-generated metadata is stored in the storage device 106 in association with the image when the image is stored in the storage device 106 .
  • the metadata may not be generated at this stage.
  • the metadata may be generated in Step S 304 , which is to be described below.
  • Step S 304 the CPU 206 stores the image data, that is, each frame forming the moving image, in the storage device 106 .
  • the image data to be stored in the storage device 106 is the following image data. Specifically, as shown in FIG. 8A , of image data 802 that has already been acquired by the image pickup device 107 before a time point 801 at which the object is authenticated and has been stored in the buffer memory 105 , image data 803 to a time point 805 beyond which the target object is untrackable in tracking 804 by the tracking unit 104 is stored in the storage device 106 .
  • the image data 802 stored in the buffer memory 105 is sequentially overwritten with new image data and erased, but the image data 803 of the image data 802 remains by being moved or copied to the storage device 106 .
  • image data 809 before the time point 805 is erased from the buffer memory 105 without being stored in the storage device 106 .
  • image data 806 acquired by a time point 808 beyond which the object is untrackable in tracking 807 by the tracking unit 104 is to be stored in the storage device 106 .
  • the CPU 206 edits those pieces of image data and the above-mentioned metadata so that those pieces of image data and the above-mentioned metadata satisfy a predetermined file format, and performs control so as to store in the storage device 106 a moving image file obtained by the editing.
  • the image data sequentially acquired by the image pickup device 107 is stored in the buffer memory 105 . Then, tracking is executed on stored image data that has already been stored in the buffer memory 105 at the time point at which the object is authenticated, to thereby extract image data in which the object is included. Therefore, according to the first embodiment, storage control can be performed such that not only the images acquired after the time point at which the object is authenticated but also the images of the object that have been acquired before the object is authenticated are stored in the storage device 106 .
  • FIG. 4 is a block diagram for illustrating the image processing apparatus according to the second embodiment.
  • the same components as those of the image processing apparatus according to the first embodiment illustrated in FIG. 1 to FIG. 3 are denoted by the same reference symbols, and a description thereof is omitted or simplified.
  • An image processing apparatus 401 writes the image data including the object authenticated by the authentication unit 103 in the storage device 106 in a case where a record switch 402 is on.
  • the image processing apparatus 401 includes a non-volatile memory 102 , an authentication unit 103 , a tracking unit 104 , a buffer memory 105 , an image pickup device 107 , and the record switch 402 .
  • the image processing apparatus 401 also includes a controller (not shown), which is configured to control respective units.
  • the image processing apparatus 401 includes an image pickup optical system 108 , that is, a lens unit.
  • the image pickup optical system 108 may or may not be removable from the image processing apparatus 401 .
  • the image processing apparatus 401 includes a storage device 106 .
  • the storage device 106 may or may not be removable from the image processing apparatus 401 .
  • the record switch 402 is used to set whether to store, that is, record image data in the storage device 106 . In a case where the record switch 402 is off, even when the object is authenticated by the authentication unit 103 , the image data including the object is not written into the storage device 106 . It should be noted, however, that even in a case where the record switch 402 is off, storage of the image data acquired by the image pickup device 107 in the buffer memory 105 is not suspended, and authentication processing by the authentication unit 103 and tracking by the tracking unit 104 are also performed as appropriate. In the case where the record switch 402 is on, the image data including the object authenticated by the authentication unit 103 is written into the storage device 106 .
  • the record switch 402 may be embodied by the operation device 204 illustrated in FIG. 2 , for example.
  • FIG. 5 is a flow chart for illustrating operation of the image processing apparatus according to the second embodiment.
  • Step S 501 to Step S 503 are similar to Step S 301 to Step S 303 described above with reference to FIG. 3 , and hence a description thereof is omitted.
  • Step S 504 In the case where the record switch 402 is on (YES in Step S 504 ), the processing proceeds to Step S 505 . Meanwhile, in the case where the record switch 402 is off (NO in Step S 504 ), the processing returns to Step S 502 .
  • Step S 505 is similar to Step S 304 described above with reference to FIG. 3 , and hence a description thereof is omitted.
  • the image data including the object authenticated by the authentication unit 103 may be written into the storage device 106 in the case where the record switch 402 is on.
  • FIG. 6 is a block diagram for illustrating the image processing apparatus according to the third embodiment.
  • an image processing apparatus 601 includes a non-volatile memory 102 , an authentication unit 103 , a tracking unit 104 , a buffer memory 105 , an image pickup device 107 , and a storage instruction unit 602 .
  • the image processing apparatus 601 also includes a controller (not shown), which is configured to control respective units.
  • the image processing apparatus 601 includes an image pickup optical system 108 , that is, a lens unit.
  • the image pickup optical system 108 may or may not be removable from the image processing apparatus 601 .
  • the image processing apparatus 601 includes a storage device 106 .
  • the storage device 106 may or may not be removable from the image processing apparatus 601 .
  • the storage instruction unit (instruction unit) 602 is used by the user to instruct the image processing apparatus 601 whether to store the image data stored in the storage device 106 .
  • the CPU 206 performs control so as to maintain a state in which the image data is stored in the storage device 106 without erasing the image data from the storage device 106 .
  • the CPU 206 performs control so as to erase the image data from the storage device 106 .
  • the storage instruction unit 602 may be implemented by the operation device 204 illustrated in FIG. 2 , for example.
  • FIG. 7 is a flow chart for illustrating operation of the image processing apparatus according to the third embodiment.
  • Step S 701 to Step S 704 are similar to Step S 301 to Step S 304 described above with reference to FIG. 3 , and hence a description thereof is omitted.
  • Step S 705 When the user operates the storage instruction unit 602 so that the image data stored in the storage device 106 is stored (YES in Step S 705 ), the CPU 206 performs control so as not to erase the image data from the storage device 106 . In other words, the CPU 206 performs control so as to maintain the state in which the image data is stored in the storage device 106 . Meanwhile, when the user operates the storage instruction unit 602 so that the image data stored in the storage device 106 is erased (NO in Step S 705 ), the CPU 206 performs control so as to erase the image data from the storage device 106 (Step S 706 ).
  • the image data including the object authenticated by the authentication unit 103 may be stored in the storage device 106 based on the instruction to store by the user.
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as a
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
  • images in which the object is photographed but which have failed in tracking for the object can also be recorded, and it is possible to prevent a circumstance in which the images are not recorded although the images include the object.
  • images for the predetermined time before the target object is tracked may also be stored in the storage device 106 .
  • the present invention is not limited thereto.
  • image data up to a time point 818 that is predetermined time T 2 after the time point 808 beyond which the object is untrackable in tracking 807 may be stored in the storage device 106 .
  • images for the predetermined time after the target object is not tracked any more may also be stored in the storage device 106 .
  • image data 803 and 806 in a period in which the target object is tracked are stored in the storage device 106 , but the present invention is not limited thereto.
  • image data 826 after the time point 801 at which the target object is authenticated and also including a period in which the target object is not tracked may be stored the storage device 106 , and the image data 826 may be stored in association with metadata 827 containing information indicating that the object is included for the period in which the target object is tracked.
  • image data 823 that has been stored in the buffer memory 105 at the time point 801 at which the target object is authenticated may be stored in the storage device 106 , and the image data 823 may be stored in association with metadata 824 containing information indicating that the object is included for the period in which the target object is successfully tracked. Then, images for the periods in which the target object is tracked may be extracted in accordance with the metadata 827 and 824 from the image data 826 and 823 recorded as described above, and the images may be played, with the result that the user can see images of a part including the target object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

There are provided an object recognition unit of recognizing a target object in sequentially acquired images, a tracking unit of performing tracking on the sequentially acquired images for the recognized target object, and a storage controller of controlling so as to store in a first storage device the sequentially acquired images. The tracking unit further performs tracking for the target object on stored images, that are acquired before the target object is recognized by the object recognition unit in the sequentially acquired images, and are stored in the first storage device. The storage controller controls so as to store in a second storage device different from the first storage device, images in which the target object is tracked by the tracking unit of the sequentially acquired images, and images in which the target object is tracked by the tracking unit of the stored images.

Description

BACKGROUND OF THE INVENTION Field of the Invention
The present invention relates to an image processing apparatus, an image processing method, and a program.
Description of the Related Art
In the related art, there has been proposed a technology involving registering a face image in advance, executing face recognition on an image to be recorded that has been acquired by an image pickup apparatus, and recording the image to be recorded in a recording medium when an extracted face image satisfies photographic conditions (orientation of the face, opening or closing of the eyes, line of sight, and the like) (Japanese Patent Application Laid-Open No. 2007-20105).
However, with the related art, the image is recordable only when the face image that has been extracted with use of a face authentication technology satisfies the photographic conditions.
SUMMARY OF THE INVENTION
According to one aspect of the present invention, there is provided an image processing apparatus, comprising: an acquisition unit configured to sequentially acquire images; an object recognition unit configured to recognize a target object in the images sequentially acquired by the acquisition unit; a tracking unit configured to perform tracking on the images sequentially acquired by the acquisition unit for the target object recognized by the object recognition unit; and a storage controller configured to control so as to store in a first storage device the images sequentially acquired by the acquisition unit, wherein the tracking unit further performs tracking for the target object on stored images, that are acquired by the acquisition unit before the target object is recognized by the object recognition unit in the images sequentially acquired by the acquisition unit, and are stored in the first storage device, and wherein the storage controller controls so as to store in a second storage device that is different from the first storage device, images in which the target object is tracked by the tracking unit of the images sequentially acquired by the acquisition unit, and images in which the target object is tracked by the tracking unit of the stored images.
According to another aspect of the present invention, there is provided an image processing method, comprising: sequentially acquiring images; recognizing a target object in the images sequentially acquired in the sequentially acquiring; performing tracking on the images sequentially acquired in the sequentially acquiring for the target object recognized in the recognizing; controlling so as to store in a first storage device the images sequentially acquired in the sequentially acquiring; performing tracking for the target object on stored images, that are acquired before the target object is recognized in the sequentially acquired images, and have been stored in the first storage device; and storing in a second storage device that is different from the first storage device, images in which the target object is tracked of the sequentially acquired images, and images in which the target object is tracked in the stored images.
According to still another aspect of the present invention, there is provided a program for causing a computer to execute: sequentially acquiring images; recognizing a target object in the images sequentially acquired in the sequentially acquiring; performing tracking on the images sequentially acquired in the sequentially acquiring for the target object recognized in the recognizing; controlling so as to store in a first storage device the images sequentially acquired in the sequentially acquiring; performing tracking for the target object on stored images, that are acquired before the target object is recognized in the sequentially acquired images, and have been stored in the first storage device; and storing in a second storage device that is different from the first storage device, images in which the target object is tracked of the sequentially acquired images, and images in which the target object is tracked in the stored images.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram for illustrating an image processing apparatus according to a first embodiment of the present invention.
FIG. 2 is a block diagram for illustrating the image processing apparatus according to the first embodiment.
FIG. 3 is a flow chart for illustrating operation of the image processing apparatus according to the first embodiment.
FIG. 4 is a block diagram for illustrating an image processing apparatus according to a second embodiment of the present invention.
FIG. 5 is a flow chart for illustrating operation of the image processing apparatus according to the second embodiment.
FIG. 6 is a block diagram for illustrating an image processing apparatus according to a third embodiment of the present invention.
FIG. 7 is a flow chart for illustrating operation of the image processing apparatus according to the third embodiment.
FIG. 8A, FIG. 8B, FIG. 8C, and FIG. 8D are charts for showing images stored in each of a first storage device and a second storage device.
DESCRIPTION OF THE EMBODIMENTS
Exemplary embodiments of the present invention are hereinafter described in detail with reference to the attached drawings. The present invention is not limited to the embodiments below.
First Embodiment
An image processing apparatus and an image processing method according to a first embodiment of the present invention are described with reference to the drawings. FIG. 1 is a block diagram for illustrating the image processing apparatus according to the first embodiment. Description is given here of an example of a case in which an image processing apparatus 101 according to the first embodiment is an electronic camera, that is, a digital camera, which is capable of acquiring a moving image and a still image, but the present invention is not limited thereto. For example, the image processing apparatus 101 may be a digital video camera, or a smart phone, which is an electronic device having both of a function of a personal digital assistant and a function of a mobile phone. Moreover, the image processing apparatus 101 may be a tablet terminal or a personal digital assistant (PDA), for example.
As illustrated in FIG. 1, the image processing apparatus 101 according to the first embodiment includes a non-volatile memory 102, an authentication unit 103, a tracking unit 104, a buffer memory 105, and an image pickup device 107. The image processing apparatus 101 also includes a controller (not shown), which is configured to control respective units. The image processing apparatus 101 includes an image pickup optical system 108, that is, a lens unit. The image pickup device 107 includes an image pickup unit, which is configured to take an optical image formed with use of the image pickup optical system 108. As the image pickup unit, a CMOS image sensor is used, for example. The image pickup device (acquisition unit) 107 is configured to convert object light, which is caused to form an image by a lens, into an electrical signal by the image pickup unit, perform noise reduction processing, for example, to sequentially acquire image data, and output the image data. The image pickup optical system 108 may or may not be removable from the image processing apparatus 101. The image processing apparatus 101 includes a storage device 106. The storage device 106 may or may not be removable from the image processing apparatus 101.
In the non-volatile memory 102, identification information for identifying an object is stored in advance as a database. Here, a case in which the object is a person, and in which the object is identified by face authentication is described as an example. As the identification information, a face image of the person is used, for example. For example, a face image of a person on which to focus is registered as the identification information with the non-volatile memory 102.
The authentication unit (object recognition unit) 103 is configured to authenticate the object. The authentication unit 103 is configured to authenticate the object by face authentication, for example. The authentication unit 103 is configured to authenticate the object based on the face image that has been registered in advance. The authentication unit 103 extracts, from within an image acquired by the image pickup device 107, a portion that is estimated to be the face, and compares the portion to the face image registered with the non-volatile memory 102 to perform face authentication. Here, the case in which the object is a person has been described as an example, but the object may be an object other than a person. Moreover, the case in which object recognition is performed with use of face authentication is described here as an example. However, a method of recognizing the object is not limited to face authentication, and any other object recognition method may be used as appropriate.
The tracking unit 104 is configured to perform tracking for the object authenticated by the authentication unit 103. The tracking unit 104 is capable of tracking the object included in images that have been acquired by the image pickup device 107 before a time point at which the object is authenticated by the authentication unit 103 and have already been stored in the buffer memory 105. The tracking unit 104 is configured to perform tracking for the object backward in chronological order, that is, going back in time, starting from a time point at which the object is authenticated by the authentication unit 103 in an image that has most recently been acquired by the image pickup device 107. In other words, the tracking unit 104 performs object tracking not only in chronological order on images sequentially acquired by the image pickup device 107, but also in reverse chronological order on images that have been acquired in the past and have been recorded in the buffer. Stated differently, the tracking unit 104 performs tracking for the object in order from the most recent image to less recent images of the images that have been acquired by the image pickup device 107 before the time point at which the object is authenticated by the authentication unit 103 and have already been stored in the buffer memory 105. When an image at the time when the object authenticated by the authentication unit 103 has started being photographed is present in the buffer memory 105, the tracking unit 104 performs tracking back to the image, that is, the frame. When the image at the time when the object authenticated by the authentication unit 103 has started being photographed is not present in the buffer memory 105, tracking is performed back to the least recent image, that is, the least recent frame.
The tracking unit 104 tracks the face of the object authenticated by the authentication unit 103 and a portion other than the face that is estimated based on the face. The tracking unit 104 performs tracking as long as the object authenticated by the authentication unit 103, that is, the person having his/her face registered, is estimated to be included in image data. For example, the tracking unit 104 detects the face portion of the object authenticated by the authentication unit 103, that is, a movement locus of a face region, and determines a skin-color region located on an extended line of the movement locus as the face of the object to perform tracking. Stated differently, the tracking unit 104 determines, as a target object, an object that is located on the extended line of the locus of the object authenticated by the authentication unit 103, and that satisfies a predetermined condition, and tracks the target object. In this manner, the object can be tracked back to a frame that has been failed in face authentication of the object because the face region was small, for example. Moreover, the tracking unit 104 can also determine an object that is located substantially at the same point as that of the object authenticated by the authentication unit 103 and that is similar to the object in size, shape, and other feature to be the object, and perform tracking for the object. Stated differently, the tracking unit 104 can determine an object having differences in position and shape that are within a predetermined level from the object authenticated by the authentication unit 103 to be the target object, and track the target object. In this manner, for example, the object can be tracked back to a frame that has been failed in face authentication because the object was facing back and the face was hidden. When the object authenticated by the authentication unit 103 is included in an image, the tracking unit 104 generates metadata containing information indicating that the object is included, and information indicating that the image is an image extracted in tracking by the tracking unit 104. The thus-generated metadata is stored in the storage device 106 in association with the image when the image is stored in the storage device 106. Tracking of the object in images acquired after the object is authenticated by the authentication unit 103 may be performed by the tracking unit 104 or by the authentication unit 103.
The buffer memory (first storage device) 105 is used as a ring buffer. In the buffer memory 105, each of a plurality of images, that is, each of a plurality of frames, forming a moving image acquired by the image pickup device 107 is stored temporarily. The images, that is, the image data, sequentially stored in the buffer memory 105 may be copied or moved to the storage device 106. When free memory space is available in the buffer memory 105, the image data is sequentially stored in the free memory space. When no more free memory space is available in the buffer memory 105, the buffer memory 105 is used as a ring buffer, and old image data is sequentially overwritten with new image data.
The storage device (second storage device) 106 is configured to permanently store the image data and other such data. Of the image data stored in the buffer memory 105, image data that is required to be permanently stored is stored in the storage device 106. In this manner, the image data acquired by the image pickup device 107 is recorded in the storage device 106 via the buffer memory 105.
FIG. 2 is a block diagram for illustrating the image processing apparatus according to the first embodiment.
The image processing apparatus 101 includes a display 201, a video RAM (VRAM) 202, a bit move unit (BMU) 203, an operation device 204, the image pickup device 107, and a central processing unit (CPU) 206. The image processing apparatus 101 further includes a read only memory (ROM) 207, a random access memory (RAM) 208, and a flash memory 209. The image processing apparatus 101 further includes a memory card interface (memory card I/F) 210, a network interface (network I/F, NET I/F) 211, and a bus 212. The image processing apparatus 101 includes the image pickup optical system 108 as described above. As described above, the image pickup optical system 108 may or may not be removable from the image processing apparatus 101. The memory card I/F 210 is included in a memory card slot (not shown) included in the image processing apparatus 101. To the memory card slot, a memory card 205 is mountable. The image data and other such data are written to and read from the memory card 205 via the memory card I/F 210.
On the display 201, user interface information for operating the image processing apparatus 101 according to the first embodiment is displayed, for example. Examples of the user interface information include an icon, a message, and a menu. Examples of the display 201 include a liquid crystal monitor, but the present invention is not limited thereto. The display 201 may be formed integrally with the image processing apparatus 101, or may be formed separately from the image processing apparatus 101. When the display 201 is formed integrally with the image processing apparatus 101, a display screen of the display 201 corresponds to a display screen of the image processing apparatus 101.
In the VRAM 202, image data used in displaying the screen of the display 201 is temporarily stored. The image data stored in the VRAM 202 is transferred to the display 201 in accordance with a predetermined method to display an image on the display 201.
The BMU 203 is configured to control data transfer between memories, and data transfer between a memory and each I/O device, for example. Examples of the data transfer between the memories include data transfer between the VRAM 202 and another memory. Examples of the data transfer between the memory and each I/O device include data transfer between the memory and the network I/F 211.
The operation device 204 is used to input various operations, and includes a button and a switch, for example. Examples of the button include a power button, an instruction button, a menu display button, and an enter button. Examples of the switch include a change-over switch. Moreover, all types of operating elements such as a cursor key, a pointing device, a touch panel, and a dial may be used as the operation device 204. Operating members of the operation device 204 may be embodied by various functional icons displayed on the display 201. A user may perform an operation by selecting one of those functional icons as appropriate.
The CPU (controller, storage controller) 206 is configured to control the entire image processing apparatus 101, and control each functional block based on a computer program (control program) for controlling the image processing apparatus 101 according to the first embodiment. The computer program is stored in the ROM 207 or the flash memory 209, for example.
The ROM 207 stores various computer programs and data, for example. The RAM 208 may function as a work area for the CPU 206, a data save area at the time of error processing, and an area for loading the computer programs, for example. The flash memory 209 records the various computer programs executed by the image processing apparatus 101, and the data temporarily stored in the RAM 208, for example. The memory card 205 records the data temporarily stored in the RAM 208 or the flash memory 209, for example.
The network I/F 211 is used for communication to/from another information processing device, a printer, and other such devices via a network. The bus 212 includes an address bus, a data bus, and a control bus. The computer programs may be provided to the CPU 206 from the ROM 207, the flash memory 209, or the memory card 205, or from another information processing apparatus through a network via the network I/F 211. A solid state drive (SSD) or the like may be provided instead of the flash memory 209.
The non-volatile memory 102 described above with reference to FIG. 1 may be embodied by the flash memory 209 illustrated in FIG. 2, for example. The authentication unit 103 and the tracking unit 104 described above with reference to FIG. 1 may be embodied by the CPU 206 illustrated in FIG. 2, for example. The storage device 106 described above with reference to FIG. 1 may be embodied by the memory card 205 illustrated in FIG. 2, for example. The buffer memory 105 described above with reference to FIG. 1 may be embodied by the flash memory 209, for example.
FIG. 3 is a flow chart for illustrating operation of the image processing apparatus 101 according to the first embodiment.
In Step S301, the CPU 206 controls respective functional blocks so that the moving image is started being taken with use of the image pickup device 107. The timing to start taking the image may be the time when the image processing apparatus 101 is powered on, for example. The timing to start taking the image is not limited thereto. For example, the image may be started being taken when the display 201 becomes ready to display an image. The image data, that is, each frame forming the moving image, acquired by the image pickup device 107 is sequentially stored in the buffer memory 105. As described above, the buffer memory 105 is used as a ring buffer. Therefore, when no more free memory space is available in the buffer memory 105, old image data is sequentially overwritten with new image data.
In Step S302, the CPU 206 executes face authentication processing using the authentication unit 103 on the image data acquired by the image pickup device 107. When the registered face image is not authenticated by the authentication unit 103, that is, when the authentication unit 103 does not succeed in face authentication (NO in Step S302), Step S302 is continued. When the authentication unit 103 succeeds in face authentication (YES in Step S302), the processing proceeds to Step S303.
In Step S303, the CPU 206 executes tracking using the tracking unit 104. Tracking is performed on the object authenticated by the authentication unit 103. Tracking for the object is executed by the tracking unit 104 on previous images that have been acquired by the image pickup device 107 before the time point at which the object is authenticated by the authentication unit 103 and have already been stored in the buffer memory 105. As described above, tracking for the object is performed by the tracking unit 104 in reverse chronological order starting from the time point at which the object is authenticated by the authentication unit 103 in the image that has most recently been acquired by the image pickup device 107. When the object authenticated by the authentication unit 103 is included in the image, the tracking unit 104 generates metadata containing the information indicating that the object is included, and the information indicating that the image is an image extracted in tracking by the tracking unit 104. The thus-generated metadata is stored in the storage device 106 in association with the image when the image is stored in the storage device 106. The metadata may not be generated at this stage. For example, the metadata may be generated in Step S304, which is to be described below.
In Step S304, the CPU 206 stores the image data, that is, each frame forming the moving image, in the storage device 106. The image data to be stored in the storage device 106 is the following image data. Specifically, as shown in FIG. 8A, of image data 802 that has already been acquired by the image pickup device 107 before a time point 801 at which the object is authenticated and has been stored in the buffer memory 105, image data 803 to a time point 805 beyond which the target object is untrackable in tracking 804 by the tracking unit 104 is stored in the storage device 106. In other words, the image data 802 stored in the buffer memory 105 is sequentially overwritten with new image data and erased, but the image data 803 of the image data 802 remains by being moved or copied to the storage device 106. Meanwhile, image data 809 before the time point 805 is erased from the buffer memory 105 without being stored in the storage device 106. Moreover, of the image data sequentially acquired by the image pickup device 107 after the authentication unit 103 succeeds in authentication of the object, image data 806 acquired by a time point 808 beyond which the object is untrackable in tracking 807 by the tracking unit 104 is to be stored in the storage device 106. The CPU 206 edits those pieces of image data and the above-mentioned metadata so that those pieces of image data and the above-mentioned metadata satisfy a predetermined file format, and performs control so as to store in the storage device 106 a moving image file obtained by the editing.
As described above, according to the first embodiment, the image data sequentially acquired by the image pickup device 107 is stored in the buffer memory 105. Then, tracking is executed on stored image data that has already been stored in the buffer memory 105 at the time point at which the object is authenticated, to thereby extract image data in which the object is included. Therefore, according to the first embodiment, storage control can be performed such that not only the images acquired after the time point at which the object is authenticated but also the images of the object that have been acquired before the object is authenticated are stored in the storage device 106.
Second Embodiment
An image processing apparatus and an image processing method according to a second embodiment of the present invention are described with reference to the drawings. FIG. 4 is a block diagram for illustrating the image processing apparatus according to the second embodiment. The same components as those of the image processing apparatus according to the first embodiment illustrated in FIG. 1 to FIG. 3 are denoted by the same reference symbols, and a description thereof is omitted or simplified.
An image processing apparatus 401 according to the second embodiment writes the image data including the object authenticated by the authentication unit 103 in the storage device 106 in a case where a record switch 402 is on.
As illustrated in FIG. 4, the image processing apparatus 401 according to the second embodiment includes a non-volatile memory 102, an authentication unit 103, a tracking unit 104, a buffer memory 105, an image pickup device 107, and the record switch 402. The image processing apparatus 401 also includes a controller (not shown), which is configured to control respective units. The image processing apparatus 401 includes an image pickup optical system 108, that is, a lens unit. The image pickup optical system 108 may or may not be removable from the image processing apparatus 401. The image processing apparatus 401 includes a storage device 106. The storage device 106 may or may not be removable from the image processing apparatus 401.
The record switch 402 is used to set whether to store, that is, record image data in the storage device 106. In a case where the record switch 402 is off, even when the object is authenticated by the authentication unit 103, the image data including the object is not written into the storage device 106. It should be noted, however, that even in a case where the record switch 402 is off, storage of the image data acquired by the image pickup device 107 in the buffer memory 105 is not suspended, and authentication processing by the authentication unit 103 and tracking by the tracking unit 104 are also performed as appropriate. In the case where the record switch 402 is on, the image data including the object authenticated by the authentication unit 103 is written into the storage device 106. The record switch 402 may be embodied by the operation device 204 illustrated in FIG. 2, for example.
FIG. 5 is a flow chart for illustrating operation of the image processing apparatus according to the second embodiment.
First, Step S501 to Step S503 are similar to Step S301 to Step S303 described above with reference to FIG. 3, and hence a description thereof is omitted.
In the case where the record switch 402 is on (YES in Step S504), the processing proceeds to Step S505. Meanwhile, in the case where the record switch 402 is off (NO in Step S504), the processing returns to Step S502.
Step S505 is similar to Step S304 described above with reference to FIG. 3, and hence a description thereof is omitted.
As described above, the image data including the object authenticated by the authentication unit 103 may be written into the storage device 106 in the case where the record switch 402 is on.
Third Embodiment
An image processing apparatus and an image processing method according to a third embodiment of the present invention are described with reference to the drawings. FIG. 6 is a block diagram for illustrating the image processing apparatus according to the third embodiment.
As illustrated in FIG. 6, an image processing apparatus 601 according to the third embodiment includes a non-volatile memory 102, an authentication unit 103, a tracking unit 104, a buffer memory 105, an image pickup device 107, and a storage instruction unit 602. The image processing apparatus 601 also includes a controller (not shown), which is configured to control respective units. The image processing apparatus 601 includes an image pickup optical system 108, that is, a lens unit. The image pickup optical system 108 may or may not be removable from the image processing apparatus 601. The image processing apparatus 601 includes a storage device 106. The storage device 106 may or may not be removable from the image processing apparatus 601.
The storage instruction unit (instruction unit) 602 is used by the user to instruct the image processing apparatus 601 whether to store the image data stored in the storage device 106. When the user operates the storage instruction unit 602 so that the image data is stored, the CPU 206 performs control so as to maintain a state in which the image data is stored in the storage device 106 without erasing the image data from the storage device 106. When the user operates the storage instruction unit 602 so that the image data is erased, the CPU 206 performs control so as to erase the image data from the storage device 106. In this manner, a storage capacity of the storage device 106 can be prevented from being unnecessarily consumed, and image data that is unnecessary to the user can also be prevented from being accumulated in the storage device 106. The storage instruction unit 602 may be implemented by the operation device 204 illustrated in FIG. 2, for example.
FIG. 7 is a flow chart for illustrating operation of the image processing apparatus according to the third embodiment.
Step S701 to Step S704 are similar to Step S301 to Step S304 described above with reference to FIG. 3, and hence a description thereof is omitted.
When the user operates the storage instruction unit 602 so that the image data stored in the storage device 106 is stored (YES in Step S705), the CPU 206 performs control so as not to erase the image data from the storage device 106. In other words, the CPU 206 performs control so as to maintain the state in which the image data is stored in the storage device 106. Meanwhile, when the user operates the storage instruction unit 602 so that the image data stored in the storage device 106 is erased (NO in Step S705), the CPU 206 performs control so as to erase the image data from the storage device 106 (Step S706).
As described above, the image data including the object authenticated by the authentication unit 103 may be stored in the storage device 106 based on the instruction to store by the user.
Other Embodiments
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions. Parts of the above-mentioned embodiments may be combined as appropriate.
In the above-mentioned embodiments, there has been described as an example the case in which the tracking 804 is performed for the image data 802 stored in the buffer going back in time from the time point 801 at which the object is authenticated, and the image data 803 that has been acquired after the time point 805 beyond which the object is untrackable and has been stored in the buffer is stored in the storage device 106. However, the present invention is not limited thereto. For example, as shown in FIG. 8B, image data 813 that has been acquired after a time point 815 that is predetermined time T1 before the time point 805 beyond which the object is untrackable and has been stored in the buffer may be stored in the storage device 106. In this manner, potential images in which the object is photographed but which have failed in tracking for the object can also be recorded, and it is possible to prevent a circumstance in which the images are not recorded although the images include the object. In this manner, of the images that have already been stored in the buffer memory 105 at the time point at which the object is authenticated by the authentication unit 103, images for the predetermined time before the target object is tracked may also be stored in the storage device 106.
Moreover, in the above-mentioned embodiments, there has been described as an example the case in which, of the image data that has been acquired after the time point 801 at which the authentication unit 103 succeeded in authentication, the image data 806 that has been acquired by the time point 808 beyond which the object is untrackable in tracking 807 is stored in the storage device 106. However, the present invention is not limited thereto. For example, as shown in FIG. 8C, of the image data acquired after the time point 801 at which the authentication unit 103 succeeded in authentication of the object, image data up to a time point 818 that is predetermined time T2 after the time point 808 beyond which the object is untrackable in tracking 807 may be stored in the storage device 106. In this manner, potential images in which the object is photographed but which have failed in tracking for the object can also be recorded, and it is possible to prevent a circumstance in which the images are not recorded although the images include the object. In this manner, of the images acquired after the time point at which the object is authenticated by the authentication unit 103, images for the predetermined time after the target object is not tracked any more may also be stored in the storage device 106.
Moreover, in the above-mentioned embodiments, there has been described as an example the case in which image data 803 and 806 in a period in which the target object is tracked are stored in the storage device 106, but the present invention is not limited thereto. For example, as shown in FIG. 8D, image data 826 after the time point 801 at which the target object is authenticated and also including a period in which the target object is not tracked may be stored the storage device 106, and the image data 826 may be stored in association with metadata 827 containing information indicating that the object is included for the period in which the target object is tracked. Moreover, image data 823 that has been stored in the buffer memory 105 at the time point 801 at which the target object is authenticated may be stored in the storage device 106, and the image data 823 may be stored in association with metadata 824 containing information indicating that the object is included for the period in which the target object is successfully tracked. Then, images for the periods in which the target object is tracked may be extracted in accordance with the metadata 827 and 824 from the image data 826 and 823 recorded as described above, and the images may be played, with the result that the user can see images of a part including the target object.
This application claims the benefit of Japanese Patent Application No. 2017-072560, filed Mar. 31, 2017, which is hereby incorporated by reference wherein in its entirety.

Claims (17)

What is claimed is:
1. An image processing apparatus, comprising:
an acquisition unit configured to sequentially acquire images;
an object recognition unit configured to recognize a target object in the images sequentially acquired by the acquisition unit;
a tracking unit configured to perform tracking on the images sequentially acquired by the acquisition unit for the target object recognized by the object recognition unit in response to the object recognition unit recognizing the target object; and
a storage controller configured to control so as to store in a first storage device the images sequentially acquired by the acquisition unit,
wherein the tracking unit further performs tracking for the target object on stored images, that are acquired by the acquisition unit before the target object is recognized by the object recognition unit in the images sequentially acquired by the acquisition unit, and are stored in the first storage device,
wherein the storage controller controls so as to store in a second storage device that is different from the first storage device, images in which the target object is tracked by the tracking unit of the images sequentially acquired by the acquisition unit, and images in which the target object is tracked by the tracking unit of the stored images, and
wherein tracking for the target object is performed in reverse chronological order starting from most recently acquired images before a time point at which the target object is recognized by the object recognition unit.
2. An image processing apparatus according to claim 1, wherein the storage controller controls so as to erase, from the first storage device, images of the stored images in which the target object is not tracked by the tracking unit, without storing the images in the second storage device.
3. An image processing apparatus according to claim 1, wherein the storage controller sequentially overwrites images stored in the first storage device with new images that are sequentially acquired.
4. An image processing apparatus according to claim 1, wherein the storage controller sequentially erases images stored in the first storage device.
5. An image processing apparatus according to claim 1, wherein the storage controller controls so as to store in the second storage device, of images that are acquired by the acquisition unit before the target object is recognized by the object recognition unit in the images sequentially acquired by the acquisition unit and are stored in the first storage device, images for a predetermined time before the time point to which the target object is trackable.
6. An image processing apparatus according to claim 1, wherein the storage controller stores in the second storage device, of the images sequentially acquired by the acquisition unit, images for a predetermined time after the time point to which the target object is trackable by the tracking unit.
7. An image processing apparatus according to claim 1, wherein the tracking unit performs tracking for the target object on the stored images that are stored in the first storage device in order from most recently acquired images.
8. An image processing apparatus according to claim 1, wherein the object recognition unit recognizes the target object based on a face image that is registered in advance.
9. An image processing apparatus according to claim 1, wherein the tracking unit performs tracking an object that is located on an extended line of a locus of the target object recognized by the object recognition unit, and that satisfies a predetermined condition.
10. An image processing apparatus according to claim 1, wherein the tracking unit performs tracking an object having differences in position and shape that are within a predetermined level from the target object recognized by the object recognition unit.
11. An image processing apparatus according to claim 1, wherein the storage controller stores the images in the second storage device when a predetermined instruction is issued by a user.
12. An image processing apparatus according to claim 1, further comprising an instruction unit configured to instruct whether to store an image,
wherein, when an instruction not to store an image is issued from the instruction unit, the storage controller controls so as to erase the image from the second storage device.
13. An image processing apparatus according to claim 1, wherein the acquisition unit sequentially acquires the images using an image pickup device.
14. An image processing apparatus according to claim 1, further comprising a generation unit configured to generate an image file based on the images in which the target object is tracked by the tracking unit of the images sequentially acquired by the acquisition unit, and the images in which the target object is tracked by the tracking unit of the stored images,
wherein the storage controller controls so as to store in the second storage device the image file generated by the generation unit.
15. An image processing method, comprising:
sequentially acquiring images;
recognizing a target object in the images sequentially acquired in the sequentially acquiring;
performing tracking on the images sequentially acquired in the sequentially acquiring for the target object recognized in the recognizing in response to the object recognition unit recognizing the target object;
controlling so as to store in a first storage device the images sequentially acquired in the sequentially acquiring;
performing tracking for the target object on stored images, that are acquired before the target object is recognized in the sequentially acquired images, and have been stored in the first storage device; and
storing in a second storage device that is different from the first storage device, images in which the target object is tracked of the sequentially acquired images, and images in which the target object is tracked in the stored images;
wherein tracking for the target object is performed in reverse chronological order starting from most recently acquired images before a time point at which the target object is recognized in the recognizing step.
16. A non-transitory computer-readable storage medium storing a computer program which causes a computer to execute an image processing method, the method comprising:
sequentially acquiring images;
recognizing a target object in the images sequentially acquired in the sequentially acquiring;
performing tracking on the images sequentially acquired in the sequentially acquiring for the target object recognized in the recognizing in response to the object recognition unit recognizing the target object;
controlling so as to store in a first storage device the images sequentially acquired in the sequentially acquiring;
performing tracking for the target object on stored images, that are acquired before the target object is recognized in the sequentially acquired images, and have been stored in the first storage device; and
storing in a second storage device that is different from the first storage device, images in which the target object is tracked of the sequentially acquired images, and images in which the target object is tracked in the stored images;
wherein tracking for the target object is performed in reverse chronological order starting from most recently acquired images before a time point at which the target object is recognized in the recognizing step.
17. An image processing apparatus according to claim 1, wherein
tracking for the target object is continued for each the images sequentially acquired in the sequentially acquiring after a time point at which the target object is recognized by the object recognition unit.
US15/935,282 2017-03-31 2018-03-26 Image processing apparatus, image processing method, and program Active 2038-07-10 US10839198B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017072560A JP6866210B2 (en) 2017-03-31 2017-03-31 Image processing equipment, image processing methods and programs
JP2017-072560 2017-03-31

Publications (2)

Publication Number Publication Date
US20180285627A1 US20180285627A1 (en) 2018-10-04
US10839198B2 true US10839198B2 (en) 2020-11-17

Family

ID=63669480

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/935,282 Active 2038-07-10 US10839198B2 (en) 2017-03-31 2018-03-26 Image processing apparatus, image processing method, and program

Country Status (2)

Country Link
US (1) US10839198B2 (en)
JP (1) JP6866210B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156520A1 (en) * 2019-07-24 2022-05-19 Nvidia Corporation Automatic generation of ground truth data for training or retraining machine learning models

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111524159B (en) * 2019-02-01 2024-07-19 北京京东乾石科技有限公司 Image processing method and apparatus, storage medium, and processor
KR20210072504A (en) * 2019-12-09 2021-06-17 삼성전자주식회사 Neural network system and operating method of the same

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5706362A (en) * 1993-03-31 1998-01-06 Mitsubishi Denki Kabushiki Kaisha Image tracking apparatus
US20030179193A1 (en) * 2002-02-19 2003-09-25 Adams William O. Three-dimensional imaging system and methods
JP2007020105A (en) 2005-07-11 2007-01-25 Fujifilm Holdings Corp Imaging apparatus, imaging method, and imaging program
US20080130950A1 (en) * 2006-12-01 2008-06-05 The Boeing Company Eye gaze tracker system and method
US7756296B2 (en) * 2007-03-27 2010-07-13 Mitsubishi Electric Research Laboratories, Inc. Method for tracking objects in videos using forward and backward tracking
US20130057728A1 (en) * 2010-05-10 2013-03-07 Fujitsu Limited Device and method for image processing
US20140320664A1 (en) * 2011-11-29 2014-10-30 Korbi Co., Ltd. Security system for tracking and surveilling an object determined as unrecognizable using a surveillance camera and method for providing security service using the system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4765732B2 (en) * 2006-04-06 2011-09-07 オムロン株式会社 Movie editing device
JP4983558B2 (en) * 2007-11-13 2012-07-25 マツダ株式会社 Vehicle driving support device
JP2017097510A (en) * 2015-11-20 2017-06-01 ソニー株式会社 Image processing apparatus, image processing method, and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5706362A (en) * 1993-03-31 1998-01-06 Mitsubishi Denki Kabushiki Kaisha Image tracking apparatus
US20030179193A1 (en) * 2002-02-19 2003-09-25 Adams William O. Three-dimensional imaging system and methods
JP2007020105A (en) 2005-07-11 2007-01-25 Fujifilm Holdings Corp Imaging apparatus, imaging method, and imaging program
US20080130950A1 (en) * 2006-12-01 2008-06-05 The Boeing Company Eye gaze tracker system and method
US7756296B2 (en) * 2007-03-27 2010-07-13 Mitsubishi Electric Research Laboratories, Inc. Method for tracking objects in videos using forward and backward tracking
US20130057728A1 (en) * 2010-05-10 2013-03-07 Fujitsu Limited Device and method for image processing
US20140320664A1 (en) * 2011-11-29 2014-10-30 Korbi Co., Ltd. Security system for tracking and surveilling an object determined as unrecognizable using a surveillance camera and method for providing security service using the system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156520A1 (en) * 2019-07-24 2022-05-19 Nvidia Corporation Automatic generation of ground truth data for training or retraining machine learning models
US11783230B2 (en) * 2019-07-24 2023-10-10 Nvidia Corporation Automatic generation of ground truth data for training or retraining machine learning models
US12112247B2 (en) 2019-07-24 2024-10-08 Nvidia Corporation Automatic generation of ground truth data for training or retraining machine learning models

Also Published As

Publication number Publication date
US20180285627A1 (en) 2018-10-04
JP6866210B2 (en) 2021-04-28
JP2018174494A (en) 2018-11-08

Similar Documents

Publication Publication Date Title
TWI475429B (en) Image display control apparatus and image display control method
RU2628485C2 (en) Prompting method and device to remove the application
JP5054063B2 (en) Electronic camera, image processing apparatus, and image processing method
US20170032172A1 (en) Electronic device and method for splicing images of electronic device
US10839198B2 (en) Image processing apparatus, image processing method, and program
US8244005B2 (en) Electronic apparatus and image display method
US9225906B2 (en) Electronic device having efficient mechanisms for self-portrait image capturing and method for controlling the same
US10212382B2 (en) Image processing device, method for controlling image processing device, and computer-readable storage medium storing program
US8077221B2 (en) Image capturing apparatus and control method therefor with determination whether storage medium is limited-rewriteable storage medium
CN104639813A (en) Image capturing apparatus and image capturing method
JP6210634B2 (en) Image search system
CN113095163A (en) Video processing method and device, electronic equipment and storage medium
US12174794B2 (en) Processing apparatus and control method thereof
US9756238B2 (en) Image capturing apparatus for performing authentication of a photographer and organizing image data for each photographer and control method thereof
JP5936376B2 (en) Image management device
US20120170868A1 (en) Image management method of digital photography device
US8345124B2 (en) Digital camera controlled by a control circuit
JP5762014B2 (en) REPRODUCTION DEVICE AND REPRODUCTION DEVICE CONTROL METHOD
US10321089B2 (en) Image preproduction apparatus, method for controlling the same, and recording medium
US20200105302A1 (en) Editing apparatus for controlling representative image to appropriate image, method of controlling the same, and storage medium therefor
JP2015231169A (en) Imaging device
JP2014203119A (en) Input device, input processing method, and program
US11937011B2 (en) Recording device, imaging device, recording method, and non-transitory computer readable medium
US11689800B2 (en) Image capturing apparatus and method of controlling the same, and storage medium
US20070101270A1 (en) Method and system for generating a presentation file for an embedded system

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GUNJI, DAISUKE;REEL/FRAME:046288/0640

Effective date: 20180315

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4