WO2020213284A1 - Dispositif de traitement d'image, procédé de traitement d'image et programme - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2020213284A1
WO2020213284A1 PCT/JP2020/009561 JP2020009561W WO2020213284A1 WO 2020213284 A1 WO2020213284 A1 WO 2020213284A1 JP 2020009561 W JP2020009561 W JP 2020009561W WO 2020213284 A1 WO2020213284 A1 WO 2020213284A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
image
processing target
success probability
target
Prior art date
Application number
PCT/JP2020/009561
Other languages
English (en)
Japanese (ja)
Inventor
悠介 篠原
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2020213284A1 publication Critical patent/WO2020213284A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image processing apparatus, an image processing method and a program.
  • Image data (video data) captured by the camera is collected, and image processing (information processing) related to the collected image data is performed.
  • a computer (computer) performs image processing on a person or an object included in the image data.
  • a computer extracts a photographed face image of a person and uses the face image for a person authentication process.
  • a computer may use a facial image to determine the age and gender of the person.
  • a step of specifying an area in which a face image is captured from image data, a step of cutting out the specified face image, and a step of cutting out the cut out face image are sequentially predetermined.
  • a process of processing face recognition, age / gender determination
  • Patent Document 1 discloses a face recognition process.
  • the feature amount that characterizes the face image such as the shape and size of the part (extracted part; for example, eyes, nose, mouth, and the entire face) cut out from the face image.
  • the calculated face feature amount feature vector composed of a plurality of feature amounts
  • the face feature amount feature vector registered in the database.
  • image processing processing typified by face recognition and age / gender determination
  • the face recognition process or the like may fail.
  • the feature amount is extracted from the face image.
  • the feature amount such as the shape and size of the part (eyes, nose, mouth, entire face) to be calculated for the feature amount cannot be extracted.
  • image processing such as face recognition is performed by a server on the network (server in the cloud environment) or a server near the sensor (server in the edge environment). Whether the face recognition process is performed in the cloud environment or the edge environment is determined in consideration of cost and communication delay.
  • a computer When face recognition processing is performed in an edge environment, a computer is placed near the sensor (for example, a camera). The sensor sends data to an application on a nearby computer. Normally, in an edge environment, the computational resources are placed near the sensor, so that the computational resources are expensive. That is, if a plurality of sensors are included in the system, each of the plurality of sensors requires a computational resource (computer), and the cost is high.
  • Whether the application for example, face recognition processing is executed in the cloud environment or the edge environment is determined by the requirements required for the system. For example, when performing camera image analysis in a small store, it is difficult to transmit data to the cloud environment from the viewpoint of privacy and communication cost. Therefore, when performing image analysis of a camera in a small store, it is suitable to process the face image in an edge environment.
  • Patent Document 2 discloses that the image quality is adjusted by a camera in order to reduce the network load.
  • the image is detected as the best shot image.
  • Patent Document 2 face image analysis is performed using only the best shot image in order to accurately perform image processing related to a person's face image by performing face image recognition only once. At that time, in Patent Document 2, as a judgment index of whether or not the face image is an image suitable for face recognition, it is determined that the person is facing the front, that the person is not out of focus, and that the person's eyes are open. I am using it.
  • Patent Document 2 a high-load process utilizing machine learning such as face orientation determination is executed in order to determine whether the image is suitable for authentication processing or the like (process for determining the best shot image). Therefore, even in the technique of Patent Document 2, it is still difficult to process the entire face image in the image in real time.
  • a main object of the present invention is to provide an image processing apparatus, an image processing method, and a program that contribute to executing image processing with a low load.
  • the image processing is based on the success probability when the image processing target is executed for the image processing target for each imaging situation, which is the situation when the image processing target is imaged.
  • a determination unit that determines whether or not to execute the image processing on the target, and the image processing on the image processing target when it is determined that the image processing is executed on the image processing target.
  • An image processing apparatus comprising an image processing unit for executing the above is provided.
  • the success probability when the image processing is executed on the image processing target for each imaging situation which is the situation when the image processing target is imaged. Based on this, it is determined whether or not to execute the image processing on the image processing target, and when it is determined that the image processing is executed on the image processing target, the image is subjected to the image processing target.
  • Image processing methods are provided that include performing the processing.
  • image processing is executed on the image processing target for each imaging situation, which is the situation when the image processing target is imaged by the computer mounted on the image processing device. Based on the success probability of the case, the process of determining whether or not to execute the image processing on the image processing target, and the image when it is determined to execute the image processing on the image processing target.
  • a process for executing the image processing on the processing target and a program for executing the image processing are provided.
  • an image processing apparatus an image processing method, and a program that contribute to executing image processing with a low load are provided.
  • other effects may be produced in place of or in combination with the effect.
  • FIG. 1 is a diagram for explaining an outline of one embodiment.
  • FIG. 2 is a diagram showing an example of a schematic configuration of an image processing system according to the first embodiment.
  • FIG. 3 is a diagram showing an example of the internal configuration of the image processing apparatus according to the first embodiment.
  • FIG. 4 is a diagram for explaining the operation of the acquisition unit according to the first embodiment.
  • FIG. 5 is a diagram for explaining division of still image data.
  • FIG. 6 is a diagram showing an example of information held by the storage unit according to the first embodiment.
  • FIG. 7 is a flowchart showing an example of the operation of the image processing apparatus according to the first embodiment.
  • FIG. 8 is a diagram showing an example of information held by the storage unit according to the second embodiment.
  • FIG. 9 is a diagram showing an example of information held by the storage unit according to the third embodiment.
  • FIG. 10 is a diagram showing an example of the hardware configuration of the image processing device.
  • FIG. 11 is a diagram showing an example of the internal configuration of the image processing apparatus according to the modified example.
  • the image processing device 100 includes a determination unit 101 and an image processing unit 102 (see FIG. 1).
  • the determination unit 101 executes image processing on the image processing target based on the success probability when the image processing target is executed on the image processing target for each imaging situation, which is the situation when the image processing target is imaged. Determine whether or not to do so.
  • the image processing unit 102 executes image processing on the image processing target when it is determined that the image processing is executed on the image processing target.
  • the image processing device 100 executes processing related to the image processing target according to the success probability of image processing according to the situation (for example, the position of the person) when the image processing target (for example, a person) is imaged. Judge whether or not. For example, with respect to an image of a person far away from the camera device 10, it is unlikely that the face recognition process for the image will succeed. Therefore, the image processing apparatus 100 calculates the success probability of the image processing in advance for each situation when the image is acquired, and performs the image processing only when the processing is expected to be completed normally. As a result, the situation where the image processing is tried but fails is avoided, and the resource waste of the limited computational resources (particularly the computational resources arranged in the edge environment) is prevented. That is, image processing can be realized with a low load, and a large number of objects are processed with limited computational resources.
  • FIG. 2 is a diagram showing an example of a schematic configuration of an image processing system according to the first embodiment.
  • the image processing system includes a plurality of camera devices 10-1 to 10-n (n is a positive integer, the same applies hereinafter), an image processing device 20, and a result storage device 30. Will be done.
  • the term "camera device 10" is simply used.
  • each of the plurality of camera devices 10 is connected to the image processing device 20. Further, the image processing device 20 and the result storage device 30 are connected.
  • the system configuration shown in FIG. 2 is an example, and is not intended to limit the number of camera devices 10 and the like.
  • the image processing system may include at least one or more camera devices 10.
  • the image processing device 20 acquires video data from each camera device 10.
  • the image processing device 20 performs image processing (data analysis) on the acquired video data, and stores the result in the result storage device 30.
  • the result storage device 30 stores the processing result of the image processing device 20.
  • the image processing device 20 determines whether or not to actually execute the image processing on the person included in the still image data SDj based on the probability that the image processing is normally completed (hereinafter referred to as the success probability). to decide.
  • the success probability is calculated for each situation (hereinafter referred to as an imaging situation) when an image processing target (a person appearing in an image in the above example) is imaged, and the success probability is calculated inside the image processing device 20. Results are accumulated.
  • the imaging situation is the position of a person imaged by the camera device 10 (coordinate position of the person in the image).
  • the image processing device 20 counts the results (normal end, abnormal end of processing) when image processing is performed on the face image FPk for each imaging status.
  • the image processing device 20 calculates the success probability in each imaging situation based on the number of trials of the image processing (hereinafter, the number of trials).
  • the image processing device 20 utilizes the accumulated success probability to determine whether or not to perform image processing. Specifically, the image processing device 20 performs threshold processing on the acquired success probability, and does not execute image processing (for example, face recognition processing) if the success probability is low.
  • the processing is performed under a meaningful situation in which the image processing is executed, so that the load on the computer that performs the face recognition processing and the like can be reduced.
  • image processing for example, authentication processing
  • image processing target for example, the person appearing in the image
  • FIG. 3 is a diagram showing an example of the internal configuration of the image processing device 20 according to the first embodiment.
  • the image processing device 20 includes an acquisition unit 201, a determination unit 202, a storage unit 203, an image cutting unit 204, and an image processing unit 205.
  • the acquisition unit 201 extracts the still image data SDj from the video data DM acquired from the camera device 10 at a predetermined timing (predetermined sampling). For example, the acquisition unit 201 extracts (captures) the still image data SDj shown in FIG.
  • the acquisition unit 201 attempts to extract a person from the extracted still image data SDj. For example, in the example of FIG. 4, the acquisition unit 201 extracts the person 301. If the acquisition unit 201 cannot extract a person from the still image data SDj, the acquisition unit 201 targets the next still image data SDj + 1 and the video data DM for processing.
  • the acquisition unit 201 calculates the position of the extracted person in the still image data SDj. For example, the acquisition unit 201 sets the lower left of the still image data SDj as the origin, and calculates the center of gravity of the extracted person and the center of the face as the position of the person. More specifically, the acquisition unit 201 converts the number of pixels from the origin to the position of the center of gravity into XY coordinates and calculates it as the position of a person.
  • the acquisition unit 201 delivers the still image data SDj from which the person has been extracted and the calculated person position PPk to the determination unit 202.
  • the acquisition unit 201 provides the still image data SDj and the person position PPk of the person 301 included in the still image data SDj to the determination unit 202.
  • various methods can be used as a method for extracting a person included in the still image data SDj and a method for calculating the person position PPk.
  • the acquisition unit 201 uses a learning model learned by a CNN (Convolutional Neural Network) to detect a target object (in this case, a person is detected) from the still image data SDj.
  • the acquisition unit 201 may extract a person by using a method such as template matching.
  • the determination unit 202 determines whether or not to execute the image processing on the image processing target based on the success probability when the image processing is executed on the image processing target for each imaging situation. More specifically, when the determination unit 202 acquires the still image data SDj and the person position PPk, the determination unit 202 acquires the success probability in the person position PPk from the storage unit 203.
  • the storage unit 203 stores at least the success probability for each imaging status. More specifically, the storage unit 203 stores information about the imaging status, the success probability, and the number of trials, which is the number of attempts to execute the image processing in the imaging status, in association with each other.
  • the information regarding the imaging status is information regarding the position of a person in an image in which a person to be image-processed is captured.
  • the information regarding the imaging status is information (for example, a coordinate range) that identifies each small area in which the still image data SDj is divided into predetermined areas.
  • the still image data SDj is divided into a plurality of small areas as shown in FIG. Note that FIG. 5 is an example, and does not mean that the division of the still image data SDj is limited to “9”. Further, the still image data SDj may be divided only in the row direction (horizontal direction) or may be divided only in the column direction (vertical direction). Further, the area of each subregion may be equal or different.
  • FIG. 6 is a diagram showing an example of information held by the storage unit 203 according to the first embodiment.
  • the success probability Pk and the number of trials Tk are stored for each small area (divided area) of the still image data SDj.
  • the storage unit 203 stores the small area of the still image data SDj, the success probability of the image processing in the small area, and the number of attempts (trial number) of the image processing in the small area in association with each other. ..
  • the number of trials Tk is the number of times that the image processing unit 205 tried image processing (for example, face recognition processing) in each small area of the still image data SDj.
  • the information (database) stored in the storage unit 203 is updated (added) by the image processing unit 205.
  • the determination unit 202 accesses the storage unit 203 to acquire the success probability Pk and the number of trials Tk of the small area corresponding to the person position PPk. Specifically, the determination unit 202 identifies a small area including the coordinates of the person position PPk, and acquires the success probability Pk and the number of trials Tk from the entry of the specified small area. For example, when the person position PPk of the person 301 is included in the area A shown in FIG. 5 in FIG. 4, the determination unit 202 acquires the success probability P01 and the number of trials R01 (see the first line of FIG. 6).
  • the determination unit 202 executes threshold processing for the acquired number of trials Tk, and executes image processing (for example, face recognition processing) for an image processing target (for example, a person appearing in the image) according to the result. Judge whether or not. Specifically, when the number of trials Tk is smaller than the trial threshold value, the determination unit 202 determines that the process is executed.
  • the information stored in the storage unit 203 is updated with the operation of the system. Therefore, at the start of system operation or the like, a sufficient number of image processes may not be executed for a small area corresponding to the desired person position PPk.
  • the index indicating whether or not a sufficient number of processes have been executed is the "number of trials". If the number of trials is small, the corresponding success probability is judged to be unreliable, and it is judged that the reliability of the success probability needs to be increased.
  • the determination unit 202 determines that the subsequent processing (image cropping processing, image processing) is executed. That is, it is necessary for the image processing unit 205 to accumulate the results of image processing related to the person position PPk and to make the success probability Pk in the small area of the corresponding still image data SDj highly reliable information. Therefore, when the number of trials is smaller than the trial threshold value, the determination unit 202 determines that the subsequent processing is "processing execution".
  • the determination unit 202 determines that the success probability Pk stored in the storage unit 203 is highly reliable data. In this case, the determination unit 202 executes the threshold value processing on the acquired success probability Pk, and determines whether or not to execute the subsequent processing (image cropping processing, image processing) according to the result.
  • the determination unit 202 determines that the process is executed. This is because if the success probability Pk is equal to or greater than the threshold value, there is a high probability that the processing will be completed normally if the image processing is executed at the person position PPk.
  • the determination unit 202 determines that "processing is not executed". The fact that the success probability Pk is smaller than the threshold value indicates that even if the image processing is executed at the person position PPk, the probability that the processing is normally completed is low.
  • the determination unit 202 determines that "processing is not executed"
  • the determination unit 202 does not execute any special processing.
  • the determination unit 202 notifies the acquisition unit 201 to that effect.
  • the acquisition unit 201 shifts the processing target to the next data (next still image data, next person).
  • the determination unit 202 determines that "processing is executed"
  • the determination unit 202 notifies the image cutout unit 204 of the image cutout request together with the still image data SDj and the person position PPk.
  • the image cutout unit 204 When the image cutout unit 204 acquires the image cutout request or the like, the image cutout unit 204 cuts out the face image (face image area) of the person existing at the person position PPk in the still image data SDj (extracts the face image). The image cutting unit 204 delivers the person position PPk corresponding to the cut out face image FPk to the image processing unit 205.
  • a method of specifying the position of the face existing in the person position PPk in the image cutting unit 204 for example, a method of extracting a face image using CNN can be used as in the acquisition unit 201.
  • the image processing unit 205 executes image processing on the image processing target when it is determined that the image processing is executed on the image processing target. Specifically, the image processing unit 205 performs a predetermined process (for example, face recognition process) using the cut-out face image FPk and the person position PPk. Since existing techniques can be applied to the calculation of the feature amount (feature vector) required for the face recognition process and the similarity (distance between the feature vectors) required for the collation process, detailed description thereof will be omitted.
  • a predetermined process for example, face recognition process
  • the image processing unit 205 updates and adds the information stored in the storage unit 203 according to the processing result (normal end, abnormal end). Specifically, the image processing unit 205 updates the fields of the success probability Pk and the number of trials Tk of the entry corresponding to the image-processed person position PPk (small area of the still image data SDj).
  • the image processing unit 205 calculates, for example, a case where a predetermined number of feature points (for example, feature points such as eyes and nose) cannot be extracted in the calculation of the feature vector, or a large number of unreliable feature quantities are calculated. If so, the authentication process is determined to be "abnormal termination".
  • a predetermined number of feature points for example, feature points such as eyes and nose
  • the image processing unit 205 calculates the success probability Pk and the number of trials Tk of the corresponding small area according to the following equations (1) and (2), and updates the information held by the storage unit 203.
  • Res in the above equation (1) indicates the processing result, and "1" is assigned if the processing ends normally, and "0" is assigned if the processing ends abnormally (error end).
  • the success probability regarding the image processing for each small area of the still image data SDj is accumulated in the storage unit 203.
  • the image processing device 20 extracts the person position PPk from the still image data SDj (step S101).
  • the image processing device 20 reads out the success probability Pk and the number of trials Tk corresponding to the person position PPk from the storage unit 203 (step S102).
  • the image processing device 20 determines whether or not the acquired number of trials Tk is equal to or greater than the trial threshold value (step S103).
  • step S103 If the number of trials Tk is smaller than the trial threshold value (step S103, No branch), the image processing apparatus 20 executes the processes after step S106.
  • step S103 If the number of trials Tk is equal to or greater than the trial threshold value (step S103, Yes branch), the image processing apparatus 20 determines whether or not the acquired success probability Pk is equal to or greater than the execution threshold value (step S104).
  • step S104 If the success probability Pk is equal to or greater than the execution threshold value (step S104, Yes branch), the image processing device 20 executes the processes after step S106.
  • step S104 If the success probability Pk is smaller than the execution threshold value (step S104, No branch), the image processing device 20 determines that the image processing such as face recognition processing is "not executed” (step S105).
  • step S106 the image processing device 20 sets image processing such as face recognition processing to "execution".
  • the image processing device 20 executes predetermined image processing (for example, face recognition processing and age / gender determination processing) (step S107).
  • predetermined image processing for example, face recognition processing and age / gender determination processing
  • the image processing device 20 reflects the processing result in step S107 in the database constructed in the storage unit 203 (step S108).
  • the success probability (accuracy) of image processing depends on the size of the face, the orientation of the face, and the amount of light shining on the face in the still image data SDj. It is known. Factors (parameters) that affect the success probability often depend on the position in the image. For example, in a store, since the display layout is fixed, many people look in a specific direction (for example, the direction in which the product is located) when passing through a specific place, and there is a certain tendency toward the face.
  • the success or failure of the image processing is biased according to the position of the person.
  • the processing related to the person at the position is not performed, thereby preventing the waste of resources.
  • the image processing device 20 performs image processing by the determination unit 202 according to the success probability Pk for each imaging situation (person position PPk in the still image data SDj) stored in the storage unit 203. Determine whether to execute. Specifically, the success probability of image processing for the processing target is estimated by a light load of threshold processing, and it is determined whether or not to actually execute the image processing. Further, when sufficient data on the success or failure of the image processing is not accumulated, the image processing is actually executed and the success or failure is reflected in the success probability Pk. As a result, the number of image processing trials per person can be reduced while maintaining the success probability of image processing, so that the number of people that can be processed by the entire system can be increased.
  • the person position PPk in the still image data SDj was used as the "imaging situation".
  • the time when the image processing target is photographed that is, the time when the still image data SDj is acquired (current time) is used as the imaging situation will be described.
  • FIG. 8 is a diagram showing an example of information held by the storage unit 203 according to the second embodiment. As shown in FIG. 8, the storage unit 203 stores the success probability Pk and the number of trials Tk for each time zone.
  • the acquisition unit 201 delivers the current time CT from which the still image data SDj has been acquired to the determination unit 202 together with the still image data SDj and the person position PPk.
  • the determination unit 202 acquires the success probability Pk and the number of trials Tk according to the current time CT from the storage unit 203.
  • the determination unit 202 performs processing related to the success probability Pk and the number of trials Tk in the same manner as the contents described in the first embodiment.
  • the determination unit 202 determines that "processing is executed". If the number of trials Tk is equal to or greater than the trial threshold value and the acquired success probability Pk is equal to or greater than the execution threshold value, the determination unit 202 determines that the process is executed.
  • the determination unit 202 determines that the process is not executed.
  • the image processing device 20 Since the operation of the image processing device 20 after the execution or non-execution of the processing is determined can be the same as that of the first embodiment, detailed description thereof will be omitted.
  • the image processing device 20 updates the corresponding entry (success probability Pk for each time zone, number of trials Tk) of the storage unit 203.
  • the success rate of image processing may depend on the amount of light hitting the face. Since the amount of light changes according to the time zone, there may be a phenomenon that the face is easily exposed to light in a specific time zone.
  • the success probability of the image processing that changes according to the time zone is taken into consideration, and the image processing is not performed in a situation where the image processing is likely to fail, such as at night. This prevents wasting resources.
  • the information regarding the imaging status according to the third embodiment includes the position of the person in the image in which the person to be image-processed is captured, and the time when the person was photographed.
  • FIG. 9 is a diagram showing an example of information held by the storage unit 203 according to the third embodiment. As shown in FIG. 9, the storage unit 203 stores the success probability Pk and the number of trials Tk in association with each other for each time zone of the small area of the still image data SDj.
  • the acquisition unit 201 delivers the still image data SDj, the current time CT for acquiring the still image data SDj, and the person position PPk to the determination unit 202.
  • the determination unit 202 acquires the success probability Pk and the number of trials Tk according to the person position PPk and the current time CT from the storage unit 203.
  • the determination unit 202 performs processing related to the success probability Pk and the number of trials Tk in the same manner as the contents described in the first embodiment.
  • the determination unit 202 determines that "processing is executed". If the number of trials Tk is equal to or greater than the trial threshold value and the acquired success probability Pk is equal to or greater than the execution threshold value, the determination unit 202 determines that the process is executed.
  • the determination unit 202 determines that the process is not executed.
  • the image processing device 20 updates the corresponding entry (small area, success probability Pk for each time zone, number of trials Tk) of the storage unit 203 when face recognition processing or the like is performed. ..
  • the image processing device 20 determines whether or not to perform image processing according to the position of the person and the success probability for each time. Therefore, as compared with the first and second embodiments, in the third embodiment, the situation in which the person is photographed can be defined in more detail and the success probability can be set. Therefore, it is possible to determine whether or not to perform accurate image processing. Can be done.
  • FIG. 10 is a diagram showing an example of the hardware configuration of the image processing device 20.
  • the image processing device 20 can be configured by an image processing device (so-called computer), and includes the configuration illustrated in FIG.
  • the image processing device 20 includes a processor 311, a memory 312, an input / output interface 313, a communication interface 314, and the like.
  • the components such as the processor 311 are connected by an internal bus or the like so that they can communicate with each other.
  • the configuration shown in FIG. 10 does not mean to limit the hardware configuration of the image processing device 20.
  • the image processing device 20 may include hardware (not shown), or may not include an input / output interface 313 if necessary.
  • the number of processors 311 and the like included in the image processing device 20 is not limited to the example of FIG. 10, and for example, a plurality of processors 311 may be included in the image processing device 20.
  • the processor 311 is a programmable device such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor). Alternatively, the processor 311 may be a device such as an FPGA (Field Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit). The processor 311 executes various programs including an operating system (OS; Operating System).
  • OS Operating System
  • the memory 312 is a RAM (RandomAccessMemory), a ROM (ReadOnlyMemory), an HDD (HardDiskDrive), an SSD (SolidStateDrive), or the like.
  • the memory 312 stores an OS program, an application program, and various data.
  • the input / output interface 313 is an interface of a display device or an input device (not shown).
  • the display device is, for example, a liquid crystal display or the like.
  • the input device is, for example, a device that accepts user operations such as a keyboard and a mouse.
  • the communication interface 314 is a circuit, module, or the like that communicates with another device.
  • the communication interface 314 includes a NIC (Network Interface Card) and the like.
  • the function of the image processing device 20 is realized by various processing modules.
  • the processing module is realized, for example, by the processor 311 executing a program stored in the memory 312.
  • the program can also be recorded on a computer-readable storage medium.
  • the storage medium may be a non-transient such as a semiconductor memory, a hard disk, a magnetic recording medium, or an optical recording medium. That is, the present invention can also be embodied as a computer program product.
  • the program can be downloaded via a network or updated using a storage medium in which the program is stored.
  • the processing module may be realized by a semiconductor chip.
  • the configuration, operation, and the like of the image processing system described in the above embodiment are examples, and are not intended to limit the configuration and the like of the system.
  • the processing result of the image processing device 20 may not be stored in the result storage device 30, but may be directly transmitted to a device that uses the result of the image processing.
  • the processing result may be transmitted to a device that controls the opening and closing of the gate according to the result of the face recognition.
  • the image processing system according to the above embodiment may be realized in a cloud environment or an edge environment.
  • the image processing device 20 and the result storage device 30 operate as servers on the network.
  • the image processing device 20 operates as a server on the edge side
  • the result storage device 30 operates as a server on the cloud side.
  • the image processing device 20 captures the still image data from the moving image data acquired from the camera device 10
  • the camera device 10 periodically (for example, at 1-second intervals) captures the still image data. It may be transmitted to the processing device 20.
  • the camera device 10 is assumed to be a fixed camera device such as a surveillance camera, but the image (moving image) data input to the image processing device 20 is data acquired from the mobile camera device. It may be. That is, the camera device 10 includes a surveillance camera, a digital camera, a mobile phone, a smartphone, and the like. That is, the "camera device" disclosed in the present application can be any electronic device having a photographing function.
  • a person is set as the target of image processing, but the target of image processing can be arbitrary.
  • an animal may be the target of image processing, or a device (object) such as a robot may be the target of image processing.
  • the authentication process using the face image has been described as a predetermined image process, but the process performed by the image processing unit 205 is not limited to the authentication process.
  • the image processing unit 205 may execute the age / gender determination process.
  • the portion to be processed by the image processing unit 205 is not limited to the “face”.
  • the image processing unit 205 may process a portion such as a “hand” or a “foot”. In this case, the image cutting unit 204 cuts out a portion required for processing by the image processing unit 205.
  • each of the plurality of camera devices 10 included in the image processing system executes the same image processing (for example, face recognition processing), but the implementation is performed according to the camera device 10 for acquiring moving image data.
  • face recognition processing is performed on the moving image data (still image data) transmitted by the camera device 10-1
  • age / gender determination processing is performed on the moving image data transmitted by the camera device 10-2. May be good.
  • each camera device 10 transmits an identifier that identifies itself to the image processing device 20 together with the moving image data.
  • the image processing device 20 may change the content of the image processing according to the attribute (for example, the installation position) of the camera device 10.
  • the position of the person appearing in the image and the time (current time) at the time of image acquisition are set as the "imaging status", but other information may be set as the imaging status.
  • information such as "weather” and "brightness” at the time of image acquisition may be used as the imaging status.
  • the storage unit 203 stores the success probability Pk for each weather (sunny, rain, cloudy) and the success probability Pk for each brightness.
  • the weather at the time of image acquisition may be acquired from an external server, or the weather may be estimated using a brightness sensor or the like.
  • the result of image processing for each imaging status is accumulated from the time of system operation to improve the reliability of the success probability, but the success probability and the number of trials for each imaging status are acquired before the system is operated. It may be stored in the storage unit 203 in advance. By storing the success probability and the number of trials in the storage unit 203 in advance in this way, it is possible to realize execution determination of image processing based on highly reliable data (success probability) from the start of system operation. That is, by setting the initial value of the storage unit 203 to a value measured in advance, the success probability can be quickly converged to an appropriate value.
  • the image processing device 20 may be provided with a test mode, and a person may stand at various places at various times to perform image processing and use the result as an initial value to be stored in the storage unit 203.
  • the image processing unit 205 calculates the success probability and the number of trials, and updates the contents of the storage unit 203.
  • the image processing unit 205 may store only the result of the image processing in the storage unit 203, and the determination unit 202 may calculate the success probability based on the stored result.
  • the image processing device 20 may further include a prediction unit 206.
  • the prediction unit 206 predicts the success probability according to the imaging situation to be processed, based on the success probability of the imaging situation different from the imaging situation to be processed. For example, in FIG.
  • the success probabilities of the small area A and the small area C are stored in the storage unit 203 as highly reliable data (the number of trials in each area is equal to or greater than the trial threshold value), but the success probabilities of the small area B Consider the case where is not stored (success probability is 0).
  • the prediction unit 206 calculates the average value of the success probabilities of the small area A and the small area B adjacent to the small area B, and provides the success probability of the small area B to the determination unit 202.
  • the computer By installing an image processing program in the storage unit of the computer, the computer can function as an image processing device. Further, by causing the computer to execute the image processing program, the image processing method can be executed by the computer.
  • [Appendix 6] The description in Appendix 4 or 5, wherein the image processing unit (102, 205) updates the information stored in the storage unit (203) according to the result of attempting the image processing on the image processing target.
  • [Appendix 7] The image processing apparatus (20, 100) according to any one of Appendix 4 to 6, wherein the information regarding the imaging status includes the position of the image processing target in the image in which the image processing target is captured.
  • [Appendix 8] The image processing apparatus (20, 100) according to any one of Supplementary note 4 to 7, wherein the information regarding the imaging status includes the time when the image processing target was photographed.
  • [Appendix 9] The image processing apparatus (20, 100) according to any one of Appendix 1 to 8, wherein the image processing target is a human face.
  • Appendix 10 The image processing apparatus (20, 100) according to Appendix 9, further comprising an image cutting section (204) for cutting out the face region to be image processed from the image.
  • Appendix 11 In the image processing apparatus (20, 100) Whether to execute the image processing on the image processing target based on the success probability when the image processing is executed on the image processing target for each imaging situation, which is the situation when the image processing target is imaged. Judge whether or not An image processing method including executing the image processing on the image processing target when it is determined to execute the image processing on the image processing target.
  • Appendix 12 On the computer (311) mounted on the image processing device (20, 100), Whether to execute the image processing on the image processing target based on the success probability when the image processing is executed on the image processing target for each imaging situation, which is the situation when the image processing target is imaged.
  • the process of determining whether or not When it is determined that the image processing is executed on the image processing target, the processing of executing the image processing on the image processing target and the processing of executing the image processing on the image processing target.
  • a program that executes Note that the form of Appendix 11 and the form of Appendix 12 can be expanded to the forms of Appendix 2 to the form of Appendix 10 in the same manner as the form of Appendix 1.
  • the present invention contributes to executing image processing such as face recognition processing with a low load in an environment where computational resources are limited.

Abstract

L'objectif de l'invention est de fournir un dispositif de traitement d'image qui exécute un traitement d'image à une faible charge. À cet effet, l'invention concerne un dispositif de traitement d'image comprenant une unité de détermination et une unité de traitement d'image. L'unité de détermination détermine s'il faut exécuter un traitement d'image sur un sujet de traitement d'image d'après la probabilité de réussite de l'exécution du traitement d'image sur le sujet de traitement d'image par situation de capture d'image, qui est la situation dans laquelle le sujet de traitement d'image est capturé. L'unité de traitement d'image exécute le traitement d'image sur le sujet de traitement d'image lorsqu'il est déterminé qu'un traitement d'image doit être réalisé sur le sujet de traitement d'image.
PCT/JP2020/009561 2019-04-15 2020-03-06 Dispositif de traitement d'image, procédé de traitement d'image et programme WO2020213284A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-076908 2019-04-15
JP2019076908 2019-04-15

Publications (1)

Publication Number Publication Date
WO2020213284A1 true WO2020213284A1 (fr) 2020-10-22

Family

ID=72837355

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/009561 WO2020213284A1 (fr) 2019-04-15 2020-03-06 Dispositif de traitement d'image, procédé de traitement d'image et programme

Country Status (1)

Country Link
WO (1) WO2020213284A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016194804A (ja) * 2015-03-31 2016-11-17 Kddi株式会社 人物特定装置およびプログラム
JP2018148367A (ja) * 2017-03-03 2018-09-20 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016194804A (ja) * 2015-03-31 2016-11-17 Kddi株式会社 人物特定装置およびプログラム
JP2018148367A (ja) * 2017-03-03 2018-09-20 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム

Similar Documents

Publication Publication Date Title
KR101687530B1 (ko) 촬상 시스템에 있어서의 제어방법, 제어장치 및 컴퓨터 판독 가능한 기억매체
JP6141079B2 (ja) 画像処理システム、画像処理装置、それらの制御方法、及びプログラム
US8938092B2 (en) Image processing system, image capture apparatus, image processing apparatus, control method therefor, and program
JP4642128B2 (ja) 画像処理方法、画像処理装置及びシステム
US8314854B2 (en) Apparatus and method for image recognition of facial areas in photographic images from a digital camera
JP6494253B2 (ja) 物体検出装置、物体検出方法、画像認識装置及びコンピュータプログラム
JP6921694B2 (ja) 監視システム
US20110142299A1 (en) Recognition of faces using prior behavior
KR20140013407A (ko) 객체 추적 장치 및 방법
WO2019033569A1 (fr) Procédé d'analyse du mouvement du globe oculaire, dispositif et support de stockage
CN108875507B (zh) 行人跟踪方法、设备、系统和计算机可读存储介质
US20160217326A1 (en) Fall detection device, fall detection method, fall detection camera and computer program
KR20190118619A (ko) 보행자 추적 방법 및 전자 디바이스
JP2008197904A (ja) 人物検索装置および人物検索方法
JP2015082245A (ja) 画像処理装置、画像処理方法及びプログラム
EP2544148A1 (fr) Dispositif d'évaluation d'objet étranger, procédé d'évaluation d'objet étranger et programme d'évaluation d'objet étranger
EP3846114A1 (fr) Système de gestion d'informations sur des animaux et procédé de gestion d'informations sur des animaux
KR102022971B1 (ko) 영상의 객체 처리 방법 및 장치
JP2018081402A (ja) 画像処理装置、画像処理方法、及びプログラム
CN110505438B (zh) 一种排队数据的获取方法和摄像机
CN108875488B (zh) 对象跟踪方法、对象跟踪装置以及计算机可读存储介质
JP6798609B2 (ja) 映像解析装置、映像解析方法およびプログラム
WO2020213284A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme
CN114743264A (zh) 拍摄行为检测方法、装置、设备及存储介质
KR20180111150A (ko) 영상에서 전경을 추출하는 장치 및 그 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20791808

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20791808

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP