WO2018070032A1 - Image processing apparatus - Google Patents

Image processing apparatus Download PDF

Info

Publication number
WO2018070032A1
WO2018070032A1 PCT/JP2016/080489 JP2016080489W WO2018070032A1 WO 2018070032 A1 WO2018070032 A1 WO 2018070032A1 JP 2016080489 W JP2016080489 W JP 2016080489W WO 2018070032 A1 WO2018070032 A1 WO 2018070032A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
matching
feature amount
captured image
Prior art date
Application number
PCT/JP2016/080489
Other languages
French (fr)
Japanese (ja)
Inventor
信夫 大石
Original Assignee
富士機械製造株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士機械製造株式会社 filed Critical 富士機械製造株式会社
Priority to PCT/JP2016/080489 priority Critical patent/WO2018070032A1/en
Priority to JP2018544656A priority patent/JPWO2018070032A1/en
Publication of WO2018070032A1 publication Critical patent/WO2018070032A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image processing apparatus.
  • pattern matching for example, pattern matching, a learning function (Ada-Boost), an image difference table, and the like are used to target a specific pattern in a detection area that is one area included in an image. It has been desired to detect an object at a higher speed.
  • Ada-Boost learning function
  • an image difference table for example, an image difference table, and the like.
  • the present invention has been made in view of such a problem, and a main object of the present invention is to provide an image processing apparatus that can capture an object moving at high speed more reliably.
  • the present invention adopts the following means in order to achieve the main object described above.
  • the image processing apparatus disclosed in this specification is: A calculation unit that acquires a signal of a part of the captured image captured by the imaging unit and calculates a feature amount including a gradient direction and gradient strength of the luminance of the local region; The predetermined feature including the luminance gradient direction and gradient intensity obtained by imaging the predetermined object is matched with the calculated feature of the local region, and the predetermined object is detected in the captured image. A determination unit for determining whether or not A control unit that starts recording the captured image when the determination unit determines that the predetermined object has entered the captured image; It is equipped with.
  • a partial signal of a captured image is acquired, a feature amount including a luminance gradient direction and gradient intensity of a local region is calculated, and a luminance gradient direction and gradient intensity obtained by imaging a predetermined target object It is determined whether or not a predetermined object is detected in the captured image by matching the fixed feature quantity including the feature quantity of the local region obtained by the calculation.
  • the apparatus starts recording the captured image when it is determined that the predetermined object has entered the captured image.
  • feature quantities are calculated in rough units of luminance gradient direction and gradient intensity, and a predetermined target can be detected at higher speed by matching feature quantities in local regions.
  • An object can be imaged more reliably. For example, an object that moves at high speed can be reliably recognized within one frame, and the start and stop of imaging and recording can be controlled.
  • FIG. 1 is a schematic explanatory diagram illustrating an example of a camera system 10.
  • FIG. FIG. 2 is a block diagram of an imaging unit 12, an image processing unit 20, and a recording device 41.
  • Explanatory drawing showing an example of an image matching process Explanatory drawing showing an example of a template.
  • the timing chart of an image input, a matching process, and a recording control signal. 6 is a timing chart of conventional image input, matching processing, and recording control signals.
  • FIG. 1 is a schematic explanatory diagram illustrating an example of a camera system 10 that is an example of the present invention.
  • the camera system 10 includes a video camera 11, a computer (PC) 40, and a recording device 41.
  • the video camera 11 is a device that captures an image such as a moving image or a still image.
  • the computer 40 captures image data captured by the video camera 11 and displays it on a display or the like.
  • the recording device 41 records image data captured by the video camera 11.
  • the video camera 11 includes an imaging unit 12, a control unit 14, a timing control unit 16, and an image processing unit 20.
  • the video camera 11 is configured as a high frame rate camera capable of capturing an image at a maximum of 1000 fps (the cycle of one frame is 1 ms).
  • the imaging unit 12 includes an imaging element 13.
  • the imaging element 13 is an element that generates charges by receiving light and outputs the generated charges.
  • the image sensor 13 may be a CMOS image sensor, for example.
  • the control unit 14 controls the entire apparatus and includes a CPU 15.
  • the video camera 11 starts moving image shooting when an object registered in advance enters the imaging range, and ends moving image shooting when the object is out of the imaging range or the maximum recording time elapses. Execute shooting mode.
  • control unit 14 performs a process of starting to continue storing the captured image (starting recording) when the image processing unit 20 determines that the predetermined object has entered the captured image. Further, the control unit 14 performs a process of stopping the continuous storage of the captured image (stops recording) when a predetermined target is removed from the captured image by the image processing unit 20.
  • the timing control unit 16 generates various clocks used for the operation of the image processing unit 20.
  • FIG. 2 is a block diagram of the imaging unit 12, the image processing unit 20, and the recording device 41.
  • the image processing unit 20 performs an object matching process on the image signal input from the imaging unit 12.
  • 3A and 3B are explanatory diagrams illustrating an example of image matching processing of the captured image 50.
  • FIG. 3A is an explanatory diagram of feature amounts of the cells 51 and the pixels 52
  • FIG. 3B is an explanatory diagram of binary coding.
  • FIG. 3C is an explanatory diagram of the matching process. This matching process will be described in detail later.
  • the recording control signal is input, the recording device 41 starts and stops recording of the image signal output from the imaging unit 12.
  • the image processing unit 20 includes a feature amount calculation unit 21 and an image determination unit 30.
  • the feature amount calculation unit 21 acquires a partial signal of the captured image captured by the imaging unit 12 and calculates a feature amount including the gradient direction and gradient strength of the luminance of the local region (cell). As shown in FIG. 3A, the cell 51 is an area in which an image is divided by a predetermined number of pixels (for example, 7 ⁇ 7, 3 ⁇ 3, etc.).
  • the feature quantity computing unit 21 inputs the signal from the imaging unit 12 and simultaneously executes the process described below.
  • the feature amount calculation unit 21 includes a function calculation unit 22, a gradient strength calculation unit 23, a gradient direction calculation unit 24, a histogram generation unit 25, a histogram RAM 26, a sort unit 27, and a binary code conversion unit 28.
  • the function calculation unit 22 is a circuit that calculates the function Gx and Gy of the feature quantity including the gradient direction and the gradient strength. These functions Gx and Gy are obtained by using, for example, the luminance values according to equations (1) and (2).
  • the gradient strength calculation unit 23 is a circuit that calculates the gradient strength of luminance using the function Gx and Gy of the feature amount. This gradient strength is obtained by equation (3).
  • the gradient direction calculation unit 24 is a circuit that calculates the gradient direction of the luminance using the function Gx, Gy of the feature amount. This gradient strength is obtained by equation (4).
  • the gradient direction calculation unit 24 outputs the gradient direction divided for each angle.
  • the gradient direction calculation unit 24 outputs a gradient direction obtained by dividing 180 ° into eight BINs.
  • the function calculator 22, the gradient strength calculator 23, and the gradient direction calculator 24 each have a plurality of circuits so that a plurality of pixels can be processed simultaneously.
  • the function calculation unit 22 has eight calculation circuits and can process eight pixels simultaneously.
  • the gradient strength calculation unit 23 and the gradient direction calculation unit 24 are the same as the function calculation unit 22.
  • the histogram generation unit 25 is a circuit that generates a histogram of each cell representing the relationship between the gradient direction and the gradient intensity.
  • the histogram generator 25 has, for example, eight circuits and can process eight cells in parallel.
  • the histogram RAM 26 is a circuit that stores the generated histogram, and may be, for example, a dual port RAM (DPRAM) having both a read data output terminal and a write data output terminal.
  • the sorting unit 27 is a circuit that sorts cell histograms in descending order of gradient strength.
  • the binary code conversion unit 28 is an arithmetic circuit that generates a binary code using an angle having the highest gradient strength in a cell histogram as a HOG (Histograms of Oriented Gradients) feature amount of the cell.
  • HOG Heistograms of Oriented Gradients
  • the binary code of the feature amount is defined by an 8-digit code corresponding to 0 to 7 by dividing 180 ° into 8 parts (see FIG. 3B).
  • the feature amount calculation unit 21 includes a plurality of circuits, and can calculate a feature amount in parallel with respect to a plurality of pixels and cells.
  • the image determination unit 30 performs matching between the fixed feature amount including the gradient direction and the gradient strength of the luminance obtained by imaging the predetermined object and the calculated feature amount of the local region (cell), and performs the predetermined object. Is detected in the captured image.
  • the image determination unit 30 is a determination circuit that performs a matching process with a feature amount of an input image using a binary-coded fixed feature amount (template).
  • the image determination unit 30 includes a binary code register 31, a binary code RAM 32, a register 33, a binary code register 34, a binary code RAM 35, a matching determination unit 36, a matching rate RAM 37, and a sorting unit 38.
  • the binary code register 31, the binary code RAM 32, and the register 33 are circuits that store the feature amount (binary code) of the input image.
  • Each of the binary code register 31 and the binary code RAM 32 has eight storage circuits. These storage circuits are collectively read from the binary code RAM 32 for one cell line, stored in the register 33, and operate as a shift register to form a sliding window.
  • the binary code register 34 and the binary code RAM 35 are storage circuits for inputting and storing the feature amount of the template output from the control unit 14.
  • the matching determination unit 36 is a circuit that performs matching between the feature amount of the template obtained in advance from the image of the object and the feature amount of the input image, and counts the matching rate (see FIG. 3C).
  • the matching determination unit 36 includes a matching check unit 36a and a matching rate counting unit 36b.
  • the matching check unit 36a performs matching between the feature amount of the template and the feature amount of the input image.
  • the matching rate counting unit 36b counts the matching rate between the feature amount of the template and the feature amount of the input image.
  • the matching rate RAM 37 is a storage circuit that stores the matching rate determined by the matching determination unit 36.
  • the sorting unit 38 is a circuit that sorts the relevance ratios and outputs a higher-order determination result to the control unit 14.
  • the matching between the feature quantity of the template and the feature quantity of the input image is performed at each position while moving the image one cell at a time.
  • the image determination unit 30 includes a plurality of registers (binary code register 31, binary code RAM 32, and A plurality of cell feature values are stored in each of the registers, and matching is performed by operating the stored feature values for each cell as a shift register. For this reason, the image determination unit 30 can realize a higher-speed matching process than that for reading out one cell at a time.
  • FIG. 4 is an explanatory diagram illustrating an example of a template. 3 and 4 and the like will be described using an automobile for easy understanding, but the moving object is not limited to the automobile.
  • this template is a reference cell feature amount obtained based on an image obtained by capturing an object.
  • This template may be generated using, for example, a binary code including at least a first feature amount of a cell and a second feature amount at a position shifted by a predetermined amount with respect to the cell.
  • This template includes a first template obtained from a cell divided at the first position.
  • the template also includes a second template obtained from a cell divided at a second position moved by a predetermined number of pixels (for example, ( ⁇ 3, ⁇ 3)) from the first position to the negative side. Furthermore, this template also includes a third template obtained from a cell divided at a third position moved by a predetermined number of pixels (for example, (3, 3)) from the first position to the positive side.
  • the template of FIG. 4 is obtained by synthesizing three binary codes generated from three types of histograms by binary-coding the top three BINs that exceed a predetermined threshold in 8 BINs in the first to third templates. Is obtained as one binary code.
  • the image processing unit 20 matches the feature amount of the input image using this template, matching the three positions at a time can improve robustness and reduce the load on the circuit. Further, the purpose of realizing the matching with the three templates in which the cell cut points are arbitrarily changed is to improve the detection rate.
  • the captured image is distorted and appears to change shape due to the parallax due to the change in the relative position of the camera to the object, so the robustness against the parallax is improved by matching simultaneously with the template of three viewpoints .
  • the imaging unit 12 may be capable of high-speed imaging at a rate exceeding 240 fps, for example, a maximum of 1000 fps.
  • the image determination unit 30 of the image processing unit 20 may detect the predetermined object within one frame after the object enters the captured image, for example, within 1 ⁇ sec.
  • the image processing unit 20 performs a high-speed matching process by calculating a binary-coded HOG feature quantity by a parallelized circuit and constructing a sliding window using a shift register using the feature quantity. be able to.
  • the video camera 11 can be used, for example, when recording an object moving at high speed. It is preferable that the object does not change its direction (direction) or size (distance) during movement.
  • the video camera 11 can be used, for example, in a parts camera that takes a picture while moving a collected component in a mounting apparatus that mounts the component on a substrate.
  • the user can store and save the captured image in the recording device 41 or perform slow motion playback on the computer 40.
  • FIG. 5 is a flowchart illustrating an example of a moving image automatic shooting processing routine executed by the video camera 11.
  • FIG. 6 is a flowchart illustrating an example of an image matching processing routine executed by the image processing unit 20.
  • the moving image automatic shooting processing routine is executed by the CPU 15 of the control unit 14 after the user inputs execution of the moving image automatic shooting processing. It is assumed that the user registers an image of an object before executing the automatic photographing process, and the control unit 14 creates a template for the object.
  • the CPU 15 inputs an image signal from the imaging unit 12 (step S100).
  • the CPU 15 inputs an image matching result from the image processing unit 20 (step S110), and determines whether or not an object is detected in the image (step S120). This determination is performed based on, for example, whether or not the relevance ratio as the image matching result is equal to or higher than a predetermined threshold (for example, 75% or higher).
  • a predetermined threshold for example, 75% or higher.
  • the CPU 15 executes the processing from step S100 onward.
  • the CPU 15 starts recording (step S130).
  • the CPU 15 inputs an image signal from the imaging unit 12 (step S140), inputs an image matching result from the image processing unit 20 (step S150), and images the object. It is determined whether or not it has been detected (step S160). When the object is detected in the image, that is, when the state in which the object is present in the image continues, the CPU 15 determines whether or not the recordable time of the recording device 41 has been exceeded (step S170). When the possible time has not been exceeded, the processing after step S140 is continuously executed.
  • the CPU 15 sets that time as the recording end time. (Step S180). Subsequently, the CPU 15 ends the recording and executes the processes after step S100.
  • the output destination of the image may be any one or more of, for example, a storage medium (DVD, HD, etc.) provided in the video camera 11 or the computer 40.
  • the captured moving image is reproduced by the computer 40, for example.
  • the imaging unit 12 captures images with a high image quality such as 1000 fps, the image capture unit 12 can perform slow reproduction or the like with high-quality images.
  • This routine is executed by the image processing unit 20 every time an image signal is input from the imaging unit 12. This routine is executed by each circuit included in the image processing unit 20.
  • the feature amount calculation unit 21 of the image processing unit 20 inputs an image signal (step S200), calculates functions Gx and Gy related to the gradient direction and gradient intensity of the luminance (step S210), and the calculation result Is used to calculate the gradient intensity and gradient direction of the brightness of each cell (step S220, FIG. 3A).
  • the feature amount calculation unit 21 divides 180 ° into eight BINs (step S230), generates a histogram for each cell, and stores the histogram in the histogram RAM 26 (step S240).
  • the feature amount calculation unit 21 sorts the histogram values (step S250), generates a binary code in which one direction is assigned to 1 bit with a value corresponding to the representative value of the cell (FIG. 3B), and The data is stored in the registers (binary code register 31, binary code RAM 32, and register 33) of the determination unit 30 (step S260). These processes are executed in parallel for a plurality of pixels and cells using a plurality of circuits.
  • the image determination unit 30 inputs the template image from the control unit 14 and stores it in the binary code register 34 and the binary code RAM 35, and performs a matching process between the template and the feature amount of the input image (step S270). At this time, the image determination unit 30 performs matching processing while shifting the feature amount of the input image of the cell for one line. In addition, since the image determination unit 30 performs matching processing using binary-coded feature quantities, simple operations such as AND and OR are possible, and determination can be performed at high speed (see FIG. 3C). Then, the image determination unit 30 stores the matching rates in the matching rate RAM 37 (step S280), sorts the matching rates, and outputs the top three matching rates to the control unit 14 as matching results (step S290). Exit. Such processing is repeated every time an image signal is input.
  • FIG. 7 is a timing chart of image input, matching processing, and recording control signals.
  • FIG. 8 is a timing chart of conventional image input, matching processing, and recording control signals.
  • the matching process takes several frames or the like, and recording cannot be started at an appropriate timing. In such a video camera, since the start of recording is delayed, a large-capacity frame memory for interpolating it is necessary.
  • the image processing unit 20 can perform feature amount calculation and matching processing simultaneously with image signal input. As shown in FIG. 7, the matching result immediately after one image transmission by the imaging unit 12. Therefore, it is possible to start recording an image from the next frame in which the object is detected.
  • the image processing unit 20 can expand the search target range in the image to the entire image. In addition to an object that moves at a constant speed and a constant direction, an object that randomly enters the imaging range, an object that moves randomly, and the like Can also be detected in real time.
  • the imaging unit 12 of the present embodiment corresponds to an imaging unit
  • the feature amount calculation unit 21 corresponds to a calculation unit
  • the image determination unit 30 corresponds to a determination unit
  • the control unit 14 corresponds to a control unit.
  • the video camera 11 of the camera system 10 of the present embodiment described above acquires a partial signal of the captured image, calculates a feature amount including the gradient direction and gradient intensity of the luminance of the local region (cell), and obtains a predetermined target.
  • a predetermined target object is detected in the captured image by matching the fixed feature (template) feature value including the gradient direction and gradient intensity of the brightness obtained by imaging the object with the feature value of the cell obtained by the calculation. It is determined whether or not.
  • the video camera 11 starts recording the captured image when it is determined that the predetermined object has entered the captured image.
  • the video camera 11 calculates feature amounts in rough units of luminance gradient direction and gradient intensity, and can detect a predetermined target at a higher speed by matching the feature amounts in the local region. The object can be imaged more reliably.
  • the video camera 11 can quickly detect an object and start capturing a moving image.
  • the control unit 14 stops the recording of the captured image when the predetermined object is removed from the captured image by the image determination unit 30, the image capturing can be started and stopped at a more appropriate timing.
  • the feature amount calculation unit 21 is a calculation circuit that creates a histogram for each gradient direction divided into a plurality of directions, and generates a binary code as a feature amount of a cell whose representative value is a direction having a strong gradient strength.
  • the image determination unit 30 is a determination circuit that performs matching using a binary-coded standard feature value. In this video camera 11, since each process is performed by hardware called a circuit, an object can be detected at a higher speed.
  • the image determination unit 30 performs matching using a binary code including at least a first feature amount of a cell and a second feature amount at a position shifted by a predetermined amount with respect to the cell as a standard feature amount of the cell. Since the image determination unit 30 detects an object by matching with a fixed feature amount including a plurality of feature amounts, for example, matching can be performed at once including a case where a local region is shifted. It is possible to further reduce the rate reduction and further reduce the increase in the number of matching times to shorten the processing time. In addition, since the image determination unit 30 performs matching processing using binary code, simple operations such as AND and OR are possible, and higher-speed operation processing can be realized.
  • the imaging unit 12 can perform high-speed imaging at a rate exceeding 240 fps, and the image determination unit 30 detects a predetermined target within one frame after the target enters the captured image. Since the image processing unit 20 can detect an object at a higher speed, the object can be detected in real time without delay. Further, the image determination unit 30 has a plurality of registers, stores the calculated feature values of the cells in each of the registers, and performs matching by operating the stored feature values for each cell as a shift register. Since the image determination unit 30 is configured to move (shift) the data in the circuit, it is not necessary to read out the feature amount each time, and matching processing can be performed at a higher speed. Furthermore, since the feature amount calculation unit 21 includes a plurality of circuits and calculates the feature amount in parallel by parallelizing a plurality of pixels, it is possible to realize faster calculation of the feature amount.
  • feature values that are binary-coded are used.
  • the present invention is not limited to this, and feature values other than binary codes may be used.
  • one template is created from three templates.
  • one template may be created from two templates, and one template may be created from four or more templates.
  • a template may be created.
  • the template may be appropriately created in consideration of the high speed with which the object is detected.
  • the image determination unit 30 includes a plurality of registers and performs a matching process by operating a shift register. However, this may be omitted. In order to speed up the matching process, it is desirable that the image determination unit 30 operates as a shift register.
  • the feature amount calculation unit 21 performs parallel processing using a plurality of circuits, but this may be omitted. In order to speed up the matching process, it is desirable that the feature amount calculation unit 21 performs parallel processing using a plurality of circuits.
  • the present invention can be used in the technical field of an apparatus that captures and plays back moving images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

A video camera 11 acquires a signal of a part of a captured image, calculates a feature amount including the gradient direction and the gradient intensity of luminance of a local region (cell), and determines whether a prescribed object is detected in the captured image by matching the calculated cell feature amount with a standard feature amount including the gradient direction and the gradient intensity of luminance obtained by capturing the prescribed object. The video camera 11 starts recording the captured image when it is determined that the prescribed object has entered the captured image.

Description

画像処理装置Image processing device
 本発明は、画像処理装置に関する。 The present invention relates to an image processing apparatus.
 従来、画像処理装置としては、例えば、画像認識結果に基づいて特定パターンの画像が特定の順序で検出されるタイミングを基準にしてメモリへの撮像結果の高速度の書き込みを終了し、このメモリに書き込んだ撮像結果を記録媒体に記録するものが提案されている(例えば、特許文献1参照)。この装置では、記録手段を一体化したビデオカメラに適用して所望するシーンを確実にスローモーション撮影することができるとしている。 Conventionally, as an image processing apparatus, for example, high-speed writing of an imaging result to a memory is terminated based on a timing at which an image of a specific pattern is detected in a specific order based on an image recognition result, and the memory is stored in this memory. There has been proposed a technique for recording a written imaging result on a recording medium (see, for example, Patent Document 1). In this apparatus, it is assumed that a desired scene can be surely taken in slow motion by applying it to a video camera in which recording means is integrated.
特開2008-141282号公報JP 2008-141282 A
 しかしながら、この特許文献1に記載された装置では、例えば、パターンマッチング、学習機能(Ada-Boost)、画像差分テーブルなどを用い、画像に含まれる一領域である検出領域中の特定のパターンを対象物として検出するものであり、より高速に対象物を検出することが望まれていた。 However, in the apparatus described in Patent Document 1, for example, pattern matching, a learning function (Ada-Boost), an image difference table, and the like are used to target a specific pattern in a detection area that is one area included in an image. It has been desired to detect an object at a higher speed.
 本発明は、このような課題に鑑みなされたものであり、高速で移動する対象物をより確実に撮像することができる画像処理装置を提供することを主目的とする。 The present invention has been made in view of such a problem, and a main object of the present invention is to provide an image processing apparatus that can capture an object moving at high speed more reliably.
 本発明は、上述の主目的を達成するために以下の手段を採った。 The present invention adopts the following means in order to achieve the main object described above.
 本明細書で開示する画像処理装置は、
 撮像部が撮像した撮像画像の一部の信号を取得し局所領域の輝度の勾配方向及び勾配強度を含む特徴量を演算する演算部と、
 所定の対象物を撮像して得られた輝度の勾配方向及び勾配強度を含む定型特徴量と前記演算した局所領域の特徴量とのマッチングを行い前記所定の対象物が前記撮像画像に検出されるか否かを判定する判定部と、
 前記判定部により前記所定の対象物が前記撮像画像に入ったと判定されたときに前記撮像画像の記録を開始する制御部と、
 を備えたものである。
The image processing apparatus disclosed in this specification is:
A calculation unit that acquires a signal of a part of the captured image captured by the imaging unit and calculates a feature amount including a gradient direction and gradient strength of the luminance of the local region;
The predetermined feature including the luminance gradient direction and gradient intensity obtained by imaging the predetermined object is matched with the calculated feature of the local region, and the predetermined object is detected in the captured image. A determination unit for determining whether or not
A control unit that starts recording the captured image when the determination unit determines that the predetermined object has entered the captured image;
It is equipped with.
 この装置では、撮像画像の一部の信号を取得し局所領域の輝度の勾配方向及び勾配強度を含む特徴量を演算し、所定の対象物を撮像して得られた輝度の勾配方向及び勾配強度を含む定型特徴量と演算して得られた局所領域の特徴量とのマッチングを行い所定の対象物が撮像画像に検出されるか否かを判定する。そして、この装置は、所定の対象物が撮像画像に入ったと判定されたときに撮像画像の記録を開始する。この装置では、輝度の勾配方向及び勾配強度という大まかな単位で特徴量を演算し、局所領域の特徴量のマッチングによって所定の対象物をより高速に検出可能であるため、例えば高速で移動する対象物をより確実に撮像することができる。例えば、高速に移動する対象物を1フレーム以内に確実に認識し、撮像、記録の開始、停止を制御することができる。 In this apparatus, a partial signal of a captured image is acquired, a feature amount including a luminance gradient direction and gradient intensity of a local region is calculated, and a luminance gradient direction and gradient intensity obtained by imaging a predetermined target object It is determined whether or not a predetermined object is detected in the captured image by matching the fixed feature quantity including the feature quantity of the local region obtained by the calculation. The apparatus starts recording the captured image when it is determined that the predetermined object has entered the captured image. In this device, feature quantities are calculated in rough units of luminance gradient direction and gradient intensity, and a predetermined target can be detected at higher speed by matching feature quantities in local regions. An object can be imaged more reliably. For example, an object that moves at high speed can be reliably recognized within one frame, and the start and stop of imaging and recording can be controlled.
カメラシステム10の一例を表す概略説明図。1 is a schematic explanatory diagram illustrating an example of a camera system 10. FIG. 撮像部12、画像処理ユニット20及び録画装置41のブロック図。FIG. 2 is a block diagram of an imaging unit 12, an image processing unit 20, and a recording device 41. 画像マッチング処理の一例を表す説明図。Explanatory drawing showing an example of an image matching process. テンプレートの一例を表す説明図。Explanatory drawing showing an example of a template. 動画自動撮影処理ルーチンの一例を示すフローチャート。The flowchart which shows an example of a moving image automatic imaging | photography process routine. 画像マッチング処理ルーチンの一例を示すフローチャート。The flowchart which shows an example of an image matching process routine. 画像入力、マッチング処理及び記録制御信号のタイミングチャート。The timing chart of an image input, a matching process, and a recording control signal. 従来の画像入力、マッチング処理及び記録制御信号のタイミングチャート。6 is a timing chart of conventional image input, matching processing, and recording control signals.
 本発明の実施形態を図面を参照しながら以下に説明する。図1は、本発明の一例であるカメラシステム10の一例を表す概略説明図である。カメラシステム10は、ビデオカメラ11と、コンピュータ(PC)40と、録画装置41とを備えている。ビデオカメラ11は、動画や静止画など画像を撮像する装置である。コンピュータ40は、ビデオカメラ11で撮像した画像データを取り込み、ディスプレイなどに表示するものである。録画装置41は、ビデオカメラ11で撮像した画像データを記録するものである。 Embodiments of the present invention will be described below with reference to the drawings. FIG. 1 is a schematic explanatory diagram illustrating an example of a camera system 10 that is an example of the present invention. The camera system 10 includes a video camera 11, a computer (PC) 40, and a recording device 41. The video camera 11 is a device that captures an image such as a moving image or a still image. The computer 40 captures image data captured by the video camera 11 and displays it on a display or the like. The recording device 41 records image data captured by the video camera 11.
 ビデオカメラ11は、撮像部12と、制御ユニット14と、タイミング制御部16と、画像処理ユニット20とを備えている。ビデオカメラ11は、例えば、最大で1000fps(1フレームの周期が1ms)で撮像可能な、ハイフレームレートカメラとして構成されている。撮像部12は、撮像素子13を備えている。撮像素子13は、受光により電荷を発生させ発生した電荷を出力する素子である。撮像素子13は、例えば、CMOSイメージセンサとしてもよい。制御ユニット14は、装置全体を制御するものであり、CPU15を備えている。このビデオカメラ11は、予め登録された対象物が撮像範囲に入ると動画撮影を開始し、対象物が撮像範囲から外れるか最大記録時間を経過するかのいずれかで動画撮影を終了する自動動画撮影モードを実行する。制御ユニット14は、詳しくは後述するが、画像処理ユニット20により所定の対象物が撮像画像に入ったと判定されたときに撮像画像の記憶継続を開始(録画を開始)する処理を行う。また、制御ユニット14は、画像処理ユニット20により所定の対象物が撮像画像から外れたときに撮像画像の継続記憶を停止(録画を停止)する処理を行う。タイミング制御部16は、画像処理ユニット20の動作に用いる各種クロックを生成する。 The video camera 11 includes an imaging unit 12, a control unit 14, a timing control unit 16, and an image processing unit 20. For example, the video camera 11 is configured as a high frame rate camera capable of capturing an image at a maximum of 1000 fps (the cycle of one frame is 1 ms). The imaging unit 12 includes an imaging element 13. The imaging element 13 is an element that generates charges by receiving light and outputs the generated charges. The image sensor 13 may be a CMOS image sensor, for example. The control unit 14 controls the entire apparatus and includes a CPU 15. The video camera 11 starts moving image shooting when an object registered in advance enters the imaging range, and ends moving image shooting when the object is out of the imaging range or the maximum recording time elapses. Execute shooting mode. As will be described in detail later, the control unit 14 performs a process of starting to continue storing the captured image (starting recording) when the image processing unit 20 determines that the predetermined object has entered the captured image. Further, the control unit 14 performs a process of stopping the continuous storage of the captured image (stops recording) when a predetermined target is removed from the captured image by the image processing unit 20. The timing control unit 16 generates various clocks used for the operation of the image processing unit 20.
 図2は、撮像部12、画像処理ユニット20及び録画装置41のブロック図である。このビデオカメラ11において、画像処理ユニット20は、撮像部12から入力した画像信号に対して対象物のマッチング処理を行う。図3は、撮像画像50の画像マッチング処理の一例を表す説明図であり、図3(a)がセル51及び画素52の特徴量の説明図、図3(b)がバイナリコード化の説明図、図3(c)がマッチング処理の説明図である。このマッチング処理については、詳しくは後述する。画像処理ユニット20は、対象物の有無に応じて録画装置41に対して記録制御信号(Rec Trigger=ON又はOFF)を出力する。録画装置41は、この記録制御信号を入力すると撮像部12から出力された画像信号の記録開始、停止を行う。 FIG. 2 is a block diagram of the imaging unit 12, the image processing unit 20, and the recording device 41. In this video camera 11, the image processing unit 20 performs an object matching process on the image signal input from the imaging unit 12. 3A and 3B are explanatory diagrams illustrating an example of image matching processing of the captured image 50. FIG. 3A is an explanatory diagram of feature amounts of the cells 51 and the pixels 52, and FIG. 3B is an explanatory diagram of binary coding. FIG. 3C is an explanatory diagram of the matching process. This matching process will be described in detail later. The image processing unit 20 outputs a recording control signal (Rec Trigger = ON or OFF) to the recording device 41 according to the presence / absence of an object. When the recording control signal is input, the recording device 41 starts and stops recording of the image signal output from the imaging unit 12.
 画像処理ユニット20は、特徴量演算部21と、画像判定部30とを備えている。特徴量演算部21は、撮像部12が撮像した撮像画像の一部の信号を取得し局所領域(セル)の輝度の勾配方向及び勾配強度を含む特徴量を演算するものである。セル51は、図3(a)に示すように、画像を所定数の画素(例えば、7×7や3×3など)で区切った領域をいう。特徴量演算部21は、撮像部12からの信号を入力すると同時に、以下に説明する処理を実行する。この特徴量演算部21は、関数演算部22、勾配強度演算部23、勾配方向演算部24、ヒストグラム発生部25、ヒストグラムRAM26、ソート部27及びバイナリーコード変換部28を備えている。関数演算部22は、勾配方向及び勾配強度を含む特徴量の関数Gx及びGyを演算する回路である。これらの関数Gx、Gyは、輝度値を用いて、例えば、式(1)、(2)により求められる。勾配強度演算部23は、特徴量の関数Gx,Gyを用いて、輝度の勾配強度を演算する回路である。この勾配強度は、式(3)により求められる。勾配方向演算部24は、特徴量の関数Gx,Gyを用いて、輝度の勾配方向を演算する回路である。この勾配強度は、式(4)により求められる。勾配方向演算部24は、角度ごとに分割した勾配方向を出力する。この勾配方向演算部24では、180°を8つのBINに分割した勾配方向を出力する。関数演算部22や勾配強度演算部23、勾配方向演算部24は、複数の画素を同時処理することができるように複数の回路をそれぞれ有している。例えば、関数演算部22は、8つの演算回路を有し、8ピクセルを同時処理可能である。勾配強度演算部23や勾配方向演算部24も関数演算部22と同様である。 The image processing unit 20 includes a feature amount calculation unit 21 and an image determination unit 30. The feature amount calculation unit 21 acquires a partial signal of the captured image captured by the imaging unit 12 and calculates a feature amount including the gradient direction and gradient strength of the luminance of the local region (cell). As shown in FIG. 3A, the cell 51 is an area in which an image is divided by a predetermined number of pixels (for example, 7 × 7, 3 × 3, etc.). The feature quantity computing unit 21 inputs the signal from the imaging unit 12 and simultaneously executes the process described below. The feature amount calculation unit 21 includes a function calculation unit 22, a gradient strength calculation unit 23, a gradient direction calculation unit 24, a histogram generation unit 25, a histogram RAM 26, a sort unit 27, and a binary code conversion unit 28. The function calculation unit 22 is a circuit that calculates the function Gx and Gy of the feature quantity including the gradient direction and the gradient strength. These functions Gx and Gy are obtained by using, for example, the luminance values according to equations (1) and (2). The gradient strength calculation unit 23 is a circuit that calculates the gradient strength of luminance using the function Gx and Gy of the feature amount. This gradient strength is obtained by equation (3). The gradient direction calculation unit 24 is a circuit that calculates the gradient direction of the luminance using the function Gx, Gy of the feature amount. This gradient strength is obtained by equation (4). The gradient direction calculation unit 24 outputs the gradient direction divided for each angle. The gradient direction calculation unit 24 outputs a gradient direction obtained by dividing 180 ° into eight BINs. The function calculator 22, the gradient strength calculator 23, and the gradient direction calculator 24 each have a plurality of circuits so that a plurality of pixels can be processed simultaneously. For example, the function calculation unit 22 has eight calculation circuits and can process eight pixels simultaneously. The gradient strength calculation unit 23 and the gradient direction calculation unit 24 are the same as the function calculation unit 22.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ヒストグラム発生部25は、勾配方向と勾配強度との関係を表す各セルのヒストグラムを生成する回路である。このヒストグラム発生部25は、例えば、8つの回路を有し、8セルを並行して処理可能である。ヒストグラムRAM26は、生成したヒストグラムを記憶する回路であり、例えば、読み取りデータ出力端子と書き込みデータ出力端子の両方をもつデュアルポートRAM(DPRAM)であるものとしてもよい。ソート部27は、勾配強度の高い順にセルのヒストグラムをソートする回路である。バイナリーコード変換部28は、セルのヒストグラムで最も勾配強度が高い角度をそのセルのHOG(Histograms of Oriented Gradients)特徴量としてバイナリーコードを生成する演算回路である。特徴量のバイナリーコードは、180°を8つに分割しそれぞれの角度範囲を0~7とし、それに応じた8桁のコードで規定されている(図3(b)参照)。特徴量演算部21は、それぞれ複数の回路により構成され、複数の画素及びセルに対して並列化して特徴量を演算することができる。 The histogram generation unit 25 is a circuit that generates a histogram of each cell representing the relationship between the gradient direction and the gradient intensity. The histogram generator 25 has, for example, eight circuits and can process eight cells in parallel. The histogram RAM 26 is a circuit that stores the generated histogram, and may be, for example, a dual port RAM (DPRAM) having both a read data output terminal and a write data output terminal. The sorting unit 27 is a circuit that sorts cell histograms in descending order of gradient strength. The binary code conversion unit 28 is an arithmetic circuit that generates a binary code using an angle having the highest gradient strength in a cell histogram as a HOG (Histograms of Oriented Gradients) feature amount of the cell. The binary code of the feature amount is defined by an 8-digit code corresponding to 0 to 7 by dividing 180 ° into 8 parts (see FIG. 3B). The feature amount calculation unit 21 includes a plurality of circuits, and can calculate a feature amount in parallel with respect to a plurality of pixels and cells.
 画像判定部30は、所定の対象物を撮像して得られた輝度の勾配方向及び勾配強度を含む定型特徴量と、演算した局所領域(セル)の特徴量とのマッチングを行い所定の対象物が撮像画像に検出されるか否かを判定するものである。画像判定部30は、バイナリーコード化された定型特徴量(テンプレート)を用いて入力画像の特徴量とのマッチング処理を行う判定回路である。この画像判定部30は、バイナリコードレジスタ31、バイナリーコードRAM32、レジスタ33、バイナリーコードレジスタ34、バイナリーコードRAM35、マッチング判定部36、マッチングレートRAM37及びソート部38を備えている。バイナリコードレジスタ31、バイナリーコードRAM32及びレジスタ33は、入力画像の特徴量(バイナリーコード)を記憶する回路である。このバイナリコードレジスタ31及びバイナリーコードRAM32は、それぞれ8つの記憶回路を有している。これらの記憶回路は、1セルライン分まとめてバイナリーコードRAM32から読み出してレジスタ33に格納し、それをシフトレジスタ動作させることによりスライディングウインドウを構成する。 The image determination unit 30 performs matching between the fixed feature amount including the gradient direction and the gradient strength of the luminance obtained by imaging the predetermined object and the calculated feature amount of the local region (cell), and performs the predetermined object. Is detected in the captured image. The image determination unit 30 is a determination circuit that performs a matching process with a feature amount of an input image using a binary-coded fixed feature amount (template). The image determination unit 30 includes a binary code register 31, a binary code RAM 32, a register 33, a binary code register 34, a binary code RAM 35, a matching determination unit 36, a matching rate RAM 37, and a sorting unit 38. The binary code register 31, the binary code RAM 32, and the register 33 are circuits that store the feature amount (binary code) of the input image. Each of the binary code register 31 and the binary code RAM 32 has eight storage circuits. These storage circuits are collectively read from the binary code RAM 32 for one cell line, stored in the register 33, and operate as a shift register to form a sliding window.
 バイナリーコードレジスタ34及びバイナリーコードRAM35は、制御ユニット14から出力されたテンプレートの特徴量を入力して記憶する記憶回路である。マッチング判定部36は、対象物の画像から予め求められたテンプレートの特徴量と、入力画像の特徴量とのマッチングを行い、その適合率を計数する回路である(図3(c)参照)。このマッチング判定部36は、マッチングチェック部36aと、マッチング率計数部36bとを備えている。マッチングチェック部36aは、テンプレートの特徴量と、入力画像の特徴量とのマッチングを行うものである。マッチング率計数部36bは、テンプレートの特徴量と、入力画像の特徴量との適合率を計数するものである。マッチングレートRAM37は、マッチング判定部36で判定された適合率を記憶する記憶回路である。ソート部38は、適合率をソートし、そのうち上位の判定結果を制御ユニット14へ出力する回路である。テンプレートの特徴量と、入力画像の特徴量のマッチングは、1セルずつ画像を移動しながら各位置でマッチングを行うが、画像判定部30は、複数のレジスタ(バイナリコードレジスタ31、バイナリーコードRAM32及びレジスタ33)を有し、セルの特徴量の複数をレジスタの各々に格納し、格納した特徴量をセルごとにシフトレジスタ動作させてマッチングを行う。このため、画像判定部30では、1セルずつ読み出すものに比して高速なマッチング処理を実現することができる。 The binary code register 34 and the binary code RAM 35 are storage circuits for inputting and storing the feature amount of the template output from the control unit 14. The matching determination unit 36 is a circuit that performs matching between the feature amount of the template obtained in advance from the image of the object and the feature amount of the input image, and counts the matching rate (see FIG. 3C). The matching determination unit 36 includes a matching check unit 36a and a matching rate counting unit 36b. The matching check unit 36a performs matching between the feature amount of the template and the feature amount of the input image. The matching rate counting unit 36b counts the matching rate between the feature amount of the template and the feature amount of the input image. The matching rate RAM 37 is a storage circuit that stores the matching rate determined by the matching determination unit 36. The sorting unit 38 is a circuit that sorts the relevance ratios and outputs a higher-order determination result to the control unit 14. The matching between the feature quantity of the template and the feature quantity of the input image is performed at each position while moving the image one cell at a time. The image determination unit 30 includes a plurality of registers (binary code register 31, binary code RAM 32, and A plurality of cell feature values are stored in each of the registers, and matching is performed by operating the stored feature values for each cell as a shift register. For this reason, the image determination unit 30 can realize a higher-speed matching process than that for reading out one cell at a time.
 ここで、テンプレートについて説明する。図4は、テンプレートの一例を表す説明図である。なお、図3、4などでは、理解を容易にするために自動車を用いて説明するが、移動する対象物は、自動車に限られない。このテンプレートは、上述した入力画像と同様に、対象物を撮像した画像を元に求められた基準となるセルの特徴量である。このテンプレートは、例えば、セルの第1特徴量とこのセルに対して所定量ずらした位置の第2特徴量とを少なくとも含むバイナリーコードを用いて生成されているものとしてもよい。このテンプレートは、第1の位置で分割されたセルから求めた第1テンプレートを含む。また、このテンプレートは、第1の位置から負側に所定画素数(例えば(-3,-3))移動した第2位置で分割されたセルから求めた第2テンプレートをも含む。更に、このテンプレートは、第1の位置から正側に所定画素数(例えば(3,3))移動した第3位置で分割されたセルから求めた第3テンプレートをも含む。図4のテンプレートは、第1~第3テンプレートにおいて8BIN中で所定の閾値を超えた上位3つのBINをバイナリコード化することにより、3種類のヒストグラムから生成された3つのバイナリコードを合成した1つのバイナリーコードとして得られている。画像処理ユニット20は、このテンプレートにより入力画像の特徴量をマッチングするため、3ポジションを1度にマッチングすることにより、ロバスト性を向上すると共に回路での負担を軽減することができる。また、セルの切り出し点を任意に変更した3つのテンプレートでのマッチングを実現させた目的は、検出率を向上させるためである。カメラを対象物との相対位置の変化による視差によってテンプレートに対して撮像画像が歪んで形状が変化して見えるため、3視点のテンプレートで同時にマッチングを行うことによって視差に対するロバスト性を向上している。 Here, the template is explained. FIG. 4 is an explanatory diagram illustrating an example of a template. 3 and 4 and the like will be described using an automobile for easy understanding, but the moving object is not limited to the automobile. Similar to the input image described above, this template is a reference cell feature amount obtained based on an image obtained by capturing an object. This template may be generated using, for example, a binary code including at least a first feature amount of a cell and a second feature amount at a position shifted by a predetermined amount with respect to the cell. This template includes a first template obtained from a cell divided at the first position. The template also includes a second template obtained from a cell divided at a second position moved by a predetermined number of pixels (for example, (−3, −3)) from the first position to the negative side. Furthermore, this template also includes a third template obtained from a cell divided at a third position moved by a predetermined number of pixels (for example, (3, 3)) from the first position to the positive side. The template of FIG. 4 is obtained by synthesizing three binary codes generated from three types of histograms by binary-coding the top three BINs that exceed a predetermined threshold in 8 BINs in the first to third templates. Is obtained as one binary code. Since the image processing unit 20 matches the feature amount of the input image using this template, matching the three positions at a time can improve robustness and reduce the load on the circuit. Further, the purpose of realizing the matching with the three templates in which the cell cut points are arbitrarily changed is to improve the detection rate. The captured image is distorted and appears to change shape due to the parallax due to the change in the relative position of the camera to the object, so the robustness against the parallax is improved by matching simultaneously with the template of three viewpoints .
 このビデオカメラ11において、撮像部12は、240fpsを超えるレート、例えば、最大1000fpsで高速撮像可能であるものとしてもよい。このとき、画像処理ユニット20の画像判定部30は、対象物が撮像画像に入ってから1フレーム以内、例えば1μsec以内に所定の対象物を検出するものとしてもよい。この画像処理ユニット20は、並列化された回路によりバイナリーコード化したHOG特徴量を高速に演算し、この特徴量を用い、シフトレジスタを使ったスライディングウインドウを構成することで高速なマッチング処理を行うことができる。このビデオカメラ11は、例えば、高速で移動する物体を録画するときに利用することができる。対象物は、移動中にその向き(方向)や大きさ(距離)が変化しないものであることが好ましい。このビデオカメラ11は、例えば、部品を基板に実装する実装装置において、採取された部品を移動しながら撮影するパーツカメラなどに利用することができる。使用者は、撮像した画像を録画装置41に記憶して保存することや、コンピュータ40でスローモーション再生することなどを行うことができる。 In this video camera 11, the imaging unit 12 may be capable of high-speed imaging at a rate exceeding 240 fps, for example, a maximum of 1000 fps. At this time, the image determination unit 30 of the image processing unit 20 may detect the predetermined object within one frame after the object enters the captured image, for example, within 1 μsec. The image processing unit 20 performs a high-speed matching process by calculating a binary-coded HOG feature quantity by a parallelized circuit and constructing a sliding window using a shift register using the feature quantity. be able to. The video camera 11 can be used, for example, when recording an object moving at high speed. It is preferable that the object does not change its direction (direction) or size (distance) during movement. The video camera 11 can be used, for example, in a parts camera that takes a picture while moving a collected component in a mounting apparatus that mounts the component on a substrate. The user can store and save the captured image in the recording device 41 or perform slow motion playback on the computer 40.
 次に、こうして構成された本実施形態のカメラシステム10の動作、まず、移動する所定の対象物の動画を自動撮影する処理について説明する。図5は、ビデオカメラ11が実行する動画自動撮影処理ルーチンの一例を示すフローチャートである。図6は、画像処理ユニット20が実行する画像マッチング処理ルーチンの一例を示すフローチャートである。動画自動撮影処理ルーチンは、使用者が動画自動撮影処理の実行を入力したあと制御ユニット14のCPU15により実行される。使用者は、自動撮影処理の実行前に、対象物を画像登録し、制御ユニット14は、その対象物のテンプレートを作成しておくものとする。 Next, the operation of the camera system 10 of the present embodiment configured as described above, first, processing for automatically capturing a moving image of a predetermined moving object will be described. FIG. 5 is a flowchart illustrating an example of a moving image automatic shooting processing routine executed by the video camera 11. FIG. 6 is a flowchart illustrating an example of an image matching processing routine executed by the image processing unit 20. The moving image automatic shooting processing routine is executed by the CPU 15 of the control unit 14 after the user inputs execution of the moving image automatic shooting processing. It is assumed that the user registers an image of an object before executing the automatic photographing process, and the control unit 14 creates a template for the object.
 このルーチンが開始されると、CPU15は、画像信号を撮像部12から入力する(ステップS100)。次に、CPU15は、画像のマッチング結果を画像処理ユニット20から入力し(ステップS110)、対象物を画像内に検出したか否かを判定する(ステップS120)。この判定は、例えば、画像のマッチング結果としての適合率が所定の閾値以上(例えば75%以上など)であるか否かに基づいて行う。対象物を画像内に検出しないときには、CPU15は、ステップS100以降の処理を実行し、対象物を画像内に検出したときには、録画を開始する(ステップS130)。次に、CPU15は、上記ステップS100~S120と同様に、画像信号を撮像部12から入力し(ステップS140)、画像のマッチング結果を画像処理ユニット20から入力し(ステップS150)、対象物を画像内に検出したか否かを判定する(ステップS160)。対象物を画像内に検出したとき、即ち、画像内に対象物が存在する状態が継続したときには、CPU15は、録画装置41の記録可能時間を超えたか否かを判定し(ステップS170)、記録可能時間を超えていないときには、ステップS140以降の処理を継続して実行する。 When this routine is started, the CPU 15 inputs an image signal from the imaging unit 12 (step S100). Next, the CPU 15 inputs an image matching result from the image processing unit 20 (step S110), and determines whether or not an object is detected in the image (step S120). This determination is performed based on, for example, whether or not the relevance ratio as the image matching result is equal to or higher than a predetermined threshold (for example, 75% or higher). When the object is not detected in the image, the CPU 15 executes the processing from step S100 onward. When the object is detected in the image, the CPU 15 starts recording (step S130). Next, as in steps S100 to S120, the CPU 15 inputs an image signal from the imaging unit 12 (step S140), inputs an image matching result from the image processing unit 20 (step S150), and images the object. It is determined whether or not it has been detected (step S160). When the object is detected in the image, that is, when the state in which the object is present in the image continues, the CPU 15 determines whether or not the recordable time of the recording device 41 has been exceeded (step S170). When the possible time has not been exceeded, the processing after step S140 is continuously executed.
 一方、ステップS170で記録可能時間を超えたとき、又は、ステップS160で対象物を画像内に検出しないとき、即ち画像外へ対象物が移動したときには、CPU15は、そのときを録画終了時刻に設定する(ステップS180)。続いて、CPU15は、録画を終了し、ステップS100以降の処理を実行する。なお、画像の出力先は、録画装置41のほか、例えば、ビデオカメラ11が備える記憶媒体(DVDやHDなど)や、コンピュータ40などのうちいずれか1以上としてもよい。撮像された動画は、例えば、コンピュータ40などで再生される。撮像部12は、例えば、1000fpsなどの高画質で撮影するため、高画質な画像でスロー再生などを行うことができる。 On the other hand, when the recordable time is exceeded in step S170, or when the object is not detected in the image in step S160, that is, when the object moves out of the image, the CPU 15 sets that time as the recording end time. (Step S180). Subsequently, the CPU 15 ends the recording and executes the processes after step S100. In addition to the recording device 41, the output destination of the image may be any one or more of, for example, a storage medium (DVD, HD, etc.) provided in the video camera 11 or the computer 40. The captured moving image is reproduced by the computer 40, for example. For example, since the imaging unit 12 captures images with a high image quality such as 1000 fps, the image capture unit 12 can perform slow reproduction or the like with high-quality images.
 次に、画像処理ユニット20が実行する画像マッチング処理(図3、6)について説明する。このルーチンは、撮像部12から画像信号を入力されるたびに画像処理ユニット20で実行される。このルーチンは、画像処理ユニット20が有する各回路により実行される。このルーチンを開始すると、画像処理ユニット20の特徴量演算部21は、画像信号を入力し(ステップS200)、輝度の勾配方向及び勾配強度に関する関数Gx,Gyを演算し(ステップS210)、演算結果を用いて各セルの輝度の勾配強度及び勾配方向を演算する(ステップS220,図3(a))。次に、特徴量演算部21は、180°を8つのBINに分割し(ステップS230)、セル単位でヒストグラムを生成し、ヒストグラムRAM26に記憶させる(ステップS240)。次に、特徴量演算部21は、ヒストグラム値をソートし(ステップS250)、1方向を1bitに割り当てたバイナリコードをセルの代表値に対応する値で生成し(図3(b))、画像判定部30のレジスタ(バイナリコードレジスタ31、バイナリーコードRAM32及びレジスタ33)に記憶させる(ステップS260)。これらの処理は、複数の回路を用いて、複数の画素、セルに対して並行して実行される。 Next, image matching processing (FIGS. 3 and 6) executed by the image processing unit 20 will be described. This routine is executed by the image processing unit 20 every time an image signal is input from the imaging unit 12. This routine is executed by each circuit included in the image processing unit 20. When this routine is started, the feature amount calculation unit 21 of the image processing unit 20 inputs an image signal (step S200), calculates functions Gx and Gy related to the gradient direction and gradient intensity of the luminance (step S210), and the calculation result Is used to calculate the gradient intensity and gradient direction of the brightness of each cell (step S220, FIG. 3A). Next, the feature amount calculation unit 21 divides 180 ° into eight BINs (step S230), generates a histogram for each cell, and stores the histogram in the histogram RAM 26 (step S240). Next, the feature amount calculation unit 21 sorts the histogram values (step S250), generates a binary code in which one direction is assigned to 1 bit with a value corresponding to the representative value of the cell (FIG. 3B), and The data is stored in the registers (binary code register 31, binary code RAM 32, and register 33) of the determination unit 30 (step S260). These processes are executed in parallel for a plurality of pixels and cells using a plurality of circuits.
 続いて、画像判定部30は、制御ユニット14からテンプレート画像を入力してバイナリーコードレジスタ34及びバイナリーコードRAM35に記憶させ、テンプレートと入力画像の特徴量のマッチング処理を行う(ステップS270)。このとき、画像判定部30は、1ライン分のセルの入力画像の特徴量をシフト移動しながらマッチング処理する。また、画像判定部30は、バイナリコード化した特徴量でマッチング処理を行うため、AND、ORなどの単純演算が可能であり、高速に判定することができる(図3(c)参照)。そして、画像判定部30は、適合率をマッチングレートRAM37に記憶し(ステップS280)、適合率をソートし、上位3つの適合率をマッチング結果として制御ユニット14へ出力し(ステップS290)、このルーチンを終了する。このような処理を画像信号の入力のたび繰り返し行う。 Subsequently, the image determination unit 30 inputs the template image from the control unit 14 and stores it in the binary code register 34 and the binary code RAM 35, and performs a matching process between the template and the feature amount of the input image (step S270). At this time, the image determination unit 30 performs matching processing while shifting the feature amount of the input image of the cell for one line. In addition, since the image determination unit 30 performs matching processing using binary-coded feature quantities, simple operations such as AND and OR are possible, and determination can be performed at high speed (see FIG. 3C). Then, the image determination unit 30 stores the matching rates in the matching rate RAM 37 (step S280), sorts the matching rates, and outputs the top three matching rates to the control unit 14 as matching results (step S290). Exit. Such processing is repeated every time an image signal is input.
 図7は、画像入力、マッチング処理及び記録制御信号のタイミングチャートである。図8は、従来の画像入力、マッチング処理及び記録制御信号のタイミングチャートである。従来の画像処理ユニットでは、図8に示すように、マッチング処理に数フレームなどの時間がかかり、適切なタイミングで記録開始できなかった。このようなビデオカメラでは、記録開始が遅れるため、それを補間するための大容量フレームメモリーが必要であった。一方、この画像処理ユニット20では、画像の信号入力と同時に特徴量の算出及びマッチング処理を行うことが可能であり、図7に示すように、撮像部12による1つの画像伝送の直後にマッチング結果が得られるため、対象物が検出された次のフレームから画像を録画開始することができる。また、この画像処理ユニット20では、画像中のサーチ対象範囲を画像全体にまで拡大することができ、定速、定方向移動する物体のほか、ランダムに撮像範囲に入る物体やランダム移動する物体などもリアルタイム検出することができる。 FIG. 7 is a timing chart of image input, matching processing, and recording control signals. FIG. 8 is a timing chart of conventional image input, matching processing, and recording control signals. In the conventional image processing unit, as shown in FIG. 8, the matching process takes several frames or the like, and recording cannot be started at an appropriate timing. In such a video camera, since the start of recording is delayed, a large-capacity frame memory for interpolating it is necessary. On the other hand, the image processing unit 20 can perform feature amount calculation and matching processing simultaneously with image signal input. As shown in FIG. 7, the matching result immediately after one image transmission by the imaging unit 12. Therefore, it is possible to start recording an image from the next frame in which the object is detected. In addition, the image processing unit 20 can expand the search target range in the image to the entire image. In addition to an object that moves at a constant speed and a constant direction, an object that randomly enters the imaging range, an object that moves randomly, and the like Can also be detected in real time.
 ここで、本実施形態の構成要素と本発明の構成要素との対応関係を明らかにする。本実施形態の撮像部12が撮像部に相当し、特徴量演算部21が演算部に相当し、画像判定部30が判定部に相当し、制御ユニット14が制御部に相当する。 Here, the correspondence between the components of the present embodiment and the components of the present invention will be clarified. The imaging unit 12 of the present embodiment corresponds to an imaging unit, the feature amount calculation unit 21 corresponds to a calculation unit, the image determination unit 30 corresponds to a determination unit, and the control unit 14 corresponds to a control unit.
 以上説明した本実施形態のカメラシステム10のビデオカメラ11では、撮像画像の一部の信号を取得し局所領域(セル)の輝度の勾配方向及び勾配強度を含む特徴量を演算し、所定の対象物を撮像して得られた輝度の勾配方向及び勾配強度を含む定型(テンプレート)特徴量と演算して得られたセルの特徴量とのマッチングを行い所定の対象物が撮像画像に検出されるか否かを判定する。そして、ビデオカメラ11は、所定の対象物が撮像画像に入ったと判定されたときに撮像画像の記録を開始する。このビデオカメラ11では、輝度の勾配方向及び勾配強度という大まかな単位で特徴量を演算し、局所領域の特徴量のマッチングにより所定の対象物をより高速に検出可能であるため、高速で移動する対象物をより確実に撮像することができる。一般的に、動画を撮影する装置においては、撮影の開始、終了の入力は使用者が行うが、開始操作から実際に記録されるまでの遅延があり、決定的瞬間を逃してしまうことがある。ビデオカメラ11では、対象物をいち早く検出して動画の撮像を開始することができる。 The video camera 11 of the camera system 10 of the present embodiment described above acquires a partial signal of the captured image, calculates a feature amount including the gradient direction and gradient intensity of the luminance of the local region (cell), and obtains a predetermined target. A predetermined target object is detected in the captured image by matching the fixed feature (template) feature value including the gradient direction and gradient intensity of the brightness obtained by imaging the object with the feature value of the cell obtained by the calculation. It is determined whether or not. The video camera 11 starts recording the captured image when it is determined that the predetermined object has entered the captured image. The video camera 11 calculates feature amounts in rough units of luminance gradient direction and gradient intensity, and can detect a predetermined target at a higher speed by matching the feature amounts in the local region. The object can be imaged more reliably. In general, in an apparatus for shooting a moving image, a user inputs start and end of shooting, but there is a delay from the start operation to actual recording, and a decisive moment may be missed. . The video camera 11 can quickly detect an object and start capturing a moving image.
 また、制御ユニット14は、画像判定部30により所定の対象物が撮像画像から外れたときに撮像画像の記録を停止するため、より適切なタイミングで画像撮像の開始及び停止を行うことができる。更に、特徴量演算部21は、複数の方向に分割した勾配方向別のヒストグラムを作成し、勾配強度の強い方向を代表値とするセルの特徴量としてのバイナリーコードを生成する演算回路である、また画像判定部30は、バイナリーコード化された定型特徴量を用いてマッチングを行う判定回路である。このビデオカメラ11では、回路というハードウエアにより各処理を行うため、より高速に対象物を検出することができる。更にまた、画像判定部30は、セルの定型特徴量としてセルの第1特徴量とこのセルに対して所定量ずらした位置の第2特徴量とを少なくとも含むバイナリーコードを用いてマッチングを行う。この画像判定部30では、複数の特徴量を含む定形特徴量とのマッチングで対象物を検出するため、例えば、局所領域がずれた場合も含めて一度にマッチングを行うことが可能であり、マッチング率の低下をより低減すると共に、マッチング回数の増加をより低減して処理時間の短縮を図ることができる。また、画像判定部30では、バイナリーコードでマッチング処理を行うため、ANDやORなど単純演算が可能であり、より高速な演算処理を実現することができる。 Further, since the control unit 14 stops the recording of the captured image when the predetermined object is removed from the captured image by the image determination unit 30, the image capturing can be started and stopped at a more appropriate timing. Furthermore, the feature amount calculation unit 21 is a calculation circuit that creates a histogram for each gradient direction divided into a plurality of directions, and generates a binary code as a feature amount of a cell whose representative value is a direction having a strong gradient strength. The image determination unit 30 is a determination circuit that performs matching using a binary-coded standard feature value. In this video camera 11, since each process is performed by hardware called a circuit, an object can be detected at a higher speed. Furthermore, the image determination unit 30 performs matching using a binary code including at least a first feature amount of a cell and a second feature amount at a position shifted by a predetermined amount with respect to the cell as a standard feature amount of the cell. Since the image determination unit 30 detects an object by matching with a fixed feature amount including a plurality of feature amounts, for example, matching can be performed at once including a case where a local region is shifted. It is possible to further reduce the rate reduction and further reduce the increase in the number of matching times to shorten the processing time. In addition, since the image determination unit 30 performs matching processing using binary code, simple operations such as AND and OR are possible, and higher-speed operation processing can be realized.
 また、撮像部12は、240fpsを超えるレートで高速撮像可能であり、画像判定部30は、対象物が撮像画像に入ってから1フレーム以内に所定の対象物を検出する。この画像処理ユニット20では、より高速に対象物を検出することができるため、対象物を遅延無しにリアルタイム検出することができる。更に、画像判定部30は、複数のレジスタを有し、演算したセルの特徴量をレジスタの各々に格納し、格納した特徴量をセルごとにシフトレジスタ動作させてマッチングを行う。この画像判定部30では、データがその回路内を移動(シフト)していくよう構成されているため、その都度、特徴量を読み出しする必要がなく、より高速にマッチング処理を行うことができる。更にまた、特徴量演算部21は、複数の回路を有し、複数の画素に対して並列化して特徴量を並列的に演算するため、より高速な特徴量の演算を実現することができる。 Further, the imaging unit 12 can perform high-speed imaging at a rate exceeding 240 fps, and the image determination unit 30 detects a predetermined target within one frame after the target enters the captured image. Since the image processing unit 20 can detect an object at a higher speed, the object can be detected in real time without delay. Further, the image determination unit 30 has a plurality of registers, stores the calculated feature values of the cells in each of the registers, and performs matching by operating the stored feature values for each cell as a shift register. Since the image determination unit 30 is configured to move (shift) the data in the circuit, it is not necessary to read out the feature amount each time, and matching processing can be performed at a higher speed. Furthermore, since the feature amount calculation unit 21 includes a plurality of circuits and calculates the feature amount in parallel by parallelizing a plurality of pixels, it is possible to realize faster calculation of the feature amount.
 なお、本発明は上述した実施形態に何ら限定されることはなく、本発明の技術的範囲に属する限り種々の態様で実施し得ることはいうまでもない。 It should be noted that the present invention is not limited to the above-described embodiment, and it goes without saying that the present invention can be implemented in various modes as long as it belongs to the technical scope of the present invention.
 例えば、上述した実施形態では、バイナリーコード化された特徴量を用いるものとしたが、これに限定されず、バイナリーコード以外の特徴量を用いてもよい。 For example, in the above-described embodiment, feature values that are binary-coded are used. However, the present invention is not limited to this, and feature values other than binary codes may be used.
 上述した実施形態では、3つのテンプレートから1つのテンプレートを作成するものとして説明したが、特に限定されず、2つのテンプレートから1つのテンプレートを作成してもよいし、4つ以上のテンプレートから1つのテンプレートを作成してもよい。テンプレートは、対象物を検出する高速性を考慮し、適宜作成すればよい。 In the above-described embodiment, it has been described that one template is created from three templates. However, the present invention is not particularly limited, and one template may be created from two templates, and one template may be created from four or more templates. A template may be created. The template may be appropriately created in consideration of the high speed with which the object is detected.
 上述した実施形態では、画像判定部30は、複数のレジスタを有し、シフトレジスタ動作させてマッチング処理を行うものとしたが、これを省略してもよい。なお、マッチング処理を高速化するには、画像判定部30は、シフトレジスタ動作させる方が望ましい。 In the above-described embodiment, the image determination unit 30 includes a plurality of registers and performs a matching process by operating a shift register. However, this may be omitted. In order to speed up the matching process, it is desirable that the image determination unit 30 operates as a shift register.
 上述した実施形態では、特徴量演算部21は、複数の回路により並列処理するものとしたが、これを省略してもよい。なお、マッチング処理を高速化するには、特徴量演算部21は、複数の回路により並列処理する方が望ましい。 In the above-described embodiment, the feature amount calculation unit 21 performs parallel processing using a plurality of circuits, but this may be omitted. In order to speed up the matching process, it is desirable that the feature amount calculation unit 21 performs parallel processing using a plurality of circuits.
 本発明は、動画の撮影、再生を行う装置の技術分野に利用可能である。 The present invention can be used in the technical field of an apparatus that captures and plays back moving images.
10 カメラシステム、11 ビデオカメラ、12 撮像部、13 撮像素子、14 制御ユニット、15 CPU、16 タイミング制御部、20 画像処理ユニット、21 特徴量演算部、22 関数演算部、23 勾配強度演算部、24 勾配方向演算部、25 ヒストグラム発生部、26 ヒストグラムRAM、27 ソート部、28 バイナリーコード変換部、30 画像判定部、31 バイナリコードレジスタ、32 バイナリーコードRAM、33 レジスタ、34 バイナリーコードレジスタ、35 バイナリーコードRAM、36 マッチング判定部、36a マッチングチェック部、36b マッチング率計数部、37 マッチングレートRAM、38 ソート部、40 コンピュータ、41 録画装置、50 撮像画像、51 セル、52 画素。 10 camera system, 11 video camera, 12 image pickup unit, 13 image pickup device, 14 control unit, 15 CPU, 16 timing control unit, 20 image processing unit, 21 feature amount calculation unit, 22 function calculation unit, 23 gradient intensity calculation unit, 24 gradient direction calculation unit, 25 histogram generation unit, 26 histogram RAM, 27 sort unit, 28 binary code conversion unit, 30 image determination unit, 31 binary code register, 32 binary code RAM, 33 register, 34 binary code register, 35 binary Code RAM, 36 Matching determination unit, 36a Matching check unit, 36b Matching rate counting unit, 37 Matching rate RAM, 38 Sorting unit, 40 Computer, 41 Recording device, 50 Captured image , 51 cell, 52 pixels.

Claims (7)

  1.  撮像部が撮像した撮像画像の一部の信号を取得し局所領域の輝度の勾配方向及び勾配強度を含む特徴量を演算する演算部と、
     所定の対象物を撮像して得られた輝度の勾配方向及び勾配強度を含む定型特徴量と前記演算した局所領域の特徴量とのマッチングを行い前記所定の対象物が前記撮像画像に検出されるか否かを判定する判定部と、
     前記判定部により前記所定の対象物が前記撮像画像に入ったと判定されたときに前記撮像画像の記録を開始する制御部と、
     を備えた画像処理装置。
    A calculation unit that acquires a signal of a part of the captured image captured by the imaging unit and calculates a feature amount including a gradient direction and gradient strength of the luminance of the local region;
    The predetermined feature including the luminance gradient direction and gradient intensity obtained by imaging the predetermined object is matched with the calculated feature of the local region, and the predetermined object is detected in the captured image. A determination unit for determining whether or not
    A control unit that starts recording the captured image when the determination unit determines that the predetermined object has entered the captured image;
    An image processing apparatus.
  2.  前記制御部は、前記判定部により前記所定の対象物が前記撮像画像から外れたときに前記撮像画像の記録を停止する、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the control unit stops recording of the captured image when the predetermined object is removed from the captured image by the determination unit.
  3.  前記演算部は、複数の方向に分割した勾配方向別のヒストグラムを作成し勾配強度の強い方向を代表値とする前記局所領域の前記特徴量としてのバイナリーコードを生成する演算回路であり、
     前記判定部は、バイナリーコード化された定型特徴量を用いて前記マッチングを行う判定回路である、請求項1又は2に記載の画像処理装置。
    The computing unit is a computing circuit that creates a histogram for each gradient direction divided into a plurality of directions and generates a binary code as the feature amount of the local region with a direction having a strong gradient strength as a representative value,
    The image processing apparatus according to claim 1, wherein the determination unit is a determination circuit that performs the matching using a binary-coded standard feature value.
  4.  前記判定部は、前記局所領域の定型特徴量として局所領域の第1特徴量と該局所領域に対して所定量ずらした位置の第2特徴量とを少なくとも含むバイナリーコードを用いて前記マッチングを行う、請求項3に記載の画像処理装置。 The determination unit performs the matching using a binary code including at least a first feature amount of the local region and a second feature amount at a position shifted by a predetermined amount with respect to the local region as the standard feature amount of the local region. The image processing apparatus according to claim 3.
  5.  前記判定部は、前記対象物が撮像画像に入ってから1フレーム以内に前記所定の対象物を検出する、請求項1~4のいずれか1項に記載の画像処理装置。 5. The image processing device according to claim 1, wherein the determination unit detects the predetermined object within one frame after the object enters the captured image.
  6.  前記判定部は、複数のレジスタを有し、前記演算した局所領域の特徴量を該レジスタの各々に格納し該格納した特徴量を局所領域毎にシフトレジスタ動作させて前記マッチングを行う、請求項1~5のいずれか1項に記載の画像処理装置。 The determination unit includes a plurality of registers, stores the calculated feature values of the local region in each of the registers, and performs the matching by operating the stored feature values for each local region as a shift register. 6. The image processing apparatus according to any one of 1 to 5.
  7.  前記演算部は、複数あり、複数の画素に対して並列化して前記特徴量を演算する、請求項1~6のいずれか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 6, wherein there are a plurality of the calculation units, and the feature amounts are calculated in parallel with respect to a plurality of pixels.
PCT/JP2016/080489 2016-10-14 2016-10-14 Image processing apparatus WO2018070032A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2016/080489 WO2018070032A1 (en) 2016-10-14 2016-10-14 Image processing apparatus
JP2018544656A JPWO2018070032A1 (en) 2016-10-14 2016-10-14 Image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/080489 WO2018070032A1 (en) 2016-10-14 2016-10-14 Image processing apparatus

Publications (1)

Publication Number Publication Date
WO2018070032A1 true WO2018070032A1 (en) 2018-04-19

Family

ID=61905284

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/080489 WO2018070032A1 (en) 2016-10-14 2016-10-14 Image processing apparatus

Country Status (2)

Country Link
JP (1) JPWO2018070032A1 (en)
WO (1) WO2018070032A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251321A (en) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc Image recording method
WO2011037097A1 (en) * 2009-09-24 2011-03-31 国立大学法人京都大学 Pattern recognition method and pattern recognition device using the method
JP2015007972A (en) * 2013-05-31 2015-01-15 オムロン株式会社 Image collation method, image collation apparatus, model template generation method, model template generation apparatus, and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5906524B2 (en) * 2013-04-24 2016-04-20 株式会社モルフォ Image composition apparatus, image composition method, and program
AU2014240213B2 (en) * 2014-09-30 2016-12-08 Canon Kabushiki Kaisha System and Method for object re-identification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251321A (en) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc Image recording method
WO2011037097A1 (en) * 2009-09-24 2011-03-31 国立大学法人京都大学 Pattern recognition method and pattern recognition device using the method
JP2015007972A (en) * 2013-05-31 2015-01-15 オムロン株式会社 Image collation method, image collation apparatus, model template generation method, model template generation apparatus, and program

Also Published As

Publication number Publication date
JPWO2018070032A1 (en) 2019-08-08

Similar Documents

Publication Publication Date Title
US8988529B2 (en) Target tracking apparatus, image tracking apparatus, methods of controlling operation of same, and digital camera
EP2880510B1 (en) Improved video tracking
JP2008129554A (en) Imaging device and automatic focusing control method
JP2011244046A (en) Imaging apparatus, image processing method, and program storage medium
US11539896B2 (en) Method and apparatus for dynamic image capturing based on motion information in image
EP2458846A1 (en) Image pickup apparatus that automatically determines shooting mode most suitable for shooting scene, control method therefor, and storage medium
US8295609B2 (en) Image processing apparatus, image processing method and computer readable-medium
US20160019681A1 (en) Image processing method and electronic device using the same
CN102547130B (en) Image shooting device and image shooting method thereof
JP2009213114A (en) Imaging device and program
JP5149861B2 (en) Intermediate image generation apparatus and operation control method thereof
US9117110B2 (en) Face detection-processing circuit and image pickup device including the same
US9473695B2 (en) Close focus with GPU
WO2018070032A1 (en) Image processing apparatus
JP2008263478A (en) Imaging apparatus
JP5832618B2 (en) Imaging apparatus, control method thereof, and program
EP3471017A1 (en) Method, system and apparatus for selecting frames of a video sequence
US9549112B2 (en) Image capturing apparatus, and control method therefor
US11838630B2 (en) Image capturing apparatus, control method therefor, and storage medium
JP2014067142A (en) Image processing apparatus, image processing method, imaging apparatus, and imaging method
JP2014153517A (en) Image processing device, image processing method, program, and storage medium
CN113347490B (en) Video processing method, terminal and storage medium
US20230360229A1 (en) Image processing apparatus, image capturing apparatus, control method, and storage medium
US20150022679A1 (en) Fast Motion Detection with GPU
JP6066942B2 (en) Image processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16918953

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018544656

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16918953

Country of ref document: EP

Kind code of ref document: A1