WO2015177881A1 - 画像処理装置、及び位置決めシステム - Google Patents
画像処理装置、及び位置決めシステム Download PDFInfo
- Publication number
- WO2015177881A1 WO2015177881A1 PCT/JP2014/063401 JP2014063401W WO2015177881A1 WO 2015177881 A1 WO2015177881 A1 WO 2015177881A1 JP 2014063401 W JP2014063401 W JP 2014063401W WO 2015177881 A1 WO2015177881 A1 WO 2015177881A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sensor
- processing apparatus
- image processing
- positioning system
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
Definitions
- the present invention relates to an image processing apparatus that connects an image sensor and recognizes an image acquired from the image sensor, and a positioning system.
- image processing apparatuses have been used to increase the speed of image processing that is necessary for determining a specific object included in an image and calculating a physical quantity such as the position and size of the specific object included in the image.
- a method is used in which only a necessary partial region is subjected to image processing in the entire region of the image.
- Patent Document 1 For example, as such a conventional technique, a technique described in Patent Document 1 is disclosed.
- a face part of a subject is detected from a plurality of image data, and a change amount of the face part and a movement amount in the horizontal / vertical direction are detected to detect the change amount and the movement amount.
- a correction amount is calculated, and the position and size of the facial organs (mouth, nose, etc.) on the image data are corrected based on the correction amount.
- Patent Document 1 does not assume setting of image transfer performance to the image sensor, and it is difficult to speed up image transfer from the image sensor.
- the present invention takes into consideration at least one of the above-described image recognition in consideration of the high-speed image transfer and the required performance of image transfer.
- the present invention has at least one of the following aspects, for example.
- the present invention obtains an image acquisition condition (for example, at least one of a dimension and a frame rate) acquired in consideration of required performance.
- the present invention predicts the trajectory of the recognition target from the obtained image, and obtains an image acquisition condition in consideration of the prediction result and the required performance.
- the present invention changes the position, size, and number of gradations of the image transferred from the image sensor by setting the image sensor itself, thereby speeding up the image transfer.
- the present invention provides an image processing apparatus capable of easily changing the position, size, and number of gradations of an image transferred from an image sensor so as to satisfy the required performance of image transfer.
- the present invention has at least one of the following effects. (1) Since the position, size, and number of gradations of the transfer image from the image sensor can be changed and the amount of transfer data from the image sensor can be reduced, high-speed image transfer can be realized. (2) Since the image sensor can be automatically set while satisfying the required performance of image transfer, the image transfer speed can be controlled easily and flexibly.
- FIG. 1 is a configuration diagram of an image processing apparatus according to the present embodiment.
- 6 is a flowchart showing processing operations of the image processing apparatus according to the present embodiment.
- FIG. 3 is a diagram showing a maximum size image that is continuously transferred to the image processing apparatus according to the present embodiment.
- FIG. 3 is a diagram showing recognition processing of the image processing apparatus according to the present embodiment.
- FIG. 3 is a diagram illustrating an example of images continuously transferred to the image processing apparatus according to the present embodiment.
- FIG. 3 is a diagram showing a setting screen of the image processing apparatus according to the present embodiment.
- FIG. 5 is a diagram illustrating Equations 1 to 4.
- FIG. 5 is a diagram for explaining Equations 5 to 10.
- the respective directions of the X axis and the Y axis are directions parallel to the horizontal direction, and the X axis and the Y axis are orthogonal coordinate systems on a plane along the horizontal direction. Is formed.
- the XY axis system represents an X axis system and a Y axis system in a plane parallel to the horizontal direction. The relationship between the X axis and the Y axis may be interchanged.
- the direction of the Z-axis is a vertical direction
- the Z-axis system represents an X-axis system in a plane parallel to the vertical direction.
- FIG. 1 is a diagram illustrating an application example of the image processing apparatus 100 according to the present embodiment to the positioning device 110.
- 1A is a top view of the positioning device 110
- FIG. 1B is a cross-sectional view showing a structure cut along the line AA shown in FIG. 1A. .
- the image recognition device 100 is connected to an image sensor 101 and a display input device 102. Further,
- the positioning device 110 includes an image sensor 101, a positioning head 111, a beam 112, a pedestal 113, and a base 114.
- the base 114 is equipped with a recognition target.
- the positioning head 111 carries the image sensor 101 and moves in the X-axis direction.
- the beam 112 carries the positioning head 111 and moves in the Y-axis direction.
- the gantry 113 supports the beam 112.
- the positioning device 110 drives the positioning head 111 in the XY directions and performs a positioning operation on the recognition target 120.
- the recognition target 120 imaged by the image sensor 101 moves in a direction opposite to the driving direction of the positioning operation of the positioning head 111 in a plurality of consecutive images having different imaging times.
- the recognition target 120 imaged by the image sensor 101 moves at a speed equivalent to the driving speed of the positioning head 111 in a plurality of consecutive images having different imaging times.
- FIG. 2 is a configuration diagram of the image processing apparatus 100 according to the present embodiment.
- the image processing apparatus 100 includes an image acquisition unit 200, an image recognition unit 201, an image sensor setting unit 203, an image sensor setting information calculation unit 202, a calculation method designation unit 204, and an input / output control unit 205.
- the image acquisition unit 200 acquires an image captured by the image sensor 101 and transferred from the image sensor 101.
- the image recognition unit 201 is connected to the image acquisition unit 200 and recognizes the recognition target 120 from a plurality of consecutive images acquired by the image acquisition unit 200 at different imaging times using a predetermined calculation method. is there.
- the image sensor sensor setting information calculation unit 202 is connected to the image recognition unit 201, and satisfies the required performance of the frame rate specified in advance based on the recognition result of the image recognition unit 201 and the calculation method specified in advance. In addition, setting information for the image sensor 101 is calculated.
- the image sensor setting unit 203 transfers the setting information calculated by the image sensor setting information calculation unit 202 to the image sensor 101 to perform setting.
- the calculation method designating unit 204 designates setting information such as required frame rate performance and a calculation method to the image sensor setting information calculation unit 202.
- the input / output control unit 205 is configured to input a calculation method or calculation process execution command to the image recognition unit 201 and the calculation method specifying unit 204, and a calculation method set in the image recognition unit 201 and the calculation method specifying unit 204. Or the operation result is output.
- FIG. 3 is a flowchart showing the processing operation of the image processing apparatus 100 according to the present embodiment.
- the image processing apparatus 100 designates a computation method to the computation method designation unit 204 via the display input device 102 connected to the input / output control unit 205 (S300).
- the calculation method specified in the calculation method specifying unit 204 includes the following (1) to (7).
- step S301 the image processing apparatus 100 determines whether to start image processing.
- the image processing device 100 starts image processing. (S301 ⁇ Yes). In the case of No in S301, the image processing apparatus 100 waits for an image processing start command.
- a predetermined initial value is set in the image sensor 101 based on the calculation application condition information set in the calculation method designating unit 204 (S302).
- the image transferred from the image sensor 101 is acquired by the image acquisition unit 200 (S303).
- FIG. 4 is a diagram showing a maximum size image continuously transferred to the image processing apparatus 100 according to the present embodiment. Note that the coordinate system of the image transferred from the image sensor 101 is the same as the coordinate system shown in FIG.
- All region images 400-1 to 400-4 which are images of the maximum size transferred from the image sensor 101, capture the recognition target 120 and transfer it to the image processing apparatus 100 at a specific frame rate F max [fps]. .
- the imaging time of the whole area image 400-1 is t0 [s]
- the imaging time of the entire region image 400-2 is t0 + Tc max [s]
- the imaging time of the entire region image 400-3 is t0 + 2 ⁇ Tc max [s]
- the imaging of the entire region image 400-4 is expressed as t0 + 3 ⁇ Tc max [s].
- the recognition target 120 captured in the entire area images 400-1 to 400-4 moves in a direction opposite to the driving direction of the positioning operation of the positioning head 111.
- the recognition target 120 moves from the lower left of the image of all region image 400-1 to the center of all region image 400-4 as the imaging time elapses. And then stop.
- the image processing apparatus 100 transfers the image acquired by the image acquisition unit 200 to the image recognition unit 201, and executes image recognition processing in the image recognition unit 201 (S304).
- the frame rate of the image transferred from the image sensor 101 is f [fps]
- An image obtained by superimposing the captured image at time t [s] and the captured image at the previous imaging time t-tc [s] is referred to as a superimposed image 500.
- the captured image at time t-tc [s] is the first image
- the captured image at time t [s] is the second image
- the captured image obtained at a time after time t [s] is the third image. It can be expressed as an image.
- ⁇ 1 is added to the end of the object or numerical code recognized in the captured image at the time t-tc ⁇ ⁇ for ease of explanation (for example, the recognition target 120-1).
- -2 is added to the end of the object recognized in the captured image at time t and the numerical code (for example, recognition target 120-2).
- the image recognition unit 201 recognizes the presence or absence of the recognition targets 120-1 and 120-2.
- the recognition targets 120-1 and 120-2 exist in the image, the following (1) to (3) are recognized.
- (1) Center coordinates 510-1, 510-2 which are the positions of the centers of the recognition objects 120-1 and 120-2 in the image, and (2) the size of the recognition objects 120-1 and 120-2 in the X-axis direction.
- the presence / absence of the recognition targets 120-1 and 120-2 and the center coordinates 510-1 and 511-2 are recognized by a general image processing method such as pattern matching.
- the image recognition unit 201 uses the luminance values of the recognition targets 120-1 and 120-2 of the superimposed image 500 and the luminance values of the background image that is a part other than the recognition targets 120-1 and 120-2 of the superimposed image 500. Then, the minimum number of gradations g min that is the number of gradations of luminance in the captured image of the image sensor 101 that is necessary for the recognition process is calculated.
- the image recognition unit 201 obtains the center coordinates 510-1 and 510-2, the X-axis size 511-1 and 511-2, the Y-axis size 512-1 and 512-2, and the minimum floor obtained by the above processing.
- the logarithm g min is transferred to the image sensor setting information calculation unit 202, and the process ends.
- the image processing apparatus 100 When the recognition target 120 is detected as a result of the image recognition by the image recognition unit 201 (S305 ⁇ Yes), the image processing apparatus 100 performs one calculation that matches the calculation application condition information specified in the calculation method specifying unit in S300. Based on the condition information and the result of image recognition calculated in S304, a setting value for the image sensor 101 is calculated by processing of the image sensor setting information calculation unit 202 (S306). When the result of S305 is No, the image processing apparatus 100 acquires the next time image transferred from the image sensor 101 by the image acquisition unit 200 without changing the setting value of the image sensor 101 (S303). Repeat the process.
- the image sensor setting information calculation unit 202 uses a movement amount in the X-axis direction from the recognition target 120-1 to the recognition target 120-2.
- An X-axis movement amount 520 and a Y-axis movement amount 521 that is a movement amount in the Y-axis direction from the recognition target 120-1 to the recognition target 120-2 are calculated.
- the center coordinate 510-1 is (x0, y0)
- the center coordinate 510-2 is (x, y)
- the speed in the X-axis direction and the speed in the Y-axis direction of the recognition target 120 may be obtained by other general image processing methods such as an optical flow.
- the X axis size 511-1 is lx0
- the X axis size 511-2 is lx
- the Y axis size 512-1 is ly0
- the speed acting in the X-axis direction is defined as the X-axis size change rate v zx [pixel / s], and the speed acting in the Y-axis direction.
- Y be the Y-axis size change v zy [pixel / s].
- the image sensor setting information calculation unit 202 calculates the X-axis size change rate v zx and the Y-axis size change rate v z using Equation 2.
- the X-axis size change degree and the Y-axis size change degree may be obtained by other general image processing methods such as stereo vision.
- the image sensor setting information calculation unit 202 captures images at the next time of the image sensor 101 from the following (1) to () calculated using the recognition targets 120-1 and 120-2.
- the predicted recognition result of the recognition target 120-3 is calculated.
- the image sensor setting information calculation unit 202 calculates the following (1) to (3) as predicted values of the recognition result of the captured image at time t + tc ′.
- the image sensor setting information calculation unit 202 calculates the center coordinate 510-3 using Equation 3 when the predicted center coordinate 510-3 of the recognition target 120-3 is (x ′, y ′). To do.
- the image sensor setting information calculation unit 202 sets the predicted X-axis size 511-3 of the recognition target 120-3 to l x ', and the predicted Y-axis size 512-3 of the recognition target 120-3 to l y Assuming that, the X-axis size 511-3 and the Y-axis size 512-3 are respectively calculated using Equation 4.
- the image sensor setting information calculation unit 202 performs the following (a) to (c) based on the center coordinates 510-3, the X-axis size 511-3, and the Y-axis size 512-3 calculated by itself. To obtain image sensor setting information ((1) to (5), which can be expressed as first setting information) satisfying the calculation condition information (which can be expressed as a predetermined condition or a required value). .
- A Required value of frame rate fr [fps]
- the X-axis transfer size 531 and the Y-axis transfer size 532 can be expressed as the dimensions of the third image.
- the transfer coordinates 533 can also be expressed as an example of information defining the position of the third image.
- the X-axis transfer size 531 is lp x '
- the Y-axis transfer size 532 is lp y '
- the excess size ratio in the X-axis direction with respect to the X-axis size 511-3 is the X-axis excess size ratio ⁇ [%]
- Y A surplus size ratio in the Y-axis direction with respect to the axis size 512-3 is defined as a Y-axis surplus size ratio ⁇ [%].
- lp x ′ can be expressed as a dimension in the first direction
- lp y ′ can be expressed as a second dimension in a direction intersecting the first direction.
- ⁇ and ⁇ can be expressed as predetermined coefficients.
- the image sensor setting information calculation unit 202 calculates the X-axis transfer size 531 and the Y-axis transfer size 532 using Equation 5 respectively.
- the X-axis surplus size ratio ⁇ and the Y-axis surplus size ratio ⁇ are values satisfying Equation 6.
- the minimum coordinate value that can be set for the image transferred from the image sensor 101 is (x min , y min ), and the maximum coordinate value that can be set for the image transferred from the image sensor 101 is (x max , y max ).
- the transfer coordinate 533 is (xp, yp).
- the image sensor setting information calculation unit 202 calculates the transfer coordinates 533 using Equation 7.
- the variables a and b in the mathematical expression 7 are respectively (l x '/ 2) ⁇ a ⁇ lp x '-(l x '/ 2), (l y ' / 2) ⁇ b ⁇ lp Any unique number satisfying y '-(l y ' / 2).
- the exposure time of the image sensor 101 Te [s] the transfer time of the header part in the image transfer of the image sensor 101 is Th [s]
- the transfer time increased per transfer of one line of the image sensor 101 is Tl [s]
- These Te, Th, Tl, d, and Td can be expressed as second setting information.
- the image sensor setting information calculation unit 202 calculates the frame rate f ′ at this time using Equation 8.
- the transfer gradation number g is a value satisfying Equation 9.
- the image sensor setting information calculation unit 202 can calculate the image sensor setting information while satisfying the calculation condition information by calculating the mathematical formulas shown in Formulas 3 to 9 and satisfying Formula 10.
- the image sensor setting information calculation unit 202 adjusts the values of the X-axis surplus size ratio ⁇ , the Y-axis surplus size ratio ⁇ , and the transfer gradation number g so as to satisfy the condition shown in Equation 9. A calculation procedure for calculating sensor setting information is required.
- the image sensor setting information calculation unit 202 transfers the calculated image sensor setting information to the image sensor setting unit 203 and completes the process of S306.
- the image sensor setting unit 203 of the image recognition apparatus 100 sets the image sensor setting information transferred from the image sensor setting information calculation unit 202 in the image sensor 101 (S307).
- the image processing apparatus 100 ends the process. To do.
- the image acquisition unit 200 acquires the image at the next time transferred from the image sensor 101 (S303), and the process is repeated.
- FIG. 6 is a diagram illustrating an example of images continuously transferred to the image processing apparatus 100 according to the present embodiment.
- the first partial acquired images 600-1 to 600-7 are images obtained by transferring only the partial areas of the whole area images 400-1 to 400-4 from the image sensor 101.
- the frame rate in the example of FIG. This is about three times as large as the entire area images 400-1 to 400-4.
- the second partial acquired images 610-1 to 610-7 are images obtained by transferring only the partial areas of the entire area images 400-1 to 400-4 from the image sensor 101.
- the total area images 400-1 to 400-4 are about 6 times larger than the first partial acquired images 600-1 to 600-7.
- the second partial acquired images 610-1 to 610-7 have a smaller transfer image size than the first partial acquired images 600-1 to 600-7.
- the recognition target 120 is set. In order to search in a wide area, it is desirable to apply the whole area image 400-1. In addition, as the distance between the positioning head 111 and the recognition target 120 approaches and the positioning head 111 decelerates, the setting of the image sensor 101 is switched in order to recognize the vibration error of the positioning head 111. It is desirable to improve the frame rate by changing the transferred images to the first partial acquired images 600-1 to 600-7 and the second partial acquired images 610-1 to 610-7.
- the positioning device 110 of the present embodiment is a component mounting device that mounts an electronic component having a short side size of several hundreds ⁇ m on a printed wiring board
- all-region images 400-1 to 400-4 are displayed.
- the image size of the first partial acquired images 600-1 to 600-7 is about 10 to 20 mm in both the X-axis direction and the Y-axis direction, and the frame rate at that time is about 100 to 200 fps.
- the frame rate is about 300 to 600 fps, and the image sizes of the second partial acquired images 610-1 to 610-7 are about
- the frame rate is preferably about 1 to 3 mm, and the frame rate is preferably about 1000 fps.
- FIG. 7 is a diagram showing a setting screen 700 of the image processing apparatus 100 according to the present embodiment.
- the setting screen 700 includes a parameter setting unit 701, a parameter application condition setting unit 702, an image processing result display unit 703, and a processing content display unit 704.
- the parameter setting unit 701 is an input interface for setting calculation condition information.
- the parameter setting condition setting unit 702 is an input interface for setting calculation application condition information for a plurality of calculation condition information.
- the image processing result display unit 703 includes the image recognition unit 201 of the image processing apparatus 100, It is an output interface for showing the result of processing of the image sensor setting information calculation unit 202.
- the image processing result display unit 703 displays the latest image acquired from the image sensor 101, the recognition value of the recognition target 120, and the time history of the image transferred from the image sensor 101. Is displayed.
- the processing content display unit 704 is an output interface for indicating the progress of internal processing of the image processing apparatus 100.
- the user of the image processing apparatus 100 first sets the calculation condition information in the parameter setting unit 701 and the calculation application condition information in the parameter application condition setting unit 702, and then sets the image processing result display unit 703.
- the processing content display unit 704 whether or not a desired recognition process can be executed is confirmed, and calculation condition information and calculation application condition information are adjusted based on the confirmed contents.
- FIG. 8 is a diagram illustrating a second example of the image processing apparatus 100 according to the present embodiment.
- the servo control device 800 includes an actuator control unit 801 and an operation information transfer unit 802.
- the servo control device 800 connects the actuator 810 and a sensor 820 for feeding back the position, speed, acceleration and the like of the actuator 810.
- the actuator control unit 801 controls the actuator 810 based on feedback information from the sensor 820.
- the actuator control unit 801 acquires the current position and speed of the movable unit using the actuator 810 based on feedback information from the sensor 820.
- the actuator control unit 801 is a movable unit using the actuator 810 that is predicted at the next imaging time of the image sensor 101 based on the position for driving the actuator 810, the command waveform of the speed, and the generation of the trajectory. The position, speed, etc. are calculated.
- the actuator control unit 801 includes information on the current position and speed of the movable part using the calculated actuator 810 and information on the position and speed of the movable part using the actuator 810 predicted at the next imaging time of the image sensor 101. Is transferred to the operation information transfer unit 802.
- the operation information transfer unit 802 is connected to the image sensor setting information calculation unit 202 of the image processing apparatus 100.
- the image sensor setting information calculation unit 202 of the image processing apparatus 100 of the present embodiment acquires at least one of the following (1) to () from the operation information transfer unit 802 of the servo control device 800. Execute the process. (1) Speed in the X-axis direction of the recognition target 120-2 of the current captured image, (2) Speed in the Y-axis direction, (3) X-axis size change rate, (4) Y-axis size change rate, (5) Center coordinates 510-3 predicted from the captured image at the next time, (6) X-axis size 511-3, and (7) Y-axis size 511-3.
- the image sensor setting information calculation unit 202 acquires from the image recognition unit 201 the information that is not acquired from the operation information transfer unit 802 among all information necessary for its own processing, as in the first embodiment. .
- Such a configuration of the image processing apparatus 100 can reduce the calculation load of the image recognition unit 201 and the image sensor setting information calculation unit 202, thereby enabling higher-speed image processing.
- the image processing apparatus 100 is applied to the positioning apparatus 110, and the actuator 810 and the sensor 820 are applied to control the positioning head 111, which is a movable portion of the positioning apparatus 110, and the beam 112.
- the servo control device 800 is applied to the control of the actuator 810 and the sensor 820, a position and speed more accurate than the position and speed calculated by the recognition processing of the image processing apparatus 100 can be obtained.
- the present invention is not limited to the embodiments.
- the contents disclosed in this embodiment can be applied to automobiles and railways. That is, the positioning system is a broad expression that can include component mounting apparatuses, automobiles, railways, and other systems.
- DESCRIPTION OF SYMBOLS 100 ... Image processing apparatus 101 ... Image sensor 102 ... Display input device 110 ... Positioning device 111 ... Positioning head 112 ... Beam 113 ... Base 114 ... Base 120, 120 -1, 120-2, 120-3 ... recognition target 200 ... image acquisition unit 201 ... image recognition unit 202 ... image sensor setting information calculation unit 203 ... image sensor setting unit 204 ... ⁇ Calculation method designation unit 205... I / O control unit 400, 400-1 to 400-4... All region image 500... Superimposed image 510-1, 510-2, 510-3. 511-1, 511-2, 511-3 ... X-axis size 512-1,512-2,512-3 ... Y-axis size 520 ...
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Studio Devices (AREA)
- Image Input (AREA)
- Image Processing (AREA)
Abstract
Description
(1)本発明は、要求性能を考慮して取得する画像の取得条件(例えば、寸法、及びフレームレートの少なくとも1つ)を得る。
(2)本発明は、得られた画像から認識対象の軌道を予測し、予測の結果と要求性能を考慮して画像の取得条件を得る。
(3)本発明は、画像センサから転送される画像の位置、サイズ、階調数を画像センサ自体に設定を行うことで変更し、画像転送を高速化する。
(4)本発明は、画像転送の要求性能を満たすように、画像センサから転送される画像の位置、サイズ、階調数を容易に変更可能な画像処理装置を提供する。
位置決め装置110は、画像センサ101、位置決めヘッド111、ビーム112、架台113、及びベース114を含む。
次に、S301では、画像処理装置100は、画像処理を開始するか否か判断する。例えば、入出力制御部205に接続される表示入力装置102を介して、演算方法指定部204に画像処理の開始が指令された場合、画像処理装置100は画像処理を開始する。(S301→Yes)。S301の処理でNoの場合、画像処理装置100は、画像処理の開始指令を待つ。
画像センサ設定情報演算部202は、画像認識部201から転送される中心座標510-1、510-2に基づき、認識対象120-1から、認識対象120-2までのX軸方向の移動量であるX軸移動量520と、認識対象120-1から、認識対象120-2までのY軸方向の移動量であるY軸移動量521を算出する。
101・・・画像センサ
102・・・表示入力装置
110・・・位置決め装置
111・・・位置決めヘッド
112・・・ビーム
113・・・架台
114・・・ベース
120、120-1、120-2、120-3・・・認識対象
200・・・画像取得部
201・・・画像認識部
202・・・画像センサ設定情報演算部
203・・・画像センサ設定部
204・・・演算方法指定部
205・・・入出力制御部
400、400-1~400-4・・・全領域画像
500・・・重畳画像
510-1、510-2、510-3・・・中心座標
511-1、511-2、511-3・・・X軸サイズ
512-1、512-2、512-3・・・Y軸サイズ
520・・・X軸移動量
521・・・Y軸移動量
530・・・画像転送サイズ
531・・・X軸転送サイズ
532・・・Y軸転送サイズ
533・・・転送座標
600-1~600-7・・・第一の部分取得画像
610-1~610-7・・・第二の部分取得画像
700・・・設定画面
701・・・パラメータ設定部
702・・・パラメータ適用条件設定部
703・・・画像処理結果表示部
704・・・処理内容表示部
800・・・サーボ制御装置
801・・・アクチュエータ制御部
802・・・動作情報転送部
810・・・アクチュエータ
820・・・センサ
Claims (28)
- センサと、
処理部と、を有し、
前記センサは第1の時刻に認識対象を含む第1の画像を得て、第1の時刻よりも後の第2の時刻に前記認識対象を含む第2の画像を得て、前記第2の時刻より後の第3の時刻に前記認識対象を含む第3の画像を得て、
前記処理部は、前記第3の画像を得る際の前記センサの第1の設定情報を所定の条件を満たすよう前記第1の画像、及び前記第2の画像から決定し、
前記第1の設定情報は、前記第3の画像の寸法、及び前記第3の画像を得る際のフレームレートを含む画像処理装置。 - 請求項1に記載の画像処理装置において、
前記処理部は前記第3の画像の寸法を前記第3の画像中での前記認識対象の予測寸法値、及び所定の係数を使用して得る画像処理装置。 - 請求項2に記載の画像処理装置において、
前記第3の画像の寸法は第1の方向における寸法と前記第1の方向と交差する方向における第2の寸法を含み、
前記処理部は、前記フレームレートを前記第2の寸法、及び前記センサの第2の設定情報を使用して得る画像処理装置。 - 請求項3に記載の画像処理装置において、
前記第2の設定情報は、前記センサの露光時間、前記センサのヘッダ部分の転送時間、前記センサの1ライン当たりに増加する転送時間、前記センサの階調値のビット数、及び前記センサの1ビット当たりの転送時間を含む画像処理装置。 - 請求項4に記載の画像処理装置において、
前記所定の条件は、前記フレームレートの要求値を含み、
前記フレームレートは前記要求値以下である画像処理装置。 - 請求項5に記載の画像処理装置において、
前記第1の所定の条件は、前記所定の係数の下限値を含む画像処理装置。 - 請求項6に記載の画像処理装置において、
前記第1の設定情報は前記第3の画像の位置を規定する情報を含む画像処理装置。 - 請求項7に記載の画像処理装置において、
前記第1の設定情報は前記第3の画像の階調数を含む画像処理装置。 - 請求項1に記載の画像処理装置において、
前記第3の画像の寸法は第1の方向における寸法と前記第1の方向と交差する方向における第2の寸法を含み、
前記処理部は、前記フレームレートを前記第2の寸法、及び前記センサの第2の設定情報を使用して得る画像処理装置。 - 請求項9に記載の画像処理装置において、
前記第2の設定情報は、前記センサの露光時間、前記センサのヘッダ部分の転送時間、前記センサの1ライン当たりに増加する転送時間、前記センサの階調値のビット数、及び前記センサの1ビット当たりの転送時間を含む画像処理装置。 - 請求項1に記載の画像処理装置において、
前記所定の条件は、前記フレームレートの要求値を含み、
前記フレームレートは前記要求値以下である画像処理装置。 - 請求項1に記載の画像処理装置において、
前記所定の条件は、前記第3の画像を得るための所定の係数の下限値を含む画像処理装置。 - 請求項1に記載の画像処理装置において、
前記第1の設定情報は前記第3の画像の位置を規定する情報を含む画像処理装置。 - 請求項1に記載の画像処理装置において、
前記第1の設定情報は前記第3の画像の階調数を含む画像処理装置。 - センサと、
前記センサを移動させる移動部と、
処理部と、を有し、
前記センサは第1の時刻に認識対象を含む第1の画像を得て、第1の時刻よりも後の第2の時刻に前記認識対象を含む第2の画像を得て、前記第2の時刻より後の第3の時刻に前記認識対象を含む第3の画像を得て、
前記処理部は、前記第3の画像を得る際の前記センサの第1の設定情報を所定の条件を満たすよう前記第1の画像、及び前記第2の画像から決定し、
前記第1の設定情報は、前記第3の画像の寸法、及び前記第3の画像を得る際のフレームレートを含む位置決めシステム。 - 請求項15に記載の位置決めシステムにおいて、
前記処理部は前記第3の画像の寸法を前記第3の画像中での前記認識対象の予測寸法値、及び所定の係数を使用して得る位置決めシステム。 - 請求項16に記載の位置決めシステムにおいて、
前記第3の画像の寸法は第1の方向における寸法と前記第1の方向と交差する方向における第2の寸法を含み、
前記処理部は、前記フレームレートを前記第2の寸法、及び前記センサの第2の設定情報を使用して得る位置決めシステム。 - 請求項17に記載の位置決めシステムにおいて、
前記第2の設定情報は、前記センサの露光時間、前記センサのヘッダ部分の転送時間、前記センサの1ライン当たりに増加する転送時間、前記センサの階調値のビット数、及び前記センサの1ビット当たりの転送時間を含む位置決めシステム。 - 請求項18に記載の位置決めシステムにおいて、
前記所定の条件は、前記フレームレートの要求値を含み、
前記フレームレートは前記要求値以下である位置決めシステム。 - 請求項19に記載の位置決めシステムにおいて、
前記所定の条件は、前記所定の係数の下限値を含む位置決めシステム。 - 請求項20に記載の位置決めシステムにおいて、
前記第1の設定情報は前記第3の画像の位置を規定する情報を含む位置決めシステム。 - 請求項21に記載の位置決めシステムにおいて、
前記第1の設定情報は前記第3の画像の階調数を含む位置決めシステム。 - 請求項15に記載の位置決めシステムにおいて、
前記第3の画像の寸法は第1の方向における寸法と前記第1の方向と交差する方向における第2の寸法を含み、
前記処理部は、前記フレームレートを前記第2の寸法、及び前記センサの第2の設定情報を使用して得る位置決めシステム。 - 請求項23に記載の位置決めシステムにおいて、
前記第2の設定情報は、前記センサの露光時間、前記センサのヘッダ部分の転送時間、前記センサの1ライン当たりに増加する転送時間、前記センサの階調値のビット数、及び前記センサの1ビット当たりの転送時間を含む位置決めシステム。 - 請求項15に記載の位置決めシステムにおいて、
前記所定の条件は、前記フレームレートの要求値を含み、
前記フレームレートは前記要求値以下である位置決めシステム。 - 請求項15に記載の位置決めシステムにおいて、
前記所定の条件は、前記第3の画像を得るための所定の係数の下限値を含む位置決めシステム。 - 請求項15に記載の位置決めシステムにおいて、
前記第1の設定情報は前記第3の画像の位置を規定する情報を含む位置決めシステム。 - 請求項15に記載の位置決めシステムにおいて、
前記第1の設定情報は前記第3の画像の階調数を含む位置決めシステム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/063401 WO2015177881A1 (ja) | 2014-05-21 | 2014-05-21 | 画像処理装置、及び位置決めシステム |
JP2016520855A JP6258480B2 (ja) | 2014-05-21 | 2014-05-21 | 画像処理装置、及び位置決めシステム |
US15/312,029 US20170094200A1 (en) | 2014-05-21 | 2014-05-21 | Image processing apparatus and positioning system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/063401 WO2015177881A1 (ja) | 2014-05-21 | 2014-05-21 | 画像処理装置、及び位置決めシステム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015177881A1 true WO2015177881A1 (ja) | 2015-11-26 |
Family
ID=54553577
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/063401 WO2015177881A1 (ja) | 2014-05-21 | 2014-05-21 | 画像処理装置、及び位置決めシステム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20170094200A1 (ja) |
JP (1) | JP6258480B2 (ja) |
WO (1) | WO2015177881A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6765820B2 (ja) * | 2016-02-10 | 2020-10-07 | オリンパス株式会社 | カメラ |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002010243A (ja) * | 2000-06-16 | 2002-01-11 | Mitsubishi Heavy Ind Ltd | 動画像処理カメラ |
JP2010263581A (ja) * | 2009-05-11 | 2010-11-18 | Canon Inc | 物体認識装置及び物体認識方法 |
JP2012048476A (ja) * | 2010-08-26 | 2012-03-08 | Canon Inc | 画像処理装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3046100B2 (ja) * | 1991-07-22 | 2000-05-29 | 株式会社フォトロン | 画像記録装置 |
US7545434B2 (en) * | 2002-02-04 | 2009-06-09 | Hewlett-Packard Development Company, L.P. | Video camera with variable image capture rate and related methodology |
US20050104958A1 (en) * | 2003-11-13 | 2005-05-19 | Geoffrey Egnal | Active camera video-based surveillance systems and methods |
US7471767B2 (en) * | 2006-05-03 | 2008-12-30 | Siemens Medical Solutions Usa, Inc. | Systems and methods for determining image acquisition parameters |
JP5241335B2 (ja) * | 2008-06-10 | 2013-07-17 | キヤノン株式会社 | X線画像診断装置及び画像処理方法 |
JP5645505B2 (ja) * | 2010-06-29 | 2014-12-24 | キヤノン株式会社 | 撮像装置及びその制御方法 |
CN110572586A (zh) * | 2012-05-02 | 2019-12-13 | 株式会社尼康 | 拍摄元件及电子设备 |
GB2503481B (en) * | 2012-06-28 | 2017-06-07 | Bae Systems Plc | Surveillance process and apparatus |
US9947128B2 (en) * | 2013-01-29 | 2018-04-17 | Andrew Robert Korb | Methods for improving accuracy, analyzing change detection, and performing data compression for multiple images |
US9454827B2 (en) * | 2013-08-27 | 2016-09-27 | Qualcomm Incorporated | Systems, devices and methods for tracking objects on a display |
KR20150041239A (ko) * | 2013-10-07 | 2015-04-16 | 삼성전자주식회사 | 엑스선 영상 장치 및 그 제어 방법 |
US9417196B2 (en) * | 2013-10-10 | 2016-08-16 | Bruker Axs Inc. | X-ray diffraction based crystal centering method using an active pixel array sensor in rolling shutter mode |
-
2014
- 2014-05-21 WO PCT/JP2014/063401 patent/WO2015177881A1/ja active Application Filing
- 2014-05-21 JP JP2016520855A patent/JP6258480B2/ja active Active
- 2014-05-21 US US15/312,029 patent/US20170094200A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002010243A (ja) * | 2000-06-16 | 2002-01-11 | Mitsubishi Heavy Ind Ltd | 動画像処理カメラ |
JP2010263581A (ja) * | 2009-05-11 | 2010-11-18 | Canon Inc | 物体認識装置及び物体認識方法 |
JP2012048476A (ja) * | 2010-08-26 | 2012-03-08 | Canon Inc | 画像処理装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6258480B2 (ja) | 2018-01-10 |
JPWO2015177881A1 (ja) | 2017-04-20 |
US20170094200A1 (en) | 2017-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110561432B (zh) | 一种基于人机共融的安全协作方法及装置 | |
US10553033B2 (en) | Head-mounted display system and method for presenting display on head-mounted display | |
JP5869177B1 (ja) | 仮想現実空間映像表示方法、及び、プログラム | |
US8731276B2 (en) | Motion space presentation device and motion space presentation method | |
JP6167622B2 (ja) | 制御システムおよび制御方法 | |
US11090807B2 (en) | Motion generation method, motion generation device, system, and computer program | |
JP2017104944A (ja) | 仮想物体の画像をロボットの映像に重畳表示する映像表示装置を備えるロボットシステム | |
JP6445092B2 (ja) | ロボットの教示のための情報を表示するロボットシステム | |
US9507414B2 (en) | Information processing device, information processing method, and program | |
US12030187B2 (en) | Robot system | |
CN107924198B (zh) | 控制装置、控制方法、非暂时记录介质 | |
JP2003211381A (ja) | ロボット制御装置 | |
JP2007233516A (ja) | 画像測定システム、画像測定方法及び画像測定プログラム | |
JP2016081264A (ja) | 画像処理方法、画像処理装置及びロボットシステム | |
JP6258480B2 (ja) | 画像処理装置、及び位置決めシステム | |
JPS59229619A (ja) | ロボツトの作業教示システムおよびその使用方法 | |
US9492748B2 (en) | Video game apparatus, video game controlling program, and video game controlling method | |
JP2009301181A (ja) | 画像処理装置、画像処理プログラム、画像処理方法、および電子機器 | |
JP5561037B2 (ja) | ロボット及びその制御方法 | |
JP5378143B2 (ja) | 画像変換装置及び操作支援システム | |
CN116476074A (zh) | 基于混合现实技术的远程机械臂操作系统及人机交互方法 | |
JPS6334093A (ja) | 視覚装置 | |
JPH09131681A (ja) | 教示装置 | |
JPH04250700A (ja) | 電子部品認識方法 | |
JP2015076026A (ja) | パターンマッチング装置及びパターンマッチング方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14892491 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016520855 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15312029 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14892491 Country of ref document: EP Kind code of ref document: A1 |