KR20190033416A - Method and apparatus for recognizing curling sheet - Google Patents
Method and apparatus for recognizing curling sheet Download PDFInfo
- Publication number
- KR20190033416A KR20190033416A KR1020180045596A KR20180045596A KR20190033416A KR 20190033416 A KR20190033416 A KR 20190033416A KR 1020180045596 A KR1020180045596 A KR 1020180045596A KR 20180045596 A KR20180045596 A KR 20180045596A KR 20190033416 A KR20190033416 A KR 20190033416A
- Authority
- KR
- South Korea
- Prior art keywords
- camera
- curling sheet
- point
- curling
- house
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
The present invention relates to a method and apparatus for recognizing a curling sheet that calculates the image coordinates of an arbitrary point of a curling sheet photographed by a camera.
A curling game is a winter sport in which teams decide whether to win or lose based on how much more precisely the two teams have placed the slab on a rectangular ice link called the curling sheet. A curling game is a game where both teams alternately throw curling stones into the house (circle), a team close to tee in the center of the house wins, and the number of curling stones closer to the closest team's curling stones You get as many points as you can. The athletes pitch the curling stone to the appropriate path in the direction of the house, brushing with a broom to adjust the course and speed of the curling stone. In the curling field, there is a lot of fine ice grains called "pebble" because the water is scattered before the start of the game. This is a sport in which the pebble is removed through brushing to secure the course of the stone.
The curling sheet used in the curling competition is 4.2 meters in width and 42.07 meters in height, which has a variable length. Accordingly, there is a problem that the pixel value of the entire curling sheet is far away from the camera, depending on the position of the camera, so that the recognition rate is lowered. In addition, when a top-view camera is installed to photograph a curling sheet, there is a problem in that the vertical shape of the rectangular curling sheet can be distorted in a curved shape so that accurate coordinate detection of the curling stone There is a difficult problem.
The present invention provides a method and apparatus for recognizing a curling sheet that extracts four feature points from a pattern of a curling sheet image photographed at a long distance and determines an arbitrary coordinate value of the curling sheet by estimating an attitude value using the feature points, There is a purpose.
In order to achieve the above object, a method of recognizing a curling sheet according to an exemplary embodiment of the present invention includes a first camera internal parameter, an internal parameter of a second camera, an external parameter of the first camera, Obtaining a curling sheet image of a near region through a first camera and a curling sheet image of a far region through a second camera through a first camera; Estimating a first posture value through the three-dimensional posture estimation of the first camera using the four feature points, calculating the first posture value using the first posture value and the relational expression, Estimating a second posture value through the three-dimensional posture estimation of the second camera, and using the first posture value and the second posture value And calculating curling sheet coordinates of an arbitrary point of the curling sheet image of the near area and the curling sheet image of the far area.
Wherein the extracting of the four minutiae points comprises extracting a house, two side lines, and a hog line pattern from the curling sheet image of the near region, calculating a first vanishing point, which is an intersection of the two side lines, And a second contact point, which is a second contact point, which is an intersection of the line connecting the first contact pair and the hog line, and a second contact point, which is the contact point of the house, can be extracted as four feature points.
Wherein the step of extracting the four feature points corresponds to a case where the pair of first contacts corresponds to the intersection of the tee line of the curling sheet and the house and the pair of second contacts corresponds to the intersection of the center line of the curling sheet and the house The global coordinates of the four minutiae can be assigned using the feature.
The extraction of the four feature points may include extracting the four minutiae points in a curling sheet image of the short range region when the connection line of the first contact pair and the hog line are parallel and the second vanishing point does not exist, Can be extracted as a pair of second contact points and used as the four minutiae points.
Wherein the step of extracting the four feature points comprises the steps of: if the hog line is not included in the curling sheet image of the near region, distinguishing the first ellipse and the second ellipse among the ellipses in the house; The intersection of the first point and the second point and the intersection point of the first point and the second point are obtained, and the third point and the fourth point, which are intersections of the tangents of the points, coincide with each other Two points of the connection line connecting the first vanishing point and the specific point may be replaced by a second pair of contact points and used as four feature points.
The step of extracting the four feature points may further include dividing the ellipse in the house into the first ellipse and the second ellipse in the curling sheet image of the near region when the hog line is not included in the curling sheet image of the near region, A straight line connecting a center of the first ellipse with a center of the second ellipse and a center of a house that is an intersection of a straight line connecting the center of the first ellipse and a connecting line of the first pair of contacts, The two points that meet the second ellipse can be replaced with the second pair of contact points and used as four feature points.
Here, the first camera and the second camera are fixedly arranged with a fixed physical relationship including an angle and a distance, and the curling sheet image of the close range is obtained by the first camera moving the house and the two side lines May be images photographed to include.
A curling sheet recognizing apparatus according to another embodiment of the present invention includes a first camera set to photograph a curling sheet image of a near region including two side lines and a house of a curling sheet, A second camera set to capture a curling sheet image of an overlapping far-field region, and a second camera set to capture an image of a curling sheet image of an overlapping far- Obtaining a curling sheet image of a near area through a first camera and a curling sheet image of a far area through a second camera through the first camera, acquiring a curling sheet image of the far area from a curling sheet image of the near area, And extracting the four feature points from the first camera, Estimating the first posture value through the three-dimensional posture estimation, estimating the second posture value through the three-dimensional posture estimation of the second camera using the first posture value and the relational expression, And a control unit for calculating curling sheet coordinates of an arbitrary point of the curling sheet image of the near region and the curling sheet image of the far region using the second posture value.
According to the present invention, it is possible to improve the accuracy by performing the curling sheet recognition using an apparatus equipped with two movable cameras, and to easily move the apparatus when performing a competition in another curling sheet.
In addition, the coordinates of the curling sheet can be calculated without using a separate landmark by estimating the attitude values of the two cameras using the pattern fixed on the curling sheet and calculating the coordinates of the curling sheet in the image using the attitude values.
In addition, according to the present invention, a near area and a far area of a curling sheet are respectively photographed using two cameras, and then combined into one curling sheet through image processing, thereby enhancing a curling sheet recognition rate and minimizing distortion according to camera positions .
1 is a flowchart of a method of recognizing a curling sheet according to an embodiment of the present invention.
2 is a schematic block diagram of a recognition apparatus for a curling sheet according to an embodiment of the present invention.
3 is an external view of a recognition device for a curling sheet according to an embodiment of the present invention.
FIGS. 4 to 5 are views showing the installation position of a recognition device for a curling sheet according to an exemplary embodiment of the present invention and regions of an image captured by two cameras.
FIG. 6 is a view for defining a coordinate system of an image photographed by a recognition device for a curling sheet according to an embodiment of the present invention.
7 to 8 are views for explaining four minutiae points according to a method of recognizing a curling sheet according to an embodiment of the present invention.
9 to 12 are views for explaining a method of extracting four feature points according to a method of recognizing a curling sheet according to an embodiment of the present invention.
The present invention is capable of various modifications and various embodiments, and specific embodiments will be described in detail with reference to the drawings. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.
The terms first, second, A, B, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term " and / or " includes any combination of a plurality of related entry items or any of a plurality of related entry items.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, but it should be understood that other elements may be present in between something to do. On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.
Throughout the specification and claims, it is to be understood that when a component comprises a component, it does not exclude other components, but may include other components, unless specifically stated otherwise.
Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings.
1 is a flowchart of a method of recognizing a curling sheet according to an embodiment of the present invention.
1, a method of recognizing a curling sheet according to an embodiment of the present invention includes performing pre-calibration (S110), image acquisition (S120), attitude estimation (S130), and coordinate recognition of a curling sheet (S140) Lt; / RTI >
First, the pre-calibration (S110) is a process of calculating a relational expression of an internal parameter of the first camera, an internal parameter of the second camera, an external parameter of the first camera, and an external parameter of the second camera. The pre-calibration need not be performed in the curling arena, but can be performed in a separate laboratory or the like. At this time, the pre-calibration may be performed in a state where the first camera and the second camera are fixedly arranged apart from each other by a certain distance.
The internal parameters determine the internal factors such as the distance and angle of the lens and the image sensor. The external parameters determine the position of a point on the two-dimensional image. Matrix. Hereinafter, obtaining the external parameters is referred to as three-dimensional attitude estimation, and specific external parameters are referred to as first and second attitude values.
Specifically, the internal parameters of the first camera and the second camera are calculated using the chess board and the MATLAB CALIBRATION TOOL, and the external parameters of the first camera are calculated through the three-dimensional posture estimation using an arbitrary image that knows the three- And the external parameter of the second camera can be calculated. When using the chess board and MATLAB CALIBRATION TOOL, it is possible to calculate the external parameters of the first camera and the external parameters of the second camera, but it can be changed by moving the device or adjusting the shooting position and angle of the camera , Only the relational expression is calculated.
The pre-calibration process includes a process of calculating a first relational expression of a two-dimensional image coordinate and a world coordinate of an arbitrary point photographed by the first camera and the second camera, an internal parameter of the first camera, Dimensional image coordinates and world coordinates of the first camera and the first and second relational expressions using the first and second relational expressions, And calculating a third relational expression, which is a relational expression of two external parameters of the camera. At this time, an arbitrary point represents a feature point that knows the two-dimensional image coordinates and the world coordinates, and for the experiment, it is possible to select a point at which both the two-dimensional image coordinates and the world coordinates are known.
Hereinafter, the mathematical expression of each process will be described in detail.
The three-dimensional world coordinates can be expressed by Equation (1) using two-dimensional image coordinates, internal parameters, and external parameters.
Here, X1 and X2 are two-dimensional image coordinates taken by the first and second cameras, W1 and W2 are three-dimensional world coordinates photographed by the first and second cameras, lambda 1 and lambda 2 are the sizes of the first and second cameras R1 and R2 are rotation matrices of the external parameters of the first and second cameras, t1 and t2 are the translation parameters of the external parameters of the first and second cameras, K1 and K2 are internal parameters of the first and second cameras, .
The above two-dimensional image coordinates can be converted into camera coordinates using Equation (2). In addition, the internal parameters can be removed and expressed as in Equation (3).
Here, C1 and C2 represent camera coordinates of the first and second cameras.
When both the internal parameters and the external parameters of the first and second cameras are known, the two-dimensional image coordinates can be transformed into the coordinates of the two-dimensional curling sheet through the three-dimensional camera coordinate system. That is, after acquiring the image of the curling sheet, the image of the curling sheet can be moved to the two-dimensional curling sheet to recognize the coordinates of an arbitrary point on the curling sheet.
As described above, by using the internal parameters of the first and second cameras and the external parameters, the two-dimensional image coordinates, the three-dimensional world coordinates, and the three-dimensional camera coordinates, the difference between the external parameters of the first camera and the external parameters of the second camera A relational expression can be derived as shown in [Equation 4]. At this time, a relational expression between the rotation matrix of the external parameter and a relational expression between the translation matrix can be derived.
Since the relational expression between the external parameters can be applied equally even if the photographing environment is changed, the relational expression obtained through the pre-calibration can be equally applied after the apparatus is installed in the curling stadium.
After the pre-calibration is performed, the photographing position and the photographing angle of the camera can be adjusted so that the area of the curling sheet to be photographed is included after the recognition device of the curling sheet is installed in the curling stadium.
Next, the curling sheet image of the near area and the curling sheet image of the far area through the second camera can be obtained through the first camera (S120). The process of acquiring an image may be performed simultaneously or sequentially according to a control signal. The curling sheet image of the near region photographed through the first camera and the curling sheet image of the far region photographed through the second camera can be photographed so as to include the hog line region close to the first camera and the second camera in common . At this time, the first camera may set the photographing area so as to necessarily include the house pattern, and the hog line itself may or may not be included. Further, the second camera may be set such that the photographing area is set such that some areas overlap, depending on how the photographing area of the first camera is set.
Next, the attitude estimation can be performed (S130) from the curling sheet image of the near region by four feature points whose world coordinates are known.
Specifically, the four feature points are two-dimensional image coordinates and three-dimensional world coordinates, which can be extracted through recognition of a specific pattern (e.g., house, hog line, side line, etc.) of the curling sheet. When the hog line is included in the near vision image, a pair of first contact points between the first vanishing point and the house, which is an intersection point obtained by extending two side lines of the image, and a second vanishing point that is the intersection of the connection line of the first contact pair and the hog line And the second pair of contacts, which are the two contacts of the house, can be extracted as four feature points. The first and second contact pairs can be used as world coordinates by using the intersection of house and tee line and the intersection of house and center line in a curling sheet manufactured in a standardized size. When the hog line included in the near vision image is parallel to the connection line of the first contact pair and there is no second vanishing point, the two contact points which are in contact with the house and the parallel line of the hog line are extracted as the second contact pair, Can be used. The actual house is a circle, but the house included in the image is taken as an oval. When the hog line is not included in the near image, the ellipse in the house is divided into a first ellipse and a second ellipse, and a specific point on a connection line of the first contact pair, a connection line of the first vanishing point, 2 ellipses are respectively obtained and a specific point where the third vanishing point and the fourth vanishing point intersect at the intersections of the tangents of the respective intersections is obtained and two points of the connecting line of the first vanishing point and the specific point, It can be used as four feature points in place of the contact pairs. A concrete method for calculating the above four minutiae points will be described in detail with reference to Figs. 9 to 11 below.
The second attitude estimation is performed through the three-dimensional attitude estimation of the second camera using the first attitude value, the relational expression of the external parameter of the first camera and the external parameter of the second camera, Can be estimated. The local curling sheet image photographed by the first camera clearly captures the pattern of the house and facilitates the extraction of the feature points while the far curling sheet image photographed by the second camera has an ambiguous pattern of the house, Estimation of the attitude value can be difficult. Therefore, the second posture value can be estimated using the first posture value of the first camera and the relational expression.
Finally, the coordinate recognition of the curling sheet can calculate the coordinates of an arbitrary point of the curling sheet image of the near area and the curling sheet image of the far area using the first attitude value and the second attitude value (S140).
Specifically, the two-dimensional image coordinates photographed by the second camera at an arbitrary point on the three-dimensional world coordinate can be calculated by Equation (5) below.
FIG. 2 is a schematic diagram of a recognition apparatus for a curling sheet according to an embodiment of the present invention, and FIG. 3 is an external view of a recognition apparatus for a curling sheet according to an embodiment of the present invention.
Referring to FIG. 2, the
The
The
The
Referring to FIG. 3, the
Although the angles of the
4 to 10, the first camera photographs an area including two side lines, a first back line, a house, and a first hog line, and the second camera photographs an area including two side lines first hog lines, A case where an area including two hundred lines is photographed will be described as an example. In addition, FIG. 10 illustrates an example in which an imaging region is set such that the first camera does not include the first hog line. The photographing area of the first and second cameras in the following drawings is only one embodiment. As described above, the photographing area is set so that the first camera includes at least the house, and the second camera is the photographing area of the first camera The photographing area may be set such that the photographing area overlaps with some area.
FIGS. 4 to 5 are views showing the installation position of a recognition device for a curling sheet according to an exemplary embodiment of the present invention and regions of an image captured by two cameras.
4 to 5, a recognition device for a curling sheet according to an embodiment of the present invention may be disposed behind the first back line of the curling sheet.
The
FIG. 6 is a view for defining a coordinate system of an image photographed by a recognition device for a curling sheet according to an embodiment of the present invention.
Referring to FIG. 6, an image photographed by a recognition device of a curling sheet according to an embodiment of the present invention can photograph a curling sheet and a curling stone in a three-dimensional world coordinate system and convert it into a two-dimensional image coordinate system. In this process, the camera coordinate system can be used.
Specifically, when a three-dimensional world is photographed with a camera, the two-dimensional image is transformed into a two-dimensional image. In this case, the position of a point forming a three- . ≪ / RTI > At this time, in order to obtain the physical relationship of the two cameras, that is, the relationship of the external parameters, it is necessary to move the world coordinate system to the camera coordinate system by removing the internal parameters.
7 to 8 are views for explaining four minutiae points according to a method of recognizing a curling sheet according to an embodiment of the present invention.
7 to 8, the apparatus for recognizing a curling sheet according to an embodiment of the present invention extracts a house, two side lines, and a hog line pattern from a curling sheet image in a near region, The first contact pair, which is the two contact points of the house and the tea line of the house, and the two contact points of the line parallel to the hog line and the house tea line, can be extracted by four feature points. The first posture value can be estimated through the three-dimensional posture estimation of the first camera using the four feature points selected from the near vision image.
Specifically, the curling sheet image of the near region photographed using the first camera includes both sides of the ellipse and the side lines photographed with different widths depending on the distance. The two-dimensional image coordinates can be extracted from the captured image and converted into three-dimensional world coordinates as shown in FIG. That is, a house photographed in an elliptical shape may appear as a circle in the real world, and both side lines varying in width may be represented as side lines having a constant width.
These four feature points represent two points (the first point and the second point) tangent to the house, two parallel lines to the sideline, and two points (the third point and the fourth point) where the house and the center line meet 8).
9 to 12 are views for explaining a method of extracting four feature points according to a method of recognizing a curling sheet according to an embodiment of the present invention.
A pair of the first point, the second point, and the
Referring to FIG. 9, in the method of recognizing a curling sheet according to an embodiment of the present invention, four feature points may be extracted using a first vanishing point and a second vanishing point.
Specifically, a house, two side lines, and a hog line pattern can be extracted from a curling sheet image in which a near region is photographed. At this time, the house may be composed of an elliptical first ellipse and a second ellipse, and the side line may be photographed so as to meet at the first vanishing point.
It is possible to extract the first pair of contacts (the first point and the second point) which are the first vanishing point which is the intersection of the two side lines and the two contacts of the house. Next, a second vanishing point, which is the intersection of the line connecting the first contact pair (first point and second point) and the hog line, is obtained, and the second pair of contacts (the third point, Fourth point) can be extracted. (The first point and the second point) extracted using the first vanishing point and the second pair of contacts (the third point and the fourth point) extracted using the second vanishing point can be extracted as four characteristic points have.
10, in the method of recognizing a curling sheet according to an embodiment of the present invention, when the connecting line of the first contact pair and the hog line are parallel to each other and there is no second vanishing point, the parallel line of the first vanishing point and the hog line Four feature points can be extracted.
Specifically, a house, two side lines, and a hog line pattern can be extracted from a curling sheet image in which a near region is photographed. At this time, the house may be composed of an elliptical first ellipse and a second ellipse, and the side line may be photographed so as to meet at the first vanishing point.
It is possible to extract the first pair of contacts (the first point and the second point) which are the first vanishing point which is the intersection of the two side lines and the two contacts of the house. Next, a second vanishing point, which is the intersection of the line connecting the first pair of contacts (first point and second point) and the hog line, is obtained. If there is no second vanishing point, two parallel lines of the hog line The second contact pair (third point, fourth point) which is the contact point can be extracted. The second contact pair (third point and fourth point) extracted using the first contact pair (first and second points) extracted using the first vanishing point and the parallel line of the hog line is extracted as four feature points .
11, in the method of recognizing a curling sheet according to an embodiment of the present invention, when a hog line is not included in the near region, four feature points are extracted using the first vanishing point, the third vanishing point and the fourth vanishing point can do.
Specifically, the house and the two side line patterns can be extracted from the curling sheet image in which the near region is photographed. At this time, the house may be composed of an elliptical first ellipse and a second ellipse, and the side line may be photographed so as to meet at the first vanishing point.
It is possible to extract the first pair of contacts (the first point and the second point) which are the first vanishing point which is the intersection of the two side lines and the two contacts of the house. Next, the house is divided into a first ellipse and a second ellipse, a specific point on a connection line of the first contact pair (first point and second point) is drawn to a connection line of the first vanishing point, (A, a ', b, b') where the connection line of the vanishing point meets the first ellipse and the second ellipse, respectively. A third vanishing point which is an intersection of the tangents of the intersections a and a 'with the first ellipse and a fourth vanishing point which is the intersection of the tangents of the intersections b and b' with the second ellipse are obtained, And the fourth vanishing point coincide with each other. At this time, it is possible to extract two points of the second contact pair, which are the second contact point among the connection points of the specific point where the third vanishing point and the fourth vanishing point coincide with each other. According to the above method, the third vanishing point and the fourth vanishing point made at the intersection where the connecting line passing through the center line passes through the connecting line of the first contact pair and the first and second ellipse coincide with each other, It is possible to find a point passing through the center line in the sheet image. In this way, four feature points can be extracted.
Referring to FIG. 12, when the hog line is not included in the near region in the method of recognizing a curling sheet according to an embodiment of the present invention, the center of the first and second ellipses of the house, Feature points can be extracted.
Specifically, the house and the two side line patterns can be extracted from the curling sheet image in which the near region is photographed. At this time, the house may be composed of an elliptical first ellipse and a second ellipse, and the side line may be photographed so as to meet at the first vanishing point.
It is possible to extract the first pair of contacts (the first point and the second point) which are the first vanishing point which is the intersection of the two side lines and the two contacts of the house. Next, the house is divided into a first ellipse and a second ellipse. When the center of the first ellipse is set as the first center and the center of the second ellipse is set as the second center, the connecting line between the first center and the second center, The point where the connecting line of the first contact pair meets becomes the center of the house. In this way, after extracting the center of the house, a point of meeting with the second ellipse (i.e. house) of the line connecting the first vanishing point and the center of the house is used as a second contact pair (third point, fourth point) . Using this method, the amount of computation for obtaining the center of the house is reduced as compared with that in Fig. 11, so that four feature points can be extracted more efficiently.
According to the present invention, the first posture value of the first camera for photographing the near region can be derived using the fixed pattern of the curling sheet, so there is no need to add another landmark. Also, by estimating the second posture value of the second camera for photographing the long-range region using the relational expression between the first posture value of the first camera and the external parameters of the cameras, coordinates of the long-distance image of the second camera, It can be easily calculated.
The description above is merely illustrative of the technical idea of the present invention and various changes and modifications may be made by those skilled in the art without departing from the essential characteristics of the present invention. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.
100: a recognition device for a curling sheet 110:
120: second camera 130:
Claims (8)
Obtaining a curling sheet image of the near region through the first camera and a curling sheet image of the far region through the second camera;
Extracting four feature points having world coordinates from the curling sheet image of the near region;
Estimating a first posture value through three-dimensional posture estimation of the first camera using the four feature points;
Estimating a second posture value through the three-dimensional posture estimation of the second camera using the first posture value and the relational expression; And
Calculating curling sheet coordinates of an arbitrary point of the curling sheet image of the near region and the curling sheet image of the far region using the first attitude value and the second attitude value;
Wherein the curling sheet is a curling sheet.
Wherein the extracting of the four feature points comprises:
Extracting a house, two side lines, and a hog line pattern from the curling sheet image of the near region, and outputting a first contact point, which is a first contact point which is an intersection of the two side lines and two contacts of the house, Wherein a pair of second contact points, which are two contact points of the house, and a second vanishing point that is an intersection of the line connecting the contact pairs and the hog line are extracted as four characteristic points.
Wherein the extracting of the four feature points comprises:
Wherein the first contact pair corresponds to an intersection of the tee line of the curling sheet and the house and the second contact pair corresponds to the intersection of the center line of the curling sheet and the house, A method of recognizing a curling sheet to which coordinates are assigned.
Wherein the extracting of the four feature points comprises:
Wherein when the connecting line of the first contact pair and the hog line are parallel to each other and the second vanishing point does not exist in the curling sheet image of the close range region, the two contacts, which are in contact with the parallel line of the hog line, Extracting a pair of the feature points and using the feature points as the four feature points.
Wherein the extracting of the four feature points comprises:
Wherein when the hog line is not included in the curling sheet image of the near region, the ellipse in the house is divided into a first ellipse and a second ellipse, and a specific point on the connecting line of the first pair of contacts and a second point of the first vanishing point And a fourth point of intersection between the third and fourth points of intersection of the tangent lines of each of the intersecting points and obtains the point of intersection between the first disappearing point and the second point, Wherein two points of the connecting line that meet with the second ellipse are replaced with a second pair of contact points to be used as four characteristic points.
Wherein the extracting of the four feature points comprises:
Wherein if the hog line is not included in the curling sheet image of the near region, the ellipse in the house is divided into a first ellipse and a second ellipse in a curling sheet image of a near region, A center of the house which is an intersection point of a straight line connecting the center of the ellipse and a connecting line of the first contact pair is obtained and two points where a straight line connecting the center of the house and the first vanishing point meet the second ellipse, A pair of contact points, and is used as four feature points.
Wherein the first camera and the second camera have a fixed physical relationship including an angle and a distance,
Wherein the curling sheet image of the near region is an image captured by the first camera to include the house and the two side lines.
A second camera set to capture a curling sheet image of a far area in which a close range and a part of the curling sheet overlap; And
Calculating a relational expression of an internal parameter of the first camera, an internal parameter of the second camera, an external parameter of the first camera, and an external parameter of the second camera through a preliminary calibration, A curling sheet image of a distant area is obtained through a second camera and extracted from the curling sheet image of the near region by four feature points whose world coordinates are known, Estimating a first posture value through a three-dimensional posture estimation of a camera, estimating a second posture value through a three-dimensional posture estimation of the second camera using the first posture value and the relational expression, And the second posture value of the curling sheet image of the near region and the curling sheet image of the far region, A control unit for calculating curling sheet coordinates;
And a recognition unit for recognizing the curling sheet.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170118972 | 2017-09-15 | ||
KR20170118972 | 2017-09-15 |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20190033416A true KR20190033416A (en) | 2019-03-29 |
KR102045436B1 KR102045436B1 (en) | 2019-11-15 |
Family
ID=65898758
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020180045596A KR102045436B1 (en) | 2017-09-15 | 2018-04-19 | Method and apparatus for recognizing curling sheet |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR102045436B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115049853A (en) * | 2022-04-14 | 2022-09-13 | 鼎云(上海)科技有限公司 | Tobacco leaf curl invariant characteristic feature extraction method and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3686919B2 (en) * | 2000-12-06 | 2005-08-24 | 株式会社ニコン技術工房 | GAME DEVICE, GAME PROCESSING METHOD, AND READABLE STORAGE MEDIUM |
JP2008298685A (en) * | 2007-06-01 | 2008-12-11 | Toyota Central R&D Labs Inc | Measuring device and program |
JP2013115540A (en) * | 2011-11-28 | 2013-06-10 | Clarion Co Ltd | On-vehicle camera system, and calibration method and program for same |
-
2018
- 2018-04-19 KR KR1020180045596A patent/KR102045436B1/en active IP Right Grant
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3686919B2 (en) * | 2000-12-06 | 2005-08-24 | 株式会社ニコン技術工房 | GAME DEVICE, GAME PROCESSING METHOD, AND READABLE STORAGE MEDIUM |
JP2008298685A (en) * | 2007-06-01 | 2008-12-11 | Toyota Central R&D Labs Inc | Measuring device and program |
JP2013115540A (en) * | 2011-11-28 | 2013-06-10 | Clarion Co Ltd | On-vehicle camera system, and calibration method and program for same |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115049853A (en) * | 2022-04-14 | 2022-09-13 | 鼎云(上海)科技有限公司 | Tobacco leaf curl invariant characteristic feature extraction method and storage medium |
Also Published As
Publication number | Publication date |
---|---|
KR102045436B1 (en) | 2019-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021004312A1 (en) | Intelligent vehicle trajectory measurement method based on binocular stereo vision system | |
CN103971378B (en) | A kind of mix the three-dimensional rebuilding method of panoramic picture in visual system | |
CN106960454B (en) | Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle | |
JP4341564B2 (en) | Object judgment device | |
JP5586765B2 (en) | Camera calibration result verification apparatus and method | |
CN110956660B (en) | Positioning method, robot, and computer storage medium | |
US20060078197A1 (en) | Image processing apparatus | |
CN110044374B (en) | Image feature-based monocular vision mileage measurement method and odometer | |
CN105654547B (en) | Three-dimensional rebuilding method | |
US10482615B2 (en) | Image processing device and image processing method | |
CN103795935B (en) | A kind of camera shooting type multi-target orientation method and device based on image rectification | |
JP4906683B2 (en) | Camera parameter estimation apparatus and camera parameter estimation program | |
CN104599281B (en) | A kind of based on the conforming panorama sketch in horizontal linear orientation and remote sensing figure method for registering | |
CN103593641A (en) | Object detecting method and device based on stereoscopic camera | |
CN111724446B (en) | Zoom camera external parameter calibration method for three-dimensional reconstruction of building | |
CN112017238A (en) | Method and device for determining spatial position information of linear object | |
JPH05303629A (en) | Method for synthesizing shape | |
CN115272403A (en) | Fragment scattering characteristic testing method based on image processing technology | |
KR20190033416A (en) | Method and apparatus for recognizing curling sheet | |
WO2021193672A1 (en) | Three-dimensional model generation method and three-dimensional model generation device | |
CN116778094B (en) | Building deformation monitoring method and device based on optimal viewing angle shooting | |
CN110800020A (en) | Image information acquisition method, image processing equipment and computer storage medium | |
CN108090930A (en) | Barrier vision detection system and method based on binocular solid camera | |
JP4886661B2 (en) | Camera parameter estimation apparatus and camera parameter estimation program | |
CN102542563A (en) | Modeling method of forward direction monocular vision of mobile robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |