WO2022075691A1 - 카메라를 이용한 평면 이동 구체의 운동 센싱장치 및 방법과, 퍼팅매트를 이동하는 골프공의 운동 센싱장치 및 방법 - Google Patents
카메라를 이용한 평면 이동 구체의 운동 센싱장치 및 방법과, 퍼팅매트를 이동하는 골프공의 운동 센싱장치 및 방법 Download PDFInfo
- Publication number
- WO2022075691A1 WO2022075691A1 PCT/KR2021/013602 KR2021013602W WO2022075691A1 WO 2022075691 A1 WO2022075691 A1 WO 2022075691A1 KR 2021013602 W KR2021013602 W KR 2021013602W WO 2022075691 A1 WO2022075691 A1 WO 2022075691A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- sphere
- plane
- point
- golf ball
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 238000001514 detection method Methods 0.000 claims description 33
- 238000009434 installation Methods 0.000 abstract description 3
- 238000004364 calculation method Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000004088 simulation Methods 0.000 description 4
- 239000003550 marker Substances 0.000 description 2
- 238000005314 correlation function Methods 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/36—Training appliances or apparatus for special sports for golf
- A63B69/3661—Mats for golf practice, e.g. mats having a simulated turf, a practice tee or a green area
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0021—Tracking a path or terminating locations
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/36—Training appliances or apparatus for special sports for golf
- A63B69/3658—Means associated with the ball for indicating or measuring, e.g. speed, direction
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/36—Training appliances or apparatus for special sports for golf
- A63B69/3676—Training appliances or apparatus for special sports for golf for putting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0021—Tracking a path or terminating locations
- A63B2024/0028—Tracking the path of an object, e.g. a ball inside a soccer pitch
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/10—Positions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/806—Video cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/807—Photo cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30221—Sports video; Sports image
- G06T2207/30224—Ball; Puck
Definitions
- the present invention relates to a sensing device and a sensing method for calculating information on the motion of a sphere by acquiring and analyzing an image of a spherical object such as a sports ball moving on a plane, for example, a golf ball or a bowling ball.
- a spherical object such as a sports ball moving on a plane, for example, a golf ball or a bowling ball.
- a stereoscopic camera device using two cameras is used as a device for acquiring coordinate information on an object in a three-dimensional space.
- 2D coordinate information about the target object is extracted from the 2D image of the target object photographed by one camera and the 2D image shot by another camera of the same target object, respectively, and the extracted 2D coordinate information is 3D coordinate information of the target object is calculated using a correlation function predefined for
- Such a stereo camera device is widely used to calculate location information of a certain target object in a three-dimensional space.
- a ceiling camera and side camera are stereo It is interlocked and used to calculate position change information in a three-dimensional space as a golf ball struck by a user's golf swing moves.
- detection and motion state calculation of a sports sphere are typically performed in a manner that utilizes a plurality of cameras, and the image captured by each of the plurality of cameras is transmitted to a computing device to calculate position information in three-dimensional space, and thus obtained The motion state of the object is calculated based on the coordinates in the real space.
- the position of each of the two cameras must be fixed, and a dedicated lighting is provided to sense the motion of the object in a three-dimensional space. Because there is a limitation that it can be used only when it is installed accurately by a professional technician in a specific place, there is a problem that it cannot be used conveniently without a space limitation.
- the present invention uses a low-cost single camera with a plane-moving sports sphere as a target object, and it is possible to detect a sports sphere without a separate lighting as well as space recognition, so the installation position of the camera must be fixed It is to provide a motion sensing device and method of a flat moving sphere using a single camera that can be used regardless of location with a simple configuration, and a motion sensing device and method of a golf ball for moving a putting mat.
- a motion sensing apparatus of a plane moving sphere using a camera includes: a camera for acquiring an image at an angle of view including the plane of motion of the sphere at a single point of view at an arbitrary location; and setting a positional relationship with the camera using the motion plane of the sphere as a reference plane from the acquired image of the camera, detecting an object corresponding to the sphere on the acquired image, and applying the positional relationship with the camera and the object and a sensing processing unit for calculating the plane position coordinates of the sphere on the reference plane by using the information about the sphere, and calculating information on the motion of the sphere from the change in the calculated plane position coordinates.
- the sensing device for a golf ball moving on a putting mat a camera for acquiring an image at an angle of view including the putting mat in a single view at an arbitrary position; And by recognizing the preset and displayed features on the putting mat from the acquired image of the camera, setting the positional relationship with the camera using the putting mat as a reference plane, on the outline of the object corresponding to the golf ball on the acquired image Detecting the feature point, calculating the position coordinate of the point where the center point of the golf ball is projected in the vertical direction to the reference plane using the feature point is calculated as the plane position coordinate of the golf ball, and the putting from the change in the calculated plane position coordinate It includes a sensing processing unit for calculating information about the movement of the golf ball on the mat.
- the motion sensing method of a plane moving sphere using a single camera comprises the steps of: acquiring an image at an angle of view including the motion plane of the sphere at a single point of view at an arbitrary position by a camera; setting a positional relationship with the camera using the motion plane of the sphere as a reference plane from the image acquired by the camera; detecting a feature point on an outline of an object corresponding to the sphere on the acquired image; and calculating the positional coordinates of the point where the center point of the sphere is projected from the feature point in the vertical direction to the reference plane by using the positional relationship with the camera as the planar positional coordinates of the sphere on the reference plane.
- a sensing method for a golf ball moving on a putting mat includes: acquiring an image at an angle of view including the putting mat in a single view at an arbitrary position by a camera; setting a positional relationship with the camera using the putting mat as a reference plane from the image acquired by the camera; preparing to hit the golf ball by detecting an object corresponding to the golf ball from the image acquired by the camera; Detecting the hit of the golf ball from the image acquired by the camera; When the hit is sensed, using the set reference plane and the positional relationship of the camera, the center point of the golf ball from the point on the outline of the object is the plane position coordinate of the point projected in the vertical direction to the reference plane. calculating as; and calculating the planar positional coordinates of the golf ball for each frame of the image acquired by the camera, and calculating information on the motion of the golf ball from the change.
- the motion sensing device and method of a plane moving sphere using a camera and the motion sensing device and method of a golf ball moving a putting mat according to the present invention use a low-cost single camera with a plane moving sports sphere as a target object
- FIG. 1 is a view showing a motion sensing device of a golf ball moving a putting mat as a specific example of a motion sensing device of a plane moving sphere using a camera according to the present invention.
- FIG. 2 is a diagram illustrating an image acquired by a single camera in a positional relationship between a single camera and a putting mat as shown in FIG. 1 .
- FIG. 3 is a flowchart for explaining a motion sensing method of a plane moving sphere using a single camera according to an embodiment of the present invention.
- FIGS. 4 and 5 are diagrams illustrating detecting an object corresponding to a sphere in an image acquired by a single camera of a sensing device according to an embodiment of the present invention and calculating a feature point for calculating the plane position coordinates of the sphere therefrom admit.
- 6 and 7 are based on the upper and lower points, which are feature points of an object detected by a sensing device according to an embodiment of the present invention, to calculate the position coordinates of a point in which the center point of a sphere is projected in a vertical direction to a reference plane.
- FIG. 8 is a flowchart for explaining a motion sensing method of a golf ball moving a putting mat according to an embodiment of the present invention.
- FIG. 9 is a diagram illustrating an example of a binarized image of an image acquired by a single camera for explaining the method according to the flowchart of FIG. 8 .
- a motion sensing device and method for a plane moving sphere using a camera according to the present invention and a motion sensing device and method for a golf ball moving a putting mat will be described with reference to the drawings.
- an image can be obtained using a single camera and information about the movement of the sphere can be calculated by analyzing it. Movement information can be calculated by detecting the movement of the golf ball putt by using the camera on the putting mat.
- the camera may be implemented as a "single camera" that acquires an image with an angle of view including the motion plane of the sphere with a single view.
- the camera for detecting the movement of the golf ball must be fixed at a preset position, and a plurality of cameras are stereo-typed to sense the three-dimensional motion of the golf ball.
- the present invention is characterized in that a single single camera can sense the movement of a golf ball at an arbitrary position by breaking away from such a constraint.
- FIG. 1 shows a motion sensing device of a golf ball moving a putting mat as a specific example of a motion sensing device of a plane moving sphere using a camera according to the present invention.
- the motion sensing apparatus of a plane moving sphere using a camera includes a single camera 200 and a sensing processing unit 220 as shown in FIG. 1 .
- the single camera 200 acquires an image at a frame per second preset with an angle of view including a motion plane (eg, putting mat 100) of a sphere at a single point of view at an arbitrary location.
- a motion plane eg, putting mat 100
- the golf ball 10 is hit by the putter 20 with a single view, not a stereo type, at a position as shown in FIG. 1 to acquire an image with an angle of view including the putting mat 100 that moves in a plane.
- the sensing processing unit is a component that receives the image acquired by the single camera, processes the image, and analyzes the processed image to calculate information on the position of the sphere on the plane.
- the present invention sets the plane on which the sphere moves as the reference plane using a single camera, and the position coordinates of the point where the center point of the sphere is projected on the reference plane in the vertical direction, that is, the It is characterized in that by calculating the position coordinates on the plane, the motion information of the sphere on the reference plane is calculated.
- the sensing processing unit recognizes the 'reference plane' for the motion plane of the sphere from the image acquired by the single camera, sets a positional relationship with the single camera, and uses the positional relationship to correspond to the sphere on the image acquired by the single camera
- the plane position coordinates of the sphere on the 'reference plane' are calculated from the object.
- the sensing processing unit 220 recognizes the preset and displayed features 111 , 112 , 113 , 114 on the putting mat 100 from the acquired image of the single camera 200 to recognize the golf ball Set the reference plane as the motion plane of (10) and the position information of the single camera on the reference plane, that is, three-dimensional position information, detect the object corresponding to the golf ball on the acquired image, and the center point of the golf ball 10 as the reference plane It may be configured to calculate the coordinates of the position of the vertically projected point, that is, the planar position of the golf ball.
- the single camera 200 recognizes the features 111 to 114 on the acquired image and analyzes the feature parts on the image, based on the origin of the x-y-z coordinate system, the reference plane and the single camera By calculating the x, y, and z coordinate information, the positional relationship between the reference plane and the single camera can be set.
- Figure 2 shows the image acquired by the single camera in the positional relationship between the single camera and the putting mat as shown in Figure 1, (a) of Figure 2 shows the state before the golf ball is hit, Figure 2 (b) indicates a state in which the golf ball moves after being struck.
- i100 represents the image of the putting mat
- i111 to i114 respectively represent the image of the feature
- i10 represents the image of the golf ball in a stationary state
- ir10 represents an image of a golf ball in a moving state.
- FIG. 1 when an image is acquired for the motion plane of a sphere such as the putting mat 100 at an angle of view obliquely looking down from the position of the single camera 200 on the upper side, in FIG. 2 (a) And as shown in (b), according to the distance from the single camera, the part close to the single camera of the putting mat appears larger and the farther part appears smaller, and the size of the sphere appears differently on the image depending on the distance between the sphere and the single camera .
- the image acquired by the single camera is a two-dimensional image, and by analyzing the content of the image, as shown in (a) and (b) of FIG. It is very difficult to figure out.
- the plane position coordinates of the sphere can be accurately calculated by setting the positional relationship between the reference plane and the single camera using the 'feature part' and analyzing the image using this.
- the sensing device may recognize a feature prepared in advance on a plane as shown in FIG. 1 through the image shown in FIG. 2 .
- the above-described 'features' may be in the form of markers 111 to 114 prepared in advance as shown in FIG. form) may be.
- the sensing device sets information on the shape and size of a feature in advance, and determines how the shape of the feature recognized through the image has changed and how the size of the feature has changed in advance. By analyzing with reference to the information on , information on the position of the reference plane and the single camera can be grasped on the set coordinate system.
- each of the features (111, 112, 113, 114) is prepared in advance on the edge of the putting mat 100, and the shape of the features (111, 112, 113, 114) and If information regarding the size, etc. is set in advance, the sensing processing unit 220 performs a portion (i111, i111, The changes in the shape and size of i112, i113, and i114) can be analyzed by comparing them with preset information, and from that, it is possible to calculate at which position coordinates the single camera is located on the x-y-z coordinate system with respect to the reference plane and store it as setting information.
- the rectangular shape itself of the putting mat 100 as shown in FIG. 1 may be preset as a feature, and the shape and size of the portion i100 corresponding to the putting mat in the image as shown in FIG. 2 .
- the positional relationship between the reference plane and the single camera can be calculated and stored as setting information.
- the feature may have the same shape as the artificially assigned marker, or if the shape of the motion plane can be specified, the shape of the motion plane itself may be set as the feature. Any shape of a marker can be used as a feature as described above since it is possible as long as the change can be easily analyzed.
- the sensing processing unit recognizes the predefined feature displayed on the motion plane through the image of the single camera, calculates the positional relationship between the single camera and the motion plane, sets the reference plane as described above, and then the sphere is positioned at any position on the reference plane It can be calculated, but a detailed method of calculating the position of the sphere on the reference plane will be described later.
- FIG. 3 shows a motion sensing method of a sphere based on the configuration of the sensing device described with reference to FIGS. 1 and 2 .
- the single camera acquires an image of the motion plane of the sphere within the angle of view at an arbitrary position (S110).
- the single camera is configured to acquire an image with ambient lighting through adjustment of image parameters such as brightness, contrast, and gamma without dedicated lighting.
- the sensing processing unit recognizes a preset feature on the acquired image (S120), and sets a positional relationship between the reference plane and the single camera with respect to the motion plane of the sphere therefrom (S130).
- the sensing processing unit recognizes the sphere from the image acquired by the single camera (S140), and determines whether the movement of the sphere has started (S150).
- the sensing processing unit detects an object on the image corresponding to the real sphere by analyzing the image acquired by the single camera (S160).
- the sensing processing unit may detect a feature point on the contour of the detected object (S170).
- the sensing processing unit detects a point (feature point) on the outline of the object using the object detected on the image acquired by the single camera, and uses the feature point to determine the position coordinates of the point where the center point of the sphere is projected in the vertical direction to the reference plane. It is calculated as the plane position coordinates of (S180).
- the motion trajectory of the sphere can be calculated, and through the change of the plane position coordinates of the sphere with time It is possible to calculate the motion information of the sphere, such as to calculate the speed, etc. (S190).
- 4 and 5 are diagrams illustrating detecting an object corresponding to a sphere in an image acquired by a single camera of a sensing device according to an embodiment of the present invention and calculating a feature point for calculating the plane position coordinates of the sphere therefrom 6 and 7 are based on the upper and lower points, which are characteristic points of the object detected by the sensing device according to an embodiment of the present invention, the center point of the sphere is projected in the vertical direction to the reference plane in the vertical direction.
- FIG. 4A is an enlarged view of a part of an image acquired by a single camera in the configuration as shown in FIG. 1 .
- the sensing processing unit recognizes preset features (i114, etc.) provided on the motion plane from the image as shown in (a) of FIG. 4 , calculates the positional relationship between the reference plane and the single camera using this, and stores the calculated information. can be set.
- the pixels of the sphere corresponding to the sphere appear to have a brightness value that is significantly different from the surrounding pixels.
- the group of pixels distinguished from the surroundings as described above may appear as a group of pixels having a brightness value distinguished from the surrounding area as well as a sphere, for example, a portion corresponding to a putter in the case of putting, and the same is true of the user's foot.
- information such as brightness, roundness, and aspect ratio for pixels on the image is preset, respectively, and for various objects displayed on the image, the information is previously set. It is possible to detect which object corresponds to a sphere based on the set conditions. As shown in FIG. 4, an object corresponding to a sphere is displayed as an OB.
- the object OB has an unclear outline and is not a perfect spherical shape. Therefore, in order to calculate the plane position coordinates of the sphere, the correct outline of the object OB is detected. it is necessary to do
- the curve of the upper boundary Tb of the object OB may be used, and FIG. 4(c) shows the curve of the upper boundary Tb of the object OB
- the result of detecting the outline of the object is shown by obtaining the circle fc having the curvature of the curve of the upper boundary.
- the curve of the upper boundary Tb of the object OB on the image can be obtained using the brightness values of the pixels of the object OB, for example, the pixel brightness value of the object.
- the curve of the upper boundary Tb can be detected by setting a threshold value of , and specifying pixels corresponding to the boundary of the threshold value.
- the object OB is calculated by calculating a circle fc having the curvature of the curve of the upper boundary Tb as shown in Fig. 4(c). ) can be specified.
- the upper end point TP of the upper end boundary Tb of the object may be obtained, and the lower end point BP of the object symmetrical with the upper end point TP may be detected.
- the plane position coordinates of the sphere can be calculated using the feature points on the outline of the object, the upper end point TP and the lower end point BP can function as the feature points.
- the outline of the object can be specified by obtaining the upper boundary Tb of the object OB and constructing a circle using the curvature of the curve. and a top point TP and a bottom point BP, which are feature points, can be detected therefrom.
- the shape of the object OB detected on the image is in an unfavorable state for circular outline specification, if the upper boundary of the object as described above does not represent a desirable curved shape, the curvature is based on the As a result, the resulting circle may appear excessively large.
- the present invention provides a method of detecting feature points (top and bottom points) on an object outline by a circle fitting method that uses the curvature of the curve of the upper boundary of the object as described above, along with a method of detecting the feature points (top and bottom points) of FIG. ), a method of detecting a feature point using the detection diagram (DR) as shown in the figure can be used in parallel.
- the upper end boundary Tb of the object OB is detected, and the upper end point TP on the upper end boundary Tb is detected, and a predetermined type of figure, such as a rectangle, is detected.
- a detection figure DR is generated to match the size and shape of the object OB, and the upper end point TP and the size d are symmetrical by using the size d of the matched detection figure DR. It is possible to detect the lower end point BP of the position.
- the detected figure DR only predetermines the type of figure, and its shape and size can be generated by rotating or deforming to fit the object OB. may appear as a rectangle of
- the matching of the detection figure DR to the object OB means that the detected figure DR most suitably includes the object OB by changing the length of the side of the detected figure DR or rotating it. It means when
- the above-described 'detection figure DR matches the object OB' in the case where the average of the brightness values inside the detection figure DR is the largest. ' It can be judged as a case of
- the length of one side and the length of the other side may be different.
- a point corresponding to the upper end point TP of the boundary by the diameter d may be detected as the lower end point BP.
- FIG. 4D shows an example of detecting the upper end point TP and the lower end point BP, which are feature points of an object in the same manner as described above.
- the method of detecting the feature point of the object using the detection figure DR as described above with reference to FIG. 4(d) may produce a more preferable result.
- the upper boundary Tb of the object OB is detected, and, for example, a rectangular detection figure DR is generated to match the size and shape of the object OB.
- the detected figure DR is characterized in that only the type of figure is predetermined, and the size and the rotated posture can be deformed according to the state of the object.
- the matched detection figure (DR) is a rectangular figure having a long side and a short side. Since the size of the spreading direction of the object, that is, the direction perpendicular to the moving direction, can be seen as the diameter of the object outline, in FIG.
- the size d of the short side of the detection figure DR matched with the indicated object OB is the diameter of the object.
- the upper end point (TP) is detected at the upper end boundary (Tb) of the object, and a point at a position symmetrical by size d from the upper end point (TP) is set as the lower end point ( BP) can be detected.
- the circle fitting method using the curvature of the upper boundary of the object shown in FIGS. 4 (b) and (c) is a method of detecting the outline of an object and obtaining the upper and lower points, which are feature points, in FIG. 4 (d) and FIG.
- the method shown in (b) and (c) of 5 is a method of obtaining the size of an object using a detection figure instead of obtaining the outline of the object and detecting the positions of the upper and lower points using this method, according to an embodiment of the present invention.
- the sensing device according to the example may accurately detect the feature point of the object by using the above two methods in parallel.
- the upper point (TP) and the lower point (BP) of the object as described above are obtained, and using these, the coordinates of the point where the center point of the sphere is projected in the vertical direction to the reference plane, that is, the plane position coordinates of the sphere can be obtained.
- FIG. 6 shows that information on the positional relationship between the reference plane and the single camera can be set by recognizing the parts (i111 to i114) corresponding to the feature on the image viewed by the single camera shown in FIG. It shows that the upper end point TP and the lower end point BP, which are feature points on the object, are detected by detecting the object OB corresponding to the moving sphere.
- the direction of the line passing through the detected upper point (TP) and the lower point (BP) is the single camera's line of sight direction
- the lower point (BP) is a point close to the position of the single camera in the single camera's line of sight.
- position and the top point (TP) is the position of a point far from the position of the single camera.
- the 'plane position coordinates of the sphere' that is, the center point of the sphere is projected in the vertical direction to the reference plane. You can find the coordinates of the position of a point.
- the above 'center point of the sphere' is completely different from the center point of the outline of the object on the image acquired by the single camera as shown in FIG. 6, and even if the coordinates of the center point of the outline of the object are obtained, from it Can't find 'coordinates'
- the point A is not the center point of the actual sphere, but represents a significant difference from the center point of the actual sphere.
- top point TP and the bottom point BP of the object OB displayed in FIG. 6 are position coordinates on the mat in the viewing direction viewed from the position of the single camera, not the point on the solid sphere, the top point TP in FIG. ) and the lower point (BP) are completely different from the upper and lower points of the sphere in real space.
- the center point A of the outline of the object OB shown in FIG. 6 is not the center point of the sphere on the actual plane, it is not possible to obtain the plane position coordinates of the sphere using this, and the upper point TP and the lower end of the object
- the plane position coordinates of the sphere can be obtained by geometric calculation using the point BP, which will be described with reference to FIG. 7 .
- FIG. 7 shows the calculation of the positional coordinates of the point in which the center point of the sphere is projected in the vertical direction to the reference plane based on the upper and lower points, which are the characteristic points of the detected object, as shown in FIG. a) shows a cross-section cut along the line of sight of a single camera that connects the top point (TP) and the bottom point (BP) of the object on the x-y coordinate plane of the real space, (b) of FIG. 7 is the x-y of the real space Represents the coordinate plane.
- setting information regarding the positional relationship between the reference plane and the single camera can be stored by the single camera acquiring an image and recognizing a portion corresponding to a feature (pre-set) on the acquired image.
- the sensing processing unit of the sensing device recognizes the feature from the acquired image of the single camera and recognizes the reference plane (sPL) for the motion plane of the sphere as shown in (a) of FIG.
- sPL reference plane
- a geometric relationship between the reference plane sPL and the position P200 of the single camera can be set.
- the height H of the single camera position P200 with respect to the reference plane sPL can be set as shown in (a) of FIG. 7 , (b ), the x, y coordinate information of the single camera position P200 can also be set, and as shown in (a) of FIG. 7 using the set height and position information for the single camera position P200 Similarly, angle information in the gaze direction from the single camera position P200 to the reference plane sPL can be calculated.
- the upper point TP and the lower point BP of the object OB are single cameras for the upper and lower ends of the sphere CB in the real coordinate system as shown in FIGS. 7 (a) and (b). are the points TP and BP on the reference plane sPL projected in the gaze direction of .
- the plane position coordinates of the sphere to be obtained by the sensing device of the present invention are, as shown in FIG. are the x and y coordinates of
- the coordinates of the point Pc can be obtained using the geometric relationship between the positions of the upper end point TP and the lower end point BP on the reference plane sPL as described above.
- the upper end point TP on the reference plane sPL is different from the point Pc at which the center point C of the sphere is projected in the vertical direction to the reference plane sPL. Let the difference be the error E2.
- the point Pc projected in the vertical direction with respect to the reference plane sPL of the center C of the sphere, as shown in FIG. E2, which is an error with the upper end point TP, can be calculated by geometrical operation. That is, as shown in (a) of FIG. 7, for a triangle by the center point C, the lower point BP, and the point Pc of the sphere, E1 can be calculated by a trigonometric function using angle a information and the radius r of the sphere.
- E2 can be calculated by a trigonometric function using the angle b information and the radius r of the sphere.
- the error E1 from the lower end point and the upper end point are obtained by geometrical calculation as shown in FIG.
- the error E2 from the point can be calculated, and by that error, the coordinates projected vertically on the reference plane of the center point of the sphere, that is, the coordinates on the x-y plane (reference plane) of the point Pc, which is the plane position coordinate of the sphere, can be obtained.
- the flowchart shown in FIG. 3 shows the process of obtaining the plane position coordinates of the sphere using the acquired image of the single camera with respect to the plane motion of the sphere. It relates to a method for sensing the motion of a golf ball as the ball moves on a putting mat that is a motion plane.
- the method of detecting the object on the image, detecting the contour of the object, and calculating the coordinates of the point projected onto the vertical reference plane of the center point of the sphere described above with reference to FIGS. 4 to 7 is a putting mat to be described in FIG. The same can be applied to the method of obtaining the plane position coordinates of the golf ball on the top.
- the flowchart shown in FIG. 8 shows a process that is more specific than the sensing method according to the flowchart shown in FIG. 3 and is specialized for golf putting.
- an image is acquired by a single camera at an arbitrary position (S210), and the sensing processing unit recognizes a feature on the putting mat from the acquired image to determine the positional relationship between the reference plane and the single camera. set (S220).
- image parameters such as brightness, contrast, and gamma can be automatically adjusted so that the detection of an object according to ambient lighting can be effectively performed.
- the sensing processing unit sets a region of interest for recognizing a golf ball on an image of a single camera after setting a reference plane ( S232 ).
- Fig. 9 (a) shows a case in which a region of interest (ROI) having a preset size and shape is set to include an object OB corresponding to a golf ball.
- ROI region of interest
- S234 When it is determined that the object OB, that is, the object OB determined to correspond to the golf ball, exists within the region of interest (ROI) (S234), it is determined that the hitting preparation is complete and the 'ball-ready' state is entered (S236). ).
- the position where the golf ball is initially placed is limited, so that the user can put the golf ball anywhere on the putting mat
- the sensing processing unit sets a trigger detection area TR for detecting whether the golf ball has been hit (S236).
- the trigger detection area TR may be set as an area within the area of the object OB, as shown in FIG. 9B .
- a trigger detection area TR is set in the object OB, and it is determined whether the change in brightness inside the trigger detection area TR exceeds a preset reference value (S242).
- Fig. 9 (b) in the process of setting the trigger detection area TR inside the object OB and detecting the brightness inside the trigger detection area TR, when the golf ball moves slightly, the trigger Since the change in brightness inside the detection area TR is not large, it does not proceed to the next step for trigger determination, but if the golf ball is hit and the golf ball moves, the brightness inside the trigger detection area TR is greatly changed.
- a first trigger signal may be generated.
- the golf ball moves as shown in FIG. 9 (b).
- a change of 40 occurs, which exceeds the reference value of 30, so that the first trigger signal is generated.
- the sensing processing unit detects each object in a plurality of frames of images, calculates the amount of movement of the object, and determines whether the calculated amount of movement of the object exceeds a preset reference value. It is determined (S244).
- the reference value for the movement amount of the golf ball is set in advance. Accordingly, it is determined that the golf ball has been hit only when the golf ball has moved more than a certain level.
- the sensing processing unit After the first trigger signal is generated as described above, the sensing processing unit generates a ball trigger when it is determined that the object has moved enough to exceed a preset reference value by detecting the object and calculating how much it has moved (S246).
- the sensing processing unit detects the change in brightness inside the trigger detection area after the ball ready (S236), performs the first trigger, and then detects the movement amount of the object once again to generate a ball trigger. detect whether
- the sensing processing unit retrieves images acquired from a single camera before and after the ball trigger time, analyzes them, and detects an object corresponding to a golf ball for each frame of the image to be analyzed (S250).
- the same method as the method of detecting an object corresponding to a sphere described in FIG. 4 may be used.
- the feature point on the contour of each detected object is detected.
- the object is obtained in a circle fitting method using the curvature of the upper boundary of the object. It is possible to obtain the outline and detect the upper and lower points therefrom, or calculate the size information of the object using the detection figure and use it to detect the upper and lower points.
- the feature points on the contour of the object is calculated using this, the position coordinates of the point where the center point of the golf ball is projected in the vertical direction to the reference plane is calculated as the planar position coordinates of the golf ball (S260).
- the method of calculating the planar position coordinates of the golf ball can be calculated by calculating the error E1 or E2 from the upper end point and the lower end point of the object, as described above with reference to FIG. 7 .
- the change in the position coordinates can be calculated, and information on the movement of the golf ball from the change in the plane position coordinates of the golf ball can be calculated (S270).
- the sensing processing unit After calculating the information on the motion of the golf ball, the sensing processing unit transmits the motion information to the client 300 (refer to FIG. 1 ), for example, a simulator of a putting simulation device, a computer of a golf information providing device, etc. to provide it to the user.
- the client 300 for example, a simulator of a putting simulation device, a computer of a golf information providing device, etc. to provide it to the user.
- the sensing processing unit may inspect the feature once again to determine whether the reference plane is distorted from the initially set state (S280). If the reference plane is misaligned, return to step S220 to set the reference plane again, and if there is no change in the reference plane, the motion of the golf ball is sensed based on the previously set reference plane.
- the apparatus and method for sensing the motion of a plane moving sphere using a camera and the device and method for sensing the motion of a golf ball for moving a putting mat according to the present invention are a low-cost type using a plane moving sphere as a target object. It uses a single camera of It has the advantage of being able to calculate the exact position and motion of the moving sphere.
- the motion sensing device and method of a plane moving sphere using a camera and the motion sensing device and method of a golf ball moving a putting mat according to the present invention are a technical field based on sensing of a plane moving sports sphere, during a golf swing It can be used in the field of golf analysis based on the analysis of the movement of the golf ball, the field of virtual golf simulation system, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims (13)
- 평면 상을 운동하는 구체에 대한 센싱장치로서,임의의 위치에서 단일 시각으로 상기 구체의 운동 평면을 포함하는 화각으로 영상을 취득하는 카메라; 및상기 카메라의 취득 영상으로부터 상기 구체의 운동 평면을 기준면으로 하여 상기 카메라와의 위치관계를 설정하고, 상기 취득 영상 상의 상기 구체에 해당하는 오브젝트를 검출하며, 상기 카메라와의 위치관계와 상기 오브젝트에 대한 정보를 이용하여 상기 구체의 상기 기준면 상에서의 평면 위치좌표를 산출하여, 상기 산출된 평면 위치좌표의 변화로부터 상기 구체의 운동에 대한 정보를 산출하는 센싱처리부;를 포함하는 카메라를 이용한 평면 이동 구체의 운동 센싱장치.
- 제1항에 있어서, 상기 카메라는,전용 조명 없이 상기 기준면과 카메라와의 위치관계 설정시 영상 파라미터의 조절을 통해 주변 조명에 따른 영상을 취득할 수 있도록 구성되는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱장치.
- 제1항에 있어서, 상기 센싱처리부는,상기 구체의 운동 평면 상에 미리 설정되어 표시된 특징부를 상기 영상을 통해 인식하고 그 인식된 정보에 기초하여 상기 기준면과 상기 카메라와의 위치관계를 설정하도록 구성되는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱장치.
- 제1항에 있어서, 상기 센싱처리부는,상기 영상에서 검출되는 오브젝트의 상단 경계의 곡선을 검출하고, 상기 상단 경계의 곡선의 곡률을 갖는 원을 산출하여 상기 원 상의 적어도 하나의 점을 특징점으로서 검출하며, 상기 특징점을 이용하여 상기 구체의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 구체의 평면 위치좌표로서 산출하도록 구성되는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱장치.
- 제1항에 있어서, 상기 센싱처리부는,상기 영상에서 검출되는 오브젝트의 상단 경계를 검출하고, 상기 상단 경계 상의 상단점을 검출하며, 상기 오브젝트의 크기와 형상에 매칭되는 검출도형을 생성하여 상기 생성된 검출도형의 크기를 이용하여 상기 상단점에 대응되는 상기 오브젝트의 하단점을 검출하여,상기 상단점 또는 하단점을 이용하여 상기 구체의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 구체의 평면 위치좌표로서 산출하도록 구성되는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱장치.
- 제1항에 있어서, 상기 센싱처리부는,상기 오브젝트의 윤곽의 상단점과 하단점을 검출하여, 상기 기준면과 카메라와의 위치관계에 관한 설정 정보를 이용하여 상기 상단점 또는 하단점을 상기 카메라의 시선방향으로 상기 기준면에 투영한 점의 오차를 보정함으로써 상기 구체의 평면 위치좌표를 산출하도록 구성되는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱장치.
- 퍼팅매트 상을 운동하는 골프공에 대한 센싱장치로서,임의의 위치에서 단일 시각으로 상기 퍼팅매트를 포함하는 화각으로 영상을 취득하는 카메라; 및상기 카메라의 취득 영상으로부터 상기 퍼팅매트 상의 미리 설정되어 표시된 특징부를 인식하여 상기 퍼팅매트를 기준면으로 하여 상기 카메라와의 위치관계를 설정하고, 상기 취득 영상 상의 상기 골프공에 해당하는 오브젝트의 윤곽 상의 특징점을 검출하며, 상기 특징점을 이용하여 상기 골프공의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 골프공의 평면 위치좌표로서 산출하여, 상기 산출된 평면 위치좌표의 변화로부터 상기 퍼팅매트 상에서의 골프공의 운동에 대한 정보를 산출하는 센싱처리부;를 포함하는 퍼팅매트 상을 운동하는 골프공에 대한 센싱장치.
- 카메라를 이용한 평면 이동 구체의 운동 센싱방법으로서,상기 카메라에 의해 임의의 위치에서 단일 시각으로 상기 구체의 운동 평면을 포함하는 화각으로 영상을 취득하는 단계;상기 카메라의 취득 영상으로부터 상기 구체의 운동 평면을 기준면으로 하여 상기 카메라와의 위치관계를 설정하는 단계;상기 취득 영상 상의 상기 구체에 해당하는 오브젝트의 윤곽 상의 특징점을 검출하는 단계; 및상기 카메라와의 위치관계를 이용하여 상기 특징점으로부터 상기 구체의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 기준면 상에서의 구체의 평면 위치좌표로서 산출하는 단계;를 포함하는 카메라를 이용한 평면 이동 구체의 운동 센싱방법.
- 제8항에 있어서, 상기 카메라와의 위치관계를 설정하는 단계는,상기 영상을 취득하는 단계에서 취득된 영상으로부터 상기 구체의 운동 평면 상에 미리 설정되어 표시된 특징부를 통해 상기 구체의 운동 평면을 기준면으로서 인식하는 단계와,상기 기준면의 인식을 기반으로 상기 카메라의 위치 정보를 설정하는 단계를 포함하는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱방법.
- 제8항에 있어서, 상기 구체의 평면 위치좌표로서 산출하는 단계는,상기 영상 상에서 검출된 상기 오브젝트의 상단 경계를 기준으로 상기 오브젝트의 윤곽을 검출하는 단계와,상기 오브젝트 윤곽의 상단점 또는 하단점을 상기 카메라의 시선방향으로 상기 기준면 상에 투영된 점과 각도 정보를 검출하는 단계와,상기 기준면과 카메라와의 위치관계에 관한 설정정보, 상기 구체의 크기 정보 및 상기 검출된 각도 정보를 이용하여 상기 상단점 또는 하단점의 상기 카메라의 시선방향으로 상기 기준면 상에 투영된 점의 오차를 보정함으로써 상기 구체의 평면 위치좌표를 산출하는 단계를 포함하는 것을 특징으로 하는 카메라를 이용한 평면 이동 구체의 운동 센싱방법.
- 퍼팅매트 상을 운동하는 골프공에 대한 센싱방법으로서,상기 카메라에 의해 임의의 위치에서 단일 시각으로 상기 퍼팅매트를 포함하는 화각으로 영상을 취득하는 단계;상기 카메라의 취득 영상으로부터 상기 퍼팅매트를 기준면으로 하여 상기 카메라와의 위치관계를 설정하는 단계;상기 카메라가 취득하는 영상으로부터 상기 골프공에 해당하는 오브젝트를 검출하여 골프공에 대한 타격 준비를 하는 단계;상기 카메라가 취득하는 영상으로부터 골프공의 타격을 감지하는 단계;상기 타격이 감지된 경우, 상기 설정된 기준면과 카메라의 위치관계를 이용하여 상기 오브젝트의 윤곽 상의 점으로부터 상기 골프공의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 골프공의 평면 위치좌표로서 산출하는 단계; 및상기 카메라가 취득하는 영상의 각 프레임마다 상기 골프공의 평면 위치좌표를 산출하여 그 변화로부터 상기 골프공의 운동에 대한 정보를 산출하는 단계;를 포함하는 퍼팅매트 상을 운동하는 골프공에 대한 센싱방법.
- 제11항에 있어서, 상기 골프공의 평면 위치좌표로서 산출하는 단계는,상기 영상에서 검출되는 오브젝트의 상단 경계의 곡선을 검출하는 단계와,상기 상단 경계의 곡선의 곡률을 갖는 원을 산출하여 상기 원 상의 적어도 하나의 점을 특징점으로서 검출하는 단계와,상기 특징점을 이용하여 상기 골프공의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 골프공의 평면 위치좌표로서 산출하는 단계를 포함하는 것을 특징으로 하는 퍼팅매트 상을 운동하는 골프공에 대한 센싱방법.
- 제11항에 있어서, 상기 골프공의 평면 위치좌표로서 산출하는 단계는,상기 영상에서 검출되는 오브젝트의 상단 경계를 검출하는 단계와,상기 상단 경계 상의 상단점을 검출하는 단계와,상기 오브젝트의 크기와 형상에 매칭되는 검출도형을 생성하여 상기 생성된 검출도형의 크기를 이용하여 상기 상단점에 대응되는 상기 오브젝트의 하단점을 검출하는 단계와,상기 상단점 또는 하단점을 이용하여 상기 골프공의 중심점이 상기 기준면으로 연직방향 투영된 점의 위치좌표를 상기 골프공의 평면 위치좌표로서 산출하는 단계를 포함하는 것을 특징으로 하는 퍼팅매트 상을 운동하는 골프공에 대한 센싱방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180068933.2A CN116490246A (zh) | 2020-10-07 | 2021-10-05 | 利用摄像头的平面移动球体的运动感测装置及方法、在推杆垫上移动的高尔夫球的运动感测装置及方法 |
JP2023521284A JP2023544793A (ja) | 2020-10-07 | 2021-10-05 | 平面移動球体の運動センシング装置及び方法と、パッティングマットを移動するゴルフボールの運動センシング装置及び方法 |
US18/030,765 US20230405432A1 (en) | 2020-10-07 | 2021-10-05 | Device and method for sensing movement of sphere moving on plane surface using camera, and device and method for sensing golfball moving on putting mat |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2020-0129422 | 2020-10-07 | ||
KR1020200129422A KR20220046244A (ko) | 2020-10-07 | 2020-10-07 | 카메라를 이용한 평면 이동 구체의 운동 센싱장치 및 방법과, 퍼팅매트를 이동하는 골프공의 운동 센싱장치 및 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022075691A1 true WO2022075691A1 (ko) | 2022-04-14 |
Family
ID=81126654
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/013602 WO2022075691A1 (ko) | 2020-10-07 | 2021-10-05 | 카메라를 이용한 평면 이동 구체의 운동 센싱장치 및 방법과, 퍼팅매트를 이동하는 골프공의 운동 센싱장치 및 방법 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230405432A1 (ko) |
JP (1) | JP2023544793A (ko) |
KR (2) | KR20220046244A (ko) |
CN (1) | CN116490246A (ko) |
WO (1) | WO2022075691A1 (ko) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102624716B1 (ko) * | 2022-12-21 | 2024-01-12 | (주)코아텍 | 스마트 모바일 기기를 이용한 퍼팅 연습 시스템 |
JP7319751B1 (ja) * | 2023-04-12 | 2023-08-02 | 株式会社Knowhere | コンピュータ、プログラム、情報処理方法およびモデル生成方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101048090B1 (ko) * | 2011-03-22 | 2011-07-08 | (주) 골프존 | 가상 골프 시뮬레이션 장치와, 이에 이용되는 센싱장치 및 센싱방법 |
KR20120010016A (ko) * | 2010-07-23 | 2012-02-02 | 건국대학교 산학협력단 | 카메라를 이용한 골프 퍼팅 결과 분석 시스템 및 그 방법 |
KR20120084481A (ko) * | 2011-01-20 | 2012-07-30 | 주식회사 웨이브애프터 | 당구대의 평면 영상 생성장치 및 생성방법, 기록매체 |
KR20170020982A (ko) * | 2015-08-17 | 2017-02-27 | 주식회사 골프존 | 퍼팅매트장치와 이를 이용한 타구분석방법 및 퍼팅 시뮬레이션 방법과, 이를 이용한 퍼팅 시뮬레이션 장치 |
KR20190048670A (ko) * | 2017-10-31 | 2019-05-09 | 대구대학교 산학협력단 | 영상 처리 기반의 퍼팅 분석 장치 및 방법 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101385326B1 (ko) * | 2011-09-23 | 2014-04-16 | 주식회사 크리에이츠 | 카메라로 운동하는 피사체를 촬영하고 그 촬영 이미지에 기초하여 피사체의 실제의 운동 궤적을 획득하기 위한 방법 및 시스템 |
KR101231387B1 (ko) * | 2012-01-20 | 2013-02-07 | 정미애 | 카메라 영상을 이용하는 구형물체의 비행속도 추정 방법 |
-
2020
- 2020-10-07 KR KR1020200129422A patent/KR20220046244A/ko not_active Application Discontinuation
-
2021
- 2021-10-05 JP JP2023521284A patent/JP2023544793A/ja active Pending
- 2021-10-05 CN CN202180068933.2A patent/CN116490246A/zh active Pending
- 2021-10-05 WO PCT/KR2021/013602 patent/WO2022075691A1/ko active Application Filing
- 2021-10-05 US US18/030,765 patent/US20230405432A1/en active Pending
-
2022
- 2022-09-14 KR KR1020220115475A patent/KR102642264B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120010016A (ko) * | 2010-07-23 | 2012-02-02 | 건국대학교 산학협력단 | 카메라를 이용한 골프 퍼팅 결과 분석 시스템 및 그 방법 |
KR20120084481A (ko) * | 2011-01-20 | 2012-07-30 | 주식회사 웨이브애프터 | 당구대의 평면 영상 생성장치 및 생성방법, 기록매체 |
KR101048090B1 (ko) * | 2011-03-22 | 2011-07-08 | (주) 골프존 | 가상 골프 시뮬레이션 장치와, 이에 이용되는 센싱장치 및 센싱방법 |
KR20170020982A (ko) * | 2015-08-17 | 2017-02-27 | 주식회사 골프존 | 퍼팅매트장치와 이를 이용한 타구분석방법 및 퍼팅 시뮬레이션 방법과, 이를 이용한 퍼팅 시뮬레이션 장치 |
KR20190048670A (ko) * | 2017-10-31 | 2019-05-09 | 대구대학교 산학협력단 | 영상 처리 기반의 퍼팅 분석 장치 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
CN116490246A (zh) | 2023-07-25 |
KR20220133143A (ko) | 2022-10-04 |
US20230405432A1 (en) | 2023-12-21 |
JP2023544793A (ja) | 2023-10-25 |
KR20220046244A (ko) | 2022-04-14 |
KR102642264B1 (ko) | 2024-02-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022075691A1 (ko) | 카메라를 이용한 평면 이동 구체의 운동 센싱장치 및 방법과, 퍼팅매트를 이동하는 골프공의 운동 센싱장치 및 방법 | |
WO2012128574A2 (ko) | 가상 골프 시뮬레이션 장치와, 이에 이용되는 센싱장치 및 센싱방법 | |
WO2017123041A1 (ko) | 야구 연습 장치에 이용되는 센싱장치 및 센싱방법과, 이를 이용한 야구 연습 장치 및 이의 제어방법 | |
WO2016200208A1 (ko) | 운동하는 볼에 대한 센싱장치 및 센싱방법 | |
WO2017010695A1 (en) | Three dimensional content generating apparatus and three dimensional content generating method thereof | |
WO2014058248A1 (ko) | 단일객체에 대한 기울기를 추정하는 영상을 감시하는 장치 및 방법 | |
WO2014109546A1 (ko) | 운동하는 볼에 대한 센싱장치 및 센싱방법 | |
US20070098250A1 (en) | Man-machine interface based on 3-D positions of the human body | |
WO2012128568A2 (ko) | 가상 골프 시뮬레이션 장치와, 이에 이용되는 센싱장치 및 센싱방법 | |
WO2017135690A1 (ko) | 야구 연습 장치에 이용되는 센싱장치 및 센싱방법과, 이를 이용한 야구 연습 장치 및 이의 제어방법 | |
CN110298864B (zh) | 一种高尔夫推杆设备的视觉感测方法及装置 | |
WO2018139810A1 (ko) | 운동하는 객체의 위치 정보를 산출하기 위한 센싱장치 및 이를 이용한 센싱방법 | |
Jeges et al. | Measuring human height using calibrated cameras | |
WO2014189315A1 (ko) | 골프 스윙에 대한 정보제공을 위한 골프 연습 시스템, 서버 및 이를 이용한 골프 스윙에 대한 정보 처리방법 | |
JP6942566B2 (ja) | 情報処理装置、情報処理方法およびコンピュータプログラム | |
WO2019216673A1 (ko) | 무인 이동체용 사물 유도 시스템 및 방법 | |
WO2019135462A1 (ko) | 번들 조정 알고리즘을 이용한 립모션과 hmd 사이의 캘리브레이션 방법 및 장치 | |
KR20090112538A (ko) | 조명 제어를 이용한 골프 영상 획득 장치, 및 그를 이용한 영상처리 기반의 골프 연습 시스템 | |
WO2018097612A1 (ko) | 사용자의 골프샷에 대한 정보를 산출하기 위한 센싱장치 및 이를 이용한 센싱방법 | |
WO2015030534A1 (en) | Golf practice system for providing golf lesson information and information processing method for providing golf lesson information using the same | |
WO2022211377A1 (ko) | 타격되어 이동하는 골프공에 대한 스핀 산출 방법 및 이를 이용한 스핀 산출 장치 | |
WO2021020813A1 (ko) | 골프클럽의 검출 방법 및 이를 이용한 센싱장치 | |
KR20220146137A (ko) | 두 대의 카메라를 이용한 천장형 골프 시뮬레이션 시스템 | |
WO2022055318A1 (ko) | 골프스윙에 대한 센싱장치 및 이를 이용한 클럽헤드의 임팩트 위치 센싱방법 | |
WO2012128572A2 (ko) | 가상 골프 시뮬레이션 장치와, 이에 이용되는 센싱장치 및 센싱방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21877943 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023521284 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180068933.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21877943 Country of ref document: EP Kind code of ref document: A1 |