WO2012002734A2 - Sensing device and sensing processing method for moving object and virtual golf simulation device using the same - Google Patents

Sensing device and sensing processing method for moving object and virtual golf simulation device using the same Download PDF

Info

Publication number
WO2012002734A2
WO2012002734A2 PCT/KR2011/004762 KR2011004762W WO2012002734A2 WO 2012002734 A2 WO2012002734 A2 WO 2012002734A2 KR 2011004762 W KR2011004762 W KR 2011004762W WO 2012002734 A2 WO2012002734 A2 WO 2012002734A2
Authority
WO
WIPO (PCT)
Prior art keywords
ball
image
image region
moving direction
moving
Prior art date
Application number
PCT/KR2011/004762
Other languages
French (fr)
Other versions
WO2012002734A9 (en
WO2012002734A3 (en
Inventor
Won Il Kim
Young Chan Kim
Chang Heon Woo
Min Soo Hahn
Tae Wha Lee
Myung Sun Choi
Jae Wook Jung
Shin Kang
Seung Woo Lee
Han Bit Park
Myung Woo Kim
Yong Woo Kim
Seong Kook Heo
Original Assignee
Golfzon Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020100062206A external-priority patent/KR101019782B1/en
Priority claimed from KR1020100062203A external-priority patent/KR101019798B1/en
Priority claimed from KR1020100062202A external-priority patent/KR101019824B1/en
Application filed by Golfzon Co., Ltd. filed Critical Golfzon Co., Ltd.
Priority to JP2013518251A priority Critical patent/JP5572853B2/en
Priority to CN201180041839.4A priority patent/CN103079652B/en
Publication of WO2012002734A2 publication Critical patent/WO2012002734A2/en
Publication of WO2012002734A9 publication Critical patent/WO2012002734A9/en
Publication of WO2012002734A3 publication Critical patent/WO2012002734A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • A63B69/3623Training appliances or apparatus for special sports for golf for driving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/006Simulators for teaching or training purposes for locating or ranging of objects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0002Training appliances or apparatus for special sports for baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/002Training appliances or apparatus for special sports for football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/04Games or sports accessories not covered in groups A63B1/00 - A63B69/00 for small-room or indoor sporting games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0669Score-keepers or score display devices
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/805Optical or opto-electronic sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/807Photo cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/833Sensors arranged on the exercise apparatus or sports implement

Definitions

  • the present invention relates to a sensing device and sensing processing method and a virtual golf simulation device using the same, and, more particularly, to a sensing device and sensing processing method for a moving object that acquires and analyzes an image of a moving object, such as a golf ball, to obtain physical information thereof and a virtual golf simulation device using the same.
  • sensing systems such as a sensing system using an infrared sensor, a sensing system using a laser sensor, a sensing system using an acoustic sensor and a sensing system using a camera sensor, have come onto the market.
  • processing method for a moving object that, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, is capable of accurately extracting an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, is capable of accurately extracting the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost and a virtual golf simulation device using the same.
  • a sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight and a sensing processing unit including a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region.
  • sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight and a sensing processing unit including a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object, a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image and an object extracting means to extract an image of the object from the image in the moving direction estimated by the first moving direction estimating means or the second moving direction estimating means.
  • sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving ball under strobe flashlight and a sensing processing unit including an object extracting means to extract a ball image region, in which images of the moving ball overlap each other, from the multiple exposure image, a fitting means to fit the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, respectively, and a coordinate extracting means to extract coordinates of at least two center points of the fitted ball image region.
  • sensing processing method including acquiring a multiple exposure image with respect to a moving object under strobe flashlight, extracting an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image, forming at least two center point candidates of the extracted object image region, and adjusting the positions of the center point candidates according to predetermined conditions to extract coordinates of at least two center points of the object image region.
  • sensing processing method including acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight, recognizing an initial position of the object in the multiple exposure image, forming lines passing through image regions in the multiple exposure image, estimating a direction of one of the lines nearest the initial position of the object as a moving direction of the object, and extracting and analyzing an image of the object from images in the estimated moving direction.
  • sensing processing method including acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight, recognizing an initial position of the object in the multiple exposure image, combining at least two frames of the multiple exposure images into a single image, performing line scanning at a predetermined angle about the initial position of the object with respect to the combined image to obtain a predetermined function value based on pixel values of images on the respective scanned lines, extracting a direction of one of the scanned lines having the maximum function value as a moving direction of the object, and extracting and analyzing an image of the object from images in the estimated moving direction.
  • sensing processing method including acquiring a multiple exposure image with respect to a moving ball under strobe flashlight, extracting a ball image region, in which images of the moving ball overlap each other by multiple exposure, from the multiple exposure image, fitting the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, and extracting coordinates of at least two center points of the fitted ball image region.
  • virtual golf simulation device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight, a sensing processing unit including a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region, and a simulator to calculate the change of the coordinates of the center points and to obtain physical information on a moving golf ball, thereby simulating an orbit of the golf ball.
  • a sensing device and sensing processing method for a moving object according to the present invention as described above and a virtual golf simulation device using the same, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, it is possible to accurately extract an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, it is possible to accurately extract the overlapping object images to extract coordinates of a center point thereof. Consequently, the present invention has the effect of achieving high sensing processing performance and sensing accuracy at low cost.
  • FIG. 1 is a block diagram showing a sensing device or a virtual golf simulation device according to an embodiment of the present invention
  • FIG. 2 is a view showing an example of a screen golf system to which the sensing device or the virtual golf simulation device shown in FIG. 1 is applied;
  • FIG. 3 is a view showing an operational signal scheme of a camera device and a strobe flashlight device of the sensing device according to the embodiment of the present invention
  • FIGS. 4(a) and 4(b) are views showing image patterns of an object acquired by the screen golf system shown in FIG. 2;
  • FIG. 5(a) is a view showing an example of an actually acquired image of the image having the pattern shown in FIG. 4(b) and FIG. 5(b) is a view showing a preliminarily processed image of the image shown in FIG. 5(a);
  • FIG. 6 is a view showing an example of a process of estimating a moving direction of a ball performed by the sensing device according to the embodiment of the present invention.
  • FIG. 7 is an enlarged view showing region A of FIG. 6;
  • FIGS. 8 to 10 are views showing another example of a process of estimating a moving direction of a ball performed by the sensing device according to the embodiment of the present invention.
  • FIG. 11(a) is a view showing an example of a ball image region in which ball images overlap each other and FIG. 11(b) is a graph showing the distribution of pixel values checked along an axis of the moving direction with respect to the image shown in FIG. 11(a);
  • FIG. 12 (a) is a view showing a process of fitting the ball image region along the axis of the moving direction and FIG. 12(b) is a graph showing the sum of pixel values on a half circle along the axis of the moving direction;
  • FIG. 13 is a view showing an image in which the ball image region is half circle fitted and center point candidates of the ball image region are formed by the process shown in FIGS. 12(a) and 12(b);
  • FIGS. 14(a) and 14(b) are views showing a process of adjusting the position of the axis of the moving direction of an estimated ball so as to pass through the centers of the fitted ball image region;
  • FIG. 15(a) is a view showing a process of adjusting the positions of the center point candidates formed in the fitted ball image region and FIG. 15(b) is a view showing the center points adjusted by the process shown in FIG. 15(a);
  • FIG. 16 is a view showing a process of correcting the center points of the ball images so as to correspond to an interval rate of strobe flashlight.
  • FIGS. 17 and 18 are flow charts showing processing steps of a sensing processing method according to the present invention.
  • a sensing device acquires and analyzes an image of a moving object to obtain physical information thereof.
  • the moving object may be a ball hit by a golfer in sports, such as golf, using a ball.
  • the sensing device is used to sense such a moving ball.
  • the sensing device may be applied to a so-called screen golf system to which a virtual golf simulation device is applied.
  • FIGS. 1 and 2 schematically show the construction of a sensing device according to the present invention and a virtual golf simulation device using the same.
  • the sensing device for a moving object includes a sensor unit including camera devices 310 and 320, a strobe flashlight device 330 and a signal generating unit 210 and a sensing processing unit 220 to process an image acquired by the sensor unit and to process an image of the moving object, thereby extracting coordinates of a center point thereof.
  • two camera devices 310 and 320 are provided, to which, however, the present invention is not limited.
  • a camera device or more than two camera devices may be provided.
  • the camera devices 310 and 320 are provided to respectively capture a plurality of frames on the movement of an object moving from an initial position in a moving direction of the object.
  • the strobe flashlight device 330 is a lighting device using light emitting diodes
  • the strobe flashlight device 330 is used as a light source for capturing of the camera devices.
  • the strobe flashlight device 330 generates strobe flashlight at a predetermined time interval (that is, a plurality of flashes is generated at a predetermined time interval) so that the camera devices 310 and 320 can capture a multiple exposure image.
  • an object is present on a frame captured by the camera devices in correspondence to the number of flashes generated from the strobe flashlight device 330 to provide a multiple exposure image thereof.
  • a trigger signal to operate the camera devices 310 and 320 and the strobe flashlight device 330 is generated by the signal generating unit 210.
  • the multiple exposure image captured by the camera devices 310 and 320 and the strobe flashlight device 330 is processed by the sensing processing unit 220 and is transmitted to a simulator 100.
  • the sensing processing unit 220 includes a first processing means 230 to extract an object image region, in which images of a moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means 240 to form and adjust at least two center point candidates of the extract object image region to extract at least two center point of the object image region.
  • the first processing means 230 includes a preliminary processing means
  • an object extracting means 233 to extract an object image region, in which object images overlap each other, from the image in the estimated moving direction.
  • the moving direction estimating means 232 may include a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object and a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image.
  • the moving direction estimating means 232 may include either the first moving direction estimating means or the second moving direction estimating means. Alternatively, the moving direction estimating means 232 may include both the first moving direction estimating means and the second moving direction estimating means.
  • the moving direction estimating means includes both the first moving direction estimating means and the second moving direction es- timating means
  • the first moving direction estimating means is used first to estimate the moving direction of the object, and, if the moving direction of the object estimated by the first moving direction estimating means is not correct, the second moving direction estimating means is used to estimate the moving direction of the object.
  • the object extracting means 233 may be configured to include a pattern analyzing means to analyze patterns of the images in the estimated moving direction to effectively extract the object image region in which the object images overlap each other or a means to check pixel values of the respective images in the estimated moving direction and to analyze patterns based on the checked pixel values to extract the object image region.
  • the second processing means 240 preferably includes a fitting means 241 to fit the extracted object image region based on a value previously set as the size of the object to form at least two center point candidates of the fitted object image region and a coordinate extracting means 242 to adjust the positions of the formed center point candidates according to predetermined conditions to extract at least two center points of the object image region.
  • the simulator 100 preferably includes a controller M, a database 110, an image processing unit 120 and an image output unit 130.
  • the controller M receives coordinate information of an object which has been image processed and acquired by the sensing processing unit 220, converts the coordinate information into three-dimensional coordinates, obtains predetermined physical information for moving orbit simulation of the object and transmits the obtained physical information to the image processing unit 120.
  • Predetermined data for moving orbit simulation of the object may be extracted from the database 110, and the image processing of the moving orbit simulation of the object by the image processing unit 120 may be performed by extracting image data stored in the database 110.
  • a converting means to convert coordinate information of the object
  • first processing means 230 and the second processing means 240 may be realized as a single program configured to perform the functions of the above means or separate programs configured to re- spectively performed the functions of the above means.
  • FIG. 2 An example of a screen golf system, to which the sensing device or the virtual golf simulation device with the above-stated construction is applied, is shown in FIG. 2.
  • a hitting mat 30 having a golf tee, on which a golf ball is placed, and a ground (preferably including at least one selected from among a fairway mat, a rough mat and a bunker mat) and a swing plate 20, on which a golfer hits a golf ball, are provided at one side of a golf booth B.
  • the strobe flashlight device 330 are provided at the ceiling of the golf booth B.
  • the camera devices 310 and 320 are provided at the ceiling and wall of the golf booth B, respectively, to which, however, the present invention is not limited.
  • the camera devices 310 and 320 can be installed at any positions of the golf booth B so long as the camera devices 310 and 320 do not disturb the swing of the golfer and are prevented from colliding with a golf ball hit by the golfer while the camera devices 310 and 320 can effectively capture an image of a moving state of the golf ball 10.
  • the strobe flashlight device 330 is installed at the ceiling of the golf booth B to provide strobe flashlight substantially perpendicular to a hitting point, to which, however, the present invention is not limited.
  • the strobe flashlight device 330 can be installed at any position of the golf booth B so long as the strobe flashlight device 330 can effectively provide strobe flashlight.
  • the camera devices 310 and 320 capturing predetermined regions in which the hitting is performed capture a plurality of frames, respectively.
  • the strobe flashlight device 330 generates several flashes per frame to acquire a multiple exposure image of the moving golf ball.
  • FIG. 3 is a view showing a trigger signal generation scheme of a camera device and a strobe flashlight device by the signal generating unit 210 (see FIG. 1).
  • trigger signals of the camera device have a time interval of tc.
  • trigger signals are generated at a time interval of tc for each frame.
  • exposure time for each frame is te (at this time, preferably tc > te).
  • data on an image acquired for a time of tc— te is preferably transmitted to the sensing processing unit 220 (see FIG. 1).
  • the camera device is triggered at an interval between the trigger signal generating times for each frame, and data on the acquired multiple exposure image is transmitted to the sensing processing unit.
  • strobe flashlight is generated by the strobe flashlight device several times. In FIG. 3, strobe flashlight trigger signals are generated three times at a time interval of tsl.
  • strobe flashlight is generated three times at the same time interval of tsl.
  • the camera device and the strobe flashlight device are preferably synchronized for the signal generating unit 210 (see FIG. 1) to generate a first trigger signal.
  • a time interval of ts2 is preferably provided between the trigger signal of the last one of the three strobe flashlights and a first strobe flashlight of the next frame.
  • the time interval of tsl and the time interval of ts2 may be set to be equal to each other. As shown in FIG. 3, however, the time interval of tsl and the time interval of ts2 are preferably set to be different from each other. More preferably, the time interval of ts2 is set to be longer than the time interval of tsl.
  • FIGS. 4(a) and 4(b) are views showing two patterns of the multiple exposure image acquired by the camera device and the strobe flashlight device based on the signal generation scheme shown in FIG. 3, respectively.
  • FIGS. 4(a) and 4(b) a moving state of a ball hit by a golfer using a golf club is shown using two image patterns acquired based on the trigger signal scheme shown in FIG. 3 (II and 12).
  • An image II shown in FIG. 4(a) is acquired through multiple exposure of golf club images CI, C2 and C3 and ball images 11a, l ib and 1 lc.
  • the ball images 11a, 1 lb and 1 lc are spaced apart from each other by a predetermined interval.
  • An image 12 of FIG. 4(b) shows a case in which ball images overlap to provide an image region 12 of a predetermined size.
  • the image is formed in a state in which the ball images are spaced apart from each other by the predetermined interval when strobe flashlight is triggered, and, in the image shown in FIG. 4(b), the ball is moved at low speed, and therefore, strobe flashlight is triggered before the ball moves far away with the result that the ball images overlap each other.
  • the sensing device and sensing processing method according to the present invention are provided to accurately acquire the coordinates of center points of the ball images when the ball images overlap each other on the multiple exposure image as the ball is moved at a speed equal to or less than a predetermined speed based on the fixed interval of the strobe flashlight as shown in FIG. 4(b).
  • FIG. 5(a) is a view showing the original copy of a multiple exposure image including overlapping ball images.
  • the multiple exposure image is acquired using a camera device having a low resolution of 320 x 240 and operated at a low speed of 75 fps and a strobe flashlight device operating at 330 Hz.
  • the preliminary processing means 231 (see FIG. 1) of the first processing means removes a stationary image, i.e. a background image, from the original image shown in FIG. 5(a) through subtraction and performs a predetermined preliminary processing operation such as Gaussian blur. An image acquired through the preliminary processing operation is shown in FIG. 5(b).
  • the overlapping ball images are displayed in the form of an image region having a predetermined size, which may be confused with various noises, such as an image region having a predetermined size displaying part of a body of a golfer or an image region having a predetermined size displaying a golf club. For this reason, it is very difficult to extract only the image region, in which the ball images overlap each other, in a state in which various noises are included.
  • a method of extracting such overlapping ball images is to estimate the moving direction of the ball using the moving direction estimating means 232 (see FIG. 1) of the sensing device according to the present invention, to primarily extract images in the estimated moving direction and to extract a ball image region from the extracted images.
  • FIGS. 6 to 10 are views showing a process of estimating a moving direction of a ball performed by the sensing device and sensing processing method according to the embodiment of the present invention. First, a process of estimating a moving direction of a ball performed by the first moving direction estimating means will be described with reference to FIGS. 6 and 7.
  • lines L are formed so as to pass through all image regions R present in an
  • the camera device senses the initial position of the ball before the ball is moved and the initial position Bl of the ball in the multiple exposure image is set based on the information on the sensed initial position of the ball.
  • FIG. 7 is an enlarged view showing region A of FIG. 6, straight distances between a plurality of lines passing through a region Rl and the initial position B 1 of the ball and between a plurality of lines passing through a region R2 and the initial position B 1 of the ball are measured.
  • a line LI passing through the region Rl and a line L2 passing through the region R2 are nearest the initial position Bl of the ball.
  • the line LI is nearer the initial position B l of the ball than the line L2. Consequently, the line LI is estimated as an axis of a moving direction of the ball tentatively.
  • the moving direction estimated using the above method may be incorrect if DL is a long distance to such an extent that DL deviates from a predetermined range. Consequently, the result estimated by the first moving direction estimating means as described above is ignored, and the moving direction is preferably estimated by the second moving direction estimating means.
  • FIGS. 8 to 10 are views showing a process of estimating the moving direction of the ball performed by the second moving direction estimating means.
  • the moving direction is estimated by the second moving direction estimating means based on the single image as shown in FIG. 8, which will be described with reference to FIG. 9.
  • the second moving direction estimating means includes a image
  • combining means to combine at least two frames of the multiple exposure images into a single image, a line scanning means to perform line scanning at a predetermined angle about the initial position of the object to check pixel values of the images on the respective scanned lines and an axis extracting means to extract the scanned line having the largest pixel value among the lines scanned by the line scanning means as an axis of the moving direction.
  • a plurality of frames may be combined into a single image by the image combining means of the second moving direction estimating means as shown in FIG. 8, and the axis of the moving direction of the ball may be extracted by the line scanning means and the axis extracting means as shown in FIG. 9.
  • a predetermined angle scanning region SR is designated based on the initial position B 1 of the ball to check pixel values of images on respective scanned lines XI, X2, X3 ... Xi ... Xn in the corresponding scanning region SR based on the initial position Bl of the ball.
  • scanning region lines si and s2 forming the scanning region SR are preferable formed to define a predetermined angle therebetween.
  • the scanning region lines si and s2 may be set to define 360 degrees therebetween.
  • the scanning region lines si and s2 are set to define front 120 degrees therebetween.
  • Xn may be checked as shown in FIG. 10.
  • a first sub scanned line UL and a second sub scanned line LL are formed at one side and the other side about the scanned line Xi based on a value previously set as a radius Rs of the ball (including a value set based on a previously known radius of the ball and a value set based on a radius of the ball previously measured from the image of the ball), and the sum Sum 1 of the pixel values of the image on the scanned line Xi, the sum Sum 2 of the pixel values of the image on the first sub scanned line UL and the sum Sum 2 of the pixel values of the image on the second sub scanned line LL are calculated.
  • scanned line having the maximum value is estimated as the axis of the moving direction of the ball.
  • the scanned line Xi may be estimated as the axis of the moving direction of the ball.
  • the pixel values are checked with respect to all of the images (which may include noise in addition to a ball image region to be extracted) through which the extracted axis of the moving direction of the ball passes to effectively remove noise and thus extract only the ball image region.
  • FIGS. 11(a) and 11(b) shows an example of a method of effectively extracting only the ball image from image present on the axis of the moving direction.
  • FIG. 11(a) is a view showing an image having a predetermined pattern which may be present on the axis of the moving direction and
  • FIG. 11(b) is a graph showing the distribution of pixel values checked along an axis 13 of the moving direction with respect to the image shown in FIG. 11(a).
  • the axis 13 of the moving direction passes through a ball image region 12 in which ball images overlap each other and noise Nl around the ball image region 12.
  • the distribution of pixel values of the image along the axis 13 of the moving direction has a pattern shown in FIG. 11(b).
  • the ball image region in which ball images overlap each other must have a section in which pixel values are maintained at a predetermined critical value or more, which corresponds to a section obtained by adding diameters of at least two balls.
  • the section in which the pixel values are maintained at the predetermined critical value or more includes a section al to a2 and a section a3 to a4.
  • the section in which the pixel values are maintained at the predetermined critical value or more has an image corresponding to the section a3 to a4, i.e. an image denoted by reference numeral 12 in FIG. 11(a)
  • the section in which the pixel values are maintained at the predetermined critical value or more may be estimated as the ball image region in which the ball images overlap each other.
  • the section in which the pixel values are maintained at the predetermined critical value or more has an image corresponding to the section al to a2, i.e. an image denoted by reference numeral Nl in FIG. 11(a)
  • the section in which the pixel values are maintained at the predetermined critical value or more may be estimated as noise (an image of a golf club).
  • the image Nl may be excluded and the image denoted by reference numeral 12 may be extracted as the ball image region in which ball images overlap each other.
  • the fitting means 241 see FIG. 1.
  • a half circle fitting method to fit one half circle at one end of the ball image region 12 in which the ball images overlap each other and to fit the other half circle at the other end of the ball image region 12 in which the ball images overlap each other as shown in FIGS. 12 and 13 may be used.
  • the ball image region 12 has a higher brightness value than the periphery (Here, if the acquired image has a grayscale image, each of the pixel values has a brightness value).
  • the edge portion of the ball image region 12 has a gradually varying brightness value.
  • the variation of the brightness value is terminated at the contour of the ball image region.
  • the contour of the ball image region is fitted.
  • a first half circle CCl and a second half circle CC2 are moved in opposite directions about the ball image region 12 along the axis 13 of the moving direction to calculate the sum of pixel values of the image on the first half circle CCl and the sum of pixel values of the image on the second half circle CC2.
  • FIG. 12(b) is a graph showing the distribution of the sum of pixel values on a half circle.
  • a half circle curve HC1 or HC2 is formed at the point VI or the point V3 to fit the ball image region.
  • the half circle curves HC1 and HC2 are formed at positions at which the sum of the pixel values on the respective half circles CCl and CC2 is less than the predetermined value to fit the ball image region.
  • the value of the point VI may be previously set as the predetermined value
  • the value of the point V2 or the point V3 may be previously set as the predetermined value or an arbitrary value between the point VI and the point V3 may be previously set as the predetermined value.
  • Setting of the predetermined value may be derived through an experiment to confirm at which value as the predetermined value the most accurate result can be obtained.
  • the diameter of the half circles CC1 and CC2 or the half circle curves HC1 and HC2 may have a value previously set as the diameter of the ball or a value measured from the ball image as the diameter of the ball.
  • the diameter of the ball may be a value which is previously measured and input or a value sensed by the camera device at the initial position of the ball and measured and set on the image as the diameter.
  • the diameter of the ball may be measured from the separate ball images as shown in FIG. 4(a) and may be used to fit the overlapping ball images.
  • any one of the points VI to V3 is selected from the distribution of the sum of the pixel values on the half circles CC1 and CC2 along the axis 13 of the moving direction and the half circle curves HC1 and HC2 having the predetermined diameter of the ball or the measured diameter of the ball are formed at opposite ends of the ball image region 12 to fit the ball image region.
  • the first half circle curve HC1 is formed at one end of the ball image region 12 to fit the ball image region 12 and the second half circle curve HC2 is formed at the other end of the ball image region 12 to fit the ball image region 12.
  • the half circle fitting as described above may be performed by a processing means.
  • Pixel values may be checked by a pixel checking means while the first half circle CC1 and the second half circle CC2 moves to opposite ends of the ball image region and the half circle curves may be formed at the opposite ends of the ball image region by a fitting processing means to fit the ball image region.
  • the half circle fitting of one half circle and the half circle fitting of the other half circle may be separately performed by different processing means.
  • first fitting processing means may form the first half circle curve HC1 at one end of the ball image region to fit the ball image region.
  • a second fitting processing means may form the second half circle curve HC2 at the other end of the ball image region to fit the ball image region.
  • center point candidates EP1 and EP2 may be formed at the point VI or the point V2.
  • a first center point candidate EP1 is formed at the side at which the first half circle curve HCl and a second center point candidate EP2 is formed at the side at which the second half circle curve HC2.
  • the center point candidates EP1 and EP2 are formed on the axis 13 of the moving direction.
  • center point candidates EP1 and EP2 may be selected as arbitrary
  • the axis of the moving direction it is preferable for the axis of the moving direction to pass through the center of the ball image region. Since the axis of the moving direction is not accurately extracted but estimated, however, the estimated axis of the moving direction may not exactly pass through the center of the ball image region.
  • the upper and lower thicknesses of the ball image region 12 through which the axis 13 of the moving direction passes must be substantially equal to each other. Therefore, it is preferable to shift the position of the axis 13 of the moving direction to a distance corresponding to the difference between the thickness tl of the ball image region 12 under the axis 13 of the moving direction and the thickness t2 of the ball image region 12 above the axis 13 of the moving direction shown in FIG. 14(a).
  • FIG. 15(a) shows a process of adjusting the center point
  • each of the fitted half circle curves HCl and HC2 is formed in the shape of a half circle as shown in FIG. 15(a), it is possible to adjust the respective center point candidates EP1 and EP2 into correct center points based on geometrical conditions thereof.
  • the length of the three directional lines rl, r2 and r3 or r4, r5 and r6 may correspond to a value previously set as the diameter of the ball or a diameter value measured through the ball image.
  • the image region extracted by the above process may contain images different from the image region of the actual ball.
  • the strobe flashlight time interval per frame is tsl and the time interval
  • ts2 As shown in FIG. 3, and the time intervals tsl and ts2 must correspond to the distance intervals between the ball image regions 12-1, 12-2 and 12-3 of the respective frames of FIG. 1 1 at a predetermined ratio.
  • a ratio of ts2 to 2ts 1 must correspond to a ratio of b to a or a ratio of d to c.
  • the center points can be corrected so that the center points can be located at the corresponding positions.
  • the center points can be removed.
  • a trigger signal is applied by the signal generating unit so that the camera device and the strobe flashlight device acquire a multiple exposure image of the moving state of an object at a predetermined interval (S10).
  • cost function values are obtained with respect to pixel values of the image regions on the respective scanned lines (S64) and one of the scanned lines which have the maximum cost function value is extracted as the axis of the moving direction (S65).
  • Step S70 of FIG. 17 will be described in more detail with reference to FIG. 18.
  • a ball image region is extracted from the image through which the extracted axis of the moving direction passes, the sum of pixel values on half circles is calculated while the respective half circles having a predetermined diameter or a measured diameter moves from the center of the extracted ball image region to opposite ends of the extracted ball image region along the axis of the moving direction (S71).
  • At least two center point candidates of the ball image region are formed on the axis of the moving direction passing through the ball image region (S73) and the formed center point candidates are adjusted so that the formed center point candidates can be located at the centers of circles having the fitted half circle curves, respectively (S74).
  • a sensing device and sensing processing method for a moving object according to the present invention as described above and a virtual golf simulation device using the same although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, it is possible to accurately extract an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, it is possible to accurately extract the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost. Consequently, the present invention can be widely used in industries related to a sensing device and sensing processing method for a moving object and a virtual golf simulation device using the same.

Abstract

Disclosed herein are a sensing device and sensing processing method for a moving object that, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, is capable of accurately extracting an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, is capable of accurately extracting the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost and a virtual golf simulation device using the same.

Description

Description
Title of Invention: SENSING DEVICE AND SENSING
PROCESSING METHOD FOR MOVING OBJECT AND VIRTUAL GOLF SIMULATION DEVICE USING THE SAME
Technical Field
[1] The present invention relates to a sensing device and sensing processing method and a virtual golf simulation device using the same, and, more particularly, to a sensing device and sensing processing method for a moving object that acquires and analyzes an image of a moving object, such as a golf ball, to obtain physical information thereof and a virtual golf simulation device using the same.
Background Art
[2] In recent years, various devices have been developed which allow users to enjoy popular sports games, such as baseball, soccer, basketball and golf, in rooms or in specific places through simulation in the form of interactive sports games.
[3] In order to simulate sports using balls, such as baseballs, soccer balls, basketballs and golf balls in such interactive sports games, much research has been conducted into various sensing systems to accurately sense physical information on a moving object, i.e. movement of a ball.
[4] For example, various sensing systems, such as a sensing system using an infrared sensor, a sensing system using a laser sensor, a sensing system using an acoustic sensor and a sensing system using a camera sensor, have come onto the market.
Disclosure of Invention
Technical Problem
[5] It is an object of the present invention to provide a sensing device and sensing
processing method for a moving object that, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, is capable of accurately extracting an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, is capable of accurately extracting the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost and a virtual golf simulation device using the same.
Solution to Problem
[6] In accordance with one aspect of the present invention, the above and other objects can be accomplished by the provision of a sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight and a sensing processing unit including a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region.
[7] In accordance with another aspect of the present invention, there is provided a
sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight and a sensing processing unit including a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object, a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image and an object extracting means to extract an image of the object from the image in the moving direction estimated by the first moving direction estimating means or the second moving direction estimating means.
[8] In accordance with another aspect of the present invention, there is provided a
sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving ball under strobe flashlight and a sensing processing unit including an object extracting means to extract a ball image region, in which images of the moving ball overlap each other, from the multiple exposure image, a fitting means to fit the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, respectively, and a coordinate extracting means to extract coordinates of at least two center points of the fitted ball image region.
[9] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring a multiple exposure image with respect to a moving object under strobe flashlight, extracting an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image, forming at least two center point candidates of the extracted object image region, and adjusting the positions of the center point candidates according to predetermined conditions to extract coordinates of at least two center points of the object image region.
[10] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight, recognizing an initial position of the object in the multiple exposure image, forming lines passing through image regions in the multiple exposure image, estimating a direction of one of the lines nearest the initial position of the object as a moving direction of the object, and extracting and analyzing an image of the object from images in the estimated moving direction.
] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight, recognizing an initial position of the object in the multiple exposure image, combining at least two frames of the multiple exposure images into a single image, performing line scanning at a predetermined angle about the initial position of the object with respect to the combined image to obtain a predetermined function value based on pixel values of images on the respective scanned lines, extracting a direction of one of the scanned lines having the maximum function value as a moving direction of the object, and extracting and analyzing an image of the object from images in the estimated moving direction.
] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring a multiple exposure image with respect to a moving ball under strobe flashlight, extracting a ball image region, in which images of the moving ball overlap each other by multiple exposure, from the multiple exposure image, fitting the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, and extracting coordinates of at least two center points of the fitted ball image region.] In accordance with a further aspect of the present invention, there is provided a
virtual golf simulation device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight, a sensing processing unit including a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region, and a simulator to calculate the change of the coordinates of the center points and to obtain physical information on a moving golf ball, thereby simulating an orbit of the golf ball.
Advantageous Effects of Invention
] In a sensing device and sensing processing method for a moving object according to the present invention as described above and a virtual golf simulation device using the same, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, it is possible to accurately extract an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, it is possible to accurately extract the overlapping object images to extract coordinates of a center point thereof. Consequently, the present invention has the effect of achieving high sensing processing performance and sensing accuracy at low cost.
Brief Description of Drawings
The above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram showing a sensing device or a virtual golf simulation device according to an embodiment of the present invention;
FIG. 2 is a view showing an example of a screen golf system to which the sensing device or the virtual golf simulation device shown in FIG. 1 is applied;
FIG. 3 is a view showing an operational signal scheme of a camera device and a strobe flashlight device of the sensing device according to the embodiment of the present invention;
FIGS. 4(a) and 4(b) are views showing image patterns of an object acquired by the screen golf system shown in FIG. 2;
FIG. 5(a) is a view showing an example of an actually acquired image of the image having the pattern shown in FIG. 4(b) and FIG. 5(b) is a view showing a preliminarily processed image of the image shown in FIG. 5(a);
FIG. 6 is a view showing an example of a process of estimating a moving direction of a ball performed by the sensing device according to the embodiment of the present invention;
FIG. 7 is an enlarged view showing region A of FIG. 6;
FIGS. 8 to 10 are views showing another example of a process of estimating a moving direction of a ball performed by the sensing device according to the embodiment of the present invention;
FIG. 11(a) is a view showing an example of a ball image region in which ball images overlap each other and FIG. 11(b) is a graph showing the distribution of pixel values checked along an axis of the moving direction with respect to the image shown in FIG. 11(a);
FIG. 12 (a) is a view showing a process of fitting the ball image region along the axis of the moving direction and FIG. 12(b) is a graph showing the sum of pixel values on a half circle along the axis of the moving direction; [26] FIG. 13 is a view showing an image in which the ball image region is half circle fitted and center point candidates of the ball image region are formed by the process shown in FIGS. 12(a) and 12(b);
[27] FIGS. 14(a) and 14(b) are views showing a process of adjusting the position of the axis of the moving direction of an estimated ball so as to pass through the centers of the fitted ball image region;
[28] FIG. 15(a) is a view showing a process of adjusting the positions of the center point candidates formed in the fitted ball image region and FIG. 15(b) is a view showing the center points adjusted by the process shown in FIG. 15(a);
[29] FIG. 16 is a view showing a process of correcting the center points of the ball images so as to correspond to an interval rate of strobe flashlight; and
[30] FIGS. 17 and 18 are flow charts showing processing steps of a sensing processing method according to the present invention.
Best Mode for Carrying out the Invention
[31] Now, exemplary embodiments of a sensing device and sensing processing method for a moving object according to the present invention and a virtual golf simulation device using the same will be described in detail with reference to the accompanying drawings.
[32] A sensing device according to the present invention acquires and analyzes an image of a moving object to obtain physical information thereof. Here, the moving object may be a ball hit by a golfer in sports, such as golf, using a ball. The sensing device is used to sense such a moving ball. In an example, the sensing device may be applied to a so-called screen golf system to which a virtual golf simulation device is applied.
[33] FIGS. 1 and 2 schematically show the construction of a sensing device according to the present invention and a virtual golf simulation device using the same.
[34] First, a sensing device according to an embodiment of the present invention and a virtual golf simulation device using the same will be described with reference to FIGS. 1 and 2.
[35] As shown in FIG. 1, the sensing device for a moving object according to the embodiment of the present invention includes a sensor unit including camera devices 310 and 320, a strobe flashlight device 330 and a signal generating unit 210 and a sensing processing unit 220 to process an image acquired by the sensor unit and to process an image of the moving object, thereby extracting coordinates of a center point thereof.
[36] In FIG. 1, two camera devices 310 and 320 are provided, to which, however, the present invention is not limited. For example, a camera device or more than two camera devices may be provided.
[37] The camera devices 310 and 320 are provided to respectively capture a plurality of frames on the movement of an object moving from an initial position in a moving direction of the object.
[38] The strobe flashlight device 330 is a lighting device using light emitting diodes
(LED). The strobe flashlight device 330 is used as a light source for capturing of the camera devices. The strobe flashlight device 330 generates strobe flashlight at a predetermined time interval (that is, a plurality of flashes is generated at a predetermined time interval) so that the camera devices 310 and 320 can capture a multiple exposure image.
[39] That is, an object is present on a frame captured by the camera devices in correspondence to the number of flashes generated from the strobe flashlight device 330 to provide a multiple exposure image thereof.
[40] The operation of the camera devices 310 and 320 and the strobe flashlight device 330 will be described below in detail.
[41] Meanwhile, a trigger signal to operate the camera devices 310 and 320 and the strobe flashlight device 330 is generated by the signal generating unit 210. The multiple exposure image captured by the camera devices 310 and 320 and the strobe flashlight device 330 is processed by the sensing processing unit 220 and is transmitted to a simulator 100.
[42] The sensing processing unit 220 includes a first processing means 230 to extract an object image region, in which images of a moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means 240 to form and adjust at least two center point candidates of the extract object image region to extract at least two center point of the object image region.
[43] Preferably, the first processing means 230 includes a preliminary processing means
231 to remove a background image and noise, a moving direction estimating means
232 to estimate a moving direction of the object in the multiple exposure image and an object extracting means 233 to extract an object image region, in which object images overlap each other, from the image in the estimated moving direction.
[44] The moving direction estimating means 232 may include a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object and a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image. The moving direction estimating means 232 may include either the first moving direction estimating means or the second moving direction estimating means. Alternatively, the moving direction estimating means 232 may include both the first moving direction estimating means and the second moving direction estimating means. In a case in which the moving direction estimating means includes both the first moving direction estimating means and the second moving direction es- timating means, the first moving direction estimating means is used first to estimate the moving direction of the object, and, if the moving direction of the object estimated by the first moving direction estimating means is not correct, the second moving direction estimating means is used to estimate the moving direction of the object.
[45] The object extracting means 233 may be configured to include a pattern analyzing means to analyze patterns of the images in the estimated moving direction to effectively extract the object image region in which the object images overlap each other or a means to check pixel values of the respective images in the estimated moving direction and to analyze patterns based on the checked pixel values to extract the object image region.
[46] On the other hand, the second processing means 240 preferably includes a fitting means 241 to fit the extracted object image region based on a value previously set as the size of the object to form at least two center point candidates of the fitted object image region and a coordinate extracting means 242 to adjust the positions of the formed center point candidates according to predetermined conditions to extract at least two center points of the object image region.
[47] The first processing means 230 and the second processing means 240 will be
described below in more detail.
[48] Meanwhile, the simulator 100 preferably includes a controller M, a database 110, an image processing unit 120 and an image output unit 130.
[49] The controller M receives coordinate information of an object which has been image processed and acquired by the sensing processing unit 220, converts the coordinate information into three-dimensional coordinates, obtains predetermined physical information for moving orbit simulation of the object and transmits the obtained physical information to the image processing unit 120.
[50] Predetermined data for moving orbit simulation of the object may be extracted from the database 110, and the image processing of the moving orbit simulation of the object by the image processing unit 120 may be performed by extracting image data stored in the database 110.
[51] Meanwhile, a converting means to convert coordinate information of the object
transmitted from the sensing processing unit 220 into three-dimensional coordinates may be provided separately from the controller M.
[52] In a hardware aspect, the first processing means 230 and the second processing
means 240 may be realized as a single controller configured to perform the functions of the above means or separate controllers configured to respectively performed the functions of the above means. In a software aspect, the first processing means 230 and the second processing means 240 may be realized as a single program configured to perform the functions of the above means or separate programs configured to re- spectively performed the functions of the above means.
[53] An example of a screen golf system, to which the sensing device or the virtual golf simulation device with the above-stated construction is applied, is shown in FIG. 2.
[54] As shown in FIG. 2, a hitting mat 30 having a golf tee, on which a golf ball is placed, and a ground (preferably including at least one selected from among a fairway mat, a rough mat and a bunker mat) and a swing plate 20, on which a golfer hits a golf ball, are provided at one side of a golf booth B.
[55] A screen 40, on which a virtual golf simulation image realized by the image output unit 130 is displayed, is provided at the front of the golf booth B. The camera devices
310 and 320 and the strobe flashlight device 330 are provided at the ceiling of the golf booth B.
[56] In FIG. 2, the camera devices 310 and 320 are provided at the ceiling and wall of the golf booth B, respectively, to which, however, the present invention is not limited. The camera devices 310 and 320 can be installed at any positions of the golf booth B so long as the camera devices 310 and 320 do not disturb the swing of the golfer and are prevented from colliding with a golf ball hit by the golfer while the camera devices 310 and 320 can effectively capture an image of a moving state of the golf ball 10.
[57] Also, in FIG. 2, the strobe flashlight device 330 is installed at the ceiling of the golf booth B to provide strobe flashlight substantially perpendicular to a hitting point, to which, however, the present invention is not limited. The strobe flashlight device 330 can be installed at any position of the golf booth B so long as the strobe flashlight device 330 can effectively provide strobe flashlight.
[58] When the golfer on the swing plate 20 hits a golf ball 10 on the hitting mat 30 toward the screen 40 in the system with the above-stated construction, as shown in FIG. 2, the camera devices 310 and 320 capturing predetermined regions in which the hitting is performed capture a plurality of frames, respectively. At this time, the strobe flashlight device 330 generates several flashes per frame to acquire a multiple exposure image of the moving golf ball.
[59] FIG. 3 is a view showing a trigger signal generation scheme of a camera device and a strobe flashlight device by the signal generating unit 210 (see FIG. 1).
[60] As shown in FIG. 3, trigger signals of the camera device have a time interval of tc.
That is, trigger signals are generated at a time interval of tc for each frame. At this time, exposure time for each frame is te (at this time, preferably tc > te). Also, data on an image acquired for a time of tc— te is preferably transmitted to the sensing processing unit 220 (see FIG. 1).
[61] That is, as shown in FIG. 3, the camera device is triggered at an interval between the trigger signal generating times for each frame, and data on the acquired multiple exposure image is transmitted to the sensing processing unit. [62] During the exposure time (time of te) of the camera device, strobe flashlight is generated by the strobe flashlight device several times. In FIG. 3, strobe flashlight trigger signals are generated three times at a time interval of tsl.
[63] That is, during the exposure time (time of te) of the camera device, strobe flashlight is generated three times at the same time interval of tsl. At this time, the camera device and the strobe flashlight device are preferably synchronized for the signal generating unit 210 (see FIG. 1) to generate a first trigger signal.
[64] Also, as shown in FIG. 3, a time interval of ts2 is preferably provided between the trigger signal of the last one of the three strobe flashlights and a first strobe flashlight of the next frame. The time interval of tsl and the time interval of ts2 may be set to be equal to each other. As shown in FIG. 3, however, the time interval of tsl and the time interval of ts2 are preferably set to be different from each other. More preferably, the time interval of ts2 is set to be longer than the time interval of tsl.
[65] Since the trigger signal intervals of the camera device and the strobe flashlight device are uniform as described above, the interval between the object images on the multiple exposure image is uniform.
[66] When the multiple exposure image is processed by the sensing processing unit,
therefore, it is possible to effectively separate a correct object image from the object image present on the multiple exposure image and various noises based on the characteristics of the uniformly fixed interval of the trigger signal.
[67] Also, since the interval of the strobe flashlight is uniformly fixed for each frame of the camera device, images having various patterns may be generated from the multiple exposure image according to the moving speed of the object.
[68] Hereinafter, a ball (golf ball) hit by a golfer using a golf club, as an example of the moving object, will be described.
[69] FIGS. 4(a) and 4(b) are views showing two patterns of the multiple exposure image acquired by the camera device and the strobe flashlight device based on the signal generation scheme shown in FIG. 3, respectively.
[70] That is, in FIGS. 4(a) and 4(b), a moving state of a ball hit by a golfer using a golf club is shown using two image patterns acquired based on the trigger signal scheme shown in FIG. 3 (II and 12).
[71] An image II shown in FIG. 4(a) is acquired through multiple exposure of golf club images CI, C2 and C3 and ball images 11a, l ib and 1 lc. In this case, the ball images 11a, 1 lb and 1 lc are spaced apart from each other by a predetermined interval.
[72] An image 12 of FIG. 4(b) shows a case in which ball images overlap to provide an image region 12 of a predetermined size.
[73] That is, in the image shown in FIG. 4(a), the ball is moved at high speed, and
therefore, the image is formed in a state in which the ball images are spaced apart from each other by the predetermined interval when strobe flashlight is triggered, and, in the image shown in FIG. 4(b), the ball is moved at low speed, and therefore, strobe flashlight is triggered before the ball moves far away with the result that the ball images overlap each other.
The sensing device and sensing processing method according to the present invention are provided to accurately acquire the coordinates of center points of the ball images when the ball images overlap each other on the multiple exposure image as the ball is moved at a speed equal to or less than a predetermined speed based on the fixed interval of the strobe flashlight as shown in FIG. 4(b).
Hereinafter, processing of the overlapping ball images as shown in FIG. 4(b) by the sensing device for the moving object according to the present invention will be described with reference to FIGS. 5 to 18.
FIG. 5(a) is a view showing the original copy of a multiple exposure image including overlapping ball images. The multiple exposure image is acquired using a camera device having a low resolution of 320 x 240 and operated at a low speed of 75 fps and a strobe flashlight device operating at 330 Hz.
The preliminary processing means 231 (see FIG. 1) of the first processing means removes a stationary image, i.e. a background image, from the original image shown in FIG. 5(a) through subtraction and performs a predetermined preliminary processing operation such as Gaussian blur. An image acquired through the preliminary processing operation is shown in FIG. 5(b).
However, since the ball images overlap each other and the resolution of the camera device is low, i.e. the resolution of image is too low to clearly confirm that the ball images overlap each other, the overlapping ball images are displayed in the form of an image region having a predetermined size, which may be confused with various noises, such as an image region having a predetermined size displaying part of a body of a golfer or an image region having a predetermined size displaying a golf club. For this reason, it is very difficult to extract only the image region, in which the ball images overlap each other, in a state in which various noises are included.
A method of extracting such overlapping ball images is to estimate the moving direction of the ball using the moving direction estimating means 232 (see FIG. 1) of the sensing device according to the present invention, to primarily extract images in the estimated moving direction and to extract a ball image region from the extracted images.
FIGS. 6 to 10 are views showing a process of estimating a moving direction of a ball performed by the sensing device and sensing processing method according to the embodiment of the present invention. First, a process of estimating a moving direction of a ball performed by the first moving direction estimating means will be described with reference to FIGS. 6 and 7.
[81] First, lines L are formed so as to pass through all image regions R present in an
image acquired by properly removing a background image and noise from a multiple exposure image having all frames which are initially acquired.
[82] That is, as shown in FIG. 6, a line connecting at least two points of the image regions
R is formed.
[83] At this time, information on the initial position of a ball must be previously set. That is, the camera device senses the initial position of the ball before the ball is moved and the initial position Bl of the ball in the multiple exposure image is set based on the information on the sensed initial position of the ball.
[84] Distances between all of the lines L formed as described above and the initial
position B 1 of the ball are measured.
[85] That is, referring to FIG. 7, which is an enlarged view showing region A of FIG. 6, straight distances between a plurality of lines passing through a region Rl and the initial position B 1 of the ball and between a plurality of lines passing through a region R2 and the initial position B 1 of the ball are measured.
[86] In FIG. 7, a line LI passing through the region Rl and a line L2 passing through the region R2 are nearest the initial position Bl of the ball. The line LI is nearer the initial position B l of the ball than the line L2. Consequently, the line LI is estimated as an axis of a moving direction of the ball tentatively.
[87] On the assumption that a straight distance between the initial position B 1 of the ball and the line LI is DL, the moving direction estimated using the above method may be incorrect if DL is a long distance to such an extent that DL deviates from a predetermined range. Consequently, the result estimated by the first moving direction estimating means as described above is ignored, and the moving direction is preferably estimated by the second moving direction estimating means.
[88] FIGS. 8 to 10 are views showing a process of estimating the moving direction of the ball performed by the second moving direction estimating means.
[89] First, as shown in FIG. 8, a plurality of frames is combined into an image, in which ball image regions appear as time passes.
[90] That is, in FIG. 8, ball image regions 12-1, 12-2 and 12-3 of the respective frames are present in a single image.
[91] Also, the moving direction is estimated by the second moving direction estimating means based on the single image as shown in FIG. 8, which will be described with reference to FIG. 9.
[92] Preferably, the second moving direction estimating means includes a image
combining means to combine at least two frames of the multiple exposure images into a single image, a line scanning means to perform line scanning at a predetermined angle about the initial position of the object to check pixel values of the images on the respective scanned lines and an axis extracting means to extract the scanned line having the largest pixel value among the lines scanned by the line scanning means as an axis of the moving direction.
[93] That is, a plurality of frames may be combined into a single image by the image combining means of the second moving direction estimating means as shown in FIG. 8, and the axis of the moving direction of the ball may be extracted by the line scanning means and the axis extracting means as shown in FIG. 9.
[94] As shown in FIG. 9, a predetermined angle scanning region SR is designated based on the initial position B 1 of the ball to check pixel values of images on respective scanned lines XI, X2, X3 ... Xi ... Xn in the corresponding scanning region SR based on the initial position Bl of the ball.
[95] At this time, scanning region lines si and s2 forming the scanning region SR are preferable formed to define a predetermined angle therebetween. The scanning region lines si and s2 may be set to define 360 degrees therebetween. Preferably, as shown in FIG. 9, the scanning region lines si and s2 are set to define front 120 degrees therebetween.
[96] The pixel values of the images on the respective scanned lines XI, X2, X3 ... Xi ...
Xn may be checked as shown in FIG. 10.
[97] As shown in FIG. 10, a first sub scanned line UL and a second sub scanned line LL are formed at one side and the other side about the scanned line Xi based on a value previously set as a radius Rs of the ball (including a value set based on a previously known radius of the ball and a value set based on a radius of the ball previously measured from the image of the ball), and the sum Sum 1 of the pixel values of the image on the scanned line Xi, the sum Sum 2 of the pixel values of the image on the first sub scanned line UL and the sum Sum 2 of the pixel values of the image on the second sub scanned line LL are calculated.
[98] Subsequently, the following a cost function value is obtained.
[99] v = Sum 1 - Sum 2 - Sum 3
[100] The above cost function value is obtained for each of the scanned lines, and the
scanned line having the maximum value is estimated as the axis of the moving direction of the ball.
[101] In this way, it is possible to accurately include the ball image region while effectively excluding images which are too small or large to be the ball image region by obtaining the cost function value derived from the respective scanned lines and the sub scanned lines based on the radius value of the ball with respect to the respective scanned lines, and therefore, it is possible to accurately extract the axis of the moving direction passing through the ball image region. [102] In FIG. 9, therefore, the scanned line Xi may be estimated as the axis of the moving direction of the ball.
[103] Meanwhile, after the axis of the moving direction of the ball is extracted using the above method, an ball image region, in which ball images overlap each other, is extracted by the object extracting means 233 (see FIG. 1) using the extracted axis of the moving direction of the ball.
[104] That is, as shown in FIG. 11, the pixel values are checked with respect to all of the images (which may include noise in addition to a ball image region to be extracted) through which the extracted axis of the moving direction of the ball passes to effectively remove noise and thus extract only the ball image region.
[105] FIGS. 11(a) and 11(b) shows an example of a method of effectively extracting only the ball image from image present on the axis of the moving direction.
[106] FIG. 11(a) is a view showing an image having a predetermined pattern which may be present on the axis of the moving direction and FIG. 11(b) is a graph showing the distribution of pixel values checked along an axis 13 of the moving direction with respect to the image shown in FIG. 11(a).
[107] As shown in FIG. 11(a), the axis 13 of the moving direction passes through a ball image region 12 in which ball images overlap each other and noise Nl around the ball image region 12. The distribution of pixel values of the image along the axis 13 of the moving direction has a pattern shown in FIG. 11(b).
[108] The ball image region in which ball images overlap each other must have a section in which pixel values are maintained at a predetermined critical value or more, which corresponds to a section obtained by adding diameters of at least two balls. As shown in FIG. 11(b), the section in which the pixel values are maintained at the predetermined critical value or more includes a section al to a2 and a section a3 to a4.
[109] If the section in which the pixel values are maintained at the predetermined critical value or more has an image corresponding to the section a3 to a4, i.e. an image denoted by reference numeral 12 in FIG. 11(a), the section in which the pixel values are maintained at the predetermined critical value or more may be estimated as the ball image region in which the ball images overlap each other. On the other hand, if the section in which the pixel values are maintained at the predetermined critical value or more has an image corresponding to the section al to a2, i.e. an image denoted by reference numeral Nl in FIG. 11(a), the section in which the pixel values are maintained at the predetermined critical value or more may be estimated as noise (an image of a golf club).
[110] Consequently, the image Nl may be excluded and the image denoted by reference numeral 12 may be extracted as the ball image region in which ball images overlap each other. [11 1] Meanwhile, after the ball image region in which the ball images overlap each other is extracted from the images present on the axis of the moving direction as described above, a process of properly fitting the extracted image region to extract a center point thereof is performed by the fitting means 241 (see FIG. 1).
[112] As a method of fitting the ball image region in which the ball images overlap each other, a half circle fitting method to fit one half circle at one end of the ball image region 12 in which the ball images overlap each other and to fit the other half circle at the other end of the ball image region 12 in which the ball images overlap each other as shown in FIGS. 12 and 13 may be used.
[113] As shown in FIG. 12(a), the ball image region 12 has a higher brightness value than the periphery (Here, if the acquired image has a grayscale image, each of the pixel values has a brightness value).
[114] The edge portion of the ball image region 12 has a gradually varying brightness value. The variation of the brightness value is terminated at the contour of the ball image region. Preferably, therefore, the contour of the ball image region is fitted.
[115] For proper fitting, therefore, as shown in FIG. 12(a), a first half circle CCl and a second half circle CC2 are moved in opposite directions about the ball image region 12 along the axis 13 of the moving direction to calculate the sum of pixel values of the image on the first half circle CCl and the sum of pixel values of the image on the second half circle CC2.
[116] The sum of the pixel values calculated as described above has the distribution shown in the graph of FIG. 12(b). FIG. 12(b) is a graph showing the distribution of the sum of pixel values on a half circle.
[117] In the graph of FIG. 12(b), the pixel value abruptly varies between a point VI or a point V2 at which the pixel value starts to abruptly vary and a point V3 at which the brightness value completed varies.
[118] Preferably, a half circle curve HC1 or HC2 (see FIG. 13) is formed at the point VI or the point V3 to fit the ball image region.
[119] That is, preferably, a specific value between the point VI and the point V3 is
previously set as a predetermined value of the sum of the pixel values, and the half circle curves HC1 and HC2 are formed at positions at which the sum of the pixel values on the respective half circles CCl and CC2 is less than the predetermined value to fit the ball image region.
[120] Consequently, the value of the point VI may be previously set as the predetermined value, the value of the point V2 or the point V3 may be previously set as the predetermined value or an arbitrary value between the point VI and the point V3 may be previously set as the predetermined value. Setting of the predetermined value may be derived through an experiment to confirm at which value as the predetermined value the most accurate result can be obtained.
[121] Here, the diameter of the half circles CC1 and CC2 or the half circle curves HC1 and HC2 may have a value previously set as the diameter of the ball or a value measured from the ball image as the diameter of the ball.
[122] That is, the diameter of the ball may be a value which is previously measured and input or a value sensed by the camera device at the initial position of the ball and measured and set on the image as the diameter.
[123] On the other hand, the diameter of the ball may be measured from the separate ball images as shown in FIG. 4(a) and may be used to fit the overlapping ball images.
[124] Meanwhile, in FIG. 13, any one of the points VI to V3 is selected from the distribution of the sum of the pixel values on the half circles CC1 and CC2 along the axis 13 of the moving direction and the half circle curves HC1 and HC2 having the predetermined diameter of the ball or the measured diameter of the ball are formed at opposite ends of the ball image region 12 to fit the ball image region.
[125] That is, preferably, the first half circle curve HC1 is formed at one end of the ball image region 12 to fit the ball image region 12 and the second half circle curve HC2 is formed at the other end of the ball image region 12 to fit the ball image region 12.
[126] The half circle fitting as described above may be performed by a processing means.
Pixel values may be checked by a pixel checking means while the first half circle CC1 and the second half circle CC2 moves to opposite ends of the ball image region and the half circle curves may be formed at the opposite ends of the ball image region by a fitting processing means to fit the ball image region.
[127] Also, the half circle fitting of one half circle and the half circle fitting of the other half circle may be separately performed by different processing means.
[128] That is, the movement of the first half circle CC1 and the checking of the pixel
values on the first half circle CC1 may be performed by a first pixel checking means and the movement of the second half circle CC2 and the checking of the pixel values on the second half circle CC2 may be performed by a second pixel checking means. According to the result of the pixel checking performed by the first pixel checking means, a first fitting processing means may form the first half circle curve HC1 at one end of the ball image region to fit the ball image region.. According to the result of the pixel checking performed by the second pixel checking means, a second fitting processing means may form the second half circle curve HC2 at the other end of the ball image region to fit the ball image region.
[129] Meanwhile, if the half circle curves HC1 and HC2 for fitting are formed at the point V3 as shown in the graph of FIG. 12(b), center point candidates EP1 and EP2 may be formed at the point VI or the point V2.
[130] That is, preferably, a first center point candidate EP1 is formed at the side at which the first half circle curve HCl and a second center point candidate EP2 is formed at the side at which the second half circle curve HC2. Also preferably, the center point candidates EP1 and EP2 are formed on the axis 13 of the moving direction.
[131] Of course, the center point candidates EP1 and EP2 may be selected as arbitrary
points at the half circle fitted sides.
[132] It is necessary to adjust the center point candidates EP1 and EP2 formed as described above using the coordinate extracting means 242 (see FIG. 1) so that the center point candidates EP1 and EP2 can be located at correct center point position. Also preferably, the position of the axis of the moving direction is adjusted so that the axis of the moving direction can be located at a more correct position.
[133] First, adjusting the position of the axis of the moving direction will be described with reference to FIG. 14.
[134] It is preferable for the axis of the moving direction to pass through the center of the ball image region. Since the axis of the moving direction is not accurately extracted but estimated, however, the estimated axis of the moving direction may not exactly pass through the center of the ball image region.
[135] In this case, it is preferable to adjust the position of the axis of the moving direction so that the estimated axis of the moving direction can exactly pass through the center of the ball image region.
[136] As shown in FIG. 14(a), the upper and lower thicknesses of the ball image region 12 through which the axis 13 of the moving direction passes must be substantially equal to each other. Therefore, it is preferable to shift the position of the axis 13 of the moving direction to a distance corresponding to the difference between the thickness tl of the ball image region 12 under the axis 13 of the moving direction and the thickness t2 of the ball image region 12 above the axis 13 of the moving direction shown in FIG. 14(a).
[137] A state in which the axis of the moving direction is shifted to a center line CL
passing through the center of the ball image region 12 using the process as described above is shown in FIG. 14(b).
[138] On the other hand, FIG. 15(a) shows a process of adjusting the center point
candidates EP1 and EP2 formed using the method shown in FIGS. 12 and 13.
[139] Since each of the fitted half circle curves HCl and HC2 is formed in the shape of a half circle as shown in FIG. 15(a), it is possible to adjust the respective center point candidates EP1 and EP2 into correct center points based on geometrical conditions thereof.
[140] That is, three directional lines rl, r2 and r3 are formed to have the same length with respect to the first center point candidate EP1 and three directional lines r4, r5 and r6 are formed to have the same length with respect to the second center point candidate EP2 to adjust the positions of the center point candidates EPl and EP2 so that the center point candidates EPl and EP2 can be located at correct positions.
[141] That is, the positions of the center point candidates EPl and EP2 are adjusted using the above method so that the center point candidates EPl and EP2 can be located at the centers of circles having the fitted half circle curves HC1 and HC2, respectively.
[142] At this time, the length of the three directional lines rl, r2 and r3 or r4, r5 and r6 may correspond to a value previously set as the diameter of the ball or a diameter value measured through the ball image.
[143] As shown in FIG. 15(b), coordinates of two center points CP1 and CP2 of the ball image region 12 are extracted by the above process.
[144] Meanwhile, the image region extracted by the above process may contain images different from the image region of the actual ball.
[145] In order to prevent such an error, as shown in FIG. 16, it is possible to determined whether the extracted center points are correct based on a strobe flashlight cycle per frame and, if the center points have been incorrectly extracted, it is possible to correct the extracted center points.
[146] That is, the strobe flashlight time interval per frame is tsl and the time interval
between the last strobe flashlight of one frame and the first strobe flashlight of the next frame is ts2 as shown in FIG. 3, and the time intervals tsl and ts2 must correspond to the distance intervals between the ball image regions 12-1, 12-2 and 12-3 of the respective frames of FIG. 1 1 at a predetermined ratio.
[147] That is, a ratio of ts2 to 2ts 1 must correspond to a ratio of b to a or a ratio of d to c.
[148] If the interval between the extracted center points does not correspond to the strobe flashlight cycle, it is possible to correct the extracted center points so that the interval between the extracted center points can correspond to the strobe flashlight cycle.
[149] For example, if the center points are not located at positions at which the center points must be located according to the strobe flashlight cycle, the center points can be corrected so that the center points can be located at the corresponding positions. On the other hand, if the center points are located at positions at which the center points must not be located according to the strobe flashlight cycle, the center points can be removed.
[150] After the center points are extracted and corrected as described above, it is possible to confirm the coordinates of the final center points of the ball image region in which the ball images overlap each other.
[151] Therefore, it is possible to very accurately extract the coordinates of the center point of the moving ball although the low quality image acquired by the low resolution and low speed camera devices and strobe flashlight device is used with the result that balls overlap each other, and therefore, an image region having a predetermined size is present.
[152] The coordinates of the center points of the balls finally confirmed as described above are converted into three-dimensional coordinates by the converting means, and physical information of the moving orbit of the ball to be simulated is obtained based on the three-dimensional coordinates.
[153] Hereinafter, embodiments of a sensing processing method according to the present invention will be described with reference to FIG. 17.
[154] As shown in FIG. 17, first, a trigger signal is applied by the signal generating unit so that the camera device and the strobe flashlight device acquire a multiple exposure image of the moving state of an object at a predetermined interval (S10).
[155] At this time, the initial position of a ball is sensed and recognized (S20). Various preliminary processing steps are performed to remove a background image and various noises from the acquired multiple exposure image (S30).
[156] A plurality of lines passing through image regions in the image preliminarily
processed as described above is formed (S40) and the line nearest the initial position of the ball is extracted (S50) to estimate an axis of a moving direction of the ball.
[157] At this time, when the distance between the initial position of the ball and the line nearest the initial position of the ball is less than a predetermined distance, the line is extracted as the axis of the moving direction (S61).
[158] When the distance between the initial position of the ball and the line nearest the initial position of the ball is greater than the predetermined distance, it is determined that the axis of the moving direction estimated at Steps S40 and S50 is incorrect, and a second moving direction estimating process is carried out.
[159] That is, two or more frames of the preliminarily processed image are combined (S62) and line scanning is performed at a predetermined angle about the initial point of the ball (S63).
[160] At this time, cost function values (see FIG. 10) are obtained with respect to pixel values of the image regions on the respective scanned lines (S64) and one of the scanned lines which have the maximum cost function value is extracted as the axis of the moving direction (S65).
[161] The image of the ball is extracted and analyzed from the image region on the axis of the moving direction extracted as described above to finally obtain the coordinates of a center point of the ball (S70).
[162] Hereinafter, Step S70 of FIG. 17 will be described in more detail with reference to FIG. 18.
[163] The axis of the moving direction in which the ball is moved is extracted by the
moving direction estimating means as described above, a ball image region is extracted from the image through which the extracted axis of the moving direction passes, the sum of pixel values on half circles is calculated while the respective half circles having a predetermined diameter or a measured diameter moves from the center of the extracted ball image region to opposite ends of the extracted ball image region along the axis of the moving direction (S71).
[164] Subsequently, half circle curves are formed at positions at which the calculated sum of the pixel values is less than a predetermined value to fit the ball image region (S72).
[165] At least two center point candidates of the ball image region are formed on the axis of the moving direction passing through the ball image region (S73) and the formed center point candidates are adjusted so that the formed center point candidates can be located at the centers of circles having the fitted half circle curves, respectively (S74).
[166] Also, the positions of the center points adjusted as described above are corrected so as to correspond to the strobe flashlight cycle (S75) and the coordinates of two center points of the ball image region are finally obtained (S76).
[167] The change of the coordinates of the center points acquired as described above in a three-dimensional space is calculated to obtain physical property information of the moving ball.
Mode for the Invention
[168] Various embodiments of a sensing device and sensing processing method and a
virtual golf simulation device using the same have been described in the best mode for carrying out the invention.
Industrial Applicability
[169] In a sensing device and sensing processing method for a moving object according to the present invention as described above and a virtual golf simulation device using the same, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, it is possible to accurately extract an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, it is possible to accurately extract the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost. Consequently, the present invention can be widely used in industries related to a sensing device and sensing processing method for a moving object and a virtual golf simulation device using the same.

Claims

Claims
[Claim 1] A sensing device comprising:
a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight; and
a sensing processing unit comprising a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and, a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region.
[Claim 2] The sensing device according to claim 1, wherein the first processing means comprises:
a moving direction estimating means to estimate a moving direction of the object in the multiple exposure image; and
an object extracting means to extract the object image region, in which the object images overlap each other, from the images in the estimated moving direction.
[Claim 3] The sensing device according to claim 1 , wherein the second
processing means comprises:
a fitting means to fit the extracted object image region to form at least two center point candidates of the fitted object image region; and a coordinate extracting means to adjust the positions of the formed center point candidates according to predetermined conditions to extract coordinates of at least two center points of the object image region.
[Claim 4] The sensing device according to claim 1 , further comprising a
correcting means to correct the extracted coordinates of the center points so that the extracted coordinates of the center points correspond to a strobe flashlight cycle.
[Claim 5] A sensing device comprising:
a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight; and
a sensing processing unit comprising a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object, a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image and, an object extracting means to extract an image of the object from the images in the moving direction estimated by the first moving direction estimating means or the second moving direction estimating means.
[Claim 6] The sensing device according to claim 5, wherein the first moving direction estimating means comprises:
a line forming means to form lines connecting at least two points of the respective image regions in the multiple exposure image; and a line extracting means to extract one of the lines nearest the initial position of the object as an axis of the moving direction of the object.
[Claim 7] The sensing device according to claim 5, wherein the second moving direction estimating means comprises:
an image combining means to combine at least two frames of the multiple exposure images into a single image;
a line scanning means to perform line scanning at a predetermined angle about the initial position of the object to check pixel values of images on the respective scanned lines; and
an axis extracting means to extract the scanned line having the largest pixel value among the lines scanned by the line scanning means as an axis of the moving direction of the object.
[Claim 8] The sensing device according to claim 5, wherein the object extracting means is configured to extract the image of the object based on the moving direction of the object estimated by the second moving direction estimating means if a distance between an axis of the moving direction of the object estimated by the first moving direction estimating means and the initial position of the object is greater than a predetermined value.
[Claim 9] A sensing device comprising:
a sensor unit to acquire a multiple exposure image with respect to a moving ball under strobe flashlight; and
a sensing processing unit comprising an object extracting means to extract a ball image region, in which images of the moving ball overlap each other, from the multiple exposure image, a fitting means to fit the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, respectively, and a coordinate extracting means to extract coordinates of at least two center points of the fitted ball image region.
[Claim 10] The sensing device according to claim 9, wherein the fitting means comprises:
a pixel checking means to check pixel values on half circles of a predetermined diameter while moving from the center of the ball image region, in which the images of the moving ball overlap each other, to the opposite ends of the ball image region; and
a fitting processing means to fit the ball image region so that the half circle curves of the predetermined diameter are formed at the opposite ends of the ball image region based on the result of the pixel checking.
[Claim 11] The sensing device according to claim 9, wherein the fitting means comprises:
a first half circle fitting means to fit the ball image region so that a first half circle curve of the predetermined diameter is formed at one end of the ball image region, in which the images of the moving ball overlap each other; and
a second half circle fitting means to fit the ball image region so that a second half circle curve of the predetermined diameter is formed at the other end of the ball image region, in which the images of the moving ball overlap each other.
[Claim 12] The sensing device according to claim 11, wherein
the first half circle fitting means comprises a first pixel checking means to calculate the sum of pixel values on the first half circle of the predetermined diameter while moving from the center of the ball image region, in which the images of the moving ball overlap each other, to one end of the ball image region and a first fitting processing means to fit the ball image region so that the first half circle curve is formed at a position at which the calculated sum of the pixel values is equal to or less than a predetermined value, and
the second half circle fitting means comprises a second pixel checking means to calculate the sum of pixel values on the second half circle of the predetermined diameter while moving from the center of the ball image region, in which the images of the moving ball overlap each other, to the other end of the ball image region and a second fitting processing means to fit the ball image region so that the second half circle curve is formed at a position at which the calculated sum of the pixel values is equal to or less than a predetermined value.
[Claim 13] A sensing processing method comprising:
acquiring a multiple exposure image with respect to a moving object under strobe flashlight;
extracting an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image;
forming at least two center point candidates of the extracted object image region; and
adjusting the positions of the center point candidates according to predetermined conditions to extract coordinates of at least two center points of the object image region.
[Claim 14] The sensing processing method according to claim 13, wherein the step of extracting the object image region comprises:
estimating a moving direction of the object in the multiple exposure image;
extracting images in the estimated moving direction from the multiple exposure image; and
extracting the object image region from the respective images in the estimated moving direction.
[Claim 15] The sensing processing method according to claim 13, wherein the step of forming the center point candidates comprises:
fitting the object image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region; and
forming at least two center points of the fitted object image region as the center point candidates.
[Claim 16] A sensing processing method comprising:
Acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight;
recognizing an initial position of the object in the multiple exposure image;
forming lines passing through image regions in the multiple exposure image;
estimating a direction of one of the lines nearest the initial position of the object as a moving direction of the object; and
extracting and analyzing an image of the object from images in the estimated moving direction.
[Claim 17] The sensing processing method according to claim 16, wherein the step of estimating the direction of one of the lines nearest the initial position of the object as the moving direction of the object comprises: combining at least two frames of the multiple exposure images into a single image if a distance between the line nearest the initial position of the object and the initial position of the object is greater than a predetermined value;
performing line scanning at a predetermined angle about the initial position of the object to check the sum of pixel values of images on the respective scanned lines; and
extracting a direction of one of the scanned lines having the largest sum of the pixel values as the moving direction of the object.
[Claim 18] A sensing processing method comprising:
Acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight;
recognizing an initial position of the object in the multiple exposure image;
combining at least two frames of the multiple exposure images into a single image;
performing line scanning at a predetermined angle about the initial position of the object with respect to the combined image to obtain a predetermined function value based on pixel values of images on the respective scanned lines;
extracting a direction of one of the scanned lines having the maximum function value as a moving direction of the object; and
extracting and analyzing an image of the object from images in the estimated moving direction.
[Claim 19] The sensing processing method according to claim 18, wherein the step of obtaining the predetermined function value comprises:
calculating a first value of the sum of the pixel values of the image on each of the scanned lines ;
calculating a second value of the sum of the pixel values of the image on a line formed at one side of the scanned line at a predetermined interval;
calculating a third value of the sum of the pixel values of the image on a line formed at the other side of the scanned line at a predetermined interval; and
subtracting the second value and the third value from the first value.
[Claim 20] A sensing processing method comprising:
acquiring a multiple exposure image with respect to a moving ball under strobe flashlight; extracting a ball image region, in which images of the moving ball overlap each other by multiple exposure, from the multiple exposure image;
fitting the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region; and
extracting coordinates of at least two center points of the fitted ball image region.
[Claim 21] The sensing processing method according to claim 20, wherein the fitting step comprises:
calculating the sum of pixel values on the half circles while moving the half circles of the predetermined diameter from the center of the ball image region to the opposite ends of the ball image region along an axis of a moving direction of the ball passing through the extracted ball image region; and
fitting the object image region so that half circle curves of the predetermined diameter are formed at positions at which the calculated sum of the pixel values is equal to or less than a predetermined value.
[Claim 22] A virtual golf simulation device comprising:
a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight;
a sensing processing unit comprising a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region; and
a simulator to calculate the change of the coordinates of the center points and to obtain physical information on a moving golf ball, thereby simulating an orbit of the golf ball.
PCT/KR2011/004762 2010-06-29 2011-06-29 Sensing device and sensing processing method for moving object and virtual golf simulation device using the same WO2012002734A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2013518251A JP5572853B2 (en) 2010-06-29 2011-06-29 Sensing device for moving object, sensing processing method, and virtual golf simulation device using the same
CN201180041839.4A CN103079652B (en) 2010-06-29 2011-06-29 Sensing device and sensing processing method for moving object and virtual golf simulation device using the same

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR1020100062206A KR101019782B1 (en) 2010-06-29 2010-06-29 Sensing processing device and method for moving object, and virtual golf simulation device using the same
KR10-2010-0062202 2010-06-29
KR1020100062203A KR101019798B1 (en) 2010-06-29 2010-06-29 Apparatus and method for sensing moving object and virtual golf simulation device using the same
KR10-2010-0062203 2010-06-29
KR10-2010-0062206 2010-06-29
KR1020100062202A KR101019824B1 (en) 2010-06-29 2010-06-29 Apparatus and method for sensing moving object and virtual golf simulation device using the same

Publications (3)

Publication Number Publication Date
WO2012002734A2 true WO2012002734A2 (en) 2012-01-05
WO2012002734A9 WO2012002734A9 (en) 2012-04-26
WO2012002734A3 WO2012002734A3 (en) 2012-06-21

Family

ID=45402569

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/004762 WO2012002734A2 (en) 2010-06-29 2011-06-29 Sensing device and sensing processing method for moving object and virtual golf simulation device using the same

Country Status (3)

Country Link
JP (1) JP5572853B2 (en)
CN (1) CN103079652B (en)
WO (1) WO2012002734A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10122921B2 (en) 2016-03-04 2018-11-06 Electronics And Telecommunications Research Institute Apparatus and method for automatically recognizing object by using low-speed camera in dual photographing mode

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016068227A1 (en) * 2014-10-31 2016-05-06 横浜ゴム株式会社 Method for measuring behavior of moving body and behavior measuring device
JP6554950B2 (en) * 2014-10-31 2019-08-07 横浜ゴム株式会社 Method and apparatus for measuring behavior of moving object
CN109475773B (en) * 2017-03-17 2022-08-23 B·瑞奇 Method and apparatus for simulating game events
AU2021208238A1 (en) * 2020-01-16 2022-08-04 Creatz Inc. Method, system, and non-transitory computer-readable recording medium for measuring spin of ball

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000262659A (en) * 1999-03-19 2000-09-26 Sumitomo Rubber Ind Ltd Measuring instrument of ball trajectory
JP2001264016A (en) * 2000-03-15 2001-09-26 Sumitomo Rubber Ind Ltd Motion-measuring instrument for ball
JP2002365018A (en) * 2001-06-05 2002-12-18 Hamamatsu Photonics Kk Measurement apparatus of moving object
KR20050109552A (en) * 2003-03-13 2005-11-21 소니 픽쳐스 엔터테인먼트, 인크. System and method for capturing facial and body motion
KR100871595B1 (en) * 2007-10-09 2008-12-02 박선의 A system for measuring flying information of globe-shaped object using the high speed camera
KR20090021407A (en) * 2007-08-27 2009-03-04 마이크로 인스펙션 주식회사 Apparatus for measuring motion of golf ball and control method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3235987B2 (en) * 1998-06-30 2001-12-04 ブリヂストンスポーツ株式会社 Golf ball rotation measurement method
JP3778427B2 (en) * 2001-04-26 2006-05-24 株式会社フォトロン Hitting ball diagnostic system
DK1509781T3 (en) * 2002-06-06 2015-08-03 Wawgd Inc Dba Foresight Sports The flight parameter measurement system
JP2006204359A (en) * 2005-01-25 2006-08-10 Sim:Kk Golf ball trial hitting machine

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000262659A (en) * 1999-03-19 2000-09-26 Sumitomo Rubber Ind Ltd Measuring instrument of ball trajectory
JP2001264016A (en) * 2000-03-15 2001-09-26 Sumitomo Rubber Ind Ltd Motion-measuring instrument for ball
JP2002365018A (en) * 2001-06-05 2002-12-18 Hamamatsu Photonics Kk Measurement apparatus of moving object
KR20050109552A (en) * 2003-03-13 2005-11-21 소니 픽쳐스 엔터테인먼트, 인크. System and method for capturing facial and body motion
KR20090021407A (en) * 2007-08-27 2009-03-04 마이크로 인스펙션 주식회사 Apparatus for measuring motion of golf ball and control method thereof
KR100871595B1 (en) * 2007-10-09 2008-12-02 박선의 A system for measuring flying information of globe-shaped object using the high speed camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10122921B2 (en) 2016-03-04 2018-11-06 Electronics And Telecommunications Research Institute Apparatus and method for automatically recognizing object by using low-speed camera in dual photographing mode

Also Published As

Publication number Publication date
WO2012002734A9 (en) 2012-04-26
JP2013538067A (en) 2013-10-10
JP5572853B2 (en) 2014-08-20
CN103079652B (en) 2015-01-07
CN103079652A (en) 2013-05-01
WO2012002734A3 (en) 2012-06-21

Similar Documents

Publication Publication Date Title
US11033826B2 (en) Methods and systems for sports simulation
US9514379B2 (en) Sensing device and method used for virtual golf simulation apparatus
JP5858261B2 (en) Virtual golf simulation apparatus, and sensing apparatus and sensing method used therefor
JP5626939B2 (en) Sensing processing device for moving ball, sensing processing method, and virtual golf simulation device using the same
AU2012231925B2 (en) Virtual golf simulation apparatus and sensing device and method used for the same
WO2012002731A2 (en) Sensing device and method for moving object and virtual golf simulation device using the same
US9314683B2 (en) Virtual golf simulation apparatus and method capable of compensation ball flight distance decreasing rate
KR101645693B1 (en) Apparatus and method for simulation golf putting using virtual cross stripes
CN105288982B (en) The motion state measure device of golf
WO2012002734A2 (en) Sensing device and sensing processing method for moving object and virtual golf simulation device using the same
KR101019782B1 (en) Sensing processing device and method for moving object, and virtual golf simulation device using the same
KR101019801B1 (en) Apparatus and method for sensing moving object and virtual golf simulation device using the same
KR101019902B1 (en) Sensing processing device and method for moving object, and virtual golf simulation device using the same
KR101019829B1 (en) Sensing processing device and method for moving object, and virtual golf simulation device using the same
US11229824B2 (en) Determining golf club head location in an image using line detection and contour separation
KR101019824B1 (en) Apparatus and method for sensing moving object and virtual golf simulation device using the same
KR101019798B1 (en) Apparatus and method for sensing moving object and virtual golf simulation device using the same
KR101019847B1 (en) Sensing processing device and method for moving object, and virtual golf simulation device using the same
WO2012002733A2 (en) Sensing device and sensing processing method for moving object and virtual golf simulation device using the same
KR101078954B1 (en) Apparatus for virtual golf simulation, and sensing device and method used to the same

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201180041839.4

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11801140

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2013518251

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11801140

Country of ref document: EP

Kind code of ref document: A2