Description
Title of Invention: SENSING DEVICE AND SENSING
PROCESSING METHOD FOR MOVING OBJECT AND VIRTUAL GOLF SIMULATION DEVICE USING THE SAME
Technical Field
[1] The present invention relates to a sensing device and sensing processing method and a virtual golf simulation device using the same, and, more particularly, to a sensing device and sensing processing method for a moving object that acquires and analyzes an image of a moving object, such as a golf ball, to obtain physical information thereof and a virtual golf simulation device using the same.
Background Art
[2] In recent years, various devices have been developed which allow users to enjoy popular sports games, such as baseball, soccer, basketball and golf, in rooms or in specific places through simulation in the form of interactive sports games.
[3] In order to simulate sports using balls, such as baseballs, soccer balls, basketballs and golf balls in such interactive sports games, much research has been conducted into various sensing systems to accurately sense physical information on a moving object, i.e. movement of a ball.
[4] For example, various sensing systems, such as a sensing system using an infrared sensor, a sensing system using a laser sensor, a sensing system using an acoustic sensor and a sensing system using a camera sensor, have come onto the market.
Disclosure of Invention
Technical Problem
[5] It is an object of the present invention to provide a sensing device and sensing
processing method for a moving object that, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, is capable of accurately extracting an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, is capable of accurately extracting the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost and a virtual golf simulation device using the same.
Solution to Problem
[6] In accordance with one aspect of the present invention, the above and other objects
can be accomplished by the provision of a sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight and a sensing processing unit including a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region.
[7] In accordance with another aspect of the present invention, there is provided a
sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight and a sensing processing unit including a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object, a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image and an object extracting means to extract an image of the object from the image in the moving direction estimated by the first moving direction estimating means or the second moving direction estimating means.
[8] In accordance with another aspect of the present invention, there is provided a
sensing device including a sensor unit to acquire a multiple exposure image with respect to a moving ball under strobe flashlight and a sensing processing unit including an object extracting means to extract a ball image region, in which images of the moving ball overlap each other, from the multiple exposure image, a fitting means to fit the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, respectively, and a coordinate extracting means to extract coordinates of at least two center points of the fitted ball image region.
[9] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring a multiple exposure image with respect to a moving object under strobe flashlight, extracting an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image, forming at least two center point candidates of the extracted object image region, and adjusting the positions of the center point candidates according to predetermined conditions to extract coordinates of at least two center points of the object image region.
[10] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight, recognizing an initial
position of the object in the multiple exposure image, forming lines passing through image regions in the multiple exposure image, estimating a direction of one of the lines nearest the initial position of the object as a moving direction of the object, and extracting and analyzing an image of the object from images in the estimated moving direction.
] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring plural frames of multiple exposure images with respect to a moving object under strobe flashlight, recognizing an initial position of the object in the multiple exposure image, combining at least two frames of the multiple exposure images into a single image, performing line scanning at a predetermined angle about the initial position of the object with respect to the combined image to obtain a predetermined function value based on pixel values of images on the respective scanned lines, extracting a direction of one of the scanned lines having the maximum function value as a moving direction of the object, and extracting and analyzing an image of the object from images in the estimated moving direction.
] In accordance with another aspect of the present invention, there is provided a
sensing processing method including acquiring a multiple exposure image with respect to a moving ball under strobe flashlight, extracting a ball image region, in which images of the moving ball overlap each other by multiple exposure, from the multiple exposure image, fitting the ball image region so that half circle curves of a predetermined diameter are formed at opposite ends of the extracted ball image region, and extracting coordinates of at least two center points of the fitted ball image region.] In accordance with a further aspect of the present invention, there is provided a
virtual golf simulation device including a sensor unit to acquire a multiple exposure image with respect to a moving object under strobe flashlight, a sensing processing unit including a first processing means to extract an object image region, in which images of the moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means to form at least two center point candidates of the extracted object image region and to adjust the positions of the center point candidates to extract coordinates of at least two center points of the object image region, and a simulator to calculate the change of the coordinates of the center points and to obtain physical information on a moving golf ball, thereby simulating an orbit of the golf ball.
Advantageous Effects of Invention
] In a sensing device and sensing processing method for a moving object according to the present invention as described above and a virtual golf simulation device using the same, although a multiple exposure image of a moving object is acquired by a low
resolution and low speed camera device and strobe flashlight device, it is possible to accurately extract an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, it is possible to accurately extract the overlapping object images to extract coordinates of a center point thereof. Consequently, the present invention has the effect of achieving high sensing processing performance and sensing accuracy at low cost.
Brief Description of Drawings
The above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram showing a sensing device or a virtual golf simulation device according to an embodiment of the present invention;
FIG. 2 is a view showing an example of a screen golf system to which the sensing device or the virtual golf simulation device shown in FIG. 1 is applied;
FIG. 3 is a view showing an operational signal scheme of a camera device and a strobe flashlight device of the sensing device according to the embodiment of the present invention;
FIGS. 4(a) and 4(b) are views showing image patterns of an object acquired by the screen golf system shown in FIG. 2;
FIG. 5(a) is a view showing an example of an actually acquired image of the image having the pattern shown in FIG. 4(b) and FIG. 5(b) is a view showing a preliminarily processed image of the image shown in FIG. 5(a);
FIG. 6 is a view showing an example of a process of estimating a moving direction of a ball performed by the sensing device according to the embodiment of the present invention;
FIG. 7 is an enlarged view showing region A of FIG. 6;
FIGS. 8 to 10 are views showing another example of a process of estimating a moving direction of a ball performed by the sensing device according to the embodiment of the present invention;
FIG. 11(a) is a view showing an example of a ball image region in which ball images overlap each other and FIG. 11(b) is a graph showing the distribution of pixel values checked along an axis of the moving direction with respect to the image shown in FIG. 11(a);
FIG. 12 (a) is a view showing a process of fitting the ball image region along the axis of the moving direction and FIG. 12(b) is a graph showing the sum of pixel values on a half circle along the axis of the moving direction;
[26] FIG. 13 is a view showing an image in which the ball image region is half circle fitted and center point candidates of the ball image region are formed by the process shown in FIGS. 12(a) and 12(b);
[27] FIGS. 14(a) and 14(b) are views showing a process of adjusting the position of the axis of the moving direction of an estimated ball so as to pass through the centers of the fitted ball image region;
[28] FIG. 15(a) is a view showing a process of adjusting the positions of the center point candidates formed in the fitted ball image region and FIG. 15(b) is a view showing the center points adjusted by the process shown in FIG. 15(a);
[29] FIG. 16 is a view showing a process of correcting the center points of the ball images so as to correspond to an interval rate of strobe flashlight; and
[30] FIGS. 17 and 18 are flow charts showing processing steps of a sensing processing method according to the present invention.
Best Mode for Carrying out the Invention
[31] Now, exemplary embodiments of a sensing device and sensing processing method for a moving object according to the present invention and a virtual golf simulation device using the same will be described in detail with reference to the accompanying drawings.
[32] A sensing device according to the present invention acquires and analyzes an image of a moving object to obtain physical information thereof. Here, the moving object may be a ball hit by a golfer in sports, such as golf, using a ball. The sensing device is used to sense such a moving ball. In an example, the sensing device may be applied to a so-called screen golf system to which a virtual golf simulation device is applied.
[33] FIGS. 1 and 2 schematically show the construction of a sensing device according to the present invention and a virtual golf simulation device using the same.
[34] First, a sensing device according to an embodiment of the present invention and a virtual golf simulation device using the same will be described with reference to FIGS. 1 and 2.
[35] As shown in FIG. 1, the sensing device for a moving object according to the embodiment of the present invention includes a sensor unit including camera devices 310 and 320, a strobe flashlight device 330 and a signal generating unit 210 and a sensing processing unit 220 to process an image acquired by the sensor unit and to process an image of the moving object, thereby extracting coordinates of a center point thereof.
[36] In FIG. 1, two camera devices 310 and 320 are provided, to which, however, the present invention is not limited. For example, a camera device or more than two camera devices may be provided.
[37] The camera devices 310 and 320 are provided to respectively capture a plurality of
frames on the movement of an object moving from an initial position in a moving direction of the object.
[38] The strobe flashlight device 330 is a lighting device using light emitting diodes
(LED). The strobe flashlight device 330 is used as a light source for capturing of the camera devices. The strobe flashlight device 330 generates strobe flashlight at a predetermined time interval (that is, a plurality of flashes is generated at a predetermined time interval) so that the camera devices 310 and 320 can capture a multiple exposure image.
[39] That is, an object is present on a frame captured by the camera devices in correspondence to the number of flashes generated from the strobe flashlight device 330 to provide a multiple exposure image thereof.
[40] The operation of the camera devices 310 and 320 and the strobe flashlight device 330 will be described below in detail.
[41] Meanwhile, a trigger signal to operate the camera devices 310 and 320 and the strobe flashlight device 330 is generated by the signal generating unit 210. The multiple exposure image captured by the camera devices 310 and 320 and the strobe flashlight device 330 is processed by the sensing processing unit 220 and is transmitted to a simulator 100.
[42] The sensing processing unit 220 includes a first processing means 230 to extract an object image region, in which images of a moving object overlap each other by multiple exposure, from the multiple exposure image and a second processing means 240 to form and adjust at least two center point candidates of the extract object image region to extract at least two center point of the object image region.
[43] Preferably, the first processing means 230 includes a preliminary processing means
231 to remove a background image and noise, a moving direction estimating means
232 to estimate a moving direction of the object in the multiple exposure image and an object extracting means 233 to extract an object image region, in which object images overlap each other, from the image in the estimated moving direction.
[44] The moving direction estimating means 232 may include a first moving direction estimating means to estimate a moving direction of the object based on initial position information of the object and a second moving direction estimating means to estimate a moving direction of the object based on pixel values of the respective image regions in the multiple exposure image. The moving direction estimating means 232 may include either the first moving direction estimating means or the second moving direction estimating means. Alternatively, the moving direction estimating means 232 may include both the first moving direction estimating means and the second moving direction estimating means. In a case in which the moving direction estimating means includes both the first moving direction estimating means and the second moving direction es-
timating means, the first moving direction estimating means is used first to estimate the moving direction of the object, and, if the moving direction of the object estimated by the first moving direction estimating means is not correct, the second moving direction estimating means is used to estimate the moving direction of the object.
[45] The object extracting means 233 may be configured to include a pattern analyzing means to analyze patterns of the images in the estimated moving direction to effectively extract the object image region in which the object images overlap each other or a means to check pixel values of the respective images in the estimated moving direction and to analyze patterns based on the checked pixel values to extract the object image region.
[46] On the other hand, the second processing means 240 preferably includes a fitting means 241 to fit the extracted object image region based on a value previously set as the size of the object to form at least two center point candidates of the fitted object image region and a coordinate extracting means 242 to adjust the positions of the formed center point candidates according to predetermined conditions to extract at least two center points of the object image region.
[47] The first processing means 230 and the second processing means 240 will be
described below in more detail.
[48] Meanwhile, the simulator 100 preferably includes a controller M, a database 110, an image processing unit 120 and an image output unit 130.
[49] The controller M receives coordinate information of an object which has been image processed and acquired by the sensing processing unit 220, converts the coordinate information into three-dimensional coordinates, obtains predetermined physical information for moving orbit simulation of the object and transmits the obtained physical information to the image processing unit 120.
[50] Predetermined data for moving orbit simulation of the object may be extracted from the database 110, and the image processing of the moving orbit simulation of the object by the image processing unit 120 may be performed by extracting image data stored in the database 110.
[51] Meanwhile, a converting means to convert coordinate information of the object
transmitted from the sensing processing unit 220 into three-dimensional coordinates may be provided separately from the controller M.
[52] In a hardware aspect, the first processing means 230 and the second processing
means 240 may be realized as a single controller configured to perform the functions of the above means or separate controllers configured to respectively performed the functions of the above means. In a software aspect, the first processing means 230 and the second processing means 240 may be realized as a single program configured to perform the functions of the above means or separate programs configured to re-
spectively performed the functions of the above means.
[53] An example of a screen golf system, to which the sensing device or the virtual golf simulation device with the above-stated construction is applied, is shown in FIG. 2.
[54] As shown in FIG. 2, a hitting mat 30 having a golf tee, on which a golf ball is placed, and a ground (preferably including at least one selected from among a fairway mat, a rough mat and a bunker mat) and a swing plate 20, on which a golfer hits a golf ball, are provided at one side of a golf booth B.
[55] A screen 40, on which a virtual golf simulation image realized by the image output unit 130 is displayed, is provided at the front of the golf booth B. The camera devices
310 and 320 and the strobe flashlight device 330 are provided at the ceiling of the golf booth B.
[56] In FIG. 2, the camera devices 310 and 320 are provided at the ceiling and wall of the golf booth B, respectively, to which, however, the present invention is not limited. The camera devices 310 and 320 can be installed at any positions of the golf booth B so long as the camera devices 310 and 320 do not disturb the swing of the golfer and are prevented from colliding with a golf ball hit by the golfer while the camera devices 310 and 320 can effectively capture an image of a moving state of the golf ball 10.
[57] Also, in FIG. 2, the strobe flashlight device 330 is installed at the ceiling of the golf booth B to provide strobe flashlight substantially perpendicular to a hitting point, to which, however, the present invention is not limited. The strobe flashlight device 330 can be installed at any position of the golf booth B so long as the strobe flashlight device 330 can effectively provide strobe flashlight.
[58] When the golfer on the swing plate 20 hits a golf ball 10 on the hitting mat 30 toward the screen 40 in the system with the above-stated construction, as shown in FIG. 2, the camera devices 310 and 320 capturing predetermined regions in which the hitting is performed capture a plurality of frames, respectively. At this time, the strobe flashlight device 330 generates several flashes per frame to acquire a multiple exposure image of the moving golf ball.
[59] FIG. 3 is a view showing a trigger signal generation scheme of a camera device and a strobe flashlight device by the signal generating unit 210 (see FIG. 1).
[60] As shown in FIG. 3, trigger signals of the camera device have a time interval of tc.
That is, trigger signals are generated at a time interval of tc for each frame. At this time, exposure time for each frame is te (at this time, preferably tc > te). Also, data on an image acquired for a time of tc— te is preferably transmitted to the sensing processing unit 220 (see FIG. 1).
[61] That is, as shown in FIG. 3, the camera device is triggered at an interval between the trigger signal generating times for each frame, and data on the acquired multiple exposure image is transmitted to the sensing processing unit.
[62] During the exposure time (time of te) of the camera device, strobe flashlight is generated by the strobe flashlight device several times. In FIG. 3, strobe flashlight trigger signals are generated three times at a time interval of tsl.
[63] That is, during the exposure time (time of te) of the camera device, strobe flashlight is generated three times at the same time interval of tsl. At this time, the camera device and the strobe flashlight device are preferably synchronized for the signal generating unit 210 (see FIG. 1) to generate a first trigger signal.
[64] Also, as shown in FIG. 3, a time interval of ts2 is preferably provided between the trigger signal of the last one of the three strobe flashlights and a first strobe flashlight of the next frame. The time interval of tsl and the time interval of ts2 may be set to be equal to each other. As shown in FIG. 3, however, the time interval of tsl and the time interval of ts2 are preferably set to be different from each other. More preferably, the time interval of ts2 is set to be longer than the time interval of tsl.
[65] Since the trigger signal intervals of the camera device and the strobe flashlight device are uniform as described above, the interval between the object images on the multiple exposure image is uniform.
[66] When the multiple exposure image is processed by the sensing processing unit,
therefore, it is possible to effectively separate a correct object image from the object image present on the multiple exposure image and various noises based on the characteristics of the uniformly fixed interval of the trigger signal.
[67] Also, since the interval of the strobe flashlight is uniformly fixed for each frame of the camera device, images having various patterns may be generated from the multiple exposure image according to the moving speed of the object.
[68] Hereinafter, a ball (golf ball) hit by a golfer using a golf club, as an example of the moving object, will be described.
[69] FIGS. 4(a) and 4(b) are views showing two patterns of the multiple exposure image acquired by the camera device and the strobe flashlight device based on the signal generation scheme shown in FIG. 3, respectively.
[70] That is, in FIGS. 4(a) and 4(b), a moving state of a ball hit by a golfer using a golf club is shown using two image patterns acquired based on the trigger signal scheme shown in FIG. 3 (II and 12).
[71] An image II shown in FIG. 4(a) is acquired through multiple exposure of golf club images CI, C2 and C3 and ball images 11a, l ib and 1 lc. In this case, the ball images 11a, 1 lb and 1 lc are spaced apart from each other by a predetermined interval.
[72] An image 12 of FIG. 4(b) shows a case in which ball images overlap to provide an image region 12 of a predetermined size.
[73] That is, in the image shown in FIG. 4(a), the ball is moved at high speed, and
therefore, the image is formed in a state in which the ball images are spaced apart from
each other by the predetermined interval when strobe flashlight is triggered, and, in the image shown in FIG. 4(b), the ball is moved at low speed, and therefore, strobe flashlight is triggered before the ball moves far away with the result that the ball images overlap each other.
The sensing device and sensing processing method according to the present invention are provided to accurately acquire the coordinates of center points of the ball images when the ball images overlap each other on the multiple exposure image as the ball is moved at a speed equal to or less than a predetermined speed based on the fixed interval of the strobe flashlight as shown in FIG. 4(b).
Hereinafter, processing of the overlapping ball images as shown in FIG. 4(b) by the sensing device for the moving object according to the present invention will be described with reference to FIGS. 5 to 18.
FIG. 5(a) is a view showing the original copy of a multiple exposure image including overlapping ball images. The multiple exposure image is acquired using a camera device having a low resolution of 320 x 240 and operated at a low speed of 75 fps and a strobe flashlight device operating at 330 Hz.
The preliminary processing means 231 (see FIG. 1) of the first processing means removes a stationary image, i.e. a background image, from the original image shown in FIG. 5(a) through subtraction and performs a predetermined preliminary processing operation such as Gaussian blur. An image acquired through the preliminary processing operation is shown in FIG. 5(b).
However, since the ball images overlap each other and the resolution of the camera device is low, i.e. the resolution of image is too low to clearly confirm that the ball images overlap each other, the overlapping ball images are displayed in the form of an image region having a predetermined size, which may be confused with various noises, such as an image region having a predetermined size displaying part of a body of a golfer or an image region having a predetermined size displaying a golf club. For this reason, it is very difficult to extract only the image region, in which the ball images overlap each other, in a state in which various noises are included.
A method of extracting such overlapping ball images is to estimate the moving direction of the ball using the moving direction estimating means 232 (see FIG. 1) of the sensing device according to the present invention, to primarily extract images in the estimated moving direction and to extract a ball image region from the extracted images.
FIGS. 6 to 10 are views showing a process of estimating a moving direction of a ball performed by the sensing device and sensing processing method according to the embodiment of the present invention. First, a process of estimating a moving direction of a ball performed by the first moving direction estimating means will be described with
reference to FIGS. 6 and 7.
[81] First, lines L are formed so as to pass through all image regions R present in an
image acquired by properly removing a background image and noise from a multiple exposure image having all frames which are initially acquired.
[82] That is, as shown in FIG. 6, a line connecting at least two points of the image regions
R is formed.
[83] At this time, information on the initial position of a ball must be previously set. That is, the camera device senses the initial position of the ball before the ball is moved and the initial position Bl of the ball in the multiple exposure image is set based on the information on the sensed initial position of the ball.
[84] Distances between all of the lines L formed as described above and the initial
position B 1 of the ball are measured.
[85] That is, referring to FIG. 7, which is an enlarged view showing region A of FIG. 6, straight distances between a plurality of lines passing through a region Rl and the initial position B 1 of the ball and between a plurality of lines passing through a region R2 and the initial position B 1 of the ball are measured.
[86] In FIG. 7, a line LI passing through the region Rl and a line L2 passing through the region R2 are nearest the initial position Bl of the ball. The line LI is nearer the initial position B l of the ball than the line L2. Consequently, the line LI is estimated as an axis of a moving direction of the ball tentatively.
[87] On the assumption that a straight distance between the initial position B 1 of the ball and the line LI is DL, the moving direction estimated using the above method may be incorrect if DL is a long distance to such an extent that DL deviates from a predetermined range. Consequently, the result estimated by the first moving direction estimating means as described above is ignored, and the moving direction is preferably estimated by the second moving direction estimating means.
[88] FIGS. 8 to 10 are views showing a process of estimating the moving direction of the ball performed by the second moving direction estimating means.
[89] First, as shown in FIG. 8, a plurality of frames is combined into an image, in which ball image regions appear as time passes.
[90] That is, in FIG. 8, ball image regions 12-1, 12-2 and 12-3 of the respective frames are present in a single image.
[91] Also, the moving direction is estimated by the second moving direction estimating means based on the single image as shown in FIG. 8, which will be described with reference to FIG. 9.
[92] Preferably, the second moving direction estimating means includes a image
combining means to combine at least two frames of the multiple exposure images into a single image, a line scanning means to perform line scanning at a predetermined
angle about the initial position of the object to check pixel values of the images on the respective scanned lines and an axis extracting means to extract the scanned line having the largest pixel value among the lines scanned by the line scanning means as an axis of the moving direction.
[93] That is, a plurality of frames may be combined into a single image by the image combining means of the second moving direction estimating means as shown in FIG. 8, and the axis of the moving direction of the ball may be extracted by the line scanning means and the axis extracting means as shown in FIG. 9.
[94] As shown in FIG. 9, a predetermined angle scanning region SR is designated based on the initial position B 1 of the ball to check pixel values of images on respective scanned lines XI, X2, X3 ... Xi ... Xn in the corresponding scanning region SR based on the initial position Bl of the ball.
[95] At this time, scanning region lines si and s2 forming the scanning region SR are preferable formed to define a predetermined angle therebetween. The scanning region lines si and s2 may be set to define 360 degrees therebetween. Preferably, as shown in FIG. 9, the scanning region lines si and s2 are set to define front 120 degrees therebetween.
[96] The pixel values of the images on the respective scanned lines XI, X2, X3 ... Xi ...
Xn may be checked as shown in FIG. 10.
[97] As shown in FIG. 10, a first sub scanned line UL and a second sub scanned line LL are formed at one side and the other side about the scanned line Xi based on a value previously set as a radius Rs of the ball (including a value set based on a previously known radius of the ball and a value set based on a radius of the ball previously measured from the image of the ball), and the sum Sum 1 of the pixel values of the image on the scanned line Xi, the sum Sum 2 of the pixel values of the image on the first sub scanned line UL and the sum Sum 2 of the pixel values of the image on the second sub scanned line LL are calculated.
[98] Subsequently, the following a cost function value is obtained.
[99] v = Sum 1 - Sum 2 - Sum 3
[100] The above cost function value is obtained for each of the scanned lines, and the
scanned line having the maximum value is estimated as the axis of the moving direction of the ball.
[101] In this way, it is possible to accurately include the ball image region while effectively excluding images which are too small or large to be the ball image region by obtaining the cost function value derived from the respective scanned lines and the sub scanned lines based on the radius value of the ball with respect to the respective scanned lines, and therefore, it is possible to accurately extract the axis of the moving direction passing through the ball image region.
[102] In FIG. 9, therefore, the scanned line Xi may be estimated as the axis of the moving direction of the ball.
[103] Meanwhile, after the axis of the moving direction of the ball is extracted using the above method, an ball image region, in which ball images overlap each other, is extracted by the object extracting means 233 (see FIG. 1) using the extracted axis of the moving direction of the ball.
[104] That is, as shown in FIG. 11, the pixel values are checked with respect to all of the images (which may include noise in addition to a ball image region to be extracted) through which the extracted axis of the moving direction of the ball passes to effectively remove noise and thus extract only the ball image region.
[105] FIGS. 11(a) and 11(b) shows an example of a method of effectively extracting only the ball image from image present on the axis of the moving direction.
[106] FIG. 11(a) is a view showing an image having a predetermined pattern which may be present on the axis of the moving direction and FIG. 11(b) is a graph showing the distribution of pixel values checked along an axis 13 of the moving direction with respect to the image shown in FIG. 11(a).
[107] As shown in FIG. 11(a), the axis 13 of the moving direction passes through a ball image region 12 in which ball images overlap each other and noise Nl around the ball image region 12. The distribution of pixel values of the image along the axis 13 of the moving direction has a pattern shown in FIG. 11(b).
[108] The ball image region in which ball images overlap each other must have a section in which pixel values are maintained at a predetermined critical value or more, which corresponds to a section obtained by adding diameters of at least two balls. As shown in FIG. 11(b), the section in which the pixel values are maintained at the predetermined critical value or more includes a section al to a2 and a section a3 to a4.
[109] If the section in which the pixel values are maintained at the predetermined critical value or more has an image corresponding to the section a3 to a4, i.e. an image denoted by reference numeral 12 in FIG. 11(a), the section in which the pixel values are maintained at the predetermined critical value or more may be estimated as the ball image region in which the ball images overlap each other. On the other hand, if the section in which the pixel values are maintained at the predetermined critical value or more has an image corresponding to the section al to a2, i.e. an image denoted by reference numeral Nl in FIG. 11(a), the section in which the pixel values are maintained at the predetermined critical value or more may be estimated as noise (an image of a golf club).
[110] Consequently, the image Nl may be excluded and the image denoted by reference numeral 12 may be extracted as the ball image region in which ball images overlap each other.
[11 1] Meanwhile, after the ball image region in which the ball images overlap each other is extracted from the images present on the axis of the moving direction as described above, a process of properly fitting the extracted image region to extract a center point thereof is performed by the fitting means 241 (see FIG. 1).
[112] As a method of fitting the ball image region in which the ball images overlap each other, a half circle fitting method to fit one half circle at one end of the ball image region 12 in which the ball images overlap each other and to fit the other half circle at the other end of the ball image region 12 in which the ball images overlap each other as shown in FIGS. 12 and 13 may be used.
[113] As shown in FIG. 12(a), the ball image region 12 has a higher brightness value than the periphery (Here, if the acquired image has a grayscale image, each of the pixel values has a brightness value).
[114] The edge portion of the ball image region 12 has a gradually varying brightness value. The variation of the brightness value is terminated at the contour of the ball image region. Preferably, therefore, the contour of the ball image region is fitted.
[115] For proper fitting, therefore, as shown in FIG. 12(a), a first half circle CCl and a second half circle CC2 are moved in opposite directions about the ball image region 12 along the axis 13 of the moving direction to calculate the sum of pixel values of the image on the first half circle CCl and the sum of pixel values of the image on the second half circle CC2.
[116] The sum of the pixel values calculated as described above has the distribution shown in the graph of FIG. 12(b). FIG. 12(b) is a graph showing the distribution of the sum of pixel values on a half circle.
[117] In the graph of FIG. 12(b), the pixel value abruptly varies between a point VI or a point V2 at which the pixel value starts to abruptly vary and a point V3 at which the brightness value completed varies.
[118] Preferably, a half circle curve HC1 or HC2 (see FIG. 13) is formed at the point VI or the point V3 to fit the ball image region.
[119] That is, preferably, a specific value between the point VI and the point V3 is
previously set as a predetermined value of the sum of the pixel values, and the half circle curves HC1 and HC2 are formed at positions at which the sum of the pixel values on the respective half circles CCl and CC2 is less than the predetermined value to fit the ball image region.
[120] Consequently, the value of the point VI may be previously set as the predetermined value, the value of the point V2 or the point V3 may be previously set as the predetermined value or an arbitrary value between the point VI and the point V3 may be previously set as the predetermined value. Setting of the predetermined value may be derived through an experiment to confirm at which value as the predetermined value
the most accurate result can be obtained.
[121] Here, the diameter of the half circles CC1 and CC2 or the half circle curves HC1 and HC2 may have a value previously set as the diameter of the ball or a value measured from the ball image as the diameter of the ball.
[122] That is, the diameter of the ball may be a value which is previously measured and input or a value sensed by the camera device at the initial position of the ball and measured and set on the image as the diameter.
[123] On the other hand, the diameter of the ball may be measured from the separate ball images as shown in FIG. 4(a) and may be used to fit the overlapping ball images.
[124] Meanwhile, in FIG. 13, any one of the points VI to V3 is selected from the distribution of the sum of the pixel values on the half circles CC1 and CC2 along the axis 13 of the moving direction and the half circle curves HC1 and HC2 having the predetermined diameter of the ball or the measured diameter of the ball are formed at opposite ends of the ball image region 12 to fit the ball image region.
[125] That is, preferably, the first half circle curve HC1 is formed at one end of the ball image region 12 to fit the ball image region 12 and the second half circle curve HC2 is formed at the other end of the ball image region 12 to fit the ball image region 12.
[126] The half circle fitting as described above may be performed by a processing means.
Pixel values may be checked by a pixel checking means while the first half circle CC1 and the second half circle CC2 moves to opposite ends of the ball image region and the half circle curves may be formed at the opposite ends of the ball image region by a fitting processing means to fit the ball image region.
[127] Also, the half circle fitting of one half circle and the half circle fitting of the other half circle may be separately performed by different processing means.
[128] That is, the movement of the first half circle CC1 and the checking of the pixel
values on the first half circle CC1 may be performed by a first pixel checking means and the movement of the second half circle CC2 and the checking of the pixel values on the second half circle CC2 may be performed by a second pixel checking means. According to the result of the pixel checking performed by the first pixel checking means, a first fitting processing means may form the first half circle curve HC1 at one end of the ball image region to fit the ball image region.. According to the result of the pixel checking performed by the second pixel checking means, a second fitting processing means may form the second half circle curve HC2 at the other end of the ball image region to fit the ball image region.
[129] Meanwhile, if the half circle curves HC1 and HC2 for fitting are formed at the point V3 as shown in the graph of FIG. 12(b), center point candidates EP1 and EP2 may be formed at the point VI or the point V2.
[130] That is, preferably, a first center point candidate EP1 is formed at the side at which
the first half circle curve HCl and a second center point candidate EP2 is formed at the side at which the second half circle curve HC2. Also preferably, the center point candidates EP1 and EP2 are formed on the axis 13 of the moving direction.
[131] Of course, the center point candidates EP1 and EP2 may be selected as arbitrary
points at the half circle fitted sides.
[132] It is necessary to adjust the center point candidates EP1 and EP2 formed as described above using the coordinate extracting means 242 (see FIG. 1) so that the center point candidates EP1 and EP2 can be located at correct center point position. Also preferably, the position of the axis of the moving direction is adjusted so that the axis of the moving direction can be located at a more correct position.
[133] First, adjusting the position of the axis of the moving direction will be described with reference to FIG. 14.
[134] It is preferable for the axis of the moving direction to pass through the center of the ball image region. Since the axis of the moving direction is not accurately extracted but estimated, however, the estimated axis of the moving direction may not exactly pass through the center of the ball image region.
[135] In this case, it is preferable to adjust the position of the axis of the moving direction so that the estimated axis of the moving direction can exactly pass through the center of the ball image region.
[136] As shown in FIG. 14(a), the upper and lower thicknesses of the ball image region 12 through which the axis 13 of the moving direction passes must be substantially equal to each other. Therefore, it is preferable to shift the position of the axis 13 of the moving direction to a distance corresponding to the difference between the thickness tl of the ball image region 12 under the axis 13 of the moving direction and the thickness t2 of the ball image region 12 above the axis 13 of the moving direction shown in FIG. 14(a).
[137] A state in which the axis of the moving direction is shifted to a center line CL
passing through the center of the ball image region 12 using the process as described above is shown in FIG. 14(b).
[138] On the other hand, FIG. 15(a) shows a process of adjusting the center point
candidates EP1 and EP2 formed using the method shown in FIGS. 12 and 13.
[139] Since each of the fitted half circle curves HCl and HC2 is formed in the shape of a half circle as shown in FIG. 15(a), it is possible to adjust the respective center point candidates EP1 and EP2 into correct center points based on geometrical conditions thereof.
[140] That is, three directional lines rl, r2 and r3 are formed to have the same length with respect to the first center point candidate EP1 and three directional lines r4, r5 and r6 are formed to have the same length with respect to the second center point candidate
EP2 to adjust the positions of the center point candidates EPl and EP2 so that the center point candidates EPl and EP2 can be located at correct positions.
[141] That is, the positions of the center point candidates EPl and EP2 are adjusted using the above method so that the center point candidates EPl and EP2 can be located at the centers of circles having the fitted half circle curves HC1 and HC2, respectively.
[142] At this time, the length of the three directional lines rl, r2 and r3 or r4, r5 and r6 may correspond to a value previously set as the diameter of the ball or a diameter value measured through the ball image.
[143] As shown in FIG. 15(b), coordinates of two center points CP1 and CP2 of the ball image region 12 are extracted by the above process.
[144] Meanwhile, the image region extracted by the above process may contain images different from the image region of the actual ball.
[145] In order to prevent such an error, as shown in FIG. 16, it is possible to determined whether the extracted center points are correct based on a strobe flashlight cycle per frame and, if the center points have been incorrectly extracted, it is possible to correct the extracted center points.
[146] That is, the strobe flashlight time interval per frame is tsl and the time interval
between the last strobe flashlight of one frame and the first strobe flashlight of the next frame is ts2 as shown in FIG. 3, and the time intervals tsl and ts2 must correspond to the distance intervals between the ball image regions 12-1, 12-2 and 12-3 of the respective frames of FIG. 1 1 at a predetermined ratio.
[147] That is, a ratio of ts2 to 2ts 1 must correspond to a ratio of b to a or a ratio of d to c.
[148] If the interval between the extracted center points does not correspond to the strobe flashlight cycle, it is possible to correct the extracted center points so that the interval between the extracted center points can correspond to the strobe flashlight cycle.
[149] For example, if the center points are not located at positions at which the center points must be located according to the strobe flashlight cycle, the center points can be corrected so that the center points can be located at the corresponding positions. On the other hand, if the center points are located at positions at which the center points must not be located according to the strobe flashlight cycle, the center points can be removed.
[150] After the center points are extracted and corrected as described above, it is possible to confirm the coordinates of the final center points of the ball image region in which the ball images overlap each other.
[151] Therefore, it is possible to very accurately extract the coordinates of the center point of the moving ball although the low quality image acquired by the low resolution and low speed camera devices and strobe flashlight device is used with the result that balls overlap each other, and therefore, an image region having a predetermined size is
present.
[152] The coordinates of the center points of the balls finally confirmed as described above are converted into three-dimensional coordinates by the converting means, and physical information of the moving orbit of the ball to be simulated is obtained based on the three-dimensional coordinates.
[153] Hereinafter, embodiments of a sensing processing method according to the present invention will be described with reference to FIG. 17.
[154] As shown in FIG. 17, first, a trigger signal is applied by the signal generating unit so that the camera device and the strobe flashlight device acquire a multiple exposure image of the moving state of an object at a predetermined interval (S10).
[155] At this time, the initial position of a ball is sensed and recognized (S20). Various preliminary processing steps are performed to remove a background image and various noises from the acquired multiple exposure image (S30).
[156] A plurality of lines passing through image regions in the image preliminarily
processed as described above is formed (S40) and the line nearest the initial position of the ball is extracted (S50) to estimate an axis of a moving direction of the ball.
[157] At this time, when the distance between the initial position of the ball and the line nearest the initial position of the ball is less than a predetermined distance, the line is extracted as the axis of the moving direction (S61).
[158] When the distance between the initial position of the ball and the line nearest the initial position of the ball is greater than the predetermined distance, it is determined that the axis of the moving direction estimated at Steps S40 and S50 is incorrect, and a second moving direction estimating process is carried out.
[159] That is, two or more frames of the preliminarily processed image are combined (S62) and line scanning is performed at a predetermined angle about the initial point of the ball (S63).
[160] At this time, cost function values (see FIG. 10) are obtained with respect to pixel values of the image regions on the respective scanned lines (S64) and one of the scanned lines which have the maximum cost function value is extracted as the axis of the moving direction (S65).
[161] The image of the ball is extracted and analyzed from the image region on the axis of the moving direction extracted as described above to finally obtain the coordinates of a center point of the ball (S70).
[162] Hereinafter, Step S70 of FIG. 17 will be described in more detail with reference to FIG. 18.
[163] The axis of the moving direction in which the ball is moved is extracted by the
moving direction estimating means as described above, a ball image region is extracted from the image through which the extracted axis of the moving direction passes, the
sum of pixel values on half circles is calculated while the respective half circles having a predetermined diameter or a measured diameter moves from the center of the extracted ball image region to opposite ends of the extracted ball image region along the axis of the moving direction (S71).
[164] Subsequently, half circle curves are formed at positions at which the calculated sum of the pixel values is less than a predetermined value to fit the ball image region (S72).
[165] At least two center point candidates of the ball image region are formed on the axis of the moving direction passing through the ball image region (S73) and the formed center point candidates are adjusted so that the formed center point candidates can be located at the centers of circles having the fitted half circle curves, respectively (S74).
[166] Also, the positions of the center points adjusted as described above are corrected so as to correspond to the strobe flashlight cycle (S75) and the coordinates of two center points of the ball image region are finally obtained (S76).
[167] The change of the coordinates of the center points acquired as described above in a three-dimensional space is calculated to obtain physical property information of the moving ball.
Mode for the Invention
[168] Various embodiments of a sensing device and sensing processing method and a
virtual golf simulation device using the same have been described in the best mode for carrying out the invention.
Industrial Applicability
[169] In a sensing device and sensing processing method for a moving object according to the present invention as described above and a virtual golf simulation device using the same, although a multiple exposure image of a moving object is acquired by a low resolution and low speed camera device and strobe flashlight device, it is possible to accurately extract an image of the object from the acquired image and, although several object images are present in an overlapping state as the result of acquiring a multiple exposure image of an object moving in particular at low speed, it is possible to accurately extract the overlapping object images to extract coordinates of a center point thereof, thereby achieving high sensing processing performance and sensing accuracy at low cost. Consequently, the present invention can be widely used in industries related to a sensing device and sensing processing method for a moving object and a virtual golf simulation device using the same.