WO2015055047A1 - 游戏控制方法及设备 - Google Patents

游戏控制方法及设备 Download PDF

Info

Publication number
WO2015055047A1
WO2015055047A1 PCT/CN2014/084780 CN2014084780W WO2015055047A1 WO 2015055047 A1 WO2015055047 A1 WO 2015055047A1 CN 2014084780 W CN2014084780 W CN 2014084780W WO 2015055047 A1 WO2015055047 A1 WO 2015055047A1
Authority
WO
WIPO (PCT)
Prior art keywords
interest
image
point
game control
points
Prior art date
Application number
PCT/CN2014/084780
Other languages
English (en)
French (fr)
Inventor
严立超
Original Assignee
智尊应用程序开发有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 智尊应用程序开发有限公司 filed Critical 智尊应用程序开发有限公司
Publication of WO2015055047A1 publication Critical patent/WO2015055047A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes

Definitions

  • the invention relates to a method for controlling a device, in particular to a game control method.
  • the present invention relates to a manipulation device, and more particularly to a game control device.
  • Wii Includes a host and a handle.
  • the handle includes a plurality of sensors and wireless transmitting ends for measuring motion information.
  • the host includes a wireless receiver and a processor.
  • a plurality of sensors on the handle such as an acceleration sensor, an angular velocity sensor, a gravity sensor, and a displacement sensor, transmit motion information of the handle to the wireless receiving end of the host through the wireless transmitting end.
  • the host establishes a motion trajectory of the handle in the space according to the motion information, and controls the game according to the motion trajectory.
  • these sensors are all electronic devices that require a built-in power supply to support the operation of such sensors.
  • the host since the plurality of sensors only record the relative motion parameters of the handle, the host needs to calculate the spatial position of the handle according to the acceleration and the like, so the spatial position is a relative value, not an absolute position, so after a certain period of time, There is a large position drift, resulting in systematic errors.
  • the host has a large amount of computation, which makes the device consume more power. Therefore, the game control methods and devices in the prior art are inferior in performance.
  • a game control method includes: an extracting step of inputting a sequence of images and obtaining a plurality of points of interest in each frame image; a matching step of selecting an image matching the pre-stored image features in the pattern library; and calculating steps according to the selected The image calculates the spatial position of the manipulated object; the steps are executed to control the game based on the spatial position.
  • the matching step includes: acquiring a reference image, the reference image includes the manipulation object; and extracting a plurality of reference points of interest of the manipulation object of the reference image, The reference points of interest are pre-stored with the pattern library.
  • the matching step further includes: comparing a plurality of to-be-satisfied points of the image input by each frame with the plurality of reference points of interest; calculating a to-be-image in the image a number that matches a reference point of interest in the reference image; when the number reaches a preset value, the frame image is selected to enter a calculation step; the preset value is greater than or equal to the number of reference points of interest 50%.
  • the matching step further includes acquiring a distribution of the plurality of reference points of interest; the calculating step includes: setting the to-be-interested points of interest identical to the reference point of interest as positioning points of interest And calculating a spatial position of the manipulation object in the frame image according to the distribution of the positioning interest points.
  • the calculating step further includes: continuously obtaining a spatial position of the manipulation object in the multi-frame image; establishing a spatial motion trajectory of the manipulation object; the performing step includes: moving according to the space The trajectory generates a game control instruction; the game control instruction is executed.
  • the method further includes: calculating the plane coordinate of the manipulation object in the display device according to the spatial position of the manipulation object; and displaying the step, displaying the manipulation object on the display screen.
  • the method further includes: obtaining, by the matching feature matrix of each to-be-interested interest point, a reference feature matrix of each reference interest point; and a waiting interest
  • the feature matrix of the point is the same as or similar to the reference feature matrix of a reference point of interest
  • the candidate point of interest is calibrated to match the reference point of interest.
  • a game control device includes: an extraction module, configured to input a sequence of images and obtain a plurality of points of interest in each frame image; a matching module, configured to select an image that matches a pre-stored image feature in the pattern library; and a calculation module And a method for calculating a spatial position of the manipulation object according to the selected image; and an execution module for controlling the game according to the spatial position.
  • the matching module is further configured to acquire a reference image, where the reference image includes the manipulation object; and the matching module is further configured to extract the plurality of manipulation objects of the reference image.
  • Reference points of interest, the plurality of reference points of interest are pre-stored in the pattern library.
  • the matching module is further configured to compare the plurality of to-be-beat points of interest of the image input by each frame with the plurality of reference points of interest; and the matching module is further used to Calculating, by the number of points of interest in the image of the frame, a matching number of reference points of interest in the reference image; when the quantity reaches a preset value, the frame image is selected to enter a calculation step; the preset value is greater than or equal to the reference interest Number of points 50%.
  • the matching module is further configured to obtain a distribution of the plurality of reference points of interest; the calculation module is further configured to set the to-be-before-mentioned interest points that are the same as the reference point of interest as positioning
  • the point of interest calculates the spatial position of the manipulated object in the frame image according to the distribution of the located point of interest.
  • the calculation module is further configured to continuously obtain a spatial position of the manipulation object in the multi-frame image, establish a spatial motion trajectory of the manipulation object, and generate a game control according to the spatial motion trajectory. instruction.
  • the calculation module is further configured to calculate plane coordinates of the manipulation object in the display device according to the spatial position of the manipulation object; and display a module to display the manipulation object on the display device.
  • the matching module is further configured to acquire a to-be-compared feature matrix of each to-be-interested point of interest and a reference feature matrix of each reference point of interest; when a feature of the to-be-interested point of interest When the matrix is the same or similar to the reference feature matrix of a reference point of interest, the candidate point of interest is calibrated to match the reference point of interest.
  • the game control method and apparatus of the present invention selects an image having a manipulation object by comparing a preset reference image having a feature of the manipulation object, the spatial position of the manipulation object is confirmed by the distribution of the interest points, The exact position of the object in the space can be obtained.
  • the absolute spatial position of the manipulated object is obtained, and there is no phenomenon that the method and the device of the present invention operate for a period of time to generate positional drift, so the performance of the game control method and device according to the present invention better. .
  • FIG. 1 is a schematic flow chart of a game control method of the present invention
  • FIG. 2 is a block diagram of a game control device of the present invention.
  • FIG. 1 is a schematic flowchart of the game control method of the present invention
  • FIG. 2 It is a schematic diagram of a module of the game control device of the present invention.
  • the game control device of the present invention includes an extraction module 11, a matching module 12, a calculation module 13, and an execution module 14 .
  • the game control method of the present invention includes:
  • Step S1 an extracting step, inputting a sequence of images and obtaining a plurality of points of interest in each frame of the image;
  • the extraction module 11 is configured to input a sequence of images and obtain a plurality of points of interest in each frame of the image.
  • the manipulation object moves in a three-dimensional space
  • the game control device of the present invention acquires the motion state of the manipulation object in the three-dimensional space, and controls the game according to the motion state.
  • the motion state includes information such as a motion trajectory, an angle of the manipulation object in the space, and the like.
  • the extraction module 11 Includes a camera that is used to capture manipulated objects to capture multiple frames of continuous images. Specifically, the camera captures a multi-frame image every second, and the multi-frame image records the motion state of the manipulation object in the space.
  • the extraction module 11 also included is a point of interest extraction unit.
  • the point of interest extraction unit is configured to extract points of interest in the image.
  • the camera transmits the image to the point of interest extraction module.
  • the point of interest extraction unit extracts the points of interest in the image according to a preset algorithm.
  • the point of interest extraction unit may perform point of interest extraction according to one or more of a corner detection method, an edge detection method, or a threshold detection method.
  • the manipulation object includes a feature of a brighter triangle, and the point of interest extraction unit extracts and defines the triangle feature as a point of interest by threshold detection.
  • the point of interest extraction unit defines the mosaic pattern as another point of interest.
  • the point of interest extraction unit defines the ring as a point of interest.
  • the point of interest extraction unit defines each edge of the coffee table as a point of interest.
  • the interest extraction unit performs point of interest extraction by using one or more images obtained by the camera in the corner detection method, the edge detection method, or the threshold detection method, thereby obtaining a plurality of points of interest.
  • the plurality of points of interest are referred to as to-be-interested points of interest.
  • the number of the plurality of to-be-interested points of interest is usually tens to hundreds, and the specific number depends on the complexity of the image transmitted by the camera.
  • Step S2 matching step, selecting an image that matches the pre-stored image feature in the pattern library
  • step S1 the matching module 12 needs to be passed.
  • Perform feature feature matching Specifically, first, an image feature of a reference image is pre-stored in the game control device of the present invention by image training.
  • the reference image is an image of the manipulation object, and has a plurality of points of interest, and the plurality of points of interest are defined as reference points of interest; and then the plurality of reference points of interest are compared with the points of interest in the image captured by the camera. In a comparison, it is determined which of the plurality of to-be-behaved points of interest are the same as the features of the plurality of reference points of interest.
  • At least one reference image having the manipulation object is first obtained by the camera, and each point of interest on the manipulation object in the reference image is extracted by the point of interest extraction unit.
  • the features of the reference point of interest are recorded by a plurality of images.
  • the camera is photographing the manipulation object to obtain a front view of the manipulation object.
  • the distribution of the plurality of reference interest point front views can be obtained.
  • the distribution of the plurality of reference points of interest is in one-to-one correspondence with the position of the manipulation object in space, that is, when the distribution of the plurality of reference points of interest is determined, the space coordinates of the manipulation object may be obtained.
  • the angle of rotation of the manipulating object Assuming that the manipulation object is a rod shape, it is possible to know the state of the rod object in space, for example, horizontally vertical or inclined, etc.).
  • the distribution of the plurality of reference points of interest includes the position of every other point of interest, the distance between any two reference points, the relative positional relationship of any two reference points of interest, and the like. For example, suppose you have reference points of interest A, B, C, D, four points form a quadrilateral, the distance between any two points is determined, the ratio of the distance between any two points is also determined, and the spatial orientation of any point relative to any other point is also determined. Eg AB distance 3 Centimeter, CD distance 4 cm, the ratio of distance is 0.75, C is 30 degrees to the upper right of A (for a picture, the order is up, down, left and right according to the order of the pixels) . It is defined that the distribution relationship of the four points of interest at this time uniquely corresponds to the position of the reference picture, and the reference picture is obtained by the manipulation object located 2 meters directly in front of the camera.
  • Step S3 the calculating step, calculating the spatial position of the manipulation object according to the selected image
  • the selected image it is assumed that the selected image also has points of interest A, B, C, D. Assuming that in the selected image, AB distance is 6 cm and CD distance is 8 In centimeters, the relative positional relationship of each store is the same as the relative positional relationship of each point in the reference picture. Only the distance between any two points has doubled.
  • the manipulation object in the selected image is also located directly in front of the camera, and the distance from the camera is less than 2 Meter. Since the size of the object in the picture and the distance of the object from the camera can be derived by the formula when the camera is shooting an object, it can be deduced that the four reference points are not changed except for the distance increase. How far the object is controlled is from the camera. For example, in the foregoing example, the manipulation object in the selected image is located directly in front of the camera. Photographed at 1 meter.
  • the points of interest A, B, C, D also appear simultaneously, and the distribution thereof passes through step S2. It has been obtained, so the calculation module 13 can calculate the coordinates of the manipulation object in space according to a preset calculation formula.
  • Step S4 Control the game according to the spatial position.
  • the game control command corresponds to the state of the manipulation object in space.
  • the manipulation object is stationary in space corresponding to a game instruction, and the manipulation object rotates in the air corresponding to another game instruction, and the manipulation object corresponds to another game instruction at a certain position in the space.
  • the coordinates of the manipulation object corresponding to the frame image in space are obtained.
  • the calculation module 13 After processing multiple frames of images continuously, you can establish a motion track of the manipulated object according to the time and the spatial coordinates of the manipulated object corresponding to each time point.
  • the motion trajectory corresponds to the game control command one by one. So at step S3 Upon completion, the execution module 14 can control the game based on the spatial position and/or motion trajectory and the like.
  • step S2 If the number of the plurality of points of interest in the image is the same as the reference point of interest in the reference image, less than 50% of the reference point of interest When the image is discarded, the next frame of image is processed, thereby saving computation time.
  • Images that are more than the point of interest are generally colored, with RGB Information.
  • the color image can be converted into a grayscale image, and the grayscale value of the pixel can be calculated according to the brightness value of each pixel before.
  • the gradient refers to the second derivative of the gray value, and the direction of the gradient may also be referred to as the vector direction.
  • the direction of the gradient generally means that the change in the rate of change of the gray value is greatest in this direction.
  • the gradient direction is generally understood to be a specific value in a 360 degree direction of a plane.
  • the starting point 0 of how to define the direction of this 360 is a relatively easy to understand content in the prior art, and is not described here.
  • the vector direction at each pixel may be a very specific value with multiple decimal points, for example 18.125 degrees, the vector direction and the corresponding second derivative need to be counted.
  • the 360-degree angle range into 36 equally-sized angle ranges, such as (0, 10), (10, respectively. 20], (20, 30], (30, 40] ..., (340, 350), (350, 360), etc.
  • the vector direction of all pixels is first summarized into these 36 angular intervals.
  • the vector direction of 18.125 degrees is summarized as (10, In the angular interval of 20]
  • the vector direction of 11.701 degrees is classified into an angular interval of (10, 20)
  • the vector direction of 46.265 degrees is classified into an angular interval of (40, 50).
  • an angular interval is taken as an example to sum the gradient values of all the pixels whose vector direction is within the angular interval. After the summation, each summation value is arranged according to the natural order of the angular intervals, and at this time, a 36-dimensional feature array can be obtained.
  • the vector direction is at (0,
  • the sum of the gradient values (second derivative) of all pixels in the angular interval of 10] is 25. (10, 20], (20, 30], (30, 40]..., (340, 350), (350,
  • the sum of the corresponding gradient values in the angle interval of 360] is ⁇ 25,10,55,70...,35,20>.
  • the point of interest can be represented by arrays ⁇ 25, 10, 55, 70, ..., 35, 20>.
  • the array ⁇ 25, 10, 55, 70, ..., 35, 20> is referred to as the feature array of the point of interest.
  • All of the reference points of interest in the reference image pre-existing in the system are also subjected to the aforementioned calculations to obtain an array of features for each reference point of interest.
  • the array of features of the reference points of interest constitutes a library of reference feature arrays.
  • a feature array to be compared with a point of interest is identical to a feature array of a reference point of interest, and therefore, when a feature array of a point of interest is approximated by a feature array of a reference point of interest
  • the point of interest can also be considered as the reference point of interest, and the feature array of the point of interest is considered to be located in the reference feature array library.
  • the approximation can be expressed as follows, for example, assuming that the array of reference points of interest is ⁇ 35, 18, 95, 60, ..., 30, 40>, when each value of the feature array of the point of interest is The corresponding values in the array of reference points of interest are within a preset ratio, or the sum of all corresponding numerical errors is within a preset ratio, and the preset ratio may be 10%, or 20%, etc. And determining that the array of reference points of interest is similar to the array described above.
  • the aforementioned preset ratio is an empirical value, and is not limited to 10% or 20%. The specific value is determined based on the clarity of the image quality, the computing power of the camera, the computing capability of the device, or the application environment of the device.
  • each feature array of the to-be-interested points of interest is calculated, and it is determined whether each feature array is located in the reference feature array library, and the feature array having more than 150 to-be-behaved points of interest exists in the reference feature array library.
  • the calculation of the distribution of the 150 points of interest can be performed to calculate the spatial position of the corresponding manipulation object.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Abstract

一种游戏控制方法及设备。游戏控制方法包括:提取步骤,输入图像序列并获得每一帧图像中的多个待比兴趣点;匹配步骤,选择与模式库中预存图像特征匹配的图像;计算步骤,依据被选择的图像计算操控物体的空间位置;执行步骤,依据该空间位置控制游戏。该游戏控制设备包括提取模块、匹配模块、计算模块和执行模块。游戏控制方法及设备用户体验较好。

Description

游戏控制方法及设备 游戏控制方法及设备
技术领域
本发明涉及一种设备的操控方法,尤其涉及一种游戏控制方法。
此外,本发明还涉及一种操控设备,尤其涉及一种游戏控制设备。
背景技术
现有技术中,体感游戏设备层出不穷,以满足人们对游戏控制设备的不断需求。以任天堂公司的 Wii 为例, Wii 包括一个主机、一个手柄。该手柄上包括多个用于测量运动信息的传感器和无线发射端。该主机包括一个无线接收端和处理器。手柄上的多个传感器,如加速度传感器、角速度传感器、重力传感器以及位移传感器等,将手柄的运动信息通过无线发射端发送至该主机的无线接收端。该主机依据该等运动信息建立该手柄在空间中的运动轨迹,并且依据该运动轨迹对游戏进行操控。
然而,该等传感器均系电子设备,该手柄内需要内置电源以支持该等传感器工作。同时,由于多个传感器只是记录手柄的相对运动参数,该主机需要根据加速度等信息计算出该手柄的空间位置,因此该空间位置是一个相对值,而非绝对位置,因此经过一定时间运行后会存在较大位置漂移,形成系统性误差。同时,该主机的计算量较大,使得设备功耗较大。因此现有技术中的游戏控制方法和设备性能较差。
发明内容
鉴于现有技术中的游戏控制方法性能较差的技术问题,有必要提供一种性能较好的游戏控制方法。
同时,鉴于现有技术中的游戏控制设备性能较差的技术问题,也有必要提供一种性能较好的游戏控制设备。
本发明的具体技术方案如下:
一种游戏控制方法,包括:提取步骤,输入图像序列并获得每一帧图像中的多个待比兴趣点;匹配步骤,选择与模式库中预存图像特征匹配的图像;计算步骤,依据被选择的图像计算操控物体的空间位置;执行步骤,依据该空间位置控制游戏。
在本发明的一个进一步优化的具体实施方式中,该匹配步骤包括:获取一个参考图像,该参考图像中包含该操控物体;提取该参考图像的该操控物体的多个参考兴趣点,将该多个参考兴趣点预存与该模式库中。
在本发明的一个进一步优化的具体实施方式中,该匹配步骤还包括:将每一帧输入的图像的多个待比兴趣点与该多个参考兴趣点进行对比;计算一帧图像中的待比兴趣点与参考图像中的参考兴趣点相匹配的数量;当该数量达到预设数值时,该帧图像被选择进入计算步骤;该预设数值大于等于该参考兴趣点数量的 50% 。
在本发明的一个进一步优化的具体实施方式中,该匹配步骤还包括获取该多个参考兴趣点的分布;该计算步骤包括:将与该参考兴趣点相同的待比兴趣点设为定位兴趣点,依据该定位兴趣点的分布计算该帧图像中操控物体的空间位置。
在本发明的一个进一步优化的具体实施方式中,该计算步骤还包括:连续获得多帧图像中的操控物体的空间位置;建立该操控物体的空间运动轨迹;该执行步骤包括:依据该空间运动轨迹生成游戏控制指令;执行该游戏控制指令。
在本发明的一个进一步优化的具体实施方式中,还包括:该计算步骤还包括,依据该操控物体的空间位置计算该操控物体在显示设备中的平面坐标;显示步骤,将该操控物体显示于显示设备。
在本发明的一个进一步优化的具体实施方式中,还包括:该匹配步骤还包括,获取每个待比兴趣点的待比特征矩阵及每个参考兴趣点的参考特征矩阵;当一个待比兴趣点的特征矩阵与一个参考兴趣点的参考特征矩阵相同或近似时,标定该待比兴趣点与该参考兴趣点匹配。
一种游戏控制设备,包括:提取模块,用于输入图像序列并获得每一帧图像中的多个待比兴趣点;匹配模块,用于选择与模式库中预存图像特征匹配的图像;计算模块,用于依据被选择的图像计算操控物体的空间位置;执行模块,用于依据该空间位置控制游戏。
在本发明的一个进一步优化的具体实施方式中,该匹配模块还用于获取一个参考图像,该参考图像中包含该操控物体;并且该匹配模块还用于提取该参考图像的该操控物体的多个参考兴趣点,将该多个参考兴趣点预存与该模式库中。
在本发明的一个进一步优化的具体实施方式中,该匹配模块还用于将每一帧输入的图像的多个待比兴趣点与该多个参考兴趣点进行对比;并且该匹配模块还用于计算一帧图像中的待比兴趣点与参考图像中的参考兴趣点相匹配的数量;当该数量达到预设数值时,该帧图像被选择进入计算步骤;该预设数值大于等于该参考兴趣点数量的 50% 。
在本发明的一个进一步优化的具体实施方式中,该匹配模块还用于获取该多个参考兴趣点的分布;该计算模块还用于将与该参考兴趣点相同的待比兴趣点设为定位兴趣点,依据该定位兴趣点的分布计算该帧图像中操控物体的空间位置。
在本发明的一个进一步优化的具体实施方式中,该计算模块还用于,连续获得多帧图像中的操控物体的空间位置,建立该操控物体的空间运动轨迹,依据该空间运动轨迹生成游戏控制指令。
在本发明的一个进一步优化的具体实施方式中,该计算模块还用于依据该操控物体的空间位置计算该操控物体在显示设备中的平面坐标;显示模块,将该操控物体显示于显示设备。
在本发明的一个进一步优化的具体实施方式中,该匹配模块还用于获取每个待比兴趣点的待比特征矩阵及每个参考兴趣点的参考特征矩阵;当一个待比兴趣点的特征矩阵与一个参考兴趣点的参考特征矩阵相同或近似时,标定该待比兴趣点与该参考兴趣点匹配。
相较于现有技术,本发明的主要有益效果在于:
相对于现有技术,由于本发明的游戏控制方法及设备通过对比预设的具有操控物体特征的参考图像来选择具有操控物体的图像,通过兴趣点的分布来确认该操控物体的空间位置,因此能够得到该物体在该空间中的准确位置。同时,由于是通过图像处理得到,是得到操控物体的绝对空间位置,不存在本发明所述的方法和设备运行一段时间产生位置漂移的现象,因此本发明所述的游戏控制方法及设备的性能较好。 。
附图说明
图 1 是本发明游戏控制方法的流程示意图;
图 2 是本发明游戏控制设备的模块示意图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用来限定本发明。
请参阅图 1 和图 2 , 图 1 是本发明游戏控制方法的流程示意图,图 2 是本发明游戏控制设备的模块示意图。本发明的游戏控制设备包括一个提取模块 11 、一个匹配模块 12 、一个计算模块 13 和一个执行模块 14 。本发明的游戏控制方法包括:
步骤 S1 、提取步骤,输入图像序列并获得每一帧图像中的多个待比兴趣点;
该提取模块 11 用于输入图像序列并获得每一帧图像中的多个待比兴趣点。
在本发明中,操控物体在三维空间中运动,本发明的游戏控制设备获取该操控物体在三维空间中的运动状态,并依据该运动状态操控游戏。该运动状态包括运动轨迹、空间中该操控物体的角度等信息。
该提取模块 11 包括一个摄像头,该摄像头用于拍摄操控物体,以获取多帧连续的图像。具体的,该摄像头每秒钟拍摄多帧图像,该多帧图像记录了该操控物体在空间内的运动状态。
该提取模块 11 还包括兴趣点提取单元。该兴趣点提取单元用于提取图像中的兴趣点。在本实施方式中,该摄像头每获取一帧图像后,将该图像传输给该兴趣点提取模块。该兴趣点提取单元依据预设的算法将该图像中的兴趣点提取出来。
具体地,该兴趣点提取单元可以依据角点检测法、边缘检测法或者阈值检测法中的一个或者多个进行兴趣点提取。例如,该操控物体上包括一个较亮三角形的特征,该兴趣点提取单元通过阈值检测法将该三角形特征提取并定义为一个兴趣点。有例如该操控物体上具有明显的马赛克图案,该兴趣点提取单元将该马赛克图案定义为另一个兴趣点。又例如在玩家挥舞该操控物体时,其手指上具有一个戒指,该兴趣点提取单元将该戒指定义为一个兴趣点。又例如,在图像的背景信息中具有一个茶几,该兴趣点提取单元将该茶几的每个边缘定义为一个兴趣点。
在本发明的具体实施方式中,该兴趣提取单元通过前述角点检测法、边缘检测法或者阈值检测法中的一个或者多个该摄像头获得的图像进行兴趣点提取,从而获得多个兴趣点。在本实施方式中,该多个兴趣点被称作待比兴趣点。该多个待比兴趣点的数量通常为数十个至数百个,具体数量依据该摄像头传输而来的图像的复杂程度而定。
步骤 S2 、匹配步骤,选择与模式库中预存图像特征匹配的图像;
在步骤 S1 执行完毕后,需要通过该匹配模块 12 进行特征特征匹配。具体地,首先通过图像训练,在本发明的游戏控制设备中预存一幅参考图像的图像特征。该参考图像为该操控物体的图像,具有多个兴趣点,定义该多个兴趣点为参考兴趣点;然后将该多个参考兴趣点与摄像头拍摄而得的图像中的待比兴趣点进行一一对比,判断该多个待比兴趣点中有哪些与该多个参考兴趣点的特征相同。
在实现上述步骤时,首先通过该摄像头获得至少一幅具有该操控物体的参考图像,通过兴趣点提取单元提取该参考图像中的该操控物体上的各个兴趣点。通过多幅图像将该参考兴趣点的特征记录下来。为了便于获得参考兴趣点,在一个优选的实施方式中,该摄像头正对着该操控物体进行拍摄,获得该操控物体的正视图。在获得正视图的情况下,可以得到该多个参考兴趣点正视图时的分布。该多个参考兴趣点的分布情况与该操控物体在空间中的位置一一对应,即,当确定该多个参考兴趣点的分布后,可以获得该操控物体的空间坐标 ( 假设以操控物体上某一点的空间中坐标来代表该操控物体在空间中的坐标 ) 、该操控物体的旋转角 ( 假设操控物体为棒状,则可以知道该棒状物体在空间中的状态,例如是横平竖直还是倾斜等 ) 。
该多个参考兴趣点的分布包括每隔一兴趣点的位置、任意两个参考点之间的距离,任意两个参考兴趣点的相对位置关系等。例如,假设具有参考兴趣点 A 、 B 、 C 、 D ,四个点构成一个四边形,任意两个点的距离是确定的,任意两个点之间的距离的比例也是确定了,任意一点相对于任意另一点的空间方位也是确定的。例如 AB 距离 3 厘米, CD 距离 4 厘米,距离之比为 0.75 , C 在 A 的右上方 30 度位置 ( 对于一个图片,按照像素行列的顺序定义上下左右 ) 。定义此时该四个兴趣点的分布关系与该参考图片位置唯一对应,此时参考图片是操控物体位于摄像头正前方 2 米处获得。
步骤 S3 、计算步骤,依据被选择的图像计算操控物体的空间位置;
在被选择的图像中,假定该被选择的图像也具有兴趣点 A 、 B 、 C 、 D 。假设在该被选择的图像中, AB 距离为 6 厘米, CD 距离为 8 厘米,各店的相对位置关系与参考图片中各点的相对位置关系相同。只是任意两点的距离均增大了一倍。则此时可以推断出该被选择的图像中的操控物体也位于该摄像头的正前方,与该摄像头的距离小于 2 米。由于摄像头拍摄物体时,物体在画面中的大小与物体距离摄像头的距离是可以通过公式推导出来的,因此当知道该四个参考点除了距离增大其余相对关系未发生变化时,可以推导出该操控物体是距离摄像头多远处得到。例如,前述例子中,该被选择的图像中的操控物体是位于该摄像头正前方 1 米处拍摄的。
与前述类似,假设 AB 的距离仍为 3 厘米、 CD 的距离变为 4.1cm , A 、 B 之间的位置关系没有发生变化,但是 A 与 C 、 A 与 D 、 B 与 C 、 B 与 D 的位置关系都发生了轻微变化,因此可以推导出该操控物体在空间中稍微转动,转动轴为 AB 所在的直线。
在另外一个实施方式中,该兴趣点 A 、 B 、 C 、 D 也同时出现,其分布通过步骤 S2 已经获得,因此根据预设的计算公式该计算模块 13 可以计算出该操控物体在空间中的坐标。
步骤 S4 、依据该空间位置控制游戏。
游戏控制指令与该操控物体在空间中的状态是对应的。例如,操控物体在空间中静止对应一个游戏指令,操控物体在空中转圈对应另一条游戏指令,操控物体在空间中某个位置处对应又一个游戏指令。在处理完一帧图像后获得该帧图像对应的操控物体在空间中的坐标。当该计算模块 13 连续的处理多帧图像后,则可以依据时间和每个时间点对应的操控物体的空间坐标建立一个操控物体的运动轨迹。该运动轨迹与游戏控制指令一一对应。因此在步骤 S3 完成后,该执行模块 14 能够依据该空间位置和 / 运动轨迹等控制游戏。
在本发明的变更的实施方式中,步骤 S2 时,若一幅图像中的多个待比兴趣点与参考图像中的参考兴趣点相同的数量少于该参考兴趣点的 50% 时,则舍弃该幅图像,进行下一帧图像的处理,从而节约计算时间。
为便于进一步理解本发明的构思,虽然现有技术中存在技术如何提取兴趣点,并判断待比兴趣点的特征是否与模式库中预存图像的特征是否相同,但下面仍给出一个特别示例,介绍如何判断待比兴趣点与模式库中的兴趣点是同一个点。
具体的:
S21 、将兴趣点的图像转化成灰阶图像,确定该兴趣点包括的每一个像素的灰度值;
待比兴趣点的图像一般都是彩色的,具有 RGB 信息。本发明中,可以将该彩色图像转变为灰阶图像,并依据之前每个像素的亮度值计算出该像素的灰阶值。
S22 、获取兴趣点中每个像素的灰度值的二阶导数及矢量方向;
在获得待比兴趣点的所有的像素的灰度值之后,我们可以对灰阶值求梯度,并计算梯度的方向。所述梯度是指灰度值的二阶导数,梯度的方向也可以被称为矢量方向。梯度的方向一般是指在这个方向上,灰度值的变化率的改变最大。
通过数学计算,可以计算出该待比兴趣点涉及的所有的像素点的梯度值和梯度方向。梯度方向通常可以理解为一个平面360度方向中的一个具体数值。对如何定义这个360的方向的起始点0点,属于现有技术中比较容易理解的内容,此处不在叙述。
S23、将360度角度范围划分为多个角度区间,统计矢量方向位于每个角度区间中的二阶导数之和,形成特征阵列;
由于每个像素点处的矢量方向可能是一个非常具体的数值,具有多个小数点,例如为18.125度,此时,需要对该矢量方向及相应二阶导数进行统计。
例如,我们可以将360度的角度范围换分为36个相同大小的角度区间,例如分别是(0, 10],(10, 20],(20, 30],(30, 40]……,(340, 350],(350, 360]等。
进行统计时,是首先将所有的像素的矢量方向归纳为这36个角度区间。例如,18.125度的矢量方向被归纳为(10, 20]的角度区间,11.701度的矢量方向被归纳为 (10, 20]的角度区间,46.265度的矢量方向被归纳为(40, 50]的角度区间。
在归纳完毕之后,以一个角度区间为例,对矢量方向位于该角度区间内的所有像素的梯度值进行求和。求和之后,依据角度区间的自然顺序,对该每个求和值进行排列,此时,可以得到一个36维度的特征阵列。
例如,对一个待比兴趣点的所有像素而言,矢量方向位于(0, 10]的角度区间的所有像素的梯度值(二阶导数)的和为25。(10, 20],(20, 30],(30, 40]……,(340, 350],(350, 360]等的角度区间中对应的梯度值的和依序为 <25,10,55,70……,35,20>。则该待比兴趣点可以用阵列<25,10,55,70……,35,20>来表示。该阵列<25,10,55,70……,35,20>则被称为该待比兴趣点的特征阵列。
S24、判断该特征阵列是否存在于预存图像特征中;
预存在系统中的参考图像中的所有参考兴趣点也进行前述计算,以获得每个参考兴趣点的特征阵列。该等参考兴趣点的特征阵列构成参考特征阵列库。
此时,若需要知道某个待比兴趣点是否与预存的参考图像中的某个参考兴趣点相同,只需要检测该待比兴趣点的特征阵列是否存在与该参考特征阵列库即可。
当然,实际应用中,很难存在一个待比兴趣点的特征阵列与一个参考兴趣点的特征阵列完全相同的情况,因此,当一个待比兴趣点的特征阵列与一个参考兴趣点的特征阵列近似的时候,也可以认为该待比兴趣点就是该参考兴趣点,认为该待比兴趣点的特征阵列位于该参考特征阵列库中。
具体地,所谓近似可以如下表达,例如,假设参考兴趣点的阵列是<35,18,95,60……,30,40>而言,当待比兴趣点的特征阵列的每个数值都与该参考兴趣点的阵列中对应的数值相差在预设比例之内,或者所有相对应的数值误差之和在预设比例之内,该预设比例可以是10%,或者20%等,此时,判定该参考兴趣点的阵列与前述该阵列近似。前述预设比例是一个经验值,并不限于10%或者20%,具体数值根据画质的清晰程度、摄像头、设备的运算能力或者设备的应用环境等方面进行综合考虑后确定。
当视屏中某个画面中的存在400个待比兴趣点,预存的参考特征阵列库中有300个参考特征阵列。则此时,计算每一个待比兴趣点的特征阵列,判断每个特征阵列是否位于参考特征阵列库中,当该有超过150个待比兴趣点的特征阵列均存在于该参考特征阵列库中时(50%的比例),则认为我们找到了该图像。此时可以依据该150个待比兴趣点的分布进行计算,计算相应的操控物体的空间位置。
本技术领域的普通技术人员可以理解实现上述实施例方法的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本发明各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。上述提到的存储介质可以是只读存储器,磁盘或光盘等。
应说明的是,以上实施例仅用以说明本发明的技术方案而非限制,尽管参照较佳实施例对本发明进行了详细说明,本领域的普通技术人员应当理解,可以对本发明的技术方案进行修改或者等同替换,而不脱离本发明技术方案的精神和范围,其均应涵盖在本发明的权利要求范围当中。

Claims (14)

  1. 一种游戏控制方法,包括:
    提取步骤,输入图像序列并获得每一帧图像中的多个待比兴趣点;
    匹配步骤,选择与模式库中预存图像特征匹配的图像;
    计算步骤,依据被选择的图像计算操控物体的空间位置;
    执行步骤,依据该空间位置控制游戏。
  2. 根据权利要求1所述的游戏控制方法,其特征在于,该匹配步骤包括:
    获取一个参考图像,该参考图像中包含该操控物体;
    提取该参考图像的该操控物体的多个参考兴趣点,将该多个参考兴趣点预存于该模式库中。
  3. 根据权利要求2所述的游戏控制方法,其特征在于,该匹配步骤还包括:
    将每一帧输入的图像的多个待比兴趣点与该多个参考兴趣点进行对比;
    计算一帧图像中的待比兴趣点与参考图像中的参考兴趣点相匹配的数量;当该数量达到预设数值时,该帧图像被选择进入计算步骤;
    该预设数值大于等于该参考兴趣点数量的50%。
  4. 根据权利要求2或3所述的游戏控制方法,其特征在于,
    该匹配步骤还包括获取该多个参考兴趣点的分布;
    该计算步骤包括:将与该参考兴趣点相同的待比兴趣点设为定位兴趣点,依据该定位兴趣点的分布计算该帧图像中操控物体的空间位置。
  5. 根据权利要求4所述的游戏控制方法,其特征在于,
    该计算步骤还包括:
    连续获得多帧图像中的操控物体的空间位置;
    建立该操控物体的空间运动轨迹;
    该执行步骤还包括:
    依据该空间运动轨迹生成游戏控制指令;
    执行该游戏控制指令。
  6. 根据权利要求4所述的游戏控制方法,其特征在于,还包括:
    该计算步骤还包括,依据该操控物体的空间位置计算该操控物体在显示设备中的平面坐标;
    显示步骤,将该操控物体显示于显示设备。
  7. 根据权利要求3所述的游戏控制方法,其特征在于,该匹配步骤还包括:
    获取每个待比兴趣点的待比特征矩阵及每个参考兴趣点的参考特征矩阵;
    当一个待比兴趣点的特征矩阵与一个参考兴趣点的参考特征矩阵相同或近似时,标定该待比兴趣点与该参考兴趣点匹配。
  8. 一种游戏控制设备,包括:
    提取模块,用于输入图像序列并获得每一帧图像中的多个待比兴趣点;
    匹配模块,用于选择与模式库中预存图像特征匹配的图像;
    计算模块,用于依据被选择的图像计算操控物体的空间位置;
    执行模块,用于依据该空间位置控制游戏。
  9. 根据权利要求8所述的游戏控制设备,其特征在于:
    该匹配模块还用于获取一个参考图像,该参考图像中包含该操控物体;并且该匹配模块还用于提取该参考图像的该操控物体的多个参考兴趣点,将该多个参考兴趣点预存于该模式库中。
  10. 根据权利要求9所述的游戏控制设备,其特征在于:
    该匹配模块还用于将每一帧输入的图像的多个待比兴趣点与该多个参考兴趣点进行对比;并且该匹配模块还用于计算一帧图像中的待比兴趣点与参考图像中的参考兴趣点相匹配的数量;当该数量达到预设数值时,该帧图像被选择进入计算步骤;
    该预设数值大于等于该参考兴趣点数量的50%。
  11. 根据权利要求9或10所述的游戏控制设备,其特征在于,
    该匹配模块还用于获取该多个参考兴趣点的分布;
    该计算模块还用于将与该参考兴趣点相同的待比兴趣点设为定位兴趣点,依据该定位兴趣点的分布计算该帧图像中操控物体的空间位置。
  12. 根据权利要求11所述的游戏控制设备,其特征在于,
    该计算模块还用于,连续获得多帧图像中的操控物体的空间位置,建立该操控物体的空间运动轨迹,依据该空间运动轨迹生成游戏控制指令。
  13. 根据权利要求11所述的游戏控制设备,其特征在于,还包括:
    该计算模块还用于依据该操控物体的空间位置计算该操控物体在显示设备中的平面坐标;
    显示模块,将该操控物体显示于显示设备。
  14. 根据权利要求10所述的游戏控制设备,其特征在于,该匹配模块还用于:获取每个待比兴趣点的待比特征矩阵及每个参考兴趣点的参考特征矩阵;当一个待比兴趣点的特征矩阵与一个参考兴趣点的参考特征矩阵相同或近似时,标定该待比兴趣点与该参考兴趣点匹配。
PCT/CN2014/084780 2013-10-17 2014-08-20 游戏控制方法及设备 WO2015055047A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310486381.4 2013-10-17
CN201310486381.4A CN103520923A (zh) 2013-10-17 2013-10-17 游戏控制方法及设备

Publications (1)

Publication Number Publication Date
WO2015055047A1 true WO2015055047A1 (zh) 2015-04-23

Family

ID=49923496

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/084780 WO2015055047A1 (zh) 2013-10-17 2014-08-20 游戏控制方法及设备

Country Status (2)

Country Link
CN (1) CN103520923A (zh)
WO (1) WO2015055047A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103520923A (zh) * 2013-10-17 2014-01-22 智尊应用程序开发有限公司 游戏控制方法及设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393599A (zh) * 2007-09-19 2009-03-25 中国科学院自动化研究所 一种基于人脸表情的游戏角色控制方法
CN101479690A (zh) * 2006-06-30 2009-07-08 微软公司 使用摄像机来生成位置信息
US20110111798A1 (en) * 2008-06-24 2011-05-12 Electronics And Telecommunications Research Institute Registration method of reference gesture data, driving method of mobile terminal, and mobile terminal thereof
CN102614662A (zh) * 2011-01-30 2012-08-01 德信互动科技(北京)有限公司 游戏实现系统
CN103520923A (zh) * 2013-10-17 2014-01-22 智尊应用程序开发有限公司 游戏控制方法及设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5531616B2 (ja) * 2007-12-07 2014-06-25 ソニー株式会社 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置
US8487938B2 (en) * 2009-01-30 2013-07-16 Microsoft Corporation Standard Gestures
US9821224B2 (en) * 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Driving simulator control with virtual skeleton
CN102179048A (zh) * 2011-02-28 2011-09-14 武汉市高德电气有限公司 基于动作分解和行为分析实现实景游戏的方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101479690A (zh) * 2006-06-30 2009-07-08 微软公司 使用摄像机来生成位置信息
CN101393599A (zh) * 2007-09-19 2009-03-25 中国科学院自动化研究所 一种基于人脸表情的游戏角色控制方法
US20110111798A1 (en) * 2008-06-24 2011-05-12 Electronics And Telecommunications Research Institute Registration method of reference gesture data, driving method of mobile terminal, and mobile terminal thereof
CN102614662A (zh) * 2011-01-30 2012-08-01 德信互动科技(北京)有限公司 游戏实现系统
CN103520923A (zh) * 2013-10-17 2014-01-22 智尊应用程序开发有限公司 游戏控制方法及设备

Also Published As

Publication number Publication date
CN103520923A (zh) 2014-01-22

Similar Documents

Publication Publication Date Title
US9025885B2 (en) Method of detecting global motion and global motion detector, and digital image stabilization (DIS) method and circuit including the same
US20150035857A1 (en) Methods and apparatus for generating composite images
US10825249B2 (en) Method and device for blurring a virtual object in a video
Jia et al. Online calibration and synchronization of cellphone camera and gyroscope
JP2009050701A (ja) 対話画像システム、対話装置及びその運転制御方法
US9996960B2 (en) Augmented reality system and method
WO2016060458A1 (en) Remote controller apparatus and control method thereof
TWI526879B (zh) 互動系統、遙控器及其運作方法
WO2022005157A1 (en) Electronic device and controlling method of electronic device
WO2019035581A1 (ko) 서버, 디스플레이장치 및 그 제어방법
WO2017044207A1 (en) Kinematic quantity measurement from an image
WO2015055047A1 (zh) 游戏控制方法及设备
CN103797515A (zh) 通过几何修正和翘曲进行的运动分析
WO2019245320A1 (ko) 이미지 센서와 복수의 지자기 센서를 융합하여 위치 보정하는 이동 로봇 장치 및 제어 방법
CN114116435A (zh) 游戏测试方法、系统、电子设备及计算机可读存储介质
Bapat et al. Rolling shutter and radial distortion are features for high frame rate multi-camera tracking
Zhou et al. OpenMV Based Cradle Head Mount Tracking System
JP2011090606A (ja) 画像処理装置、画像表示システム、及び画像処理方法
WO2020116100A1 (ja) 情報処理装置、情報処理方法、プログラム及び投影システム
JP2006109088A (ja) マルチプロジェクションシステムにおける幾何補正方法
O'Sullivan et al. Empirical modelling of rolling shutter effect
JP5592834B2 (ja) 光学投影制御装置、光学投影制御方法、及びプログラム
WO2020105847A1 (ko) 전자 장치 및 그 제어 방법
Tykkälä et al. RGB-D Tracking and Reconstruction for TV Broadcasts.
WO2023121053A1 (ko) 전자 장치 및 그 제어 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14853341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14853341

Country of ref document: EP

Kind code of ref document: A1