CN113318424B - Novel game device and control method - Google Patents

Novel game device and control method Download PDF

Info

Publication number
CN113318424B
CN113318424B CN202011538691.2A CN202011538691A CN113318424B CN 113318424 B CN113318424 B CN 113318424B CN 202011538691 A CN202011538691 A CN 202011538691A CN 113318424 B CN113318424 B CN 113318424B
Authority
CN
China
Prior art keywords
information
game
image
virtual
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011538691.2A
Other languages
Chinese (zh)
Other versions
CN113318424A (en
Inventor
傅峰峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fugang Life Intelligent Technology Co Ltd
Original Assignee
Guangzhou Fugang Life Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fugang Life Intelligent Technology Co Ltd filed Critical Guangzhou Fugang Life Intelligent Technology Co Ltd
Priority to CN202011538691.2A priority Critical patent/CN113318424B/en
Publication of CN113318424A publication Critical patent/CN113318424A/en
Application granted granted Critical
Publication of CN113318424B publication Critical patent/CN113318424B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses a novel game device, which comprises an intelligent interaction system, wherein the intelligent interaction system comprises: a game terminal and a game robot; the game terminal comprises: the human body motion capturing device is used for collecting the position and human body motion information of the participants; the control device is used for providing game information and generating interaction control instructions according to the positions of the participants, the human body action information and the game information; the game robot is used for collecting the interaction control instruction and executing according to the interaction control instruction. According to the invention, the human body motion is captured through the human body motion capturing device, so that the human body motion of the participant is determined, the fight or online effect of the participant and the game robot is realized through controlling the game robot, the sense of reality of a game is improved, and the immersive experience effect is realized.

Description

Novel game device and control method
Technical Field
The invention relates to a novel game device and a control method, in particular to a novel game device and a control method for acquiring human body gestures and realizing man-machine cooperation.
Background
With the development of scientific technology, electronic games are widely popularized, and the display modes of the games are subjected to the traditional augmented reality technologies such as an old television, a plasma television, a liquid crystal television, VR and the like, so that the real texture of game contents is gradually improved, and the game contents still stay in a simple interaction state between a player and a game machine.
In the game process, the game machine needs to collect control commands of the player through terminals such as a handle, so that the player can only play the game through the control of the terminals. Meanwhile, the game process is mostly displayed by adopting virtual images, and the game process still cannot achieve higher sense of reality due to lack of participation of real objects.
Disclosure of Invention
The invention provides a novel game device and a control method, so that a game device capable of interacting with game participants is established, and a user can obtain unique game experience.
The invention discloses a novel game device, which comprises an intelligent interaction system, wherein the intelligent interaction system comprises: a game terminal and a game robot;
the game terminal comprises:
the human body motion capturing device is used for collecting the position and human body motion information of the participants;
the control device is used for providing game information and generating interaction control instructions according to the positions of the participants, the human body action information and the game information;
the game robot is used for collecting the interaction control instruction and executing according to the interaction control instruction.
Further, the game terminal also comprises an environment information acquisition device and a projection device;
the environment information acquisition device is used for acquiring real environment information, and the control device generates matched image information according to the real environment information, the positions of participants and the human body action information;
the projection device is used for projecting the image information to the real environment so that the image information is matched with the real environment information and the participants.
Still further, the control device is configured to generate image information matched with the real environment information according to the real environment information, and includes:
according to the real environment information, constructing a virtual 3D image, fitting the position and human motion information of the participants with the virtual 3D image, and obtaining virtual interaction image parameters;
and correcting the virtual interactive image parameters to form image information for projection.
Further, the environmental information collection device is further used for collecting spatial information of the game robot, wherein the spatial information at least comprises one or more of position information, angle information and action information of the game robot;
the control device fits the space information of the game robot, the positions of the participants and the human action information with the image information to form a virtual interaction image interacted with the game robot, and virtual interaction image parameters are obtained; the control device corrects the virtual interactive image parameters to form interactive image information for projection;
the projection device projects interactive image information to a real environment.
Still further, the projection device includes N projection units, and the control device is configured to generate image information matching with the real environment information according to the real environment information, including:
constructing a virtual image according to the real environment information, wherein the virtual image comprises N virtual sub-images, and virtual sub-image parameters are obtained; the virtual sub-images are in one-to-one correspondence with the projection units, and the virtual sub-images are partially or completely overlapped;
and correcting the virtual sub-image parameters according to the position of the projection unit corresponding to the virtual sub-image to form sub-image information for projection of the projection unit.
Further, the game robot can be one or more of an AGV trolley and a bionic robot.
The invention also discloses a control method of the novel game device, which comprises a human body motion capturing and collecting process, wherein the human body motion capturing and collecting process comprises the following steps:
locating the participant position;
collecting the shapes of participants;
simplifying according to the stature of the participator to obtain N bone points;
and acquiring position change information of skeleton points to obtain human motion information.
Further, the simplifying according to the participant figure, and obtaining N bone points includes:
constructing a 3D human model according to the participant figure information;
the 3D human model is simplified, and a matchman structure consisting of N bone points is obtained.
Further, the step of acquiring the position change information of the skeleton points to obtain the human motion information comprises;
collecting first space positions of all bone points and calculating the first space positions as first moments;
collecting a second spatial position of each bone point and calculating the second spatial position as a second moment;
and determining the time interval according to the first moment and the second moment, determining the position change information of the same skeleton point, and simulating the movement information of the matchman structure from the first moment and the second moment to obtain the human body action information.
Further, the control method includes a game running process including:
the control device outputs game information;
the human body motion capturing device collects the positions of participants and human body motion information;
the control device generates a control instruction according to the position of the participant and the human body action information and sends the control instruction to the game robot for execution;
the control device collects the space information in the execution process of the game robot and fits the space information with the game information to obtain the interactive game information.
Compared with the prior art, the human body motion capturing device is used for capturing human body motions, further determining the human body motions of the participants, and controlling the game robot to achieve the fight or online effect of the participants and the game robot, so that the sense of reality of a game is improved, and the immersive experience effect is achieved.
Drawings
FIG. 1 is a block diagram of an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution of the embodiments of the present invention will be clearly and completely described below, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or article that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or article.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The embodiment of the invention provides a novel game device, which comprises an intelligent interaction system, wherein the intelligent interaction system is shown in fig. 1 and comprises: a game terminal and a game robot;
the game terminal comprises:
the human body motion capturing device is used for collecting the position and human body motion information of the participants;
the human motion capturing device comprises a 3D camera, wherein the 3D camera can adopt a TOF lens, a Kinect sensor and the like and is used for capturing specific spatial positions of participants and human motion information such as human motion and the like;
the control device is used for providing game information and generating interaction control instructions according to the positions of the participants, the human body action information and the game information;
wherein, the control device is internally provided with a control instruction library and a game information library, and the control instruction library comprises a plurality of control instructions and corresponding human body action information; the game information base contains game information, the control device determines the interaction condition of the participants and the game information by collecting the position and human action information of the participants, and generates a control instruction of the corresponding game robot according to the interaction condition;
the game robot is used for collecting the interaction control instruction and executing according to the interaction control instruction.
The game robot collects the interaction control instruction and operates according to the interaction control instruction to achieve the effect of executing the instruction, so that the game robot can operate in cooperation with the action of the participant to achieve the effect of online game or fight. And the game robot has a movement module for changing the position of the game robot, the game robot including at least one robot arm for executing a pick-up command of the participant.
According to the embodiment of the invention, the human body motion is captured through the human body motion capturing device, so that the effect that a participant controls a game through the human body motion is realized. Meanwhile, the game robot is adopted, and the player command is executed by the game robot, so that the reality of the game is improved, and the immersive experience effect is realized.
Optionally, the game terminal further comprises an environment information acquisition device and a projection device;
the environment information acquisition device is used for acquiring real environment information, and the control device generates matched image information according to the real environment information generation, the position of the participant and the human body action information;
the real environment information in the embodiment of the invention comprises the space environment information of the area where the intelligent interaction system is located, namely a 3D space structure, the novel game device of the embodiment of the invention can be installed indoors, and the environment information acquisition device acquires the size of an indoor space, the size of an article existing indoors and the use information; the control device collects information collected by the environment information collection device and establishes an indoor virtual 3D model; the control device also comprises an image information model library, wherein a plurality of theme image packages are stored in the image information model library, and each theme image package comprises a background image main unit and a background image sub-unit; the background image main unit is used for providing a theme background according to the indoor space, and the background image sub-unit is used for providing an image background matched with the indoor object in size; in the running process of the control device, the control device collects the theme background in the background image main unit and fits with the indoor space size, so that the fitted theme background is matched with the indoor space size; the control device further collects image backgrounds in the background image subunit and fits the image backgrounds with indoor articles respectively, so that the fitted image backgrounds are matched with the indoor articles; the control device establishes image information matched with real environment information, the position of the participator and human body action information according to the fitted image background;
the projection device is used for projecting image information to the real environment so that the image information is matched with the real environment information;
the projection device comprises a projector, wherein the projector projects image information to the surface of the real environment;
the game robot is used for collecting control instructions and executing commands according to the control instructions.
According to the embodiment of the invention, the information acquisition of the real external environment where the game system is located is realized through the environment information acquisition device of the game terminal, and the image information matched with the real external environment is obtained by matching with the projection device, so that the game image scene can be intelligently projected according to the external environment, the positions and the actions of the participants, and the personalized experience effect is achieved.
Optionally, the control device is configured to generate, according to the real environment information, image information matched with the real environment information, including:
according to the real environment information, constructing a virtual 3D image, fitting the position and human motion information of the participants with the virtual 3D image, and obtaining virtual interaction image parameters;
the control device acquires information acquired by the environment information acquisition device, acquires 3D information of a real environment, and establishes an indoor virtual 3D map model; the control device extracts a virtual image matched with the virtual 3D map from the theme image package according to the virtual 3D map, and fits the position and human body action information of the participant with the virtual image to obtain virtual interaction image parameters;
and correcting the virtual interactive image parameters to form image information for projection.
The method comprises the steps of setting a certain angle in a real environment as a projection surface, collecting the relative positions of a projector and the projection surface, and correcting virtual image parameters.
Because of the characteristics of the projector, the projection of the virtual image directly generates deviation in the real environment to influence the actual experience effect.
Optionally, the environmental information collection device is further configured to collect spatial information of the game robot, where the spatial information at least includes one or more of position information, angle information, and motion information of the game robot;
the environment information acquisition device acquires spatial information of the game robot, wherein the spatial information at least comprises one of horizontal position information of the game robot, height position information of the game robot and action information of the game robot; in particular, the spatial information at least includes one or more of position information, angle information, and motion information of the game robot. The height and position information of the game robot determines the specific position of the game robot in the real space and the occupied size of the game robot in the space, and the motion information of the game robot is used for judging the motion trail of the game robot by the control device, so that the interaction effect of the game robot and the image information is realized.
The control device fits the space information of the game robot, the positions of the participants and the human action information with the image information to form a virtual interaction image interacted with the game robot, and virtual interaction image parameters are obtained; the control device corrects the virtual interactive image parameters to form interactive image information for projection;
the control device collects the space information of the game robot, the positions of the participants and the human body action information, fits the positions and actions of the game robot and the participants into the image information to form a virtual interactive image, obtains virtual interactive image parameters, and obtains interactive image information intersecting the game robot and the participants through correction;
the projection device projects interactive image information to a real environment.
The environment information acquisition device provided by the embodiment of the invention can ensure that the control device can determine the specific position and action of the game robot in the virtual 3D map by acquiring the space information of the game robot, so that the fitting process of the game robot, the participants and the image information is realized, and the interaction effect of the game image information and the robot is achieved.
In particular, the projection device includes N projection units, and the control device is configured to generate image information matching with the real environment information based on the real environment information, including:
constructing a virtual image according to the real environment information, wherein the virtual image comprises N virtual sub-images, and virtual sub-image parameters are obtained; the virtual sub-images are in one-to-one correspondence with the projection units, and the virtual sub-images are partially or completely overlapped;
wherein N is a natural number, and in the embodiment of the invention, the projection unit is a projector; the control device divides the virtual image into N virtual sub-images according to the relative positions of the projection unit and the real environment, so that the virtual sub-images correspond to the projection unit one by one;
and correcting the virtual sub-image parameters according to the position of the projection unit corresponding to the virtual sub-image to form sub-image information for projection of the projection unit.
The method comprises the steps of setting a plane at a certain angle in a real environment as a projection plane, collecting the relative positions of each projection unit and the projection plane, and correcting virtual sub-image parameters to form sub-image information suitable for projection of each projection unit.
The projector can only project in a single direction, so that the projection effect on the 3D structure is poor, and the reality experience is lacking. According to the embodiment of the invention, the N projection units are adopted to realize the superposition projection effect on the real scene by matching with the control device, so that the multi-angle projection is realized, and the sense of reality of the projection is improved.
Optionally, the game robot may be one or more of an AGV cart and a biomimetic robot.
The number of the game robots is 1 or more, the game robots can adopt robots with moving functions such as AGV trolleys and bionic robots, and can also adopt mechanical structures such as mechanical arms which cannot move but can realize multiple actions. In the embodiment of the invention, the game robot is an AGV trolley with a mechanical arm, and can execute various actions by using the mechanical arm while having a moving function, so that a good interaction effect is realized.
In particular, the environmental information collection device is further configured to collect spatial information of the game robot, including:
collecting first space information of the game robot, and calculating a first moment;
the environment information acquisition device is internally provided with a timing unit, the timing unit is controlled to start timing when the space information of the game robot is acquired, and the space information of the game robot at the moment is set as first space information;
collecting second space information of the game robot and calculating a second moment;
the method comprises the steps of collecting space information of a game robot after a time interval, and setting the space information as second space information;
the control device determines a time interval according to the first time and the second time, determines the running trend parameter of the game robot according to the time interval between the first space information and the second space information, fits the running trend parameter with the image information, forms a virtual interactive image aiming at the running trend parameter of the game robot, and obtains the virtual interactive image parameter; the control device corrects the virtual interactive image parameters to form interactive image information for projection;
the control device obtains the position change of the game robot according to the change trend of the first space information and the second space information and the time difference of the first moment and the second moment, so that the running trend parameter of the game robot is determined, the future space information of the game robot is judged, the future space information of the game robot is fitted with the image information according to the future space information of the game robot, and the control device can intelligently respond according to the running trend of the game robot, so that the interaction effect is improved;
the projection device projects interactive image information to a real environment.
The control device is used for improving accurate judgment of the running trend of the game robot, collecting the space information change condition of the game robot at a plurality of time intervals, and judging whether the robot is in an acceleration state, a deceleration state or a stop state according to the change condition of the space information of the game robot.
According to the embodiment of the invention, the environmental information acquisition device is used for acquiring the spatial information of the game robot at a plurality of moments, so that the motion condition of the game robot is determined, the future running trend of the game robot is further judged, the intelligent response with the game robot is realized, and the interaction effect of a game system is improved.
Particularly, the intelligent interaction system further comprises a growth module, wherein the growth module is used for collecting instructions, image information, spatial information and virtual interaction images of the participants, taking the instructions of the participants as first parameters, taking the image information as second parameters, taking the spatial information as third parameters and taking the virtual interaction images as evaluation parameters, and establishing response relations among the first parameters, the second parameters, the third parameters and the evaluation parameters; the evaluation parameters are provided with a plurality of target parameters, and after receiving instructions and image information of participants, the game robot determines space information corresponding to each target parameter according to the target parameters and selectively executes the space information.
The game robot space information comprises game robot running information, the growth module is matched with image information provided by the control device and the game robot space information to form a response relationship with the virtual interactive image by using the participant instruction information, the virtual interactive image is used as evaluation of the game robot running information, a scoring system is established, and different game robot running information is scored to obtain optimal game robot running information, so that intelligent game robot development is realized.
According to the embodiment of the invention, the growth module is adopted, so that the game robot can automatically judge the optimal operation information after receiving the instruction operation of the participator for many times, and the game robot has a growth function, so that intelligent cultivation is realized.
The control method of the novel game device comprises a human body motion capturing and collecting process, wherein the human body motion capturing and collecting process comprises the following steps of:
locating the participant position;
the human body motion capturing device determines the specific position of the participant, and in the game process, other people possibly pass by the participant, and control errors caused by the change of the capturing object of the human body motion capturing device are avoided through the specific positioning of the participant;
collecting the shapes of participants;
the method comprises the steps of scanning body shape information of a participant to obtain one or more information including at least a head position, a trunk position and four limbs position;
simplifying according to the stature of the participator to obtain N bone points;
wherein, the human body is simplified into a structure composed of a plurality of line segments, and the bone points are positioned at the endpoints of each line segment;
and acquiring position change information of skeleton points to obtain human motion information.
And determining human body actions according to the change information of each bone point to obtain human body action information.
According to the embodiment of the invention, the positions and the actions of the participants are confirmed through the human motion capturing and collecting process, so that the human motion capturing and collecting precision is improved.
Optionally, the simplifying according to the participant figure, and obtaining N bone points includes:
constructing a 3D human model according to the participant figure information;
the 3D human model is simplified, and a matchman structure consisting of N bone points is obtained.
The simplification process is as follows: the human body contour is removed, the joints are simplified into single skeleton points, the trunk of the forearm, the forearm and the like are simplified into a line segment structure formed by two or more skeleton points, and the human body is simplified into a matchman structure formed by a plurality of skeleton points and line segments.
According to the embodiment of the invention, by simplifying the human body structure, the noise influence of the human body shape and clothes on the human body motion capturing process is avoided, the human body motion capturing precision is improved, and the human body capturing difficulty is reduced.
In particular, the step of acquiring the position change information of the skeleton points, and the step of obtaining the human body action information comprises;
collecting first space positions of all bone points and calculating the first space positions as first moments;
the human body motion capturing device is internally provided with a timing unit, the timing unit is controlled to start timing when the human body bone point position is acquired, and the position of the human body bone point at the moment is set to be a first space position;
collecting a second spatial position of each bone point and calculating the second spatial position as a second moment;
wherein, the positions of the human skeleton points after the time interval are collected, and the positions of the human skeleton points are set as second space positions;
and determining the time interval according to the first moment and the second moment, determining the position change information of the same skeleton point, and simulating the movement information of the matchman structure from the first moment and the second moment to obtain the human body action information.
According to the embodiment of the invention, the human body action information can be accurately obtained through changing the bone point position, so that the player can conveniently and smoothly control the game.
Optionally, the control method includes a game running process, where the game running process includes:
the control device outputs game information;
the control device can send game information to display equipment such as a television or a projector, and the like to form images for participants to watch;
the human body motion capturing device collects the positions of participants and human body motion information;
the human body motion capturing device interacts with game information by collecting the position and human body motion information of the participants;
the control device generates a control instruction according to the position of the participant and the human body action information and sends the control instruction to the game robot for execution;
the control device automatically judges according to interaction between the participant and the game information, forms a control command for the game robot and sends the control command to the game robot for execution (namely, the control device intelligently judges the game robot according to actions of the participant, such as executing rescue actions when the participant is injured in an online combat state;
the control device collects the space information in the execution process of the game robot and fits the space information with the game information to obtain the interactive game information.
The control device collects the space information change in the process of executing the command by the game robot, and obtains the interactive game information after the game robot operates by fitting the space information of the game robot with the game information (namely, the control device changes the game information according to the action of the game robot so as to realize the interactive effect).
According to the embodiment of the invention, the interaction process among the game robot, the participants and the game information is realized by collecting the space information of the game robot, the positions of the participants and the human body action information, so that the game information content displayed by the control device can be changed according to the game robot, and the game reality is improved through interaction with the game robot.
Finally, it should be noted that the above-mentioned embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above-mentioned embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made to the specific embodiments of the present invention after reading the present specification, and these modifications and variations do not depart from the scope of the invention as claimed in the pending claims.

Claims (6)

1. A novel gaming device, wherein the novel gaming device comprises an intelligent interactive system, the intelligent interactive system comprising: a game terminal and a game robot;
the game terminal comprises:
the human body motion capturing device is used for collecting the position and human body motion information of the participants;
the control device is used for providing game information and generating interaction control instructions according to the positions of the participants, the human body action information and the game information;
the game robot is used for collecting interaction control instructions and executing according to the interaction control instructions;
the game terminal also comprises an environment information acquisition device and a projection device;
the environment information acquisition device is used for acquiring real environment information, and the control device generates matched image information according to the real environment information, the positions of participants and the human body action information;
the projection device is used for projecting image information to the real environment so that the image information is matched with the real environment information and the participants;
the control device is used for generating image information matched with the real environment information according to the real environment information, and comprises the following steps:
according to the real environment information, constructing a virtual 3D image, fitting the position and human motion information of the participants with the virtual 3D image, and obtaining virtual interaction image parameters;
the environment information acquisition device is also used for acquiring the space information of the game robot, and the space information at least comprises one or more of position information, angle information and action information of the game robot;
the control device fits the space information of the game robot, the positions of the participants and the human action information with the image information to form a virtual interaction image interacted with the game robot, and virtual interaction image parameters are obtained; the control device corrects the virtual interactive image parameters to form interactive image information for projection;
the projection device projects interactive image information to a real environment;
the control method of the novel game device comprises a human body motion capturing acquisition process and a game running process, wherein the game running process comprises the following steps of:
the control device outputs game information;
the human body motion capturing device collects the positions of participants and human body motion information;
the control device generates a control instruction according to the position of the participant and the human body action information and sends the control instruction to the game robot for execution;
the control device collects space information in the execution process of the game robot and fits the space information with game information to obtain interactive game information;
collecting first space information of the game robot, wherein the first space information is counted as a first moment;
collecting second space information of the game robot, wherein the second space information is calculated as a second moment;
the control device determines a time interval according to the first time and the second time, determines the running trend parameter of the game robot according to the time interval between the first space information and the second space information, fits the running trend parameter with the image information, forms a virtual interactive image aiming at the running trend parameter of the game robot, and obtains the virtual interactive image parameter; the control device corrects the virtual interactive image parameters to form interactive image information for projection.
2. The novel game device according to claim 1, wherein the projection device includes N projection units, and the control device is configured to generate image information matching with the real environment information based on the real environment information, including:
constructing a virtual image according to the real environment information, wherein the virtual image comprises N virtual sub-images, and virtual sub-image parameters are obtained; the virtual sub-images are in one-to-one correspondence with the projection units, and the virtual sub-images are partially or completely overlapped;
and correcting the virtual sub-image parameters according to the position of the projection unit corresponding to the virtual sub-image to form sub-image information for projection of the projection unit.
3. The novel game device according to claim 1, wherein the game robot is one or more of an AGV car and a bionic robot.
4. The novel gaming device of claim 1, wherein the human motion capture acquisition process comprises:
locating the participant position;
collecting the shapes of participants;
simplifying according to the stature of the participator to obtain N bone points;
and acquiring position change information of skeleton points to obtain human motion information.
5. The novel play set of claim 4, wherein said simplifying based on participant stature to obtain N plurality of skeletal points comprises:
constructing a 3D human model according to the participant figure information;
the 3D human model is simplified, and a matchman structure consisting of N bone points is obtained.
6. The novel game device according to claim 5, wherein the acquiring the position change information of the skeletal points, obtaining the human motion information, comprises;
collecting first space positions of all bone points and calculating the first space positions as first moments;
collecting a second spatial position of each bone point and calculating the second spatial position as a second moment;
and determining the time interval according to the first moment and the second moment, determining the position change information of the same skeleton point, and simulating the movement information of the matchman structure from the first moment and the second moment to obtain the human body action information.
CN202011538691.2A 2020-12-23 2020-12-23 Novel game device and control method Active CN113318424B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011538691.2A CN113318424B (en) 2020-12-23 2020-12-23 Novel game device and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011538691.2A CN113318424B (en) 2020-12-23 2020-12-23 Novel game device and control method

Publications (2)

Publication Number Publication Date
CN113318424A CN113318424A (en) 2021-08-31
CN113318424B true CN113318424B (en) 2023-07-21

Family

ID=77413193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011538691.2A Active CN113318424B (en) 2020-12-23 2020-12-23 Novel game device and control method

Country Status (1)

Country Link
CN (1) CN113318424B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201955771U (en) * 2010-11-15 2011-08-31 中国科学院深圳先进技术研究院 Human-computer interaction system
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN105291114A (en) * 2015-12-04 2016-02-03 北京建筑大学 Home service type robot system based on mobile internet
CN109200576A (en) * 2018-09-05 2019-01-15 深圳市三宝创新智能有限公司 Somatic sensation television game method, apparatus, equipment and the storage medium of robot projection
CN111277808A (en) * 2020-03-24 2020-06-12 欧拓飞科技(珠海)有限公司 Virtual reality enhancement equipment and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201955771U (en) * 2010-11-15 2011-08-31 中国科学院深圳先进技术研究院 Human-computer interaction system
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN105291114A (en) * 2015-12-04 2016-02-03 北京建筑大学 Home service type robot system based on mobile internet
CN109200576A (en) * 2018-09-05 2019-01-15 深圳市三宝创新智能有限公司 Somatic sensation television game method, apparatus, equipment and the storage medium of robot projection
CN111277808A (en) * 2020-03-24 2020-06-12 欧拓飞科技(珠海)有限公司 Virtual reality enhancement equipment and method

Also Published As

Publication number Publication date
CN113318424A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
JP5508007B2 (en) Apparatus and method for interacting with data processing apparatus
JP6276882B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
EP2039402B1 (en) Input instruction device, input instruction method, and dancing simultation system using the input instruction device and method
US8840466B2 (en) Method and system to create three-dimensional mapping in a two-dimensional game
JP5410710B2 (en) Program, information storage medium, game system
EP2394717A2 (en) Image generation system, image generation method, and information storage medium for video games
US8655015B2 (en) Image generation system, image generation method, and information storage medium
CN109799900A (en) The wireless wrist connected for three-dimensional imaging, mapping, networking and interface calculates and controls device and method
EP2395454A2 (en) Image generation system, shape recognition method, and information storage medium
CN103019024A (en) System for realtime and accurate observation and analysis of table tennis rotating and system operating method
US20150182855A1 (en) Motion detection for existing portable devices
CN107803025B (en) Analogy method is aimed at and triggered when a kind of 3D high-precision real
CN110728739A (en) Virtual human control and interaction method based on video stream
CN111028597B (en) Mixed reality foreign language scene, environment and teaching aid teaching system and method thereof
CN113318424B (en) Novel game device and control method
CN110038258A (en) A kind of omnidirectional's treadmill and its virtual reality implementation method
CN113318425B (en) Novel game device and control method
CN108553888A (en) Augmented reality exchange method and device
CN113318426B (en) Novel game system
CN113797525B (en) Novel game system
CN114474066B (en) Intelligent humanoid robot control system and method
JP7282844B2 (en) entertainment system
WO2019142229A1 (en) Robot device, method for controlling robot device, and program
CN113905220B (en) Sand table projection method, device, equipment and storage medium based on event scene
CN107145220A (en) Man-machine interaction self-adapting regulation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211104

Address after: 510663 501-2, Guangzheng science and Technology Industrial Park, No. 11, Nanyun fifth road, Science City, Huangpu District, Guangzhou, Guangdong Province

Applicant after: GUANGZHOU FUGANG LIFE INTELLIGENT TECHNOLOGY Co.,Ltd.

Address before: 510700 501-1, Guangzheng science and Technology Industrial Park, No. 11, Yunwu Road, Science City, Huangpu District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU FUGANG WANJIA INTELLIGENT TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant