CN113318426B - Novel game system - Google Patents

Novel game system Download PDF

Info

Publication number
CN113318426B
CN113318426B CN202011544368.6A CN202011544368A CN113318426B CN 113318426 B CN113318426 B CN 113318426B CN 202011544368 A CN202011544368 A CN 202011544368A CN 113318426 B CN113318426 B CN 113318426B
Authority
CN
China
Prior art keywords
information
user
image
game
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011544368.6A
Other languages
Chinese (zh)
Other versions
CN113318426A (en
Inventor
傅峰峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fugang Life Intelligent Technology Co Ltd
Original Assignee
Guangzhou Fugang Life Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fugang Life Intelligent Technology Co Ltd filed Critical Guangzhou Fugang Life Intelligent Technology Co Ltd
Priority to CN202011544368.6A priority Critical patent/CN113318426B/en
Publication of CN113318426A publication Critical patent/CN113318426A/en
Application granted granted Critical
Publication of CN113318426B publication Critical patent/CN113318426B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a novel game system, which comprises an intelligent interaction system, wherein the intelligent interaction system comprises: a game terminal and a game robot; the game terminal comprises: the environment information acquisition device is used for acquiring real environment information and user space parameters; the control device is used for generating matched image information according to the real environment information and the user space parameters; the projection device is used for projecting image information to the real environment so as to enable the image information to be matched with the real environment information; the game robot is used for collecting user information and image information, setting virtual environment information according to the image information and executing commands according to the user information and the virtual environment information; the user information is one or more of a user instruction or a user space parameter. The game image scene can be intelligently changed according to the environment, and the personalized experience effect is achieved by matching with the action response of a user.

Description

Novel game system
Technical Field
The present invention relates to a novel game system, and more particularly, to a novel game system that can interact with a user.
Background
With the development of scientific technology, electronic games are widely popularized, and the display modes of the games are subjected to the traditional augmented reality technologies such as an old television, a plasma television, a liquid crystal television, VR and the like, so that the real texture of game contents is gradually improved, and the game contents still stay in a simple interaction state between a player and a game machine.
In the game process, the game machine collects control commands of players and displays corresponding image information according to game software so as to realize interaction with the players. Although augmented reality technologies such as VR can provide an immersive experience for a player, most of the augmented reality technologies such as VR adopt a head-mounted device, so that the actions of a user are limited, the use experience of the user is reduced, and the long-term use process can lead to dizziness of the user and reduce the game feeling. In addition, the game process still cannot achieve high realism due to the lack of participation of real objects.
Disclosure of Invention
The invention provides a novel game system, which is used for establishing a game scene which can be subjected to personalized projection according to an actual environment, collecting the space information of a user, realizing an interaction function with the user and obtaining unique game experience.
The invention provides a novel game system, which comprises an intelligent interaction system, wherein the intelligent interaction system comprises: a game terminal and a game robot;
the game terminal comprises:
the environment information acquisition device is used for acquiring real environment information and user space parameters;
the control device is used for generating matched image information according to the real environment information and the user space parameters;
the projection device is used for projecting image information to the real environment so as to enable the image information to be matched with the real environment information;
the game robot is used for collecting user information and image information, setting virtual environment information according to the image information and executing commands according to the user information and the virtual environment information; the user information is one or more of a user instruction or a user space parameter.
Further, the novel game system further comprises a control end, wherein the control end is used for collecting instructions of a user and sending the instructions of the user to the game end and/or the game robot.
Further, the control device is configured to generate image information matched with the real environment information according to the real environment information, and includes:
according to the real environment information, constructing a virtual 3D image, fitting the user space parameters with the virtual 3D image, and obtaining virtual interaction image parameters;
and correcting the virtual interactive image parameters to form image information for projection.
Further, the environment information acquisition device is also used for acquiring the space information of the game robot;
the control device fits the space information of the game robot with the virtual 3D image to form a virtual interaction image interacted with the game robot, and virtual interaction image parameters are obtained; the control device corrects the virtual interactive image parameters to form image information for projection;
the projection device projects image information to a real environment.
Still further, the projection device includes N projection units, and the control device is configured to generate image information matching with the real environment information according to the real environment information, including:
constructing a virtual image according to the real environment information, wherein the virtual image comprises N virtual sub-images, and virtual sub-image parameters are obtained; the virtual sub-images are in one-to-one correspondence with the projection units, and the virtual sub-images are partially or completely overlapped;
and correcting the virtual sub-image parameters according to the position of the projection unit corresponding to the virtual sub-image to form sub-image information for projection of the projection unit.
Still further, the spatial information includes at least one or more of position information, angle information, and motion information of the game robot.
Still further, the environmental information collection device is further configured to collect spatial information of the game robot, including:
collecting first space information of a game robot and/or a user, and calculating a first moment;
collecting second space information of the game robot and/or a user, and calculating a second moment;
the control device determines a time interval according to the first time and the second time, determines running trend parameters of the game robot and/or a user according to the time interval between the first space information and the second space information, fits the running trend parameters with the image information, and forms a virtual interactive image aiming at the running trend parameters of the game robot and/or the user to obtain virtual interactive image parameters; the control device corrects the virtual interactive image parameters to form image information for projection;
the projection device projects image information to a real environment.
Furthermore, the intelligent interaction system further comprises a growth module, wherein the growth module is used for acquiring user information, image information, space information and virtual interaction images, taking the user information as a first parameter, taking the image information as a second parameter, taking the space information as a third parameter and taking the virtual interaction images as evaluation parameters, and establishing the response relation between the first parameter, the second parameter, the third parameter and the evaluation parameters; the evaluation parameters are provided with a plurality of target parameters, and after receiving the user information and the image information, the game robot determines the space information corresponding to each target parameter according to the target parameters and selectively executes the space information.
Further, the game terminal further comprises a user instruction acquisition module, wherein the user instruction acquisition module comprises one or more of a voice recognition unit and a human body action recognition unit, and the user instruction acquisition module is used for converting voice or actions of a user into instructions of the user.
Further, the game robot can be one or more of an AGV trolley and a bionic robot.
Compared with the prior art, the game system and the game system have the advantages that the information acquisition of the real external environment where the game system is located and the information acquisition of the user is realized through the environment information acquisition device of the game terminal, the image information matched with the real external environment and the user is obtained by matching with the projection device, the game image scene can be intelligently changed according to the external environment, and the change response is carried out by matching with the action of the user, so that the personalized experience effect is achieved. Meanwhile, the game robot is adopted, and is utilized to react according to the information of the user, so that the game fight effect is realized, the sense of reality of the game is improved, and the immersive experience effect is realized.
Drawings
FIG. 1 is a block diagram of an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution of the embodiments of the present invention will be clearly and completely described below, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or article that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or article.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The embodiment of the invention provides a novel game system, which comprises an intelligent interaction system, wherein the intelligent interaction system is shown in fig. 1 and comprises: a game terminal and a game robot;
the game terminal comprises:
the environment information acquisition device is used for acquiring real environment information and user space parameters;
the real environment information in the embodiment of the invention comprises the space environment information of the area where the intelligent interaction system is located and the space parameters of a user, namely the 3D space structure, the position information and the action information of the user;
the control device is used for generating matched image information according to the real environment information and the user space parameters;
the control device collects information collected by the environment information collection device and establishes an indoor virtual 3D model; the control device also comprises an image information model library, wherein a plurality of theme image packages are stored in the image information model library, and each theme image package comprises a background image main unit and a background image sub-unit; the background image main unit is used for providing a theme background according to the indoor space, and the background image sub-unit is used for providing an image background matched with the indoor object in size; in the running process of the control device, the control device collects the theme background in the background image main unit and fits with the indoor space size, so that the fitted theme background is matched with the indoor space size; the control device further collects image backgrounds in the background image subunit and fits the image backgrounds with indoor articles respectively, so that the fitted image backgrounds are matched with the indoor articles; the control device establishes image information matched with real environment information and user space parameters according to the fitted image background;
the projection device is used for projecting image information to the real environment so as to enable the image information to be matched with the real environment information;
the projection device comprises a projector, wherein the projector projects image information to the surface of the real environment;
the game robot is used for collecting user information and image information, setting virtual environment information according to the image information and executing commands according to the user information and the virtual environment information; the user information is one or more of a user instruction or a user space parameter.
The game robot is provided with a moving module for changing the position of the game robot, and at least comprises a mechanical arm for executing a pick-up command of a user or interacting with the user.
According to the embodiment of the invention, the information acquisition of the real external environment where the game system is located and the information acquisition of the user is realized through the environment information acquisition device of the game terminal, and the image information matched with the real external environment and the user is obtained by matching with the projection device, so that the game image scene can be intelligently changed according to the external environment and changed by matching with the action of the user, and the personalized experience effect is achieved. Meanwhile, the embodiment of the invention realizes the game fight effect by adopting the game robot and utilizing the game robot to react according to the information of the user, improves the sense of reality of the game and realizes the immersive experience effect.
Optionally, the novel game system further comprises a control end, wherein the control end is used for collecting instructions of a user and sending the instructions of the user to the game end and/or the game robot.
Wherein, control the end and can be selected as one or more of remote control handle, cell-phone, intelligent response equipment.
According to the embodiment of the invention, the control end is adopted to realize information interaction between a user and the game end and between the user and the game robot.
Optionally, the control device is configured to generate, according to the real environment information, image information matched with the real environment information, including:
according to the real environment information, constructing a virtual 3D image, fitting the user space parameters with the virtual 3D image, and obtaining virtual interaction image parameters;
the control device acquires information acquired by the environment information acquisition device, acquires 3D information of a real environment, and establishes an indoor virtual 3D map model; the control device extracts a virtual image matched with the virtual 3D map from the theme image package according to the virtual 3D map;
and correcting the virtual interactive image parameters to form image information for projection.
Wherein, a certain angle in the real environment is set as a projection surface, the relative positions of the projector and the projection surface are collected, the virtual interactive image parameters are corrected,
because of the characteristics of the projector, the projection of the virtual image directly generates deviation in the real environment to influence the actual experience effect.
Optionally, the environmental information collection device is further used for collecting spatial information of the game robot;
the environment information acquisition device acquires spatial information of the game robot, wherein the spatial information at least comprises one of horizontal position information of the game robot, height position information of the game robot and action information of the game robot;
the control device fits the space information of the game robot with the virtual 3D image to form a virtual interaction image interacted with the game robot, and virtual interaction image parameters are obtained; the control device corrects the virtual interactive image parameters to form image information for projection;
the control device collects the space information of the game robot, fits the game robot into the image information to form a virtual interaction image, obtains virtual interaction image parameters, and obtains the image information interacted with the game robot through correction;
the projection device projects image information to a real environment.
According to the environment information acquisition device, the control device can determine the specific position and action of the game robot in the virtual 3D map by acquiring the space information of the game robot, so that the fitting process of the game robot and the image information is realized, and the interaction effect of the game image information and the robot is achieved.
In particular, the projection device includes N projection units, and the control device is configured to generate image information matching with the real environment information based on the real environment information, including:
constructing a virtual image according to the real environment information, wherein the virtual image comprises N virtual sub-images, and virtual sub-image parameters are obtained; the virtual sub-images are in one-to-one correspondence with the projection units, and the virtual sub-images are partially or completely overlapped;
wherein N is a natural number, and in the embodiment of the invention, the projection unit is a projector; the control device divides the virtual image into N virtual sub-images according to the relative positions of the projection unit and the real environment, so that the virtual sub-images correspond to the projection unit one by one;
and correcting the virtual sub-image parameters according to the position of the projection unit corresponding to the virtual sub-image to form sub-image information for projection of the projection unit.
The method comprises the steps of setting a plane at a certain angle in a real environment as a projection plane, collecting the relative positions of each projection unit and the projection plane, and correcting virtual sub-image parameters to form sub-image information suitable for projection of each projection unit.
The projector can only project in a single direction, so that the projection effect on the 3D structure is poor, and the reality experience is lacking. According to the embodiment of the invention, the N projection units are adopted to realize the superposition projection effect on the real scene by matching with the control device, so that the multi-angle projection is realized, and the sense of reality of the projection is improved.
In particular, the spatial information at least includes one or more of position information, angle information, and motion information of the game robot.
The control device can determine the specific position of the game robot in the real space and the occupied size of the game robot in the space according to the game robot horizontal position information and the game robot height position information, and the game robot action information is used for the control device to judge the motion trail of the game robot so as to realize the interaction effect of the game robot and the image information.
The embodiment of the invention can determine the specific position and the running state of the game robot in the real space by collecting the position information, the angle information and the action information of the game robot, and realize the interaction of the game robot and the image information.
In particular, the environmental information collection device is further configured to collect spatial information of the game robot, including:
collecting first space information of a game robot and/or a user, and calculating a first moment;
the environment information acquisition device is internally provided with a timing unit, the timing unit is controlled to start timing when the space information of the game robot and the user is acquired, and the space information of the game robot and the user at the moment is set as first space information;
collecting second space information of the game robot and/or a user, and calculating a second moment;
the method comprises the steps of collecting space information of a game robot and a user after a time interval, and setting the space information as second space information;
the control device determines a time interval according to the first time and the second time, determines running trend parameters of the game robot and/or a user according to the time interval between the first space information and the second space information, fits the running trend parameters with the image information, and forms a virtual interactive image aiming at the running trend parameters of the game robot and/or the user to obtain virtual interactive image parameters; the control device corrects the virtual interactive image parameters to form image information for projection;
the control device obtains the position change of the game robot and/or the user according to the change trend of the first space information and the second space information and the time difference of the first moment and the second moment, so that the running trend parameters of the game robot and the user are determined, the future space information of the game robot and the user is judged, and the game robot and the user are fitted with the image information according to the future space information of the game robot and the user, so that the control device can intelligently respond according to the running trend of the game robot and the user, and the interaction effect is improved;
the projection device projects image information to a real environment.
The control device is used for improving accurate judgment of the running trend of the game robot and the user, collecting the space information change conditions of the game robot and the user at intervals of a plurality of moments, and judging whether the robot is in an accelerating state, a decelerating state or a stopping state according to the space information change conditions of the game robot and the user.
According to the embodiment of the invention, the environmental information acquisition device is used for acquiring the spatial information of the game robot and the user at a plurality of moments, so that the movement conditions of the game robot and the user are determined, the future running trend of the game robot and the user is further judged, the intelligent response with the game robot and the user is realized, and the interaction effect of a game system is improved.
Particularly, the intelligent interaction system further comprises a growth module, wherein the growth module is used for acquiring user information, image information, space information and virtual interaction images, taking the user information as a first parameter, taking the image information as a second parameter, taking the space information as a third parameter and taking the virtual interaction images as evaluation parameters, and establishing response relations among the first parameter, the second parameter, the third parameter and the evaluation parameters; the evaluation parameters are provided with a plurality of target parameters, and after receiving the user information and the image information, the game robot determines the space information corresponding to each target parameter according to the target parameters and selectively executes the space information.
The game robot space information comprises game robot running information, a growth module is used for matching with image information provided by a control device and the game robot space information, a function is built with the virtual interactive image to form a response relation, the virtual interactive image is used as evaluation of the game robot running information, a scoring system is built, different game robot running information is scored, optimal game robot running information is obtained, and intelligent game robot cultivation is achieved.
According to the embodiment of the invention, the growth module is adopted, so that the game robot can automatically judge the optimal operation information after receiving the user information for operation for a plurality of times, and the game robot has a growth function, so that intelligent cultivation is realized.
Optionally, the game end further includes a user instruction acquisition module, where the user instruction acquisition module includes one or more of a voice recognition unit and a human body action recognition unit, and the user instruction acquisition module is used for converting the voice or the action of the user into an instruction of the user.
The voice recognition unit is used for collecting user voice, converting the voice into a control command according to user voice content and sending the control command to the game robot. The human body action recognition unit is used for recognizing human body actions, converting the actions of a user into control commands and sending the control commands to the game robot.
The embodiment of the invention can identify the voice or human body action of the user by adopting the user instruction acquisition module, thereby realizing intelligent interaction with the user and avoiding dependence on control ends such as a game handle, a mobile phone, a remote controller and the like.
Optionally, the game robot may be one or more of an AGV cart and a biomimetic robot.
The number of the game robots is 1 or more, the game robots can adopt robots with moving functions such as AGV trolleys and bionic robots, and can also adopt mechanical structures such as mechanical arms which cannot move but can realize multiple actions. In the embodiment of the invention, the game robot is an AGV trolley with a mechanical arm, and can execute various actions by using the mechanical arm while having a moving function, so that a good interaction effect is realized.
Finally, it should be noted that the above-mentioned embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above-mentioned embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made to the specific embodiments of the present invention after reading the present specification, and these modifications and variations do not depart from the scope of the invention as claimed in the pending claims.

Claims (7)

1. A novel gaming system, wherein the novel gaming system comprises an intelligent interactive system, the intelligent interactive system comprising: a game terminal and a game robot;
the game terminal comprises:
the environment information acquisition device is used for acquiring real environment information and user space parameters, wherein the user space parameters comprise position information and action information of a user;
the control device is used for generating matched image information according to the real environment information and the user space parameters;
the projection device is used for projecting image information to the real environment so as to enable the image information to be matched with the real environment information;
the game robot is used for collecting user information and image information, setting virtual environment information according to the image information and executing commands according to the user information and the virtual environment information; the user information is one or more of a user instruction or a user space parameter; the game robot is provided with a moving module and at least comprises a mechanical arm, wherein the mechanical arm is used for executing a pick-up command of a user or interacting with the user;
the control device is used for generating image information matched with the real environment information according to the real environment information, and comprises the following steps:
according to the real environment information, constructing a virtual 3D image, fitting the user space parameters with the virtual 3D image, and obtaining virtual interaction image parameters;
correcting the virtual interactive image parameters to form image information for projection;
the environment information acquisition device is also used for acquiring the space information of the game robot;
the control device fits the space information of the game robot and the space parameters of the user with the virtual 3D image to form a virtual interaction image interacted with the user and the game robot, and virtual interaction image parameters are obtained; the control device corrects the virtual interactive image parameters to form image information for projection;
the projection device projects image information to a real environment;
and, the environmental information collection device is used for collecting the space information of the game robot and comprises:
collecting first space information of a game robot and/or a user, and calculating a first moment;
collecting second space information of the game robot and/or a user, and calculating a second moment;
the control device determines a time interval according to the first time and the second time, determines running trend parameters of the game robot and/or the user according to the time interval between the first space information and the second space information, fits the running trend parameters with image information, and forms a virtual interactive image aiming at the running trend parameters of the game robot and/or the user to obtain virtual interactive image parameters; the control device corrects the virtual interactive image parameters to form image information for projection;
the projection device projects image information to a real environment.
2. The novel gaming system of claim 1, further comprising a manipulation end for capturing instructions from a user and transmitting instructions from the user to the gaming end and/or the gaming robot.
3. The novel game system according to claim 1, wherein the projection device includes N projection units, and the control device is configured to generate image information matching with the real environment information based on the real environment information, including:
constructing a virtual image according to the real environment information, wherein the virtual image comprises N virtual sub-images, and virtual sub-image parameters are obtained; the virtual sub-images are in one-to-one correspondence with the projection units, and the virtual sub-images are partially or completely overlapped;
and correcting the virtual sub-image parameters according to the position of the projection unit corresponding to the virtual sub-image to form sub-image information for projection of the projection unit.
4. A new game system according to claim 3, wherein the spatial information comprises at least one or more of position information, angle information, motion information of the game robot.
5. The novel game system of claim 4, wherein the intelligent interactive system further comprises a growing module for collecting user information, image information, spatial information and virtual interactive images, wherein the growing module is used for establishing a response relationship among the first parameter, the second parameter, the third parameter and the evaluation parameter by taking the user information as a first parameter, the image information as a second parameter, the spatial information as a third parameter and the virtual interactive image as an evaluation parameter; the evaluation parameters are provided with a plurality of target parameters, and after receiving the user information and the image information, the game robot determines the space information corresponding to each target parameter according to the target parameters and selectively executes the space information.
6. The novel game system of claim 1, wherein the game terminal further comprises a user command collection module, the user command collection module comprises one or more of a voice recognition unit and a human motion recognition unit, and the user command collection module is used for converting the voice or the motion of the user into the command of the user.
7. The novel game system of claim 1, wherein the game robot is selected from one or more of an AGV cart and a biomimetic robot.
CN202011544368.6A 2020-12-23 2020-12-23 Novel game system Active CN113318426B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011544368.6A CN113318426B (en) 2020-12-23 2020-12-23 Novel game system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011544368.6A CN113318426B (en) 2020-12-23 2020-12-23 Novel game system

Publications (2)

Publication Number Publication Date
CN113318426A CN113318426A (en) 2021-08-31
CN113318426B true CN113318426B (en) 2023-05-26

Family

ID=77413242

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011544368.6A Active CN113318426B (en) 2020-12-23 2020-12-23 Novel game system

Country Status (1)

Country Link
CN (1) CN113318426B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201955771U (en) * 2010-11-15 2011-08-31 中国科学院深圳先进技术研究院 Human-computer interaction system
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN109200576A (en) * 2018-09-05 2019-01-15 深圳市三宝创新智能有限公司 Somatic sensation television game method, apparatus, equipment and the storage medium of robot projection
CN111277808A (en) * 2020-03-24 2020-06-12 欧拓飞科技(珠海)有限公司 Virtual reality enhancement equipment and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201955771U (en) * 2010-11-15 2011-08-31 中国科学院深圳先进技术研究院 Human-computer interaction system
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN109200576A (en) * 2018-09-05 2019-01-15 深圳市三宝创新智能有限公司 Somatic sensation television game method, apparatus, equipment and the storage medium of robot projection
CN111277808A (en) * 2020-03-24 2020-06-12 欧拓飞科技(珠海)有限公司 Virtual reality enhancement equipment and method

Also Published As

Publication number Publication date
CN113318426A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CN105344101B (en) Simulated race device that a kind of picture is Tong Bu with mechanical movement and analogy method
CN105373224B (en) A kind of mixed reality games system based on general fit calculation and method
EP2394717B1 (en) Image generation system, image generation method, and information storage medium for video games
EP2044503B1 (en) Apparatus and method of interaction with a data processor
US11049324B2 (en) Method of displaying virtual content based on markers
US9504920B2 (en) Method and system to create three-dimensional mapping in a two-dimensional game
CN110728739B (en) Virtual human control and interaction method based on video stream
US9155967B2 (en) Method for implementing game, storage medium, game device, and computer
US8854304B2 (en) Image generation system, image generation method, and information storage medium
US8655015B2 (en) Image generation system, image generation method, and information storage medium
US9849378B2 (en) Methods, apparatuses, and systems for remote play
CN107065409A (en) Trend projection arrangement and its method of work
EP2395454A2 (en) Image generation system, shape recognition method, and information storage medium
CN202150897U (en) Body feeling control game television set
CN107027014A (en) A kind of intelligent optical projection system of trend and its method
CN104102412A (en) Augmented reality technology-based handheld reading equipment and reading method thereof
EP2394710A2 (en) Image generation system, image generation method, and information storage medium
CN204028887U (en) A kind of reading of the hand-held based on augmented reality equipment
CN206575538U (en) A kind of intelligent projection display system of trend
CN114283229A (en) Method, device and equipment for generating walking animation of virtual character and storage medium
CN114474066B (en) Intelligent humanoid robot control system and method
CN111803904A (en) Dance teaching exercise device and method
CN113318426B (en) Novel game system
CN113797525B (en) Novel game system
CN113318425B (en) Novel game device and control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211103

Address after: 510663 501-2, Guangzheng science and Technology Industrial Park, No. 11, Nanyun fifth road, Science City, Huangpu District, Guangzhou, Guangdong Province

Applicant after: GUANGZHOU FUGANG LIFE INTELLIGENT TECHNOLOGY Co.,Ltd.

Address before: 510700 501-1, Guangzheng science and Technology Industrial Park, No. 11, Yunwu Road, Science City, Huangpu District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU FUGANG WANJIA INTELLIGENT TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant