CN109453525A - A kind of recreation interactive system and method based on immersion robot - Google Patents

A kind of recreation interactive system and method based on immersion robot Download PDF

Info

Publication number
CN109453525A
CN109453525A CN201811323111.0A CN201811323111A CN109453525A CN 109453525 A CN109453525 A CN 109453525A CN 201811323111 A CN201811323111 A CN 201811323111A CN 109453525 A CN109453525 A CN 109453525A
Authority
CN
China
Prior art keywords
robot
information
user
module
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811323111.0A
Other languages
Chinese (zh)
Other versions
CN109453525B (en
Inventor
范鹏
曾帅
晋睿
张雷
左惠文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Fangde Technology Co Ltd
Original Assignee
Chengdu Fangde Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Fangde Technology Co Ltd filed Critical Chengdu Fangde Technology Co Ltd
Priority to CN202210267212.0A priority Critical patent/CN114618169A/en
Priority to CN202210268191.4A priority patent/CN114632332A/en
Priority to CN201811323111.0A priority patent/CN109453525B/en
Publication of CN109453525A publication Critical patent/CN109453525A/en
Application granted granted Critical
Publication of CN109453525B publication Critical patent/CN109453525B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The present invention provides a kind of recreation interactive systems based on immersion robot, cloud server includes at least the shooting criterions module for carrying out information exchange with intelligent movable end and robot respectively, and carries out the driving path planning module of information exchange with shooting criterions module and robot respectively.Shooting criterions module is configured as: being received the collected second simulated laser transmitting of information and/or robot based on the collected first simulated laser transmitting in intelligent movable end and is received information and collected real-time firing rate information progress information analysis and the firing rate control parameter for generating user, and the firing rate control parameter is exported to the firing rate to robot to control robot.Driving path planning module is configured as: being obtained and hide to robot the environmental monitoring information of route planning, and determines that several are hidden path and indicate that robot hides the shooting of user and updates driving path with this based on obtained user gradation information and environmental monitoring information.

Description

A kind of recreation interactive system and method based on immersion robot
Technical field
The present invention relates to human-computer interaction, gunnery training, laser simulated firing, autonomous control shooting game robots, more people Shot strategy training, military training technical field more particularly to a kind of recreation interactive system and side based on immersion robot Method.
Background technique
It with information-based, industrialization constantly fusion, is surging forward by the intelligent industry of representative of robot science and technology, becomes existing One important symbol of Time Technology innovation.Robot and intelligence manufacture are incorporated the priority setting of national science and technology innovation by China Field.
And under the continuous development of science and technology, the introducing of information technology, computer technology and artificial intelligence technology, The research of robot has been not limited to industrial circle, gradually extends to medical treatment, health care, family, amusement and service industry Equal fields.And then requirement of the people for robot is also conformed to the principle of simplicity single duplicate mechanical action, being promoted to has anthropomorphic question and answer, autonomous Property and the intelligent robot that can be interacted with other robot, human-computer interaction, which also just becomes, determines intelligent robot development Therefore key factor improves the interactive capability of intelligent robot, promote the intelligence of intelligent robot, becomes at present urgently Major issue to be solved.
Robot is especially widely used in simulated gunnery technology in amusement emulation field.Simulation is penetrated The basic principle for hitting technology is to generate simulated gunnery scene by computer and by showing equipment such as large screen or passing through throwing Shadow instrument is projected in screen and is shown, the simulated gunnery target in the scene comprising successively occurring according to preset rule Such as virtual enemies, virtual unit.During simulated gunnery, ejaculator is penetrated using model gun such as infrared gun, laser gun etc. Hit the simulated gunnery target in scene, position detecting system calculates point of impact position and by the position of itself and shown goal It is matched, assert that ejaculator hits the mark in successful match.Due to goal is changed by traditional fixation target can be with The intelligent robot controlled by computer, therefore existing simulated gunnery system is a kind of " man-machine confrontation " system.
And on the basis of traditional analog gunnery system, technical staff develops a kind of more true simulated gunnery again System.This system makes document scene using true shooting scene, and the broadcasting of this document is then kept by display, Shown scene is taken from true shooting scene, therefore ejaculator has much the sense of reality as being in the action.
However, although simulation effect is more and more true to nature, due to simulated gunnery target and mould in existing simulated gunnery system The setting of quasi- shooting scene cannot change, and cause simulated gunnery target can only be mechanically on pre-set moving track Return is dynamic to detect user, and stops movement when detecting user, carries out according to pre-set firing rate to player Attack.Therefore, being increasing with access times, user are easy to remember the appearance situation of simulated gunnery target, not only use Family can easily hide the attack of simulated gunnery target, and always need when simulated gunnery target will attack player Want stop motion, what the quite a while was exposed to user can be in firing area.Different from real gaming, two game players Between should accuse each other and can hide other side lasting attack situation, thus reduce game difficulty and playability and Weaken the simulation effect of simulated gunnery target.Further, since the usually single fixation of the setting of simulated gunnery scene, can not build The tense atmosphere in environment is really shot out.
Chinese patent (Publication No. CN107121019A) discloses a kind of group's confrontation fire training system, by multiple intelligence Energy running target composition, shoots for simulating actual combat scene, has group control, target identification, danger judgement, the independently function such as confrontation Energy.Wherein, group control refers to multiple running targets according to tactical problem, between planning path or free movement, running target or moves It is mutually avoided between target and blindage;Danger judgement refers to that running target being capable of automatic sensing, judgement and escape from danger;Autonomous confrontation refers to shifting After moving-target enters belligerent range, energy image processor target is simultaneously struck back.The patent includes master control, video location, autonomous classification, movement The modules such as target and laser countermeasure (s), each intermodule are connected with each other, and are constructed training court space by video location and main control module and are sat Mark system, provides target real-time coordinates, realizes the accessible movement of group, danger judgement and autonomous confrontation is carried out by program, for ginseng Instruction person provides the group's tactical confrontation shooting condition for meeting actual combat, improves cooperation technical ability.
Although group's confrontation fire training system that the patent provides, can be real for tissue by the way that a variety of Tactical Modes are arranged Identification shooting or collaboration shooting under the conditions of war provide technical support, so that the person's of participating in training main body truly incorporates shooting training In, it can be improved the practical marketing skill of soldier.But it in group's confrontation fire training system that the patent provides, is penetrated as simulation The intelligent sliding moving-target for hitting target is only limitted to according to scheduled tactical formation under the control function of its group, along preset fixation Line-of-road movement, in multiple interactive process, user is easy to touch out the appearance situation of the simulated gunnery target, not only user's energy Enough attacks for easily hiding simulated gunnery target, and always need to stop when simulated gunnery target will attack player It only moves, what the quite a while was exposed to user can be in firing area.Different from real gaming, between two game players The situation that should accuse each other and can hide the lasting attack of other side thus reduces the difficulty of game and playability and weakens The simulation effect of simulated gunnery target.
Chinese patent (Publication No. CN2793674) discloses a kind of shoot-simulating dress with actual situation combination display effect It sets, it further includes half reflection which, which has display unit, computing unit, camera unit, the gun that can emit light beam, Mirror and the outdoor scene built;Computing unit is connected with display unit and camera unit respectively;Display unit shows that computing unit is drawn The virtual scene produced, which projects on half-reflecting mirror, after being reflected by half-reflecting mirror, is shown in and holds gun Player at the moment;The outdoor scene built and player are distributed in the two sides of half-reflecting mirror, outdoor scene by half-reflecting mirror projection it It is also shown in player afterwards at the moment.
Although the shoot-simulating device with actual situation combination display effect that the patent provides, can be by virtual scene and reality Border scene is simultaneously displayed in player's eye, to realize more life-like shooting scenario simulation effect.But the patent provides The shoot-simulating device with actual situation combination display effect in, since the shoot-simulating device is merely able to from player to movement Simulated gunnery target attack, and simulated gunnery target can not start shooting to strike back to player, therefore player is participating in It can be absorbed in strike simulated gunnery target in the process and do not have to the riposte for worrying simulated gunnery target, it cannot be with simulated gunnery mesh Effective interaction and failing builds the tense atmosphere in true shooting environment between mark.And the patent use half-reflecting mirror and The outdoor scene built, although solving the problems, such as that tradition takes up a large area, the place scene of the single type is not only maintained into This is higher, and the demand without solving broad masses' shooting scene changeable to novelty, cause the sense of reality for bringing user and Feeling of immersion experience is insufficient.
Summary of the invention
For the deficiencies of the prior art, the present invention provides a kind of recreation interactive system based on immersion robot, including At least one robot, Cloud Server and at least one intelligent movable end worn by user, wherein use at the intelligent movable end Emit in the first simulated laser of automatic collection and receive information, the robot includes around the robot is locating for automatic collection The environmental monitoring information of environment, the real-time firing rate information of the robot and the transmitting of the second simulated laser receive information.Institute It states Cloud Server and includes at least the shooting criterions module for carrying out information exchange with the intelligent movable end and the robot respectively, And the driving path planning module of information exchange is carried out with the shooting criterions module and the robot respectively.
Preferably, the driving path planning module is configured as: under conditions of the user hits the robot, The environmental monitoring information for hide to the robot route planning is obtained, and is penetrated based on the environmental monitoring information and by described The user gradation information determination for hitting control module generation hides path to indicate shooting of the robot to the user Hidden and driving path is updated with this.
According to a kind of preferred embodiment, the shooting criterions module is configured as: being acquired based on the intelligent movable end The the first simulated laser transmitting arrived receives the collected second simulated laser transmitting of information and/or the robot and receives information life The shooting frequency that information analysis generates the user is carried out at user gradation information, and in conjunction with collected real-time firing rate information The firing rate control parameter of generation is exported to the robot shooting frequency to control the robot by rate control parameter Rate.
According to a kind of preferred embodiment, the driving path planning module the user hit the robot to The robot receives second simulated laser transmitting and receives the current location for obtaining the robot under conditions of information Coordinate and the environmental monitoring information, wherein
The driving path planning module is based on the blindage information and the user sensed in the environmental monitoring information Current position coordinates, the blindage information include at least the shaped volumes of the blindage, the blindage position coordinates and The relative distance of the robot and the blindage, in conjunction with the current position coordinates and the environmental monitoring information of the robot Use what Artificial Potential Field Method planned the robot to hide path to the target point at the rear of the blindage to hide the user's Shooting.
According to a kind of preferred embodiment, the driving path planning module is according to the collected video information of video module The scene map of the location coordinate information including the blindage is generated, and is generated based on given initial position and target position It the driving path of several described robots on the environmental map and stores to presetting database and/or is provided to described Robot.
According to a kind of preferred embodiment, the driving path planning module is configured as: being collected according to video module Video information identify the boundary of the blindage and the scene in the video module, generated with this including the blindage The scene map of location coordinate information enables the robot in a manner of avoiding the blindage along the driving path planned It is mobile.
According to a kind of preferred embodiment, the driving path planning module plans the robot using Artificial Potential Field Method The step of hiding path includes at least:
Hitting the robot in the user, the second simulated laser transmitting receives to which the robot receives Under conditions of information, by the current position coordinates and the environmental monitoring information of the robot, following draw is substituted into respectively Force function and repulsion function simultaneously acquire the resultant force of both gravitation function and repulsion function as a result, determining the machine according to resultant force result People's hides path direction, wherein
Gravitation function are as follows: Fatt(X)=k (XG-XR),
Repulsion function are as follows:
Its resultant force are as follows:
In formula, k indicates gravitation gain, ηXIndicate repulsion gain coefficient, ρ (XR,Xi) indicate blindage to the robot away from From ρ (XR,XG) indicate distance of the robot to the target point for being located at blindage rear, ρ0Indicate that blindage can influence the robot Maximum distance, XRIndicate the current position coordinates of the robot, XiIndicate the current position coordinates of blindage, XGIt represents to be located at and cover The position coordinates of the target point at body rear.
According to a kind of preferred embodiment, the shooting criterions module of the Cloud Server is acquired based on the intelligent movable end The the first simulated laser transmitting arrived receive the collected second simulated laser transmitting of information and/or the robot receive information into Row information is analyzed to obtain the current cumulative integral of the user and is stored in the presetting database and the current accumulation Integrate corresponding user gradation information.
Wherein, the information analysis process, which is included at least, is stored in institute based on the user gradation information of acquisition to retrieve State the firing rate control parameter in presetting database with the user gradation information for corresponding relationship, the shooting criterions The firing rate control parameter that module will acquire export to the robot so that the firing rate of the robot with work as Preceding user gradation is corresponding.
According to a kind of preferred embodiment, the information analysis process includes at least the user gradation letter based on acquisition It ceases to retrieve the movement speed control parameter being stored in the presetting database with the user gradation information for corresponding relationship, And hitting the robot in the user, the second simulated laser transmitting receives information to which the robot receives Under conditions of by the movement speed control parameter that the shooting criterions module will acquire export to the driving path plan Module, the driving path planning module indicate the Robot based on the environmental monitoring information generate described in hide road The shooting of the user is hidden with the movement speed in the movement speed control parameter in diameter direction.
According to a kind of preferred embodiment, the Cloud Server further includes carrying out information exchange with the shooting criterions module And show that equipment carries out the scenery control module of scenery control information association with scene.Preferably, the scenery control module is used The current user's level of the user is believed in the user gradation information for obtaining the shooting criterions module generation and with this Breath is updated, and the scenery control module is stored in institute to retrieve in the case that user gradation information changes in this prior State in presetting database with the scene information that the user gradation information is corresponding relationship and by scene display equipment into The corresponding scene of row is shown, and updated user gradation information is mentioned by the intelligent movable end to the user Show.
A kind of entertainment interactive method based on immersion robot, including at least one robot, Cloud Server and at least One intelligent movable end worn by user, the Cloud Server includes at least shooting criterions module and driving path plans mould Block, which is characterized in that the entertainment interactive method at least includes the following steps: the condition of the robot is hit in the user Under, the environmental monitoring information for hide to the robot route planning is obtained, and based on the environmental monitoring information and by institute The user gradation information determination for stating the generation of shooting criterions module hides path to indicate the robot to the user's Shooting is hidden and updates driving path with this.
According to a kind of preferred embodiment, the entertainment interactive method is further comprising the steps of: being based on the intelligent movable It holds collected first simulated laser transmitting to receive the collected second simulated laser transmitting of information and/or the robot to receive Information generates user gradation information, and carries out information analysis in conjunction with collected real-time firing rate information and generate the user's Firing rate control parameter exports the firing rate control parameter of generation to the robot to control the robot Firing rate.
Advantageous effects of the invention:
(1) recreation interactive system provided through the invention, user and robot are each other according to being capable of active attack The mode and platform space of other side carries out information exchange, and platform space controls the shooting frequency of robot in such a way that data are transmitted The path of hiding of rate and robot carries out simulation with user and interacts, and when by user's laser hits, which can be according to hiding It keeps away path and voluntarily hides shooting, and at the same time can replace after hiding shooting to other driving paths, and avoid tradition single Shot pattern and fixed mobile path increase the fidelity of entertainment systems and the playability of amusing method, based on different amusements Scene improves the substitution sense and sense of reality experience of player.
(2) by establishing Robot Virtual repulsion Artificial Potential Field Model, while variation with user's game ratings, control The firing rate of robot and the movement speed of robot it is synchronous variation, improve the simulation effect of system;And due to control Replacement is to other driving paths after hiding shooting for its robot, even if user's access times and time are continuously increased, also not The appearance situation that can remember simulated gunnery target increases the difficulty of game and playability and enhances the entertainment interactive system The true simulation effect of system.
Detailed description of the invention
Fig. 1 is the module connection diagram of a preferred embodiment of recreation interactive system of the invention;
Fig. 2 is the method schematic diagram of a currently preferred preferred embodiment;
Fig. 3 is the method schematic diagram of another currently preferred preferred embodiment;With
Fig. 4 is the module connection diagram of another preferred embodiment of recreation interactive system of the invention.
Reference signs list
100: robot 200: user 300: intelligent movable end
400: shooting criterions module 500: driving path planning module 600: scenery control module
700: Cloud Server
Specific embodiment
It is described in detail with reference to the accompanying drawing.
As shown in Figure 1, the recreation interactive system based on immersion robot, including at least one robot 100, cloud service Device 700 and at least one intelligent movable end 300 for being worn by user 200, wherein intelligent movable end 300 is for automatic collection the The transmitting of one simulated laser receives information, and robot 100 is used for the environment that automatic collection includes ambient enviroment locating for the robot 100 Monitoring information, the real-time firing rate information of robot 100 and the transmitting of the second simulated laser receive information.Cloud Server 700 to Few includes the shooting criterions module 400 for carrying out information exchange with intelligent movable end 300 and robot 100 respectively, and respectively with Shooting criterions module 400 and robot 100 carry out the driving path planning module 500 of information exchange.Preferably, the first simulation swashs It includes that intelligent movable end 300 emits laser and intelligent movable end 300 and hits two letters by user 200 that light emitting, which receives information, Breath, it includes that the transmitting laser of user 200 and user 200 are hit by intelligent movable end 300 that the transmitting of the second simulated laser, which receives information, Two information.
Preferably, driving path planning module 500 is configured as: under conditions of user 200 hits robot 100, being obtained The environmental monitoring information for hide to robot 100 route planning is taken, and based on the environmental monitoring information and by shooting criterions The user gradation information that module 400 generates determines that several are hidden path and indicate that robot 100 carries out the shooting of user 200 Hide and driving path is updated with this.Preferably, whether environmental monitoring information includes at least uses within the scope of monitoring Observable Blindage location information within the scope of family 200 and acquisition Observable.When robot is hit by user, Cloud Server can be passed through The 700 lasting attacks hidden path and hide user provided, and continue to move along different from former driving path after hiding, it beats Breaking traditional fixed form path causes user to be easy the problem of remembeing and reducing game difficulty and playability.
The present invention setting user and robot each other according to can by way of active attack other side it is empty with platform Between carry out information exchange, platform space controls in such a way that data are transmitted the firing rate of robot and road is hidden by robot Diameter carries out simulation with user and interacts, and robot can voluntarily hide shooting and avoid the single shot pattern of tradition, increases amusement The fidelity of system and the playability of amusing method improve the substitution sense and the sense of reality of player based on different Entertainment Scenes Experience.And also by establishing Robot Virtual repulsion Artificial Potential Field Model, while variation with user's game ratings, control The firing rate of robot and the movement speed of robot it is synchronous variation, improve the simulation effect of system;And due to control Its robot after hiding shooting replacement to other driving paths, so even if user's access times and time be continuously increased, The appearance situation that can not remember simulated gunnery target increases the difficulty of game and playability and enhances the entertainment interactive The simulation effect of system.
Preferably, which includes at least the water bullet rifle battle equipment worn by user 200, the water bullet rifle pair War equipment includes helmet equipment, jumper is standby and water bullet rifle equipment, user 200 carry out robot 100 by transmitting water bullet rifle Attack, compared with the prior art in common paintball class battle equipment, although having the gun sense of reality due to using all-metal There are certain risks for bullet, and water bullet rifle battle equipment had both enhanced the sense of reality of gun and there is no the danger hurted sb.'s feelings.Wherein, Intelligent movable end 300 collects the first simulation and swashs while water bullet rifle is equipped to helmet equipment or jumper preparation jetting bullet Light emitting receives information, when helmet equipment or jumper it is standby in by water bullet rifle water attack while the robot 100 adopt Collect the transmitting of the second simulated laser and receives information.
Preferably, shooting criterions module 400 is configured as: being based on collected first simulated laser in intelligent movable end 300 Transmitting receives information and/or the transmitting of collected second simulated laser of robot 100 receives information and collected penetrates in real time It hits frequency information to carry out information analysis and generate the firing rate control parameter of user 200, and the firing rate is controlled and is joined Number output controls the firing rate of robot 100 to robot 100.Due to the going game proficiency based on user, attack Elementary game ratings are in attack is hidden, it is therefore desirable to which the simulated gunnery target that grade matches therewith passes through user 200 It is analyzed with the corresponding shooting situation of robot 100 and situation of being hit by a bullet, obtains the game ratings of the user 200, and lead to The real-time firing rate information for crossing the feedback of robot 100 obtains firing rate control parameter corresponding with the game ratings and right The real-time firing rate information of robot 100 is updated, for example, firing rate control parameter may include the continuous fire time Interval was not less than 3 seconds or continuous fire time interval is not less than the number of shooting in 5 seconds or 1min and is no more than 5 times or 1min Interior shoots number no more than 10 times.
Preferably, when user hits robot, by the collected first simulated laser transmitting information in intelligent movable end 300 And information and real-time firing rate information are received by collected second simulated laser of the robot, shooting control is fed back to respectively Molding block 400 carries out information analysis.The process that shooting criterions module 400 carries out information analysis includes accumulating in the history of the user Increase corresponding score changing value on the basis of integral to generate the current cumulative integral of user, and based on the current accumulation product Divide and be stored in corresponding user gradation information in presetting database to retrieve, and obtains being stored in presetting database simultaneously It is the firing rate control parameter and movement speed control parameter of corresponding relationship with the user gradation information.Shooting criterions module The 400 movement speed control parameters that will acquire are exported to driving path planning module 500, and driving path planning module 500 indicates Robot hides path direction according to the movement speed in the movement speed control parameter based on what environmental monitoring information generated The shooting of user is hidden;And when the real-time firing rate information of acquisition exceeds firing rate control parameter, i.e. user User gradation change, shooting criterions module 400 by the firing rate control parameter export to the robot 100 with control The firing rate of the robot 100 under the user gradation.When to receive the information that robot 100 is hit, road is travelled Diameter planning module 500 can be according to the movement speed for the information control robot that shooting criterions module 400 generates, and is judging While the user gradation of the user changes out, shooting criterions module 400 controls the firing rate of the robot 100, makes The movement speed and firing rate for obtaining robot are consistent with the present level of the user always.
According to a kind of preferred embodiment, driving path planning module 500 hits robot 100 in user 200 to machine Device people 100 receives the current position coordinates and ring that the transmitting of the second simulated laser receives acquisition robot 100 under conditions of information Border monitoring information.Preferably, driving path planning module 500 is based on the blindage information sensed in environmental monitoring information and user 200 current position coordinates, blindage information include at least the position coordinates and robot 100 of the shaped volumes of blindage, blindage With the relative distance of blindage, planned in conjunction with the current position coordinates and environmental monitoring information of robot 100 using Artificial Potential Field Method The shooting of the robot 100 hidden path and hide user 200 to the target point at the rear of blindage.
According to a kind of preferred embodiment, driving path planning module 500 is according to the collected video information of video module The scene map of the location coordinate information including blindage is generated, and several based on given initial position and target position generation It driving path of a robot 100 on the environmental map and stores to presetting database and/or is provided to robot 100.
According to a kind of preferred embodiment, driving path planning module 500 is configured as: collected according to video module Video information identifies the boundary of blindage and scene in video module, and the scene of the location coordinate information including blindage is generated with this Map enables driving path of the robot 100 in a manner of avoiding blindage along planning to move.Preferably, according to being mounted on The video module in scene of game can collect blindage number of locations and scene boundary in the scene of game, and according to adopting The boundary of the scene collected generates the scene map in the scene of game, can be used in not starting previous existence into several rows in game Path is sailed, allows the robot to avoid blindage automatically to move.
Preferably, intelligent movable end 300 may include smart phone or other intelligent wearable devices, and mobile intelligent terminal 300 communicatedly connect with laser emitter, laser pickoff and the positioning device carried by user respectively.Preferably, machine People includes at least one or more of laser emitter, laser pickoff, image collecting device or positioning device.
According to a kind of preferred embodiment, driving path planning module 500 plans the robot 100 using Artificial Potential Field Method Included at least the step of hiding path:
Robot 100 is hit in user 200 to which robot 100 receives the item that the transmitting of the second simulated laser receives information Under part, by the current position coordinates and environmental monitoring information of robot 100, following gravitation function and repulsion letter are substituted into respectively It counts and acquires the resultant force of both gravitation function and repulsion function as a result, hiding road according to what resultant force result determined the robot 100 Diameter direction, wherein
Gravitation function are as follows: Fatt(X)=k (XG-XR),
Repulsion function are as follows:
Its resultant force are as follows:
In formula, k indicates gravitation gain, ηXIndicate repulsion gain coefficient, ρ (XR,Xi) indicate blindage to the robot 100 Distance, ρ (XR,XG) indicate that robot 100 arrives the distance of the target point positioned at blindage rear, ρ0Indicate that blindage can influence this The maximum distance of robot 100, XRIndicate the current position coordinates of the robot 100, XiIndicate the current position coordinates of blindage, XGRepresent the position coordinates for being located at the target point at blindage rear.
Preferably, by establishing Robot Virtual repulsion Artificial Potential Field Model, the movement environment of robot is abstracted into Movement in one Artificial Potential Field Model, it is assumed that have " attraction ", blindage and robot between the target point and robot in environment There is " repulsive force ", it is mobile by the resultant direction control robot of both " attraction " and " repulsive force ", one, which is cooked up, with this puts down Sliding safety hides path.Preferably, target point is chosen at positioned at blindage rear and is located at another different from current driving road On other driving paths of diameter, so that in robot according to hiding after path hides the shooting of user attack, Neng Gouyan It is continued to move to different from other paths of original route, avoids still causing user to be easy according to initial setting up path is mobile after hiding The driving path of the robot is remembered, so that the simulation effect of the recreation interactive system reduces and user's sense of reality is experienced not By force.Preferably, the selection process of target point may comprise steps of: as illustrated in fig. 2, it is assumed that the robot is attacked by user When the location of be b point, the target point positioned at blindage rear is denoted as c point, then makees an extension tangent with blindage by b point Line takes the intersection point of the extended line and global path, as target point c point, while obtaining being overlapped on scene map with target point c point At least one other driving paths.And it is preferred, if having other blindages on c point, as shown in figure 3, if abandon this at this time Selected c point, finding another using Artificial Potential Field Method again can be used for the place d point that robot is hidden, and its step is adopted It is gone to choose new target point e point with method similar to the above, be planned using random Lu Tufa.
Preferably, above-mentioned random Lu Tufa is at least included the following steps: random Lu Tufa is firstly the need of building route map, i.e., V is added in figure node set V by the random selected element v in configuration space if v is not belonging to the point on blindage;By v with Other points v ' in set V is attached, if it is possible to be connected, then this edge v, v ' is added to the corresponding road route subgraph N Line subgraph is the set of single course) in, v is otherwise added to a new route subgraph;So repeat two above Step, when sampled point can connect multiple subgraphs, then these subgraphs can be synthesized into a new subgraph, thus finally obtain Route map;Beginning and end is added in route map, searches out a paths as hiding path using the route map.
According to a kind of preferred embodiment, the shooting criterions module 400 of Cloud Server 700 is based on intelligent movable end 300 and adopts The the first simulated laser transmitting collected receives information and/or the transmitting of collected second simulated laser of robot 100 receives information Information analysis is carried out to obtain the current cumulative integral of user 200 and be stored in corresponding with current cumulative integral in presetting database User gradation information.
Wherein, information analysis process, which is included at least, is stored in presetting database based on the user gradation information of acquisition to retrieve Interior and user gradation information is the firing rate control parameter of corresponding relationship, the shooting frequency that shooting criterions module 400 will acquire Rate control parameter is exported to robot 100 so that the firing rate of robot 100 is corresponding with current user's level.
According to a kind of preferred embodiment, information analysis process includes at least the user gradation information based on acquisition to retrieve It is stored in presetting database the movement speed control parameter with user gradation information for corresponding relationship, and is hit in user 200 Middle robot 100 receives under conditions of information to which robot 100 receives the transmitting of the second simulated laser by shooting criterions module The 400 movement speed control parameters that will acquire are exported to driving path planning module 500, and driving path planning module 500 indicates Path direction is hidden along what is generated based on environmental monitoring information with the movement speed pair in movement speed control parameter by robot 100 The shooting of user 200 is hidden.
According to a kind of preferred embodiment, Cloud Server 700 further include with shooting criterions module 400 carry out information exchange and Show that equipment carries out the scenery control module 600 of scenery control information association with scene.Preferably, scenery control module 600 is used Current user's level information of user 200 is carried out in the user gradation information for obtaining the generation of shooting criterions module 400 and with this It updates, scenery control module 600 is stored in preset data to retrieve in the case that user gradation information changes in this prior It is shown with the scene information that user gradation information is corresponding relationship in library and by the corresponding scene of scene display equipment progress, and And updated user gradation information is prompted by intelligent movable end 300 to user 200.As user's game ratings become Simulated scenario corresponding change is controlled while change, and by being prompted by intelligent movable end 300 to user 200, is built The tense atmosphere in environment is really shot out, solves demand of the broad masses to novel changeable shooting scene, and maintenance cost It is lower.
A kind of entertainment interactive method based on immersion robot, including at least one robot, Cloud Server and at least One intelligent movable end worn by user, Cloud Server 700 includes at least shooting criterions module 400 and driving path plans mould Block 500, which is characterized in that entertainment interactive method at least includes the following steps: under conditions of user 200 hits robot 100, The environmental monitoring information for hide to robot 100 route planning is obtained, and is controlled based on the environmental monitoring information and by shooting The user gradation information determination that molding block 400 generates hides path to indicate that the shooting of user 200 is hidden by robot 100 And driving path is updated with this.
According to a kind of preferred embodiment, entertainment interactive method is further comprising the steps of: being acquired based on intelligent movable end 300 The the first simulated laser transmitting arrived receives information and/or the transmitting of collected second simulated laser of robot 100 receives information life At user gradation information, and collected real-time firing rate information is combined to carry out the shooting frequency that information analysis generates user 200 The firing rate control parameter of generation is exported to robot 100 the shooting frequency to control robot 100 by rate control parameter Rate.
Preferably, the Cloud Server 700 is also configured to be associated with multiple robots progress operation data, and is based on Different firing data and robot motion's data is formed pair in the way of establishing corresponding relationship in the operation data obtained War strategy, and by the battle policy store to presetting database, so that robot can not only when connecting with Cloud Server It uploads the operation data of itself and the battle policy download is updated in the database of itself simultaneously.For example, the firing data Including at least shooting battle data, firing rate data and the user gradation data between multiple robots and user, the machine Device people's exercise data hides path data, driving path more new data and movement speed number including at least multiple robots According to.
Wherein, as shown in figure 4, robot 100 includes at least the shooting criterions for carrying out information exchange with intelligent movable end 300 Module 400 and the driving path planning module 500 that information exchange is carried out with shooting criterions module 400.Wherein, in user 200 Under conditions of hitting robot 100, obtains and hide the environmental monitoring information of route planning to robot 100 and controlled by shooting The user gradation information that molding block 400 generates, and retrieve based on above- mentioned information the battle in the database for being stored in robot Strategy will be retrieved since battle strategy includes the different firing datas and robot motion's data for establishing corresponding relationship each other To robot motion's data be transmitted to driving path planning module 500 and then control the robot hides path and movement Firing rate data in the firing data retrieved are transmitted to shooting criterions module 400 and then control the robot by speed Firing rate.
Wherein, while not retrieving battle strategy, the transmitting of the first simulated laser is based on by shooting criterions module 400 and is connect Breath and/or the second simulated laser transmitting reception information of collecting mail and collected real-time firing rate information progress information analysis are simultaneously The firing rate control parameter of user 200 is generated, and the firing rate control parameter is exported to robot 100 to control machine The firing rate of device people 100 is generated by shooting criterions module 400 based on environmental monitoring information and by shooting criterions module 400 User gradation information determine several hide path and indicate robot 100 shooting of user 200 is hidden and with this more New driving path.After fighting with user, robot connect the operation data to upload itself with Cloud Server, and same When the battle policy download is updated in the database of itself.
It should be noted that above-mentioned specific embodiment is exemplary, those skilled in the art can disclose in the present invention Various solutions are found out under the inspiration of content, and these solutions also belong to disclosure of the invention range and fall into this hair Within bright protection scope.It will be understood by those skilled in the art that description of the invention and its attached drawing are illustrative and are not Constitute limitations on claims.Protection scope of the present invention is defined by the claims and their equivalents.

Claims (10)

1. a kind of recreation interactive system based on immersion robot, including at least one robot (100), Cloud Server (700) and at least one by user (200) wear intelligent movable end (300), wherein the intelligent movable end (300) is used for The transmitting of the first simulated laser of automatic collection receives information, and the robot (100) includes the robot (100) for automatic collection The environmental monitoring information of locating ambient enviroment, the real-time firing rate information of the robot (100) and the second simulated laser hair Penetrate reception information, which is characterized in that
The Cloud Server (700) includes at least and carries out letter with the intelligent movable end (300) and the robot (100) respectively Cease the shooting criterions module (400) of interaction, and respectively with the shooting criterions module (400) and the robot (100) into The driving path planning module (500) of row information interaction, wherein
The driving path planning module (500) is configured as: hitting the item of the robot (100) in the user (200) Under part, the environmental monitoring information for hide to the robot (100) route planning is obtained, and be based on the environmental monitoring information It is determined with the user gradation information generated by the shooting criterions module (400) and hides path to indicate the robot (100) shooting of the user (200) is hidden and driving path is updated with this.
2. recreation interactive system as described in claim 1, which is characterized in that the shooting criterions module (400) is configured as: Information is received based on the collected first simulated laser transmitting of the intelligent movable end (300) and/or the robot (100) is adopted The the second simulated laser transmitting collected receives information and generates user gradation information, and combines collected real-time firing rate information The firing rate control parameter that information analysis generates the user (200) is carried out, the firing rate control parameter of generation is defeated The firing rate of the robot (100) is controlled to the robot (100) out.
3. the recreation interactive system as described in one of preceding claims, which is characterized in that the driving path planning module (500) robot (100) described second simulation to which the robot (100) receive is hit in the user (200) The current position coordinates and the environmental monitoring information of the robot (100) are obtained under conditions of Laser emission reception information, Wherein,
The driving path planning module (500) is based on the blindage information and the user sensed in the environmental monitoring information (200) current position coordinates, the blindage information include at least the position coordinates of the shaped volumes of the blindage, the blindage And the relative distance of robot (100) and the blindage, current position coordinates and institute in conjunction with the robot (100) State the target for hiding path to the rear of the blindage that environmental monitoring information plans the robot (100) using Artificial Potential Field Method It puts to hide the shooting of the user (200).
4. the recreation interactive system as described in one of preceding claims, which is characterized in that the driving path planning module (500) the scene map of the location coordinate information including the blindage is generated according to the collected video information of video module, and And several rows of the robot (100) on the environmental map are generated based on given initial position and target position It sails path and stores to presetting database, hit the robot (100) in the user (200) to the robot (100) it is different from current driving path to receive offer at least one under conditions of the second simulated laser transmitting receives information Other driving paths be updated to the robot (100), wherein the driving path planning module (500) is also configured to
The boundary of the blindage and the scene in the video module is identified according to the collected video information of video module, The scene map of the location coordinate information including the blindage is generated with this, so that the robot (100) can be to avoid The mode for stating blindage is moved along the driving path of planning.
5. the recreation interactive system as described in one of preceding claims, which is characterized in that the driving path planning module (500) including at least the step of hiding path for robot (100) is planned using Artificial Potential Field Method:
Hitting the robot (100) in the user (200), second simulation swashs to which the robot (100) receive Under conditions of light emitting receives information, the current position coordinates and the environmental monitoring information of the robot (100) are obtained, point Following gravitation function and repulsion function are not substituted into and acquire the resultant force of both gravitation function and repulsion function as a result, according to resultant force As a result determine the robot hides path direction, wherein
Gravitation function are as follows: Fatt(X)=k (XG-XR),
Repulsion function are as follows:
Its resultant force are as follows:
In formula, k indicates gravitation gain, ηXIndicate repulsion gain coefficient, ρ (XR,Xi) indicate distance of the blindage to the robot, ρ (XR,XG) indicate distance of the robot to the target point for being located at blindage rear, ρ0Indicate that blindage can influence the robot most Big distance, XRIndicate the current position coordinates of the robot, XiIndicate the current position coordinates of blindage, XGIt represents after being located at blindage The position coordinates of the target point of side.
6. the recreation interactive system as described in one of preceding claims, which is characterized in that the shooting of the Cloud Server (700) Control module (400) is based on the intelligent movable end (300) collected first simulated laser transmitting and receives information and/or described The collected second simulated laser transmitting of robot (100) receives information progress information analysis and obtains working as the user (200) Preceding cumulative integral and user gradation information corresponding with the current cumulative integral is stored in the presetting database, In,
The information analysis process, which is included at least, is stored in the present count based on the user gradation information of acquisition to retrieve It is the firing rate control parameter of corresponding relationship, the shooting criterions module according to Ku Nei and the user gradation information (400) the firing rate control parameter that will acquire is exported to the robot (100) so that the robot (100) are penetrated It is corresponding with current user's level to hit frequency.
7. the recreation interactive system as described in one of preceding claims, which is characterized in that the information analysis process is at least wrapped The user gradation information based on acquisition is included to retrieve to be stored in the presetting database and be with the user gradation information The movement speed control parameter of corresponding relationship, and the robot (100) are hit in the user (200) to the machine People (100) receives under conditions of the second simulated laser transmitting receives information and will be obtained by the shooting criterions module (400) The movement speed control parameter taken is exported to the driving path planning module (500), the driving path planning module (500) indicate that the robot (100) hides path direction with the shifting described in generate based on the environmental monitoring information Movement speed in dynamic speed control parameter hides the shooting of the user (200).
8. the recreation interactive system as described in one of preceding claims, which is characterized in that the Cloud Server (700) further includes Information exchange is carried out with the shooting criterions module (400) and shows that equipment carries out the scene of scenery control information association with scene Control module (600), wherein
The scenery control module (600) is used to obtain the user gradation information that the shooting criterions module (400) generates And current user's level information of user (200) is updated with this, the scenery control module (600) is in this prior User gradation information to retrieve to be stored in the presetting database with the user gradation information is in the case where changing The scene information of corresponding relationship simultaneously shows that equipment carries out corresponding scene and shows by the scene, and by updated user Class information is prompted by intelligent movable end (300) the Xiang Suoshu user (200).
9. a kind of entertainment interactive method based on immersion robot, including at least one robot (100), Cloud Server (700) and at least one is by the intelligent movable end (300) of user (200) wearing, and the Cloud Server (700), which includes at least, shoots Control module (400) and driving path planning module (500), which is characterized in that the entertainment interactive method includes at least following Step:
Under conditions of the user (200) hits the robot (100), the robot (100) is hidden in acquisition The environmental monitoring information of route planning, and based on the environmental monitoring information and the institute generated by the shooting criterions module (400) It states the determination of user gradation information and hides path to indicate that the shooting of the user (200) is hidden by the robot (100) And driving path is updated with this.
10. entertainment interactive method as claimed in claim 9, which is characterized in that the entertainment interactive method further includes following step It is rapid:
Information and/or the robot are received based on the collected first simulated laser transmitting of the intelligent movable end (300) (100) collected second simulated laser transmitting receives information and generates user gradation information, and combines collected real-time shooting Frequency information carries out the firing rate control parameter that information analysis generates the user (200), by the firing rate control of generation Parameter processed exports to the robot (100) firing rate for controlling the robot (100).
CN201811323111.0A 2018-11-07 2018-11-07 Entertainment interaction system and method based on immersive robot Active CN109453525B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202210267212.0A CN114618169A (en) 2018-11-07 2018-11-07 Information capturing system and method for entertainment interactive system
CN202210268191.4A CN114632332A (en) 2018-11-07 2018-11-07 Planning module and application method of entertainment interactive system
CN201811323111.0A CN109453525B (en) 2018-11-07 2018-11-07 Entertainment interaction system and method based on immersive robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811323111.0A CN109453525B (en) 2018-11-07 2018-11-07 Entertainment interaction system and method based on immersive robot

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202210267212.0A Division CN114618169A (en) 2018-11-07 2018-11-07 Information capturing system and method for entertainment interactive system
CN202210268191.4A Division CN114632332A (en) 2018-11-07 2018-11-07 Planning module and application method of entertainment interactive system

Publications (2)

Publication Number Publication Date
CN109453525A true CN109453525A (en) 2019-03-12
CN109453525B CN109453525B (en) 2022-03-01

Family

ID=65609686

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202210267212.0A Pending CN114618169A (en) 2018-11-07 2018-11-07 Information capturing system and method for entertainment interactive system
CN202210268191.4A Pending CN114632332A (en) 2018-11-07 2018-11-07 Planning module and application method of entertainment interactive system
CN201811323111.0A Active CN109453525B (en) 2018-11-07 2018-11-07 Entertainment interaction system and method based on immersive robot

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202210267212.0A Pending CN114618169A (en) 2018-11-07 2018-11-07 Information capturing system and method for entertainment interactive system
CN202210268191.4A Pending CN114632332A (en) 2018-11-07 2018-11-07 Planning module and application method of entertainment interactive system

Country Status (1)

Country Link
CN (3) CN114618169A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6821206B1 (en) * 1999-11-25 2004-11-23 Namco Ltd. Game machine, game route selection method, and information storage medium
KR100718334B1 (en) * 2004-08-30 2007-05-14 중앙대학교 산학협력단 Game difficulty control method for user interacting using artificial intelligence technique
CN106730825A (en) * 2017-01-03 2017-05-31 深圳人人创业商学院有限公司 A kind of battle game device
CN106871730A (en) * 2017-03-17 2017-06-20 北京军石科技有限公司 A kind of full landform intelligent mobile target system of shoot training of light weapons
CN107121019A (en) * 2017-05-15 2017-09-01 中国人民解放军73653部队 A kind of group's confrontation fire training system
CN107644273A (en) * 2017-09-27 2018-01-30 上海思岚科技有限公司 A kind of navigation path planning method and equipment
CN108536155A (en) * 2018-05-21 2018-09-14 上海理工大学 Intelligence based on cloud platform, which is practiced shooting, trains multi-robot system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6821206B1 (en) * 1999-11-25 2004-11-23 Namco Ltd. Game machine, game route selection method, and information storage medium
KR100718334B1 (en) * 2004-08-30 2007-05-14 중앙대학교 산학협력단 Game difficulty control method for user interacting using artificial intelligence technique
CN106730825A (en) * 2017-01-03 2017-05-31 深圳人人创业商学院有限公司 A kind of battle game device
CN106871730A (en) * 2017-03-17 2017-06-20 北京军石科技有限公司 A kind of full landform intelligent mobile target system of shoot training of light weapons
CN107121019A (en) * 2017-05-15 2017-09-01 中国人民解放军73653部队 A kind of group's confrontation fire training system
CN107644273A (en) * 2017-09-27 2018-01-30 上海思岚科技有限公司 A kind of navigation path planning method and equipment
CN108536155A (en) * 2018-05-21 2018-09-14 上海理工大学 Intelligence based on cloud platform, which is practiced shooting, trains multi-robot system

Also Published As

Publication number Publication date
CN114632332A (en) 2022-06-17
CN109453525B (en) 2022-03-01
CN114618169A (en) 2022-06-14

Similar Documents

Publication Publication Date Title
US10843077B2 (en) System and method for creation, presentation and interaction within multiple reality and virtual reality environments
CN207895727U (en) Make exercising system
US20080146302A1 (en) Massive Multiplayer Event Using Physical Skills
CN108694871A (en) A kind of more soldier's military training checking systems based on large space virtual reality
WO2021160108A1 (en) Animation video processing method, device, electronic apparatus, and storage medium
Isokoski et al. Gaze controlled games
US8920172B1 (en) Method and system for tracking hardware in a motion capture environment
CN110288868A (en) Armed forces in real combat interacts countermeasure system
US8052527B2 (en) Calculation control method, storage medium, and game device
CN108398049B (en) Networking mutual-combat type projection antagonism shooting training system
US20130225288A1 (en) Mobile gaming platform system and method
JP2024514752A (en) Method and device for controlling summoned objects in a virtual scene, electronic equipment and computer program
CN101155621A (en) Match game system and game device
CN109029127B (en) Command system and command method based on man-machine live ammunition confrontation training
CN106061571A (en) Interactive virtual reality systems and methods
WO2013111146A2 (en) System and method of providing virtual human on human combat training operations
US8777226B1 (en) Proxy target system
US20200289938A1 (en) Systems and methods for training an artificial intelligence model for competition matches
JP2022540276A (en) Virtual environment display method, apparatus, equipment and program
JP7071823B2 (en) Simulation system and program
CN114432701A (en) Ray display method, device and equipment based on virtual scene and storage medium
JP2023541150A (en) Screen display methods, devices, equipment and computer programs
CN110009960A (en) Virtual implementing helmet formula weaponry simulated training method
KR102361694B1 (en) Drone-based survival shooting game provision system
CN109453525A (en) A kind of recreation interactive system and method based on immersion robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant