CN112198816A - Multi-equipment interaction system and interaction method based on script - Google Patents

Multi-equipment interaction system and interaction method based on script Download PDF

Info

Publication number
CN112198816A
CN112198816A CN202010925310.XA CN202010925310A CN112198816A CN 112198816 A CN112198816 A CN 112198816A CN 202010925310 A CN202010925310 A CN 202010925310A CN 112198816 A CN112198816 A CN 112198816A
Authority
CN
China
Prior art keywords
script
terminal
map
equipment
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010925310.XA
Other languages
Chinese (zh)
Other versions
CN112198816B (en
Inventor
王少燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Lefu Smart Technology Co ltd
Original Assignee
Nanjing Lefu Smart Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Lefu Smart Technology Co ltd filed Critical Nanjing Lefu Smart Technology Co ltd
Priority to CN202010925310.XA priority Critical patent/CN112198816B/en
Publication of CN112198816A publication Critical patent/CN112198816A/en
Application granted granted Critical
Publication of CN112198816B publication Critical patent/CN112198816B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers

Abstract

The invention discloses a multi-equipment interaction system and an interaction method based on a script, which comprise a first equipment terminal and a second equipment terminal which are connected and communicated with each other, wherein the second equipment terminal comprises a placing platform for providing an activity area for the first equipment terminal; script data are preset in the first type equipment terminal, and the script data comprise control instruction data and/or map data corresponding to a script; script data and/or map data corresponding to the second type equipment terminal placing platform are preset in the second type equipment terminal. The interactive system can realize the free action, or the designated action, of a single or a plurality of first-class equipment terminals on the second-class equipment terminals, or according to the deduction action of the script, when the script is deducted, the first-class equipment terminals can realize the movement on the second-class terminals according to the map data corresponding to the script. The interaction based on the script can be realized among a plurality of interactive devices, the interactive content is rich and flexible and variable, and the interactive interaction experience of device users, particularly children groups, can be improved.

Description

Multi-equipment interaction system and interaction method based on script
Technical Field
The invention relates to the technical field of intelligent communication interaction, in particular to a script-based multi-device interaction system and an interaction method.
Background
Currently, the mainstream children dolls in the industry are generally divided into two types: dolls without control: such as blind boxes, rubber dolls; doll with control device: such as an electronic pet, a remote-controlled robot. The products are generally single-person interactive, the playing method is single, and the functions cannot be flexibly changed.
Disclosure of Invention
The invention aims to provide a script-based multi-device interaction system and an interaction method, which can realize script-based interaction among a plurality of interactive devices, have rich and flexible and variable interaction contents and can improve the interactive interaction experience of device users, particularly children groups.
The technical scheme adopted by the invention is as follows:
in one aspect, the invention provides a multi-device interaction system based on a script, which comprises a first type device terminal and a second type device terminal which are connected and communicated with each other, wherein the second type device terminal comprises a placing platform for providing an activity area for the first type device terminal;
script data are preset in the first type equipment terminal, and the script data comprise control instruction data and/or map data corresponding to a script; script data and/or map data corresponding to a second type equipment terminal placing platform are preset in the second type equipment terminal;
the first equipment terminal and/or the second equipment terminal respond to the input of an external control instruction, analyze the external control instruction, and control the corresponding first equipment terminal to execute the action specified by the control instruction on the placing platform of the second equipment terminal according to the analysis result;
or the first-class device terminals and/or the second-class device terminals respond to the input of external control instructions, analyze the external control instructions to obtain control instructions corresponding to the first-class device terminals and distribute the control instructions to the corresponding first-class device terminals, and the first-class device terminals execute actions specified by the control instructions on the placing platforms of the second-class device terminals according to the received control instructions;
or the first equipment terminal executes the corresponding action of the script on the placing platform of the corresponding second equipment terminal according to the script data of the first equipment terminal or the second equipment terminal;
or the first-type equipment terminals and/or the second-type equipment terminals respond to the external control instruction, and control the first-type equipment terminals to execute actions corresponding to the script on the placing platform of the second-type equipment terminals according to the preset script, the script specified by the external control instruction or the regenerated script.
According to the scheme, the interactive system can realize free action, designated action or deductive action of one or more first-class equipment terminals on the second-class equipment terminals, and when the script is deduced, the first-class equipment terminals can move on the second-class terminals according to the map data corresponding to the script. In specific implementation, when a plurality of first-class equipment terminals are combined with a plurality of second-class equipment terminals, one second-class equipment terminal can analyze and process an external control instruction or a script or a map, and then distribute an analyzed instruction sequence to the corresponding first equipment terminal, so that a plurality of first equipment terminals can cooperatively complete appointed or script-appointed actions.
Optionally, the first type of device terminal and the second type of device terminal respectively include multiple models, and the first type of device terminal and the second type of device terminal of each model are provided with the same or different script data;
the method comprises the steps of combining communication connections of first equipment terminals and second equipment terminals of different models and different numbers, triggering the first equipment terminals and/or the second equipment terminals in the combination to select corresponding scripts from preset script data, or generating new scripts according to current scripts of all equipment in the combination, or triggering the first equipment terminals and/or the second equipment terminals in the combination to send new combination script request information so as to obtain corresponding scripts from the outside.
Optionally, the system of the present invention further includes a server, where the server stores script data corresponding to various device terminals, and script data corresponding to a combination of various device terminals; the server can adopt a cloud server, when the equipment terminals with different types and numbers are combined, the request information of the newly combined script is transmitted to the cloud server, and the cloud server returns corresponding script data according to the request information.
Optionally, the interactive system of the present invention further includes a third type device terminal, where the third type device terminal receives a user control instruction, and the user control instruction is processed by the third type device terminal or a preset instruction processing device to obtain a control instruction corresponding to the first type device terminal and/or the second type device terminal, and then the control instruction is distributed to the corresponding first type device terminal and/or the second type device terminal;
or after receiving the user control instruction, the third type device terminal transmits the control instruction to the first type device terminal or the second type device terminal for instruction processing to obtain a control instruction corresponding to the first type device terminal and/or the second type device terminal, and then distributes the control instruction to the corresponding first type device terminal and/or the second type device terminal;
or after receiving the user control instruction, the third type device terminal transmits the control instruction to the first type device terminal and the second type device terminal respectively for instruction processing, and each of the first type device terminal and the second type device terminal extracts the control instruction corresponding to the local device terminal in the control instruction.
Optionally, when the user control instruction of the first-class device terminal, the second-class device terminal, the third-class device terminal, or the designated instruction processing device is processed, the control signal model is determined according to the mapping relationship between the preset instruction and the control signal model and the user control instruction, then the motion model including the device terminal and the action data to be executed is determined according to the mapping relationship between the preset control signal model and the motion model, and then the serialization control instruction corresponding to the device terminal is generated according to the motion model and distributed to the corresponding device terminal.
Optionally, the first device terminal and/or the second device terminal include a map management module, and operations performed by the map management module include map loading and map parsing, and also include map updating and/or map generation;
the map loading is to load the local current map or chessboard data of the equipment terminal; the map analysis is to analyze the loaded map or chessboard data to obtain map or chessboard area division data corresponding to the second type of equipment terminal placement platform area; the map is updated to receive map or chessboard updating data synchronized by the server, and the map or chessboard updating data is updated to the local; the map generation is used for generating map serialization data, and comprises automatic generation and generation triggered by a user;
and/or the first equipment terminal and/or the second equipment terminal comprise a script management module, and the operation executed by the script management module comprises script loading and script analysis, and further comprises script updating and/or script generation;
the script loading is the local current script data of the loading equipment terminal or the script data input from the outside; the script analysis comprises the step of obtaining a control instruction sequence facing to a first type equipment terminal according to the loaded script data; updating the script into script updating data synchronized by the receiving server, and updating the script updating data to the local; the generation of the script is to perform the generation of script serialization instruction data, including automatic generation and generation in response to a user trigger.
Optionally, the scenario includes scenario header information and a control instruction sequence, where the scenario header information includes scenario version data, scenario scene definition data, and/or map or chessboard setting data corresponding to the scenario; the control instruction sequence comprises a plurality of instruction segments, and each instruction segment comprises one or more instructions; the instruction section is an automatic execution instruction section or a user control section, when the control instruction is distributed, the instructions in the automatic execution instruction section are distributed in real time according to a sequence, and the user control section instruction is an instruction which is selected according to a preset rule in response to a control instruction signal input by a user. The instruction selection rule of the user control segment is server synchronization, or the device is preset or can be modified by the user.
Optionally, the map or checkerboard data loaded by the map loading is a set of serialized data; the map analysis determines the mapping relation between the digital map coordinates and the physical coordinates of the placing platform according to a preset mapping mode, and further determines the area division of a map or a chessboard on the placing platform of the second type of equipment terminal;
the area division of the map or the chessboard on the second type of equipment terminal placing platform comprises the following steps: one or more of a waiting area, an exiting area, an obstacle area and a specific plot triggering area of the doll;
the mapping mode between the digital map coordinates and the physical coordinates of the placing platform comprises equal-scale scaling mapping and cutting mapping.
Optionally, the second-class device terminal is provided with a connecting mechanism, so that the edges of different B-class device placing platforms can be butted with each other;
the connecting mechanism is a magnetic suction mechanism or a buckle mechanism;
and/or the map analysis further comprises the steps of responding to the splicing state of the placing platforms among the plurality of B-type devices, obtaining spliced map or chessboard data, and determining the area division on the spliced placing platforms according to the size relation of the placing platforms before and after splicing. Specifically, maps or chessboard area information before and after splicing can be mapped, so that the A-type device can perform action execution according to mapped position data when a sequential control instruction is subsequently executed.
Optionally, the first type of device terminal further includes a power unit, a storage unit, and one or more of a camera unit, a display unit, a key unit, an audio unit, and a light-emitting unit;
the instruction acquisition units of the first class device terminal and the second class device terminal comprise: one or more of a touch signal acquisition module, a sound signal acquisition module, a motion signal acquisition module, a key signal acquisition module, a vibration signal acquisition module, a light signal acquisition module and an infrared signal acquisition module;
the operation executed by the first type of equipment comprises one or more of voice playing, lamplight displaying, vibration, photographing, video recording, moving, displaying, joint movement and other operations;
the communication unit of the first type equipment terminal and the second type equipment terminal comprises: one or more of a magnetic card, an infrared communication module, a WIFI communication module, a Bluetooth communication module, a mobile communication module and a wired network card.
In a second aspect, the present invention provides a script-based multi-device interaction method, including:
collecting externally input control source information;
processing the control source information and extracting control instruction information;
analyzing the control instruction information, and determining a control signal model according to the control instruction information according to a preset mapping relation between an instruction and the control signal model;
determining a motion model according to the determined control signal model according to the mapping relation between a preset control signal model and the motion model; the motion model comprises equipment terminal data and motion data to be executed;
and generating a sequencing control instruction corresponding to one or more equipment terminals according to the determined motion model, and distributing the sequencing control instruction to the corresponding equipment terminals, so that the equipment terminals execute corresponding actions according to the received control instruction.
Corresponding to the interactive system of the first aspect, the instruction analysis in the interactive method includes contents such as analysis of map data, determination of a scenario, analysis of scenario data, and the like, a combination of instruction sequences to be executed by the plurality of first-class device terminals after the combination can be obtained through the analysis, each instruction can include target devices, execution action types, action contents, information of action processes on the map, and the like, and the instruction sequences can be distributed in real time according to time set by the scenario, so that the plurality of first-class device terminals can perform scenario appointed or corresponding to the scenario on the map or the chessboard of the second-class device terminals.
In the method, the third type of equipment terminal or the first type and second type of equipment terminals can be adopted for directly acquiring the external control instruction, processing the external control instruction and generating the instruction which can be identified by the equipment terminal, and the analysis and distribution of the instruction are preferably executed by the second type of equipment terminal.
Optionally, in the mapping relationship between the command and the control signal model and the mapping relationship between the control signal model and the motion model:
1 control command corresponds to one control signal model and corresponds to 1 or more motion models;
alternatively, the plurality of control commands correspond to one control signal model and 1 or more motion models.
Optionally, the distribution of the serialization control instruction is as follows: at the execution time corresponding to the instruction, the corresponding instruction is sent to the corresponding equipment terminal in real time;
or all control instructions corresponding to each equipment terminal are sent to the corresponding equipment terminal.
Advantageous effects
The interactive system and the interactive method can realize the real-time interaction of multiple devices based on the script, and the device terminals among multiple users or the multiple devices of a single user can realize the deduction of the appointed script through the interaction. And the combination of different equipment terminals can realize the updating of the script and the map, thereby refreshing the script, having greater attraction for the users of the low ages, and stimulating the innovation consciousness of the users, and presenting diversified playing methods and experiences.
Meanwhile, the invention can provide a programmable script updating and editing interface and a control-execution model editing interface for the user, realizes DIY control of the user on doll control response, and further brings brand new interaction and entertainment experience.
Drawings
FIG. 1 is a schematic diagram of an interactive system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a class A device in the interactive system according to an embodiment of the present invention;
FIG. 3 is a schematic diagram showing the splicing of the platform for placing the class B devices in the interactive system according to the present invention;
FIG. 4 is a schematic diagram illustrating the physical coordinate setting of a placement platform on a class B device;
FIG. 5 is a schematic diagram illustrating the physical coordinate setting in the case of splicing of the type B devices;
FIG. 6 is a schematic diagram of the digital coordinate setting of a map or a chessboard on a class B device;
FIG. 7 is a schematic diagram illustrating an isometric mapping of physical coordinates to digitized coordinates;
FIG. 8 is a schematic diagram illustrating a cropping map of physical coordinates and digitized coordinates;
FIG. 9 is a schematic illustration of a scenario of the present invention and its analysis;
FIG. 10 is a schematic diagram showing the script parsing, instruction distribution and execution flow of the interactive system of the present invention;
FIG. 11 is a diagram illustrating an embodiment of instruction processing in the interactive method of the present invention;
FIG. 12 is a flow chart illustrating an interactive method of the interactive system of the present invention working in a free mode by taking a single user as an example;
FIG. 13 is a schematic diagram of a mapping of a control command model to a motion model;
FIG. 14 is a schematic view of an interaction method of the interactive system of the present invention in a performance mode;
fig. 15 is a schematic diagram illustrating a scenario recording principle of the interactive system of the present invention.
Detailed Description
The following further description is made in conjunction with the accompanying drawings and the specific embodiments.
Example 1
This embodiment introduces a scenario-based multi-device interactive system.
The first type of terminal is referred to as a type a device, which may be selected from portable, portable devices such as dolls, pendant, and the like, preferably dolls with movement or joint movement capabilities. Further type classification definition can be carried out according to the appearance of the device, such as appearance, material, color, modeling and the like, or according to the software and hardware capabilities of the device, such as type A device type a, type b, type c and the like.
In this embodiment, the class a device is composed as shown in fig. 2, where the identity module, the connection module, the behavior capability execution module, the instruction acquisition module, the data recording and sorting module, and the policy management module are software function modules, and others are hardware function modules. The composition can support the operations of A-class devices such as voice playing, lamplight displaying, vibration, photographing, video recording, moving, displaying, joint movement and the like, so that the A-class devices have richer script deduction capability, and the interactive experience of the executing user is greatly improved.
The second class of device terminals are called class B devices, and the class B devices adopt devices that can be placed in a plane, for example, a placing platform can be designed in the class B devices for placing the class a devices and providing an active area for the class a devices. The placing platform of the B-type device can be in the form of a chessboard or a map and the like.
The communication units of the class a device and the class B device may each include: the system comprises one or more of a magnetic card, an infrared communication module, a WIFI communication module, a Bluetooth communication module, a mobile communication module and a wired network card, and is used for realizing near field communication and far field communication among A-class devices, between the A-class devices and B-class devices and/or among different B-class devices.
Different B-type devices can be spliced through physical contact, and a chessboard or a map on a placement platform is triggered to be spliced and fused in background software, so that scripts corresponding to the chessboard or the map after splicing are triggered, and the A-type devices execute corresponding actions according to new scripts on the chessboard or the map after splicing the B-type devices. Physical contact splice accessible magnetism inhale, be close to, mode realization such as draw-in groove promptly, also, type B device is equipped with the coupling mechanism who is used for interconnect for the edge of different type B device putting platform can dock each other. As shown in fig. 3, a plurality of quadrilateral class B device placement platforms (chess board surfaces or ground picture surfaces) are spliced to form a larger quadrilateral (chess board surfaces or ground picture surfaces) which can be used for accommodating more class a devices to perform script deduction on the platforms, so that the script is richer, and the interest and playability of the system are further improved.
Referring to fig. 11, the class a device and/or the class B device includes a map management module for: map loading, map analysis, map updating and map generation. The map loading is responsible for loading a map or a chessboard corresponding to a B-type device placement platform area, the map or the chessboard is essentially a group of serialized data storage, the map analysis is used for analyzing the loaded map or chessboard area information, and identifying area division information in the map or the chessboard area information, including mapping relation analysis of motion instructions which are required to be made due to the change of the size of the map caused by splicing of the B-type devices.
The map updating is used for synchronizing with the cloud end and receiving the map updating of the cloud end. Map generation data generation for map serialization includes automatic generation and user-triggered map generation.
The map analysis further comprises the steps of responding to the splicing state of the placing platforms among the plurality of B-type devices, obtaining spliced map or chessboard data, and determining the area division on the spliced placing platforms according to the size relation of the placing platforms before and after splicing. Specifically, maps or chessboard area information before and after splicing can be mapped, so that the A-type device can perform action execution according to mapped position data when a sequential control instruction is subsequently executed.
Regarding maps or chessboards on class B device placement platforms, the present invention relates to the design of these: and setting coordinates and performing serialized expression.
1) And (3) setting coordinates: mainly relating to the setting of physical coordinates and the setting of map coordinates.
The physical coordinates, that is, the plane coordinate marks of the actual placing platform of the class B device, may be divided into networks according to each physical size a, assuming that the width of the placing platform is W and the height is H, a total of m ═ W/a grids may be set in the horizontal direction, and n ═ H/a grids may be set in the vertical direction, and a certain corner is selected at a set time, as shown in fig. 4, the first grid at the upper left corner is taken as the coordinate P0(0,0), the grid coordinates of the j-th column of the i-th row and the j-th column of the class B device from top to bottom are (i-1, j-1), and the lower right corner is (m-1, n-1), that is, P1(3,2), Px (m-1, n-1) in fig. 4.
For the case of class B device splicing, referring to fig. 5, the first grid in the top left corner of the spliced platform area is still used as coordinate P0(0,0), and the last grid in the bottom right corner is used as coordinate Px (2 × m-1, n-1).
After splicing a plurality of B-type devices, a brand-new map can be triggered, for example, a single device triggers the loading of the map A1, two B-type devices splices triggers the loading of the map A2, and four B-type devices splices triggers the loading of the map A3.
In the above, the default loaded map or the triggered new map after the single device and the different devices are spliced is managed by the map management module, and the management rule is a preset rule or pushed by the cloud server.
In the invention, a map is a set of serialized data description, and the setting of the map or chessboard coordinates means that a set of serialized map data is obtained by describing a set area and the distribution thereof in a digital plane. The map or chessboard information can be set and defined in a partition mode through coordinate marking. Referring to fig. 6, the map coordinate setting includes defining a rectangle made up of P0 and P1 as a departure waiting area, a rectangle made up of P2 and P3 as a departure waiting area, a rectangle made up of P5 and P6 as an obstacle area, and P4 as a special trigger point area. The map may be defined in any partition for subsequent use of the transcript, and the definition of the partition is not limited to the following:
Figure RE-GDA0002819581780000081
the coordinate setting needs to map the physical coordinates and the digital coordinates in advance, and when the subsequent map is analyzed, the map management module can determine the physical coordinates of the actual positions on the B-type device placing platform corresponding to each piece of map data according to the loaded serialized map data and the mapping relation. The mapping mode can adopt equal scaling mapping, clipping mapping and the like.
The isometric mapping is shown with reference to fig. 7, i.e., the mapping is scaled according to the physical size and the digital map size, e.g., when the aspect ratio of the digital map and the aspect ratio of the class B device are the same, the isometric mapping is used.
Referring to fig. 8, when the aspect ratio of the digital map is different from that of the class B device placement platform region, the cutting mapping is adopted, that is, the digital map is scaled to correspond to the actual size of the class B device placement platform (after scaling, the placement platform region should completely include all coordinate regions of the digital map), and the rest regions may be set as invalid regions in the map information, that is, only a part of the region where the class B device is actually used as an available map region.
In practical application, the interactive system can have a plurality of application modes, for example, a blank map can be loaded in a mode of free control by a user, and at the moment, the placing platform of the whole B-type device can be used for the A-type device to freely move under the instruction of the user. Of course, a preset corresponding mode map may also be loaded.
The information included in the serialized map data is not limited to the following:
field(s) Means of
Numbering version information Number indicating map, version information
Size information Size information indicating a map
Applicable device information Adaptations indicating a map (e.g. single device, splicing of multiple devices)
Application mode Game mode for indicating map application
Applicable script Restrictions on subsequent loading of scripts
Region partitioning and labeling Partitioning and labeling information for specific regions
The invention does not limit the names of the specific fields of the serialized map data, and the splicing mode and the storage format of serialization. One possible example is the following:
Map001V0.1_ ALL _ ALL _ FREEDON _ ALL// indicates that this map is 001, version 0.1, applicable to ALL sizes and ALL device terminal types, applicable to free mode, applicable to ALL scripts.
(x0, y0) (x1, y1) unused// set an unavailable area
Mark P (x1, y1)// marks a special point P for subsequent scripts and the like.
Referring to fig. 11, the inventive class a terminal and/or class B terminal further includes a scenario management module, and operations performed by the scenario management module include scenario loading, scenario parsing, and further may include scenario updating and/or scenario generation.
The script can be provided by a server, or recorded by the user, or collected in a favorite action sequence of the user in a free control mode to automatically generate the script.
Referring to fig. 9, the content of the scenario may include scenario version information, scenario scene definition information, a map or chessboard size and area definition information corresponding to the scenario, and the like, and further include serialized control instructions. The control command can be segmented to correspond to different script deduction modes, one script can only comprise a single deduction mode or a plurality of comprehensive modes, for example, the control command of script serialization comprises an automatic execution segment and a user control segment, the command of the automatic execution segment needs to be matched with the detection sensing execution capacity of an A-class device or a B-class device for the motion of the A-class device on the B-class device to realize, the real-time detection result is used as the command execution result and is used as the judgment condition of the next command execution, for example, whether the command is moved to a specified position, whether the motion track accords with the track corresponding to the script, and the like. The instructions in the user control section require the user to perform relevant control on the class a device, and the control result may cause the execution sequence, logic and the like of the subsequent instructions to be adjusted. The above division of the instruction segment is a virtual division, and the instruction segment may be one instruction or multiple instructions.
Referring to fig. 10, in the process from scenario analysis to deduction, the terminal of the device responsible for scenario analysis performs scenario analysis to obtain the serialized control instructions corresponding to the scenario, and distributes the control instructions according to the sequence order, so that the class a device receives the control instructions and then executes the corresponding actions of the corresponding instructions, or for the user control segment in the scenario, the class a device receives and executes the user control instructions, feeds the execution result back to the scenario analysis device, and then jumps or selects the next instruction according to the execution result.
The invention has the function of recording the script when in application, and the recording of the script can be controlled and executed by the A-type device or the B-type device to acquire and record the script so as to form a new serialized script.
The recording of the transcript may be triggered by user input of voice, smart device UI interface options, physical buttons of a class a device or a class B device, or a combination thereof.
In the present invention, the serialized representation of the scenario and the serialized representation of the map may be stored separately or in a single representation file, such as:
map001v01// Map information, which may be full screen blank Map by default
The region consisting of Mark BEGIN (X1, Y1),// marks X1 and Y1 is the field area
The region consisting of Mark END (X2, Y2),// marks X2 and Y2 is the field-relief region
C01 ALL MOVE BEGIN,// let ALL equipment on class B automatically MOVE to the out-of-the-field area
C02A 1 MOVE P1(X3, Y3)// A1 to P1 position
C03 After C02; after completion of A2 MOVE P2(X4, Y4)// C02 execution, A2 MOVEs to the P2 position
C04 IF(check C03)A1 FLASH
ELSE A1 MOVE ENDF// if C03 performed successfully, A1 emitted light otherwise A1 exited the out-of-field region.
In the invention, different B-type devices or A-type devices can be provided with different scripts, different A-type device models and different numbers of A-type devices are placed on the B-type devices, and unlocking of the scripts which are correspondingly combined or downloading from a cloud can be triggered. After different B-type devices are spliced, unlocking of the script corresponding to the combination relation is triggered or the script is downloaded from the cloud.
The module can be in intelligent interrupt, such as a mobile phone, or on a type A device or a type B device, or perform data distribution after cloud processing. Preferably, the system is integrated in a software module of the B-type device.
Furthermore, the interactive system of the present invention further includes a third type of device terminal, which is referred to as a control device C, and the control device C may adopt an intelligent terminal such as an electronic watch, a bracelet, a mobile phone, and the like. The control device C comprises an instruction acquisition unit for receiving a user control instruction, and the specific form of the instruction acquisition unit can refer to the implementation form of the instruction acquisition unit of the class a device.
The control equipment C is used for receiving a user control instruction, the user control instruction can be processed by the control equipment C or preset instruction processing equipment to obtain a control instruction corresponding to the A-type device and/or the B-type device, and then the control instruction is distributed to the corresponding A-type device and/or the B-type device;
or after receiving the user control instruction, the control equipment C transmits the control instruction to the class-A device or the class-B device for instruction processing to obtain a control instruction corresponding to the class-A device and/or the class-B device, and then distributes the control instruction to the corresponding class-A device and/or the class-B device;
or after receiving the user instruction, the control device C transmits the control instruction to the class a device and the class B device respectively for instruction processing, and each class a device and each class B device extract the control instruction corresponding to the local device terminal in the control instruction.
Of course, the interactive system of the invention does not need to be provided with the control equipment C, and the collection, processing and other work of the control instruction can be realized by the A-type device or the B-type device. If an independent control device C is provided, it may be configured to: one or more functional modules of a sensor unit, a key unit, a human-computer interface and a voice interaction unit. The human-computer interface can provide a virtual key area for a user and/or provide an area for drawing a motion trail or a gesture corresponding to an instruction for the user.
Referring to fig. 13, when processing an original control command, the present invention may be performed based on a preset control command model, that is, a control command model and a motion model that are mapped to each other are preset in a device for command analysis processing, an input of the control command model is a control signal that is originally input, such as a sensor signal, a voice signal input by a user, or other operation data input by the user through a human-computer interface, an output of the control command model is a mapped motion model, which may be an overall motion model of a class a device or a motion model of a certain joint, and the motion model is a command sequence corresponding to content to be executed by the device.
The control instruction model is not limited to the following form:
Figure RE-GDA0002819581780000122
the contents of the control command model and the motion model, and the mapping relationship between the two are configured in advance, and can also be reconfigured or expanded through upgrading of the device (such as cloud-end pushing) or a local user-side interface.
For the motion model, under the condition that the class B device adopts the placing class equipment, the motion model only relates to the class A device. Because different class a devices may have different motion capabilities, the motion model is not fixed or invariant. The motion model can also be preset or changed locally or pushed through the cloud. Assuming that a certain type A device has the motion capability of 'magic step', the motion is defined as: and the three steps of advancing, retreating and going back and forth are carried out for three times, so that a new motion model MAGIC can be set, the MAGIC and a mapping rule capable of controlling the execution of the MAGIC can be pushed to the class-A device side through the cloud, and the device side can control the action execution of the MAGIC through a control instruction model corresponding to the rule.
The motion model represents the actual motion capability or capability combination of the device, and may be defined for the whole class a device, or for a certain joint of the class a device, by local setup or cloud push, and is not limited to the following definitions:
Figure RE-GDA0002819581780000121
Figure RE-GDA0002819581780000131
based on the definition content of the control signal model and the motion model, the mapping relationship between the control signal model and the motion model in this embodiment can be realized by number mapping, and the mapping relationship can also be set or modified by device side setting or cloud pushing, without being limited to the following mapping forms:
Figure RE-GDA0002819581780000132
the invention can provide the user with the capability of model programming through the user interaction interface, permits the user to define the corresponding relation between the gesture model and the execution model, and is used for generating the corresponding relation between the specific control gesture of the user and the execution instruction.
After the motion model is determined, the device terminal data of the motion to be executed and the motion data of the device terminal to be executed are obtained, and at this time, control instruction data can be generated for each related device terminal in a serialized manner so as to be distributed to the corresponding device terminal or stored. The serialized control instruction may include information such as an instruction number, a transmission object, an execution time, an execution duration, an execution action, an execution content, an execution number, an instruction control command, and is not limited to the following:
Figure RE-GDA0002819581780000133
Figure RE-GDA0002819581780000141
a serialized instruction may contain one, some, or all of the above fields, expressed in a particular coding. The following are a few examples:
c001, A _0s _ Move _ LEFT is numbered as C001 instruction, and the OS makes the A device start to Move to the LEFT;
c002 Pause C001 at 2S numbered C002 instruction: at 2S, temporarily setting the command C001, namely A to stop moving;
c003: after C002; a _ Flash _3, namely an instruction with the number of C003, and after the instruction C002 is finished, the device A performs light flashing for 3 times.
After determining the device terminals and the control instructions related to the motion model, it is necessary to distribute the serialized control instructions to the corresponding device terminals, and the instruction distribution form may be:
at the execution time corresponding to the instruction, the corresponding instruction is sent to the corresponding equipment terminal in real time;
or all control instructions corresponding to each equipment terminal are sent to the corresponding equipment terminal, namely, one group of control instructions are sent at one time.
The instruction source of the instruction real-time distribution module can be real-time input of the class A device (generated by controlling the class A device by a user), or an instruction which is analyzed from the script and is prefabricated in the script, or the instruction source and the instruction can be both.
The control command comprises an action command, and aiming at the splicing situation of the B-type devices, in the embodiment, before the control command is distributed, the equipment responsible for command processing maps the movement speed and the movement amplitude according to the change of the current map size so as to better accord with the actual experience of physical movement, for example, the default map size of the movement command is the map size W H of a single B-type device, the content of the movement command is left movement 20cm, when the actual size of the map is changed into W2H 2 due to the splicing of the B-type devices, the corresponding movement distance can be reasonably mapped, and the mapping relation can be the proportional relation between W2 and W.
In the present invention, the analysis processing and the distribution of the user control command are preferably executed by the class B device.
After the instruction is distributed, the class-A device or the class-B device which receives the control instruction identifies the corresponding instruction, the serialized control instruction is analyzed into the corresponding execution control instruction through the instruction execution interface, the execution of the corresponding hardware is operated, and if the control instruction is FLASH3, the control instruction is converted into the corresponding hardware execution interface FLASH (3000 ms).
The execution of the control instructions represents the instruction execution capabilities of the device and is not limited to the following types:
Figure RE-GDA0002819581780000151
based on the above settings, the application modes of the multi-device interactive system based on the scenario of the present invention are as follows.
1) Free mode: the class a device is freely operated by a single or multiple users through the control device C to move, play music, etc. on the class B device.
Optionally, as shown in fig. 11, one or more users perform instruction input through a specific control device C and a specific manner, and the input instruction is processed and analyzed by an instruction, and then is distributed to a corresponding a-type device or a corresponding B-type device, so as to trigger the instruction execution of the corresponding device.
This mode of interaction may be triggered by voice, smart device UI interface options, physical buttons on class a or class B devices, or a combination of two or more of these, and this mode of interaction may be the default mode for one or more class a devices to be placed on class B devices.
For the example of a single user, an interaction flow in the free mode is shown in fig. 12. Wherein the processing and generation of the instructions are: the control original instruction is converted into a standard execution instruction which can be recognized by the equipment, and the control original instruction can be a single instruction or a group of instructions of a single equipment or a single instruction or a group of instructions of a plurality of equipment, the execution time sequence relation among the instructions can be expressed, and the expansion is supported. After receiving the corresponding standardized control instruction, the class A equipment converts the standardized control instruction into corresponding interface data of the motion control system of the class A equipment, and then corresponding operation is executed.
2) The performance mode comprises the following steps: the class A device is arranged on the class B device and automatically set according to the script, and the actions such as relevant sports, music playing and the like are automatically completed without the participation of users in the whole process.
The operation principle of the interactive system in the performance mode is shown in fig. 13, and after obtaining the analysis structure, the device terminal responsible for instruction analysis sends an instruction to the a-class device corresponding to the instruction according to the control instruction sequence at a corresponding time or condition, so that the a-class devices complete the deduction of all the scenarios under the cooperation of the instruction distribution device.
The instructional representation of the performance mode is not limited to the following types:
Figure RE-GDA0002819581780000161
3) script mode: the user plays the game according to the scenario, as shown in fig. 9, that is, part of the segment requires the user to control the class a device behavior, and the class a device control of the part of the segment is automatically executed by the system sending out an instruction.
4) A recording mode: the system identifies and records the A-type device actions in the script mode and the recording mode, and generates a custom script for watching again in the sharing or performance mode.
The recording of the script may be performed together with the performance of the script, i.e. the recording mode may be started synchronously when the user performs a free mode or script mode interaction. As shown in fig. 15, in the free mode or scenario mode, control commands to the class-a devices are detected, data is stored in a serialized manner, and the stored command set (scenario) is available for subsequent loading.
The interactive system can realize the free action, or the designated action, of a single or a plurality of first-class equipment terminals on the second-class equipment terminals, or realize the movement of the first-class equipment terminals on the second-class terminals according to the map data corresponding to the scenarios during the performance of the scenario according to the performance action of the scenario. In specific implementation, when a plurality of first-class equipment terminals are combined with a plurality of second-class equipment terminals, one second-class equipment terminal can analyze and process an external control instruction or a script or a map, and then distribute an analyzed instruction sequence to the corresponding first equipment terminal, so that a plurality of first equipment terminals can cooperatively complete appointed or script-appointed actions.
In order to realize reliable and smooth interaction, the invention sets equipment responsible for instruction analysis to detect the instruction execution result of the class A device in the interaction process of the class A device and the class B device. If the equipment responsible for analysis is a class B device, that is, the class B device is provided with a motion detection module for detecting the motion position, trajectory and behavior of the class a device on the class B device, and the detection result can be used as an identifier for whether the execution of the action is successful. The motion detection may be implemented in the following manner.
1) Motion detection using class A device's own hardware capabilities
And the motion detection module records the motion or other behavior results by utilizing the hardware capability of the device A type device and feeds back the motion or other behavior results to the instruction analysis equipment. For example, the moving device Z of the class a device is used to record the moving direction, speed, acceleration and time of the device, so that the computing device a compares the relative new position P1 after the initial position P0, if the initial position of the class a device on the class B device is (0,0), the class a device is instructed to stop moving to the right Ts, the new position is P1, and the device a records the moving average speed thereof as V, then the latest position of the class a device is (VT,0) is computed. The motion detection module can judge whether the class A device is successfully executed according to the comparison of the calculation result and the actual result, so as to determine the next step of instruction.
2) Motion detection using class B device capabilities
That is, the device B needs to sense the specific position and motion behavior detection capability of the device a on the plane of the device B, and firstly, the device B has the capability of positioning and managing the coordinates of the device B.
Regarding the self-coordinate positioning, the whole apparatus B can perform coordinate setting with a certain point at a certain angle of the two-dimensional plane of the device as a starting point set to (0,0), and refer to the following figure. Taking device B with width W and height H as an example, if the top left point represents the starting point as (0,0), then any point Pn (x, y) on the plane of device B represents the meaning: the horizontal distance P0 is located at x and the vertical distance is y.
Identifying the location of device a in device B may depend on the hardware capabilities of device a and device B, and is not limited to the following:
2-1) using the specific known starting position and the motion trail data of the device A to identify the position, for example, the device A uses a specific starting position P (x0, y0) of the device B as a starting point, and calculates the position point P of a certain time T according to the motion trail (including speed, direction, acceleration and motion time) of the device A.
The initial position is (x,0), the position P in the cut-off diagram, assuming that t is operated at the speed V and the direction is horizontal, the point P is (x, vt),
the other directions, the acceleration direction and the curvilinear motion can be calculated through a real-time position physical motion formula.
2-2) identifying coordinate information of the device A at the device B by using the contact sensing device of the device B. The contact sensing device can be a magnetic force, gravity and other sensing matrix devices.
The sensing matrix divides the device into W x H matrices, sensing with device a and device B, and identifying device a as contacting matrix grid points in device B rows, such as: the first matrix in the upper left corner represents (0,0), contact point I-1, J-1 in row I and column J
2-3) if device B is equipped with an LCD device, i.e. device B is a touch screen device that can be touched in the plane, it directly recognizes the touch point of device a on the display screen as its location.
2-4) the physical position of the B device in the A device is confirmed by utilizing multi-point positioning, the positioning device can utilize the positioning means of infrared, Bluetooth and the like which are mature in the industry, the invention is not limited, the following figures take three-point positioning as an example, the three point positions on the B device are only illustrated, and the invention does not limit the installation positions of the positioning points and the arrangement number of the positioning points.
The multipoint positioning calculation module can be used for calculating on the device B or other devices which are in wired and wireless connection with the device B, such as a smart phone, and transmitting corresponding calculation results to the device B.
Example 2
This embodiment introduces a scenario-based multi-device interaction method, including:
collecting externally input control source information;
processing the control source information and extracting control instruction information;
analyzing the control instruction information, and determining a control signal model according to the control instruction information according to a preset mapping relation between an instruction and the control signal model;
determining a motion model according to the determined control signal model according to the mapping relation between a preset control signal model and the motion model; the motion model comprises equipment terminal data and motion data to be executed;
and generating a sequencing control instruction corresponding to one or more equipment terminals according to the determined motion model, and distributing the sequencing control instruction to the corresponding equipment terminals, so that the equipment terminals execute corresponding actions according to the received control instruction.
Corresponding to the interactive system of the first aspect, the instruction analysis in the interactive method includes contents such as analysis of map data, determination of a scenario, analysis of scenario data, and the like, a combination of instruction sequences to be executed by the plurality of first-class device terminals after the combination can be obtained through the analysis, each instruction can include target devices, execution action types, action contents, information of action processes on the map, and the like, and the instruction sequences can be distributed in real time according to time set by the scenario, so that the plurality of first-class device terminals can perform scenario appointed or corresponding to the scenario on the map or the chessboard of the second-class device terminals.
In the method, the third type of equipment terminal or the first type and second type of equipment terminals can be adopted for directly acquiring the external control instruction, processing the external control instruction and generating the instruction which can be identified by the equipment terminal, and the analysis and distribution of the instruction are preferably executed by the second type of equipment terminal.
Optionally, in the mapping relationship between the command and the control signal model and the mapping relationship between the control signal model and the motion model:
1 control command corresponds to one control signal model and corresponds to 1 or more motion models;
alternatively, the plurality of control commands correspond to one control signal model and 1 or more motion models.
Optionally, the distribution of the serialization control instruction is as follows: at the execution time corresponding to the instruction, the corresponding instruction is sent to the corresponding equipment terminal in real time;
or all control instructions corresponding to each equipment terminal are sent to the corresponding equipment terminal.
In summary, the interactive system and the interactive method of the present invention can implement real-time interaction of multiple devices based on the scenario, and the device terminals among multiple users or the multiple devices of a single user can implement deduction of the designated scenario through interaction. And the combination of different equipment terminals can realize the updating of the script and the map, thereby refreshing the script, having greater attraction for the users of the low ages, and stimulating the innovation consciousness of the users, and presenting diversified playing methods and experiences.
Meanwhile, the invention can provide a programmable script updating and editing interface and a control-execution model editing interface for the user, realizes DIY control of the user on doll control response, and further brings brand new interaction and entertainment experience.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A multi-equipment interaction system based on a script is characterized by comprising a first type equipment terminal and a second type equipment terminal which are connected and communicated with each other, wherein the second type equipment terminal comprises a placing platform for providing an activity area for the first type equipment terminal;
script data are preset in the first type equipment terminal, and the script data comprise control instruction data and/or map data corresponding to a script; script data and/or map data corresponding to a second type equipment terminal placing platform are preset in the second type equipment terminal;
the first equipment terminal and/or the second equipment terminal respond to the input of an external control instruction, analyze the external control instruction, and control the corresponding first equipment terminal to execute the action specified by the control instruction on the placing platform of the second equipment terminal according to the analysis result;
or the first-class device terminals and/or the second-class device terminals respond to the input of external control instructions, analyze the external control instructions to obtain control instructions corresponding to the first-class device terminals and distribute the control instructions to the corresponding first-class device terminals, and the first-class device terminals execute actions specified by the control instructions on the placing platforms of the second-class device terminals according to the received control instructions;
or the first equipment terminal executes the corresponding action of the script on the placing platform of the corresponding second equipment terminal according to the script data of the first equipment terminal or the second equipment terminal;
or the first-type equipment terminals and/or the second-type equipment terminals respond to the external control instruction, and control the first-type equipment terminals to execute actions corresponding to the script on the placing platform of the second-type equipment terminals according to the preset script, the script specified by the external control instruction or the regenerated script.
2. The interactive system as claimed in claim 1, wherein the first type equipment terminal and the second type equipment terminal respectively comprise a plurality of models, and the first type equipment terminal and the second type equipment terminal of each model are provided with same or different script data;
the method comprises the steps of combining communication connections of first equipment terminals and second equipment terminals of different models and different numbers, triggering the first equipment terminals and/or the second equipment terminals in the combination to select corresponding scripts from preset script data, or generating new scripts according to current scripts of all equipment in the combination, or triggering the first equipment terminals and/or the second equipment terminals in the combination to send new combination script request information so as to obtain corresponding scripts from the outside.
3. The interactive system according to claim 2, further comprising a server in which scenario data corresponding to each type of equipment terminal, and scenario data corresponding to a combination of a plurality of types of equipment terminals are stored; when the equipment terminals with different models and numbers are combined, the newly combined script request information is transmitted to the server, and the server returns corresponding script data according to the request information.
4. The interactive system according to claim 1, further comprising a third type device terminal, wherein the third type device terminal receives a user control instruction, and the user control instruction is processed by the third type device terminal or a preset instruction processing device to obtain a control instruction corresponding to the first type device terminal and/or the second type device terminal, and then the control instruction is distributed to the corresponding first type device terminal and/or the second type device terminal;
or after receiving the user control instruction, the third type device terminal transmits the control instruction to the first type device terminal or the second type device terminal for instruction processing to obtain a control instruction corresponding to the first type device terminal and/or the second type device terminal, and then distributes the control instruction to the corresponding first type device terminal and/or the second type device terminal;
or after receiving the user control instruction, the third type device terminal transmits the control instruction to the first type device terminal and the second type device terminal respectively for instruction processing, and each of the first type device terminal and the second type device terminal extracts the control instruction corresponding to the local device terminal in the control instruction.
5. The interactive system according to claim 4, wherein when the first class device terminal, the second class device terminal, the third class device terminal or the designated command processing device processes the user control command, the control signal model is determined according to the user control command according to the mapping relationship between the preset command and the control signal model, then the motion model including the device terminal and the action data to be executed is determined according to the mapping relationship between the preset control signal model and the motion model, and then the serialization control command corresponding to the device terminal is generated according to the motion model and distributed to the corresponding device terminal.
6. The interactive system according to claim 1, wherein the first device terminal and/or the second device terminal comprises a map management module, and the operations performed by the map management module comprise map loading and map parsing, and further comprise map updating and/or map generation;
the map loading is to load the local current map or chessboard data of the equipment terminal; the map analysis is to analyze the loaded map or chessboard data to obtain map or chessboard area division data corresponding to the second type of equipment terminal placement platform area; the map is updated to receive map or chessboard updating data synchronized by the server, and the map or chessboard updating data is updated to the local; the map generation is used for generating map serialization data, and comprises automatic generation and generation triggered by a user;
and/or the first equipment terminal and/or the second equipment terminal comprise a script management module, and the operation executed by the script management module comprises script loading and script analysis, and further comprises script updating and/or script generation;
the script loading is the local current script data of the loading equipment terminal or the script data input from the outside; the script analysis comprises the step of obtaining a control instruction sequence facing to a first type equipment terminal according to the loaded script data; updating the script into script updating data synchronized by the receiving server, and updating the script updating data to the local; the generation of the script is to perform the generation of script serialization instruction data, including automatic generation and generation in response to a user trigger.
7. The interactive system according to claim 6, wherein the scenario includes scenario header information and a control command sequence, the scenario header information including scenario version data, scenario scene definition data, and/or map or board setting data corresponding to the scenario; the control instruction sequence comprises a plurality of instruction segments, and each instruction segment comprises one or more instructions; the instruction section is an automatic execution instruction section or a user control section, when the control instruction is distributed, the instructions in the automatic execution instruction section are distributed in real time according to a sequence, and the user control section instruction is an instruction which is selected according to a preset rule in response to a control instruction signal input by a user.
8. The interactive system of claim 6, wherein the map load map or checkerboard data is a set of serialized data; the map analysis determines the mapping relation between the digital map coordinates and the physical coordinates of the placing platform according to a preset mapping mode, and further determines the area division of a map or a chessboard on the placing platform of the second type of equipment terminal;
the area division of the map or the chessboard on the second type of equipment terminal placing platform comprises the following steps: one or more of a waiting area, an exiting area, an obstacle area and a specific plot triggering area of the doll;
the mapping mode between the digital map coordinates and the physical coordinates of the placing platform comprises equal-scale scaling mapping and cutting mapping.
9. The interactive system as claimed in claim 6, wherein the second type device terminal is provided with a connecting mechanism so that the edges of different class B device placement platforms can be butted against each other;
the connecting mechanism is a magnetic suction mechanism or a buckle mechanism;
and/or the map analysis further comprises the steps of responding to the splicing state of the placing platforms among the plurality of B-type devices, obtaining spliced map or chessboard data, and determining the area division on the spliced placing platforms according to the size relation of the placing platforms before and after splicing.
10. A multi-equipment interaction method based on a script is characterized by comprising the following steps:
collecting externally input control source information;
processing the control source information and extracting control instruction information;
analyzing the control instruction information, and determining a control signal model according to the control instruction information according to a preset mapping relation between an instruction and the control signal model;
determining a motion model according to the determined control signal model according to the mapping relation between a preset control signal model and the motion model; the motion model comprises equipment terminal data and motion data to be executed;
and generating a sequencing control instruction corresponding to one or more equipment terminals according to the determined motion model, and distributing the sequencing control instruction to the corresponding equipment terminals, so that the equipment terminals execute corresponding actions according to the received control instruction.
CN202010925310.XA 2020-09-06 2020-09-06 Multi-equipment interaction system and interaction method based on script Active CN112198816B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010925310.XA CN112198816B (en) 2020-09-06 2020-09-06 Multi-equipment interaction system and interaction method based on script

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010925310.XA CN112198816B (en) 2020-09-06 2020-09-06 Multi-equipment interaction system and interaction method based on script

Publications (2)

Publication Number Publication Date
CN112198816A true CN112198816A (en) 2021-01-08
CN112198816B CN112198816B (en) 2022-02-18

Family

ID=74005324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010925310.XA Active CN112198816B (en) 2020-09-06 2020-09-06 Multi-equipment interaction system and interaction method based on script

Country Status (1)

Country Link
CN (1) CN112198816B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101394318A (en) * 2007-09-21 2009-03-25 歌尔声学股份有限公司 Distributed intelligent toy system and communication method thereof
US20140170929A1 (en) * 2012-12-17 2014-06-19 Librae Limited Interacting toys
CN105653250A (en) * 2014-11-14 2016-06-08 中国科学院沈阳计算技术研究所有限公司 Task control system for three dimensional simulation system
CN105844879A (en) * 2016-04-02 2016-08-10 深圳市熙龙玩具有限公司 Story toy system and implementation method thereof
CN107261490A (en) * 2017-07-06 2017-10-20 腾讯科技(深圳)有限公司 Realize intelligent toy interactive method, client and intelligent toy
CN107688802A (en) * 2017-09-29 2018-02-13 深圳市玛塔创想科技有限公司 A kind of easy programming method and device based on image recognition
CN108905188A (en) * 2018-07-20 2018-11-30 北京电鳗科技有限公司 A kind of toy splicing map and the toy comprising the map
CN110119202A (en) * 2019-04-24 2019-08-13 彼乐智慧科技(北京)有限公司 A kind of method and system for realizing scene interactivity

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101394318A (en) * 2007-09-21 2009-03-25 歌尔声学股份有限公司 Distributed intelligent toy system and communication method thereof
US20140170929A1 (en) * 2012-12-17 2014-06-19 Librae Limited Interacting toys
CN105653250A (en) * 2014-11-14 2016-06-08 中国科学院沈阳计算技术研究所有限公司 Task control system for three dimensional simulation system
CN105844879A (en) * 2016-04-02 2016-08-10 深圳市熙龙玩具有限公司 Story toy system and implementation method thereof
CN107261490A (en) * 2017-07-06 2017-10-20 腾讯科技(深圳)有限公司 Realize intelligent toy interactive method, client and intelligent toy
CN107688802A (en) * 2017-09-29 2018-02-13 深圳市玛塔创想科技有限公司 A kind of easy programming method and device based on image recognition
CN108905188A (en) * 2018-07-20 2018-11-30 北京电鳗科技有限公司 A kind of toy splicing map and the toy comprising the map
CN110119202A (en) * 2019-04-24 2019-08-13 彼乐智慧科技(北京)有限公司 A kind of method and system for realizing scene interactivity

Also Published As

Publication number Publication date
CN112198816B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
US11948260B1 (en) Streaming mixed-reality environments between multiple devices
US10671239B2 (en) Three dimensional digital content editing in virtual reality
CN107911614B (en) A kind of image capturing method based on gesture, device and storage medium
CN103357177B (en) Portable type game device is used to record or revise game or the application of real time execution in primary games system
US11839816B2 (en) Method and apparatus for controlling movement of virtual object, terminal, and storage medium
CN103093658B (en) Child real object interaction story building method and system
CN108279878B (en) Augmented reality-based real object programming method and system
CN102622774B (en) Living room film creates
US20120108305A1 (en) Data generation device, control method for a data generation device, and non-transitory information storage medium
CN111589167B (en) Event sightseeing method, device, terminal, server and storage medium
CN202150897U (en) Body feeling control game television set
US20140132498A1 (en) Remote control using depth camera
CN113822970A (en) Live broadcast control method and device, storage medium and electronic equipment
KR20210067874A (en) Electronic device for providing target video in sports play video and operating method thereof
CN112198816B (en) Multi-equipment interaction system and interaction method based on script
CN106232194A (en) Game delivery device, game delivering method and game delivery program
CN113727353A (en) Configuration method and device of entertainment equipment and entertainment equipment
CN112809709B (en) Robot, robot operating system, robot control device, robot control method, and storage medium
CN110798722A (en) Playing tool for VR video
CN112717395B (en) Audio binding method, device, equipment and storage medium
CN115591225B (en) Data processing method, device, computer equipment and storage medium
CN112717411B (en) Track recording method, device, equipment and storage medium of virtual vehicle
CN112231220B (en) Game testing method and device
CN116055757A (en) Method, device, electronic equipment and storage medium for controlling and processing anchor avatar
CN108702547B (en) Moving image reproduction device, moving image reproduction method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant