CN112702423B - Robot learning system based on Internet of things interactive entertainment mode - Google Patents

Robot learning system based on Internet of things interactive entertainment mode Download PDF

Info

Publication number
CN112702423B
CN112702423B CN202011534704.9A CN202011534704A CN112702423B CN 112702423 B CN112702423 B CN 112702423B CN 202011534704 A CN202011534704 A CN 202011534704A CN 112702423 B CN112702423 B CN 112702423B
Authority
CN
China
Prior art keywords
robot
unit
data
learning system
account
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011534704.9A
Other languages
Chinese (zh)
Other versions
CN112702423A (en
Inventor
陈锴林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Bimai Technology Co ltd
Original Assignee
Hangzhou Bimai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Bimai Technology Co ltd filed Critical Hangzhou Bimai Technology Co ltd
Priority to CN202011534704.9A priority Critical patent/CN112702423B/en
Publication of CN112702423A publication Critical patent/CN112702423A/en
Application granted granted Critical
Publication of CN112702423B publication Critical patent/CN112702423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/104Peer-to-peer [P2P] networks
    • H04L67/1074Peer-to-peer [P2P] networks for supporting data block transmission mechanisms
    • H04L67/1078Resource delivery mechanisms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/30Control
    • G16Y40/35Management of things, i.e. controlling in accordance with a policy or in order to achieve specified objectives
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed

Abstract

The invention discloses a robot learning system based on an internet of things interactive entertainment mode, which belongs to the technical field of artificial intelligence, and comprises: the robot comprises a human-computer interaction module, a robot and environment interaction module and a robot learning system module; the robot learning system comprises a robot learning system module and a robot interaction module, wherein the robot interaction module comprises a first account operation unit, an interaction control unit, a data synchronization fusion unit and a visualization unit, the robot interaction module comprises a second account operation unit, a deployment feedback verification unit, a data acquisition unit and an equipment control unit, and the robot learning system module comprises an account management unit, a data processing unit, a learning training unit and a content distribution unit. The system of the invention can obtain the robot learning training data with labels at low cost.

Description

Robot learning system based on Internet of things interactive entertainment mode
Technical Field
The invention belongs to the technical field of artificial intelligence, and particularly relates to a robot learning system based on an internet of things interactive entertainment mode.
Background
In recent years, the robot learning technology has made a lot of breakthrough along with the development of deep learning, and among many learning methods, supervised learning is an extremely important learning method in the current technical flow, and the effect is good. For many practical scenes, in order to obtain labeled data, the current method needs to arrange acquisition equipment in the actual scene, acquire data, and then manually label the data in a crowdsourcing mode, so that the consumed manpower, financial resources and time cost are extremely high. The lack of data therefore becomes a bottleneck that hinders further progress in this field.
The method for acquiring data in the prior art needs to spend a large amount of manpower, material resources, financial resources and time to acquire and label data, so that the data under various practical scenes are always in a lack state.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a robot learning system based on an Internet of things interactive entertainment mode. The robot learning system solves the problem that the robot learning progress is hindered due to data lack caused by high data acquisition cost.
The purpose of the invention is realized by the following technical scheme: a robot learning system based on an Internet of things interactive entertainment mode comprises: the robot comprises a human-computer interaction module, a robot and environment interaction module and a robot learning system module; the human-computer interaction module, the robot and environment interaction module and the robot learning system module are mutually connected in data communication;
the human-computer interaction module comprises a first account operation unit, an interaction control unit, a data synchronization fusion unit and a visualization unit, wherein the first account operation unit is used for registering an account by a user and submitting information of the registered account to the robot learning system module; the interaction control unit sends a control command to the robot and environment interaction module through a P2P communication channel to control the robot in the scene and other equipment in the scene; the data synchronous fusion unit is used for sending the data stream after synchronous fusion on the time axis to the robot learning system module; the visualization unit is used for receiving a live video stream distributed by the robot learning system module or a data stream sent by the robot and environment interaction module and performing visualization processing;
the robot and environment interaction module comprises a second account operation unit, a deployment feedback verification unit, a data acquisition unit and an equipment control unit; the second account operation unit is used for registering a scene registration account and submitting the information of the scene registration account to the robot learning system module; the deployment feedback verification unit deploys the trained or stored robot model in the training process to the robot in the practical scene according to the specified task, tests the training effect of the robot model, obtains a feedback result, and sends the feedback result to the robot learning system module; the data acquisition unit is used for acquiring data streams acquired by a camera and a sensor in the environment and sending the data streams to the robot learning system module and the human-computer interaction module; the device control unit is used for receiving a control instruction for the robot or other devices from the human-computer interaction module and controlling the robot or other devices in the scene to complete tasks;
the robot learning system module comprises an account management unit, a data processing unit, a learning training unit and a content distribution unit, wherein the account management unit is used for receiving information of a registration account submitted in the man-machine interaction module and information of a scene registration account submitted in the robot and environment interaction module, managing account login and processing payment and transaction information; the data processing unit is used for receiving the data stream which is sent by the man-machine interaction module and synchronously fused on the time axis, and performing filtering and batch combination processing; the learning and training unit is used for training a robot model and sending the trained robot model or the robot model stored in the training process to the robot and environment interaction module for deployment test; the content distribution unit is used for receiving the data stream from the robot and environment interaction module and distributing the data stream to the man-machine interaction module.
Further, the human-computer interaction module is divided into a remote control mode and an ornamental mode.
Furthermore, the human-computer interaction module also comprises a peripheral docking unit for connecting interaction hardware equipment; the visualization unit is also used for receiving data streams of the robot-environment interaction module and the robot learning system module, decoding the data streams, directly displaying the decoded data streams on a screen or displaying the decoded data streams on an external display device through an external docking unit, and transmitting the received data streams to the data synchronization fusion unit; the interaction control unit is also used for receiving control command data of the peripheral docking unit, sending the control command data to the equipment control unit, simultaneously sending the control command data to the data synchronization fusion unit for synchronization fusion processing, and receiving an authority control command sent by the first account operation unit; the data synchronous fusion unit is used for receiving data streams and control command data of the visualization unit and the interactive control unit, performing synchronous fusion processing on a time axis and then sending the data streams and the control command data to the data processing unit; the first account operation unit receives the equipment information transmitted by the peripheral docking unit, performs equipment registration and account binding, and controls the sending authority of the interactive control unit for sending data to the outside based on the account authority.
Further, the interactive hardware devices include a projector, a display, an XR device, and a gaming peripheral.
Furthermore, the robot learning system module further comprises a central information analysis decision control unit, wherein the central information analysis decision control unit is used for receiving data to be analyzed and decided from each unit of the robot learning system module, performing analysis and decision processing, and returning an analysis decision result to a corresponding unit in the robot learning system module; and the data processing unit transmits the data for training to the learning training unit for training.
Furthermore, the robot and environment interaction module further comprises an equipment docking unit, wherein the equipment docking unit is used for connecting external equipment, acquiring information of the external equipment, receiving an equipment control command of the equipment control unit, sending the control command to control the target external equipment, receiving a robot model update command and data sent by the deployment verification feedback unit, receiving data collected by the connected external equipment and sending the collected data to the data collection unit; the second account operation unit is also used for registering the robot test environment account, submitting registration information, receiving the equipment information acquired by the equipment docking unit, receiving the common account information participating in the entertainment activity from the robot learning system module, and sending the common account information participating in the entertainment activity to the equipment control unit for permission control; the data acquisition unit is used for receiving the data acquired by the equipment docking unit, filtering, encoding, encrypting and then sending the data to the robot learning system module and the human-computer interaction module.
Further, the external device includes a robot, a camera, a sensor, and an actuator.
Compared with the prior art, the invention has the following beneficial effects: according to the invention, interactive entertainment activities are established in a real scene and the Internet is connected, so that people are attracted to spontaneously participate in the remote control robot to complete tasks, sensor data acquired during participation and a control command stream of the robot are used as label data, the label data can be used as training data for robot learning, and the problem that the current robot learning is difficult to progress due to lack of data is solved; the robot learning system can be directly deployed in a real environment for testing and verification in the training process and returns a feedback result for adjusting the training parameters and the process, so that the authenticity and the effectiveness of the learning effect verified in the prior art are improved, data aiming at the application field of the robot and a control command data stream for controlling the robot to participate in entertainment activities by a user can be directly collected as tags through the system, and compared with the method in the prior art, the method avoids high-cost data collection and labeling processes.
Drawings
FIG. 1 is a block diagram of a robot learning system of the present invention;
FIG. 2 is a flow chart of the structure of the human-computer interaction module of the present invention;
FIG. 3 is a block diagram of a robot learning system according to the present invention;
FIG. 4 is a flow chart of the structure of the robot and environment interaction module according to the present invention.
Detailed Description
The technical solution of the present invention is further explained with reference to the accompanying drawings.
As shown in fig. 1, the present invention provides a robot learning system based on an internet of things interactive entertainment mode, the system comprising: the robot comprises a human-computer interaction module, a robot and environment interaction module and a robot learning system module; and the human-computer interaction module, the robot and environment interaction module and the robot learning system module are mutually connected in data communication.
The human-computer interaction module comprises a first account operation unit, an interaction control unit, a data synchronization fusion unit and a visualization unit, wherein the first account operation unit is used for registering an account by a user and submitting information of the registered account to the robot learning system module; the interaction control unit is used for sending a control command to the robot and environment interaction module through the P2P communication channel to control the robot in the scene and other equipment in the scene; the data synchronous fusion unit is used for sending the data stream after synchronous fusion on the time axis to the robot learning system module for the learning and training of the robot model; the visualization unit is used for receiving live video streams distributed by the robot learning system module or data streams sent by the robot and environment interaction module and performing visualization processing, and supports multiple devices to synchronously display different contents in the visualization process, so that different information sources can be conveniently separated, and a user can conveniently make a decision to control the robot quickly according to different information. The human-computer interaction module is divided into a remote control mode and an ornamental mode, and data in the remote control mode is received from the robot and environment interaction module through a P2P communication channel; and the data of the viewing mode is received from the robot learning system module through a communication channel which is connected with the robot learning system module through the human-computer interaction module.
The robot and environment interaction module comprises a second account operation unit, a deployment feedback verification unit, a data acquisition unit and an equipment control unit; the second account operation unit is used for registering a scene registration account and submitting the information of the scene registration account to the robot learning system module; the deployment feedback verification unit deploys the trained or stored robot model in the training process to the robot in a practical scene according to the specified task, tests the training effect of the robot model, obtains a feedback result, and sends the feedback result to the robot learning system module; the data acquisition unit is used for acquiring data streams acquired by the camera and the sensors in the environment, sending the data streams to the robot learning system module and sending the data streams to the human-computer interaction module in a remote control state through a P2P communication channel; the equipment control unit is used for receiving control instructions from the robot or other equipment in the human-computer interaction module and realizing the task completion of the robot or other equipment in a single-person or multi-person cooperative control scene through rule processing.
The robot learning system module comprises an account management unit, a data processing unit, a learning training unit and a content distribution unit, wherein the account management unit is used for receiving information of a registration account submitted in the human-computer interaction module and information of a scene registration account submitted in the robot and environment interaction module, managing the account and processing payment and transaction information; the data processing unit is used for receiving the data which are sent by the human-computer interaction module and are synchronously fused on the time axis, filtering the data, performing batch combination processing to make the data not larger than the size of the computing resource, and then training the robot model; the learning and training unit is used for training a robot model and sending the trained robot model or the robot model stored in the training process to the robot and environment interaction module for deployment test; the content distribution unit is used for receiving the data stream from the robot and environment interaction module and distributing the data stream to the man-machine interaction program modules in the viewing mode.
In one technical scheme of the present invention, the human-computer interaction module not only includes a first account operation unit, an interaction control unit, a data synchronization fusion unit, a visualization unit, but also includes a peripheral docking unit, as shown in fig. 2; the peripheral docking unit is used for connecting interactive hardware equipment, the visualization unit is also used for receiving data streams of the robot and environment interaction module and the robot learning system module, decoding the data streams, directly displaying the decoded data streams on a screen or displaying the decoded data streams on an external display device through the peripheral docking unit, and transmitting the received data streams to the data synchronization fusion unit; the interaction control unit is also used for receiving control command data of the peripheral docking unit, sending the control command data to the equipment control unit and simultaneously transmitting the control command data to the data synchronous fusion unit for synchronous fusion processing; receiving an authority control command sent by a first account operation unit; the data synchronization fusion unit is used for receiving the data streams and the control command data of the visualization unit and the interactive control unit, performing synchronization fusion processing on a time axis to keep the data streams and the control command data synchronous, and then sending the data streams and the control command data to the data processing unit; the first account operation unit is used for registering an account, receiving equipment information transmitted by the peripheral docking unit, registering the equipment and binding the equipment information with the account; and controlling the sending authority of the interaction control unit for sending data to the outside based on the account authority.
In a technical solution of the present invention, the robot-environment interaction module includes not only the second account operation unit, the deployment feedback verification unit, the data acquisition unit, the device control unit, but also the device docking unit, as shown in fig. 4, where the device docking unit is used to connect to an external device to obtain information of the external device, and the interaction hardware device includes a projector, a display, an XR device, and a game peripheral. The equipment docking unit receives an equipment control command of the equipment control unit, sends the control command to control the target external equipment, receives a robot model updating command and data sent by the deployment verification feedback unit, and updates the robot model; receiving data collected by connected external equipment and sending the collected data to a data collection unit; the second account operation unit is also used for registering the robot test environment account, submitting registration information, receiving equipment information acquired by the equipment docking unit, submitting the information to the robot learning system module for auditing, binding the information with the account after auditing, receiving common account information of the robot learning system module participating in entertainment activities, and sending the common account information participating in the entertainment activities to the equipment control unit for permission control; the data acquisition unit is used for receiving the data acquired by the equipment docking unit, filtering, encoding, encrypting and then sending the data to the robot learning system module and the human-computer interaction module.
In one technical solution of the present invention, the robot learning system module not only includes an account management unit, a data processing unit, a learning training unit, and a content distribution unit, but also has a central information analysis decision control unit, as shown in fig. 3, the central information analysis decision control unit analyzes and decides data to be analyzed and decided transmitted by each unit of the robot learning system module, and returns an analysis decision result to a corresponding unit in the robot learning system module. Specifically, the method comprises the following steps: the account management unit is used for receiving external account registration, login and transaction information, managing existing accounts, transmitting information of registered accounts submitted in the man-machine interaction module and information account information of scene registered accounts submitted in the robot and environment interaction module to the central information analysis decision control unit for analysis, returning analysis results, assisting account management, such as screening abnormal accounts, and guaranteeing account quality. The data processing unit receives data which are sent by the man-machine interaction module and the equipment docking unit and are prepared for training, transmits the data to the central information analysis decision control unit for analysis, returns a decision result, then carries out filtering and batch combination according to the decision result, finally combines the batch data which meet the requirements, and then transmits the batch data to the learning training unit for training; the learning training unit receives data of the data processing unit, inputs the data into the robot model for training, updates and sends parameters and parameters generated in the training process to the robot under the actual task for deployment test, returns a feedback result, then sends the feedback result and other training related information to the central information decision control unit for analysis, returns a decision result, automatically adjusts training parameters and a process according to the decision result, tests, verifies and feeds back the trained result, and iterates repeatedly until the robot model meets the task requirement, thereby completing the training process; the content distribution unit is also used for receiving external video stream data, sampling and sending the external video stream data to the central information decision control unit for analysis and content verification, returning verification results and distributing the verified external video stream data to the man-machine interaction modules in the watching mode.
The specific implementation process of the scheme provided by the invention comprises the following steps: the robot learning system module is deployed in a cloud server, the human-computer interaction module is deployed in an application program of a PC or a mobile device, and an installation package of the application program is uploaded to the Internet, so that a common user can easily obtain the application. The application can be connected with a cloud server with a robot learning system module deployed after the application is connected to the Internet, and operations such as account registration, login, browsing of distributed content resources and the like are performed. In an application program of integrating a robot and an environment interaction module into a PC or an embedded box device, various hardware devices including a robot, a camera, a sensor, an actuator and the like can be connected by utilizing the existing communication technology to build an entertainment scene. The application program can be connected with a cloud server with a robot learning system module deployed after being accessed to the Internet, a user building an entertainment scene can register and log in an account, the entertainment scene is issued to the cloud server robot learning system module in a content resource mode, and the entertainment scene is distributed to a visualization unit in a human-computer interaction module in the application program used by an interested common user through a content distribution unit to be browsed and selectively participated by the common user.
The ordinary users pay to purchase the bidding permission through the account operation unit, and obtain the control right after regular bidding and bid winning, wherein the number of the users obtaining the control right can be one or more, and the users obtaining the control right can send control instructions to the equipment control unit in the robot and environment interaction module through the interaction control unit in the man-machine interaction module in the application program used by the users, so that the robots or equipment in the entertainment scene are controlled, and further the users participate in interactive entertainment activities.
The entertainment activity process content is collected by the data collection unit, sent to the content distribution unit of the robot learning system module of the server, sampled and sent to the central information analysis decision control unit, analyzed and checked, and then shared to the existing live broadcast platform or N ordinary users in the watching mode in a video streaming mode, so that a plurality of users are attracted to watch or participate in the process. When the robot learning system has a data volume capable of completing the contents of the desired task, the entertainment activity is set to be performed in a mode of operating the robot to complete the task.
By utilizing the system provided by the technical scheme, in the entertainment process, environmental data collected by the robot and the sensor in the entertainment scene, extra tag data generated under the driving of game rules and control command data sent by a user are synchronously fused by the data synchronous fusion unit of the human-computer interaction module and then sent to the data processing unit of the robot learning system module at the server side, and after filtering and preprocessing, the data are combined into batch data which is then sent to the learning and training unit for the learning and training of the robot model. The learning training unit sends robot model data to a deployment verification feedback unit in the robot and environment interaction module to directly deploy an intelligent model to a robot body in an entertainment scene, and automatic application testing is carried out to obtain field sensor data or audience feedback data returned by the data acquisition unit. The central information decision unit analyzes and makes a training scheme decision based on the feedback data, and then sends the feedback data to the learning training unit to adjust the training parameters and the process, and the process is continuously iterated until the task requirements are met.
The robot learning system provided by the invention is used for training the rescue robot, and specifically comprises the following steps: the scene is set as an earthquake disaster relief scene, various tables, chairs, cabinets and sundries in a room fall down in a scattered manner, wheels capable of being controlled remotely are installed at the places where some tables, chairs are in contact with the ground, and the wheels can be controlled by obtaining a control right after a certain user pays a bid to bid through the Internet, so that the positions of the tables, chairs and chairs on the ground partially fall down are changed, and the difficulty of the robot crossing barriers is increased or reduced; the challenge task is to have a quadruped robot traverse this room to another room in the shortest time. After the user willing to participate in the challenge pays the bid, the control right of the robot can be obtained to participate in the challenge, and the robot is controlled remotely to cross the obstacle to complete the task within the set time so as to advance to the next round of challenge. After a plurality of rounds of selection within a certain time, the game generates a plurality of winners to obtain the full reward, the expense source of the full reward can be sponsored by a sponsor, partial advertisements of the sponsors are also arranged in the scene, or a plurality of users directly participate in the bidding but do not bid for winning a bid, and the accumulated expense without returning the bidding expense is agreed in advance. The game activity can be continuously held, and a plurality of users can participate in the game activity. The environmental perception data and the control command data generated by the competition activity can be used for training the rescue robot, so that the robot can obtain the capability of crossing complex obstacles in a disaster relief scene. With the learning ability of the robot to cross complex obstacles becoming stronger and stronger, the challenging scene is designed to be more complex, the robot continuously learns and trains through continuously challenging the complex scene in a semi-automatic and remote manual assistance mode and is closer to the human level until the disaster relief task requirement is met, and the competition challenged by the scene can be stopped.
Based on the implementation of the scheme, the robot training data with the labels can be obtained at low cost and used for learning and training of the robot model, so that the machine intelligence is promoted to further break through to realize more landing applications.

Claims (7)

1. The utility model provides a robot learning system based on interactive entertainment mode of thing networking which characterized in that includes: the robot comprises a human-computer interaction module, a robot and environment interaction module and a robot learning system module; the human-computer interaction module, the robot and environment interaction module and the robot learning system module are mutually connected in data communication;
the human-computer interaction module comprises a first account operation unit, an interaction control unit, a data synchronization fusion unit and a visualization unit, wherein the first account operation unit is used for registering an account by a user and submitting information of the registered account to the robot learning system module; the interaction control unit sends a control command to the robot and environment interaction module through a P2P communication channel to control the robot in the scene and other equipment in the scene; the data synchronous fusion unit is used for sending the data stream after synchronous fusion on the time axis to the robot learning system module; the visualization unit is used for receiving a live video stream distributed by the robot learning system module or a data stream sent by the robot and environment interaction module and performing visualization processing;
the robot and environment interaction module comprises a second account operation unit, a deployment feedback verification unit, a data acquisition unit and an equipment control unit; the second account operation unit is used for registering a scene registration account and submitting the information of the scene registration account to the robot learning system module; the deployment feedback verification unit deploys the trained or stored robot model in the training process to the robot in the practical scene according to the specified task, tests the training effect of the robot model, obtains a feedback result, and sends the feedback result to the robot learning system module; the data acquisition unit is used for acquiring data streams acquired by a camera and a sensor in the environment and sending the data streams to the robot learning system module and the human-computer interaction module; the device control unit is used for receiving a control instruction for the robot or other devices from the human-computer interaction module and controlling the robot or other devices in the scene to complete tasks;
the robot learning system module comprises an account management unit, a data processing unit, a learning training unit and a content distribution unit, wherein the account management unit is used for receiving information of a registration account submitted in the man-machine interaction module and information of a scene registration account submitted in the robot and environment interaction module, managing account login and processing payment and transaction information; the data processing unit is used for receiving the data stream which is sent by the man-machine interaction module and synchronously fused on the time axis, and performing filtering and batch combination processing; the learning and training unit is used for training a robot model and sending the trained robot model or the robot model stored in the training process to the robot and environment interaction module for deployment test; the content distribution unit is used for receiving the data stream from the robot and environment interaction module and distributing the data stream to the man-machine interaction module;
the interaction control unit is also used for receiving control command data of the peripheral docking unit, sending the control command data to the equipment control unit and simultaneously transmitting the control command data to the data synchronous fusion unit for synchronous fusion processing;
the data processing unit performs batch combination processing on the data for training and then transmits the data to the learning training unit for training.
2. The Internet of things interactive entertainment mode-based robot learning system of claim 1, wherein the human-computer interaction module is divided into a remote control mode and an ornamental mode.
3. The internet-of-things interactive entertainment mode-based robot learning system of claim 1, wherein the human-computer interaction module further comprises a peripheral docking unit for connecting with interactive hardware devices; the visualization unit is also used for receiving data streams of the robot-environment interaction module and the robot learning system module, decoding the data streams, directly displaying the decoded data streams on a screen or displaying the decoded data streams on an external display device through an external docking unit, and transmitting the received data streams to the data synchronization fusion unit; receiving an authority control command sent by a first account operation unit; the data synchronous fusion unit is used for receiving data streams and control command data of the visualization unit and the interactive control unit, performing synchronous fusion processing on a time axis and then sending the data streams and the control command data to the data processing unit; the first account operation unit receives the equipment information transmitted by the peripheral docking unit, registers the equipment, binds the equipment with the account and controls the sending authority of the interaction control unit for sending data to the outside based on the account authority.
4. The Internet of things interactive entertainment mode-based robot learning system of claim 3, wherein the interactive hardware devices comprise projectors, displays, XR devices, and game peripherals.
5. The robot learning system based on the internet of things interactive entertainment mode as claimed in claim 1, wherein the robot learning system module further comprises a central information analysis and decision control unit, and the central information analysis and decision control unit is configured to receive data of a decision to be analyzed from each unit of the robot learning system module, perform analysis and decision processing, and return an analysis and decision result to a corresponding unit in the robot learning system module.
6. The robot learning system based on the internet of things interactive entertainment mode as claimed in claim 1, wherein the robot and environment interaction module further comprises an equipment docking unit, the equipment docking unit is used for connecting external equipment, acquiring information of the external equipment, receiving an equipment control command of the equipment control unit, sending the control command to control target external equipment, receiving a robot model update command and data sent by the deployment verification feedback unit, receiving data collected by the connected external equipment, and sending the collected data to the data collection unit; the second account operation unit is also used for registering the robot test environment account, submitting registration information, receiving equipment information acquired by the equipment docking unit, receiving common account information participating in entertainment activities from the robot learning system module, and sending the common account information participating in the entertainment activities to the equipment control unit for authority control; the data acquisition unit is used for receiving the data acquired by the equipment docking unit, filtering, encoding, encrypting and then sending the data to the robot learning system module and the human-computer interaction module.
7. The Internet of things interactive entertainment mode-based robot learning system of claim 6, wherein the external devices comprise robots, cameras, sensors, actuators.
CN202011534704.9A 2020-12-23 2020-12-23 Robot learning system based on Internet of things interactive entertainment mode Active CN112702423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011534704.9A CN112702423B (en) 2020-12-23 2020-12-23 Robot learning system based on Internet of things interactive entertainment mode

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011534704.9A CN112702423B (en) 2020-12-23 2020-12-23 Robot learning system based on Internet of things interactive entertainment mode

Publications (2)

Publication Number Publication Date
CN112702423A CN112702423A (en) 2021-04-23
CN112702423B true CN112702423B (en) 2022-05-03

Family

ID=75510894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011534704.9A Active CN112702423B (en) 2020-12-23 2020-12-23 Robot learning system based on Internet of things interactive entertainment mode

Country Status (1)

Country Link
CN (1) CN112702423B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113630458A (en) * 2021-08-04 2021-11-09 北京电信规划设计院有限公司 Multi-scene multi-terminal robot management system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407518A (en) * 2018-12-20 2019-03-01 山东大学 The autonomous cognitive approach of home-services robot operating status and system
WO2019113510A1 (en) * 2017-12-07 2019-06-13 Bluhaptics, Inc. Techniques for training machine learning
CN110084307A (en) * 2019-04-30 2019-08-02 东北大学 A kind of mobile robot visual follower method based on deeply study
CN110465947A (en) * 2019-08-20 2019-11-19 苏州博众机器人有限公司 Multi-modal fusion man-machine interaction method, device, storage medium, terminal and system
CN111914069A (en) * 2019-05-10 2020-11-10 京东方科技集团股份有限公司 Training method and device, dialogue processing method and system and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019113510A1 (en) * 2017-12-07 2019-06-13 Bluhaptics, Inc. Techniques for training machine learning
CN109407518A (en) * 2018-12-20 2019-03-01 山东大学 The autonomous cognitive approach of home-services robot operating status and system
CN110084307A (en) * 2019-04-30 2019-08-02 东北大学 A kind of mobile robot visual follower method based on deeply study
CN111914069A (en) * 2019-05-10 2020-11-10 京东方科技集团股份有限公司 Training method and device, dialogue processing method and system and medium
CN110465947A (en) * 2019-08-20 2019-11-19 苏州博众机器人有限公司 Multi-modal fusion man-machine interaction method, device, storage medium, terminal and system

Also Published As

Publication number Publication date
CN112702423A (en) 2021-04-23

Similar Documents

Publication Publication Date Title
US11050977B2 (en) Immersive interactive remote participation in live entertainment
US9746912B2 (en) Transformations for virtual guest representation
US11593872B2 (en) Immersive virtual entertainment system
CN110213601A (en) A kind of live broadcast system and live broadcasting method based on cloud game, living broadcast interactive method
US20180300097A1 (en) Communication to an Audience at an Event
CN103442774B (en) Double mode program performs and loads
US9066144B2 (en) Interactive remote participation in live entertainment
CN104333775B (en) Virtual objects interactive approach, device and system in a kind of direct broadcast band
CN107667388A (en) The system and method for the dynamic advertising selection of multiple advertisements or advertising campaign on device
CN107004211A (en) The system and method for providing advertising service to the device of the adaptation Consumer's Experience with customization based on adaptive algorithm
CN107660294A (en) System and method for the autonomous bid of advertisement stock
CN107872732A (en) A kind of self-service interdynamic video live broadcast system
JP2017056194A (en) Game system including third party control
CN106548517A (en) The method and device of video conference is carried out based on augmented reality
CN104363475A (en) Audience grouping association method, device and system
CN106023863A (en) Advertising machine
CN104853228B (en) The system and method for keeping the commercial programme broadcasted synchronous with interactive application
CN112702423B (en) Robot learning system based on Internet of things interactive entertainment mode
CN107430736A (en) System and method for providing the programmatically created of advertising campaign and modification
CN106095098A (en) Body feeling interaction device and body feeling interaction method
CN104821143A (en) Interactive system based on screen dynamic display
Oriti et al. Harmonize: A shared environment for extended immersive entertainment
JP2003248650A (en) Method and system for providing information between virtual space constructed and provided on computer network and real world
CN112492323B (en) Live broadcast mask generation method, readable storage medium and computer equipment
CN106162238A (en) For software application is sent to use the system and method for the equipment of advertisement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant