CN112036819B - Interaction method and device based on task image, medium and electronic equipment - Google Patents

Interaction method and device based on task image, medium and electronic equipment Download PDF

Info

Publication number
CN112036819B
CN112036819B CN202010857156.7A CN202010857156A CN112036819B CN 112036819 B CN112036819 B CN 112036819B CN 202010857156 A CN202010857156 A CN 202010857156A CN 112036819 B CN112036819 B CN 112036819B
Authority
CN
China
Prior art keywords
information
terminal
partner
task
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010857156.7A
Other languages
Chinese (zh)
Other versions
CN112036819A (en
Inventor
许巧龄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intelligent Chuanggu Beijing Technology Co ltd
Original Assignee
Intelligent Chuanggu Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelligent Chuanggu Beijing Technology Co ltd filed Critical Intelligent Chuanggu Beijing Technology Co ltd
Priority to CN202010857156.7A priority Critical patent/CN112036819B/en
Publication of CN112036819A publication Critical patent/CN112036819A/en
Application granted granted Critical
Publication of CN112036819B publication Critical patent/CN112036819B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/105Human resources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • G06Q50/2057Career enhancement or continuing education service
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The disclosure provides an interaction method, device, medium and electronic equipment based on task images. The user logs in the APP through the terminal, performs interactive operation on the APP interface according to the information prompt, and sends an interactive process to the server, and the server automatically generates task information for training through the number of actual participant terminals and the unique characteristic information of the terminals, so that the actual participant generates partner judgment information through the task information. Through the complexity and diversity of task information, different levels of training difficulty are created, so that an actual participant can conduct information communication through a terminal, accurately transmit communication information, realize accurate information matching according to the communication information, and finally obtain successful partner judgment information.

Description

Interaction method and device based on task image, medium and electronic equipment
Technical Field
The disclosure relates to the technical field of computers, in particular to an interaction method, device, medium and electronic equipment based on task images.
Background
The improvement of the efficiency of the enterprise team is not skill any more, and how to improve the quality of the team, thereby improving the performance, has become urgent for the enterprise operators. The existing method for improving team quality is often realized by face-to-face communication and team cooperation to complete a task. The method needs to centralize personnel for training, needs enough sites, needs to centralize teaching training methods, and has low training efficiency.
With the development of internet technology, the current training can be performed in an online manner, for example, in a live video manner, but the training manner is still a simple conversion of offline training, the training efficiency is still not high, and the interactive manner is still similar to the offline manner, so that the improvement of the training effect is not obvious.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The disclosure aims to provide a task image-based interaction method, a task image-based interaction device, a task image-based interaction medium and an electronic device, which can solve at least one technical problem. The specific scheme is as follows:
according to a specific embodiment of the present disclosure, in a first aspect, the present disclosure provides a task image-based interaction method, including:
acquiring the number of actual parameter training user terminals and unique terminal characteristic information;
acquiring a task information set consisting of a corresponding number of task information from a comprehensive task information set based on the number of the practical participant terminals; the task information includes: the method comprises the steps of image unique feature information, image types and task images corresponding to the image types, wherein any task image has partner images with the same image type;
randomly establishing a one-to-one mapping relation between the unique characteristic information of the terminal and the unique characteristic information of the image;
transmitting task information corresponding to the image unique feature information to an actual participant terminal corresponding to the terminal unique feature information based on the one-to-one mapping relation, and transmitting the terminal unique feature information of other actual participant terminals to the actual participant terminal;
receiving partner judgment information sent by each actual participant terminal; the partner determination information includes: and transmitting the terminal unique characteristic information of the actual participant terminal of the partner judging information, the image type and the terminal unique characteristic information of the partner participant terminal with the same image type.
According to a second aspect of the present disclosure, there is provided an interaction device based on task images, including:
the basic information unit is used for acquiring the number of the practical parameter training terminals and the unique characteristic information of the terminals;
the task information acquisition unit is used for acquiring a task information set consisting of a corresponding number of task information from the comprehensive task information set based on the number of the practical participant terminals; the task information includes: the method comprises the steps of image unique feature information, image types and task images corresponding to the image types, wherein any task image has partner images with the same image type;
the mapping relation establishing unit is used for randomly establishing a one-to-one mapping relation between the unique characteristic information of the terminal and the unique characteristic information of the image;
the transmitting unit is used for transmitting task information corresponding to the image unique characteristic information to an actual participant terminal corresponding to the terminal unique characteristic information based on the one-to-one mapping relation, and transmitting the terminal unique characteristic information of other actual participant terminals to the actual participant terminal;
the receiving unit is used for receiving the partner judging information sent by each actual participant terminal; the partner determination information includes: and transmitting the terminal unique characteristic information of the actual participant terminal of the partner judging information, the image type and the terminal unique characteristic information of the partner participant terminal with the same image type.
According to a third aspect of the disclosure, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the interaction method according to any of the first aspects.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the interaction method of any of the first aspects.
Compared with the prior art, the scheme of the embodiment of the disclosure has at least the following beneficial effects:
the disclosure provides an interaction method, device, medium and electronic equipment based on task images. The user logs in the APP through the terminal, performs interactive operation according to information prompts on an APP interface, sends an interactive process to a server, and automatically generates task information for training through the number of actual participant terminals and unique characteristic information of the terminals so that the actual participant generates partner judgment information through the task information. Through the complexity and diversity of task information, different levels of training difficulty are created, so that an actual participant can conduct information communication through a terminal, accurately transmit communication information, realize accurate information matching according to the communication information, and finally obtain successful partner judgment information.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale. In the drawings:
FIG. 1 illustrates a flow chart of a task image based interaction method according to an embodiment of the present disclosure;
FIG. 2 illustrates a task image 1 of a task image-based interaction method according to an embodiment of the present disclosure;
FIG. 3 illustrates a task image 2 of a task image-based interaction method according to an embodiment of the present disclosure;
FIG. 4 illustrates a block diagram of a unit of a task image based interaction device, in accordance with an embodiment of the present disclosure;
fig. 5 shows a schematic diagram of an electronic device connection structure according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Alternative embodiments of the present disclosure are described in detail below with reference to the drawings.
The first embodiment provided by the present disclosure is an embodiment of an interaction method based on task images.
Embodiments of the present disclosure are described in detail below with reference to fig. 1 through 3.
Step S101, the number of the practical parameter training terminals and the unique characteristic information of the terminals are obtained.
The embodiment of the disclosure is that a task server controls to randomly distribute task information, a participant terminal executes tasks, and an execution result is returned to the task server.
The terminal unique characteristic information is information for guaranteeing the uniqueness of the actual participant terminal, namely the terminal unique characteristic information and the actual participant terminal are in one-to-one correspondence, and the actual participant terminal can be found through the terminal unique characteristic information. The terminal unique characteristic information comprises user identity information or the IP address of the actual participant terminal.
Firstly, the task server distributes task information randomly according to the number of the participants, so that the method for acquiring the number of the actual participant terminals and the unique characteristic information of the terminals comprises the following steps:
and step S101-1, acquiring the training login information of the practical training participant terminal in the preset login time.
The preset login time is used for ensuring the normal running of training, if the user cannot log in within the preset login time, the user cannot participate in the training, and meanwhile, the task server is also used for ensuring that the task server can effectively generate task information.
The training logging information comprises user identity information and the IP address of the actual participant terminal.
And step S101-2, acquiring the number of the actual participant terminals and the unique terminal characteristic information based on the participant login information.
The number of actual participant terminals can be obtained by counting the user identity information or the IP addresses of the actual participant terminals.
Step S102, acquiring a task information set composed of a corresponding number of task information from the comprehensive task information set based on the number of the actual participant terminals.
The comprehensive task information sets store various task information, each time the task server generates a task information set from the comprehensive task information sets according to the number of actual participant terminals, the number of the task information sets is equal to the number of the actual participant terminals, that is, the task server enables each actual participant terminal to acquire one task information.
The task information includes: image unique characteristic information, an image type and a task image corresponding to the image type.
The unique image characteristic information is used for distinguishing each task image, the task images of each training have similarity, such as the task images shown in fig. 2 and 3 are all like butterfly images, but a plurality of different butterfly images exist in the task information set, and any task image has partner images with the same image type, namely the task images with the same butterfly images.
The embodiment of the disclosure aims to enable the practical participant to communicate back to back through the terminal, and then judge which task images held by the practical participant terminal are the same as the task images held by the practical participant terminal.
Step S103, randomly establishing a one-to-one mapping relation between the unique terminal characteristic information and the unique image characteristic information.
Because each practical participant terminal holds a task image after the task server is randomly distributed, the unique characteristic information of the terminal and the unique characteristic information of the image have a one-to-one mapping relation.
Step S104, task information corresponding to the image unique feature information is sent to an actual participant terminal corresponding to the terminal unique feature information based on the one-to-one mapping relation, and the terminal unique feature information of other actual participant terminals is sent to the actual participant terminal.
For example, there is a social software at each of the actual participant terminals, the social software displaying the task images distributed to the actual participant terminals and displaying a list of all the actual participants, all the members of the list of actual participants being able to communicate the information of the task images held by each other in a chat room; every two practical parametriers can also independently chat one to one; the practical participant identifies the speaker through the unique characteristic information of the terminal.
Accordingly, embodiments of the present disclosure further include the steps of:
step S104-1, acquiring the compliance communication information sent by the practical participant terminal based on the task image in the preset communication time, and sending the compliance communication information to other practical participant terminals.
The preset communication time is generally set according to the training difficulty, and the more the number of actual participant terminals is, the more the image types are, the more the task images are complex, and the longer the preset communication time is. For example, the number of actual participant terminals is 30, the image types are 6, the task images are butterflies, and the preset communication time is set to 20 minutes.
The determination is required to be made immediately after the preset communication time.
The practical participant only has one judgment opportunity to complete the task.
Optionally, the compliance information includes text information and/or voice information.
The step of obtaining the compliance communication information sent by the practical participant terminal based on the task image comprises the following steps:
and step S104-1-1, acquiring communication information sent by the practical participant based on the task image.
Communicating information, comprising: text information, pictures, video information, and/or voice information.
Step S104-1-2, filtering the communication information to obtain text information and/or voice information in the communication information.
After the communication information is filtered, the compliance communication information is sent to other trainees, and the trainees are prevented from transmitting information through images. The embodiment of the disclosure takes the text information and the voice information as compliance communication information, so that an actual participant can accurately transmit the information through language skills, and all team members can realize common understanding of task images.
Step S105, receiving partner judgment information sent by each actual participant terminal.
The partner determination information includes: and transmitting the terminal unique characteristic information of the actual participant terminal of the partner judging information, the image type and the terminal unique characteristic information of the partner participant terminal with the same image type.
Partner trainee terminals, i.e. other real trainee terminals having the same task image (or image type) as the real trainee terminal transmitting the partner determination information.
Optionally, after receiving the partner determination information sent by each actual participant terminal, the method further includes the following steps:
and step S106, classifying the task information in the task information set based on the image type, and acquiring task information groups with the same image type.
The task information in each task information group has the same image type.
Step S107, obtaining partner control information based on the task information group and the one-to-one mapping relation.
The partner control information includes: the terminal unique characteristic information of the actual participant terminal, the image type and the terminal unique characteristic information of the partner participant terminal.
Step S108, comparing the partner judging information with the partner control information to obtain successfully-compared partner judging information and failed-compared partner judging information.
The successful alignment partner determination information, i.e., the partner determination information is identical to the partner control information, indicating that the alignment was successful. Namely, the training results of the actual participant are qualified.
The partner determination information of the failed alignment, that is, the partner determination information and the partner control information are not completely identical, indicating the failed alignment. I.e. the training results of the actual participant are not qualified.
Optionally, after the obtaining the partner determination information of the comparison failure, the method further includes the following steps:
step S109, obtaining the unique terminal characteristic information of the actual participant terminal sending the partner judgment information and the unique terminal characteristic information of the partner participant terminal with failed comparison from the partner judgment information with failed comparison.
After the terminal unique characteristic information of the actual participant terminal and the terminal unique characteristic information of the partner participant terminal which is failed to be compared are obtained from the partner determination information which is failed to be compared, the method further comprises the following steps:
step S109, transmitting the terminal unique characteristic information of the partner training participant terminal to the actual training participant terminal based on the terminal unique characteristic information of the actual training participant terminal transmitting the partner judgment information.
So as to prompt the actual participant terminal that sent the partner determination information which determinations are erroneous.
The embodiment of the disclosure provides an interaction method based on task images, a user logs in an APP through a terminal, performs interaction operation according to information prompts on an APP interface, sends an interaction process to a server, and automatically generates task information for training through the number of actual participant terminals and unique characteristic information of the terminals so that the actual participant generates partner judgment information through the task information. Through the complexity and diversity of task information, different levels of training difficulty are created, so that an actual participant can conduct information communication through a terminal, accurately transmit communication information, realize accurate information matching according to the communication information, and finally obtain successful partner judgment information.
Corresponding to the first embodiment provided by the present disclosure, the present disclosure also provides a second embodiment, i.e., an interaction device based on task images. Since the second embodiment is substantially similar to the first embodiment, the description is relatively simple, and the relevant portions will be referred to the corresponding descriptions of the first embodiment. The device embodiments described below are merely illustrative.
Fig. 4 illustrates an embodiment of a task image based interaction device provided by the present disclosure.
As shown in fig. 4, the present disclosure provides an interaction device based on task images, including:
the basic information acquisition unit 401 is used for acquiring the number of actual parameter training person terminals and the unique characteristic information of the terminals;
an acquisition task information set unit 402, configured to acquire a task information set composed of a corresponding number of task information from a comprehensive task information set based on the number of actual participant terminals; the task information includes: the method comprises the steps of image unique feature information, image types and task images corresponding to the image types, wherein any task image has partner images with the same image type;
a mapping relationship establishing unit 403, configured to randomly establish a one-to-one mapping relationship between the unique terminal feature information and the unique image feature information;
a sending unit 404, configured to send task information corresponding to the image unique feature information to an actual participant terminal corresponding to the terminal unique feature information based on the one-to-one mapping relationship, and send the terminal unique feature information of other actual participant terminals to the actual participant terminal;
a receiving unit 405, configured to receive partner determination information sent by each actual participant terminal; the partner determination information includes: and transmitting the terminal unique characteristic information of the actual participant terminal of the partner judging information, the image type and the terminal unique characteristic information of the partner participant terminal with the same image type.
Optionally, in the acquiring basic information unit 401, it includes:
the parameter training login information acquisition subunit is used for acquiring parameter training login information of the actual parameter trainer terminal in a preset login time;
the basic information acquisition subunit is used for acquiring the number of the actual participant terminals and the unique terminal characteristic information based on the participant login information.
Optionally, the interaction device further includes:
and the compliance exchanging unit is used for acquiring compliance exchanging information sent by the practical parameter trainer terminal based on the task image in preset communication time before receiving the partner judging information sent by each practical parameter trainer terminal, and sending the compliance exchanging information to other practical parameter trainer terminals.
Optionally, the compliance communication information includes text information and/or voice information;
in the compliant ac unit, it includes:
the communication information acquisition subunit is used for acquiring communication information sent by the actual participant based on the task image;
and the filtering exchange information subunit is used for filtering the exchange information and acquiring text information and/or voice information in the exchange information.
Optionally, the interaction device further includes:
the classification unit is used for classifying the task information in the task information set based on the image type after receiving the partner judgment information sent by each actual participant terminal, and acquiring a task information group with the same image type;
the partner control information acquisition unit is used for acquiring partner control information based on the task information group and the one-to-one mapping relation; the partner control information includes: the method comprises the steps of (1) terminal unique characteristic information of an actual parameter trainer terminal, an image type and terminal unique characteristic information of a partner parameter trainer terminal;
and the comparison unit is used for comparing the partner judging information with the partner comparison information to acquire successfully-compared partner judging information and failed-compared partner judging information.
Optionally, the interaction device further includes:
and the acquisition failure information unit is used for acquiring the terminal unique characteristic information of the actual participant terminal sending the partner judgment information and the terminal unique characteristic information of the partner participant terminal failing to be compared from the partner judgment information failing to be compared after the partner judgment information failing to be compared is acquired.
Optionally, the interaction device further includes:
and the transmission failure information unit is used for transmitting the terminal unique characteristic information of the partner participant terminal to the actual participant terminal based on the terminal unique characteristic information of the actual participant terminal which transmits the partner determination information after acquiring the terminal unique characteristic information of the actual participant terminal and the terminal unique characteristic information of the partner participant terminal which fails in comparison from the partner determination information which fails in comparison.
The embodiment of the disclosure provides a task image-based interaction device, a user logs in an APP through a terminal, performs interaction operation according to information prompts on an APP interface, sends an interaction process to a server, and automatically generates task information for training through the number of actual participant terminals and unique characteristic information of the terminals so that the actual participant generates partner judgment information through the task information. Through the complexity and diversity of task information, different levels of training difficulty are created, so that an actual participant can conduct information communication through a terminal, accurately transmit communication information, realize accurate information matching according to the communication information, and finally obtain successful partner judgment information.
The embodiment of the disclosure provides a third embodiment, namely an electronic device for a task image-based interaction method, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the one processor to enable the at least one processor to perform the task image based interaction method as described in the first embodiment.
The present disclosure provides a fourth embodiment, namely, a computer storage medium storing computer executable instructions that can execute the task image based interaction method as described in the first embodiment.
Referring now to fig. 5, a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 5 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 5, the electronic device may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the electronic device are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows an electronic device having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (6)

1. An interaction method based on task images is characterized by comprising the following steps:
acquiring the number of actual parameter training user terminals and unique terminal characteristic information;
acquiring a task information set consisting of a corresponding number of task information from a comprehensive task information set based on the number of the practical participant terminals; the task information includes: the method comprises the steps of image unique feature information, image types and task images corresponding to the image types, wherein any task image has partner images with the same image type;
randomly establishing a one-to-one mapping relation between the unique characteristic information of the terminal and the unique characteristic information of the image;
transmitting task information corresponding to the image unique feature information to an actual participant terminal corresponding to the terminal unique feature information based on the one-to-one mapping relation, and transmitting the terminal unique feature information of other actual participant terminals to the actual participant terminal;
receiving partner judgment information sent by each actual participant terminal; the partner determination information includes: transmitting the terminal unique characteristic information of the actual participant terminal of the partner judgment information, the image type and the terminal unique characteristic information of the partner participant terminal with the same image type;
before receiving the partner determination information sent by each actual participant terminal, the method further comprises the following steps:
acquiring compliance communication information sent by the practical participant terminal based on the task image within preset communication time, and sending the compliance communication information to other practical participant terminals;
the compliance communication information comprises text information and/or voice information;
the obtaining the compliance communication information sent by the practical participant terminal based on the task image comprises the following steps:
acquiring communication information sent by the actual participant based on the task image;
filtering the communication information to obtain text information and/or voice information in the communication information;
after receiving the partner determination information sent by each actual participant terminal, the method further comprises the following steps:
classifying task information in the task information set based on the image type to obtain task information groups with the same image type;
acquiring partner control information based on the task information group and the one-to-one mapping relation; the partner control information includes: the method comprises the steps of (1) terminal unique characteristic information of an actual parameter trainer terminal, an image type and terminal unique characteristic information of a partner parameter trainer terminal;
and comparing the partner judging information with the partner control information to obtain successfully-compared partner judging information and failed-compared partner judging information.
2. The interaction method according to claim 1, wherein the obtaining the number of actual participant terminals and the terminal unique feature information includes:
acquiring the training login information of the actual training participant terminal in the preset login time;
and acquiring the number of the practical parameter training terminals and the unique terminal characteristic information based on the parameter training login information.
3. The interaction method according to claim 1, further comprising, after the acquisition of the partner determination information of the comparison failure:
and acquiring the terminal unique characteristic information of the actual participant terminal sending the partner judgment information and the terminal unique characteristic information of the partner participant terminal with failed comparison from the partner judgment information with failed comparison.
4. The interactive method according to claim 3, further comprising, after the obtaining of the terminal unique feature information of the actual participant terminal from the partner judgment information of the failed alignment and the terminal unique feature information of the partner participant terminal of the failed alignment:
transmitting the terminal unique characteristic information of the partner trainer terminal to the actual trainer terminal based on the terminal unique characteristic information of the actual trainer terminal transmitting the partner judgment information.
5. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the interaction method according to any of claims 1 to 4.
6. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the interaction method of any of claims 1 to 4.
CN202010857156.7A 2020-08-24 2020-08-24 Interaction method and device based on task image, medium and electronic equipment Active CN112036819B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010857156.7A CN112036819B (en) 2020-08-24 2020-08-24 Interaction method and device based on task image, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010857156.7A CN112036819B (en) 2020-08-24 2020-08-24 Interaction method and device based on task image, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112036819A CN112036819A (en) 2020-12-04
CN112036819B true CN112036819B (en) 2024-02-02

Family

ID=73580514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010857156.7A Active CN112036819B (en) 2020-08-24 2020-08-24 Interaction method and device based on task image, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112036819B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101677339A (en) * 2008-09-18 2010-03-24 Lg电子株式会社 Mobile terminal and buddy information displaying method thereof
CN102227719A (en) * 2008-11-26 2011-10-26 微软公司 Online service syndication
CN102946585A (en) * 2012-10-24 2013-02-27 百度在线网络技术(北京)有限公司 Method for interaction with people around, interaction system and server
JP2014124239A (en) * 2012-12-25 2014-07-07 Konami Digital Entertainment Co Ltd Game control device, game control method, program, and game system
KR20150037058A (en) * 2013-09-30 2015-04-08 김민철 Learning Management Method Using Learning Partner in Online, and Learning Management Server Used Therein

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10561929B2 (en) * 2015-12-07 2020-02-18 Preston Allen White, SR. Game system for enhanced communication skills
WO2019070351A1 (en) * 2017-10-03 2019-04-11 Fanmountain Llc Systems, devices, and methods employing the same for enhancing audience engagement in a competition or performance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101677339A (en) * 2008-09-18 2010-03-24 Lg电子株式会社 Mobile terminal and buddy information displaying method thereof
CN102227719A (en) * 2008-11-26 2011-10-26 微软公司 Online service syndication
CN102946585A (en) * 2012-10-24 2013-02-27 百度在线网络技术(北京)有限公司 Method for interaction with people around, interaction system and server
JP2014124239A (en) * 2012-12-25 2014-07-07 Konami Digital Entertainment Co Ltd Game control device, game control method, program, and game system
KR20150037058A (en) * 2013-09-30 2015-04-08 김민철 Learning Management Method Using Learning Partner in Online, and Learning Management Server Used Therein

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
分布式虚拟训练系统研究;杨瑛霞;中国博士学位论文全文数据库 社会科学II辑;全文 *

Also Published As

Publication number Publication date
CN112036819A (en) 2020-12-04

Similar Documents

Publication Publication Date Title
CN112311841B (en) Information pushing method and device, electronic equipment and computer readable medium
US20220300243A1 (en) Screen sharing method and device and electronic equipment
CN110781373B (en) List updating method and device, readable medium and electronic equipment
CN109982134B (en) Video teaching method based on diagnosis equipment, diagnosis equipment and system
EP4113985A1 (en) Multimedia conference data processing method and apparatus, and electronic device
CN112634102A (en) Remote classroom system, method for joining remote classroom, electronic device and medium
CN111460049A (en) Content sharing method and device, electronic equipment and computer readable storage medium
CN110765752A (en) Test question generation method and device, electronic equipment and computer readable storage medium
CN113571162B (en) Method, device and system for realizing multi-user collaborative operation medical image
WO2015096678A1 (en) Information interaction method, device and server
CN113038197A (en) Grouping method, device, medium and electronic equipment for live broadcast teaching classroom
CN102891851A (en) Access control method, equipment and system of virtual desktop
CN112486825B (en) Multi-lane environment architecture system, message consumption method, device, equipment and medium
CN113747247B (en) Live broadcast method, live broadcast device, computer equipment and storage medium
CN113144620A (en) Detection method, device, platform, readable medium and equipment for frame synchronization game
CN112036819B (en) Interaction method and device based on task image, medium and electronic equipment
CN109995543B (en) Method and apparatus for adding group members
CN110738882A (en) method, device, equipment and storage medium for on-line teaching display control
US20230418794A1 (en) Data processing method, and non-transitory medium and electronic device
CN113766178B (en) Video control method, device, terminal and storage medium
CN113947166A (en) Questionnaire statistics real-time processing method, system, electronic equipment and storage medium
CN112330996A (en) Control method, device, medium and electronic equipment for live broadcast teaching
CN112804539A (en) Method, device, medium and electronic equipment for packet information interaction
CN112036822B (en) Interaction method and device based on color ropes, medium and electronic equipment
CN111680754A (en) Image classification method and device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant