CN111767898A - Service data processing method, device, equipment and storage medium - Google Patents

Service data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN111767898A
CN111767898A CN202010727981.5A CN202010727981A CN111767898A CN 111767898 A CN111767898 A CN 111767898A CN 202010727981 A CN202010727981 A CN 202010727981A CN 111767898 A CN111767898 A CN 111767898A
Authority
CN
China
Prior art keywords
user
service
data
service data
users
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010727981.5A
Other languages
Chinese (zh)
Other versions
CN111767898B (en
Inventor
贺思颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010727981.5A priority Critical patent/CN111767898B/en
Publication of CN111767898A publication Critical patent/CN111767898A/en
Application granted granted Critical
Publication of CN111767898B publication Critical patent/CN111767898B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Educational Technology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application discloses a method, a device, equipment and a storage medium for processing service data, wherein the method comprises the following steps: responding to the trigger operation aiming at the application client, and outputting a service display interface corresponding to a virtual room in the application client; if the user i executing the trigger operation belongs to a first type user in the virtual room, calling a camera corresponding to the application client to acquire first image data of the user i at a first image acquisition time stamp; determining the service state of the user i based on the first image data, and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp; and sending the service data group to a server corresponding to the application client so that the server determines a service state diagram for performing anomaly analysis on service states of N users belonging to the first type of users. By adopting the embodiment of the application, the bandwidth pressure during data transmission can be relieved, and the efficiency of centralized management is improved.

Description

Service data processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing service data.
Background
Online education is an increasingly popular form of education, and learning users can learn online in a virtual room through an online client.
For example, taking three learning users (e.g., user a, user B, and user C) performing online learning in a virtual room as an example, the server may receive video data uploaded by terminals corresponding to the three learning users, respectively, for example, the video data uploaded by the terminal 1 corresponding to the user a may be video data 1, the video data uploaded by the terminal 2 corresponding to the user B may be video data 2, and the video data uploaded by the terminal 3 corresponding to the user C may be video data 3. Then, for the three terminals, each terminal needs to display the video data of other learning users in the virtual room issued by the server in addition to the video data collected by the terminal. For example, for a terminal 1 running an online client, the video data 1 of the terminal 1 can be displayed on a display interface thereof, and the video data of the user B and the user C in the virtual room synchronously delivered by the server can also be displayed together. Obviously, for each learning user in a virtual room, multiple paths of video data synchronously issued by the server are received, but when a large number of learning users exist in a virtual room, the phenomenon that the received multiple paths of video data are jammed may occur due to network delay, that is, in the existing online learning manner, bandwidth pressure is caused by synchronous transmission of the multiple paths of video data, and further, the efficiency of data transmission is reduced.
In addition, for the server, on one hand, when receiving multiple channels of video data uploaded by a large number of learning users in the virtual room, the multiple channels of video data are synchronously transmitted. On the other hand, data processing (for example, attendance analysis) needs to be performed on the video data corresponding to each learning user, for example, face recognition needs to be performed on each frame of image data in the video data uploaded by each learning user to perform attendance management on the corresponding learning user, so that when the computing capability of the server is limited, it is difficult to obtain an attendance result quickly, and thus the efficiency of performing centralized management on a plurality of learning users in the virtual room is reduced.
Disclosure of Invention
The embodiment of the application provides a service data processing method, a service data processing device and a storage medium, which can relieve bandwidth pressure during data transmission and improve centralized management efficiency.
An aspect of the present application provides a method for processing service data, including:
responding to the trigger operation aiming at the application client, and outputting a service display interface corresponding to a virtual room in the application client;
if the user i executing the trigger operation belongs to a first type user in the virtual room, calling a camera corresponding to the application client to acquire first image data of the user i at a first image acquisition time stamp; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room;
determining the service state of the user i based on the first image data, and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp;
sending the service data group to a server corresponding to the application client so that the server updates the service data group to a service data group set associated with the first type of user, and determining a service state diagram for performing exception analysis on service states of N users belonging to the first type of user based on the updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users.
An aspect of the present application provides a method for processing service data, including:
acquiring a service data group sent by a user terminal through an application client; the user terminal is a terminal corresponding to a user i in a virtual room corresponding to the application client; user i belongs to a first type of user in the virtual room; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room; the service data group comprises a service state of a user i and a first image acquisition timestamp of first image data corresponding to the user i; the first image data is acquired by calling a camera corresponding to the application client by the user terminal;
updating the service data group to a service data group set associated with the first type of user to obtain an updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users;
and determining a service state diagram for performing abnormal analysis on the service states of the N users belonging to the first type of users based on the service data groups of the N users in the updated service data group set.
An aspect of the present application provides a service data processing apparatus, including:
the display interface output module is used for responding to the trigger operation aiming at the application client and outputting a service display interface corresponding to a virtual room in the application client;
the first acquisition module is used for calling a camera corresponding to the application client to acquire first image data of a first image acquisition timestamp of the user i if the user i executing the trigger operation belongs to a first type of user in the virtual room; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room;
the data group construction module is used for determining the service state of the user i based on the first image data and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp;
the data group sending module is used for sending the service data group to a server corresponding to the application client so that the server updates the service data group to a service data group set associated with the first type of user, and determines a service state diagram for performing abnormal analysis on service states of N users belonging to the first type of user based on the updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users.
Wherein, the device still includes:
the image data output module is used for outputting the first image data to an image display sub-interface independent of the service display interface; the image display sub-interface is an interface superposed on the service display interface, and the size of the image display sub-interface is smaller than that of the service display interface.
The image data in the image display sub-interface is used for indicating a user i to adjust the camera shooting angle of the camera;
the device also includes:
the second acquisition module is used for acquiring image data of the user i at a second image acquisition timestamp based on the camera with the adjusted camera angle; the second image acquisition timestamp is a next acquisition timestamp of the first image acquisition timestamp;
and the image data refreshing module is used for taking the image data of the user i at the second image acquisition time stamp as second image data and refreshing the image data in the image display sub-interface based on the second image data.
Wherein, the device still includes:
and the display sub-interface closing module is used for recording the overlapping display time length of the image display sub-interface overlapped on the service display interface, and closing the image display sub-interface on the service data display interface when the overlapping display time length reaches the display time length threshold value corresponding to the image display sub-interface.
The service display interface is a data sharing interface for sharing data among N users belonging to the first type of users;
the device also includes:
the time stamp obtaining module is used for obtaining a sharing time stamp corresponding to the data sharing interface, and taking the image data of the user i collected by the camera when the user i shares the time stamp as third image data; the sharing time stamp is the maximum time stamp in the sharing time length corresponding to the data sharing interface;
and the display sub-interface recovery module is used for recovering and displaying the image display sub-interface on the data sharing interface and outputting the third image data to the image display sub-interface if the sharing time stamp reaches the sharing period corresponding to the data sharing interface.
Wherein, the data group construction module comprises:
the identification unit is used for carrying out face identification on the first image data to obtain a face identification result;
a first conversion unit, configured to convert, if the face recognition result indicates that the face of the user i exists in the first image data, the first image data including the face of the user i into a first state value; the first state value is used for representing that the user state of the user i is an online state;
a second conversion unit, configured to convert the first image data that does not include the face of the user i into a second state value if the face recognition result indicates that the face of the user i does not exist in the first image data; the second state value is used for representing that the user state of the user i is an absent state;
and the data group building unit is used for determining the online state or the absence state as the service state of the user i, and building a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp corresponding to the first image data.
If the user i belongs to a second type of user in the virtual room; the user i has the authority to manage the service states of the N users; the updated service data set indicating server performs data statistics on service data groups of the N users on the K image acquisition timestamps to obtain K statistical service data associated with the N users; one image acquisition timestamp corresponds to one statistical service data; the service state diagram is determined by the server based on the K image acquisition time stamps and the statistical service data on the corresponding image acquisition time stamps; the K image acquisition time stamps comprise a first image acquisition time stamp, and the image acquisition time length formed by the K image acquisition time stamps is equal to the sharing time length of the service display interface;
the device also includes:
the state diagram analysis module is used for receiving a service state diagram returned by the server based on the authority of the user i, and performing anomaly analysis on the statistical service data on each image acquisition timestamp in the service state diagram to obtain an anomaly analysis result;
the target timestamp determining module is used for taking the statistical service data meeting the abnormal detection condition as target statistical service data and taking an image acquisition timestamp corresponding to the target statistical service data as a target image acquisition timestamp if the abnormal analysis result indicates that the statistical service data meeting the abnormal detection condition exists in the K statistical service data;
the request sending module is used for acquiring service data groups of N users on a target image acquisition timestamp based on target statistical service data, determining the user with the user state of the absence state as a candidate abnormal user in the N service data groups, and sending an abnormal data extraction request to the server; the abnormal data extraction request is used for indicating the server to obtain business auxiliary data corresponding to the candidate abnormal user;
and the secondary verification module is used for receiving the business auxiliary data returned by the server, performing secondary verification on the candidate abnormal users based on the business auxiliary data, performing abnormity filtering on the candidate abnormal users based on a secondary verification result, and determining the candidate abnormal users after abnormity filtering as abnormal users.
The number of the candidate abnormal users is L; l is a positive integer less than or equal to N; the service auxiliary data comprises user identifications of L candidate abnormal users and abnormal duration corresponding to the candidate abnormal users; the abnormal duration of each candidate abnormal user comprises a target image acquisition timestamp;
the secondary verification module includes:
the exception receiving unit is used for receiving the user identifications of the L candidate exception users returned by the server and the exception duration corresponding to the candidate exception users;
the secondary verification unit is used for carrying out secondary verification on the candidate abnormal users based on the L user identifications and the L abnormal durations to obtain a secondary verification result;
the to-be-filtered user determining unit is used for taking the user identifier of the candidate abnormal user corresponding to the abnormal duration smaller than the abnormal duration threshold as the to-be-filtered user identifier and taking the candidate abnormal user corresponding to the to-be-filtered user identifier as the to-be-filtered user if the secondary verification result indicates that the abnormal duration smaller than the abnormal duration threshold exists in the L abnormal durations;
and the abnormal user determining unit is used for removing the user identifier to be processed from the L user identifiers, taking the user identifier without the user identifier to be processed as a target user identifier, and determining a candidate abnormal user corresponding to the target user identifier as an abnormal user from the L candidate abnormal users.
Wherein, the device still includes:
the information generation module is used for generating abnormal prompt information based on the target user identification of the abnormal user and the abnormal duration of the abnormal user, and outputting a page switching control on the service display interface based on the abnormal prompt information;
and the interface switching module is used for responding to the triggering operation aiming at the page switching control, switching the display interface of the application client from the service display interface to a state display interface, and displaying the service states of N users in the virtual room on the state display interface.
One aspect of the present application provides a computer device, comprising: a processor, a memory, a network interface;
the processor is connected to a memory and a network interface, wherein the network interface is used for providing a data communication function, the memory is used for storing a computer program, and the processor is used for calling the computer program to execute the method in the above aspect in the embodiment of the present application.
An aspect of the present application provides a computer-readable storage medium storing a computer program comprising program instructions that, when executed by a processor, perform the method of the above-mentioned aspect of the embodiments of the present application.
An aspect of the application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method of the above-described aspect.
An aspect of the present application provides a service data processing apparatus, including:
the data group acquisition module is used for acquiring a service data group sent by the user terminal through the application client; the user terminal is a terminal corresponding to a user i in a virtual room corresponding to the application client; user i belongs to a first type of user in the virtual room; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room; the service data group comprises a service state of a user i and a first image acquisition timestamp of first image data corresponding to the user i; the first image data is acquired by calling a camera corresponding to the application client by the user terminal;
the data set updating module is used for updating the service data set to a service data set associated with the first type of user to obtain an updated service data set; the updated service data group set comprises service data groups of N users belonging to the first type of users;
and the state diagram determining module is used for determining a service state diagram for carrying out abnormal analysis on the service states of the N users belonging to the first type of users based on the service data groups of the N users in the updated service data group set.
Wherein the state diagram determining module comprises:
the data set determining unit is used for acquiring service data groups of N users in the updated service data set on K image acquisition time stamps respectively, and taking the acquired service data groups of each user on the K image acquisition time stamps as user data sets to be processed respectively; the number of user data sets to be processed is equal to N;
an integration processing unit, configured to integrate state values of service states of the N users on corresponding image acquisition timestamps based on K image acquisition timestamps in each to-be-processed user data set, so as to obtain service data matrices R associated with the N usersm×n(ii) a Service data matrix Rm×nThe value of the matrix row number m is N, and the service data matrix Rm×nThe number n of matrix columns in (1) is K;
a service state determination unit for determining the state of the service in the service data matrix Rm×nIn (2), the state values of matrix elements on the same column are made non-zeroDetermining the service state as a first service state, and determining the service state with the state value of zero of the matrix elements on the same column as a second service state;
the state diagram determining unit is used for determining the number of users in a first service state on the same column as a first user number, determining the number of users in a second service state as a second user number, taking the ratio of the first user number to the sum of the first user number and the second user number as K statistical service data associated with N users, and determining a service state diagram corresponding to a first type of user based on K image acquisition timestamps and the statistical service data on the corresponding image acquisition timestamps; the service state diagram is used for carrying out abnormity analysis on the user states of the N users.
One aspect of the present application provides a computer device, comprising: a processor, a memory, a network interface;
the processor is connected to a memory and a network interface, wherein the network interface is used for providing a data communication function, the memory is used for storing a computer program, and the processor is used for calling the computer program to execute the method in the above aspect in the embodiment of the present application.
An aspect of the present application provides a computer-readable storage medium storing a computer program comprising program instructions that, when executed by a processor, perform the method of the above-mentioned aspect of the embodiments of the present application.
An aspect of the application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method of the above-described aspect.
In an embodiment of the present application, a computer device (e.g., a user terminal) running an application client (e.g., a live curriculum client) may output a service display interface corresponding to a virtual room in the application client in response to a trigger operation for the application client. If the user i performing the triggering operation belongs to a first type of user (e.g., a learning user) in the virtual room, the user terminal may invoke a camera corresponding to the application client to acquire first image data of the user i at the first image acquisition time stamp. Wherein i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of user in the virtual room. Further, the user terminal may determine a service state of the user i based on the first image data, and construct a service data group (e.g., an attendance group) corresponding to the user i according to the service state of the user i and the first image acquisition timestamp. It should be understood that, since the number of bytes occupied by the service data group is much smaller than the number of bytes occupied by the image data, the user terminal can improve the efficiency of data transmission when uploading the service data group to the server. It should be understood that, at this time, since the server corresponding to the application client does not need to acquire the multiple channels of video data of the N users in the virtual room, it is naturally not necessary to synchronously issue the multiple channels of video data, and thus the bandwidth pressure during data transmission can be effectively relieved. In addition, it can be understood that, when the server obtains the service data group uploaded by the user i in the virtual room, the server may update the service data group to a service data group set associated with the first type user, so as to obtain an updated service data group set. It can be understood that the updated service data group set can be used to store the service data groups uploaded by the N users in the virtual room. Based on this, the server may determine, based on the updated set of service data groups, a service state diagram for performing anomaly analysis on service states of N users belonging to the first type of user (for example, a real-time attendance graph associated with the N users may be determined), and thus may improve efficiency of performing centralized management on the N users in the virtual room.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application;
fig. 2 is a schematic view of a scenario for performing data interaction according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a service data processing method according to an embodiment of the present application;
fig. 4 is a schematic view of a scene of image data in an update image display sub-interface according to an embodiment of the present disclosure;
fig. 5 is a schematic view of a scenario of an updated service data set provided in an embodiment of the present application;
FIG. 6 is a schematic view of a scenario for determining candidate abnormal users according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a scenario for determining an abnormal user according to an embodiment of the present application;
fig. 8 is a scene schematic diagram for reminding a second type of user of paying attention to an abnormal situation according to an embodiment of the present application;
fig. 9 is a scene schematic diagram of a switching display interface provided in an embodiment of the present application;
fig. 10 is a schematic flowchart of a service data processing method according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a service data processing apparatus according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a computer device provided by an embodiment of the present application;
fig. 13 is a schematic structural diagram of a service data processing apparatus according to an embodiment of the present application;
FIG. 14 is a schematic diagram of a computer device provided by an embodiment of the present application;
fig. 15 is a schematic structural diagram of a data processing system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present disclosure. As shown in fig. 1, the network architecture may include a server 10 and a cluster of user terminals. The user terminal cluster may comprise one or more user terminals, where the number of user terminals will not be limited. As shown in fig. 1, the system may specifically include a user terminal 100a, a user terminal 100b, user terminals 100c and …, and a user terminal 100 n. As shown in fig. 1, the user terminal 100a, the user terminal 100b, the user terminals 100c, …, and the user terminal 100n may be respectively connected to the server 10 via a network, so that each user terminal may interact with the server 10 via the network.
Wherein, each ue in the ue cluster may include: the intelligent terminal comprises an intelligent terminal with a service data processing function, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, wearable equipment, an intelligent home, and head-mounted equipment. It should be understood that each user terminal in the user terminal cluster shown in fig. 1 may be installed with the application client, and when the application client runs in each user terminal, data interaction may be performed with the server 10 shown in fig. 1. The application client may be an independent client, or may be an embedded sub-client integrated in a certain client (for example, a social client, an educational client, a multimedia client, and the like), which is not limited herein.
As shown in fig. 1, the server 10 in the embodiment of the present application may be a server corresponding to the application client. The server 10 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like.
For convenience of understanding, in the embodiment of the present application, one user terminal may be selected from the plurality of user terminals shown in fig. 1 as the target user terminal. For example, the user terminal 100a shown in fig. 1 may be used as a target user terminal in the embodiment of the present application, and a target application (i.e., an application client) having the service data processing function may be integrated in the target user terminal. At this time, the target user terminal may implement data interaction with the server 10 through the service data platform corresponding to the application client.
It should be understood that the first type of user and the second type of user and the third type of user may be contained in a virtual room in the application client. The second type user may be a user having management authority (also referred to as supervision authority) on the service status of the first type user. A third type of user here may be an initiating user in the virtual room (e.g., a main user who is live in video or voice).
It can be understood that the virtual room in the application client may be a virtual room corresponding to an online conference in a conference scene. Wherein the first type of user may be N enterprise employees participating in the online meeting. The second type of user may be a business employee responsible for recording attendance status of the attendees, and the third type of user may be a business employee (e.g., a business leader) that organizes the attendees to conduct an online meeting. Wherein it should be understood that the employee of the enterprise responsible for recording attendance may be one of the N attendees specified by the third type of user; optionally, the enterprise employee responsible for recording attendance may also be another person who can be responsible for attendance work independent of the N attendees. Optionally, the virtual room in the application client may also be a virtual room corresponding to online learning in a learning scene. Wherein the first type of user may be N learning users (e.g., students) participating in the online learning. The second type of user may be a teaching assistant user for the online learning, and the third type of user may be a teaching person (e.g., a teacher user) who organizes the online learning of the learning users. It should be understood that the teaching users herein may include: a learning user specified by a third type user (e.g., a teacher user) among the learning users who participate in online learning; optionally, the teaching assistant user may further include: other persons (e.g., supervising persons) independent of the N learning users, who can be responsible for attendance work. This help religion user can assist teacher user's teaching work, for example, look over student's performance of giving lessons, control student's attendance situation etc.. The virtual room in the application client may also be a virtual room corresponding to another scene, which is not limited herein.
Taking online learning as an example, the virtual room in the application client may contain N first type users, where N may be a positive integer. It should be understood that the second type user in the embodiment of the present application may be a certain user (e.g., user i) of the N first type users. For example, the user i in the embodiment of the present application may log in the application client to enter the virtual room for online learning through the first account information (e.g., account information 1) in the user terminal 100a (i.e., the target user terminal). In addition, the user i may also have the right to manage the service states of the N first type users (including the user i) in the virtual room, that is, the user i may also be referred to as an assistant user.
Optionally, taking online learning as an example, the virtual room in the application client may include N users of the first type and one user of the second type. It should be understood that the embodiment of the present application may select another user terminal (e.g., the user terminal 100b) from the plurality of user terminals shown in fig. 1 as the teaching assistant user terminal. For example, the user (e.g., user j) in the embodiment of the present application may log in the application client through the second account information (e.g., account information 2) in the user terminal 100 b. The user j may have the right to manage the traffic status of the N first type users in the virtual room.
For easy understanding, please refer to fig. 2, and fig. 2 is a schematic diagram of a scenario for performing data interaction according to an embodiment of the present application. As shown in fig. 2, the application client in the embodiment of the present application may be used for online learning in an online learning scenario. The virtual room in the application client may contain N first type users (e.g., learning users participating in online learning). The embodiment of the application takes 3 learning users as an example, and specifically includes a user a, a user b, and a user c. The server corresponding to the application client may be the server 20 shown in fig. 2, and the server 20 may be the server 10 shown in fig. 1.
It should be understood that the embodiments of the present application may collectively refer to user a, user b, and user c as a first type of user in the virtual room. The ue corresponding to the user a may be a terminal 30, where the terminal 30 may be any one of the ue (for example, the ue 100a) in the ue cluster shown in fig. 1, the ue corresponding to the user b may be a terminal 31, the terminal 31 may be another ue (for example, the ue 100b) in the ue cluster shown in fig. 1, the ue corresponding to the user c may be a terminal 32, and the terminal 32 may be another ue (for example, the ue 100c) in the ue cluster shown in fig. 1. It is understood that the terminal 30, the terminal 31 and the terminal 32 may respectively run an application client (e.g., live curriculum client) having a service data processing function.
It is understood that, in the embodiment of the present application, the user a may perform a trigger operation on the application client to enter a virtual room associated with the trigger operation to perform the online learning of the shared task. The triggering operation may include a contact operation such as a click or a long press, or may also include a non-contact operation such as a voice or a gesture, which is not limited herein. At this time, the terminal 30 may output a service display interface (such as the service display interface 400a or the service display interface 400b shown in fig. 4 below) corresponding to the virtual room in response to the trigger operation. The service display interface may be a data sharing interface for data sharing among users of the first type (e.g., user a, user b, and user c) in the virtual room, that is, the display data in the data sharing interface observed by the user a, the user b, and the user c on the respective corresponding user terminals are the same. The display data in the data sharing interface may include image data of a third type of user (e.g., teacher user) entering the virtual room, a tutorial or blackboard writing, and the like.
Further, the terminal 30 may invoke the camera a corresponding to the application client, and capture image data (e.g., image data a) of the user a at any image capture timestamp (e.g., image capture timestamp t1) in the sharing duration corresponding to the sharing task. The image acquisition time stamp t1 may be referred to as a first image acquisition time stamp, and the image data acquired at the first image acquisition time stamp may be determined as the first image data. It is understood that the terminal 30 may determine the service status (i.e., attendance status) of the user a based on the image data a, in other words, the terminal 30 may perform face recognition on the image data a, so that a face recognition result may be obtained. It should be understood that if the face recognition result indicates that the face of the user a exists in the image data a, the terminal 30 may determine that the service state of the user a is an online state; if the face recognition result indicates that the face of the user a does not exist in the image data a (for example, due to the angle problem of the camera, the captured image data a is the shoulder and other parts of the user a), the terminal 30 may determine that the service status of the user a is the absent status.
Further, the terminal 30 may construct a service data set (i.e., attendance data, for example, the service data set D1) corresponding to the user a through a data set constructing module in the terminal 30 according to the service state of the user a and the image capturing timestamp t 1. Wherein the traffic data set D1 may contain a state value for characterizing the user state of user a and an image acquisition timestamp t 1. It will be appreciated that the state value may be represented in binary form. The user status may be represented by a status value "1" when the user status is an online status, and may be represented by a status value "0" when the user status is an absent status. For example, when the terminal 30 recognizes that the face of the user a exists in the image data a, the business data group D1 may be (1, t 1).
It should be understood that the terminal 30 may transmit the service data group D1 to the server 20 shown in fig. 2, and so on, and the terminal 31 shown in fig. 2 may construct a service data group (e.g., the service data group D2) corresponding to the user b and transmit the service data group D2 to the server 20. The terminal 32 shown in fig. 2 may construct a service data group (e.g., service data group D3) corresponding to the user c, and transmit the service data group D3 to the server 20. For the specific implementation of the terminal 31 for constructing the service data group D2 and the specific implementation of the terminal 32 for constructing the service data group D3, reference may be made to the implementation of the terminal 30 for constructing the service data group D1, and details are not described here again.
It should be understood that the data synchronization module in the server 20 may compare and synchronize the image capture timestamps in the service data sets obtained by the application clients, so that the service status of each first type user can be synchronized. It can be understood that, when the server 20 acquires the service data group (for example, the service data group D1, the service data group D2, and the service data group D3) sent by the application client, the data synchronization module may update the service data group to the service data group set associated with the first type of user, so that the updated service data group set may be obtained. The updated service data group set may include service data groups of user a, user b, and user c. Further, the server 20 may determine statistical service data (e.g., attendance rate) corresponding to each image capturing timestamp through the statistical analysis module based on the service data sets of the 3 users in the updated service data set, and may further determine a service state diagram (e.g., real-time attendance graph). Wherein, the service state diagram can be used to perform anomaly analysis on the service states of the 3 users belonging to the learning user.
Therefore, the terminal 30, the terminal 31 and the terminal 32 can perform data processing on the acquired image data of the learning user, and further can construct a service data group for representing the user state of the learning user. Since the number of bytes occupied by the service data group is much smaller than the number of bytes occupied by the image data, the server 20 does not need to receive the multi-channel image data of the 3 users in the virtual room, and thus the bandwidth pressure during data transmission can be effectively relieved. In addition, the server 20 may determine a service status map based on the acquired service data group, so that efficiency of centralized management of the first type users in the virtual room may be improved.
For a specific implementation manner of determining a service state diagram by a server, a computer device (e.g., a user terminal) running an application client constructs a service data group corresponding to a user through image data of the user captured by a camera of the application client, and sends the service data group to a server corresponding to the application client, so that the server may refer to embodiments corresponding to fig. 3 to fig. 10 described below.
Further, please refer to fig. 3, where fig. 3 is a schematic flow chart of a service data processing method according to an embodiment of the present application. As shown in fig. 3, the method may be performed by a computer device integrated with an application client, where the computer device may be a user terminal, and the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100 a. The method may comprise at least the following steps S101-S104:
and step S101, responding to the trigger operation aiming at the application client, and outputting a service display interface corresponding to the virtual room in the application client.
Specifically, a user (e.g., user i) corresponding to a user terminal running an application client (e.g., live lesson client) may perform a trigger operation with respect to the application client to enter a virtual room associated with the trigger operation to perform a sharing task. The shared task may be a shared task associated with online learning. The triggering operation may include a contact operation such as a click or a long press, or may also include a non-contact operation such as a voice or a gesture, which is not limited herein. At this time, the user terminal may respond to the trigger operation, so as to output a service display interface corresponding to the virtual room.
It will be appreciated that the user i, when entering the virtual room, may become a first type of user in the virtual room participating in the online learning of this shared task. For example, the learning user, in other words, the user i can perform learning tasks such as listening, self-study, and the like in the virtual room. Optionally, the user i can also become a second type of user in the virtual room, for example, a teaching assistant user, when entering the virtual room. In other words, the user i can assist the teacher's teaching work in the virtual room. For example, help to check the performance of the learning user in class, monitor the attendance status of the learning user, etc. Optionally, the user i may also become a third type of user initiating the sharing task in the virtual room when entering the virtual room, for example, a teacher user. In other words, the user i can perform a live teaching task in the virtual room to guide the learning user to improve the learning performance and the like.
Step S102, if the user i executing the trigger operation belongs to a first type user in the virtual room, a camera corresponding to the application client is called to acquire first image data of the user i at a first image acquisition time stamp.
Specifically, if the user i performing the trigger operation belongs to a first type of user in the virtual room, the user terminal may invoke a camera corresponding to the application client to acquire image data of the user i at the first image acquisition time stamp. The first image acquisition timestamp may be a timestamp within a shared duration (e.g., 2 hours) corresponding to the virtual room. The image data acquired at the first image acquisition time stamp can be determined as the first image data. It is to be understood that i may be a positive integer less than or equal to N, which may be the number of users in the virtual room belonging to the first type of user.
It should be appreciated that the user terminal may output the first image data to an image display sub-interface separate from the business display interface. The image display sub-interface may be an interface (e.g., a floating window or a pop-up window) superimposed on the service display interface, and the size of the image display sub-interface may be smaller than that of the service display interface, so that occlusion of data displayed in the service display interface may be reduced. It is understood that the data displayed on the image display sub-interface and the data displayed on the service display interface are independent of each other. The image data in the image display sub-interface can be used for indicating the user i to adjust the camera angle of the camera, so that the user terminal can more completely display the face image of the user i in the image sub-interface, and the service state of the user i in the virtual room can be accurately determined to be an online state by the user terminal.
It is understood that the user terminal may acquire the image data of the user i at the second image acquisition time stamp based on the adjusted camera. Wherein the second image acquisition timestamp may be a next acquisition timestamp to the first image acquisition timestamp. At this time, the user terminal may take the image data of the user i at the second image capturing time stamp as the second image data, and may further refresh the image data in the image display sub-interface based on the second image data.
For easy understanding, please refer to fig. 4, and fig. 4 is a schematic view of a scene of image data in a refresh image display sub-interface according to an embodiment of the present disclosure. As shown in fig. 4, the user terminal in this embodiment may be a user terminal running an application client, and the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100 a.
It should be understood that the user (e.g., user a) corresponding to the user terminal in the embodiment of the present application may be a first type of user (e.g., a learning user) in the virtual room of the application client (e.g., live curriculum client). The service display interface 400a and the service display interface 400b in this embodiment may be collectively referred to as a service display interface corresponding to the virtual room. The business display interface can be used for displaying image data, teaching courseware, blackboard writing and the like of the teacher user.
It is understood that the user terminal may invoke the corresponding camera of the application client to capture the image data (i.e., the first image data, e.g., image data 1) of user a at time stamp t1 (i.e., the first image capture time stamp). At this time, the user terminal may output the image data 1 to the image display sub-interface 410a independent of the service display interface 400 a. The image display sub-interface 410a may be an interface (e.g., a floating window or a pop-up window) superimposed on the service display interface 400a, and the size of the image display sub-interface 410a may be smaller than that of the service display interface 400a, so that the occlusion of data displayed in the service display interface 400a may be reduced. It is understood that the data displayed on the image display sub-interface 410a and the data displayed on the service display interface 400a are independent of each other. The image data in the image display sub-interface 410a can be used to instruct the user a to adjust the camera angle of the camera.
It is understood that when the user a observes that the image data 1 displayed on the image display sub-interface 410a contains a small area of the face image of the user a or does not contain the face image of the user a, the user a may adjust the angle of the camera. At this time, the user terminal may capture image data (e.g., image data 2) of the user a at the time stamp t2 (i.e., the second image capture time stamp) based on the adjusted camera. Wherein the timestamp t2 may be the next acquisition timestamp of the timestamp t 1. The user terminal may use image data 2 of user a at time stamp t2 as the second image data, and may refresh image data 1 in image display sub-interface 410a based on the second image data, so that image data 2 is displayed on the image sub-interface of the user terminal. At this time, the image display sub-interface of the user terminal may be as shown in the image display sub-interface 410b shown in fig. 4.
Further, in order to reduce the occlusion of the image display sub-interface on the data (for example, teaching courseware) displayed on the service display interface, the user terminal may periodically display the image display sub-interface on the service display interface. It can be understood that, the user terminal may record a superimposed display duration in which the image display sub-interface is superimposed on the service display interface, and when the superimposed display duration reaches a display duration threshold (for example, 10 seconds) corresponding to the image display sub-interface, the image display sub-interface may be closed on the service data display interface.
The service display interface may be a data sharing interface for performing data sharing among N users belonging to the first type of user. The user terminal can acquire a sharing timestamp corresponding to the data sharing interface, and further can take image data of the user i acquired by the camera when the user i shares the timestamp as third image data; the sharing timestamp may be a maximum timestamp in the sharing duration corresponding to the data sharing interface. If the sharing timestamp reaches the sharing period (for example, 5 minutes) corresponding to the data sharing interface, the user terminal may resume displaying the image display sub-interface on the data sharing interface, and may output the third image data to the image display sub-interface.
For example, the user terminal may display an image display sub-interface displaying image data of a user i corresponding to the user terminal on the service display interface in an overlapping manner at the time of T1 timestamp (e.g., 00: 00: 00). When the superimposed display duration of the image display sub-interface reaches the display duration threshold (for example, 10 seconds) corresponding to the image display sub-interface, the image display sub-interface may be closed on the service display interface. In other words, the user terminal may close the image display sub-interface on the service display interface at the time of T2 timestamp (e.g., 00: 00: 10).
Further, the user terminal may resume displaying the image display sub-interface on the data sharing interface when the sharing timestamp (i.e., the T3 timestamp) corresponding to the data sharing interface is acquired to reach the sharing period (e.g., 5 minutes) corresponding to the data sharing interface. In other words, the user terminal may resume displaying the image display sub-interface on the data sharing interface at the time of the T3 timestamp (e.g., 00: 05: 10). The image data displayed by the image display sub-interface may be the image data of the user i acquired at the time of the T3 time stamp. By analogy, the user terminal may close the image display sub-interface at the time of the T4 timestamp (e.g., 00: 05: 20) and restore the image display sub-interface at the time of the T5 timestamp (e.g., 00: 10: 20).
Step S103, determining the service state of the user i based on the first image data, and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp.
Specifically, the user terminal may perform face recognition on the first image data, so that a face recognition result may be obtained. If the face recognition result indicates that the face of the user i exists in the first image data, the user terminal may convert the first image data including the face of the user i into a first state value (e.g., 1). The first state value may be used to characterize the user state of the user i as the online state. If the face recognition result indicates that the face of the user i does not exist in the first image data, the user terminal may convert the first image data that does not include the face of the user i into a second state value (e.g., 0). The second state value may be used to characterize the user state of the user i as an absent state. Further, the user terminal may determine the online state or the absence state as a service state of the user i, and may construct a service data group (e.g., an attendance data group) corresponding to the user i according to the service state of the user i and the first image acquisition timestamp corresponding to the first image data. In other words, the user terminal may construct the state value corresponding to the service state of the user i and the first image capturing timestamp corresponding to the first image data into the service data group corresponding to the user i.
It can be understood that the user terminal may also acquire the facial images of the first type users in the virtual room in advance, and store the facial images of the first type users in the database. For example, when the user terminal acquires the first image data of the user i, the user terminal may perform face comparison on the first image data and a face image of the user i in the database, so as to prevent other users from performing attendance check instead of the user i.
And step S104, sending the service data group to a server corresponding to the application client so that the server updates the service data group to a service data group set associated with the first type of user, and determining a service state diagram for performing abnormal analysis on service states of N users belonging to the first type of user based on the updated service data group set.
Specifically, the user terminal may send the service data group to a server corresponding to the application client. At this time, the server may obtain the service data group sent by the user terminal through the application client, and may further update the service data group to a service data group set associated with the first type of user, so that the updated service data set may be obtained. The updated service data set may include service data groups of N users belonging to the first type of user. Further, the server may determine a service state diagram for performing anomaly analysis on the service states of the N users belonging to the first type of user based on the service data groups of the N users in the updated service data group set.
The updated service data set can instruct the server to perform data statistics on service data groups of the N users on the K image acquisition timestamps to obtain K statistical service data associated with the N users; one image acquisition time stamp may correspond to one statistical traffic data.
It can be understood that the server may obtain service data groups of N users in the updated service data set on the K image acquisition timestamps, respectively, and may further use the obtained service data groups of each user on the K image acquisition timestamps as the user data sets to be processed, respectively. Wherein the number of the sets of user data to be processed may be equal to N.
For ease of understanding, please refer to fig. 5, where fig. 5 is a schematic view of a scenario of an updated service data set according to an embodiment of the present application. As shown in fig. 5, the server 50 in the embodiment of the present application may be a server corresponding to an application client, and the server may be the server 10 shown in fig. 1.
The virtual room of the application client may include N first-type users, and in the embodiment of the present application, N is 2 as an example, and specifically may include a user a and a user b. As shown in fig. 5, the updated service data set obtained by the server 50 may include: a user data set 1 to be processed corresponding to the user a and a user data set 2 to be processed corresponding to the user b. It should be understood that the set 1 of pending user data may be formed by the business data sets of the user a on K (for example, 9) image capturing time stamps, and the set 2 of pending user data may be formed by the business data sets of the user b on K (for example, 9) image capturing time stamps. The 9 image capturing timestamps herein may specifically include: a t1 timestamp, a t2 timestamp, a t3 timestamp, a t4 timestamp, a t5 timestamp, a t6 timestamp, a t7 timestamp, a t8 timestamp, and a t9 timestamp.
It can be understood that, in the service data set in the embodiment of the present application, an attendance data set may be constructed by the data set construction module for a user terminal running an application client, and the service data set may include a state value used for characterizing a user state and a corresponding image acquisition timestamp, for example, the state value may be represented in a binary form.
It should be understood that the user terminal a corresponding to the user a may acquire the image data 11 at the time stamp of t1, and may perform face recognition on the image data 11 to obtain a corresponding face recognition result. At this time, the face recognition result obtained by the user terminal a may indicate that the face of the user a exists in the image data 11, and the service data set D11 constructed by the user terminal a may be (1, t1), that is, the service data set D11 includes the state value 1 for representing the online state of the user a and the timestamp t 1. By analogy, the service data group of the user a between the time stamp t1 and the time stamp t9 can be specifically referred to as the pending user set 1 shown in fig. 5.
In addition, the user terminal B corresponding to the user B may acquire the image data 21 at the time stamp of t1, and may further perform face recognition on the image data 21 to obtain a corresponding face recognition result. At this time, the face recognition result obtained by the user terminal B may indicate that the face of the user B does not exist in the image data 21, and the service data set D21 constructed by the user terminal B may be (0, t1), that is, the service data set D21 includes a state value 0 for representing the absence state of the user B and a time stamp t 1. By analogy, the service data group of the user b between the time stamp t1 and the time stamp t9 can be referred to the pending user set 2 shown in fig. 5.
Further, the server may integrate the state values of the service states of the N users on the corresponding image acquisition timestamps based on K image acquisition timestamps in each user data set to be processed, so that a service data matrix R associated with the N users may be obtainedm×n. Wherein, the service data matrix Rm×nThe number m of matrix rows in (1) may be N, the service data matrix Rm×nThe number n of matrix columns in (2) may be K.
It should be understood that in the traffic data matrix Rm×nThe server may determine a traffic state in which the state value of a matrix element on the same column is non-zero as a first traffic state (e.g., an online state), and determine a traffic state in which the state value of a matrix element on the same column is zero as a second traffic state (e.g., an absent state). Further, the server may determine the number of users in the first service state on the same column as a first number of users (i.e., the number of online users), determine the number of users in the second service state as a second number of users (i.e., the number of absent users), and determine a ratio of the first number of users to a sum of the first number of users and the second number of users (i.e., a total number of users of the first type in the virtual room) as K statistical service data (i.e., the attendance rate) associated with the N users.
The formula of the statistical service data corresponding to a certain image acquisition timestamp determined by the server may be shown as the following formula (1):
counting service data as the first user number/total user number (1)
The statistical service data may be an attendance rate corresponding to the image acquisition timestamp, the first number of users may be a first number of users in a first service state (for example, an online state), and the total number of users may be a number of users belonging to a first type of user in the virtual room.
For example, the service data matrix R obtained by the serverm×nThe number m of the matrix rows in (b) can be the generic matrix rows in the application clientFor the number of first type users (e.g., 100), the number of matrix columns n may be the number of image acquisition time stamps (e.g., 10). When the server determines statistical service data corresponding to a certain image capture timestamp (e.g., image capture timestamp 3 corresponding to matrix element in column 3 of the service data matrix), the server may determine, as the first number of users, the number of users in the first service state on column 3, and determine, as the second number of users, the number of users in the second service state on column 3. In other words, the server may determine the number of users with a state value of 1 on the 3 rd column as a first number of users (e.g., 95), and determine the number of users with a state value of 0 on the 3 rd column as a second number of users (e.g., 5). At this time, the server may determine that the statistical traffic data of column 3 is 95% by the above formula (1), that is, the statistical traffic data (e.g., attendance rate) of the first type of user in the virtual room at the time of the image capturing time stamp 3 is 95%.
Further, the server may determine a traffic state diagram (e.g., a real-time attendance graph) corresponding to the first type of user based on the K image capture timestamps and the statistical traffic data on the corresponding image capture timestamps. The service state diagram can be used for performing anomaly analysis on the user states of the N users. For example, the server may determine the service state diagram corresponding to the first type of user by using the image acquisition timestamp as a horizontal axis of the service state diagram and using the statistical service data as a vertical axis of the service state diagram. Optionally, the server may further determine the service state diagram corresponding to the first type of user by using the image acquisition timestamp as a vertical axis of the service state diagram and using the statistical service data as a horizontal axis of the service state diagram. The determination method of the service state diagram is not limited herein.
It should be understood that the server may send the service state diagram to a user terminal (e.g., a teaching assistant terminal) corresponding to the user belonging to the second type in the virtual room. If the second type user is a user i in the virtual room, the user i may have a right to manage the service states of the N users. It can be understood that the user terminal corresponding to the user i may receive the service state diagram returned by the server based on the authority of the user i. The service state diagram is determined by the server based on the K image acquisition time stamps and the statistical service data on the corresponding image acquisition time stamps; the K image acquisition timestamps may include a first image acquisition timestamp, and an image acquisition duration formed by the K image acquisition timestamps is equal to a shared duration of the service display interface.
It can be understood that, in the service state diagram, the user terminal may perform anomaly analysis on the statistical service data on each image acquisition timestamp through the anomaly analysis module, so as to obtain an anomaly analysis result. It should be understood that, if the anomaly analysis result indicates that there is statistical service data meeting the anomaly detection condition in the K pieces of statistical service data, the user terminal may use the statistical service data meeting the anomaly detection condition as target statistical service data, and may use an image acquisition timestamp corresponding to the target statistical service data as a target image acquisition timestamp.
Wherein, the anomaly detection condition may refer to when the statistical traffic data on the image acquisition timestamp (e.g., timestamp t) is less than a statistical threshold (e.g., 95%), and a difference between the statistical traffic data on the image acquisition timestamp and the statistical traffic data of a previous image acquisition timestamp (e.g., timestamp t-1) of the image acquisition timestamp is less than a first difference threshold (e.g., 5%). For example, if the statistical traffic data 3 at a certain image capturing timestamp (e.g., timestamp 3) is 90%, that is, less than the statistical threshold in the anomaly detection condition, at this time, the user terminal may obtain the statistical traffic data 2 (e.g., 96%) corresponding to the last timestamp (e.g., timestamp 2) of the timestamp. Further, the user terminal may determine that the difference between the statistical service data 2 and the statistical service data 3 is 6%, that is, smaller than the first difference threshold, at this time, the user terminal may determine that the statistical service data 3 is the target statistical service data, and the timestamp 3 is the target image acquisition timestamp.
Alternatively, the anomaly detection condition may refer to when the statistical traffic data at the image acquisition timestamp (e.g., timestamp t) is less than a statistical threshold (e.g., 95%) and the difference between the statistical traffic data at the image acquisition timestamp and the average statistical traffic data is less than a second difference threshold (e.g., 5%). The average statistical traffic data may be an average value of statistical traffic data corresponding to a plurality of image capturing timestamps (e.g., timestamp t-1, timestamp t-2, and timestamp t-3) before the image capturing timestamp. For example, if the statistical traffic data 5 on a certain image capturing timestamp (e.g., timestamp 5) is 90%, that is, less than the statistical threshold in the anomaly detection condition, at this time, the user terminal may obtain the statistical traffic data 4 (e.g., 96%) corresponding to timestamp 4, the statistical traffic data 3 (e.g., 97%) corresponding to timestamp 3, and the statistical traffic data 2 (e.g., 95%) corresponding to timestamp 2, and may further determine an average value of the statistical traffic data corresponding to the 3 timestamps, and use the average value as the average statistical traffic data (e.g., 96%). Further, the user terminal may determine that a difference between the average statistical service data and the statistical service data 5 is 6%, that is, smaller than the second difference threshold, at this time, the user terminal may determine that the statistical service data 5 is the target statistical service data, and the timestamp 5 is the target image acquisition timestamp.
It should be understood that the user terminal may obtain service data sets of N users on the target image acquisition timestamp based on the target statistical service data, and may determine the user whose user state is the absent state as a candidate abnormal user in the N service data sets.
For easy understanding, please refer to fig. 6, and fig. 6 is a schematic view of a scenario for determining candidate abnormal users according to an embodiment of the present application. The user terminal in the embodiment of the application may be a user terminal running with an application client. The ue may be any one of the ues in the ue cluster shown in fig. 1, for example, the ue 100 a.
It should be understood that the user corresponding to the user terminal (i.e., the second type of user, e.g., the teaching assistant user) may have the right to manage the first type of user in the application client. In this embodiment of the application, the number of the first type users may be N, and taking 8 as an example, the first type users may include a user a, a user b, a user c, a user d, a user e, a user f, a user g, and a user h.
It can be understood that the user terminal may obtain the service state diagram sent by the server corresponding to the application client. As shown in fig. 6, the service state diagram in the embodiment of the present application is determined by the server corresponding to the application client based on the K image capturing timestamps and the statistical service data on the corresponding image capturing timestamps. The server may be the server 10 corresponding to fig. 1. It can be understood that the image capturing duration formed by the K image capturing timestamps is equal to the sharing duration of the service display interface. The embodiment of the present application may take 10 image capturing timestamps as an example, and specifically may include timestamp 1, timestamp 2, timestamp 3, timestamp 4, timestamp 5, timestamp 6, timestamp 7, timestamp 8, timestamp 9, and timestamp 10. Each of the 10 image capture timestamps may correspond to a piece of statistical traffic data.
Further, the user terminal may perform anomaly analysis on the statistical service data on each image acquisition timestamp in the service state diagram shown in fig. 6, so as to obtain an anomaly analysis result. As shown in fig. 6, in the 10 statistical service data indicated by the anomaly analysis result, the user terminal may determine that the statistical service data corresponding to the timestamp 5 satisfies the anomaly detection condition, and further may use the statistical service data corresponding to the timestamp 5 as the target statistical service data, and may use the timestamp 5 as the target image acquisition timestamp.
It should be understood that the user terminal may obtain a service data group of 8 users at the time stamp 5 based on the target statistical service data. As shown in fig. 6, the service data groups of 8 users may include a service data group D15(0, t5) of user a, a service data group D25(1, t5) of user b, a service data group D35(0, t5) of user c, a service data group D45(1, t5) of user D, a service data group D55(1, t5) of user e, a service data group D65(0, t5) of user f, a service data group D75(1, t5) of user g, and a service data group D85(1, t5) of user h.
Further, the user terminal may obtain the user with the user state being the absent state from the 8 service data groups, and determine the user with the user state being the absent state as the candidate abnormal user. In other words, the user terminal may obtain the user with the state value of 0 in the service data group, and determine the user with the state value of 0 as the candidate abnormal user. For example, the candidate abnormal users determined by the user terminal may be user a, user c, and user f.
It should be appreciated that the user terminal may send an anomalous data extraction request to the server. The abnormal data extraction request may be used to instruct the server to obtain service assistance data corresponding to the candidate abnormal user. Further, the user terminal may receive the service auxiliary data returned by the server, so that the abnormal user may be subjected to secondary verification based on the service auxiliary data, and further, the candidate abnormal user may be subjected to abnormal filtering based on a secondary verification result, and the candidate abnormal user after the abnormal filtering is determined as the abnormal user.
Wherein, it can be understood that the number of candidate abnormal users may be L; l may be a positive integer less than or equal to N. The service assistance data may include the user identifications of the L candidate abnormal users and the abnormal durations of the corresponding candidate abnormal users. The abnormal duration of each candidate abnormal user can include a target image acquisition timestamp.
It should be understood that the server may return the user identifications of the L candidate abnormal users and the abnormal durations of the corresponding candidate abnormal users to the user terminal. At this time, the user terminal may perform secondary verification on the candidate abnormal user based on the L user identifiers and the L abnormal durations to obtain a secondary verification result. If the secondary verification result indicates that an abnormal duration smaller than the abnormal duration threshold (for example, 10 seconds) exists in the L abnormal durations, the user terminal may use the user identifier of the candidate abnormal user corresponding to the abnormal duration smaller than the abnormal duration threshold as the user identifier to be processed, and use the candidate abnormal user corresponding to the identifier to be processed as the user to be filtered. Further, the user terminal may remove the user identifier to be processed from the L user identifiers, and use the user identifier from which the user identifier to be processed is removed as the target user identifier, and further determine, from the L candidate abnormal users, the candidate abnormal user corresponding to the target user identifier as the abnormal user. It can be understood that the embodiment of the present application may allow the first type of user to have a certain error tolerance range (i.e., an instant long exception threshold value, for example, 10 seconds), that is, the exception duration of the candidate exception user is within the error tolerance range, and the candidate exception user may be determined to be a normal user.
For easy understanding, please refer to fig. 7, and fig. 7 is a schematic view of a scenario for determining an abnormal user according to an embodiment of the present application. As shown in fig. 7, the terminal 700 in the embodiment of the present application may be a user terminal running an application client. The terminal 700 may be a user terminal corresponding to a second type of user in a virtual room in an application client, and the terminal 700 may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100 a. The server 70 in the embodiment of the present application may be the server 10 corresponding to fig. 1.
It should be understood that the terminal 700 may send an abnormal data extraction request to the server 70. The abnormal data extraction request may be used to instruct the server 70 to obtain the service assistance data corresponding to the candidate abnormal user. The number of candidate abnormal users in the embodiment of the present application may be L, where L is a positive integer that may be less than or equal to N. The embodiment of the application takes 3 users as an example, and specifically includes a user a, a user b, and a user c. The service assistance information may be a subscriber identity 1 and an exceptional duration 1 (e.g., 20 seconds) for subscriber a, a subscriber identity 2 and an exceptional duration 2 (e.g., 8 seconds) for subscriber b, and a subscriber identity 3 and an exceptional duration 3 (e.g., 15 seconds) for subscriber c. The abnormal duration of each candidate abnormal user can include a target image acquisition timestamp.
It should be understood that the server 70 may return the user identifications of the 3 candidate abnormal users and the abnormal durations of the corresponding candidate abnormal users to the terminal 700. At this time, the terminal 700 may perform secondary verification on the candidate abnormal user based on the 3 user identifiers and the 3 abnormal durations, so as to obtain a secondary verification result.
It can be understood that, in the secondary verification result, the abnormal duration 2 of the user b is smaller than the abnormal duration threshold (for example, 10 seconds), at this time, the terminal 700 may use the user identifier 2 of the user b corresponding to the abnormal duration 2 smaller than the abnormal duration threshold as the to-be-processed user identifier, and use the user b corresponding to the to-be-processed identifier as the to-be-filtered user. Further, the terminal 700 may remove the to-be-processed user identifier from the 3 user identifiers, and use the user identifier with the to-be-processed user identifier removed as the target user identifier. For example, the terminal 700 may determine the user identity 1 and the user identity 3 as target user identities.
Further, the terminal 700 may determine, as the abnormal user, a candidate abnormal user corresponding to the target user identifier among the 3 candidate abnormal users. In other words, the user terminal may determine the user a corresponding to the user identifier 1 and the user c corresponding to the user identifier 3 as abnormal users.
It can be understood that, when the user terminal acquires the service state diagram, the anomaly analysis module may perform anomaly analysis on the service state diagram to remind the second type user corresponding to the user terminal to pay attention to the anomaly condition. For easy understanding, please refer to fig. 8, and fig. 8 is a schematic view of a scenario for reminding a second type of user of an abnormal situation according to an embodiment of the present application. The user terminal in the embodiment of the application may be a user terminal running with an application client. The user terminal may be a user terminal corresponding to a second type of user (e.g., an assistant user) in a virtual room in the application client, and the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100 a.
As shown in fig. 8, when the user terminal in the embodiment of the application acquires the service state diagram, the service state diagram may be subjected to anomaly analysis by the anomaly analysis module, so that an abnormal user of the virtual room of the application client under the target image acquisition timestamp may be determined. It can be understood that the user terminal may generate the abnormal prompt information based on the target user identifier of the abnormal user and the abnormal duration of the abnormal user, so as to remind the second type user to pay attention to the abnormal situation.
It should be understood that, if the user i is a second type user of the virtual room in the application client, in order to facilitate the user i to observe the service state (i.e., attendance condition) of the abnormal user, the display interface of the user terminal corresponding to the user i may be switched. It can be understood that the user terminal may generate the abnormal prompt information based on the target user identifier of the abnormal user and the abnormal duration of the abnormal user, and further may output the page switching control on the service display interface based on the abnormal prompt information. At this time, the user terminal may respond to a trigger operation of the user i for the page switching control, switch the display interface of the application client from the service display interface to the state display interface, and display the service states of the N users in the virtual room on the state display interface.
For easy understanding, please refer to fig. 9, and fig. 9 is a schematic view of a scene of a switching display interface according to an embodiment of the present application. As shown in fig. 9, the user terminal in the embodiment of the present application may be a user terminal running an application client (e.g., live curriculum client). The user terminal may be a user terminal corresponding to a second type of user (e.g., an assistant user) in a virtual room in the application client, and the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100 a.
It should be understood that the virtual room of the application client in the embodiment of the present application may include N first type users (e.g., learning users), and the user (e.g., user i) corresponding to the user terminal may be any one of the N users. It is understood that the user i can participate in the online learning sharing task in the virtual room through the user terminal, that is, the user i can observe the shared data shared by the teacher user in the virtual room on the service display interface 900 shown in fig. 9. The business display interface 900 may output image data, courseware, blackboard writing, etc. of the teacher user.
It is understood that the user i may also be a second type user (e.g., an assistant user) of the virtual room in the application client, and in order to facilitate the user i to observe the service state (i.e., attendance condition) of the abnormal user, the display interface of the user terminal corresponding to the user i may be switched. It should be understood that the user terminal may generate the exception prompting information based on the target user identifier of the exception user and the exception duration of the exception user. Further, the user terminal may output a page switching control 90 as shown in fig. 9 on the service display interface 900 based on the abnormal prompt information. At this time, the user i may execute a trigger operation for the page switching control 90, so that the user terminal may respond to the trigger operation, switch the display interface of the application client from the service display interface 900 to the state display interface 910, and may display the service states of the N users in the virtual room on the state display interface 910.
As shown in fig. 9, a user identification and a status value for each of the N users may be displayed in the status display interface 910. For example, if the status value of user a is 1, it may indicate that user a is in an online status. If the status value of user c is 0, it may indicate that user b is in the absence state, and so on. Of course, due to the limitation of the display size of the status display interface 910, all the first type users in the virtual room cannot be completely displayed, and the user i may perform a trigger operation (e.g., a left-slide operation or a slide-up operation) in the status display interface 910, so that other users of the N users may be displayed. It can be understood that, the user i may perform a trigger operation on users (i.e., absent users, for example, the user c, the user g, and the user l) with a state value of 0, and further, the user terminal may send an exception reminder to the user terminals respectively corresponding to the absent users associated with the trigger operation through a server of the application client, so as to remind the 3 absent users to adjust the positions of the cameras or remind the 3 absent users to participate in the sharing task. The exception reminder may be "your live lesson is in progress, please enter the study room or adjust the camera position. "
Optionally, the second type of user (e.g., a teaching assistant user) in this embodiment of the application may also be another person (e.g., a supervising person) that is independent of the N learning users and can be responsible for attendance work. In the sharing duration corresponding to the virtual room executing the on-line learning of the sharing task, the state display interface 910 shown in fig. 9 can be directly displayed on the display interface of the teaching assistant terminal corresponding to the teaching assistant user, so that the teaching assistant user can pay attention to the attendance condition of the first type user in the virtual room at any time.
In an embodiment of the present application, a computer device (e.g., a user terminal) running an application client (e.g., a live curriculum client) may output a service display interface corresponding to a virtual room in the application client in response to a trigger operation for the application client. If the user i performing the triggering operation belongs to a first type of user (e.g., a learning user) in the virtual room, the user terminal may invoke a camera corresponding to the application client to acquire first image data of the user i at the first image acquisition time stamp. Wherein i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of user in the virtual room. Further, the user terminal may determine a service state of the user i based on the first image data, and construct a service data group (e.g., an attendance group) corresponding to the user i according to the service state of the user i and the first image acquisition timestamp. It should be understood that, since the number of bytes occupied by the service data group is much smaller than the number of bytes occupied by the image data, the user terminal can improve the efficiency of data transmission when uploading the service data group to the server. It should be understood that, at this time, since the server corresponding to the application client does not need to acquire the multiple channels of video data of the N users in the virtual room, it is naturally not necessary to synchronously issue the multiple channels of video data, and thus the bandwidth pressure during data transmission can be effectively relieved. In addition, it can be understood that, when the server obtains the service data group uploaded by the user i in the virtual room, the server may update the service data group to a service data group set associated with the first type user, so as to obtain an updated service data group set. It can be understood that the updated service data group set can be used to store the service data groups uploaded by the N users in the virtual room. Based on this, the server may determine, based on the updated set of service data groups, a service state diagram for performing anomaly analysis on service states of N users belonging to the first type of user (for example, a real-time attendance graph associated with the N users may be determined), and thus may improve efficiency of performing centralized management on the N users in the virtual room.
Further, please refer to fig. 10, where fig. 10 is a schematic flowchart of a service data processing method according to an embodiment of the present application. As shown in fig. 10, the method may be performed by a user terminal and a server, where the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100 a. The server may be the server 10 shown in fig. 1 described above. The method may comprise at least the following steps S201-S206:
step S201, a user terminal responds to a trigger operation aiming at an application client and outputs a service display interface corresponding to a virtual room in the application client;
step S202, if the user i executing the trigger operation belongs to a first type user in a virtual room, the user terminal calls a camera corresponding to the application client to acquire first image data of the user i at a first image acquisition time stamp;
step S203, the user terminal determines the service state of the user i based on the first image data, and constructs a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp;
step S204, the user terminal sends the service data group to a server corresponding to the application client;
step S205, the server updates the service data group to a service data group set associated with the first type user to obtain an updated service data group set;
step S206, the server determines a service state diagram for performing anomaly analysis on the service states of the N users belonging to the first type of user based on the service data groups of the N users in the updated service data group set.
For specific implementation of steps S201 to S206, reference may be made to the description of steps S101 to S104 in the embodiment corresponding to fig. 3, and details will not be described here.
In an embodiment of the present application, a computer device (e.g., a user terminal) running an application client (e.g., a live curriculum client) may output a service display interface corresponding to a virtual room in the application client in response to a trigger operation for the application client. If the user i performing the triggering operation belongs to a first type of user (e.g., a learning user) in the virtual room, the user terminal may invoke a camera corresponding to the application client to acquire first image data of the user i at the first image acquisition time stamp. Wherein i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of user in the virtual room. Further, the user terminal may determine a service state of the user i based on the first image data, and construct a service data group (e.g., an attendance group) corresponding to the user i according to the service state of the user i and the first image acquisition timestamp. It should be understood that, since the number of bytes occupied by the service data group is much smaller than the number of bytes occupied by the image data, the user terminal can improve the efficiency of data transmission when uploading the service data group to the server. It should be understood that, at this time, since the server corresponding to the application client does not need to acquire the multiple channels of video data of the N users in the virtual room, it is naturally not necessary to synchronously issue the multiple channels of video data, and thus the bandwidth pressure during data transmission can be effectively relieved. In addition, it can be understood that, when the server obtains the service data group uploaded by the user i in the virtual room, the server may update the service data group to a service data group set associated with the first type user, so as to obtain an updated service data group set. It can be understood that the updated service data group set can be used to store the service data groups uploaded by the N users in the virtual room. Based on this, the server may determine, based on the updated set of service data groups, a service state diagram for performing anomaly analysis on service states of N users belonging to the first type of user (for example, a real-time attendance graph associated with the N users may be determined), and thus may improve efficiency of performing centralized management on the N users in the virtual room.
Further, please refer to fig. 11, where fig. 11 is a schematic structural diagram of a service data processing apparatus according to an embodiment of the present application. The service data processing apparatus may be a computer program (including program code) running in a computer device, for example, the service data processing apparatus is an application software; the service data processing device can be used for executing corresponding steps in the method provided by the embodiment of the application. As shown in fig. 11, the service data processing apparatus 1 may operate in a user terminal, which may be any one of the user terminals in the user terminal cluster in the embodiment corresponding to fig. 1, for example, the terminal 100 a. The service data processing apparatus 1 may include: the system comprises a display interface output module 11, a first acquisition module 12, a data group construction module 13, a data group sending module 14, an image data output module 15, a second acquisition module 16, an image data refreshing module 17, a display sub-interface closing module 18, a timestamp acquisition module 19, a display sub-interface recovery module 20, a state diagram analysis module 21, a target timestamp determination module 22, a request sending module 23, a secondary verification module 24, an information generation module 25 and an interface switching module 26.
The display interface output module 11 is configured to respond to a trigger operation for an application client, and output a service display interface corresponding to a virtual room in the application client;
the first acquisition module 12 is configured to, if a user i performing the trigger operation belongs to a first type of user in the virtual room, invoke a camera corresponding to the application client to acquire first image data of a first image acquisition timestamp of the user i; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room;
the data group construction module 13 is configured to determine a service state of the user i based on the first image data, and construct a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp.
Wherein, the data group construction module 13 includes: an identification unit 131, a first conversion unit 132, a second conversion unit 133 and a data set construction unit 134.
The recognition unit 131 is configured to perform face recognition on the first image data to obtain a face recognition result;
the first conversion unit 132 is configured to, if the face recognition result indicates that the face of the user i exists in the first image data, convert the first image data including the face of the user i into a first state value; the first state value is used for representing that the user state of the user i is an online state;
the second converting unit 133 is configured to, if the face recognition result indicates that the face of the user i does not exist in the first image data, convert the first image data that does not include the face of the user i into a second state value; the second state value is used for representing that the user state of the user i is an absent state;
the data group establishing unit 134 is configured to determine an online state or an absent state as a service state of the user i, and establish a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp corresponding to the first image data.
For specific implementation of the identifying unit 131, the first converting unit 132, the second converting unit 133 and the data set constructing unit 134, reference may be made to the description of step S103 in the embodiment corresponding to fig. 3, and details will not be further described here.
The data group sending module 14 is configured to send the service data group to a server corresponding to the application client, so that the server updates the service data group to a service data group set associated with the first type of user, and determines a service state diagram for performing anomaly analysis on service states of N users belonging to the first type of user based on the updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users.
The image data output module 15 is configured to output the first image data to an image display sub-interface independent of the service display interface; the image display sub-interface is an interface superposed on the service display interface, and the size of the image display sub-interface is smaller than that of the service display interface.
The image data in the image display sub-interface is used for indicating a user i to adjust the camera shooting angle of the camera;
the second acquisition module 16 is configured to acquire image data of the user i at a second image acquisition timestamp based on the camera with the camera angle adjusted; the second image acquisition timestamp is a next acquisition timestamp of the first image acquisition timestamp;
the image data refreshing module 17 is configured to use the image data of the user i at the second image capturing time stamp as second image data, and refresh the image data in the image display sub-interface based on the second image data.
The display sub-interface closing module 18 is configured to record a superimposed display duration during which the image display sub-interface is superimposed on the service display interface, and close the image display sub-interface on the service data display interface when the superimposed display duration reaches a display duration threshold corresponding to the image display sub-interface.
The service display interface is a data sharing interface for sharing data among N users belonging to the first type of users;
the timestamp acquisition module 19 is configured to acquire a sharing timestamp corresponding to the data sharing interface, and use image data of the user i acquired by the camera when the user i shares the timestamp as third image data; the sharing time stamp is the maximum time stamp in the sharing time length corresponding to the data sharing interface;
the display sub-interface recovery module 20 is configured to recover and display the image display sub-interface on the data sharing interface and output the third image data to the image display sub-interface if the sharing timestamp reaches the sharing period corresponding to the data sharing interface.
If the user i belongs to a second type of user in the virtual room; the user i has the authority to manage the service states of the N users; the updated service data set indicating server performs data statistics on service data groups of the N users on the K image acquisition timestamps to obtain K statistical service data associated with the N users; one image acquisition timestamp corresponds to one statistical service data; the service state diagram is determined by the server based on the K image acquisition time stamps and the statistical service data on the corresponding image acquisition time stamps; the K image acquisition time stamps comprise a first image acquisition time stamp, and the image acquisition time length formed by the K image acquisition time stamps is equal to the sharing time length of the service display interface;
the state diagram analysis module 21 is configured to receive a service state diagram returned by the server based on the authority of the user i, and perform anomaly analysis on the statistical service data on each image acquisition timestamp in the service state diagram to obtain an anomaly analysis result;
the target timestamp determining module 22 is configured to, if the anomaly analysis result indicates that there is statistical service data meeting the anomaly detection condition in the K pieces of statistical service data, take the statistical service data meeting the anomaly detection condition as target statistical service data, and take an image acquisition timestamp corresponding to the target statistical service data as a target image acquisition timestamp;
the request sending module 23 is configured to obtain service data groups of N users on a target image acquisition timestamp based on target statistical service data, determine a user in an absent state as a candidate abnormal user in the N service data groups, and send an abnormal data extraction request to the server; the abnormal data extraction request is used for indicating the server to obtain business auxiliary data corresponding to the candidate abnormal user;
the secondary verification module 24 is configured to receive the service auxiliary data returned by the server, perform secondary verification on the candidate abnormal user based on the service auxiliary data, perform exception filtering on the candidate abnormal user based on a secondary verification result, and determine the candidate abnormal user after exception filtering as an abnormal user.
The number of the candidate abnormal users is L; l is a positive integer less than or equal to N; the service auxiliary data comprises user identifications of L candidate abnormal users and abnormal duration corresponding to the candidate abnormal users; the abnormal duration of each candidate abnormal user comprises a target image acquisition timestamp;
the secondary verification module 24 includes: an anomaly receiving unit 241, a secondary verification unit 242, a to-be-filtered user determination unit 243, and an anomalous user determination unit 244.
The exception receiving unit 241 is configured to receive user identifiers of L candidate exception users and exception durations of corresponding candidate exception users, where the user identifiers are returned by the server;
the secondary verification unit 242 is configured to perform secondary verification on the candidate abnormal user based on the L user identifiers and the L abnormal durations to obtain a secondary verification result;
the to-be-filtered user determining unit 243 is configured to, if the secondary verification result indicates that there is an abnormal duration smaller than the abnormal duration threshold in the L abnormal durations, take the user identifier of the candidate abnormal user corresponding to the abnormal duration smaller than the abnormal duration threshold as the to-be-filtered user identifier, and take the candidate abnormal user corresponding to the to-be-filtered user identifier as the to-be-filtered user;
the abnormal user determining unit 244 is configured to remove the user identifier to be processed from the L user identifiers, use the user identifier from which the user identifier to be processed is removed as a target user identifier, and determine, as an abnormal user, a candidate abnormal user corresponding to the target user identifier from among the L candidate abnormal users.
For specific implementation manners of the abnormal user receiving unit 241, the secondary verification unit 242, the to-be-filtered user determining unit 243, and the abnormal user determining unit 244, reference may be made to the description of the abnormal user in the embodiment corresponding to fig. 3, and details will not be further described here.
The information generating module 25 is configured to generate an abnormal prompt information based on the target user identifier of the abnormal user and the abnormal duration of the abnormal user, and output a page switching control on the service display interface based on the abnormal prompt information;
the interface switching module 26 is configured to switch the display interface of the application client from the service display interface to a state display interface in response to a trigger operation for the page switching control, and display service states of N users in the virtual room on the state display interface.
The specific implementation manners of the display interface output module 11, the first acquisition module 12, the data set construction module 13, the data set sending module 14, the image data output module 15, the second acquisition module 16, the image data refreshing module 17, the display sub-interface closing module 18, the timestamp acquisition module 19, the display sub-interface recovery module 20, the state diagram analysis module 21, the target timestamp determination module 22, the request sending module 23, the secondary verification module 24, the information generation module 25, and the interface switching module 26 may refer to the descriptions of steps S101 to S104 in the embodiment corresponding to fig. 3, and will not be further described herein. In addition, the beneficial effects of the same method are not described in detail.
Further, please refer to fig. 12, fig. 12 is a schematic diagram of a computer device according to an embodiment of the present application. As shown in fig. 12, the computer device 1000 may be the user terminal in the embodiment corresponding to fig. 1, where the computer device 1000 may include: at least one processor 1001, such as a CPU, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display (Display) and a Keyboard (Keyboard), and the network interface 1004 may optionally include a standard wired interface and a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 1005 may optionally also be at least one storage device located remotely from the aforementioned processor 1001. As shown in fig. 12, a memory 1005, which is a kind of computer storage medium, may include an operating system, a network communication module, a user interface module, and a device control application program.
In the computer apparatus 1000 shown in fig. 12, the network interface 1004 is mainly used for network communication with a server; the user interface 1003 is an interface for providing a user with input; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
responding to the trigger operation aiming at the application client, and outputting a service display interface corresponding to a virtual room in the application client;
if the user i executing the trigger operation belongs to a first type user in the virtual room, calling a camera corresponding to the application client to acquire first image data of the user i at a first image acquisition time stamp; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room;
determining the service state of the user i based on the first image data, and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp;
sending the service data group to a server corresponding to the application client so that the server updates the service data group to a service data group set associated with the first type of user, and determining a service state diagram for performing exception analysis on service states of N users belonging to the first type of user based on the updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users.
It should be understood that the computer device 1000 described in this embodiment may perform the description of the service data processing method in the embodiment corresponding to fig. 3 and fig. 10, and may also perform the description of the service data processing apparatus 1 in the embodiment corresponding to fig. 11, which is not described herein again. In addition, the beneficial effects of the same method are not described in detail.
Further, here, it is to be noted that: an embodiment of the present application further provides a computer-readable storage medium, where the computer program executed by the aforementioned service data processing apparatus 1 is stored in the computer-readable storage medium, and the computer program includes program instructions, and when the processor executes the program instructions, the description of the service data processing method in the embodiment corresponding to fig. 3 or fig. 10 can be executed, so that details are not repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in embodiments of the computer-readable storage medium referred to in the present application, reference is made to the description of embodiments of the method of the present application. As an example, program instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network, which may comprise a block chain system.
An aspect of the application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device can execute the description of the service data processing method in the embodiment corresponding to fig. 3 or fig. 10, which is not described herein again. In addition, the beneficial effects of the same method are not described in detail.
Further, please refer to fig. 13, where fig. 13 is a schematic structural diagram of a service data processing apparatus according to an embodiment of the present application. The service data processing apparatus may be a computer program (including program code) running in a computer device, for example, the service data processing apparatus is an application software; the service data processing device can be used for executing corresponding steps in the method provided by the embodiment of the application. As shown in fig. 13, the service data processing apparatus 2 may operate on a server, which may be the server 10 in the embodiment corresponding to fig. 1. The service data processing apparatus 2 may include: a data group acquisition module 100, a data set update module 200, and a state diagram determination module 300.
The data set obtaining module 100 is configured to obtain a service data set sent by a user terminal through an application client; the user terminal is a terminal corresponding to a user i in a virtual room corresponding to the application client; user i belongs to a first type of user in the virtual room; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room; the service data group comprises a service state of a user i and a first image acquisition timestamp of first image data corresponding to the user i; the first image data is acquired by calling a camera corresponding to the application client by the user terminal;
the data set updating module 200 is configured to update a service data set to a service data set associated with a first type of user, so as to obtain an updated service data set; the updated service data group set comprises service data groups of N users belonging to the first type of users;
the state diagram determining module 300 is configured to determine a service state diagram for performing anomaly analysis on service states of N users belonging to a first type of user based on service data groups of N users in the updated service data group set.
Wherein the state diagram determining module 300 comprises: a data set determination unit 3010, an integration processing unit 3020, a traffic state determination unit 3030, and a state diagram determination unit 3040.
The data set determining unit 3010 is configured to obtain service data groups of N users in the updated service data set on K image acquisition timestamps, respectively, and use the obtained service data groups of each user on the K image acquisition timestamps as user data sets to be processed, respectively; the number of user data sets to be processed is equal to N;
the integration processing unit 3020 is configured to, based on the K image acquisition timestamps in each user data set to be processed, integrate the state values of the service states of the N users in the corresponding image acquisition timestamps to obtain the service data matrix R associated with the N usersm×n(ii) a Service data matrix Rm×nThe value of the matrix row number m is N, and the service data matrix Rm×nThe number n of matrix columns in (1) is K;
the service status determination unit 3030 is configured to determine the service data matrix Rm×nDetermining a service state with a non-zero state value of matrix elements on the same column as a first service state, and determining a service state with a zero state value of matrix elements on the same column as a second service state;
the state diagram determining unit 3040 is configured to determine the number of users in the first service state on the same column as a first user number, determine the number of users in the second service state as a second user number, determine a ratio of the first user number to a sum of the first user number and the second user number as K statistical service data associated with the N users, and determine a service state diagram corresponding to the first type user based on the K image acquisition timestamps and the statistical service data on the corresponding image acquisition timestamps; the service state diagram is used for carrying out abnormity analysis on the user states of the N users.
For specific implementation of the data set determining unit 3010, the integration processing unit 3020, the service state determining unit 3030, and the state diagram determining unit 3040, reference may be made to the description of step S206 in the embodiment corresponding to fig. 10, which will not be described again here.
For specific implementation manners of the data group obtaining module 100, the data set updating module 200, and the state diagram determining module 300, reference may be made to the description of step S201 to step S206 in the embodiment corresponding to fig. 10, and details will not be further described here. In addition, the beneficial effects of the same method are not described in detail.
Further, please refer to fig. 14, fig. 14 is a schematic diagram of a computer device according to an embodiment of the present application. As shown in fig. 14, the computer device 3000 may be the server in the corresponding embodiment of fig. 2, and the computer device 3000 may include: at least one processor 3001, e.g., a CPU, at least one network interface 3004, a user interface 3003, memory 3005, at least one communication bus 3002. The communication bus 3002 is used to realize connection communication between these components. The user interface 3003 may include a Display screen (Display) and a Keyboard (Keyboard), and the network interface 3004 may optionally include a standard wired interface and a wireless interface (e.g., WI-FI interface). The memory 3005 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The storage 3005 may optionally also be at least one storage device located remotely from the aforementioned processor 3001. As shown in fig. 14, the memory 3005, which is one type of computer storage medium, may include an operating system, a network communication module, a user interface module, and a device control application program.
In the computer device 3000 shown in fig. 14, the network interface 3004 is mainly used for network communication with the user terminal; and the user interface 3003 is an interface mainly for providing input to the user; and the processor 3001 may be configured to invoke a device control application stored in the memory 3005 to implement:
acquiring a service data group sent by a user terminal through an application client; the user terminal is a terminal corresponding to a user i in a virtual room corresponding to the application client; user i belongs to a first type of user in the virtual room; i is a positive integer less than or equal to N, and N is the number of users belonging to the first type of users in the virtual room; the service data group comprises a service state of a user i and a first image acquisition timestamp of first image data corresponding to the user i; the first image data is acquired by calling a camera corresponding to the application client by the user terminal;
updating the service data group to a service data group set associated with the first type of user to obtain an updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users;
and determining a service state diagram for performing abnormal analysis on the service states of the N users belonging to the first type of users based on the service data groups of the N users in the updated service data group set.
It should be understood that the computer device 3000 described in this embodiment may perform the description of the service data processing method in the embodiment corresponding to fig. 10, and may also perform the description of the service data processing apparatus 2 in the embodiment corresponding to fig. 13, which is not described herein again. In addition, the beneficial effects of the same method are not described in detail.
Further, here, it is to be noted that: an embodiment of the present application further provides a computer-readable storage medium, where the computer program executed by the aforementioned service data processing apparatus 2 is stored in the computer-readable storage medium, and the computer program includes program instructions, and when the processor executes the program instructions, the description of the service data processing method in the embodiment corresponding to fig. 10 can be performed, so that details are not repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in embodiments of the computer-readable storage medium referred to in the present application, reference is made to the description of embodiments of the method of the present application. As an example, program instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network, which may comprise a block chain system.
An aspect of the application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device can execute the description of the service data processing method in the embodiment corresponding to fig. 3 or fig. 10, which is not described herein again. In addition, the beneficial effects of the same method are not described in detail.
Further, please refer to fig. 15, where fig. 15 is a schematic structural diagram of a data processing system according to an embodiment of the present application. The data processing system 3 may comprise a data processing device 1a and a data processing device 2 a. The data processing apparatus 1a may be the service data processing apparatus 1 in the embodiment corresponding to fig. 11, and it is understood that the data processing apparatus 1a may be integrated in the user terminal in the embodiment corresponding to fig. 1, and therefore, details will not be described here. The data processing apparatus 2a may be the service data processing apparatus 2 in the embodiment corresponding to fig. 13, and it is understood that the data processing apparatus 2a may be integrated in the server 10 in the embodiment corresponding to fig. 1, and therefore, the details will not be described here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in the embodiments of the data processing system to which the present application relates, reference is made to the description of the embodiments of the method of the present application.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (15)

1. A method for processing service data is characterized by comprising the following steps:
responding to a trigger operation aiming at an application client, and outputting a service display interface corresponding to a virtual room in the application client;
if the user i executing the triggering operation belongs to a first type user in the virtual room, calling a camera corresponding to the application client to acquire first image data of the user i at a first image acquisition time stamp; i is a positive integer less than or equal to N, wherein N is the number of users belonging to the first type of user in the virtual room;
determining the service state of the user i based on the first image data, and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp;
sending the service data group to a server corresponding to the application client, so that the server updates the service data group to a service data group set associated with the first type of user, and determining a service state diagram for performing abnormal analysis on service states of N users belonging to the first type of user based on the updated service data group set; the updated service data group set comprises the service data groups of the N users belonging to the first type of users.
2. The method of claim 1, further comprising:
outputting the first image data to an image display sub-interface independent of the service display interface; the image display sub-interface is an interface superposed on the service display interface, and the size of the image display sub-interface is smaller than that of the service display interface.
3. The method of claim 2, wherein image data in the image display sub-interface is used to instruct the user i to adjust a camera angle of the camera;
the method further comprises the following steps:
acquiring image data of the user i at a second image acquisition time stamp based on the camera with the adjusted camera angle; the second image acquisition timestamp is a next acquisition timestamp of the first image acquisition timestamp;
and taking the image data of the user i at the second image acquisition time stamp as second image data, and refreshing the image data in the image display sub-interface based on the second image data.
4. The method of claim 2, further comprising:
recording the overlapping display time length of the image display sub-interface overlapping the service display interface, and closing the image display sub-interface on the service data display interface when the overlapping display time length reaches the display time length threshold value corresponding to the image display sub-interface.
5. The method according to claim 4, wherein the service display interface is a data sharing interface for data sharing among N users belonging to the first type of user;
the method further comprises the following steps:
acquiring a sharing timestamp corresponding to the data sharing interface, and taking the image data of the user i acquired by the camera at the sharing timestamp as third image data; the sharing timestamp is the maximum timestamp in the sharing duration corresponding to the data sharing interface;
and if the sharing time stamp reaches the sharing period corresponding to the data sharing interface, restoring and displaying the image display sub-interface on the data sharing interface, and outputting the third image data to the image display sub-interface.
6. The method according to any one of claims 1 to 5, wherein the determining the service state of the user i based on the first image data, and constructing the service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp comprises:
carrying out face recognition on the first image data to obtain a face recognition result;
if the face recognition result indicates that the face of the user i exists in the first image data, converting the first image data containing the face of the user i into a first state value; the first state value is used for representing that the user state of the user i is an online state;
if the face recognition result indicates that the face of the user i does not exist in the first image data, converting the first image data which does not contain the face of the user i into a second state value; the second state value is used for representing that the user state of the user i is an absent state;
and determining the online state or the absence state as the service state of the user i, and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp corresponding to the first image data.
7. The method according to any one of claims 1 to 5, wherein if the user i belongs to a second type of user in the virtual room, the user i has a right to manage the service states of the N users; the updated service data set instructs the server to perform data statistics on service data groups of the N users on K image acquisition timestamps to obtain K statistical service data associated with the N users; one image acquisition timestamp corresponds to one statistical service data; the service state diagram is determined by the server based on the statistical service data on the K image acquisition time stamps and the corresponding image acquisition time stamps; the K image acquisition time stamps comprise the first image acquisition time stamp, and the image acquisition time length formed by the K image acquisition time stamps is equal to the sharing time length of the service display interface;
the method further comprises the following steps:
receiving the service state diagram returned by the server based on the authority of the user i, and performing anomaly analysis on the statistical service data on each image acquisition timestamp in the service state diagram to obtain an anomaly analysis result;
if the abnormal analysis result indicates that statistical service data meeting abnormal detection conditions exist in the K statistical service data, taking the statistical service data meeting the abnormal detection conditions as target statistical service data, and taking an image acquisition timestamp corresponding to the target statistical service data as a target image acquisition timestamp;
acquiring service data groups of N users on the target image acquisition timestamp based on the target statistical service data, determining users with user states of absence as candidate abnormal users in the N service data groups, and sending abnormal data extraction requests to the server; the abnormal data extraction request is used for indicating the server to obtain business auxiliary data corresponding to the candidate abnormal user;
and receiving the business auxiliary data returned by the server, performing secondary verification on the candidate abnormal user based on the business auxiliary data, performing abnormal filtering on the candidate abnormal user based on a secondary verification result, and determining the candidate abnormal user after abnormal filtering as an abnormal user.
8. The method of claim 7, wherein the number of candidate abnormal users is L; l is a positive integer less than or equal to N; the service auxiliary data comprises user identifications of L candidate abnormal users and abnormal duration corresponding to the candidate abnormal users; the abnormal duration of each candidate abnormal user comprises the target image acquisition timestamp;
the receiving the service auxiliary data returned by the server, performing secondary verification on the candidate abnormal user based on the service auxiliary data, performing abnormal filtering on the candidate abnormal user based on a secondary verification result, and determining the candidate abnormal user after abnormal filtering as an abnormal user includes:
receiving the user identifications of the L candidate abnormal users and the abnormal duration of the corresponding candidate abnormal users returned by the server;
performing secondary verification on the candidate abnormal user based on the L user identifications and the L abnormal durations to obtain a secondary verification result;
if the secondary verification result indicates that abnormal time length smaller than the abnormal time length threshold exists in the L abnormal time lengths, taking the user identification of the candidate abnormal user corresponding to the abnormal time length smaller than the abnormal time length threshold as a user identification to be processed, and taking the candidate abnormal user corresponding to the identification to be processed as a user to be filtered;
and removing the user identifier to be processed from the L user identifiers, taking the user identifier from which the user identifier to be processed is removed as a target user identifier, and determining a candidate abnormal user corresponding to the target user identifier as an abnormal user from the L candidate abnormal users.
9. The method of claim 8, further comprising:
generating abnormal prompt information based on the target user identification of the abnormal user and the abnormal duration of the abnormal user, and outputting a page switching control on the service display interface based on the abnormal prompt information;
and responding to the triggering operation aiming at the page switching control, switching the display interface of the application client from the service display interface to a state display interface, and displaying the service states of the N users in the virtual room on the state display interface.
10. A method for processing service data is characterized by comprising the following steps:
acquiring a service data group sent by a user terminal through an application client; the user terminal is a terminal corresponding to a user i in a virtual room corresponding to the application client; the user i belongs to a first type of user in the virtual room; the i is a positive integer less than or equal to the N, and the N is the number of users belonging to the first type of users in the virtual room; the service data group comprises a service state of the user i and a first image acquisition timestamp of first image data corresponding to the user i; the first image data is acquired by the user terminal calling a camera corresponding to the application client;
updating the service data group to a service data group set associated with the first type of user to obtain an updated service data group set; the updated service data group set comprises service data groups of N users belonging to the first type of users;
and determining a service state diagram for performing anomaly analysis on the service states of the N users belonging to the first type of user based on the service data groups of the N users in the updated service data group set.
11. The method according to claim 10, wherein said determining a traffic state diagram for performing anomaly analysis on the traffic states of the N users belonging to the first type of user based on the traffic data groups of the N users in the updated traffic data group set comprises:
acquiring service data groups of N users in the updated service data set on K image acquisition time stamps respectively, and taking the acquired service data groups of each user on the K image acquisition time stamps as user data sets to be processed respectively; the number of the user data sets to be processed is equal to N;
integrating the state values of the service states of the N users on the corresponding image acquisition time stamps based on the K image acquisition time stamps in each user data set to be processed to obtain a service data matrix R associated with the N usersm×n(ii) a The service data matrix Rm×nThe value of the matrix row number m in (1) is N, and the service data matrix Rm×nThe value of the matrix column number n in (1) is K;
in the service data matrix Rm×nDetermining a service state with a non-zero state value of matrix elements on the same column as a first service state, and determining a service state with a zero state value of matrix elements on the same column as a second service state;
determining the number of users in the first service state on the same column as a first user number, determining the number of users in the second service state as a second user number, taking the ratio of the first user number to the sum of the first user number and the second user number as K statistical service data associated with the N users, and determining a service state diagram corresponding to the first type of user based on the K image acquisition timestamps and the statistical service data on the corresponding image acquisition timestamps; and the service state diagram is used for carrying out abnormity analysis on the user states of the N users.
12. A service data processing apparatus, comprising:
the display interface output module is used for responding to the trigger operation aiming at the application client and outputting a service display interface corresponding to a virtual room in the application client;
the first acquisition module is used for calling a camera corresponding to the application client to acquire first image data of a first image acquisition timestamp of the user i if the user i executing the trigger operation belongs to a first type of user in the virtual room; i is a positive integer less than or equal to N, wherein N is the number of users belonging to the first type of user in the virtual room;
the data group construction module is used for determining the service state of the user i based on the first image data and constructing a service data group corresponding to the user i according to the service state of the user i and the first image acquisition timestamp;
a data group sending module, configured to send the service data group to a server corresponding to the application client, so that the server updates the service data group to a service data group set associated with the first type of user, and determines, based on the updated service data group set, a service state diagram for performing anomaly analysis on service states of N users belonging to the first type of user; the updated service data group set comprises service data groups of N users belonging to the first type of users.
13. A service data processing apparatus, comprising:
the data group acquisition module is used for acquiring a service data group sent by the user terminal through the application client; the user terminal is a terminal corresponding to a user i in a virtual room corresponding to the application client; the user i belongs to a first type of user in the virtual room; the i is a positive integer less than or equal to the N, and the N is the number of users belonging to the first type of users in the virtual room; the service data group comprises a service state of the user i and a first image acquisition timestamp of first image data corresponding to the user i; the first image data is acquired by the user terminal calling a camera corresponding to the application client;
a data set updating module, configured to update the service data set to a service data set associated with the first type of user, so as to obtain an updated service data set; the updated service data group set comprises service data groups of N users belonging to the first type of users;
and a state diagram determining module, configured to determine, based on the service data groups of the N users in the updated service data group set, a service state diagram used for performing anomaly analysis on the service states of the N users belonging to the first type user.
14. A computer device, comprising: a processor, a memory, a network interface;
the processor is connected to a memory for providing data communication functions, a network interface for storing a computer program, and a processor for calling the computer program to perform the method of any one of claims 1 to 11.
15. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions which, when executed by a processor, perform the method of any of claims 1-11.
CN202010727981.5A 2020-07-23 2020-07-23 Service data processing method, device, equipment and storage medium Active CN111767898B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010727981.5A CN111767898B (en) 2020-07-23 2020-07-23 Service data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010727981.5A CN111767898B (en) 2020-07-23 2020-07-23 Service data processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111767898A true CN111767898A (en) 2020-10-13
CN111767898B CN111767898B (en) 2023-11-24

Family

ID=72727424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010727981.5A Active CN111767898B (en) 2020-07-23 2020-07-23 Service data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111767898B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112910880A (en) * 2021-01-28 2021-06-04 腾讯科技(深圳)有限公司 Virtual room creating method, system, device, equipment and medium
CN114625298A (en) * 2022-03-18 2022-06-14 北京有竹居网络技术有限公司 Online learning method and device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106982224A (en) * 2017-04-28 2017-07-25 南京网博计算机软件系统有限公司 The method and system of real time identity checking identification
CN107340867A (en) * 2017-07-05 2017-11-10 广东小天才科技有限公司 One kind uses data statistical approach, device, terminal device and storage medium
WO2019062620A1 (en) * 2017-09-28 2019-04-04 钉钉控股(开曼)有限公司 Attendance check method and apparatus, and attendance check device
CN110458069A (en) * 2019-08-02 2019-11-15 深圳市华方信息产业有限公司 A kind of method and system based on face recognition Added Management user's on-line study state
CN111047481A (en) * 2019-09-29 2020-04-21 云知声智能科技股份有限公司 Online learning system with supervision function

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106982224A (en) * 2017-04-28 2017-07-25 南京网博计算机软件系统有限公司 The method and system of real time identity checking identification
CN107340867A (en) * 2017-07-05 2017-11-10 广东小天才科技有限公司 One kind uses data statistical approach, device, terminal device and storage medium
WO2019062620A1 (en) * 2017-09-28 2019-04-04 钉钉控股(开曼)有限公司 Attendance check method and apparatus, and attendance check device
CN110458069A (en) * 2019-08-02 2019-11-15 深圳市华方信息产业有限公司 A kind of method and system based on face recognition Added Management user's on-line study state
CN111047481A (en) * 2019-09-29 2020-04-21 云知声智能科技股份有限公司 Online learning system with supervision function

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112910880A (en) * 2021-01-28 2021-06-04 腾讯科技(深圳)有限公司 Virtual room creating method, system, device, equipment and medium
CN112910880B (en) * 2021-01-28 2022-04-12 腾讯科技(深圳)有限公司 Virtual room creating method, system, device, equipment and medium
CN114625298A (en) * 2022-03-18 2022-06-14 北京有竹居网络技术有限公司 Online learning method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111767898B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
CN109766859B (en) Campus monitoring method, device, equipment and storage medium based on micro-expressions
CN111767898B (en) Service data processing method, device, equipment and storage medium
CN107292441B (en) Operation and maintenance cooperation system
CN106789593B (en) A kind of instant message processing method, server and system merging sign language
CN112565914B (en) Video display method, device and system for online classroom and storage medium
CN113055697A (en) Online education live broadcast system and method based on cloud computing and big data technology
KR102187741B1 (en) Metadata crowd sourcing system and method
CN110609970B (en) User identity identification method and device, storage medium and electronic equipment
CN111970471B (en) Conference participant scoring method, device, equipment and medium based on video conference
CN112866619A (en) Teleconference control method and device, electronic equipment and storage medium
CN103646314A (en) Team safety activity management system control method based on web site
CN111131757A (en) Video conference display method, device and storage medium
KR101562012B1 (en) System and method providing military training mode using smart device
CN113573025A (en) Monitoring video viewing method and device, terminal equipment and storage medium
CN106992971B (en) Interactive terminal switching method and device and interactive recording and broadcasting system
CN112040277B (en) Video-based data processing method and device, computer and readable storage medium
CN114401386A (en) Two-to-many remote e-interrogation system and method for intelligent public security
CN111327943B (en) Information management method, device, system, computer equipment and storage medium
CN111901351A (en) Remote teaching system, method and device and voice gateway router
CN112437244A (en) Service recovery method, device, terminal equipment and storage medium
CN112132079A (en) Method, device and system for monitoring students in online teaching
CN111414838A (en) Attention detection method, device, system, terminal and storage medium
CN111695459B (en) State information prompting method and related equipment
CN220273729U (en) Wireless desktop delivery system for video conference place
CN115733998B (en) Live content transmission method based on online course live broadcast system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40031422

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant