CN112861591A - Interactive identification method, interactive identification system, computer equipment and storage medium - Google Patents

Interactive identification method, interactive identification system, computer equipment and storage medium Download PDF

Info

Publication number
CN112861591A
CN112861591A CN201911191660.1A CN201911191660A CN112861591A CN 112861591 A CN112861591 A CN 112861591A CN 201911191660 A CN201911191660 A CN 201911191660A CN 112861591 A CN112861591 A CN 112861591A
Authority
CN
China
Prior art keywords
preset
posture
participant
gesture
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911191660.1A
Other languages
Chinese (zh)
Inventor
刘鹏宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN201911191660.1A priority Critical patent/CN112861591A/en
Publication of CN112861591A publication Critical patent/CN112861591A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/36Indoor scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Technology (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an interactive identification method, an interactive identification system, computer equipment and a storage medium, wherein the interactive identification method comprises the following steps: controlling an image acquisition device to acquire scene images of the activity room in real time according to a preset frequency within a preset time; carrying out gesture recognition on each participant in the scene image, judging whether each participant makes at least one of a plurality of preset gestures, and if so, storing gesture information of the gesture; and if the posture information meets the preset condition, judging whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior. The embodiment provided by the invention can identify the interaction behavior of each participant in a classroom and a conference in real time, so that the activity of each participant and the interactivity of the classroom or the conference can be effectively evaluated.

Description

Interactive identification method, interactive identification system, computer equipment and storage medium
Technical Field
The invention relates to the technical field of gesture recognition, in particular to an interactive recognition method, an interactive recognition system, computer equipment and a storage medium based on gesture recognition.
Background
In traditional classroom teaching, most of analysis and observation of student classroom interaction behaviors are subjective evaluation of instructors, real classroom interaction conditions can not be accurately reflected by results, and attention of instructors can be greatly consumed for statistics of classroom interaction. Based on the problems, the education whiteboard system is dedicated to counting and analyzing the attention, interactivity and other behaviors of students in a classroom, most of the existing education whiteboard products are simple action detection for detecting the student interaction behaviors, and the accuracy and reliability of counting results can be greatly influenced by the diversity of the student classroom actions or false actions.
Disclosure of Invention
In order to solve at least one of the above problems, a first embodiment of the present invention provides an interactive recognition method based on gesture recognition, including:
controlling an image acquisition device to acquire scene images of the activity room in real time according to a preset frequency within a preset time;
carrying out gesture recognition on each participant in the scene image, judging whether each participant makes at least one of a plurality of preset gestures, and if so, storing gesture information of the gesture;
and if the posture information meets the preset condition, judging whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
Further, if the gesture satisfies a preset condition, determining whether a preset interaction behavior exists among the participants making the current gesture, and if so, recording the interaction behavior further includes:
and if the current posture is the first preset posture, controlling a display device to display a preset manual posture request, responding to the operation of a first preset participant to determine whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
Further, if the gesture satisfies a preset condition, determining whether a preset interaction behavior exists among the participants making the current gesture, and if so, recording the interaction behavior further includes:
if the current posture is the first preset posture, judging whether preset interaction behaviors exist in the participators in the current posture according to the posture information stored before the posture occurrence time of the current posture, and if so, recording the interaction behaviors.
Further, the attitude information includes: the gesture type, the gesture occurrence time, and the position information of the participant who made the gesture;
if the current posture is the first preset posture, judging whether preset interaction behaviors exist in the participators in the current posture according to posture information stored before the posture occurrence time of the current posture, and if so, recording the interaction behaviors further comprises:
if the current posture is a first preset posture, judging whether a previous posture exists before the posture occurrence time of the participant who makes the current posture and whether the previous posture is a second preset posture,
if the previous gesture is a second preset gesture, judging whether the time difference between the gesture occurrence time of the previous gesture and the gesture occurrence time of the current gesture meets a first time threshold value, if so, judging that the participant in the current gesture has a preset interaction behavior, and recording the interaction behavior;
otherwise, whether the first preset participant makes a third preset posture is judged.
Further, the posture information of the first preset participant further includes a posture direction;
otherwise, judging whether the first preset participant makes a third preset posture further comprises:
judging whether the first preset participant has a previous gesture before the gesture occurrence time of the current gesture and whether the previous gesture of the first preset participant is a third preset gesture,
if the previous posture of the first preset participant is a third preset posture, judging whether the time difference between the posture occurrence time of the previous posture of the first preset participant and the posture occurrence time of the current posture meets a second time threshold, if so, judging whether the position information of the participant making the current posture accords with the posture direction of the previous posture of the first preset participant, and if so, judging that the participant in the current posture has a preset interaction behavior and recording the interaction behavior;
otherwise, controlling a display device to display a preset manual confirmation request, responding to the operation of the first preset participant to determine whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
Further, before the controlling image capturing device captures images of scenes in a room of the activity room in real time according to a predetermined frequency within a predetermined time, the interactive recognition method further includes:
and judging whether position binding is finished, if not, controlling a display device to display a preset manual binding request, binding the identity information and the position information of each participant in response to the operation of a first preset participant, and confirming that each participant is located at a preset position.
Further, when the determination is made as to whether the location binding is completed, if not, the display device is controlled to display a preset manual binding request, and the identity information and the location information of each participant are bound in response to the operation of a first preset participant and before the participant is confirmed to be located at the preset location, the interactive identification method further includes:
and judging whether the position setting is finished or not, if not, controlling a display device to display a preset manual setting request, and setting the position of the activity room in response to the operation of a first preset participant.
Further, the activity room is a classroom, the first preset participant is a teacher, the other participants are students, the first preset posture is that the students stand up, the second preset posture is that the students hold hands, the third preset posture is that the teachers put out hands and call for roll call, and the interactive behavior is that questions are answered;
or
The activity room is the conference room, the first participant of predetermineeing is the host, other participants are the participant, the first gesture of predetermineeing is the participant stand up, the gesture is predetermine to the second and is held hands for the participant, the gesture is predetermine for the host to the third and is stretched out the hand roll name, interactive behavior is the discussion problem.
The second embodiment of the invention provides an interactive recognition system based on gesture recognition, which comprises a controller and an image acquisition device, wherein
The image acquisition device is used for acquiring scene images of the activity rooms;
the controller is used for carrying out gesture recognition on each participant in the scene image, recording gesture information, judging whether the current gesture meets a preset condition, if so, judging whether the participant in the current gesture has a preset interaction behavior, and if so, recording the interaction behavior.
Further, the system also comprises a display device which is used for displaying a plurality of preset manual requests and displaying the posture information of each posture.
A third embodiment of the invention provides a computer-readable storage medium, on which a computer program is stored which, when executed by a processor, implements the method of the first embodiment.
A fourth embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the method according to the first embodiment.
The invention has the following beneficial effects:
aiming at the existing problems, the interactive recognition method, the interactive recognition system, the computer equipment and the storage medium are formulated, one embodiment of the application identifies the interactive behavior in the scene by carrying out gesture recognition on the scene image in the activity room acquired by the image acquisition device in real time through the controller so as to record the interactive behavior in the scene, so that the activity of each participant in the scene and the interactivity in the scene can be counted, analyzed and evaluated conveniently, and meanwhile, the display function and manual operation are provided through the education whiteboard so as to further improve the accuracy and reliability of the interactive recognition method, so that the problems in the prior art are solved, and the interactive recognition system has a wide application prospect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart illustrating an interactive recognition method according to an embodiment of the present invention;
FIG. 2 shows a schematic view of a teaching whiteboard according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an interactive recognition system according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a computer device according to another embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the invention, the invention is further described below with reference to preferred embodiments and the accompanying drawings. Similar parts in the figures are denoted by the same reference numerals. It is to be understood by persons skilled in the art that the following detailed description is illustrative and not restrictive, and is not to be taken as limiting the scope of the invention.
As shown in fig. 1, an embodiment of the present invention provides an interactive recognition method based on gesture recognition, including: controlling an image acquisition device to acquire scene images of the activity room in real time according to a preset frequency within a preset time; carrying out gesture recognition on each participant in the scene image, judging whether each participant makes at least one of a plurality of preset gestures, and if so, storing gesture information of the gesture; and if the posture information meets the preset condition, judging whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
In a specific example, the interactive behavior of a teacher and a student who are in class in a classroom is taken as an example for explanation, wherein after the teacher puts forward a question, gestures such as holding up hands of the student, putting out hands of the teacher for roll calling, standing up and the like of the student are identified, and whether the interactive behavior of answering the question between the student and the teacher is performed or not is judged.
Specifically, the method comprises the following steps:
first, in response to the control of the controller, the image capturing device captures scene images in the classroom in real time at a predetermined frequency and transmits the scene images to the controller.
In the classroom interaction scene of the embodiment, the controller controls the image acquisition device to acquire scene images in a classroom in real time according to the frequency of taking one picture in 0.1s in the preset time according to the time difference between the class time and the class time as the interaction identification time. In this embodiment, controller and image acquisition device structure as an organic whole, also can be for the components of a whole that can function independently structure, image acquisition device can be the equipment that has the function of making a video recording such as camera, infrared camera, or degree of depth camera, and this application does not do the restriction to image acquisition device's quantity to can shoot all students and mr in the classroom for being suitable.
And secondly, the controller performs image processing on the scene image, identifies students and teachers in the image based on a portrait recognition technology, performs gesture recognition on the students and teachers in the image according to a preset gesture, and stores the recognized gesture.
In the classroom interaction scene of the embodiment, the gesture of raising hands of the student, the gesture of calling up hands of the teacher and the gesture of standing up the student are preset according to the interaction behavior of answering the questions. For example, the student is subjected to 'hand-lifting' gesture recognition, and when the student is recognized to have the 'hand-lifting' gesture, the gesture is recorded, specifically comprising the gesture type (hand-lifting), the gesture occurrence time and the seat of the student making the gesture. For another example, the teacher is subjected to "arm roll call" gesture recognition, and when it is recognized that the teacher has an "arm roll call" gesture, the gesture is recorded, and the gesture information further includes gesture directions, specifically including gesture type (arm roll call), gesture occurrence time, location of the teacher, and gesture direction (the direction in which the teacher's arm in the "arm roll call" gesture points). And performing 'standing up' gesture recognition on the students, and recording the gesture when the 'standing up' gesture of the students is recognized, wherein the gesture specifically comprises a gesture type (standing up), gesture occurrence time and seats of the students making the gesture.
In order to improve the accuracy of the controller in recognizing the postures of the students or teachers, in an alternative embodiment, before the class, the controller judges whether the setting of the position of the activity room is completed, and if the setting is not completed, the controller controls the display device to display a manual setting request and set the position of the activity room in response to the operation of the first preset participant.
In this embodiment, before the lesson, the controller judges whether or not the position in the classroom, for example, the position layout such as the position of a lecture table, the position of a student seat, etc., has been set in advance, and if not, displays a manual setting request through the display device to prompt the teacher to set the position information in the classroom. In this embodiment, the display device may be a device installed in a classroom and having a display function, such as a teaching whiteboard, which may be used to display courseware information and prompt information, and may also display posture information of each recognized posture, for example, when recognizing the postures of hands holding a plurality of students, position information of the students holding hands is displayed on the teaching whiteboard, as shown in fig. 2, the teaching whiteboard 1 may further integrate an image capturing device 2 and a controller (not shown in the figure), and a courseware display area 3 and a posture information display area 4. In this implementation, the controller prompts the teacher to set the platform position, the student seat and the like in the classroom through displaying the prompt information in the courseware display area so that the subsequent controller can recognize the scene image in the classroom conveniently, and therefore the accuracy of gesture recognition of each student and the teacher in the classroom is improved.
To further improve the accuracy of the controller's gesture recognition for each student or teacher, in another alternative embodiment, before the course starts, the interactive recognition method further includes: judging whether position binding is finished, if not, controlling a display device to display a preset manual binding request, binding the identity information and the position information of each participant in response to the operation of a first preset participant, and confirming that each participant is located at a preset position
In this embodiment, before the course starts, the controller further determines whether position binding is completed, that is, whether student information, teacher information, and specific position information have been bound, and if not, displays a manual binding request through the display device to prompt the teacher to input teacher information and student information of the course. The step of binding the student information and the specific position information specifically comprises the step of binding identity information of the student, such as the name and the number of the student, with the position of a seat where the student sits. Specifically, the surnames of the students are WangI, the school number is 2, and the positions are three rows and two columns.
Before the course starts, the controller judges whether the positions are bound, if the binding setting is not finished, a teacher is prompted to carry out the binding setting, for example, students who attend the course are bound with position information of seats where the students sit, or original binding information is modified, so that the students who leave attendance are determined to have been seated, for students who leave or are absent, the teacher does not need to bind, the posture recognition performance of the participating students based on effective positions of the controller is further enhanced, the accuracy of posture recognition is further improved, and meanwhile, the position binding result of the step can be used as the attendance record of the students.
And finally, the controller judges the current posture, judges whether the standing student completes the interactive behavior of answering the question once or not when the current state is the standing posture of the student, and records the interactive behavior if the standing posture is the standing posture of the student.
This embodiment carries out gesture recognition to the scene image according to predetermineeing the gesture, takes notes mr and each student's gesture information in the classroom respectively, after discerning out the student and make "stand up" gesture, judges whether this stand up student accomplishes the interactive behavior of answering the problem once, if accomplish then record this interactive behavior to in statistics, analysis, the interactive nature of each participant's liveness in the evaluation classroom and classroom or meeting.
In the classroom interaction scene of the embodiment, considering that a large number of students can hold their hands to answer questions after a teacher puts forward the questions, or that the teacher gives out his hands to call a roll, or a student who holds one of the hands or a student who does not hold the hands, the "standing up" posture of the student is recognized as the basis for judging that the student completes the interaction behavior of answering the questions once.
Specifically, in an optional embodiment, if the current posture is a first preset posture, the display device is controlled to display a preset manual posture request, and whether a preset interaction behavior exists in the participant who makes the current posture is determined in response to the operation of the first preset participant, and if the preset interaction behavior exists, the interaction behavior is recorded.
In this embodiment, in order to avoid the recognition error, after the student answers the question on stand, the controller controls the teaching whiteboard to display the posture confirmation request to prompt the teacher, and the teacher manually confirms whether the student on stand completes the interactive behavior of answering the question once.
Although the accuracy of the interactive behavior recognition can be guaranteed through manual confirmation by the teacher, the method has the problem of frequently interrupting the lectures of the teacher, in an optional embodiment, if the current posture is the first preset posture, whether the preset interactive behavior exists in the participant at the current posture is judged according to the posture information stored before the posture occurrence time of the current posture, and if the preset interactive behavior exists, the interactive behavior is recorded.
In this embodiment, since the image collector collects the scene image in the classroom in real time, the controller identifies the postures of the student's ' hands holding ', the teacher's ' hands holding roll call ' and the student's ' standing up ' according to the scene image, and therefore the controller comprehensively judges whether the student's standing up ' completes the interactive behavior of answering the question according to the postures of whether the student's ' hands holding ' posture exists before the student's standing up ' and whether the student's ' hands holding roll call ' exists before the student's standing up '.
In an optional embodiment, the method specifically includes: the attitude information includes: the gesture type, the gesture occurrence time, and the position information of the participant who made the gesture; if the current posture is the first preset posture, judging whether preset interaction behaviors exist in the participators in the current posture according to posture information stored before the posture occurrence time of the current posture, and if so, recording the interaction behaviors further comprises: if the current posture is a first preset posture, judging whether a previous posture exists before the posture occurrence time of the current posture of the participant who makes the current posture, and whether the previous posture is a second preset posture, if the previous posture is the second preset posture, judging whether the time difference between the posture occurrence time of the previous posture and the posture occurrence time of the current posture meets a first time threshold, and if so, judging that the participant in the current posture has a preset interaction behavior and recording the interaction behavior; otherwise, whether the first preset participant makes a third preset posture is judged.
In this embodiment, the controller queries the identity information of the student according to the position information of the student standing up, queries whether the student has a hand-lifting posture before the student stands up according to the identity information of the student, determines whether a difference between the occurrence time of the hand-lifting posture and the occurrence time of the current hand-lifting posture is less than 30 seconds if the hand-lifting posture exists, and determines that the hand-lifting posture and the standing posture of the student are successive postures in one interactive behavior if the hand-lifting posture and the standing posture are less than 30 seconds, so as to determine that the student completes one-time answer interactive behavior, and the controller records the interactive behavior so as to evaluate the participation degree of the student in the course and the interactive condition of the course.
Otherwise, if the time difference between the previous hand-lifting posture and the subsequent standing-up posture is greater than 30 seconds, the two postures are judged to be two mutually independent postures which cannot be used for judging whether the student completes one-time answer to the question, and the posture of the teacher before the student stands up needs to be judged.
In an optional embodiment, the posture information of the first preset participant further includes a posture direction; judging whether a previous posture exists before the posture occurrence time of the current posture of the first preset participant and whether the previous posture of the first preset participant is a third preset posture, if so, judging whether the time difference between the posture occurrence time of the previous posture of the first preset participant and the posture occurrence time of the current posture meets a second time threshold, if so, judging whether the position information of the participant making the current posture meets the posture direction of the previous posture of the first preset participant, and if so, judging that the participant in the current posture has a preset interactive behavior and recording the interactive behavior; otherwise, controlling a display device to display a preset manual confirmation request, responding to the operation of the first preset participant to determine whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
In this embodiment, it is first determined whether the teacher has a gesture of "reach roll" before the student "stands up", if the teacher has the gesture, judging whether the difference value between the occurrence time of the 'stretching roll call' gesture and the occurrence time of the current 'standing up' gesture is less than 10 seconds, if the difference value is less than 10 seconds, considering the 'stretching roll call' gesture of the teacher and the 'standing up' gesture of the current student as the successive gesture in one interactive behavior, and simultaneously judging the gesture direction in the gesture information of the 'stretching roll call' gesture of the teacher, namely, whether the extending direction of the teacher extending the hands for roll calling is the same as the position of the standing student, if so, the student is judged to finish the interactive action of answering the question once, the controller records the interaction behavior so as to evaluate the participation degree of the student in the course and the interaction condition of the course.
Otherwise, if the time between the previous teacher's ' standing roll ' gesture and the next student's ' standing up ' gesture is more than 10 seconds or the gesture direction of the teacher's ' standing roll ' gesture is different from the position where the ' standing up ' student is located, whether the current ' standing up ' student completes the interactive operation of answering the question or not can not be judged according to the two gestures, the controller controls the teaching whiteboard to display a manual confirmation request to prompt the teacher to manually confirm and confirm whether the current ' standing up ' student completes the interactive behavior of answering the question or not.
It is worth explaining, this embodiment is only used for explaining a specific implementation of this application, does not prescribe a limit to the application scenario of this application, and this application can also be applied to other scenarios that have an interactive behavior, for example the studio is the conference room, first preset participant is the host, other participants are the participants, first preset gesture is the participant's stand up, the gesture is held for the participant to be preset for the second, the gesture is preset for the host to stretch out the hand roll name for the third, interactive behavior is the discussion problem. The detailed description is similar to the above embodiments and will not be repeated herein.
Corresponding to the interactive recognition method provided in the foregoing embodiment, an embodiment of the present application further provides an interactive recognition system based on gesture recognition, and since the interactive recognition system provided in the embodiment of the present application corresponds to the interactive recognition methods provided in the foregoing embodiments, the foregoing implementation is also applicable to the interactive recognition system provided in the embodiment, and detailed description is not given in this embodiment.
As shown in fig. 3, an embodiment of the present application further provides an interactive recognition system based on gesture recognition, which includes a controller and an image capturing device, wherein the image capturing device is configured to capture a scene image of an activity room to which the image capturing device belongs; the controller is used for carrying out gesture recognition on each participant in the scene image, recording gesture information, judging whether the current gesture meets a preset condition, if so, judging whether the participant in the current gesture has a preset interaction behavior, and if so, recording the interaction behavior.
In this embodiment, the controller identifies each participant in the image through the scene image acquired by the image acquisition device, and performs gesture identification on each participant, so as to determine whether each participant has an interaction behavior according to a gesture meeting a preset condition, if so, the interaction behavior is recorded so as to facilitate statistics, analysis and evaluation of the interactivity and the participation degree of each participant in the application scene, otherwise, the controller continues to identify until the activity of the application scene is finished.
In an optional embodiment, the interactive recognition system further includes a display device for displaying a plurality of preset manual requests and displaying the posture information of each posture.
In the present embodiment, the display device is a teaching whiteboard, and as shown in fig. 2, the teaching whiteboard 1 may further integrate an image capturing device 2 and a controller (not shown in the figure), and a courseware display area 3 and a posture information display area 4. The teaching whiteboard can be used for displaying courseware information and displaying preset various manual requests, such as prompt information of 'manual confirmation request', 'manual binding request' and 'manual setting request' in the embodiment, and can also display posture information of each recognized posture, for example, position information and identity information of a student who raises his hands are displayed on the teaching whiteboard if the posture of the student who raises his hands is recognized.
Another embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements: controlling an image acquisition device to acquire scene images of the activity room in real time according to a preset frequency within a preset time; carrying out gesture recognition on each participant in the scene image, judging whether each participant makes at least one of a plurality of preset gestures, and if so, storing gesture information of the gesture; and if the posture information meets the preset condition, judging whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
In practice, the computer-readable storage medium may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present embodiment, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
As shown in fig. 4, another embodiment of the present invention provides a schematic structural diagram of a computer device. The computer device 12 shown in FIG. 4 is only one example and should not bring any limitations to the functionality or scope of use of embodiments of the present invention.
As shown in FIG. 4, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, and commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 20. As shown in FIG. 4, the network adapter 20 communicates with the other modules of the computer device 12 via the bus 18. It should be appreciated that although not shown in FIG. 4, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processor unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, for example, implementing an interactive recognition method based on gesture recognition provided by the embodiment of the present invention.
Aiming at the existing problems, the invention sets an interactive recognition method, an interactive recognition system, computer equipment and a storage medium, one embodiment of the application identifies the gesture of the scene image in the activity room acquired by the image acquisition device in real time through the controller so as to recognize the interactive behavior in the scene, so as to record the interactive behavior in the scene, thereby facilitating statistics, analysis and evaluation of the liveness of each participant in the scene and the interactivity of a classroom or a conference, and simultaneously provides a display function and manual operation through the education whiteboard so as to further improve the accuracy and reliability of the interactive recognition method, thereby remedying the problems in the prior art and having wide application prospect.
It should be understood that the above-mentioned embodiments of the present invention are only examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention, and it will be obvious to those skilled in the art that other variations or modifications may be made on the basis of the above description, and all embodiments may not be exhaustive, and all obvious variations or modifications may be included within the scope of the present invention.

Claims (12)

1. An interactive recognition method based on gesture recognition is characterized by comprising the following steps:
controlling an image acquisition device to acquire scene images of the activity room in real time according to a preset frequency within a preset time;
carrying out gesture recognition on each participant in the scene image, judging whether each participant makes at least one of a plurality of preset gestures, and if so, storing gesture information of the gesture;
and if the posture information meets the preset condition, judging whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
2. The interactive recognition method of claim 1, wherein if the gesture satisfies a preset condition, determining whether a participant making the current gesture has a preset interactive behavior, and if so, recording the interactive behavior further comprises:
and if the current posture is the first preset posture, controlling a display device to display a preset manual posture request, responding to the operation of a first preset participant to determine whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
3. The interactive recognition method of claim 1, wherein if the gesture satisfies a preset condition, determining whether a participant making the current gesture has a preset interactive behavior, and if so, recording the interactive behavior further comprises:
if the current posture is the first preset posture, judging whether preset interaction behaviors exist in the participators in the current posture according to the posture information stored before the posture occurrence time of the current posture, and if so, recording the interaction behaviors.
4. The interactive recognition method of claim 3, wherein the pose information comprises: the gesture type, the gesture occurrence time, and the position information of the participant who made the gesture;
if the current posture is the first preset posture, judging whether preset interaction behaviors exist in the participators in the current posture according to posture information stored before the posture occurrence time of the current posture, and if so, recording the interaction behaviors further comprises:
if the current posture is a first preset posture, judging whether a previous posture exists before the posture occurrence time of the participant who makes the current posture and whether the previous posture is a second preset posture,
if the previous gesture is a second preset gesture, judging whether the time difference between the gesture occurrence time of the previous gesture and the gesture occurrence time of the current gesture meets a first time threshold value, if so, judging that the participant in the current gesture has a preset interaction behavior, and recording the interaction behavior;
otherwise, whether the first preset participant makes a third preset posture is judged.
5. The interactive recognition method of claim 4, wherein the posture information of the first preset participant further includes a posture direction;
otherwise, judging whether the first preset participant makes a third preset posture further comprises:
judging whether the first preset participant has a previous gesture before the gesture occurrence time of the current gesture and whether the previous gesture of the first preset participant is a third preset gesture,
if the previous posture of the first preset participant is a third preset posture, judging whether the time difference between the posture occurrence time of the previous posture of the first preset participant and the posture occurrence time of the current posture meets a second time threshold, if so, judging whether the position information of the participant making the current posture accords with the posture direction of the previous posture of the first preset participant, and if so, judging that the participant in the current posture has a preset interaction behavior and recording the interaction behavior;
otherwise, controlling a display device to display a preset manual confirmation request, responding to the operation of the first preset participant to determine whether the participant making the current posture has a preset interaction behavior, and if so, recording the interaction behavior.
6. The interactive recognition method according to any one of claims 1 to 5, wherein before the controlling image capturing device captures the scene image in the room of the activity room in real time at a predetermined frequency for a predetermined time, the interactive recognition method further comprises:
and judging whether position binding is finished, if not, controlling a display device to display a preset manual binding request, binding the identity information and the position information of each participant in response to the operation of a first preset participant, and confirming that each participant is located at a preset position.
7. The interactive recognition method of claim 6, wherein before the determining whether the location binding is completed, and if not, controlling a display device to display a preset manual binding request, and binding the identity information and the location information of each participant in response to an operation of a first preset participant and confirming that each participant is located at a preset location, the interactive recognition method further comprises:
and judging whether the position setting is finished or not, if not, controlling a display device to display a preset manual setting request, and setting the position of the activity room in response to the operation of a first preset participant.
8. The interactive recognition method of claim 4,
the activity room is a classroom, the first preset participant is a teacher, the other participants are students, the first preset posture is that the students stand up, the second preset posture is that the students hold hands, the third preset posture is that the teacher takes out hands and calls the roll, and the interaction behavior is that the questions are answered;
or
The activity room is the conference room, the first participant of predetermineeing is the host, other participants are the participant, the first gesture of predetermineeing is the participant stand up, the gesture is predetermine to the second and is held hands for the participant, the gesture is predetermine for the host to the third and is stretched out the hand roll name, interactive behavior is the discussion problem.
9. An interactive recognition system based on gesture recognition is characterized by comprising a controller and an image acquisition device, wherein
The image acquisition device is used for acquiring scene images of the activity rooms;
the controller is used for carrying out gesture recognition on each participant in the scene image, recording gesture information, judging whether the current gesture meets a preset condition, if so, judging whether the participant in the current gesture has a preset interaction behavior, and if so, recording the interaction behavior.
10. The interactive recognition system of claim 9, further comprising a display device for displaying a plurality of preset manual requests and displaying pose information for each of the poses.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-8.
12. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-8 when executing the program.
CN201911191660.1A 2019-11-28 2019-11-28 Interactive identification method, interactive identification system, computer equipment and storage medium Pending CN112861591A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911191660.1A CN112861591A (en) 2019-11-28 2019-11-28 Interactive identification method, interactive identification system, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911191660.1A CN112861591A (en) 2019-11-28 2019-11-28 Interactive identification method, interactive identification system, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112861591A true CN112861591A (en) 2021-05-28

Family

ID=75995601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911191660.1A Pending CN112861591A (en) 2019-11-28 2019-11-28 Interactive identification method, interactive identification system, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112861591A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113591642A (en) * 2021-07-20 2021-11-02 广州市奥威亚电子科技有限公司 Classroom personnel posture judgment method and device
CN114067451A (en) * 2021-11-30 2022-02-18 浙江大华技术股份有限公司 Roll call method, roll call device, storage medium and electronic device
CN114397959A (en) * 2021-12-13 2022-04-26 北京大麦文化传播有限公司 Interactive prompting method, device and equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113591642A (en) * 2021-07-20 2021-11-02 广州市奥威亚电子科技有限公司 Classroom personnel posture judgment method and device
CN114067451A (en) * 2021-11-30 2022-02-18 浙江大华技术股份有限公司 Roll call method, roll call device, storage medium and electronic device
CN114397959A (en) * 2021-12-13 2022-04-26 北京大麦文化传播有限公司 Interactive prompting method, device and equipment

Similar Documents

Publication Publication Date Title
CN111079113B (en) Teaching system with artificial intelligent control and use method thereof
US20190340944A1 (en) Multimedia Interactive Teaching System and Method
CN112861591A (en) Interactive identification method, interactive identification system, computer equipment and storage medium
CN107909022B (en) Video processing method and device, terminal equipment and storage medium
CN111368808A (en) Method, device and system for acquiring answer data and teaching equipment
CN112652200A (en) Man-machine interaction system, man-machine interaction method, server, interaction control device and storage medium
CN114885216B (en) Problem pushing method, system, electronic equipment and storage medium
CN112287767A (en) Interaction control method, device, storage medium and electronic equipment
CN112055257B (en) Video classroom interaction method, device, equipment and storage medium
US20150301726A1 (en) Systems and Methods for Displaying Free-Form Drawing on a Contact-Sensitive Display
CN109934150B (en) Conference participation degree identification method, device, server and storage medium
KR20220032460A (en) Remote lecturing method and system
CN111489595B (en) Method and device for feeding back test information in live broadcast teaching process
CN110378261B (en) Student identification method and device
CN110675669A (en) Lesson recording method
CN111050111A (en) Online interactive learning communication platform and learning device thereof
CN110796577A (en) Information display method based on intelligent class board
CN114095747B (en) Live broadcast interaction system and method
CN115631501A (en) Intelligent blackboard writing cloud calling method based on circular screen teaching space
US20220013025A1 (en) Systems and methods for providing a dialog assessment platform
KR20160020924A (en) Estimation system for multimedia learning
CN112270264A (en) Multi-party interactive teaching system
CN113570227A (en) Online education quality evaluation method, system, terminal and storage medium
KR20140110557A (en) E-Learning system using image feedback
CN114120729B (en) Live teaching system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination