CN109240492A - The method for controlling studio packaging and comment system by gesture identification - Google Patents

The method for controlling studio packaging and comment system by gesture identification Download PDF

Info

Publication number
CN109240492A
CN109240492A CN201810956270.8A CN201810956270A CN109240492A CN 109240492 A CN109240492 A CN 109240492A CN 201810956270 A CN201810956270 A CN 201810956270A CN 109240492 A CN109240492 A CN 109240492A
Authority
CN
China
Prior art keywords
gesture
studio
packaging
client
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810956270.8A
Other languages
Chinese (zh)
Inventor
朱祝华
王骏
朱俊
李天明
李青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ANHUI LTECH Co Ltd
Original Assignee
ANHUI LTECH Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ANHUI LTECH Co Ltd filed Critical ANHUI LTECH Co Ltd
Priority to CN201810956270.8A priority Critical patent/CN109240492A/en
Publication of CN109240492A publication Critical patent/CN109240492A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of studio packaging and comment system are controlled by gesture identification, can solve host in current studio system and comment system is inconvenient, operation unsightly, the technical issues of not supporting quasi- programme televised live.Include the following steps, acquires the centre of the palm joint data of human body in real time, obtain the real time kinematics track of the centre of the palm;Straight line or conic fitting are carried out to the tracing point of the motion profile, to complete the identification to gesture behavior;It will identify that the gesture behavior come, be sent to studio packaging client and comment system client using standard gesture protocol format;The client distinguishes autonomous classification and responds different gestures according to self-demand.The present invention can make host directly pass through gesture control studio to pack/comment system/third party system, the feeling of immersion and shock degree of program can be increased, limbs flexibility and the program for also enhancing host are beautiful, while the viewing experience of spectators being allowed to rise a class.

Description

The method for controlling studio packaging and comment system by gesture identification
Technical field
The present invention relates to gesture identification fields, and in particular to a kind of to control studio packaging and comment system by gesture identification The method of system.
Background technique
In radio, TV and film industries, during living broadcast of programs, host is frequently necessary to interact to control program with broadcast system Process, but go to interact before camera lens cannot be left.Solution traditional in the past is that program process is controlled with remote controler, still There is remote controler due to presiding in manpower, limb action is subject to certain restrictions, and unsightly, influences the sight of spectators to a certain extent See experience, thus traditional solution be not it is too friendly, this just need it is a set of can be every the solution of empty remote control of programs process Scheme.
It especially propagates its belief on a large scale instantly in AR/VR, military class program and weather forecast are especially apparent, only background picture The visual sense of beauty and feeling of immersion that spectators are no longer satisfied with text need integrated, the professional scheme of one kind to solve this Problem.Such as in military class program, when host introduces the equipment of both sides' operation, 15 are destroyed if mentioned, and host is to sky In one refer to, destroy 15 three-dimensional animation and roar past immediately;Such as in weather forecast, there may be strong convective weather when making referrals to somewhere When, host beats a snap, and at this moment the three-dimensional animation of torrential rain and hail is following.It is envisioned that this effect not only allows section Mesh novel form, spectators are also pleasing and like upper this grade of program.
At present, mainly two ways realizes said effect in the market, but has its limitation:
1, host takes proprietary remote controler, mentions some point, presses some key of remote controler.
It is described as discussed above, there is remote controler due to presiding in manpower, limb action is subject to certain restrictions, and unsightly, The viewing experience of spectators is influenced to a certain extent, and this form is not suitable for AR/VR form program;
2, non-volume/studio software, edits animation effect, and specified animation occurs in designated time period.This mode is suitable It closing and records program, during live broadcast/collimation is broadcast, fault-tolerance is very poor, and it is very awkward if being other pictures in specified time, There is no the advantage of any movement.
Summary of the invention
A kind of method controlling studio packaging and comment system by gesture identification proposed by the present invention, solves at present Studio system and comment system in host is inconvenient, operation unsightly, the technical issues of not supporting quasi- programme televised live.
To achieve the above object, the invention adopts the following technical scheme:
A method of studio packaging and comment system are controlled by gesture identification, included the following steps,
S10: the centre of the palm joint data of acquisition human body in real time obtain the real time kinematics track of the centre of the palm;
S20: using Kinematic Algorithms, carries out straight line or conic fitting to the tracing point of the motion profile, according to It is fitted the straight line or curve formed, to complete the identification to gesture behavior;
S30: will identify that come gesture behavior, using standard gesture protocol format be sent to studio packaging client and Comment on system client;
S40: studio packaging client and comment system client according to the received standard gesture protocol format of institute, Difference autonomous classification simultaneously responds different gestures according to self-demand.
On the other hand, a kind of device controlling studio packaging and comment system by gesture identification, specifically includes acquisition Device obtains the real time kinematics track of the centre of the palm for acquiring the centre of the palm joint data of human body in real time;
Identify sending device, the tracing point progress straight line or secondary for using Kinematic Algorithms, to the motion profile Curve matching, according to straight line or curve that fitting is formed, to complete the identification to gesture behavior;And it will identify that the gesture row come To be sent to studio packaging client and comment system client using standard gesture protocol format;
Client terminal device specifically includes studio packaging client and comment system client, for receiving standard gesture Protocol format, difference autonomous classification simultaneously respond different gestures according to self-demand.
Further, the acquisition device uses the Kinect of Microsoft.
As shown from the above technical solution, this controls studio packaging and comment by gesture identification the invention discloses a kind of The method of system, by using the Kinect of Microsoft as data collection terminal, the joint data for acquiring human body in real time are (main It is centre of the palm position), according to the real time kinematics track in the centre of the palm, judge the gesture behavior of user;Using the dependency number in kinematics Theory carries out straight line or conic fitting to tracing point, according to straight line or curve that fitting is formed, to complete to gesture Identification;Server-side, which receives, carrys out the gesture behavior that identifies of self-identifying end, and the unified gesture message format of operating specification is by gesture Behavior is sent;Client receives the standard gesture protocol format that server-side is sent, and autonomous classification is simultaneously rung according to self-demand Answer different gestures.The present invention can make host be detached from the limitation of remote controler/tablet computer, directly pass through gesture control studio Packaging/comment system/third party system, is particluarly suitable in AR/VR system, in conjunction with corresponding three-dimensional animation effect, can increase The feeling of immersion of program and shock degree also enhance the limbs flexibility and program beauty of host, while allowing the viewing body of spectators It tests and rises a class.
Detailed description of the invention
Fig. 1 is method flow schematic diagram of the invention;
Fig. 2 is the flow chart of the method for the present invention.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.
It is set by the method that gesture identification controls studio packaging and comment system using sensor described in the present embodiment The standby every frame gesture path point got identifies specific gesture by trajectory analysis, and this method, which will be first into, " to be risen Beginning gesture " identification state;In this state duration, if identifying " starting gesture ", track later is clicked through immediately Row storage will immediately begin to the gesture path point progress hand to storage after the number of the gesture path point of storage reaches requirement Potential analysis.Analysis will continue for some time (such as 2 seconds), during this period of time, if identifying specific gesture behavior or knowledge The other time expires, and will immediately enter wait state, and waiting will continue for some time (such as 1 second, be not to wait within 0 second), when waiting Gesture path analysis is no longer carried out, " starting gesture " identification state after waiting, is entered.
As depicted in figs. 1 and 2, the side of studio packaging and comment system is controlled described in the present embodiment by gesture identification Method, comprising the following steps:
S10: the centre of the palm joint data of acquisition human body in real time obtain the real time kinematics track of the centre of the palm;
The specific embodiment of the present invention uses the Kinect of Microsoft as data collection terminal, for acquiring the pass of human body in real time Joint number judges the gesture behavior of user according to the real time kinematics track in the centre of the palm according to (mainly centre of the palm position);
Kinect is originated from Microsoft on the E3 great Zhan on June 2nd, 2009, a body-sensing periphery peripheral hardware formally announced.
The Kinect device of Microsoft is always first in skeleton detection technique bend as the product for many years that emerged One equipment referred to, extensive verifying has been obtained in stability, availability and reliability.
Kinect device uses TOF technology, measures to deep space.TOF is flight time (Time of Flight) the abbreviation of technology, i.e. sensor issue modulated near infrared light, meet object back reflection, sensor is by calculating light Line transmitting and reflection interval difference or phase difference, come the distance of scenery of being taken that converts, to generate depth information, furthermore in conjunction with biography The camera of system is shot, and the topographic map mode that the three-D profile of object can be represented in different colors to different distance shows. Its measurement range is less than 2cm up to several meters, precision, and renewal frequency per second is up to 30-60fps.
S20: using Kinematic Algorithms, carries out straight line or conic fitting to the tracing point of the motion profile, according to It is fitted the straight line or curve formed, to complete the identification to gesture behavior;
Specifically, related algorithm is as follows:
Least square method fitting a straight line
It is most basic and the most commonly used is straight line fittings in curve matching.If the functional relation between x and y are as follows: y=a+bx. There are two undetermined parameter in formula, a represents intercept, and b represents slope.N group data (xi, yi) obtained for equal precision measurement, i =1,2 ..., N, xi value are considered as that accurately, all errors are only in connection with yi.Below with least square method observation Data are fitted to straight line.
When with Least Square Method parameter, it is desirable that the weighted sum of squares of the deviation of observation yi is minimum.For etc. essences It spends for the straight line fitting of observation, the value of following formula can be made minimum:
Above formula asks local derviation to obtain a, b respectively:
Equation group is obtained after arrangement
The best estimate of straight line parameter a and b can be acquired by solving above-mentioned equation group.
Correlation coefficient r:
Least Square in Processing data usually give correlation coefficient r in addition to providing a, b,
R is defined as
Wherein
Least square method polynomial curve fitting
According to m given point, it is not required that this curve accurately passes through these points, but curve y=f's (x) is close Like curve y=φ (x).
Data-oriented point pi (xi, yi), wherein i=1,2 ..., m.Seek curve of approximation y=φ (x).And make approximate song The deviation of line and y=f (x) are minimum.Deviation δ i=φ (xi)-y, i=1,2 ..., m of the curve of approximation at point pi.
Common curve-fitting method:
Make the sum of absolute value of the bias minimum
Make the maximum minimum of absolute value of the bias
Keep sum of square of deviations minimum
Matched curve is chosen by the smallest principle of sum of square of deviations, and taking binomial equation is the side of matched curve Method, referred to as least square method.
Derivation process:
If polynomial fitting are as follows:
Y=a0+a1x+...+akxk,
Each point is as follows to the sum of the distance of this curve, i.e. sum of square of deviations:
1. asking ai partial derivative, thus we on the right of peer-to-peer to acquire qualified a value
It obtains:
.......
2. the equation left side is carried out abbreviation, then should it is available below equation:
.......
3. the form that these equatioies are expressed as matrix, so that it may obtain following matrix:
4. can be obtained after this vandermonde is obtained matrix abbreviation:
5. that is X*A=Y, then A=(X'*X) -1*X'*Y, has just obtained coefficient matrices A, meanwhile, we are also Matched curve is obtained.
S30: will identify that come gesture behavior, using standard gesture protocol format be sent to studio packaging client and Comment on system client;
S40: studio packaging client and comment system client according to the received standard gesture protocol format of institute, Difference autonomous classification simultaneously responds different gestures according to self-demand.
On the other hand studio packaging and comment system are controlled by gesture identification the embodiment of the invention also discloses a kind of Device, including,
Acquisition device obtains the real time kinematics track of the centre of the palm for acquiring the centre of the palm joint data of human body in real time;
Identification device carries out straight line or conic section to the tracing point of the motion profile for using Kinematic Algorithms Fitting, according to straight line or curve that fitting is formed, to complete the identification to gesture behavior;
Sending device will identify that the gesture behavior come, is sent to studio using standard gesture protocol format and packs visitor Family end and comment system client;
Client terminal device specifically includes studio packaging client and comment system client, for receiving standard gesture Protocol format, difference autonomous classification simultaneously respond different gestures according to self-demand.
The specific acquisition device uses the Kinect of Microsoft.
In conclusion by adopting the above-described technical solution, the beneficial effect of the embodiment of the present invention is:
1, the animation effect of studio packaging can enrich form of programs, also improve according to different definition of gesture Technology sense and feeling of immersion;
2, in program comment, host can throw away remote controler, by upper stroke/lower stroke/it is left draw/right draw wait gesture operations, control Single appearance/disappearance/next/switching in large screen processed etc., rich in expressive force, the following sense;
3, meet AR/VR form of programs, because this is following trend;
4, media Policies for development is melted in the acceleration for meeting General Bureau of Radio, Film and Television's requirement.
The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although with reference to the foregoing embodiments Invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each implementation Technical solution documented by example is modified or equivalent replacement of some of the technical features;And these modification or Replacement, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution.

Claims (3)

1. a kind of method for controlling studio packaging and comment system by gesture identification, it is characterised in that: include the following steps,
S10: the centre of the palm joint data of acquisition human body in real time obtain the real time kinematics track of the centre of the palm;
S20: using Kinematic Algorithms, straight line or conic fitting is carried out to the tracing point of the motion profile, according to fitting The straight line or curve of formation, to complete the identification to gesture behavior;
S30: will identify that the gesture behavior come, be sent to studio packaging client and comment using standard gesture protocol format System client;
S40: the studio packaging client and comment system client are according to the received standard gesture protocol format of institute, respectively Autonomous classification simultaneously responds different gestures according to self-demand.
2. a kind of device for controlling studio packaging and comment system by gesture identification, it is characterised in that: including,
Acquisition device obtains the real time kinematics track of the centre of the palm for acquiring the centre of the palm joint data of human body in real time;
Identification device carries out straight line or conic fitting to the tracing point of the motion profile for using Kinematic Algorithms, According to straight line or curve that fitting is formed, to complete the identification to gesture behavior;
Sending device will identify that the gesture behavior come, be sent to studio packaging client using standard gesture protocol format With comment system client;
Client terminal device specifically includes studio packaging client and comment system client, for receiving standard gesture agreement Format, difference autonomous classification simultaneously respond different gestures according to self-demand.
3. a kind of device for controlling studio packaging and comment system by gesture identification according to claim 2, special Sign is: the Kinect of Microsoft is used including the acquisition device.
CN201810956270.8A 2018-08-21 2018-08-21 The method for controlling studio packaging and comment system by gesture identification Pending CN109240492A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810956270.8A CN109240492A (en) 2018-08-21 2018-08-21 The method for controlling studio packaging and comment system by gesture identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810956270.8A CN109240492A (en) 2018-08-21 2018-08-21 The method for controlling studio packaging and comment system by gesture identification

Publications (1)

Publication Number Publication Date
CN109240492A true CN109240492A (en) 2019-01-18

Family

ID=65070253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810956270.8A Pending CN109240492A (en) 2018-08-21 2018-08-21 The method for controlling studio packaging and comment system by gesture identification

Country Status (1)

Country Link
CN (1) CN109240492A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109831706A (en) * 2019-02-13 2019-05-31 安徽励图信息科技股份有限公司 A method of it is broadcasted based on Microsoft Excel Software control TV subtitling
CN110123258A (en) * 2019-03-29 2019-08-16 深圳和而泰家居在线网络科技有限公司 Method, apparatus, eyesight detection device and the computer storage medium of sighting target identification
CN110880719A (en) * 2019-11-30 2020-03-13 国网河南省电力公司孟州市供电公司 Handle device capable of simulating multiple cable head manufacturing tools
CN114845148A (en) * 2022-04-29 2022-08-02 深圳迪乐普数码科技有限公司 Interaction control method and device for host to virtual object in virtual studio

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202150897U (en) * 2011-06-10 2012-02-22 苏州美娱网络科技有限公司 Body feeling control game television set
CN103020648A (en) * 2013-01-09 2013-04-03 北京东方艾迪普科技发展有限公司 Method and device for identifying action types, and method and device for broadcasting programs
US20150228200A1 (en) * 2012-03-21 2015-08-13 Gaijin Entertainment Corporation System and method for simulated aircraft control through desired direction of flight
CN106909216A (en) * 2017-01-05 2017-06-30 华南理工大学 A kind of Apery manipulator control method based on Kinect sensor
CN107229921A (en) * 2017-06-09 2017-10-03 济南大学 Dynamic gesture identification method based on Hausdorff distances

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202150897U (en) * 2011-06-10 2012-02-22 苏州美娱网络科技有限公司 Body feeling control game television set
US20150228200A1 (en) * 2012-03-21 2015-08-13 Gaijin Entertainment Corporation System and method for simulated aircraft control through desired direction of flight
CN103020648A (en) * 2013-01-09 2013-04-03 北京东方艾迪普科技发展有限公司 Method and device for identifying action types, and method and device for broadcasting programs
CN106909216A (en) * 2017-01-05 2017-06-30 华南理工大学 A kind of Apery manipulator control method based on Kinect sensor
CN107229921A (en) * 2017-06-09 2017-10-03 济南大学 Dynamic gesture identification method based on Hausdorff distances

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109831706A (en) * 2019-02-13 2019-05-31 安徽励图信息科技股份有限公司 A method of it is broadcasted based on Microsoft Excel Software control TV subtitling
CN110123258A (en) * 2019-03-29 2019-08-16 深圳和而泰家居在线网络科技有限公司 Method, apparatus, eyesight detection device and the computer storage medium of sighting target identification
CN110880719A (en) * 2019-11-30 2020-03-13 国网河南省电力公司孟州市供电公司 Handle device capable of simulating multiple cable head manufacturing tools
CN114845148A (en) * 2022-04-29 2022-08-02 深圳迪乐普数码科技有限公司 Interaction control method and device for host to virtual object in virtual studio
CN114845148B (en) * 2022-04-29 2024-05-03 深圳迪乐普数码科技有限公司 Interaction control method and device for host in virtual studio to virtual object

Similar Documents

Publication Publication Date Title
US11412108B1 (en) Object recognition techniques
CN109240492A (en) The method for controlling studio packaging and comment system by gesture identification
US8667519B2 (en) Automatic passive and anonymous feedback system
EP2453386B1 (en) Multimedia device, multiple image sensors having different types and method for controlling the same
US9100667B2 (en) Life streaming
EP2453596B1 (en) Multimedia device, multiple image sensors having different types and method for controlling the same
US20190124391A1 (en) Recording remote expert sessions
US11074451B2 (en) Environment-based application presentation
US20110302293A1 (en) Recognition system for sharing information
US9703371B1 (en) Obtaining input from a virtual user interface
US9392248B2 (en) Dynamic POV composite 3D video system
CN104364733A (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
CN104380729A (en) Context-driven adjustment of camera parameters
US9874977B1 (en) Gesture based virtual devices
US20150135144A1 (en) Apparatus for obtaining virtual 3d object information without requiring pointer
CN102301379A (en) Method For Controlling And Requesting Information From Displaying Multimedia
US9268408B2 (en) Operating area determination method and system
CN111131735B (en) Video recording method, video playing method, video recording device, video playing device and computer storage medium
Chen et al. Snaplink: Fast and accurate vision-based appliance control in large commercial buildings
CN203630822U (en) Virtual image and real scene combined stage interaction integrating system
US8970479B1 (en) Hand gesture detection
Goto et al. Development of an Information Projection Interface Using a Projector–Camera System
CN104216510A (en) Method for moving cursor in screen to object capable of being pressed and computer system thereof
Chen Capturing fast motion with consumer grade unsynchronized rolling-shutter cameras
CN105302283B (en) The control system and its control method of mapping projections

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190118

RJ01 Rejection of invention patent application after publication