CN108549484A - Man-machine interaction method and device based on human body dynamic posture - Google Patents
Man-machine interaction method and device based on human body dynamic posture Download PDFInfo
- Publication number
- CN108549484A CN108549484A CN201810301243.7A CN201810301243A CN108549484A CN 108549484 A CN108549484 A CN 108549484A CN 201810301243 A CN201810301243 A CN 201810301243A CN 108549484 A CN108549484 A CN 108549484A
- Authority
- CN
- China
- Prior art keywords
- human body
- body dynamic
- dynamic posture
- posture
- human
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
This disclosure relates to field of artificial intelligence.It is poor to human body dynamic attitude detection rate in existing interactive process in order to solve the problems, such as, the embodiment of the present disclosure provides a kind of man-machine interaction method and device based on human body dynamic posture, which includes receiving module for receiving the human body dynamic posture that detection zone detects;Interactive module is used to judge whether interactive process succeeds according to the human body dynamic posture performance of user, while obtaining to preset the picture interaction content of effect display imaging.
Description
Technical field
This disclosure relates to field of artificial intelligence, in particular to a kind of man-machine friendship based on human body dynamic posture
Mutual method and apparatus.
Background technology
In existing electronics mobile terminal, the detection of human-computer interaction has hysteresis and error rate, cannot timely and effectively examine
The information of human-computer interaction is surveyed, and then influences the experience property of user.For human body dynamic posture in human-computer interaction detection difficulty more
Height, because to consider the spatial attitude of human body dynamic posture and the completion rate of dynamic posture, above two attribute value is all
It needs human-computer interaction terminal to carry out accurate detection and calculate, can just be efficiently completed human-computer interaction whole process.
Invention content
The embodiment of the present disclosure provides a kind of man-machine interaction method and device based on human body dynamic posture.
In a first aspect, the embodiment of the present disclosure provides a kind of man-machine interaction method based on human body dynamic posture, including with
Lower step:Receive the human body dynamic posture that detection zone detects;Judge institute according to the human body dynamic posture performance of user
It states whether interactive process succeeds, while obtaining to preset the picture interaction content of effect display imaging.
Second aspect, the embodiment of the present disclosure provide a kind of computer readable storage medium, are stored thereon with computer journey
The step of sequence, which realizes above-mentioned method when being executed by processor.
The third aspect, the embodiment of the present disclosure provide a kind of computer equipment, including memory, processor and are stored in
On reservoir and the computer program that can run on a processor, the processor realize above-mentioned method when executing described program
Step.
Fourth aspect, the embodiment of the present disclosure provide a kind of human-computer interaction device based on human body dynamic posture, including:It connects
Receive module, the human body dynamic posture detected for receiving detection zone;Interactive module, for the human body dynamic appearance according to user
State performance judges whether the interactive process succeeds, while obtaining in the picture interaction to preset effect display imaging
Hold.
It is to be understood that foregoing general description and following detailed description are both illustrative, and it is intended to
In the further explanation for providing claimed technology.
Description of the drawings
In order to illustrate more clearly of the technical solution of the embodiment of the present disclosure, below to needed in embodiment description
Attached drawing is briefly described:
Fig. 1 is the hardware architecture diagram of the terminal device of the embodiment of the present disclosure;
Fig. 2 is the structural schematic diagram of the human-computer interaction device based on human body dynamic posture of the embodiment of the present disclosure one;
Fig. 3 is the work flow diagram of the human-computer interaction device shown in Fig. 2 based on human body dynamic posture;
Fig. 4 is the structural schematic diagram of the human-computer interaction device based on human body dynamic posture of the embodiment of the present disclosure two;
Fig. 5 is the work flow diagram of the human-computer interaction device shown in Fig. 4 based on human body dynamic posture;
Fig. 6 is the structural schematic diagram of the human-computer interaction device based on human body dynamic posture of the embodiment of the present disclosure three;
Fig. 7 is the work flow diagram of the human-computer interaction device shown in fig. 6 based on human body dynamic posture;
Fig. 8 is the hardware block diagram of the human-computer interaction device based on human body dynamic posture of the embodiment of the present disclosure;
Fig. 9 is the schematic diagram of the computer readable storage medium of the embodiment of the present disclosure.
Specific implementation mode
The application is further discussed in detail with reference to the accompanying drawings and examples.
In following introductions, term " first ", " second " only for descriptive purposes, and should not be understood as instruction or dark
Show relative importance.It is following to introduce the multiple embodiments for providing the disclosure, it can replace or merge between different embodiments
Combination, therefore the application is it is also contemplated that include all possible combinations of recorded identical and/or different embodiment.Thus, such as
Fruit one embodiment include feature A, B, C, another embodiment include feature B, D, then the application also should be regarded as include containing
A, the embodiment of the every other possible combination of one or more of B, C, D, although the embodiment may be in the following contents
In have specific literature record.
As shown in Figure 1, terminal device can be implemented in a variety of manners, the terminal device in the disclosure may include but not
It is (flat to be limited to such as mobile phone, smart phone, laptop, digit broadcasting receiver, PDA (personal digital assistant), PAD
Plate computer), PMP (portable media player), navigation device, vehicle-mounted terminal equipment, vehicle-mounted display terminal, after vehicle electronics
The fixed terminal equipment of the mobile terminal device of visor etc. and such as number TV, desktop computer etc..
In one embodiment of the disclosure, terminal device may include wireless communication unit 1, A/V (audio/video) defeated
Enter unit 2, user input unit 3, sensing unit 4, output unit 5, memory 6, interface unit 7, controller 8 and power supply unit
9 etc..Wherein, A/V (audio/video) input unit 2 includes but not limited to camera, front camera, rear camera,
All kinds of audio and video input equipments.It should be appreciated by those skilled in the art included by the terminal device that above-described embodiment is listed
Component, more than type described above, may include less or more components.
It should be appreciated by those skilled in the art various embodiments described herein can be to use such as computer soft
Part, hardware or any combination thereof computer-readable medium implement.Hardware is implemented, embodiment described herein can be with
By using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), can
Programmed logic device (PLD), processor, controller, microcontroller, microprocessor, is set field programmable gate array (FPGA)
It is calculated as executing at least one of electronic unit of function described herein to implement, in some cases, such embodiment party
Formula can be implemented in the controller.For software implementation, the embodiment of such as process or function can with allow to execute at least
A kind of individual software module of functions or operations is implemented.Software code can be by being write with any programming language appropriate
Software application (or program) is implemented, and software code can store in memory and be executed by controller.
Specifically, the embodiment of the present disclosure provides a kind of human-computer interaction device based on human body dynamic posture, including:Imaging
Device, receiving module is for receiving the human body dynamic posture that detection zone detects;Interactive module is used for the human body according to user
Dynamic posture performance judges whether interactive process succeeds, while obtaining and being interacted with the picture for presetting effect display imaging
Content.
In the present embodiment, the human body dynamic posture that detection zone detects is received;Further according to the human body dynamic posture of user
Performance judges whether interactive process succeeds, while obtaining to preset the picture interaction content of effect display imaging.This
Open embodiment reaches a variety of sides such as is labeled by artificial notation methods to each characteristic point position of human body dynamic posture
Formula improves the verification and measurement ratio of human body dynamic posture in interactive process, meanwhile, it realizes based on the man-machine of human body dynamic posture
Interactive strong interactive and experience property.
Embodiment one
As shown in Fig. 2, the human-computer interaction device based on human body dynamic posture of the present embodiment, including:200 He of receiving module
Interactive module 400.Wherein, receiving module 200 is for receiving the human body dynamic posture that detection zone detects;Interactive module 400 is used
In judging whether interactive process succeeds according to the human body dynamic posture performance of user, while obtaining to preset effect
Show the picture interaction content of imaging.
In the present embodiment, the human body dynamic posture that detection zone detects is received, it can be understood as at least one detection zone
The human body dynamic posture detected;Further according to user human body dynamic posture performance judge interactive process whether at
Work(, while obtaining to preset the picture interaction content of effect display imaging.The embodiment of the present disclosure reaches through artificial notation methods
Equal various ways are labeled to each characteristic point position of human body dynamic posture to improve human body dynamic in interactive process
The verification and measurement ratio of posture, meanwhile, realize the strong interactive and experience property of the human-computer interaction based on human body dynamic posture.
Fig. 3 is the work flow diagram of the human-computer interaction device shown in Fig. 2 based on human body dynamic posture.Specific steps are such as
Under:
Step 202, the human body dynamic posture that detection zone detects is received.Wherein, detection zone is that human-computer interaction device is shown
At least one of area display area.
Step 204, judge whether interactive process succeeds according to the human body dynamic posture performance of user, simultaneously
It obtains to preset the picture interaction content of effect display imaging.
It is dynamic to receive the human body that detection zone detects for the man-machine interaction method of human body dynamic posture disclosed in the embodiment of the present disclosure
State posture;Judge whether interactive process succeeds according to the human body dynamic posture performance of user, while obtaining with pre-
If the picture interaction content of effect display imaging.The above method reaches through artificial notation methods to each of human body dynamic posture
Characteristic point position is labeled equal various ways to improve the verification and measurement ratio of human body dynamic posture in interactive process, meanwhile, it is real
The strong interactive and experience property of the human-computer interaction based on human body dynamic posture is showed.
Embodiment two
As shown in figure 4, the human-computer interaction device based on human body dynamic posture of the present embodiment what is different from the first embodiment is that
The device is added to generation module 500, labeling module 600, computing module 700, the first acquisition module 800 and the second acquisition module
900.Specifically, receiving module 200 is for receiving the human body dynamic posture that detection zone detects;Labeling module 600 is for passing through
Artificial notation methods are labeled each characteristic point position of human body dynamic posture;After computing module 700 is used for mark
User positions, and part of each characteristic point in human body dynamic posture frame is calculated according to obtained human body dynamic posture frame
Coordinate;First acquisition module 800 is used to carry out calculating average value by the coordinate to all characteristic points, obtains average human dynamic
The characteristic point position of posture, the initial position configuration of the characteristic point position as human body dynamic posture;Second acquisition module 900 is used
It configures in the initial position of the characteristic point position according to human body dynamic posture, changes in conjunction with human body dynamic posture alignment model
In generation, obtains the characteristic point position of human body dynamic posture, that is, completes crucial point location;Generation module 500 exists for receiving user
The human body dynamic posture completed in preset time, the crucial point location to human body dynamic posture and human body dynamic Attitude estimation, it is raw
At a center control point;Interactive module 400 is used to judge human-computer interaction according to the human body dynamic posture performance of user
Whether process succeeds, while obtaining to preset the picture interaction content of effect display imaging.
It should be noted that being labeled to each characteristic point position of human body dynamic posture by artificial notation methods;
User after mark is positioned, each characteristic point is calculated in human body dynamic posture according to obtained human body dynamic posture frame
Local coordinate in frame.Wherein, human body dynamic posture frame is calculated by default human body dynamic attitude detection algorithm and is obtained.As a result,
The data of the human body dynamic posture detected to detection zone provide accurately technical support.In addition it is also necessary to which explanation, leads to
It crosses and calculating average value is carried out to the coordinate of all characteristic points, the characteristic point position of average human dynamic posture is obtained, as human body
The initial position of the characteristic point position of dynamic posture configures;Match further according to the initial position of the characteristic point position of human body dynamic posture
It sets, is iterated in conjunction with human body dynamic posture alignment model, obtain the characteristic point position of human body dynamic posture, that is, complete key point
Positioning.It is supported as a result, subsequently to provide accurately data to the judgement of human body dynamic posture performance.
In addition, the mode that receiving module 200 receives the human body dynamic posture that detection zone detects can be:It is dynamic to establish human body
State pose template, by optics camera device, such as the camera of mobile terminal, kinect etc. captures the human body dynamic appearance of detection zone
The human body dynamic posture for capturing detection zone is compared state with the human body dynamic pose template of foundation, completes detection zone inspection
The human body dynamic posture measured, the effective posture as detected.
It should be noted that establishing human body dynamic pose template concrete operations and being:Based on the estimation of human body dynamic posture, base
In the binding of human body dynamic posture and human body key point.Specifically, the estimation based on human body dynamic posture is primarily referred to as inputting
Each human part, the i.e. component part of human body are obtained in picture, for example, head, the positions such as left and right upper arm, size and direction
Deng.In order to detect human body attitude from input picture, it is necessary to be scanned to input picture;Due to human part in picture
Size and position distribution are all not fixed, thus scan each human part when need with different positions, scale and direction into
Row scanning.Then, the feature that scanning obtains is sent to two-value grader to be detected, to determine whether human body.It can manage
Solution, before testing, needs to be trained two-value grader to obtain the parameter of grader.It should be noted that by
It is multiple and different but very close postures that the same human testing in picture may will be inputted when detection, because
This needs to carry out mixing operation to classification results, to exclude the posture repeated.
In addition, the generation of the template based on human body dynamic posture to prestore is also based on the estimation of the human body attitude of component
The graph structure model of use.Wherein, graph structure model is broadly divided into three graph model, the observation model of component and figure reasoning portions
Point.Graph model indicates human body framework, and for describing the whole restriction relation of all human parts, graph model is generally using tree mould
Type, wherein the restriction relation of each pair of adjacent component is modeled using the distorted pattern between component;The observation model of component
Model is established to the appearance of human part, the quality of feature selecting determines the quality of the display model of component;Figure reasoning is profit
The human body attitude in picture to be detected is estimated with the graph model of foundation and component observation model.Before carrying out figure reasoning,
The parameter of manikin is obtained by classifier training.
It should be noted that the manikin based on component generally uses skeleton pattern or hinge shape model.Wherein, bone
Frame model, that is, rod graph model is made of the axis line segment of human part, these line segments are generally connected with each other.Skeleton pattern is simply straight
It sees;Hinge type shape generally indicates that human part, hinge type shape contain more than skeleton pattern using rectangle
Information content, the position of human part can not only be described, moreover it is possible to describe the width information of human part, pass through enhancing as a result,
Describable amount is that subsequent comparison lays the foundation.
Further, after the completion of the estimation operation of human body dynamic posture, human body key point is chosen.Wherein, key point is chosen
17 bone key points, 17 bone key points are respectively:Head, right side shoulder, right side ancon, right side wrist, the right hand, left side
Shoulder, left side ancon, left side wrist, left hand, right knee, right ankle, right crus of diaphragm, left knee, left ankle, left foot, right hip, left hip.It will be upper
It states 17 bone key points and is bound with this action event estimated based on human body attitude.It is that follow-up user holds as a result,
The action event of template of the row based on human body attitude provides accurate data supporting, has good ease for use.
Further, receiving the mode for the human body dynamic posture that detection zone detects can also be acquired by action message
Module includes but not limited to displacement sensor, such as displacement gyroscope, acquires the angular speed on three axis of three dimensions and position
It moves, to calculate distance, acceleration transducer, such as gravity accelerometer, acquires on three axis of three dimensions
Acceleration parameter, to by multiple acceleration parameters calculate acceleration of motion and the direction of motion, Optical tracking sensor with
And position definition module.Wherein, position definition module carries out the setting of position for each sensor, can also be used as simultaneously
Information data end handles the master data of the actuating signal of the sensor from each different location.
In the present embodiment, the human body dynamic posture that detection zone detects is received, it can be understood as at least one detection zone
The human body dynamic posture detected;Further according to user human body dynamic posture performance judge interactive process whether at
Work(, while obtaining to preset the picture interaction content of effect display imaging.The embodiment of the present disclosure reaches through artificial notation methods
Equal various ways are labeled to each characteristic point position of human body dynamic posture to improve human body dynamic in interactive process
The verification and measurement ratio of posture, meanwhile, realize the strong interactive and experience property of the human-computer interaction based on human body dynamic posture.
In the present embodiment, by detecting the multi-selection of human body dynamic posture mode, interactive accuracy is improved, together
When, the addition of labeling module, computing module, the first acquisition module, the second acquisition module and generation module, for follow-up interaction mould
The accuracy of block provides data support with rapidity.
Fig. 5 is the work flow diagram of the human-computer interaction device shown in Fig. 4 based on human body dynamic posture.Detailed process step
It is described as follows:
Step 401, the human body dynamic posture that detection zone detects is received.Wherein, detection zone is that human-computer interaction device is shown
At least one of area display area.
Step 402, each characteristic point position of human body dynamic posture is labeled by artificial notation methods.
Step 403, the user after mark is positioned, each feature is calculated according to obtained human body dynamic posture frame
Local coordinate of the point in human body dynamic posture frame.
Step 404, calculating average value is carried out by the coordinate to all characteristic points, obtains the spy of average human dynamic posture
Sign point position, the initial position configuration of the characteristic point position as human body dynamic posture.
Step 405, it is configured according to the initial position of the characteristic point position of human body dynamic posture, in conjunction with human body dynamic posture pair
Neat model is iterated, and obtains the characteristic point position of human body dynamic posture, that is, completes crucial point location.
Step 406, the human body dynamic posture that user completes in preset time is received, to the key of human body dynamic posture
Point location and human body dynamic Attitude estimation generate a center control point.
Step 407, judge whether interactive process succeeds according to the human body dynamic posture performance of user, simultaneously
It obtains to preset the picture interaction content of effect display imaging.
It is dynamic to receive the human body that detection zone detects for the man-machine interaction method of human body dynamic posture disclosed in the embodiment of the present disclosure
State posture;Judge whether interactive process succeeds according to the human body dynamic posture performance of user, while obtaining with pre-
If the picture interaction content of effect display imaging.The above method reaches through artificial notation methods to each of human body dynamic posture
Characteristic point position is labeled equal various ways to improve the verification and measurement ratio of human body dynamic posture in interactive process, meanwhile, it is real
The strong interactive and experience property of the human-computer interaction based on human body dynamic posture is showed.
In the present embodiment, by detecting the multi-selection of human body dynamic posture mode, interactive accuracy is improved, together
When, each characteristic point position of human body dynamic posture is labeled by artificial notation methods;To the user after mark into
Row positioning, local coordinate of each characteristic point in human body dynamic posture frame is calculated according to obtained human body dynamic posture frame,
In, human body dynamic posture frame is calculated by default human body dynamic attitude detection algorithm and is obtained;Pass through the coordinate to all characteristic points
Calculating average value is carried out, the characteristic point position of average human dynamic posture, the characteristic point as human body dynamic posture are obtained
The initial position of position configures;It is configured according to the initial position of the characteristic point position of human body dynamic posture, in conjunction with human body dynamic appearance
State alignment model is iterated, and obtains the characteristic point position of human body dynamic posture, that is, is completed the operations such as crucial point location, be follow-up
The accuracy of interactive module provides data support with rapidity.
Embodiment three
As shown in fig. 6, the human-computer interaction device based on human body dynamic posture of the present embodiment what is different from the first embodiment is that
The device is obtained in addition to being added to generation module 500, labeling module 600, computing module 700, the first acquisition module 800 and second
Except module 900, interactive module 400 further includes the first judging unit 401 and the second judging unit 402.Specifically, receiving module
200 for receiving the human body dynamic posture that detection zone detects;Labeling module 600 is used for dynamic to human body by artificial notation methods
Each characteristic point position of state posture is labeled;Computing module 700 is used to position the user after mark, according to
To human body dynamic posture frame calculate local coordinate of each characteristic point in human body dynamic posture frame;First acquisition module 800
For carrying out calculating average value by the coordinate to all characteristic points, the characteristic point position of average human dynamic posture is obtained, is made
For the initial position configuration of the characteristic point position of human body dynamic posture;Second acquisition module 900 is used for according to human body dynamic posture
Characteristic point position initial position configuration, be iterated in conjunction with human body dynamic posture alignment model, obtain human body dynamic posture
Characteristic point position, that is, complete crucial point location;Generation module 500 is for receiving the human body that user completes in preset time
Dynamic posture, the crucial point location to human body dynamic posture and human body dynamic Attitude estimation generate a center control point;Interaction
Module 400 is used to judge whether interactive process succeeds according to the human body dynamic posture performance of user, obtain simultaneously
To preset the picture interaction content of effect display imaging.
In the present embodiment, interactive module 400 includes:First judging unit 401 is used for the human body dynamic posture as user
When performance compares successfully with the picture element in application program in interactive process, judgement human body dynamic posture is effective
Posture;Second judging unit 402 applies journey in the human body dynamic posture performance and interactive process for working as user
When picture element in sequence compares failure, judgement human body dynamic posture is invalid posture, and is shown in advance on picture interaction content
If prompt message.It improves as a result, and the ease for use that distinct interaction shows picture is presented according to distinct interaction result.
It should be noted that when comparing successfully, interactive action is judged for effective posture of user, and in picture
Hold with default effect display imaging.Wherein, it is image content highlight effect to preset effect;Image content is shown as with default effect
The mode of picture realizes the determination of interactive action, including:When the man-machine friendship of the data and reception of the picture element that parsing obtains
When mutual action data is compared successfully, image content with highlight effect render and shows imaging.In addition, default effect can be with
Image content occurs with the screen flicker effect of preset times, when the man-machine friendship of the data and reception of the picture element that parsing obtains
When mutual action data is compared successfully, image content with the screen flicker effect of preset times render and shows imaging.
It should be noted that it can be photo-realistic images rendering or feeling of unreality figure to be rendered to image content
As rendering.About photo-realistic images rendering is carried out to image content the true of it is obtained specifically, for the object in scene
True feeling image makees the blanking of hidden surface it is necessary to carry out perspective projection to it, then calculates the illumination chiaroscuro effect of visible face,
The photo-realistic images for obtaining scene are shown.But it is not only to be hidden face to eliminate the obtained image sense of reality to scene
No more, the illumination chiaroscuro effect of body surface how is handled, by using different color gray scales, to increase the true of graph image
True feeling, this is also the main source of the scene image sense of reality.
Computer real graphic seems a kind of raster image, is made of pixel.When generating a width photo-realistic images, it is necessary to by
Calculate to a pixel the color of corresponding contents surface region on picture.Obviously when calculating the color of visible scenery surface region,
Light source is not only considered to the region incident light and brightness and spectral composition, and is also contemplated that the surface region to light source
Direction, the material on surface and reflectivity properties etc..This calculating must be based on certain optical physics model, i.e. Luminescence model.
The process that a width photo-realistic images are generated based on scene geometry and illumination model is referred to as to draw.Common photo-realistic images are drawn
Algorithm includes scan-line algorithm, Ray Tracing Algorithm, radiosity method etc..In addition, non-real about being carried out to image content
True feeling image rendering needs the selection that active is made to rendering content and mode specifically, in non-photorealistic image rendering.
Non-photorealistic image, which renders, often to be realized by a default application program, this default application program is real with piece image or three-dimensional
Body is input, and exports the image of specific multiple attributes.
It is understood that image content render and shows imaging, including:Programming interface pair is preset by calling
Image content render and shows imaging.By using the interface, the high efficiency of the compatibility and imaging of imaging files is improved.
Further, the first judging unit 401 in interactive module 400, when the human body dynamic posture of user is completed
When situation compares successfully with the picture element in application program in interactive process, judgement human body dynamic posture is effective appearance
State, and interactive module 400 completes interactive operation with the picture interaction content for presetting effect display imaging simultaneously.Specifically, interaction
Module 400 so that show first layer picture and second layer picture simultaneously on mobile terminal.It is considered that aobvious in mobile terminal
Show two layers of display block of module protection, in the default display period of first layer picture, according to first layer charging order, load
Sequentially, rendering order and imaging sequence complete the displaying of this layer of display block;In the default display period of second layer picture
In, the displaying of this layer of display block is completed according to second layer charging order, loading sequence, rendering order and imaging sequence.
It should be noted that first in interactive module in the human-computer interaction device based on human body dynamic posture of the disclosure
Layer picture and second layer picture are shown in the default display period for the identical display period, are drawn to reach first layer
The face effect synchronous with display with the charging of second layer picture, load, rendering.In addition, further including:First layer is drawn in interactive module
Face and second layer picture are when the default display period is to differ the display period to be shown, first layer picture and second
Layer picture can also carry out time segment and be shown, reach multi-selection and the flexibility of display as a result,.Certainly, it is based on
If first layer picture and second layer picture are when the default display period is to differ the display period to be shown, specifically
Display order can also be by being finally electrically charged, loading, rendering and being imaged within the default display period of first layer picture
Display block, within the default display period of second layer picture can be adjusted to be electrically charged at first, and the display block
Remaining display block, the cis-position being electrically charged within the default display period of second layer picture are in the default aobvious of first layer picture
Show that the cis-position being electrically charged in the period increases a cis-position backward.Even if improving first layer picture and second layer picture as a result,
Asynchronously show alternately with hierarchy.
In addition it is also necessary to illustrate, interactive module in the human-computer interaction device based on human body dynamic posture of the disclosure
Application, also significantly optimize the beauty and speed that human-computer interaction is shown.
Fig. 7 is the work flow diagram of the human-computer interaction device shown in fig. 6 based on human body dynamic posture.Specific steps are such as
Under:
Step 601, the human body dynamic posture that detection zone detects is received.Wherein, detection zone is that human-computer interaction device is shown
At least one of area display area.
Step 602, each characteristic point position of human body dynamic posture is labeled by artificial notation methods.
Step 603, the user after mark is positioned, each feature is calculated according to obtained human body dynamic posture frame
Local coordinate of the point in human body dynamic posture frame.
Step 604, calculating average value is carried out by the coordinate to all characteristic points, obtains the spy of average human dynamic posture
Sign point position, the initial position configuration of the characteristic point position as human body dynamic posture.
Step 605, it is configured according to the initial position of the characteristic point position of human body dynamic posture, in conjunction with human body dynamic posture pair
Neat model is iterated, and obtains the characteristic point position of human body dynamic posture, that is, completes crucial point location.
Step 606, the human body dynamic posture that user completes in preset time is received, to the key of human body dynamic posture
Point location and human body dynamic Attitude estimation generate a center control point.
Step 607, judge the human body dynamic posture performance of user in application program in interactive process
Whether picture element comparison succeeds.
Step 608, when the picture in application program in the human body dynamic posture performance of user and interactive process
When surface element is compared successfully, judgement human body dynamic posture is effective posture, while being obtained to preset the picture of effect display imaging
Interaction content.
Step 609, when the picture in application program in the human body dynamic posture performance of user and interactive process
When surface element compares failure, judgement human body dynamic posture is invalid posture, and default prompt letter is shown on picture interaction content
Breath.
It is dynamic to receive the human body that detection zone detects for the man-machine interaction method of human body dynamic posture disclosed in the embodiment of the present disclosure
State posture;Judge whether interactive process succeeds according to the human body dynamic posture performance of user, while obtaining with pre-
If the picture interaction content of effect display imaging.The above method reaches through artificial notation methods to each of human body dynamic posture
Characteristic point position is labeled equal various ways to improve the verification and measurement ratio of human body dynamic posture in interactive process, meanwhile, it is real
The strong interactive and experience property of the human-computer interaction based on human body dynamic posture is showed.
Fig. 8 is the hardware cell for illustrating the human-computer interaction device according to an embodiment of the present disclosure based on human body dynamic posture
Figure.As shown in figure 8, including memory 801 according to the human-computer interaction device 80 based on human body dynamic posture of the embodiment of the present disclosure
With processor 802.Each component in human-computer interaction device 80 based on human body dynamic posture passes through bus system and/or other shapes
Bindiny mechanism's (not shown) interconnection of formula.
Memory 801 is for storing non-transitory computer-readable instruction.Specifically, memory 801 may include one
Or multiple computer program products, computer program product may include various forms of computer readable storage mediums, such as
Volatile memory and/or nonvolatile memory.Volatile memory for example may include random access memory (RAM)
And/or cache memory (cache) etc..Nonvolatile memory for example may include read-only memory (ROM), hard disk,
Flash memory etc..
Processor 802 can be central processing unit (CPU) or have data-handling capacity and/or instruction execution capability
Other forms processing unit, and other components in the human-computer interaction device 80 based on human body dynamic posture can be controlled
To execute desired function.In one embodiment of the disclosure, the processor 802 is used in run memory 801 store
Computer-readable instruction so that the human-computer interaction device 80 based on human body dynamic posture execute it is above-mentioned be based on human body dynamic posture
Man-machine interaction method.Human-computer interaction device based on human body dynamic posture and the above-mentioned human-computer interaction based on human body dynamic posture
The embodiment of method description is identical, will omit its repeated description herein.
Fig. 9 is the schematic diagram for illustrating computer readable storage medium according to an embodiment of the present disclosure.As shown in figure 9, root
It is stored thereon with non-transitory computer-readable instruction 901 according to the computer readable storage medium 900 of the embodiment of the present disclosure.Work as institute
When stating non-transitory computer-readable instruction 901 and being run by processor, execute with reference to foregoing description according to the embodiment of the present disclosure
The man-machine interaction method based on human body dynamic posture.
More than, according to the man-machine interaction method and device based on human body dynamic posture of the embodiment of the present disclosure, and calculate
Machine readable storage medium storing program for executing.A variety of sides are labeled etc. to each characteristic point position of human body dynamic posture by artificial notation methods
Formula improves the verification and measurement ratio of human body dynamic posture in interactive process, meanwhile, it realizes based on the man-machine of human body dynamic posture
Interactive strong interactive and experience property.
The basic principle of the disclosure is described above in association with specific embodiment, however, it is desirable to, it is noted that in the disclosure
The advantages of referring to, advantage, effect etc. are only exemplary rather than limitation, must not believe that these advantages, advantage, effect etc. are the disclosure
Each embodiment is prerequisite.In addition, detail disclosed above is merely to exemplary effect and the work being easy to understand
With, and it is unrestricted, it is that must be realized using above-mentioned concrete details that above-mentioned details, which is not intended to limit the disclosure,.
The block diagram of device, device, equipment, system involved in the disclosure only as illustrative example and is not intended to
It is required that or hint must be attached in such a way that box illustrates, arrange, configure.As those skilled in the art will appreciate that
, it can be connected by any way, arrange, configure these devices, device, equipment, system.Such as "include", "comprise", " tool
" etc. word be open vocabulary, refer to " including but not limited to ", and can be used interchangeably with it.Vocabulary used herein above
"or" and " and " refer to vocabulary "and/or", and can be used interchangeably with it, unless it is not such that context, which is explicitly indicated,.Here made
Vocabulary " such as " refers to phrase " such as, but not limited to ", and can be used interchangeably with it.
In addition, as used herein, the "or" instruction separation that is used in the enumerating of the item started with "at least one"
It enumerates, so that enumerating for such as " A, B or C's being at least one " means A or B or C or AB or AC or BC or ABC (i.e. A and B
And C).In addition, wording " exemplary " does not mean that the example of description is preferred or more preferable than other examples.
It may also be noted that in the system and method for the disclosure, each component or each step are can to decompose and/or again
Combination nova.These decompose and/or reconfigure the equivalent scheme that should be regarded as the disclosure.
The technology instructed defined by the appended claims can not departed from and carried out to the various of technology described herein
Change, replace and changes.In addition, the scope of the claims of the disclosure is not limited to process described above, machine, manufacture, thing
Composition, means, method and the specific aspect of action of part.It can be essentially identical using being carried out to corresponding aspect described herein
Function either realize essentially identical result there is currently or to be developed later processing, machine, manufacture, event group
At, means, method or action.Thus, appended claims include such processing within its scope, machine, manufacture, event
Composition, means, method or action.
The above description of disclosed aspect is provided so that any person skilled in the art can make or use this
It is open.Various modifications in terms of these are readily apparent to those skilled in the art, and are defined herein
General Principle can be applied to other aspect without departing from the scope of the present disclosure.Therefore, the disclosure is not intended to be limited to
Aspect shown in this, but according to the widest range consistent with principle disclosed herein and novel feature.
In order to which purpose of illustration and description has been presented for above description.In addition, this description is not intended to the reality of the disclosure
It applies example and is restricted to form disclosed herein.Although already discussed above multiple exemplary aspects and embodiment, this field skill
Art personnel will be recognized that its certain modifications, modification, change, addition and sub-portfolio.
Claims (16)
1. a kind of man-machine interaction method based on human body dynamic posture, which is characterized in that include the following steps:
Receive the human body dynamic posture that detection zone detects;
Judge whether the interactive process succeeds according to the human body dynamic posture performance of user, while obtaining with pre-
If the picture interaction content of effect display imaging.
2. the man-machine interaction method according to claim 1 based on human body dynamic posture, which is characterized in that further include:It connects
Receive the human body dynamic posture that user completes in preset time, the crucial point location to the human body dynamic posture and institute
Human body dynamic Attitude estimation is stated, a center control point is generated.
3. the man-machine interaction method according to claim 2 based on human body dynamic posture, which is characterized in that further include:It is logical
Artificial notation methods are crossed to be labeled each characteristic point position of the human body dynamic posture;
User after mark is positioned, each characteristic point is calculated in the human body according to obtained human body dynamic posture frame
Local coordinate in dynamic posture frame.
4. the man-machine interaction method according to claim 3 based on human body dynamic posture, which is characterized in that the human body is dynamic
State posture frame is calculated by default human body dynamic attitude detection algorithm and is obtained.
5. the man-machine interaction method according to claim 3 based on human body dynamic posture, which is characterized in that further include:It is logical
It crosses and calculating average value is carried out to the coordinate of all characteristic points, obtain the characteristic point position of the average human body dynamic posture, as
The initial position of the characteristic point position of the human body dynamic posture configures;
It is configured according to the initial position of the characteristic point position of the human body dynamic posture, in conjunction with the human body dynamic appearance
State alignment model is iterated, and obtains the characteristic point position of the human body dynamic posture, that is, completes crucial point location.
6. the man-machine interaction method according to claim 1 based on human body dynamic posture, which is characterized in that the basis makes
The human body dynamic posture performance of user judges whether the interactive process succeeds, including:As the people of user
When body dynamic posture performance compares successfully with the picture element in application program in the interactive process, described in judgement
Human body dynamic posture is effective posture.
7. the man-machine interaction method according to claim 1 based on human body dynamic posture, which is characterized in that the basis makes
The human body dynamic posture performance of user judges whether the interactive process succeeds, and further includes:Described in user
When human body dynamic posture performance compares unsuccessfully with the picture element in application program in the interactive process, institute is judged
It is invalid posture to state human body dynamic posture, and is shown on the picture interaction content and preset prompt message.
8. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is held by processor
The step of any one of claim 1-7 the methods are realized when row.
9. a kind of computer equipment, including memory, processor and storage are on a memory and the meter that can run on a processor
Calculation machine program, which is characterized in that the processor realizes any one of the claim 1-7 sides when executing described program
The step of method.
10. a kind of human-computer interaction device based on human body dynamic posture, which is characterized in that including:
Receiving module, the human body dynamic posture detected for receiving detection zone;
Interactive module, for according to the human body dynamic posture performance of user judge the interactive process whether at
Work(, while obtaining to preset the picture interaction content of effect display imaging.
11. the human-computer interaction device according to claim 10 based on human body dynamic posture, which is characterized in that further include:
Generation module, the human body dynamic posture completed in preset time for receiving user, to the human body dynamic posture
Crucial point location and the human body dynamic Attitude estimation, generate a center control point.
12. the human-computer interaction device according to claim 11 based on human body dynamic posture, which is characterized in that further include:
Labeling module, for being labeled to each characteristic point position of the human body dynamic posture by artificial notation methods;
Computing module calculates each spy for being positioned to the user after mark according to obtained human body dynamic posture frame
Local coordinate of the sign point in the human body dynamic posture frame.
13. the human-computer interaction device according to claim 12 based on human body dynamic posture, which is characterized in that the human body
Dynamic posture frame is calculated by default human body dynamic attitude detection algorithm and is obtained.
14. the human-computer interaction device according to claim 12 based on human body dynamic posture, which is characterized in that further include:
First acquisition module obtains the average human body dynamic appearance for carrying out calculating average value by the coordinate to all characteristic points
The characteristic point position of state, the initial position configuration of the characteristic point position as the human body dynamic posture;
Second acquisition module, the initial position for the characteristic point position according to the human body dynamic posture configure,
It is iterated in conjunction with the human body dynamic posture alignment model, obtains the characteristic point position of the human body dynamic posture, that is, complete
Crucial point location.
15. the human-computer interaction device according to claim 10 based on human body dynamic posture, which is characterized in that the interaction
Module, including:First judging unit, the human body dynamic posture performance for working as user and the human-computer interaction
When picture element in journey in application program is compared successfully, judge that the human body dynamic posture is effective posture.
16. the human-computer interaction device according to claim 10 based on human body dynamic posture, which is characterized in that the interaction
Module, including:Second judging unit, the human body dynamic posture performance for working as user and the human-computer interaction
When picture element in journey in application program compares failure, judge that the human body dynamic posture is invalid posture, and in the picture
It is shown on the interaction content of face and presets prompt message.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810274487 | 2018-03-29 | ||
CN2018102744870 | 2018-03-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108549484A true CN108549484A (en) | 2018-09-18 |
CN108549484B CN108549484B (en) | 2019-09-20 |
Family
ID=63514021
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810301243.7A Active CN108549484B (en) | 2018-03-29 | 2018-04-04 | Man-machine interaction method and device based on human body dynamic posture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108549484B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109697446A (en) * | 2018-12-04 | 2019-04-30 | 北京字节跳动网络技术有限公司 | Image key points extracting method, device, readable storage medium storing program for executing and electronic equipment |
CN109753152A (en) * | 2018-12-21 | 2019-05-14 | 北京市商汤科技开发有限公司 | Exchange method and device based on human body attitude, computer equipment |
CN111685772A (en) * | 2020-05-29 | 2020-09-22 | 清华大学 | Exoskeleton robot measurement system, walking gait modeling analysis method and equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102486816A (en) * | 2010-12-02 | 2012-06-06 | 三星电子株式会社 | Device and method for calculating human body shape parameters |
CN103501868A (en) * | 2011-04-28 | 2014-01-08 | 微软公司 | Control of separate computer game elements |
CN103646425A (en) * | 2013-11-20 | 2014-03-19 | 深圳先进技术研究院 | A method and a system for body feeling interaction |
CN105144196A (en) * | 2013-02-22 | 2015-12-09 | 微软技术许可有限责任公司 | Method and device for calculating a camera or object pose |
CN105404392A (en) * | 2015-11-03 | 2016-03-16 | 北京英梅吉科技有限公司 | Monocular camera based virtual wearing method and system |
-
2018
- 2018-04-04 CN CN201810301243.7A patent/CN108549484B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102486816A (en) * | 2010-12-02 | 2012-06-06 | 三星电子株式会社 | Device and method for calculating human body shape parameters |
CN103501868A (en) * | 2011-04-28 | 2014-01-08 | 微软公司 | Control of separate computer game elements |
CN105144196A (en) * | 2013-02-22 | 2015-12-09 | 微软技术许可有限责任公司 | Method and device for calculating a camera or object pose |
CN103646425A (en) * | 2013-11-20 | 2014-03-19 | 深圳先进技术研究院 | A method and a system for body feeling interaction |
CN105404392A (en) * | 2015-11-03 | 2016-03-16 | 北京英梅吉科技有限公司 | Monocular camera based virtual wearing method and system |
Non-Patent Citations (2)
Title |
---|
叶青: "无标记人体运动捕捉技术研究", 《中国博士学位论文全文数据库》 * |
胡赢: "视频体感交互试衣系统开发", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109697446A (en) * | 2018-12-04 | 2019-04-30 | 北京字节跳动网络技术有限公司 | Image key points extracting method, device, readable storage medium storing program for executing and electronic equipment |
CN109697446B (en) * | 2018-12-04 | 2021-12-07 | 北京字节跳动网络技术有限公司 | Image key point extraction method and device, readable storage medium and electronic equipment |
CN109753152A (en) * | 2018-12-21 | 2019-05-14 | 北京市商汤科技开发有限公司 | Exchange method and device based on human body attitude, computer equipment |
CN111685772A (en) * | 2020-05-29 | 2020-09-22 | 清华大学 | Exoskeleton robot measurement system, walking gait modeling analysis method and equipment |
CN111685772B (en) * | 2020-05-29 | 2021-07-09 | 清华大学 | Exoskeleton robot measurement system, walking gait modeling analysis method and equipment |
Also Published As
Publication number | Publication date |
---|---|
CN108549484B (en) | 2019-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200387698A1 (en) | Hand key point recognition model training method, hand key point recognition method and device | |
CN110599605B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN108475439B (en) | Three-dimensional model generation system, three-dimensional model generation method, and recording medium | |
CN104732585B (en) | A kind of method and device of human somatotype reconstruct | |
US11494915B2 (en) | Image processing system, image processing method, and program | |
CN109598798B (en) | Virtual object fitting method and virtual object fitting service system | |
JP5936155B2 (en) | 3D user interface device and 3D operation method | |
JP4473754B2 (en) | Virtual fitting device | |
CN107484428B (en) | Method for displaying objects | |
KR101556992B1 (en) | 3d scanning system using facial plastic surgery simulation | |
CN104346612B (en) | Information processing unit and display methods | |
US20080165187A1 (en) | Face Image Synthesis Method and Face Image Synthesis Apparatus | |
EP3992919B1 (en) | Three-dimensional facial model generation method and apparatus, device, and medium | |
CN112819947A (en) | Three-dimensional face reconstruction method and device, electronic equipment and storage medium | |
US10909744B1 (en) | Simulating garment with wrinkles based on physics based cloth simulator and machine learning model | |
JPWO2006049147A1 (en) | Three-dimensional shape estimation system and image generation system | |
CN108549484B (en) | Man-machine interaction method and device based on human body dynamic posture | |
CN106797458A (en) | The virtual change of real object | |
CN108537162A (en) | The determination method and apparatus of human body attitude | |
IL299465A (en) | Object recognition neural network for amodal center prediction | |
Treepong et al. | Makeup creativity enhancement with an augmented reality face makeup system | |
CN105814604B (en) | For providing location information or mobile message with the method and system of at least one function for controlling vehicle | |
JP2010211732A (en) | Object recognition device and method | |
CN110533761B (en) | Image display method, electronic device and non-transient computer readable recording medium | |
KR20180083608A (en) | Single image-based 3D modeling apparatus for 3D printing and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |