CN107831890A - Man-machine interaction method, device and equipment based on AR - Google Patents

Man-machine interaction method, device and equipment based on AR Download PDF

Info

Publication number
CN107831890A
CN107831890A CN201710943334.6A CN201710943334A CN107831890A CN 107831890 A CN107831890 A CN 107831890A CN 201710943334 A CN201710943334 A CN 201710943334A CN 107831890 A CN107831890 A CN 107831890A
Authority
CN
China
Prior art keywords
data
sample operations
action
operations data
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710943334.6A
Other languages
Chinese (zh)
Inventor
王行
盛赞
李骊
周晓军
李朔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing HJIMI Technology Co Ltd
Original Assignee
Beijing HJIMI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing HJIMI Technology Co Ltd filed Critical Beijing HJIMI Technology Co Ltd
Priority to CN201710943334.6A priority Critical patent/CN107831890A/en
Publication of CN107831890A publication Critical patent/CN107831890A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders, dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00342Recognition of whole body movements, e.g. for sport training

Abstract

The present invention provides a kind of man-machine interaction method based on AR, device and equipment, wherein, methods described includes:Gather the set data in current augmented reality AR scenes;Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains similarity comparison result;One group of sample operations data is determined according to the similarity comparison result;Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.The present invention can realize a variety of man-machine interaction modes based on a kind of interface alternation program currently developed, the use of exploitation and user for associated developer provides great convenience, and the intelligent level of the man-machine interaction in AR scenes can be improved, meets user's request.

Description

Man-machine interaction method, device and equipment based on AR
Technical field
The present invention relates to interaction technique field, the more particularly to a kind of man-machine interaction method based on AR, device and equipment.
Background technology
AR (Augmented Reality, augmented reality) technology is a kind of by real world information and virtual world information " seamless " integrated new technology, script can be difficult that the entity experienced is believed in the certain time spatial dimension of real world by it Breath (such as visual information, sound, taste and tactile), will be virtual by being overlapped again after the analog simulations such as computer technology Information application to real world, perceived by human sensory, so as to reach the sensory experience of exceeding reality, can be achieved will be true Environment and virtual object be added in real time same picture or space.
In existing AR interaction techniques, it is necessary to for various interactive modes (such as gesture, body-sensing and eye movement Deng) interface alternation program corresponding to exploitation, therefore development to associated developer and the use of user are brought very respectively Big inconvenience.
The content of the invention
In view of this, the present invention provides a kind of man-machine interaction method based on AR, device and equipment, improves in AR scenes The intelligent level of man-machine interaction, meets user's request.
In a first aspect, the embodiments of the invention provide a kind of man-machine interaction method based on AR, including:
Gather the set data in current augmented reality AR scenes;
Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains phase Like degree comparison result;
One group of sample operations data is determined according to the similarity comparison result;
Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
In one embodiment, before the set data in the current augmented reality AR scenes of collection, also wrap Include:
Interactive mode control instruction is received, the interactive mode currently selected is included in the interactive mode control instruction;
From the first corresponding relation pre-saved, multigroup sample operations data corresponding to the interactive mode are inquired about.
In one embodiment, in addition to:First corresponding relation is pre-saved according to following steps:
Determine interactive mode corresponding to multigroup sample operations data to be saved and multigroup sample operations data;
Build the first corresponding relation between multigroup sample operations data and the interactive mode;
Preserve first corresponding relation.
In one embodiment, it is described obtain similarity comparison result after, in addition to:
Similarity threshold is determined according to sensitivity parameter;
By the similarity comparison result compared with the similarity threshold;
If it is determined that the similarity comparison result reaches the similarity threshold, then perform described according to the similarity ratio The operation of one group of sample operations data is determined to result.
In one embodiment, the relevant action according to corresponding to one group of sample operations data performs default interaction Before operation, in addition to:
From the second corresponding relation pre-saved, the inquiry association corresponding with one group of sample operations data Action.
In one embodiment, in addition to:Second corresponding relation is pre-saved according to following steps:
One group of sample operations data corresponding to relevant action to be associated and the relevant action are determined, the association is moved Act on interactive operation corresponding to being performed to the terminal device of association;
Build the second corresponding relation between the sample operations data and one group of relevant action;
Preserve second corresponding relation.
In one embodiment, methods described also includes:
Before carrying out feature extraction to the action data, the action data is normalized.
Second aspect, the embodiments of the invention provide a kind of human-computer interaction device based on AR, including:
Data acquisition module, for gathering the set data in current augmented reality AR scenes;
Characteristic extracting module, for carrying out feature extraction to the action data, obtain the action data first is special Sign;
Feature comparing module, for being carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data Similarity compares, and obtains similarity comparison result;
Determining module is operated, for determining one group of sample operations data according to the similarity comparison result;
Interaction execution module, default interaction is performed for the relevant action according to corresponding to one group of sample operations data Operation.
The third aspect, the embodiments of the invention provide a kind of electronic equipment, the electronic equipment includes:
Processor;
It is configured as storing the memory of processor-executable instruction;
Wherein, the processor is configured as:
Gather the set data in current augmented reality AR scenes;
Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains phase Like degree comparison result;
One group of sample operations data is determined according to the similarity comparison result;
Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
Fourth aspect, the embodiments of the invention provide a kind of computer-readable recording medium, stored in the storage medium There is computer program, the program is realized when being processed by the processor:
Gather the set data in current augmented reality AR scenes;
Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains phase Like degree comparison result;
One group of sample operations data is determined according to the similarity comparison result;
Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
As shown from the above technical solution, the man-machine interaction method based on AR, device and equipment provided by the invention, by adopting Collect the set data in current augmented reality AR scenes, and feature extraction is carried out to the action data, then will carry The second feature of the fisrt feature taken and multigroup sample operations data carries out similarity and compared respectively, and then is compared according to similarity As a result one group of sample operations data is determined, performing default interaction with the relevant action according to corresponding to this group of sample operations data grasps Make, a variety of man-machine interaction modes can be realized based on a kind of interface alternation program currently developed, be the exploitation of associated developer And the use of user provides great convenience, and the intelligent level of the man-machine interaction in AR scenes can be improved, met User's request.
Brief description of the drawings
Fig. 1 is the flow chart of according to embodiments of the present invention one man-machine interaction method based on AR;
Fig. 2 is the flow chart of according to embodiments of the present invention two man-machine interaction method based on AR;
Fig. 3 is the flow chart of according to embodiments of the present invention three man-machine interaction method based on AR;
Fig. 4 is the flow chart of according to embodiments of the present invention four man-machine interaction method based on AR;
Fig. 5 is the flow chart of according to embodiments of the present invention five man-machine interaction method based on AR;
Fig. 6 is the flow chart of according to embodiments of the present invention six man-machine interaction method based on AR;
Fig. 7 is the structured flowchart of according to embodiments of the present invention eight human-computer interaction device based on AR;
Fig. 8 is the structured flowchart of according to embodiments of the present invention nine human-computer interaction device based on AR;
Fig. 9 is the structured flowchart of according to embodiments of the present invention ten electronic equipment.
Embodiment
In order to facilitate the understanding of the purposes, features and advantages of the present invention, it is below in conjunction with the accompanying drawings and specific real Applying mode, the present invention is further detailed explanation.
It is only merely for the purpose of description specific embodiment in term used in this application, and is not intended to be limiting the application. " one kind " of singulative used in the application and appended claims, " described " and "the" are also intended to including majority Form, unless context clearly shows that its ordinary meaning.It is also understood that term "and/or" used herein refers to and wrapped Containing the associated list items purpose of one or more, any or all may be combined.
It will be appreciated that though various information may be described using term first, second, third, etc. in this application, but These information should not necessarily be limited by these terms.These terms are only used for same type of information being distinguished from each other out.For example, do not taking off In the case of the application scope, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as The first information.Depending on linguistic context, word as used in this " if " can be construed to " ... when " or " when ... When " or " in response to determining ".
Fig. 1 is the flow chart of according to embodiments of the present invention one man-machine interaction method based on AR;The embodiment can be used In the terminal device (such as mobile phone, tablet personal computer, personal computer etc.) with a variety of support AR interaction techniques.As shown in figure 1, This method comprises the following steps S101-S105:
S101:Gather the set data in current augmented reality AR scenes.
In an optional embodiment, during current AR scenes carry out man-machine interaction, terminal device can lead to Cross the set data of AR applications or camera device (such as camera or camera) collection user.
In an optional embodiment, the set data can include the frame image sequence of two dimension or three dimensional form (video flowing obtained in such as image acquisition process), some two field pictures can be included in the frame image sequence.
S102:Feature extraction is carried out to the action data, obtains the fisrt feature of the action data.
In an optional embodiment, the feature to action data extraction can be that can protrude the action well The characteristics of data, and can shows the difference with other action datas and the feature contacted, with Enhanced feature identification.
In an optional embodiment, a kind of feature can be extracted to the action data, or extract a variety of spies simultaneously Levy (one or more in such as face key point or gesture artis), fully comprehensively to show the spy of the action data Point.
, can also be first to the action before feature extraction is carried out to the action data in an optional embodiment Data are normalized, and to be standardized to the data acquired in different device or different AR application programs, improve and calculate The accuracy of method, the present embodiment is to the method for normalized without limiting.
S103:Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, Obtain similarity comparison result.
In an optional embodiment, the feature to sample operations data extraction can be that can protrude to be somebody's turn to do well The feature of the characteristics of sample operations data.
In an optional embodiment, the sample operations data can be extracted with a kind of feature, or extract simultaneously more Kind feature (corresponding with the feature extracted to the action data), fully comprehensively to show the spy of the sample operations data Point.
In an optional embodiment, the second feature includes being grasped according to typing before user and/or default sample Make data by the feature obtained by feature extraction.
It is every in the fisrt feature and multigroup sample operations data of the action data extracted in an optional embodiment The second feature of group sample data carries out similarity comparison respectively, to obtain similarity comparison result.
S104:One group of sample operations data is determined according to the similarity comparison result.
In an optional embodiment, rule are compared based on above-mentioned similarity comparison result and based on default similarity Then, one group of sample operations data can be determined from above-mentioned multigroup sample operations data.
In an optional embodiment, it may be determined that one group of sample operations data most like with the action data.
S105:Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
In an optional embodiment, it is determined that after one group of sample operations data most like with the action data, i.e., Relevant action corresponding to this group of sample operations data can be determined, the relevant action is used for according to the corresponding relation built in advance To interactive operation (such as " thumb movement " correspondence " volume adjusting ") corresponding to the terminal device execution of association.
In an optional embodiment, the interactive operation can include but is not limited to touch screen operation and (such as press, be long Press, slide, dragging or 2 points of slips etc.), can also include operating the button of the terminal device, the present embodiment is to this Without limiting.
As shown from the above technical solution, the present embodiment is implemented to gather current augmented reality by step S101-S105 Set data in AR scenes, and to the action data carry out feature extraction, then by the fisrt feature of extraction with it is multigroup The second feature of sample operations data carries out similarity comparison respectively, and then determines one group of sample behaviour according to similarity comparison result Make data, default interactive operation is performed with the relevant action according to corresponding to this group of sample operations data, can be based on currently opening A kind of interface alternation program of hair realizes a variety of man-machine interaction modes, and the use of exploitation and user for associated developer provides Great convenience, the intelligent level of the man-machine interaction in AR scenes is improved, meets user's request.
Fig. 2 is the flow chart of according to embodiments of the present invention two man-machine interaction method based on AR;As shown in Fig. 2 the party Method includes step S201-S207:
S201:Interactive mode control instruction is received, the interaction side currently selected is included in the interactive mode control instruction Formula.
In an optional embodiment, user can be set current by sending corresponding control instruction to terminal device The mode (such as interaction of gesture interaction, expression, eyeball interaction, skeleton interaction or lip reading interaction) of man-machine interaction is carried out, with simplification The process that action data is identified, and then improve the efficiency of follow-up man-machine interaction.
In an optional embodiment, after terminal device receives the interactive mode currently selected comprising user, you can after It is continuous to perform following steps.
S202:From the first corresponding relation pre-saved, multigroup sample operations number corresponding to the interactive mode is inquired about According to.
In an optional embodiment, according to based on the interactive mode built in advance and the first of multigroup sample operations data Corresponding relation, inquire about multigroup sample operations data corresponding to the interactive mode currently selected.
S203:Gather the set data in current augmented reality AR scenes.
S204:Feature extraction is carried out to the action data, obtains the fisrt feature of the action data.
S205:Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, Obtain similarity comparison result.
S206:One group of sample operations data is determined according to the similarity comparison result.
S207:Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
Wherein, step S203-S207 is identical with the step S101-S105 in foregoing embodiment illustrated in fig. 1, and relevant explanation is said It is bright to may refer to foregoing embodiment illustrated in fig. 1, no longer repeated herein.
As shown from the above technical solution, the present embodiment includes the control instruction of the interactive mode currently selected by receiving, Comprising the interactive mode currently selected in the interactive mode control instruction, and from the first corresponding relation pre-saved, look into Multigroup sample operations data corresponding to the interactive mode are ask, can quickly determine to gather the type of action data, and exactly It is determined that the sample data contrasted with the action data of collection, simplifies what the type of the action data of collection was determined Process, and then improve the efficiency of follow-up man-machine interaction.
Fig. 3 is the flow chart of according to embodiments of the present invention three man-machine interaction method based on AR;As shown in figure 3, the party Method comprises the following steps S301-S310:
S301:Determine interaction side corresponding to multigroup sample operations data to be saved and multigroup sample operations data Formula.
In an optional embodiment, multigroup sample operations data are gathered in advance, such as multigroup gesture operation data, Duo Zubiao Feelings operation data, multigroup eyeball operation data, multigroup skeleton operation data or lip reading operation data etc., wherein, every group of sample behaviour Make data and can correspond to a relevant action.
In an optional embodiment, above-mentioned multigroup sample operations data to be saved and multigroup sample behaviour are determined Make interactive mode corresponding to data, the interactive mode as corresponding to multigroup gesture operation data is gesture interaction, multigroup expression operation Interactive mode corresponding to data is expression interaction etc..
S302:Build the first corresponding relation between multigroup sample operations data and the interactive mode.
In an optional embodiment, it is determined that above-mentioned multigroup sample operations data to be saved and multigroup sample After interactive mode corresponding to operation data, you can build first between multigroup sample operations data and the interactive mode Corresponding relation, such as build a correspondence table.
S303:Preserve first corresponding relation.
In an optional embodiment, first between multigroup sample operations data and the interactive mode is being built After corresponding relation, you can save the first corresponding relation of the structure.For example, the correspondence table of structure can be deposited Storage is in terminal device.
S304:Interactive mode control instruction is received, the interaction side currently selected is included in the interactive mode control instruction Formula.
S305:From the first corresponding relation pre-saved, multigroup sample operations number corresponding to the interactive mode is inquired about According to.
S306:Gather the set data in current augmented reality AR scenes.
S307:Feature extraction is carried out to the action data, obtains the fisrt feature of the action data.
S308:Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, Obtain similarity comparison result.
S309:One group of sample operations data is determined according to the similarity comparison result.
S310:Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
Wherein, step S304-S310 is identical with the step S201-S207 in embodiment described in earlier figures 2, and relevant explanation is said It is bright to may refer to previous embodiment, no longer repeated herein.
As shown from the above technical solution, the present embodiment is by determining multigroup sample operations data to be saved and described more Interactive mode corresponding to group sample operations data, and build the between multigroup sample operations data and the interactive mode One corresponding relation, and then preserve first corresponding relation, it is possible to achieve subsequently selected according to the corresponding relation and user Interactive mode quickly and accurately determines multigroup sample data, significantly simplifies and the type of the action data of collection is carried out really Fixed process, and then improve the efficiency of follow-up man-machine interaction.
Fig. 4 is the flow chart of according to embodiments of the present invention four man-machine interaction method based on AR;As shown in figure 4, the party Method comprises the following steps S401-S407:
S401:Gather the set data in current augmented reality AR scenes.
S402:Feature extraction is carried out to the action data, obtains the fisrt feature of the action data.
S403:Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, Obtain similarity comparison result.
S404:Similarity threshold is determined according to sensitivity parameter.
In an optional embodiment, user can set the sensitivity parameter of man-machine interaction according to personal like, or, Developer can rule of thumb set the sensitivity parameter of man-machine interaction.
In an optional embodiment, similarity threshold can be determined according to the sensitivity parameter that user or developer are set Value, the similarity threshold can be used for whether judging identified and one group of most like sample operations data of the action data Meet to require.
S405:By the similarity comparison result compared with the similarity threshold.
In an optional embodiment, if obtaining the second spy of fisrt feature and multigroup sample operations data in step S403 The similarity comparison result of sign is respectively 85%, 40% and 20% (to correspond respectively to the spy with first group of sample operations data The similarity comparison result of the feature of sign, the feature of second group of sample operations data and the 3rd group of sample operations data), and phase It is 60% like degree threshold value, then by the similarity comparison result respectively compared with the similarity threshold, it may be determined that phase Reach the similarity threshold (85% like degree comparison result>60%).
S406:If it is determined that the similarity comparison result reaches the similarity threshold, then compared according to the similarity As a result one group of sample operations data is determined.
In an optional embodiment, however, it is determined that the similarity comparison result reaches the similarity threshold, then can be with One group of sample operations data is determined according to the similarity comparison result for reaching the similarity threshold, such as can be by above-mentioned One group of sample operations data is defined as one group of sample operations data.
S407:Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
Wherein, step S401-S403, S406-S407 is identical with the step S101-S105 in embodiment described in earlier figures 1, Relevant explanation explanation may refer to previous embodiment, no longer be repeated herein.
As shown from the above technical solution, the present embodiment is according to sensitivity parameter by determining similarity threshold, and will described in Similarity comparison result is worked as compared with the similarity threshold and determines that the similarity comparison result reaches the phase During like degree threshold value, one group of sample operations data is determined according to the similarity comparison result, it is possible to achieve by user or exploit person Member sets the sensitivity parameter of man-machine interaction, improves flexibility and the intelligent level of man-machine interaction mode, lifts Consumer's Experience.
Fig. 5 is the flow chart of according to embodiments of the present invention five man-machine interaction method based on AR;As shown in figure 5, the party Method comprises the following steps S501-S506:
S501:Gather the set data in current augmented reality AR scenes.
S502:Feature extraction is carried out to the action data, obtains the fisrt feature of the action data.
S503:Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, Obtain similarity comparison result.
S504:One group of sample operations data is determined according to the similarity comparison result.
S505:From the second corresponding relation pre-saved, the inquiry institute corresponding with one group of sample operations data State relevant action.
, can be from each group sample operations data pre-saved and the second couple of relevant action in an optional embodiment In should being related to, the inquiry relevant action corresponding with one group of sample operations data.
S506:Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
Wherein, step S501-S504, S506 is identical with the step S101-S105 in foregoing embodiment illustrated in fig. 1, related Explanation may refer to previous embodiment, no longer be repeated herein.
As shown from the above technical solution, the present embodiment is by from the second corresponding relation pre-saved, inquiry with it is described The corresponding relevant action of one group of sample operations data, the speed and accuracy rate for determining relevant action can be improved, simplified The process being determined to relevant action, and then improve the efficiency of follow-up man-machine interaction.
Fig. 6 is the flow chart of according to embodiments of the present invention six man-machine interaction method based on AR;As shown in fig. 6, the party Method comprises the following steps S601-S609:
S601:One group of sample operations data corresponding to relevant action to be associated and the relevant action are determined, it is described Relevant action is used for interactive operation corresponding to the terminal device execution of association.
In an optional embodiment, multiple relevant actions to be associated are predefined, such as terminal device volume adjusting, screen Curtain brightness regulation, full frame/non-full frame switching etc..
In an optional embodiment, one group of sample operations corresponding to above-mentioned multiple relevant action difference to be associated are determined Data, such as one group of gesture operation data, or one group of expression operation data, or one group of eyeball operation data, or one group of skeleton operation Data, or one group of lip reading operation data etc..
S602:Build the second corresponding relation between the sample operations data and one group of relevant action.
In an optional embodiment, it is determined that one group of sample corresponding to relevant action to be associated and the relevant action After this operation data, you can build the second corresponding relation between the sample operations data and one group of relevant action, example Such as build a correspondence table.
S603:Preserve second corresponding relation.
In an optional embodiment, second between the sample operations data and one group of relevant action is being built After corresponding relation, you can save the second corresponding relation of the structure.For example, the correspondence table of structure can be deposited Storage is in terminal device.
S604:Gather the set data in current augmented reality AR scenes.
S605:Feature extraction is carried out to the action data, obtains the fisrt feature of the action data.
S606:Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, Obtain similarity comparison result.
S607:One group of sample operations data is determined according to the similarity comparison result.
S608:From the second corresponding relation pre-saved, the inquiry institute corresponding with one group of sample operations data State relevant action.
S609:Default interactive operation is performed according to relevant action corresponding to one group of sample operations data
Wherein, step S604-S609 is identical with the step S501-S506 in embodiment described in earlier figures 5, and relevant explanation is said It is bright to may refer to previous embodiment, no longer repeated herein.
As shown from the above technical solution, the present embodiment is by determining relevant action to be associated and the relevant action pair The one group of sample operations data answered, and build the second corresponding pass between the sample operations data and one group of relevant action System, and then preserve second corresponding relation, it is possible to achieve subsequently according to the corresponding relation and one group of sample operations of determination Data quickly and accurately determine relevant action, significantly simplify the process being determined to relevant action, and then after improving The efficiency of continuous man-machine interaction.
Fig. 7 is the structured flowchart of according to embodiments of the present invention eight human-computer interaction device based on AR;As shown in fig. 7, should Device includes data acquisition module 110, characteristic extracting module 120, feature comparing module 130, operation determining module 140 and handed over Mutual execution module 150, wherein:
Data acquisition module 110, for gathering the set data in current augmented reality AR scenes;
Characteristic extracting module 120, for carrying out feature extraction to the action data, obtain the first of the action data Feature;
Feature comparing module 130, distinguish for the second feature according to the fisrt feature and multigroup sample operations data Similarity comparison is carried out, obtains similarity comparison result;
Determining module 140 is operated, for determining one group of sample operations data according to the similarity comparison result;
Interaction execution module 150, performed for the relevant action according to corresponding to one group of sample operations data default Interactive operation.
As shown from the above technical solution, the human-computer interaction device based on AR of the present embodiment, by by gathering current increase Set data in strong reality technology AR scenes, and feature extraction, then first by extraction are carried out to the action data The second feature of feature and multigroup sample operations data carries out similarity and compared respectively, and then is determined according to similarity comparison result One group of sample operations data, default interactive operation is performed with the relevant action according to corresponding to this group of sample operations data, can be with A variety of man-machine interaction modes are realized based on a kind of interface alternation program currently developed, are exploitation and the user of associated developer Use great convenience is provided, and the intelligent level of the man-machine interaction in AR scenes can be improved, meet user's request.
Fig. 8 is the structured flowchart of according to embodiments of the present invention nine human-computer interaction device based on AR;Wherein, data acquisition Module 240, characteristic extracting module 260, feature comparing module 270, operation determining module 280 and interaction execution module 330 with Data acquisition module 110, characteristic extracting module 120, feature comparing module 130, operation determining module in embodiment illustrated in fig. 7 140 and interaction execution module 150 it is identical, no longer repeated herein.As shown in figure 8, on the basis of above-described embodiment, institute Stating device can also include:
Command reception module 210, for receiving interactive mode control instruction, include and work as in the interactive mode control instruction The interactive mode of preceding selection;
Action queries module 230, for from the first corresponding relation pre-saved, inquiring about corresponding to the interactive mode Multigroup sample operations data.
In an optional embodiment, described device can also include:First preserving module 220, for pre-saving The first corresponding relation is stated, the first preserving module 220 can include:
First determining unit 221, for determining multigroup sample operations data to be saved and multigroup sample operations Interactive mode corresponding to data;
First construction unit 222, for building first between multigroup sample operations data and the interactive mode Corresponding relation;
First storage unit 223, for preserving first corresponding relation.
In an optional embodiment, described device can also include:
Threshold determination module 290, for determining similarity threshold according to sensitivity parameter;
Threshold value comparison module 300, for by the similarity comparison result compared with the similarity threshold;
On this basis, interaction execution module 330 can be also used for when the determination similarity comparison result reaches described During similarity threshold, then the operation that one group of sample operations data is determined according to the similarity comparison result is performed.
In an optional embodiment, described device can also include:
Action query module 320, for from the second corresponding relation pre-saved, inquiring about and one group of sample operations The corresponding relevant action of data.
In an optional embodiment, described device can also include:Second preserving module 310, for pre-saving The second corresponding relation is stated, the second preserving module 310 can also include:
Second determining unit 311, for determining one group of sample corresponding to relevant action to be associated and the relevant action This operation data, the relevant action are used for interactive operation corresponding to the terminal device execution of association;
Second construction unit 312, for building second between the sample operations data and one group of relevant action Corresponding relation;
Second storage unit 313, for preserving second corresponding relation.
In an optional embodiment, described device can also include:
Module 250 is normalized, for before feature extraction is carried out to the action data, returning to the action data One change is handled.
It should be noted that for device embodiment, because it corresponds essentially to embodiment of the method, so correlation Place illustrates referring to the part of embodiment of the method, herein without repeating.
The embodiment of the human-computer interaction device based on AR of the present invention can be applied on network devices.Device embodiment can To be realized by software, can also be realized by way of hardware or software and hardware combining.Exemplified by implemented in software, as one Device on logical meaning, it is that corresponding computer program instructions are formed in the processor run memory by equipment where it 's.For hardware view, as shown in figure 9, a kind of hardware for equipment where the human-computer interaction device based on AR of the present invention Structure chart, in addition to the processor shown in Fig. 9, network interface, internal memory and nonvolatile memory, device institute in embodiment Equipment can also generally include common hardware, be such as responsible for the forwarding chip of processing message;From hardware configuration The equipment is also possible to be distributed equipment, may include multiple interface cards, to carry out the expansion of Message processing in hardware view Exhibition.
The embodiment of the present invention additionally provides a kind of computer-readable recording medium, and computer is stored with the storage medium Program, the program are realized when being processed by the processor:
Gather the set data in current augmented reality AR scenes;
Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains phase Like degree comparison result;
One group of sample operations data is determined according to the similarity comparison result;
Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
The embodiment of the present invention is moved by gathering the set data in current augmented reality AR scenes, and to described Make data and carry out feature extraction, then the fisrt feature of extraction carried out to the second feature of multigroup sample operations data respectively similar Degree compares, and then determines one group of sample operations data according to similarity comparison result, with corresponding according to this group of sample operations data Relevant action perform default interactive operation, a variety of man-machine friendships can be realized based on a kind of interface alternation program currently developed Mutual mode, for developer, after need to only loading the algorithm model studied well and associated profile, for correlation The exploitation of developer provides a great convenience;For a user, it is possible to achieve neatly by each user according to oneself Individual character sets corresponding some interactive operation of certain action with custom, is that the use of user provides a great convenience, improves AR The intelligent level of man-machine interaction in scene, meets user's request.
Each embodiment in this specification is described by the way of progressive, what each embodiment stressed be with The difference of common embodiment, between each embodiment identical similar part mutually referring to.For device embodiment For, because it is substantially similar to embodiment of the method, so description is fairly simple, referring to the portion of embodiment of the method in place of correlation Defend oneself bright.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention God any modification, equivalent substitution and improvements done etc., should be included within the scope of protection of the invention with principle.

Claims (10)

  1. A kind of 1. man-machine interaction method based on AR, it is characterised in that including:
    Gather the set data in current augmented reality AR scenes;
    Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
    Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains similarity Comparison result;
    One group of sample operations data is determined according to the similarity comparison result;
    Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
  2. 2. according to the method for claim 1, it is characterised in that one in the current augmented reality AR scenes of collection Before group action data, in addition to:
    Interactive mode control instruction is received, the interactive mode currently selected is included in the interactive mode control instruction;
    From the first corresponding relation pre-saved, multigroup sample operations data corresponding to the interactive mode are inquired about.
  3. 3. according to the method for claim 2, it is characterised in that also include:Described first is pre-saved according to following steps Corresponding relation:
    Determine interactive mode corresponding to multigroup sample operations data to be saved and multigroup sample operations data;
    Build the first corresponding relation between multigroup sample operations data and the interactive mode;
    Preserve first corresponding relation.
  4. 4. according to the method for claim 1, it is characterised in that it is described obtain similarity comparison result after, in addition to:
    Similarity threshold is determined according to sensitivity parameter;
    By the similarity comparison result compared with the similarity threshold;
    If it is determined that the similarity comparison result reaches the similarity threshold, then perform described compared according to the similarity and tie Fruit determines the operation of one group of sample operations data.
  5. 5. according to the method for claim 1, it is characterised in that the pass according to corresponding to one group of sample operations data Before joining the default interactive operation of action executing, in addition to:
    From the second corresponding relation pre-saved, the inquiry association corresponding with one group of sample operations data is moved Make.
  6. 6. according to the method for claim 5, it is characterised in that also include:Described second is pre-saved according to following steps Corresponding relation:
    One group of sample operations data corresponding to relevant action to be associated and the relevant action are determined, the relevant action is used Interactive operation corresponding to being performed in the terminal device to association;
    Build the second corresponding relation between the sample operations data and one group of relevant action;
    Preserve second corresponding relation.
  7. 7. according to the method for claim 1, it is characterised in that methods described also includes:
    Before carrying out feature extraction to the action data, the action data is normalized.
  8. A kind of 8. human-computer interaction device based on AR, it is characterised in that including:
    Data acquisition module, for gathering the set data in current augmented reality AR scenes;
    Characteristic extracting module, for carrying out feature extraction to the action data, obtain the fisrt feature of the action data;
    Feature comparing module, it is similar for being carried out respectively to the second feature of multigroup sample operations data according to the fisrt feature Degree compares, and obtains similarity comparison result;
    Determining module is operated, for determining one group of sample operations data according to the similarity comparison result;
    Interaction execution module, default interaction behaviour is performed for the relevant action according to corresponding to one group of sample operations data Make.
  9. 9. a kind of electronic equipment, it is characterised in that the electronic equipment includes:
    Processor;
    It is configured as storing the memory of processor-executable instruction;
    Wherein, the processor is configured as:
    Gather the set data in current augmented reality AR scenes;
    Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
    Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains similarity Comparison result;
    One group of sample operations data is determined according to the similarity comparison result;
    Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
  10. 10. a kind of computer-readable recording medium, computer program is stored with the storage medium, it is characterised in that the journey Realized when sequence is processed by the processor:
    Gather the set data in current augmented reality AR scenes;
    Feature extraction is carried out to the action data, obtains the fisrt feature of the action data;
    Similarity is carried out respectively according to the second feature of the fisrt feature and multigroup sample operations data to compare, and obtains similarity Comparison result;
    One group of sample operations data is determined according to the similarity comparison result;
    Default interactive operation is performed according to relevant action corresponding to one group of sample operations data.
CN201710943334.6A 2017-10-11 2017-10-11 Man-machine interaction method, device and equipment based on AR Pending CN107831890A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710943334.6A CN107831890A (en) 2017-10-11 2017-10-11 Man-machine interaction method, device and equipment based on AR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710943334.6A CN107831890A (en) 2017-10-11 2017-10-11 Man-machine interaction method, device and equipment based on AR

Publications (1)

Publication Number Publication Date
CN107831890A true CN107831890A (en) 2018-03-23

Family

ID=61647799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710943334.6A Pending CN107831890A (en) 2017-10-11 2017-10-11 Man-machine interaction method, device and equipment based on AR

Country Status (1)

Country Link
CN (1) CN107831890A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110314344A (en) * 2018-03-30 2019-10-11 杭州海康威视数字技术股份有限公司 Move based reminding method, apparatus and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN105653037A (en) * 2015-12-31 2016-06-08 张小花 Interactive system and method based on behavior analysis
CN106095111A (en) * 2016-06-24 2016-11-09 北京奇思信息技术有限公司 The method that virtual reality is mutual is controlled according to user's eye motion
CN106997236A (en) * 2016-01-25 2017-08-01 亮风台(上海)信息科技有限公司 Based on the multi-modal method and apparatus for inputting and interacting
CN107066081A (en) * 2016-12-23 2017-08-18 歌尔科技有限公司 The interaction control method and device and virtual reality device of a kind of virtual reality system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN105653037A (en) * 2015-12-31 2016-06-08 张小花 Interactive system and method based on behavior analysis
CN106997236A (en) * 2016-01-25 2017-08-01 亮风台(上海)信息科技有限公司 Based on the multi-modal method and apparatus for inputting and interacting
CN106095111A (en) * 2016-06-24 2016-11-09 北京奇思信息技术有限公司 The method that virtual reality is mutual is controlled according to user's eye motion
CN107066081A (en) * 2016-12-23 2017-08-18 歌尔科技有限公司 The interaction control method and device and virtual reality device of a kind of virtual reality system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110314344A (en) * 2018-03-30 2019-10-11 杭州海康威视数字技术股份有限公司 Move based reminding method, apparatus and system

Similar Documents

Publication Publication Date Title
CN104123078B (en) The method and apparatus of input information
TWI444836B (en) Method and apparatus for remote workspace sharing
CN105159687B (en) A kind of information processing method, terminal and computer-readable storage medium
CN105955587B (en) A kind of hiden application icon display method and device
CN104461473B (en) A kind of switching method and device, terminal device of screen locking wallpaper
US20150128085A1 (en) Method, Device and Computer Storage Medium for Controlling Desktop
CN108958487A (en) It is pre-processed using gesture of the marked region to video flowing
CN104917967A (en) Photographing method and terminal
CN107622483A (en) A kind of image combining method and terminal
CN109844760A (en) Time correlation ink
CN107943359A (en) A kind of method, terminal and the computer-readable medium of the control of application program
CN107508961A (en) A kind of active window starts method, terminal and computer-readable recording medium
CN107038078A (en) A kind of information sharing method and electronic equipment
CN107340964A (en) The animation effect implementation method and device of a kind of view
CN107203309A (en) View switching method and device, computer installation and computer-readable recording medium
CN107831890A (en) Man-machine interaction method, device and equipment based on AR
CN109889892A (en) Video effect adding method, device, equipment and storage medium
CN105324160A (en) Electronic game machine, electronic game processing method, and electronic game program
CN104714739B (en) Information processing method and electronic equipment
CN109284060A (en) Display control method and relevant apparatus
CN109271228A (en) Interface function recognition methods, device and the electronic equipment of application
CN106502496B (en) Window switching method and device
CN104871116B (en) Information processor, information processing method and medium
CN105022737B (en) Micro-blog method for information display and device
JP6661780B2 (en) Face model editing method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination