CN103513906B - A kind of command identifying method, device and electronic equipment - Google Patents

A kind of command identifying method, device and electronic equipment Download PDF

Info

Publication number
CN103513906B
CN103513906B CN201210223896.0A CN201210223896A CN103513906B CN 103513906 B CN103513906 B CN 103513906B CN 201210223896 A CN201210223896 A CN 201210223896A CN 103513906 B CN103513906 B CN 103513906B
Authority
CN
China
Prior art keywords
user
information
electronic equipment
image
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210223896.0A
Other languages
Chinese (zh)
Other versions
CN103513906A (en
Inventor
阳光
贺志强
柴海新
付荣耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201210223896.0A priority Critical patent/CN103513906B/en
Publication of CN103513906A publication Critical patent/CN103513906A/en
Application granted granted Critical
Publication of CN103513906B publication Critical patent/CN103513906B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a kind of command identifying method, device and electronic equipment, this method is applied to an electronic equipment, has image unit on the electronic equipment, this method includes:The first image information that collection image unit absorbs, the first image information comprise at least the object information on the display interface of the electronic equipment;First image information is analyzed, when identify user image data be present in the first image information when, triggering image unit capture user motion images information;According to the object information in the motion images information and the first image information of user, determine user on display interface target object to be operated, and according to the motion images information of user, target object and the current running status of electronic equipment, determine operational order and perform the operational order.It the method increase the simple operation to electronic equipment.

Description

A kind of command identifying method, device and electronic equipment
Technical field
The present invention relates to technical field of data processing, more particularly to a kind of command identifying method, device and electricity Sub- equipment.
Background technology
Electronic equipment with touch-screen is increasing, be such as provided with the mobile phone of touch-screen, Pad, notebook computer and Tablet personal computer etc..
User's touch operation on the touchscreen can be captured and then trigger generation phase by being provided with touch-screen electronic equipment The order answered, and perform corresponding operation according to the order.But in some cases, if electronic equipment can only identify it is tactile User's operation on screen is touched, then user only can be by being touched or clicking operation can just be sent out to the touch-screen of electronic equipment When going out certain instruction, it is not easy to user and carries out some processing operations, also make it that operating process of the user to electronic equipment is complicated, and Have influence on Consumer's Experience.
The content of the invention
In view of this, the present invention provides a kind of command identifying method, device and electronic equipment, and this method can be with more convenient Operation of the realization to electronic equipment, simplify operating process, improve Consumer's Experience.
To achieve the above object, the present invention provides following technical scheme:A kind of command identifying method, set applied to an electronics It is standby, there is image unit, methods described includes on the electronic equipment:
The first image information that collection image unit absorbs, described first image information comprise at least and are positioned over the electricity Object information in sub- equipment;
Described first image information is analyzed, when identifying in described first image information user image data be present When, the motion images information of triggering image unit capture user;
According to the object information in the motion images information and described first image information of the user, it is determined that being positioned over User on electronic equipment target object to be operated;
According to the motion images information of user, the target object and the current running status of the electronic equipment, really Determine operational order and perform the operational order.
On the other hand present invention also offers a kind of command recognition unit, applied to an electronic equipment, including:
First image unit, the first image of the object information being positioned on the electronic equipment is comprised at least for absorbing Information;
Image analyzing unit, for analyzing described first image information, when identifying described first image information In when user image data be present, trigger the operation of the second image unit;
Second image unit, for capturing the motion images information of user;
Operation object determining unit, in the motion images information and described first image information according to the user Object information, it is determined that the target object that the user being positioned on the electronic equipment is to be operated;
Command executing unit, for the motion images information according to user, the target object and the electronic equipment Current running status, determine operational order and perform the operational order.
On the other hand, present invention also offers a kind of electronic equipment, including:
First camera, second camera and processor;
The first camera intake comprises at least the first image letter for the object information being positioned on the electronic equipment Breath, and the first view data captured is transferred to the processor;
The motion images information of the second camera capture user, and give the motion images information transfer to the place Manage device;
The processor is analyzed the first image information that the first image unit absorbs, when identifying described first When user image data in image information be present, the motion images information of triggering second camera intake user, according to the use Object information in the motion images information and described first image information at family, it is determined that being positioned over the use on the electronic equipment Family target object to be operated, and it is current according to the motion images information of user, the target object and the electronic equipment Running status, determine operational order and perform the operational order.
Understand that compared with prior art, the present disclosure provides a kind of command recognition side via above-mentioned technical scheme Method, device and electronic equipment, this method are applied to an electronic equipment, have image unit on electronic equipment, this method includes:Adopt The first image information that collection image unit absorbs, the first image information comprise at least the display interface positioned at the electronic equipment On object information;First image information is analyzed, when identify user image data be present in the first image information when, Trigger the motion images information of image unit capture user;According in the motion images information and the first image information of user Object information, user on display interface target object to be operated is determined, and according to the motion images information of user, object Body and the current running status of electronic equipment, determine operational order and perform the operational order.This method contacts without user To display interface, you can send corresponding action command, electronic equipment by the motion images information of the user to absorbing, and With reference to the object information on display interface, it can both determine operational order and perform, so as to improve the behaviour to electronic equipment Make convenience.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the required accompanying drawing used in technology description to be briefly described, it should be apparent that, drawings in the following description are only this The embodiment of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can also basis The accompanying drawing of offer obtains other accompanying drawings.
Fig. 1 shows a kind of schematic flow sheet of command identifying method one embodiment of the present invention;
Fig. 2 shows that electronic equipment utilizes the schematic diagram of two cameras progress image captures in the present invention;
Fig. 3 shows a kind of schematic flow sheet of another embodiment of command identifying method of the present invention;
Fig. 4 shows a kind of schematic flow sheet of another embodiment of command identifying method of the present invention;
Fig. 5 shows a kind of structural representation of command recognition unit one embodiment of the present invention;
Fig. 6 shows a kind of structural representation of command recognition unit one embodiment of the present invention;
Fig. 7 shows a kind of structural representation of command recognition unit one embodiment of the present invention;
Fig. 8 shows the structural representation of a kind of electronic equipment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation describes, it is clear that described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.It is based on Embodiment in the present invention, those of ordinary skill in the art are obtained every other under the premise of creative work is not made Embodiment, belong to the scope of protection of the invention.
The embodiment of the invention discloses a kind of command identifying method, this method is applied to an electronic equipment, the electronic equipment Upper to have image unit, this method includes:The first image information that collection image unit absorbs, first image information is at least Including the object information being positioned on electronic equipment;First image information is analyzed, when identifying in the first image information When user image data be present, the motion images information of triggering image unit capture user;According to the motion images information of user And the first object information in image information, it is determined that the target object that the user being positioned on electronic equipment is to be operated;According to The motion images information of user, the target object and the current running status of electronic equipment, determine operational order and perform The operational order.This method can eaily realize the operation to electronic equipment, simplify operating process, improve user Experience.
The command identifying method of the present invention is described in detail below in conjunction with the accompanying drawings, referring to Fig. 1, shows the present invention one The schematic flow sheet of kind command identifying method one embodiment, this method are applied to an electronic equipment, and the electronic equipment can be PAD, mobile phone either tablet personal computer etc..The electronic equipment includes image unit in the present embodiment, and the image unit can be managed Solve to image first-class image-pickup device.This method includes:
Step 101:The first image information that collection image unit absorbs, first image information are comprised at least and are positioned over Object information on electronic equipment.
Image capture is carried out to the electronic equipment by image unit, the scope that unit intake is absorbed in the step can root According to needing to set, but the first image that image unit absorbs should comprise at least the object information being positioned on the electronic equipment, Specifically include and whether be placed with object on the electronic equipment, the number etc. of which object and object is placed with the electronic equipment Information.
The difference of scope is absorbed according to the image unit, that is absorbed in first image information is positioned on electronic equipment The mode of object information also have various ways.Such as, when the electronic equipment has display interface, there is displaying interface work( During energy, image unit can be controlled to be monitored the display interface of electronic equipment, display interface is arrived in collection image unit intake On the first image information, first image information comprise at least be positioned on the display interface of the electronic equipment object letter Breath.Certainly, if the electronic equipment does not have display interface, a camera watch region, the model of the camera watch region can also be set Setting as needed is enclosed, and then controls image unit to carry out IMAQ to the camera watch region, acquisition is positioned over the electronic equipment The camera watch region object information, the object information being such as positioned on a certain fixed pan of the electronic equipment.
Step 102:First image information is analyzed, when identifying in the first image information user image data be present When, the motion images information of triggering image unit capture user.
If analyzed the first image information, when including user image data in first image information, i.e., should The image information of certain customers is included in first image information, as first image information include the hand of user, face or When person is other trunk information of user, it is determined that the image data information of user is arrived in intake in the range of intake, and then triggers Capture the motion images information of user.
Wherein, the motion images information of user includes the facial expression information of user, such as the eye movement of user, mouth deformation Change, the motion images information can also include the trunk movable information of user, such as body twisting, the letter such as the gesture motion of user Breath.
Step 103:According to the object information in the motion images information and the first image information of user, it is determined that being positioned over User on electronic equipment target object to be operated.
Use can be analyzed according to the motion images information of the object information got in the first image information, and user Family needs to operate which object being positioned on electronic equipment, and then the object determined is to be operated as user Target object.
Step 104:According to the current operation shape of the motion images information, the information of target object and electronic equipment of user State, determine operational order and perform the operational order.
It is current according to the motion images information of user and the electronic equipment after determining user's target object to be operated Motion state determine operational order.According to the difference of target object, and the difference of the current running status of electronic equipment, The operational order of identical user action triggering may also can be different.
Such as, this is according to the difference of the motion state of electronic equipment, and the operational order determined of difference of target object It can include moving target object, and specific moving direction then user in the motion images information with getting user Specific action it is related, either, show the informative presentations relevant with the target object or show corresponding menu interface Deng.Such as, can place to be provided with electromagnet below the region of object in the electronic equipment, and then can be passed through by control The electric current of electromagnet produces corresponding magnetic field, and then realizes and move direction to the target object being positioned on electronic equipment Control, naturally, the target object also should be can with induced field change charged object.
Specifically, according to the current running status of electronic equipment, the motion images information of user is analyzed, and according to capturing User action and preset object operational order corresponding relation, it is determined that the operational order related to target object, according to grasping Make instruction and perform control to target object, wherein, the operational order include to assigned direction move the target object and/ Or show the information of the target object.
In addition, the object being positioned on the electronic equipment can also include the second electronic equipment, the second electronic equipment is should Other controlled electronic equipments outside electronic equipment, are used for the purpose of distinguishing the electronic equipment of the present invention and are positioned over this herein Other electronic equipments on electronic equipment, and other electronic equipments are named as the second electronic equipment.As being when the electronic equipment During Pad, the second electronic equipment can be mobile phone.When the target object being positioned on the electronic equipment is the second electronic equipment, The operational order then generated can be then the control instruction to second electronic equipment, or show second electronic equipment The order of corresponding information.
Specifically, when the target object is the second electronic equipment, according to the motion images information of user and the electricity The running status of sub- equipment, generation control the control instruction of second electronic equipment, and the control instruction is sent to institute State the second electronic equipment.Such as, when to determine target object be the second electronic equipment, and in the current motion state of electronic equipment Under, the motion images information of the user, which represents, to be carried out data transmission, then the user action image information got is analyzed, Data transmission direction is determined, when data transfer direction for when being transmitted to the electronic equipment from the second electronic equipment, generation instruction the Its data is transferred to the instruction of the electronic equipment by two electronic equipments, and sends the instruction to the second electronic equipment, so as to second Electronic equipment, which performs the instruction, to carry out data transmission.Certainly, the action message of user can also be by the number on the electronic equipment According to the second electronic equipment, and then the instruction of generation instruction the second electronic equipment reception data is transferred to, then the electronic equipment performs User instruction transmits data to the second electronic equipment, and the second electronic equipment receives the data.
In the present embodiment, when user needs that the object placed on electronic equipment operate or input and electronics During the instruction for the object correlation placed in equipment, user can pass through its action letter without carrying out complicated input operation, user Breath sends corresponding instruction, and the electronic equipment is according to the action message of the user captured and the thing being positioned on electronic equipment Body determines that operational order performs accordingly to determine user's target object to be operated with reference to the running status of the electronic equipment Operation, simplify operating process, improve operating efficiency.
It should be noted that in order to more comprehensively get user to the operation for the object placed on electronic equipment Behavior, two image units typically can be on an electronic device set, two cameras are such as set on an electronic device.Wherein one The regional extent club of individual image unit intake has any different, meanwhile, wherein one image unit intake of setting is close to electronic equipment Small range region image, another image unit can then absorb image in larger scope.Specifically, electronic equipment Image unit includes the first image unit and the second image unit, and two image units can be distributed in the phase of electronic equipment respectively Mutually on parallel both sides, wherein, the first image unit, can to perpendicular to the direction of one plane of electronic equipment to extension Stretch, and then first image unit is absorbed in the plane of the electronic equipment(The plane can be display interface The either a certain common plane of electronic equipment).First image unit intake comprises at least the thing being positioned on electronic equipment First image information of body information, electronic equipment gather the first image information that first image unit absorbs, when analyzing When the image data information of user is included in first image information, then the motion images of the second image unit capture user are triggered Information.The spatial dimension of second image unit intake image is bigger than the spatial dimension of the first image unit intake image, because This, the second image unit can more completely capture the action message of user's various pieces.
In order to clearly represent the relation between two cameras, referring to Fig. 2, electronic equipment in the present invention is shown On two cameras intake image schematic diagram, should wherein be placed with two objects 2 in a plane of electronic equipment 1 The first camera 11 and second camera 12 are provided with electronic equipment 1, thing is placed on first camera intake electronic equipment Image in the planar range of body, and partial user images, in Fig. 2, two outside dotted lines of first camera, table The image pickup scope of first camera is shown.Second camera 12 is higher than the first camera 11, and the second camera can absorb The spatial dimension arrived is larger, can capture the motion images information of more complete user.It is to put with object 21 and object 22 Exemplified by putting on the display interface of electronic equipment, then first camera then mainly absorbs first in display interface designated area Image information, to determine whether to be placed with object on the display interface of electronic equipment, and it whether there is in the designated area Action message of user etc..
Due to the facial image information and gesture motion image letter of user can be included in the motion images information of user Breath etc., and different motion images information is it is determined that during operational order, the operational motion of triggering may also can be different.Referring to Fig. 3, shows a kind of schematic flow sheet of another embodiment of order control method of the invention, and the embodiment is real shown in Fig. 1 The mode that is particularly shown of example is applied, comprises at least the facial information of user with the action message of the user got in the present embodiment Exemplified by, the method for the present embodiment is applied to an electronic equipment, and the electronic equipment can be Pad, tablet personal computer etc..This method bag Include:
Step 301:The first image information that collection image unit absorbs, first image information are comprised at least and are positioned over Object information on electronic equipment.
Step 302:First image information is analyzed, when identifying that the first image information includes user image data When, the motion images information of triggering image unit capture user, the motion images information includes the facial image information of user.
Step 303:The eyeball fixes direction of user is determined according to the facial image information of user, and determines the eyeball note The intersection point of straight line and the electronic equipment where apparent direction, and obtain the intersecting point coordinate.
After the facial image information of user is got, the eyeball fixes direction of user is analyzed, can specifically be used existing Some image recognition modes, will not be repeated here., can be with according to the straight line and the intersection point of electronic equipment where eyeball fixes direction Determine the intersecting point coordinate.The intersecting point coordinate can be understood as the relative coordinate position in the plane of electronic equipment placement object Put.
Then arrived as object is positioned on display interface, in step 301 for collection the first image unit intake comprise at least it is aobvious Show the object information placed on interface.When capturing the facial image information of user, it is determined that the eyeball fixes direction of user The straight line at place and the intersection point of display interface, and then obtain coordinate position of the intersection point in display interface.
Step 304:Object information in described first image information is analyzed, it is determined that on the electronic equipment Object position coordinates.
Wherein it is determined that the position coordinates of object also should relative to the coordinate for the plane that the object is placed on electronic equipment, As object is positioned on the display interface of electronic equipment, it is determined that coordinate position of the object on display interface.
Step 305:Using the target to be operated as user of the object with the intersecting point coordinate with identical position coordinates Object.
According to the first image information captured, can analyze on the electronic equipment is positioned over, and more than being in The object of intersecting point coordinate opening position, the object are the target object to be operated of user.
When object is positioned on the display interface of electronic equipment, it is determined that the target object gone out, which also should be, is positioned over display User on interface target object to be operated.
Step 306:According to the current operation shape of the motion images information, the information of target object and electronic equipment of user State, determine operational order and perform the operational order.
The step may refer to the step 104 in Fig. 1 embodiments, will not be repeated here.
Referring to Fig. 4, a kind of schematic flow sheet of another embodiment of command identifying method of the present invention, the present embodiment are shown Method be embodiment illustrated in fig. 1 concrete implementation mode, the present embodiment with the user action image information that gets at least Exemplified by gesture motion image information including user, the present embodiment includes:
Step 401:The first image information that collection image unit absorbs, first image information are comprised at least and are positioned over Object information on electronic equipment.
Step 402:First image information is analyzed, when identifying that the first image information includes user image data When, the motion images information of triggering image unit capture user, the motion images information includes the gesture motion image of user Information.
Step 403:The gesture motion image information of the user is analyzed, determines the sensing of the gesture motion of user.
Step 404:According to the sensing of the gesture motion of the user, it is determined that being positioned on the electronic equipment and and user Gesture motion point to the target object that matches, using the target object target object to be operated as user.
The gesture motion image information of user in the present embodiment to capturing carries out graphical analysis, from the gesture motion figure Determine that the gesture motion of user is pointed to as in, and then determine to be positioned over electronic equipment according to the sensing of the gesture motion of user On target object to be operated.Determine target object to be operated also as noted in Fig. 2 embodiments according to eyeball in this step Apparent direction is similar to determine the mode of target object to be operated, i.e., the straight line according to where being pointed to the gesture motion of user is with being somebody's turn to do The intersection point of electronic equipment, and intersecting point coordinate is determined, and then determine to be placed on the object of the intersecting point coordinate opening position, the object is made For target object.
Certainly, when object is positioned on the display interface of electronic equipment, it is determined that where the gesture motion of user is pointed to Straight line and the display interface intersecting point coordinate, and then the intersecting point coordinate opening position on display interface is determined according to intersecting point coordinate The object of placement, using the object as target object.
Step 405:According to the current operation shape of the motion images information, the information of target object and electronic equipment of user State, determine operational order and perform the operational order.
The operation of the step may refer to the description of the step 104 of embodiment illustrated in fig. 1, will not be repeated here.
For more than the present invention each embodiment, the motion images information for obtaining user can be that real-time capture is used The motion images information at family, the motion images information of a width user can also be obtained at predetermined time intervals, certainly, first image Information can also be monitored in real time to electronic equipment, and is carried out image acquisition in real time or triggered at predetermined time intervals Image unit absorbs first image information.According to the motion images information of user and the first image information absorbed, User's target object to be operated can be analyzed, after target object is determined, according to the motion images information of user with And first image information, it may be determined that go out the input action of user, and according to the current running status of the electronic equipment, it is determined that Go out the input instruction corresponding to the input action under electronic equipment current operating conditions, and generate corresponding operational order, enter And perform the operational order.
Accordingly, when determining operational order, it may be necessary to which several motion images information of the user to getting are divided Analysis, to determine under current motion state, the action behavior respective operations order of the user.
In order to which the solution of the present invention is more clearly understood, below using electronic equipment as the Pad with two cameras, And to place several chess pieces on pad display interface(Such as Chinese checkers or Chinese chess)Exemplified by, the first camera on pad can To carry out image capture including at least the region of display interface, second camera can then absorb the display interface of electronic equipment And the image around electronic equipment in predeterminable area(The specific intake image schematic diagram that may refer to shown in Fig. 2).Assuming that Electromagnet is provided with below Pad display interface, Pad by being controlled to factors such as the size of current by electromagnet, Come realize the moving direction for the object being positioned on display interface, displacement control etc..Accordingly, it is placed into pad display circle Chess piece on face for can induced field charged object.Using the method for the present invention, when user A is to putting on pad display interface When putting chess piece, the first camera can absorb the chess piece of user's placement on an electronic device, and the part comprising the user Image information, and then determine that the chess piece is placed by the user A.Meanwhile trigger second camera and the motion images of user are believed Breath is captured, to determine whether user needs to send certain instruction.
During playing chess, the current motion state of the electronic equipment is the state to chessboard control, when user needs During some mobile chess piece, without the manual movement chess piece, e.g., user can send instruction by eye movement.Electronics is set The standby motion images information according to the user captured, user target to be operated is determined, and continue to capture user Motion images information, e.g., the eye movement of user is pointed to, and electronic equipment is according to motion images information of the user captured etc. It can determine that user needs the moving direction to the target, and then generate the operation life of control targe chess piece moving direction Order, and be controlled by electrical current to electromagnet etc., to realize the movement to the target.
Certainly, convenient understanding is described by taking the movement to object as an example exemplified by the example is only, it is to be understood that It is that, according to the difference of the current running status of target object, electronic equipment, operational order corresponding to identical user action may Can be different.Operational order such as generation can also be the information that obtains the preset target object and enterprising in display interface Row shows.Either show related control menu of target object etc..
The command identifying method of the corresponding present invention, present invention also offers a kind of command recognition unit, command recognition dress Put and be applied to an electronic equipment, referring to Fig. 5, show a kind of structural representation of one embodiment of command recognition unit of the present invention Scheme, the command recognition unit includes in the present embodiment:First image unit 510, image analyzing unit 520, the second image unit 530th, operation object determining unit 540 and command executing unit 550.
Wherein, the first image unit 510, the object information being positioned on the electronic equipment is comprised at least for absorbing First image information.
Image capture is carried out to the electronic equipment by image unit, the scope that unit intake is absorbed in the step can root According to needing to set, but the first image that image unit absorbs should comprise at least the object information being positioned on the electronic equipment, Specifically include and whether be placed with object on the electronic equipment, the number etc. of which object and object is placed with the electronic equipment Information.
Image analyzing unit 520, for analyzing the first image information, exist when identifying in the first image information During user image data, the operation of the second image unit is triggered.
The mode that the image analyzing unit is analyzed the first image information can use existing image recognition technology Carry out graphical analysis, when analyze user images be present in the first image information that the first image unit absorbs when, then trigger Capture the operation of the motion images information of user.
Second image unit 530, for capturing the motion images information of user.
Second image unit can be with the motion images information of user in real, can also be according to default time interval To capture the motion images information of user.Include in the motion images information for the user that the second image unit captures:User Facial image information, the gesture motion image information of user and user other limb action information etc..
Operation object determining unit 540, for being believed according to the motion images information and described first image of the user Object information in breath, it is determined that the target object that the user being positioned on the electronic equipment is to be operated.
Command executing unit 550, set for the motion images information according to user, the target object and the electronics Standby current running status, determines operational order and performs the operational order.
Wherein, the target object being positioned on electronic equipment can have polytype, the thing being such as positioned on electronic equipment The charged object that body can move on an electronic device with the control of electronic equipment, or other electronic equipments.
Accordingly, the command executing unit, it is specially:For according to the current running status of electronic equipment, described in analysis The motion images information of user, and according to the corresponding relation of the user action captured and preset object operational order, it is determined that The operational order related to the target object, the control to target object is performed according to the operational order, wherein, the behaviour Making instruction includes moving the target object to assigned direction and/or shows the information of the target object.
When it is other electronic equipments to identify target object, the command executing unit, it is specially:For when the target When object is the second electronic equipment, according to the motion images information of user and the running status of the electronic equipment, generation control The control instruction of second electronic equipment is made, and the control instruction is sent to second electronic equipment.
In addition, the first image information that the first image unit absorbs can be the image of predeterminable area on electronic equipment, Specifically, the first image unit 510, specially intake are including at least the object letter placed on the display interface of the electronic equipment Breath.
Corresponding, operation object determining unit is specifically, for the motion images information according to the user and described Object information in first image information, it is determined that the target object that the user being positioned on the display interface is to be operated.
It should be noted that first image unit and the second image unit can be same unit in the present invention, Can be two different units.
The command recognition unit can be gathered including at least the object information being positioned on electronic equipment in the present embodiment First image information, and when analyze user image data be present in the first image information when, obtain user motion images letter Breath, and then according to the object information in the motion images information and the first image information of user, it is determined that being positioned over electronic equipment On user's target object to be operated, and then according to the current motion state of electronic equipment, the action message of user and mesh Mark object information generates operational order, and performs corresponding operational order, avoid user carry out it is complicated be manually entered with The related operational order of target object, simplifies operating process.
Due to the species of the motion images information of the user captured have it is a variety of, as the motion images information of user can wrap Facial image information, the gesture motion image information of user of user is included, wherein facial image information can include eyeball fortune again The motions and facial expression etc. such as dynamic, mouth, nose, therefore according to the difference of the motion images information got, determine target object Process may difference, referring to Fig. 6, show that a kind of structure of another embodiment of command recognition unit of the present invention is shown It is intended to, is with the difference of embodiment illustrated in fig. 5:
Include the gesture motion image information of user with the motion images information of the user got in this embodiment Exemplified by, then operation object determining unit 540 specifically includes:
First analytic unit 541, for analyzing the gesture motion image information of the user, determine the gesture motion of user Sensing.
First object determining unit 542, for the sensing of the gesture motion according to the user, it is determined that being positioned over the electricity The target object to match is pointed in sub- equipment and with the gesture motion of user, using the target object mesh to be operated as user Mark object.
The operation of other units may refer to the description of embodiment illustrated in fig. 5.
Further, when including the facial image information of user in the motion images information of user, referring to Fig. 7, for this Invent a kind of structural representation of another embodiment of command recognition unit, the present embodiment it is different from embodiment illustrated in fig. 5 it Be in:
Operation object determining unit 540, it can include:
Second analytic unit 543, for determining the eyeball fixes direction of user according to the facial image information of user, and really The intersection point of straight line and the electronic equipment where the fixed eyeball fixes direction, obtains the intersecting point coordinate.
Coordinate analysis unit 544, for analyzing the object information in described first image information, it is determined that being located at institute State the position coordinates of the object on electronic equipment.
Second object determining unit 545, for the object that will there is identical position coordinates with the intersecting point coordinate as User's target object to be operated.
A kind of command recognition unit of the corresponding present invention, present invention also offers a kind of electronic equipment, referring to Fig. 8, shows The structural representation of a kind of electronic equipment of the present invention, the electronic equipment include:First camera 801, the and of second camera 802 Processor 803;
The intake of first camera 801 comprises at least the first image of the object information being positioned on the electronic equipment Information, and the first view data captured is transferred to the processor;
The second camera 802 captures the motion images information of user, and by the motion images information transfer to institute State processor;
The processor 803 is analyzed the first image information that the first image unit absorbs, described when identifying When user image data be present in the first image information, the motion images information of triggering second camera intake user, according to institute The object information in the motion images information and described first image information of user is stated, it is determined that being positioned on the electronic equipment User's target object to be operated, and according to the motion images information of user, the target object and the electronic equipment Current running status, determine operational order and perform the operational order.
Wherein, first camera and second camera can be understood as two different cameras, or same Individual camera.
The microprocessor is built-in with a kind of command recognition unit in any of the above one embodiment.
Each embodiment is described by the way of progressive in this specification, what each embodiment stressed be and other The difference of embodiment, between each embodiment identical similar portion mutually referring to.For device disclosed in embodiment For electronic equipment, because it is corresponded to the method disclosed in Example, so description is fairly simple, related part referring to Method part illustration.
The foregoing description of the disclosed embodiments, professional and technical personnel in the field are enable to realize or using the present invention. A variety of modifications to these embodiments will be apparent for those skilled in the art, as defined herein General Principle can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, it is of the invention The embodiments shown herein is not intended to be limited to, and is to fit to and principles disclosed herein and features of novelty phase one The most wide scope caused.

Claims (15)

1. a kind of command identifying method, it is characterised in that have shooting single applied to an electronic equipment, on the electronic equipment Member, methods described include:
The first image information for absorb of collection image unit, described first image information are set including at least being positioned over the electronics Standby upper object information;
Described first image information is analyzed, when identify user image data be present in described first image information when, Trigger the motion images information of image unit capture user;
According to the object information in the motion images information and described first image information of the user, it is determined that being positioned over described User on electronic equipment target object to be operated;
According to the motion images information of user, the target object and the current running status of the electronic equipment, it is determined that behaviour Order and perform the operational order;
Wherein, the image unit comprises at least the first image unit and the second image unit, and first image unit is used for Intake is close to the image in the small range region of the electronic equipment, and second image unit is for absorbing figure in larger scope Picture;
The first image information that the collection image unit absorbs specifically includes:
Gather the first image information that first image unit absorbs;
The motion images information of the triggering image unit capture user specifically includes:
The motion images information of the intake elements capture user of triggering second, the range areas of the motion images information is more than described The range areas of first image information.
2. according to the method for claim 1, it is characterised in that the motion images information of the user includes following any It is a kind of:
The facial image information of user, the gesture motion image information of user.
3. according to the method for claim 2, it is characterised in that the motion images information and institute according to the user The object information in the first image information is stated, it is determined that the target object that the user being positioned on the electronic equipment is to be operated, bag Include:
The gesture motion image information of the user is analyzed, determines the sensing of the gesture motion of user;
According to the sensing of the gesture motion of the user, it is determined that being positioned on the electronic equipment and referring to the gesture motion of user To the target object to match, using the target object target object to be operated as user.
4. according to the method for claim 2, it is characterised in that according to the motion images information of the user and described the Object information in one image information, it is determined that the target object that the user being positioned on the electronic equipment is to be operated, including:
The eyeball fixes direction of user is determined according to the facial image information of user, and where determining the eyeball fixes direction The intersection point of straight line and the electronic equipment, obtains the intersecting point coordinate;
Object information in described first image information is analyzed, it is determined that the position of the object on the electronic equipment Coordinate;
Using the target object to be operated as user of the object with the intersecting point coordinate with identical position coordinates.
5. according to the method for claim 1, it is characterised in that the first image information that the collection image unit absorbs In comprise at least the electronic equipment display interface on object information;
The target object for determining that the user that is positioned on the electronic equipment is to be operated, it is specially:
It is determined that the target object that the user being positioned on the display interface is to be operated.
6. method according to claim 1 or 2, it is characterised in that the motion images information according to user, the mesh Object and the current running status of the electronic equipment are marked, operational order is determined and performs the operational order, including:
When the target object is the second electronic equipment, according to the motion images information of user and the fortune of the electronic equipment Row state, generation control the control instruction of second electronic equipment, and the control instruction is sent to second electronics Equipment.
7. method according to claim 1 or 2, it is characterised in that the motion images information according to user, the mesh Object and the current running status of the electronic equipment are marked, operational order is determined and performs the operational order, including:
According to the current running status of electronic equipment, the motion images information of the user is analyzed, and according to the user captured Action and the corresponding relation of preset object operational order, it is determined that the operational order related to the target object, according to described in Operational order performs the control to target object, wherein, the operational order includes moving the target object to assigned direction And/or show the information of the target object.
A kind of 8. command recognition unit, it is characterised in that applied to an electronic equipment, including:
First image unit, the first image letter for the object information being positioned on the electronic equipment is comprised at least for absorbing Breath;Wherein described first image unit is specifically used for, and absorbs the image close to the small range region of the electronic equipment;
Image analyzing unit, for analyzing described first image information, deposited when identifying in described first image information In user image data, the operation of the second image unit is triggered;
Second image unit, for capturing the motion images information of user, the range areas of the motion images information is more than institute State the range areas of the first image information;Wherein described second image unit is specifically used for, and absorbs image in larger scope;
Operation object determining unit, for the thing in the motion images information and described first image information according to the user Body information, it is determined that the target object that the user being positioned on the electronic equipment is to be operated;
Command executing unit, it is current for the motion images information according to user, the target object and the electronic equipment Running status, determine operational order and perform the operational order.
9. device according to claim 8, it is characterised in that the motion images information of the user includes following any It is a kind of:
The facial image information of user, the gesture motion image information of user.
10. device according to claim 9, it is characterised in that the operation object determining unit, including:
First analytic unit, for analyzing the gesture motion image information of the user, determine the sensing of the gesture motion of user;
First object determining unit, for the sensing of the gesture motion according to the user, it is determined that being positioned over the electronic equipment Upper and with user gesture motion points to the target object to match, using the target object object to be operated as user Body.
11. device according to claim 9, it is characterised in that the operation object determining unit, including:
Second analytic unit, for determining the eyeball fixes direction of user according to the facial image information of user, and described in determination The intersection point of straight line and the electronic equipment where eyeball fixes direction, obtains the intersecting point coordinate;
Coordinate analysis unit, for analyzing the object information in described first image information, it is determined that being located at the electronics The position coordinates of object in equipment;
Second object determining unit, for waiting to grasp as user using with object that the intersecting point coordinate has identical position coordinates The target object of work.
12. device according to claim 8, it is characterised in that first image unit, specially intake comprise at least The object information placed on the display interface of the electronic equipment;
The operation object determining unit is specifically, for the motion images information and described first image according to the user Object information in information, it is determined that the target object that the user being positioned on the display interface is to be operated.
13. device according to claim 8 or claim 9, it is characterised in that the command executing unit, be specially:For working as When to state target object be the second electronic equipment, according to the motion images information of user and the running status of the electronic equipment, Generation controls the control instruction of second electronic equipment, and the control instruction is sent to second electronic equipment.
14. device according to claim 8 or claim 9, it is characterised in that the command executing unit, be specially:For basis The current running status of electronic equipment, the motion images information of the user is analyzed, and according to the user action captured and in advance The corresponding relation for the object operational order put, it is determined that the operational order related to the target object, according to the operational order The control to target object is performed, wherein, the operational order includes moving the target object to assigned direction and/or showed The information of the target object.
15. a kind of electronic equipment, it is characterised in that including:
First camera, second camera and processor, first camera are used to absorb close to the small of the electronic equipment The image of range areas, the second camera are used to absorb image in larger scope;
The first camera intake comprises at least the first image information of the object information being positioned on the electronic equipment, and The first view data captured is transferred to the processor;
The motion images information of the second camera capture user, and give the motion images information transfer to the processing Device;The range areas of the motion images information is more than the range areas of described first image information;
The processor is analyzed the first image information that the first image unit absorbs, when identifying described first image When user image data in information be present, the motion images information of triggering second camera intake user, according to the user's Object information in motion images information and described first image information, it is determined that the user being positioned on the electronic equipment treats The target object of operation, and according to the motion images information of user, the target object and the current fortune of the electronic equipment Row state, determine operational order and perform the operational order.
CN201210223896.0A 2012-06-28 2012-06-28 A kind of command identifying method, device and electronic equipment Active CN103513906B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210223896.0A CN103513906B (en) 2012-06-28 2012-06-28 A kind of command identifying method, device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210223896.0A CN103513906B (en) 2012-06-28 2012-06-28 A kind of command identifying method, device and electronic equipment

Publications (2)

Publication Number Publication Date
CN103513906A CN103513906A (en) 2014-01-15
CN103513906B true CN103513906B (en) 2018-01-16

Family

ID=49896722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210223896.0A Active CN103513906B (en) 2012-06-28 2012-06-28 A kind of command identifying method, device and electronic equipment

Country Status (1)

Country Link
CN (1) CN103513906B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866070A (en) * 2014-02-20 2015-08-26 联想(北京)有限公司 Method for information processing and electronic equipment
CN104866081A (en) * 2014-02-25 2015-08-26 中兴通讯股份有限公司 Terminal operation method and device as well as terminal
CN105334940B (en) * 2014-07-14 2019-04-26 联想(北京)有限公司 A kind of control method and electronic equipment
CN105717982B (en) * 2014-12-03 2019-03-08 联想(北京)有限公司 Information processing method and electronic equipment
CN106095178B (en) * 2016-06-14 2019-06-11 广州视睿电子科技有限公司 Input equipment recognition methods and system, input instruction identification method and system
DE102016221829A1 (en) * 2016-11-08 2018-05-09 Audi Ag Energy supply vehicle for supplying an electrically driven motor vehicle with electrical energy
CN110197171A (en) * 2019-06-06 2019-09-03 深圳市汇顶科技股份有限公司 Exchange method, device and the electronic equipment of action message based on user

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170492B2 (en) * 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US20080147488A1 (en) * 2006-10-20 2008-06-19 Tunick James A System and method for monitoring viewer attention with respect to a display and determining associated charges
US20110239114A1 (en) * 2010-03-24 2011-09-29 David Robbins Falkenburg Apparatus and Method for Unified Experience Across Different Devices
US8643625B2 (en) * 2010-06-10 2014-02-04 Empire Technology Development Llc Communication between touch-panel devices

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating

Also Published As

Publication number Publication date
CN103513906A (en) 2014-01-15

Similar Documents

Publication Publication Date Title
CN103513906B (en) A kind of command identifying method, device and electronic equipment
CN107132917B (en) For the hand-type display methods and device in virtual reality scenario
CN108958615A (en) A kind of display control method, terminal and computer readable storage medium
CN102662577B (en) A kind of cursor operating method based on three dimensional display and mobile terminal
CN108646997A (en) A method of virtual and augmented reality equipment is interacted with other wireless devices
CN110083202A (en) With the multi-module interactive of near-eye display
CN103809866B (en) A kind of operation mode switching method and electronic equipment
CN102117117A (en) System and method for control through identifying user posture by image extraction device
CN106200944A (en) The control method of a kind of object, control device and control system
CN106708270A (en) Display method and apparatus for virtual reality device, and virtual reality device
CN104914982B (en) The control method and device of a kind of electronic equipment
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
CN110245250A (en) Image processing method and relevant apparatus
KR20130101367A (en) Gloves based interface apparatus, and haptic systme and method using the same
WO2013149475A1 (en) User interface control method and device
CN104407696B (en) The virtual ball simulation of mobile device and the method for control
CN105979330A (en) Somatosensory button location method and device
CN109407929A (en) A kind of method for sorting and terminal of desktop icons
CN108469940A (en) A kind of screenshot method and terminal
CN104932691A (en) Real-time gesture interaction system with tactile perception feedback
WO2023078272A1 (en) Virtual object display method and apparatus, electronic device, and readable medium
CN105917294A (en) Touch sensor
CN109801709A (en) A kind of system of hand gestures capture and health status perception for virtual environment
CN105204613B (en) A kind of information processing method and wearable device
CN103902202B (en) A kind of information processing method and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant