CN104317398B - A kind of gestural control method, Wearable and electronic equipment - Google Patents

A kind of gestural control method, Wearable and electronic equipment Download PDF

Info

Publication number
CN104317398B
CN104317398B CN201410545362.9A CN201410545362A CN104317398B CN 104317398 B CN104317398 B CN 104317398B CN 201410545362 A CN201410545362 A CN 201410545362A CN 104317398 B CN104317398 B CN 104317398B
Authority
CN
China
Prior art keywords
electronic equipment
display unit
msub
represent
coordinate information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410545362.9A
Other languages
Chinese (zh)
Other versions
CN104317398A (en
Inventor
王峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Samsung Electronics Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Tianjin Samsung Electronics Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Samsung Electronics Co Ltd, Samsung Electronics Co Ltd filed Critical Tianjin Samsung Electronics Co Ltd
Priority to CN201410545362.9A priority Critical patent/CN104317398B/en
Publication of CN104317398A publication Critical patent/CN104317398A/en
Application granted granted Critical
Publication of CN104317398B publication Critical patent/CN104317398B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Abstract

The present invention relates to field of human-computer interaction, discloses a kind of gestural control method, Wearable and electronic equipment, and gestural control method includes:In the acquisition range for the image collecting device that the display unit of electronic equipment is located at Wearable, detect whether gesture operation caused by an operating body be present by Wearable;When Wearable determines to have gesture operation, Wearable obtains operating body by image acquisition device and blocked in the shielded image on display unit surface;Wearable sends relevant information corresponding with shielded image to electronic equipment;Electronic equipment determines the parameter information of gesture operation based on relevant information.

Description

A kind of gestural control method, Wearable and electronic equipment
Technical field
The present invention relates to field of human-computer interaction, more particularly to a kind of gestural control method, Wearable and electronic equipment.
Background technology
With the fast development of electronic equipment, for the electronic equipment with display unit control mode also increasingly It is more, such as:Remote control control, sky mouse control, gesture control etc., wherein gesture control as a kind of new control mode, by The welcome of more and more users is arrived.
So that electronic equipment is television set as an example, existing Gesture Recognition needs to set camera on a television set, uses Family needs towards the display unit of television set to stop in the centre of the palm a moment, then double swerve palm three or four times at leisure, television set are known Do not go out after gesture operation, pointer will be shown in screen centre position, backpointer moved with the direction of user's palm, and lead to Cross the project that gesture operation select finger is chosen.
As can be seen here, at least there is following technical problem in the scheme of prior art:
(1) less efficient, user needs the operation of progress certain gestures, and (centre of the palm stops left at leisure in a moment towards television set Palm is rocked three or four times in the right side) television set is shown pointer, gesture could be used to manipulate pointer afterwards and reach the position needed, so It is relatively complicated in the process for the pointer for calling television set, and because user's palm traveling time and distance are longer, so lead Cause gesture operation time-consuming longer, it can be seen that, because gesture operation is relatively complicated and time-consuming longer in the prior art, so cause The low technical problem of efficiency;
(2) application scenarios have more limitation, and user must be positioned at the centre position towards display devices such as television sets, And distance can not be excessively near, too high or too low, so cause gesture control scheme of the prior art that there is more limitation.
The content of the invention
The present invention provides a kind of gestural control method, Wearable and electronic equipment, to solve in the prior art to bag Electronic equipment containing display unit carries out technical problem less efficient during gesture control.
In a first aspect, the embodiment of the present invention provides a kind of gestural control method, including:
Detect whether gesture operation caused by an operating body be present;
When the gesture operation be present, collection obtains the operating body and blocked in the Occlusion Map on the display unit surface Picture;
The parameter information of the gesture operation is determined based on the shielded image.
Optionally, the parameter information that the gesture operation is determined based on the shielded image, is specifically included:
Determine the first coordinate information of operating body described in the shielded image;
Calibrating coordinates are carried out to first coordinate information, and then obtain the second coordinate information of the operating body, it is described When second coordinate information is watches the operating body from user's angle of the electronic equipment, the operating body is in the display Corresponding coordinate information on unit, second coordinate information are the parameter information.
Optionally, before the progress calibrating coordinates to first coordinate information, methods described also includes:
By the display unit using its center as origin, four quadrants are divided into;
For each quadrant determine abscissa offset Δ x, the Δ x be in respective quadrants X direction and origin away from Offset from farthest point relative to origin;
For each quadrant determine ordinate offset Δ y, the Δ y be in respective quadrants y direction and origin away from Offset from farthest point relative to origin.
Optionally, first coordinate information is calibrated by below equation:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
Optionally, first coordinate information is calibrated by below equation:
Wherein, l0Represent the length of display unit, l1Represent the physical length of display unit, x2Represent the second coordinate information In abscissa, x1Represent the abscissa in the first coordinate information, a1Represent that X direction in respective quadrants is farthest apart from origin The actual abscissa of point on the display unit;
Wherein, h0Represent the length of display unit, h1Represent the physical length of display unit, y2Represent the second coordinate information In ordinate, y1Represent the abscissa in the first coordinate information, b1Represent that y direction in respective quadrants is farthest apart from origin The actual ordinate of point on the display unit.
Optionally, the parameter information that the gesture operation is determined based on the shielded image, is specifically included:
Determine the first area blocked in the shielded image by the operating body;
Regional calibration is carried out to the first area, and then obtains second area, the second area is from the electronics The angle of the user of equipment watches the region that the display unit is blocked;
First object body is determined from the second area;
The first object body is sent to the electronic equipment.
Optionally, it is described before the first area for determining to be blocked by the operating body in the shielded image Method also includes:
Judge whether the duration of the gesture operation is more than preset duration;
The first area blocked in the shielded image by the operating body is determined, specifically,
When the duration is more than the preset duration, determine to be blocked by the operating body in the shielded image First area.
Optionally, it is described that first object body is determined from the second area, specifically include:
Image recognition is carried out to the second area, to obtain N number of objective body, N is the integer more than or equal to 2;
Determine that shielded area is more than first threshold and is less than M objective body of Second Threshold from N number of objective body, M is positive integer;
The first object body is determined from the M objective body.
Optionally, it is described that the first object body is determined from the M objective body, be specially:
It is the first object that the objective body farthest apart from the display unit bottom is determined from the M objective body Body.
Second aspect, the embodiment of the present invention provide a kind of Wearable, and the Wearable is with having display unit Electronic equipment exist communication, it is characterised in that the equipment includes:
Detection module, for detecting whether in the presence of gesture operation caused by an operating body;
Acquisition module, for when the gesture operation be present, collection to obtain the operating body and blocked in the display list The shielded image on first surface;
Sending module, for the shielded image to be sent to the electronic equipment, so that the electronic equipment passes through institute State shielded image and obtain the parameter information of the gesture operation, and the electronic equipment is controlled by the parameter information System;Or
The sending module, for the parameter information of the gesture operation to be sent directly into the electronic equipment, with logical Cross the parameter information to be controlled the electronic equipment, the parameter information is obtained based on the shielded image.
Optionally, when the sending module is used to send the parameter information to the electronic equipment, the equipment Also include:
First determining module, for determining the first coordinate information of operating body described in the shielded image;
First calibration module, for carrying out calibrating coordinates to first coordinate information, and then obtain the operating body Second coordinate information, second coordinate information be when watching the operating body from user's angle of the electronic equipment, The operating body corresponding coordinate information on the display unit, second coordinate information are the parameter information.
Optionally, the equipment also includes:
First division module, for first coordinate information carry out calibrating coordinates before, by the display unit Using center as origin, four quadrants are divided into;
Second determining module, for determining that abscissa offset Δ x, the Δ x are respective quadrants for each quadrant Offset of the interior X direction with the farthest point of initial point distance relative to origin;
3rd determining module, for determining that ordinate offset Δ y, the Δ y are respective quadrants for each quadrant Offset of the interior y direction with the farthest point of initial point distance relative to origin.
Optionally, first calibration module is calibrated by below equation to first coordinate information:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
Optionally, first calibration module is calibrated by below equation to first coordinate information:
Wherein, l0Represent the length for the display unit that image collecting device detects, l1Represent the actual (tube) length of display unit Degree, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent in respective quadrants The actual abscissa of the X direction point farthest apart from origin on the display unit;
Wherein, h0Represent the length for the display unit that image collecting device detects, h1Represent the actual (tube) length of display unit Degree, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent in respective quadrants The actual ordinate of the y direction point farthest apart from origin on the display unit.
Optionally, when the sending module is used to send the parameter information to the electronic equipment, the equipment Also include:
4th determining module, for the first area for determining to be blocked by the operating body in the shielded image;
Second calibration module, for carrying out regional calibration to the first area, and then second area is obtained, described second The region that region is blocked for display unit described in the angle in the user in the electronic equipment;
5th determining module, for determining first object body from the second area, the first object body is The parameter information.
Optionally, the equipment also includes:
First judge module, for judging whether the duration of the gesture operation is more than preset duration;
4th determining module, for when the duration is more than the preset duration, determining the Occlusion Map The first area blocked as in by the operating body.
Optionally, the 5th determining module, is specifically included:
First recognition unit, for carrying out image recognition to the second area, to obtain N number of objective body, N be more than etc. In 2 integer;
First determining unit, for determining that shielded area is more than first threshold and less than the from N number of objective body M objective body of two threshold values, M are positive integer;
Second determining unit, for determining the first object body from the M objective body.
Optionally, second determining unit, is specifically used for:
It is the first object that the objective body farthest apart from the display unit bottom is determined from the M objective body Body.
The third aspect, the embodiment of the present invention provide a kind of electronic equipment, and the electronic equipment is adopted with external or Built-in Image There is communication in the Wearable of acquisition means, the equipment includes:
Receiving module, the shielded image or the reception Wearable hair sent for receiving the Wearable The parameter information of the gesture operation sent, the shielded image are specially:It is described when gesture operation caused by an operating body be present The operating body that image collecting device obtains blocks to be believed in the image on the display unit surface, the parameter of the gesture operation Breath is specifically analyzed the shielded image by the Wearable and obtained;
Control module, for obtaining the shielded image, and pass through the ginseng of the shielded image acquisition gesture operation Number information, and the electronic equipment is controlled by the parameter information;Or
The control module, the parameter information of the gesture operation received for directly obtaining the receiving module, and The electronic equipment is controlled by the parameter information of the gesture operation.
Optionally, the control module, is specifically included:
3rd determining unit, for determining the first coordinate information of operating body described in the shielded image;
First alignment unit, for carrying out calibrating coordinates to first coordinate information, and then obtain the operating body Second coordinate information, second coordinate information be when watching the operating body from user's angle of the electronic equipment, The operating body corresponding coordinate information on the display unit, second coordinate information are the parameter information.
Optionally, the electronic equipment also includes:
Second division module, for first coordinate information carry out calibrating coordinates before, by the display unit Using center as origin, four quadrants are divided into;
6th determining module, for determining that abscissa offset Δ x, the Δ x are respective quadrants for each quadrant Offset of the interior X direction with the farthest point of initial point distance relative to origin;
7th determining module, for determining that ordinate offset Δ y, the Δ y are respective quadrants for each quadrant Offset of the interior y direction with the farthest point of initial point distance relative to origin.
Optionally, first alignment unit is calibrated by below equation to first coordinate information:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
Optionally, first alignment unit is calibrated by below equation to first coordinate information:
Wherein, l0Represent the length for the display unit that image collecting device detects, l1Represent the actual (tube) length of display unit Degree, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent in respective quadrants The actual abscissa of the X direction point farthest apart from origin on the display unit;
Wherein, h0Represent the length for the display unit that image collecting device detects, h1Represent the actual (tube) length of display unit Degree, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent in respective quadrants The actual ordinate of the y direction point farthest apart from origin on the display unit.
Optionally, the control module, is specifically included:
4th determining unit, for the first area for determining to be blocked by the operating body in the shielded image;
Second alignment unit, for carrying out regional calibration to the first area, and then second area is obtained, described second The region that region is blocked for display unit described in the angle in the user in the electronic equipment;
5th determining unit, for determining first object body from the second area, the first object body is The parameter information.
Optionally, the electronic equipment also includes:
Second judge module, for judging whether the duration of the gesture operation is more than preset duration;
4th determining unit, for when the duration is more than the preset duration, determining the Occlusion Map The first area blocked as in by the operating body.
Optionally, the 5th determining unit, is specifically included:
Identify subelement, for carrying out image recognition to the second area, to obtain N number of objective body, N be more than or equal to 2 integer;
First determination subelement, for determining shielded area from N number of objective body more than first threshold and being less than M objective body of Second Threshold, M are positive integer;
Second determination subelement, for determining the first object body from the M objective body.
Optionally, second determination subelement, is specifically used for:Determined from the M objective body apart from described aobvious It is the first object body to show the farthest objective body of unit bottom.
The present invention has the beneficial effect that:
Pass through gesture operation caused by an operating body due to, first detecting whether to exist in embodiments of the present invention;True When gesture operation surely be present, then collection obtains operating body and blocked in the shielded image on display unit surface;It is finally based on the screening Gear image determines the parameter information of the gesture operation, to be set by the parameter information to the electronics comprising the display unit It is standby to be controlled.Namely:Directly collection obtains shutter body and blocked in the shielded image on display unit surface, it is possible to it is determined that selling The parameter information of gesture operation, and then electronic equipment is controlled by parameter information, without the tune by cumbersome gesture With going out corresponding control interface, and the traveling time of the operating body for producing gesture operation and distance are not restricted, therefore And improve efficiency when being controlled to the electronic equipment comprising display unit;Also, operated during for producing gesture operation The position of body is simultaneously unrestricted, so add the application scenarios of gesture control.
Brief description of the drawings
Fig. 1 is the flow chart of gestural control method in first aspect of the embodiment of the present invention;
Fig. 2 is the structure chart of the intelligent glasses comprising camera in the embodiment of the present invention;
Fig. 3 is the schematic diagram of shielded image in the embodiment of the present invention;
Fig. 4 will relevant information corresponding with shielded image for Wearable in the gestural control method of the embodiment of the present invention Send to the flow chart of the first way of electronic equipment;
Fig. 5 determines the abscissa offset of each quadrant for Wearable in the gestural control method of the embodiment of the present invention With the flow chart of ordinate offset;
The signal for five calibrating positions that Fig. 6 determines for Wearable in the gestural control method of the embodiment of the present invention Figure;
Fig. 7 is the plane abscissa where image collecting device in the gestural control method of the embodiment of the present invention and screen institute The schematic diagram for being θ in the angle of plane abscissa;
Fig. 8 is that Wearable will relevant information hair corresponding with shielded image in gestural control method of the embodiment of the present invention Deliver to the flow chart of the second way of electronic equipment;
Fig. 9 is that Wearable determines first object body from second area in gestural control method of the embodiment of the present invention Flow chart;
4 objective bodies that Wearable is determined from shielded image in Figure 10 gestural control methods of the embodiment of the present invention Schematic diagram;
Figure 11 is the structure chart of the Wearable of second aspect of the embodiment of the present invention;
Figure 12 is the structure chart of the Wearable of the third aspect of the embodiment of the present invention.
Embodiment
The present invention provides a kind of gestural control method, Wearable and electronic equipment, to solve in the prior art to bag Electronic equipment containing display unit carries out technical problem less efficient during gesture control.
Technical scheme in the embodiment of the present application is the above-mentioned technical problem of solution, and general thought is as follows:
First detect whether to exist and pass through gesture operation caused by an operating body;When it is determined that gesture operation be present, then adopt Collect acquisition operating body to block in the shielded image on display unit surface;It is finally based on the shielded image and determines the gesture operation Parameter information, to be controlled by the parameter information to the electronic equipment comprising the display unit.Namely:Directly adopt Collect acquisition shutter body to block in the shielded image on display unit surface, it is possible to the parameter information of gesture operation is determined, and then Electronic equipment is controlled by parameter information, without the corresponding control interface that calls out by cumbersome gesture, and And the traveling time and distance of the operating body for producing gesture operation are not restricted, so improve to comprising display unit Efficiency of electronic equipment when being controlled;Also, the position of operating body and unrestricted during for producing gesture operation, so increase The application scenarios of gesture control are added.
In order to be better understood from above-mentioned technical proposal, below by accompanying drawing and specific embodiment to technical solution of the present invention It is described in detail, it should be understood that the specific features in the embodiment of the present invention and embodiment are to the detailed of technical solution of the present invention Thin explanation, rather than the restriction to technical solution of the present invention, in the case where not conflicting, the embodiment of the present invention and embodiment In technical characteristic can be mutually combined.
In a first aspect, the embodiment of the present invention provides a kind of gestural control method, Fig. 1 is refer to, including:
Step S101:It is located at the acquisition range of the image collecting device of Wearable in the display unit of electronic equipment When, detect whether gesture operation caused by an operating body be present by Wearable;
Step S102:Determine that Wearable is adopted by image collecting device when gesture operation be present in Wearable Collect acquisition operating body to block in the shielded image on display unit surface;
Step S103:Wearable sends relevant information corresponding with shielded image to electronic equipment;
Step S104:Electronic equipment determines the parameter information of gesture operation based on relevant information.
And after electronic equipment determines the parameter information of gesture operation based on relevant information, it is possible to based on parameter information Electronic equipment is controlled.
In specific implementation process, relevant information can be shielded image, and in this case, Wearable directly will Shielded image sends the parameter information for electronic equipment, then determining gesture operation based on shielded image by electronic equipment;Phase The parameter information that information can also be gesture operation is closed, in this case, Wearable is by dividing shielded image The parameter information of gesture operation is determined in analysis, and then Wearable sends the parameter information of gesture operation to electronic equipment. That is to say can be determined in Wearable side by shielded image gesture operation parameter information, can also be in electronic equipment side The parameter information of gesture operation is determined by shielded image.
In step S101, Wearable is, for example,:Intelligent glasses, intelligent watch etc., as shown in Fig. 2 being included for one The schematic diagram of the intelligent glasses of camera 20, electronic equipment are, for example,:Television set, notebook computer, display etc., it is wearable Exist between equipment and electronic equipment and communicate.
In step S101, operating body can be a variety of operating bodies, such as:Hand, stylus etc.;The image of Wearable Harvester can every prefixed time interval (such as:1min, 2min etc.) it is scanned, in scanning to operating body positioned at aobvious When showing before unit, it is determined that gesture operation be present;Or when the user of Wearable needs to produce gesture operation, The pre-set button of Wearable is clicked on, Wearable is when detecting the operation for clicking on pre-set button, it is determined that hand be present Gesture operates;For which kind of mode to detect whether there is gesture operation using, the embodiment of the present invention no longer itemizes, and does not make Limitation.
In step S102, the image acquisition device of Wearable to shielded image in electronic equipment display list Arbitrary content can be shown in member, such as:Show documents editing interface, display image, the multiple exercisable objective bodies of display etc. Deng the embodiment of the present invention is not restricted.Fig. 3 is the schematic diagram of a shielded image, and wherein operating body is finger, is shown on display unit Show be multiple exercisable objective bodies (namely:Multiple buttons).
In step S103, the difference based on relevant information, Wearable sends relevant information to the side of electronic equipment Formula is also different, is set forth below three kinds therein and is introduced, and certainly, in specific implementation process, is not limited to following three kinds of situations.
The first, Wearable sends relevant information corresponding with shielded image to electronic equipment, refer to Fig. 4, Specifically include:
Step S401:Wearable determines the first coordinate information of operating body in shielded image;
Step S402:Wearable carries out calibrating coordinates to the first coordinate information, and then obtains the second of operating body and sit Information is marked, when the second coordinate information is watches operating body from user's angle of electronic equipment, operating body is right on the display unit The coordinate information answered;
Step S403:Wearable sends the second coordinate information to electronic equipment.
In step S401, the central area of display unit in shielded image can be determined to the origin of coordinates (0,0), then will Display unit region is divided into four quadrants, it is assumed that image collecting device detects that the width of display unit is l0, height For h0, then abscissa corresponding to the left frame of display unit can be defined asAbscissa is defined as corresponding to left frameThe ordinate at the top of display unit is defined asThe ordinate of bottom is defined asAssuming that the area corresponding to gesture operation Domain is in shielded image at the top of display unitOn the left of display unitSo then can be according to the position of gesture operation The distance with origin is put, then can determine that the abscissa in the first coordinate information is:It is vertical in first coordinate information Coordinate is:Certainly, the difference based on gesture operation corresponding position in shielded image, the first coordinate information is not yet Together, this embodiment of the present invention is no longer itemized, and be not restricted.
, please before Wearable carries out calibrating coordinates to the first coordinate information as further preferred embodiment With reference to figure 5, method also includes:
Step S501:Display unit using center as origin, is divided into four quadrants by Wearable;
Step S502:Wearable determines that abscissa offset Δ x, Δ x are in respective quadrants for each quadrant Offset of the X direction with the farthest point of initial point distance relative to origin;
Step S503:Wearable determines that ordinate offset Δ y, Δ y are in respective quadrants for each quadrant Offset of the y direction with the farthest point of initial point distance relative to origin.
In step S501, Wearable is that display unit is divided into four regions with the actual size of display unit, Using the center of display unit as the origin of coordinates, five calibrating positions are then chosen, it is assumed that the developed width of display unit is l1, it is real Border is highly h1, then as shown in fig. 6, five calibrating positions determined are respectively:(0,0),Then determine that compensation misses respectively for this five calibrating positions Difference, its detailed process are as follows:
User sequentially points to five calibrating positions with operating body, works as user according to the prompting of the display unit of electronic equipment Think after being pointing exactly to relevant position, finger is confirmed using the gesture clicked on, and then is just capable of determining that this five calibrating positions Coordinate in shielded image, then according to compensation calculation formula:Coordinate in calibrating position actual coordinate-shielded image determines Go out the compensation error amount of each calibrating position.
Assuming that coordinate of the central area determined by user in shielded image is:(x0,y0), and the reality of the origin of coordinates Coordinate is:(0,0), then the compensation error amount that can determine the origin of coordinates is:
(0-x0,0-y0)=(- x0,-y0) ………………………………[1]
In step S502, by taking first quartile as an example, it is assumed that in shielded image the X direction of first quartile and origin away from Abscissa from farthest point is x1, and the abscissa of the point is in practiceIt is determined that compensation error amount be: Then the compensation error amount of origin is subtracted with the compensation error amount of the point, it is possible to determine that Δ x is as follows:
It is similar with the calculation of first quartile for the Δ x of other quadrants computational methods, so will not be repeated here.
In step S503, it is assumed that the vertical seat of the y direction of first quartile and the farthest point of initial point distance in shielded image It is designated as y1, and the ordinate of the point is in practiceIt is determined that compensation error amount be:Then with the benefit of the point Repay the compensation error amount that error amount subtracts origin, it is possible to determine that Δ y is as follows:
It is similar with first quartile for the Δ y of other quadrants calculation, so will not be repeated here.
In step S402, Wearable can calibrate to the first coordinate information in several ways, be set forth below Two ways therein, certainly, in specific implementation process, it is not limited to following two modes.
1. Wearable is calibrated by below equation to the first coordinate information:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
Under normal circumstances, above-mentioned formula is used for when user is located at the front of the display unit of electronic equipment, to first Coordinate information is corrected, because the parameter used is less, so reduce the quantity of Wearable detection parameters, and energy The arithmetic speed of Wearable is enough improved, thus with the technology effect for improving the efficiency being corrected to the first coordinate information Fruit.
2. Wearable is calibrated by below equation to the first coordinate information:
Wherein, l0Represent the length for the display unit that image collecting device detects, l1Represent the actual (tube) length of display unit Degree, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent in respective quadrants The actual abscissa of the X direction point farthest apart from origin on the display unit;
Wherein, h0Represent the length for the display unit that image collecting device detects, h1Represent the actual (tube) length of display unit Degree, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent in respective quadrants The actual ordinate of the y direction point farthest apart from origin on the display unit.
Such scheme be commonly used to user be not at electronic equipment display unit front situation, refer to figure 7, it is assumed that the angle of plane abscissa and plane abscissa where screen where image collecting device is θ, then first quartile is horizontal Coordinate physical location is A point, and the projected position in camera is An.
Due to Cos θ=xn/xm=l0/l1 ………………………………[8]
Therefore
As can be seen here, if the angle of the plane abscissa and plane abscissa where screen where image collecting device is θ, Need it is no it is inclined on the basis of be multiplied by proportionality coefficientSo formula [6] is finally determined, and for image collecting device The plane ordinate at place and plane ordinate where screen exist it is inclined in the case of, how to determine y2, its method is with determining x2 It is similar, so will not be repeated here.
Due in such scheme, it is contemplated that plane is deposited relative to plane where display unit where image collecting device In inclined situation, so with the more accurate technique effect corrected to the first coordinate information.
Second, Wearable sends relevant information corresponding with shielded image to electronic equipment, refer to Fig. 8, Specifically include:
Step S801:Wearable determines the first area that operated member is blocked in shielded image;
Step S802:Wearable carries out regional calibration to first area, and then obtains second area, and second area is The region that display unit is blocked is watched from the angle of the user of electronic equipment;
Step S803:Wearable determines first object body from second area;
Step S804:Wearable sends first object body to electronic equipment.
Optionally, before the first area that operated member is blocked during Wearable determines shielded image, method is also Including:
Wearable judges whether the duration of gesture operation is more than preset duration;
And then in step S801, Wearable determines the first area that operated member is blocked in shielded image, specifically For:
When duration is more than preset duration, Wearable determines operated member is blocked in shielded image first Region.
In specific implementation process, preset duration can be default any duration, such as:2s, 5s etc., to this this hair Bright embodiment is not restricted, and when duration is more than preset duration, then shows that the user of Wearable needs to produce really Gesture operation, so with the technique effect for preventing maloperation.In addition, preset duration can be set by user oneself, Jin Ershi Now to the accurate control of duration, the Experience Degree of user is improved.
In step S802, Wearable carries out the mode of regional calibration to first area, and wearable in step S402 The calibrating mode that equipment carries out calibrating coordinates to the first coordinate information is similar, can also determine five schools shown in Fig. 6 first Level is put, and then determines the compensation error amount of this five calibrating positions respectively;
Then the Δ x and Δ y of each quadrant are determined;
It is then determined that the coordinate information of each pixel gone out in first area;
Then coordinates correction is carried out by formula [4], formula [5] or formula [6], formula [7] for each pixel, And then determine the new coordinate of each pixel;
Second area is assured that out by the new coordinate of each pixel.
Optionally, in step S803, Wearable determines first object body from second area, refer to Fig. 9, tool Body includes:
Step S901:Wearable carries out image recognition to second area, to obtain N number of objective body, N be more than or equal to 2 integer;
Step S902:Wearable determines that shielded area is more than first threshold and less than second from N number of objective body M objective body of threshold value, M are positive integer;
Step S903:Wearable determines first object body from M objective body.
In step S901, if second area is not blocked in shielded image, Wearable can be directly right Second area in shielded image carries out image recognition, to obtain N number of objective body;And if second area is blocked, then dress Formula equipment can obtain the current screen sectional drawing of electronic equipment from electronic equipment, then to the second area in current screen sectional drawing Carry out image recognition, it is assumed that determine that (label is respectively four objective bodies as shown in Figure 10 in figure:1st, 2,3,4), namely N is 4, certain N are only a citing, are not intended as limiting.
In step S902, first threshold and Second Threshold can be default arbitrary value, and certainly, first threshold can not be more than Second Threshold, such as:First threshold is 0.5cm2, Second Threshold 0.8cm2, shielded area is that operating body is carried out into regional calibration Afterwards, the area that operating body is blocked to objective body, wherein, to regional calibration mode and the area to first area of operating body Domain calibrating mode is similar, so will not be repeated here.
Wearable can detect four target bulk areas in figure respectively, it is assumed that the area of objective body 1 is 0.6cm2、 The area of objective body 2 is 0.7cm2, objective body 3 area be 0.4cm2, objective body 4 area be 0.9cm2, then can be therefrom Determine that two objective bodies, namely objective body 1 and objective body 2, namely M are equal to 2, certain above first threshold, Second Threshold, M Only one citing, however it is not limited to above-mentioned numerical value.In addition, first threshold and Second Threshold can also be set by the user, such as: If the sectional area of operating body is larger, first threshold and Second Threshold can set it is higher, and if the section of operating body Product is smaller, then first threshold and Second Threshold can be set relatively low, and then determined first threshold and Second Threshold are more smart Really, the Experience Degree of user is improved.
In specific implementation process, Wearable determines first object body again from M objective body in step S903 Various ways can be used, two ways therein is set forth below.
1. Wearable determines first object body from M objective body, it is specially:
Wearable determines that the objective body farthest apart from display unit bottom is first object from M objective body Body, then or by taking the objective body shown in Figure 10 as an example, it is determined that it is first object body to go out objective body 1.
Under normal circumstances, the objective body corresponding to the top of operating body is often the objective body of the desired selection of user, therefore And the first object body determined by this kind of mode has more accurate technique effect, it can more meet user's request.
2. Wearable determines first object body from M objective body, it is specially:
Wearable selects the maximum objective body of shielded area as first object body from M objective body, then By taking the objective body shown in Figure 10 as an example, then can determine whether out objective body 2 is first object body.
In specific implementation process, Wearable, can also be to first object body after first object body is determined It is highlighted to be shown, wherein, if first object body and non-user wish the objective body of selection, user can by it is upper and lower, The mode of left and right moving operation body is adjusted to first object body, and then can realize pair essence for determining first object body Really control.
Due in such scheme, directly determining first object body by occlusion area, then first object body is entered Row control, without realizing the control to first object body by the gesture of complexity, so the difficulty of memory gesture is reduced, So as to reduce the use difficulty of user, and then the popularity of the gesture control can be improved, and improve the experience of user Degree.
The third, Wearable sends relevant information corresponding with shielded image to electronic equipment, is specially:
Wearable sends shielded image to electronic equipment.
Specifically, Wearable only gathers acquisition shielded image, and shielded image is handled, and then is joined The process of number information, then completed by electronic equipment.
In step S104, the difference of the relevant information sent based on Wearable to electronic equipment, electronic equipment is obtained The mode of the parameter information of gesture operation is also different, is set forth below three kinds therein and is introduced, certainly, in specific implementation process In, it is not limited to following three kinds of situations.
The first, the relevant information that Wearable is sent to electronic equipment is:Second coordinate information, then electronic equipment is straight The parameter information for taking the second coordinate information as gesture operation is obtained, and then electronic equipment can be entered by the second coordinate information Row control.
Second, the relevant information that Wearable is sent to electronic equipment is:First object body, then electronic equipment is direct Parameter information of the first object body as gesture operation is obtained, and then electronic equipment can be controlled by first object body System, or by taking Figure 10 as an example, it is assumed that first object body is objective body 1, then triggers control instruction corresponding to objective body 1.
The third, the relevant information that Wearable is sent to electronic equipment is:Shielded image, in this case, then Need to handle shielded image by electronic equipment, and then obtain the parameter information of gesture operation, it there can be a variety of places Reason mode, two kinds of processing modes therein are set forth below and are introduced, certainly, in specific implementation process, however it is not limited to be following Two kinds of processing modes.
1. electronic equipment determines the parameter information of gesture operation based on relevant information, it is specially:
Electronic equipment determines the first coordinate information of operating body in shielded image;
Electronic equipment carries out calibrating coordinates to the first coordinate information, and then obtains the second coordinate information of operating body, and second Coordinate information is when watching operating body from user's angle of electronic equipment, and operating body on the display unit believe by corresponding coordinate Breath, the second coordinate information is parameter information.
Wherein, electronic equipment determines that the mode of the second coordinate information and Wearable are based on Occlusion Map based on shielded image As determining that the mode of the second coordinate information is similar, every electronic equipment determines the mode of the second coordinate information all based on shielded image Wearable is can apply to, so will not be repeated here.
Before electronic equipment carries out calibrating coordinates to the first coordinate information, method also includes:
Display unit using center as origin, is divided into four quadrants by electronic equipment;
Electronic equipment determines that abscissa offset Δ x, Δ x are X direction and original in respective quadrants for each quadrant Offset of the point apart from farthest point relative to origin;
Electronic equipment determines that ordinate offset Δ y, Δ y are y direction and original in respective quadrants for each quadrant Offset of the point apart from farthest point relative to origin.
Wherein, electronic equipment determines that Δ x is similar with Δ y mode with Wearable, every electronic equipment determine Δ x with Δ y mode can be applied to Wearable, so will not be repeated here.
As further preferred embodiment, electronic equipment is calibrated by below equation to the first coordinate information:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
As further preferred embodiment, electronic equipment is calibrated by below equation to the first coordinate information:
Wherein, l0Represent the length for the display unit that image collecting device detects, l1Represent the actual (tube) length of display unit Degree, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent in respective quadrants The actual abscissa of the X direction point farthest apart from origin on the display unit;
Wherein, h0Represent the length for the display unit that image collecting device detects, h1Represent the actual (tube) length of display unit Degree, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent in respective quadrants The actual ordinate of the y direction point farthest apart from origin on the display unit.
Wherein, electronic equipment is similar with the mode that Wearable is calibrated to the first coordinate information, and every electronics is set The standby mode calibrated to the first coordinate information can be applied to Wearable, so will not be repeated here.
2. electronic equipment determines the parameter information of gesture operation based on relevant information, specifically include:
Electronic equipment determines the first area that operated member is blocked in shielded image;
Electronic equipment carries out regional calibration to first area, and then obtains second area, and second area is to be set in electronics The region that the angle display unit of standby user is blocked;
Electronic equipment determines first object body from second area, and first object body is parameter information.
Wherein, electronic equipment determines that the mode of first object body and Wearable are based on shielded image based on shielded image Determine that the mode of first object body is similar, every electronic equipment determines that the mode of first object body can answer based on shielded image For Wearable, so will not be repeated here.
As further preferred embodiment, relevant information also includes:The duration of gesture operation, it is true in electronic equipment Before determining the first area that operated member in shielded image is blocked, method also includes:
Electronic equipment judges whether the duration of gesture operation is more than preset duration;
Electronic equipment determines the first area that operated member is blocked in shielded image, is specially:
When duration is more than preset duration, electronic equipment determines the firstth area that operated member is blocked in shielded image Domain.
Wherein, electronic equipment is similar with the mode that Wearable determines first area, and every electronic equipment determines first The mode in region can be applied to Wearable, so will not be repeated here.
As further preferred embodiment, electronic equipment determines first object body from second area, specifically includes:
Electronic equipment carries out image recognition to second area, and to obtain N number of objective body, N is the integer more than or equal to 2;
Electronic equipment determines that shielded area is more than first threshold and less than M mesh of Second Threshold from N number of objective body Standard type, M are positive integer;
Electronic equipment determines first object body from M objective body.
Wherein, electronic equipment determines that the mode of first object body is similar with Wearable from second area, every Electronic equipment determines that the mode of first object body can be applied to Wearable from second area, so herein no longer Repeat.
As further preferred embodiment, electronic equipment determines first object body from M objective body, is specially:
Electronic equipment determines that first object body is determined apart from display unit from M objective body from M objective body The farthest objective body in bottom is first object body.
Wherein, electronic equipment determines that the mode of first object body is similar with Wearable from M objective body, every Electronic equipment determines that the mode of first object body can be applied to Wearable from M objective body, so herein not Repeat again.
Second aspect, the embodiment of the present invention provide a kind of Wearable, Wearable and the electricity with display unit There is communication in sub- equipment, refer to Figure 11, equipment includes:
Detection module 1101, during acquisition range for being located at image collecting device in display unit, detect whether exist Gesture operation caused by one operating body;
Acquisition module 1102, for when gesture operation be present, passing through the external or built-in IMAQ of Wearable Device collection obtains operating body and blocked in the shielded image on display unit surface;
Sending module 1103, for relevant information corresponding with shielded image to be sent to electronic equipment, so that electronics is set The standby parameter information that gesture operation is determined based on relevant information.
Optionally, sending module 1103, specifically include:
First determining unit, for determining the first coordinate information of operating body in shielded image;
First alignment unit, for carrying out calibrating coordinates to the first coordinate information, and then obtain the second coordinate of operating body Information, when the second coordinate information is watches operating body from user's angle of electronic equipment, operating body corresponds on the display unit Coordinate information;
First transmitting element, for the second coordinate information to be sent to electronic equipment.
Optionally, equipment also includes:
First division module, for the first coordinate information carry out calibrating coordinates before, by display unit using center as Origin, it is divided into four quadrants;
First determining module, for determining that abscissa offset Δ x, Δ x are horizontal stroke in respective quadrants for each quadrant Offset of the direction of principal axis with the farthest point of initial point distance relative to origin;
Second determining module, for determining that ordinate offset Δ y, Δ y are to be indulged in respective quadrants for each quadrant Offset of the direction of principal axis with the farthest point of initial point distance relative to origin.
Optionally, the first alignment unit, it is specifically used for, the first coordinate information is calibrated by below equation:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
Optionally, the first alignment unit, it is specifically used for, the first coordinate information is calibrated by below equation:
Wherein, l0Represent the length for the display unit that image collecting device detects, l1Represent the actual (tube) length of display unit Degree, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent in respective quadrants The actual abscissa of the X direction point farthest apart from origin on the display unit;
Wherein, h0Represent the length for the display unit that image collecting device detects, h1Represent the actual (tube) length of display unit Degree, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent in respective quadrants The actual ordinate of the y direction point farthest apart from origin on the display unit.
Optionally, sending module 1103, specifically include:
Second determining unit, for determining operated member is blocked in shielded image first area;
Second alignment unit, for first area carry out regional calibration, and then obtain second area, second area be from The angle of the user of electronic equipment watches the region that display unit is blocked;
3rd determining unit, for determining first object body from second area;
Second transmitting element, for first object body to be sent to electronic equipment.
Optionally, equipment also includes:
Judge module, for it is determined that before the first area that operated member is blocked in shielded image, judging that gesture is grasped Whether the duration of work is more than preset duration;
Second determining unit, is specifically used for:
When duration is more than preset duration, the first area that operated member is blocked in shielded image is determined.
Optionally, the 3rd determining unit, specifically include:
First identification subelement, for carrying out image recognition to second area, to obtain N number of objective body, N be more than or equal to 2 integer;
First determination subelement, for determining that shielded area is more than first threshold and less than second from N number of objective body M objective body of threshold value, M are positive integer;
Second determination subelement, for determining first object body from M objective body.
Optionally, the second determination subelement, it is specifically used for:
It is first object body that the objective body farthest apart from display unit bottom is determined from M objective body.
Optionally, sending module 1103, it is specifically used for:
Shielded image is sent to electronic equipment, so that electronic equipment determines parameter information based on shielded image.
The third aspect, the embodiment of the present invention provide a kind of electronic equipment, and electronic equipment fills with the collection of external or Built-in Image There is communication in the Wearable put, refer to Figure 12, equipment includes:
Receiving module 1201, for receiving the relevant information corresponding with shielded image of Wearable transmission, Occlusion Map As being specially:The acquisition range of image collecting device is in the display unit of electronic equipment and hand caused by an operating body be present When gesture operates, the operating body that image collecting device obtains is blocked in the image on display unit surface;
Parameter determination module 1202, for determining the parameter information of gesture operation by relevant information.
Optionally, relevant information is specially:Shielded image, parameter determination module 1202, is specifically included:
4th determining unit, for determining the first coordinate information of operating body in shielded image;
3rd alignment unit, for carrying out calibrating coordinates to the first coordinate information, and then obtain the second coordinate of operating body Information, the second coordinate information are that operating body is right on the display unit when watching operating body from user's angle of electronic equipment The coordinate information answered, the second coordinate information are parameter information.
Optionally, equipment also includes:
Second division module, for the first coordinate information carry out calibrating coordinates before, by display unit using center as Origin, it is divided into four quadrants;
3rd determining module, for determining that abscissa offset Δ x, Δ x are horizontal stroke in respective quadrants for each quadrant Offset of the direction of principal axis with the farthest point of initial point distance relative to origin;
4th determining module, for determining that ordinate offset Δ y, Δ y are to be indulged in respective quadrants for each quadrant Offset of the direction of principal axis with the farthest point of initial point distance relative to origin.
Optionally, the 3rd alignment unit, it is specifically used for, the first coordinate information is calibrated by below equation:
Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent The actual abscissa of the X direction point farthest apart from origin on the display unit in respective quadrants;
Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent The actual ordinate of the y direction point farthest apart from origin on the display unit in respective quadrants.
Optionally, the 3rd alignment unit, it is specifically used for, the first coordinate information is calibrated by below equation:
Wherein, l0Represent the length for the display unit that image collecting device detects, l1Represent the actual (tube) length of display unit Degree, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent in respective quadrants The actual abscissa of the X direction point farthest apart from origin on the display unit;
Wherein, h0Represent the length for the display unit that image collecting device detects, h1Represent the actual (tube) length of display unit Degree, y2Represent the ordinate in the second coordinate information, y1Represent the abscissa in the first coordinate information, b1Represent in respective quadrants The actual ordinate of the y direction point farthest apart from origin on the display unit.
Optionally, relevant information is specially:Shielded image, parameter determination module 1202, is specifically included:
5th determining unit, for determining operated member is blocked in shielded image first area;
4th alignment unit, for first area carry out regional calibration, and then obtain second area, second area be In the region that the angle display unit of the user of electronic equipment is blocked;
6th determining unit, for determining first object body from second area, first object body is parameter information.
Optionally, relevant information also includes:The duration of gesture operation, equipment also include:
Second judge module, for it is determined that before the first area that operated member is blocked in shielded image, judging hand Whether the duration of gesture operation is more than preset duration;
5th determining unit, is specifically used for:
When duration is more than preset duration, the first area that operated member is blocked in shielded image is determined.
Optionally, the 6th determining unit, specifically include:
Second identification subelement, for carrying out image recognition to second area, to obtain N number of objective body, N be more than or equal to 2 integer;
3rd determination subelement, for determining that shielded area is more than first threshold and less than second from N number of objective body M objective body of threshold value, M are positive integer;
4th determination subelement, for determining first object body from M objective body.
Optionally, the 4th determination subelement, it is specifically used for:
It is first object body that the objective body farthest apart from display unit bottom is determined from M objective body.
Optionally, relevant information is specially:Second coordinate information, the second coordinate information are specially:From making for electronic equipment When user's angle watches operating body, operating body corresponding coordinate information on the display unit;Or
Relevant information is specially:First object body, first object body are specially:Examined from the angle of the user of electronic equipment See the objective body that the region that display unit is blocked is included.
One or more embodiments of the invention, at least have the advantages that:
Pass through gesture operation caused by an operating body due to, first detecting whether to exist in embodiments of the present invention;True When gesture operation surely be present, then collection obtains operating body and blocked in the shielded image on display unit surface;It is finally based on the screening Gear image determines the parameter information of the gesture operation, to be set by the parameter information to the electronics comprising the display unit It is standby to be controlled.Namely:Directly collection obtains shutter body and blocked in the shielded image on display unit surface, it is possible to it is determined that selling The parameter information of gesture operation, and then electronic equipment is controlled by parameter information, without the tune by cumbersome gesture With going out corresponding control interface, and the traveling time of the operating body for producing gesture operation and distance are not restricted, therefore And improve efficiency when being controlled to the electronic equipment comprising display unit;Also, operated during for producing gesture operation The position of body is simultaneously unrestricted, so add the application scenarios of gesture control.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program Product.Therefore, the present invention can use the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware Apply the form of example.Moreover, the present invention can use the computer for wherein including computer usable program code in one or more The computer program production that usable storage medium is implemented on (including but is not limited to magnetic disk storage, CD-ROM, optical memory etc.) The form of product.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that can be by every first-class in computer program instructions implementation process figure and/or block diagram Journey and/or the flow in square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided Instruct the embedded Control of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices Device is to produce a machine so that passes through the finger that computer or the embedded controller of other programmable data processing devices perform Order, which produces, to be used to realize what is specified in one flow of flow chart or multiple flows and/or one square frame of block diagram or multiple square frames The device of function.
These computer program instructions, which may be alternatively stored in, can guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory, which produces, to be included referring to Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one square frame of block diagram or The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that counted Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented processing, so as in computer or The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one The step of function of being specified in individual square frame or multiple square frames.
Although preferred embodiments of the present invention have been described, but those skilled in the art once know basic creation Property concept, then can make other change and modification to these embodiments.So appended claims be intended to be construed to include it is excellent Select embodiment and fall into having altered and changing for the scope of the invention.
Obviously, those skilled in the art can carry out various changes and modification without departing from this hair to the embodiment of the present invention The spirit and scope of bright embodiment.So, if these modifications and variations of the embodiment of the present invention belong to the claims in the present invention And its within the scope of equivalent technologies, then the present invention is also intended to comprising including these changes and modification.

Claims (10)

  1. A kind of 1. gestural control method, it is characterised in that including:
    Detect whether gesture operation caused by an operating body be present;
    When the gesture operation be present, collection obtains the operating body and blocked in the shielded image on display unit surface;
    The parameter information of the gesture operation is determined based on the shielded image;
    The electronic equipment comprising the display unit is controlled by the parameter information;Wherein,
    The parameter information that the gesture operation is determined based on the shielded image, including:
    Determine the first area blocked in the shielded image by the operating body;
    Regional calibration is carried out to the first area, and then obtains second area, the second area is from the electronic equipment The angle of user watch the region that the display unit is blocked;
    First object body is determined from the second area;
    The first object body is sent to the electronic equipment.
  2. 2. the method as described in claim 1, it is characterised in that described that the gesture operation is determined based on the shielded image Parameter information, specifically include:
    Determine the first coordinate information of operating body described in the shielded image;
    Calibrating coordinates are carried out to first coordinate information, and then obtain the second coordinate information of the operating body, described second When coordinate information is watches the operating body from user's angle of the electronic equipment, the operating body is in the display unit Coordinate information corresponding to upper, second coordinate information is the parameter information.
  3. 3. method as claimed in claim 2, it is characterised in that it is described to first coordinate information carry out calibrating coordinates it Before, methods described also includes:
    By the display unit using its center as origin, four quadrants are divided into;
    For each quadrant determine abscissa offset Δ x, the Δ x be in respective quadrants X direction with initial point distance most Remote point relative to origin offset;
    For each quadrant determine ordinate offset Δ y, the Δ y be in respective quadrants y direction with initial point distance most Remote point relative to origin offset.
  4. 4. method as claimed in claim 3, it is characterised in that school is carried out to first coordinate information by below equation It is accurate:
    <mrow> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>=</mo> <mfrac> <msub> <mi>x</mi> <mn>1</mn> </msub> <msub> <mi>a</mi> <mn>1</mn> </msub> </mfrac> <mo>&amp;CenterDot;</mo> <mi>&amp;Delta;</mi> <mi>x</mi> <mo>;</mo> </mrow>
    <mrow> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>=</mo> <mfrac> <msub> <mi>y</mi> <mn>1</mn> </msub> <msub> <mi>b</mi> <mn>1</mn> </msub> </mfrac> <mo>&amp;CenterDot;</mo> <mi>&amp;Delta;</mi> <mi>y</mi> <mo>;</mo> </mrow>
    Wherein, x2Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent corresponding The actual abscissa of the X direction point farthest apart from origin on the display unit in quadrant;
    Wherein, y2Represent the ordinate in the second coordinate information, y1Represent the ordinate in the first coordinate information, b1Represent corresponding The actual ordinate of the y direction point farthest apart from origin on the display unit in quadrant.
  5. 5. method as claimed in claim 3, it is characterised in that school is carried out to first coordinate information by below equation It is accurate:
    <mrow> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>=</mo> <mfrac> <msub> <mi>l</mi> <mn>0</mn> </msub> <msub> <mi>l</mi> <mn>1</mn> </msub> </mfrac> <mo>&amp;CenterDot;</mo> <mfrac> <msub> <mi>x</mi> <mn>1</mn> </msub> <msub> <mi>a</mi> <mn>1</mn> </msub> </mfrac> <mo>&amp;CenterDot;</mo> <mi>&amp;Delta;</mi> <mi>x</mi> <mo>;</mo> </mrow> 1
    <mrow> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>=</mo> <mfrac> <msub> <mi>h</mi> <mn>0</mn> </msub> <msub> <mi>h</mi> <mn>1</mn> </msub> </mfrac> <mo>&amp;CenterDot;</mo> <mfrac> <msub> <mi>y</mi> <mn>1</mn> </msub> <msub> <mi>b</mi> <mn>1</mn> </msub> </mfrac> <mo>&amp;CenterDot;</mo> <mi>&amp;Delta;</mi> <mi>y</mi> <mo>;</mo> </mrow>
    Wherein, l0Represent the length for the display unit that image acquisition device arrives, l1Represent the physical length of display unit, x2 Represent the abscissa in the second coordinate information, x1Represent the abscissa in the first coordinate information, a1Represent transverse axis side in respective quadrants To the actual abscissa of the point farthest apart from origin on the display unit;
    Wherein, h0Represent the height for the display unit that image acquisition device arrives, h1Represent the actual height of display unit, y2 Represent the ordinate in the second coordinate information, y1Represent the ordinate in the first coordinate information, b1Represent longitudinal axis side in respective quadrants To the actual ordinate of the point farthest apart from origin on the display unit.
  6. 6. the method as described in claim 1, it is characterised in that by the operating body institute in the determination shielded image Before the first area blocked, methods described also includes:
    Judge whether the duration of the gesture operation is more than preset duration;
    The first area blocked in the shielded image by the operating body is determined, specifically,
    When the duration is more than the preset duration, blocked in the shielded image by the operating body is determined One region.
  7. 7. method as claimed in claim 6, it is characterised in that it is described that first object body is determined from the second area, Specifically include:
    Image recognition is carried out to the second area, to obtain N number of objective body, N is the integer more than or equal to 2;
    Determine that shielded area is more than first threshold and is less than M objective body of Second Threshold, M from N number of objective body Positive integer;
    The first object body is determined from the M objective body.
  8. 8. method as claimed in claim 7, it is characterised in that described that first mesh is determined from the M objective body Standard type, it is specially:
    It is the first object body that the objective body farthest apart from the display unit bottom is determined from the M objective body.
  9. 9. a kind of Wearable, the Wearable exists with the electronic equipment with display unit to communicate, and its feature exists In the equipment includes:
    Detection module, for detecting whether in the presence of gesture operation caused by an operating body;
    Acquisition module, for when the gesture operation be present, collection to obtain the operating body and blocked in the display unit table The shielded image in face;
    Sending module, for the shielded image to be sent to the electronic equipment, so that the electronic equipment passes through the screening Keep off image and obtain the parameter information of the gesture operation, and the electronic equipment is controlled by the parameter information;Or Person
    Sending module, for the parameter information of the gesture operation to be sent directly into the electronic equipment, to pass through the ginseng Number information is controlled to the electronic equipment, and the parameter information is obtained based on the shielded image;Wherein, it is described to be based on institute The parameter information that shielded image determines the gesture operation is stated, including:
    Determine the first area blocked in the shielded image by the operating body;
    Regional calibration is carried out to the first area, and then obtains second area, the second area is from the electronic equipment The angle of user watch the region that the display unit is blocked;
    First object body is determined from the second area, the first object body is the parameter information.
  10. 10. a kind of electronic equipment, the electronic equipment exists with external or Built-in Image harvester Wearable to communicate, The equipment includes:
    Receiving module, for receiving the shielded image of the Wearable transmission or receiving what the Wearable was sent The parameter information of gesture operation, the shielded image are specially:When gesture operation caused by an operating body be present, described image The operating body that harvester obtains is blocked in the image on display unit surface, the parameter information of the gesture operation specifically by The Wearable is analyzed the shielded image and obtained;
    Control module, believe for obtaining the shielded image, and by the parameter of the shielded image acquisition gesture operation Breath, and the electronic equipment is controlled by the parameter information;Or
    Control module, the parameter information of the gesture operation received for directly obtaining the receiving module, and by described The parameter information of gesture operation is controlled to the electronic equipment;Wherein, the parameter information of the gesture operation is specifically by institute State Wearable and analyze the shielded image acquisition, including:
    Determine the first area blocked in the shielded image by the operating body;
    Regional calibration is carried out to the first area, and then obtains second area, the second area is from the electronic equipment The angle of user watch the region that the display unit is blocked;
    First object body is determined from the second area;
    The first object body is sent to the electronic equipment.
CN201410545362.9A 2014-10-15 2014-10-15 A kind of gestural control method, Wearable and electronic equipment Expired - Fee Related CN104317398B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410545362.9A CN104317398B (en) 2014-10-15 2014-10-15 A kind of gestural control method, Wearable and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410545362.9A CN104317398B (en) 2014-10-15 2014-10-15 A kind of gestural control method, Wearable and electronic equipment

Publications (2)

Publication Number Publication Date
CN104317398A CN104317398A (en) 2015-01-28
CN104317398B true CN104317398B (en) 2017-12-01

Family

ID=52372637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410545362.9A Expired - Fee Related CN104317398B (en) 2014-10-15 2014-10-15 A kind of gestural control method, Wearable and electronic equipment

Country Status (1)

Country Link
CN (1) CN104317398B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866103B (en) * 2015-06-01 2019-12-24 联想(北京)有限公司 Relative position determining method, wearable electronic device and terminal device
CN105302301B (en) * 2015-10-15 2018-02-13 广东欧珀移动通信有限公司 A kind of awakening method of mobile terminal, device and mobile terminal
CN107786549B (en) * 2017-10-16 2019-10-29 北京旷视科技有限公司 Adding method, device, system and the computer-readable medium of audio file
CN107831920B (en) * 2017-10-20 2022-01-28 广州视睿电子科技有限公司 Cursor movement display method and device, mobile terminal and storage medium
JP6971788B2 (en) * 2017-11-09 2021-11-24 シャープ株式会社 Screen display control method and screen display control system
CN108875694A (en) * 2018-07-04 2018-11-23 百度在线网络技术(北京)有限公司 Speech output method and device
CN111142666A (en) * 2019-12-27 2020-05-12 惠州Tcl移动通信有限公司 Terminal control method, device, storage medium and mobile terminal
CN113141502B (en) * 2021-03-18 2022-02-08 青岛小鸟看看科技有限公司 Camera shooting control method and device of head-mounted display equipment and head-mounted display equipment
CN114120770A (en) * 2021-03-24 2022-03-01 张银合 Barrier-free communication method for hearing-impaired people
CN113467608A (en) * 2021-05-28 2021-10-01 荣耀终端有限公司 Method and device for controlling opening and closing of terminal equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203178918U (en) * 2013-03-21 2013-09-04 联想(北京)有限公司 Electronic device
CN103309434A (en) * 2012-03-12 2013-09-18 联想(北京)有限公司 Instruction identification method and electronic equipment
CN103941864A (en) * 2014-04-03 2014-07-23 北京工业大学 Somatosensory controller based on human eye binocular visual angle
CN103995621A (en) * 2014-04-28 2014-08-20 京东方科技集团股份有限公司 Wearable type touch control device and wearable type touch control method
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2006101096B4 (en) * 2005-12-30 2010-07-08 Apple Inc. Portable electronic device with multi-touch input
CN202142005U (en) * 2009-07-22 2012-02-08 罗技欧洲公司 System for long-distance virtual screen input
JP5412227B2 (en) * 2009-10-05 2014-02-12 日立コンシューマエレクトロニクス株式会社 Video display device and display control method thereof
CN102446032B (en) * 2010-09-30 2014-09-17 中国移动通信有限公司 Information input method and terminal based on camera
US9069164B2 (en) * 2011-07-12 2015-06-30 Google Inc. Methods and systems for a virtual input device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309434A (en) * 2012-03-12 2013-09-18 联想(北京)有限公司 Instruction identification method and electronic equipment
CN203178918U (en) * 2013-03-21 2013-09-04 联想(北京)有限公司 Electronic device
CN103941864A (en) * 2014-04-03 2014-07-23 北京工业大学 Somatosensory controller based on human eye binocular visual angle
CN103995621A (en) * 2014-04-28 2014-08-20 京东方科技集团股份有限公司 Wearable type touch control device and wearable type touch control method
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal

Also Published As

Publication number Publication date
CN104317398A (en) 2015-01-28

Similar Documents

Publication Publication Date Title
CN104317398B (en) A kind of gestural control method, Wearable and electronic equipment
DE112010002760B4 (en) User interface
CN101963839B (en) Operation control device and operation control method
US20120139907A1 (en) 3 dimensional (3d) display system of responding to user motion and user interface for the 3d display system
US20140317576A1 (en) Method and system for responding to user&#39;s selection gesture of object displayed in three dimensions
CN103985137B (en) It is applied to the moving body track method and system of man-machine interaction
CN102971692B (en) Three-dimensional display apparatus, 3-D view capture device and instruction defining method
CN108628533A (en) Three-dimensional graphical user interface
EP2966555A1 (en) Three-dimensional operation control method and device for touchscreen, and mobile terminal thereof
EP2352112A1 (en) Remote control system for electronic device and remote control method thereof
CN105094675B (en) A kind of man-machine interaction method and touch screen wearable device
CN103955316B (en) A kind of finger tip touching detecting system and method
CN111543934A (en) Vision detection method and device, electronic product and storage medium
CN104268864B (en) Card edge extracting method and device
CN104081307A (en) Image processing apparatus, image processing method, and program
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN106778670A (en) Gesture identifying device and recognition methods
CN105975132B (en) A kind of detection method and device of touch screen accuracy
CN104978030B (en) The software and method of display interface of mobile phone are automatically adjusted based on right-hand man
CN105807989A (en) Gesture touch method and system
CN112488059B (en) Spatial gesture control method based on deep learning model cascade
CN106919928A (en) gesture recognition system, method and display device
CN103425237B (en) A kind of display control method and electronic equipment
CN105426817B (en) Hand gesture location identification device and recognition methods based on infrared imaging
CN104270664B (en) Light pen remote control, the system and method for realizing intelligent operating platform input control

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171201

Termination date: 20191015

CF01 Termination of patent right due to non-payment of annual fee