CN105260022A - Gesture-based screen capture control method and apparatus - Google Patents
Gesture-based screen capture control method and apparatus Download PDFInfo
- Publication number
- CN105260022A CN105260022A CN201510670573.XA CN201510670573A CN105260022A CN 105260022 A CN105260022 A CN 105260022A CN 201510670573 A CN201510670573 A CN 201510670573A CN 105260022 A CN105260022 A CN 105260022A
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- display screen
- gesture
- gesture motion
- graphics field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000001143 conditioned effect Effects 0.000 claims description 8
- 230000013011 mating Effects 0.000 claims description 5
- 208000006440 Open Bite Diseases 0.000 claims description 4
- 230000009471 action Effects 0.000 abstract description 5
- 230000008569 process Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000004888 barrier function Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The present invention discloses a gesture-based screen capture control method and apparatus. The method comprises: by using an ultrasonic sensor in a mobile terminal, identifying a gesture action of a user; determining a pattern region indicated by the gesture action; and determining a projection area of the pattern region on a display screen of the mobile terminal, intercepting display content corresponding to the projection area, and storing the display content as a picture. Therefore, the problems that a screen capture mode is inflexible, a display screen of a mobile terminal is required to have a touch function and the like in the prior art are solved, so that the purpose of performing air screen capture without the need for contacting the mobile terminal is achieved, and the interestingness of a screen capture operation and the user experience are improved.
Description
Technical field
The embodiment of the present invention relates to mobile terminal technology, particularly relates to a kind of method and the device that control sectional drawing based on gesture.
Background technology
Along with the propelling of electronic product Intelligent Process, the mobile terminal with human-computer interaction function is applied to life with production by people more and more widely.People require also more and more higher to the convenience of man-machine interaction.
At present, when the content that mobile terminal shows is intercepted, usually adopt and pin the picture of mode to current display that default button exceedes default time span and intercept.Which can only realize full screen printscreen, the part in the picture of current display can not be selected to intercept, malfunction, cannot meet the diversified sectional drawing requirement of user.Also some mobile terminal carries out sectional drawing by the mode of finger sliding trace determination screenshot area on a display screen, and which can carry out selectivity sectional drawing according to the requirement of user.But this sectional drawing mode is relatively inflexible, owing to needing to detect user's finger sliding trace on a display screen, needs the display screen of mobile terminal to have touch controllable function, be not easy to extend on the intelligent terminal without touching display screen.
Summary of the invention
The invention provides and a kind ofly control the method for sectional drawing and device based on gesture, realize not contacting the object that mobile terminal carries out every empty sectional drawing, improve Experience Degree that is interesting and user.
First aspect, embodiments provides a kind of method controlling sectional drawing based on gesture, comprising:
Utilize the gesture motion of the ultrasonic sensor identification user in mobile terminal;
Determine the graphics field that described gesture motion indicates;
Determine the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.
Second aspect, the embodiment of the present invention additionally provides a kind of device controlling sectional drawing based on gesture, and this device comprises:
Gesture motion recognition unit, for utilizing the gesture motion of the ultrasonic sensor identification user in mobile terminal;
Image-region determining unit, for determining the graphics field that described gesture motion indicates;
Picture interception unit, for determining the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.
The present invention utilizes the gesture motion of the ultrasonic sensor identification user in mobile terminal, determine the graphics field that described gesture motion indicates, determine the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.Utilize ultrasound wave good directionality, run into barrier and can reflect the object realizing being accurately identified the gesture motion of user with the characteristic of transmission.Determine the view field of graphics field on the display screen of mobile terminal that gesture motion indicates, obtain the sectional drawing of displaying contents corresponding to this view field, and this sectional drawing is identical with this view field's shape, sectional drawing mode is more flexible and interest is stronger.The present invention solves the problem that sectional drawing mode in prior art is dumb, display screen that is that need mobile terminal has touch controllable function, realizes not contacting the object that mobile terminal carries out every empty sectional drawing, improves the interest of shot operation and the Experience Degree of user.
Accompanying drawing explanation
Fig. 1 is the process flow diagram controlling the method for sectional drawing based on gesture in the embodiment of the present invention one;
Fig. 2 a is the process flow diagram controlling screenshot method in the method for sectional drawing based on gesture in the embodiment of the present invention two;
Fig. 2 b be in the embodiment of the present invention two based on gesture control sectional drawing method in sectional drawing process schematic;
Fig. 3 is the structural representation controlling the device of sectional drawing based on gesture in the embodiment of the present invention three;
Fig. 4 is the structural representation of ultrasonic sensor in the embodiment of the present invention.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not entire infrastructure.
Embodiment one
The process flow diagram controlling the method for sectional drawing based on gesture that Fig. 1 provides for the embodiment of the present invention one, the present embodiment is applicable to the situation of carrying out every empty sectional drawing to the picture of terminal demonstration, the method can be performed by the device controlling sectional drawing based on gesture, and this device is configured in be had in the terminal of menu display function.Described method specifically comprises the steps:
The gesture motion of step 110, the ultrasonic sensor identification user utilized in mobile terminal.
Wherein, described ultrasonic sensor comprises at least one receiving end and at least one transmitting terminal.Launch ultrasonic signal by transmitting terminal, and handle by receiving end reception the ultrasonic reflections signal reflected.Such as, in order to obtain the coordinate of gesture motion in three dimensions, receiving end can be set along the X-axis of terminal body, Y-axis and Z-direction.According to the ultrasonic signal launched and the ultrasonic reflections signal of reception calculate ultrasonic signal propagation to user hand needed for time, determine according to the described time hand gesture location coordinate that the gesture motion of user is corresponding, the coordinate of described hand gesture location coordinate with the standard gesture in the gesture library preset is carried out mating to identify described gesture motion.Especially, can also using the loudspeaker of described mobile terminal as transmitting terminal, the microphone of described mobile terminal is as receiving end.
Step 120, determine the graphics field that described gesture motion indicates.
Wherein, described gesture motion can be the action of user's one hand, also can be the action of many hands of user's both hands or multi-user.The graphics field of described action instruction can be the regular figure such as square, circular, can also be irregular figure.Enclosed region maximum for the area formed by gesture motion, according to the gesture motion identified, is defined as the graphics field of described gesture motion instruction by terminal; Or, completion operation is performed to the barbed portion meeting pre-conditioned non-occlusion region formed by gesture motion, the enclosed region obtained is defined as the graphics field of described gesture motion instruction after completion.The distance that the described pre-conditioned central angle comprised between two adjacent end points of described barbed portion exceedes between default angle threshold or adjacent two end points of described barbed portion does not exceed default distance threshold.
Such as two users do gesture respectively, to be formed the closed circumscribed circle of a closed rectangle and this rectangle by gesture motion, mobile terminal identifies described rectangle and described circumscribed circle, compares both areas.According to comparative result, enclosed region corresponding for described circumscribed circle is defined as the graphics field that described gesture motion performs by mobile terminal.
Say for another example, pre-set the non-closed arc that central angle is more than or equal to 180 degree and can perform completion operation, if terminal recognition goes out user form by gesture motion the non-closed circular arc that a central angle (central angle) is 275 degree, then according to this non-closed circular arc of Central Symmetry principle completion of circle, region corresponding for the closed circle obtained can be defined as the graphics field of described gesture motion instruction after completion.
Also such as, pre-setting distance threshold is 5cm, the distance between two end points that so barbed portion is adjacent more than 5cm can perform completion operation.If terminal recognition goes out user forms non-closed rectangle by gesture motion, and the distance calculated between two adjacent end points of barbed portion is 3cm, then with described two end points for starting point, stretch out to draw ray along the length direction of rectangle and the track of Width, till the ray intersection having taken described end points as point-rendering.Region corresponding for the closed rectangle obtained after completion is defined as the graphics field of described gesture motion instruction.
Step 130, determine the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.
The hand gesture location coordinate that mobile terminal is corresponding according to the gesture motion of user in three dimensions, calculates the projection coordinate of graphics field on the display screen of mobile terminal of described gesture motion instruction, determines view field according to described projection coordinate.Mobile terminal intercepts the displaying contents that on display screen, this view field is corresponding and saves as picture, obtains the sectional drawing identical with the view field's shape fallen on display screen.
The technical scheme of the present embodiment, by utilizing the gesture motion of the ultrasonic sensor identification user in mobile terminal, determine the graphics field that described gesture motion indicates, determine the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.The technical scheme of the present embodiment solves the problem that in prior art, sectional drawing mode is dumb, display screen that is that need mobile terminal has touch controllable function, realize not contacting the object that mobile terminal carries out every empty sectional drawing, improve the interest of shot operation and the Experience Degree of user.
Embodiment two
Fig. 2 a is the process flow diagram controlling screenshot method in the method for sectional drawing based on gesture in the embodiment of the present invention two, the technical scheme of the present embodiment on the basis of above-described embodiment, further to determining that the view field of described graphics field on the display screen of described mobile terminal elaborates.
Described method specifically comprises the steps:
Step 210, the hand gesture location coordinate utilizing the gesture motion of described ultrasonic sensor acquisition user corresponding, undertaken mating to identify described gesture motion by the coordinate of described hand gesture location coordinate with the standard gesture in the gesture library preset.
Mobile terminal launches ultrasonic signal by the loudspeaker of self as transmitting terminal, and receives ultrasonic reflections signal by the microphone of self as receiving end.This mobile terminal according to the ultrasonic signal launched and the ultrasonic reflections signal of reception calculate ultrasonic signal propagation to user hand needed for time, the hand gesture location coordinate that the gesture motion of user is corresponding is determined according to the described time, described hand gesture location coordinate is mated with the standard gesture in the gesture library preset, determines described gesture motion according to matching result.Such as, determine described gesture motion to be central angle be the arc of the non-closed of 275 degree through overmatching, and the arc of this non-closed also in connect a closed rectangle.
Step 220, determine the graphics field that described gesture motion indicates.
User pre-sets the non-closed arc that central angle is more than or equal to 180 degree can perform completion operation.The central angle (275 degree) of described non-closed arc and the angle threshold (180 degree) pre-set compare by mobile terminal, because the central angle of this non-closed arc is greater than the angle threshold pre-set, completion operation can be performed to described non-closed arc.The arc of mobile terminal non-closed according to the principle completion of circular central symmetry obtains the border circular areas closed.Again because the area connecing rectangle in closed in this circle is less than the area of this circle, so, determine that the circle after completion is the graphics field of described gesture motion instruction.
Step 230, to determine according to hand gesture location coordinate described graphics field described mobile terminal display screen projected position coordinate in the plane.
According to projection theory, mobile terminal according to formed the hand gesture location coordinate of described circle determine described circle this mobile terminal display screen projected position coordinate in the plane.
Step 240, judge whether described projected position coordinate exceeds the display screen of described mobile terminal, if described projected position coordinate exceeds the display screen of described mobile terminal, then perform step 260, if described projected position coordinate does not exceed the display screen of described mobile terminal, then perform step 250.
The dimensional data of determined projected position coordinate and pre-recorded described mobile terminal self compares by mobile terminal determines whether described projected position coordinate exceeds the display screen of described mobile terminal.As shown in Figure 2 b, in figure, 2b-1 represents described mobile terminal display screen, 2b-2 represents that described projected position coordinate falls into the Projection Line Segment in described display screen, and 2b-3 represents that described projected position coordinate exceeds the Projection Line Segment of described display screen, and 2b-4 represents the display screen frame of described mobile terminal.When described projected position coordinate exceeds the display screen of described mobile terminal, perform step 260.When described projected position coordinate does not exceed the display screen of described mobile terminal, perform step 250.
Step 250, determine the view field of described graphics field on the display screen of described mobile terminal according to described projected position coordinate.
When described projected position coordinate does not exceed the display screen of described mobile terminal, the closed figures region that described projected position coordinate is formed by connecting is the view field on the display screen of described mobile terminal.
Step 260, the enclosed region formed according to the Projection Line Segment of display screen and display screen frame that fall into described mobile terminal determine the view field of described graphics field on the display screen of described mobile terminal.
Shown in Fig. 2 b, when described projected position coordinate exceeds the display screen of described mobile terminal, mobile terminal determines the Projection Line Segment 2b-2 fallen in the display screen of described view field, and is positioned at the display screen frame 2b-4 of described view field.The enclosed region that described Projection Line Segment 2b-2 and display screen frame 2b-4 is formed is defined as the view field of described graphics field on the display screen of described mobile terminal.
The technical scheme of the present embodiment, hand gesture location coordinate corresponding to the gesture motion of user is obtained to identify described gesture motion by ultrasonic sensor, the graphics field determined according to gesture motion is projected on the display screen of described mobile terminal, determine view field, the displaying contents intercepting described view field corresponding saves as picture.The technical scheme of the present embodiment solves the dumb problem needing touching display screen to carry out sectional drawing of existing sectional drawing mode, realize carrying out every empty sectional drawing according to the gesture of user, according to the picture of the current display of Projection Line Segment adaptive intercepting display screen that the projection of gesture motion falls on display screen, improve the interest of shot operation and the Experience Degree of user.
Embodiment three
Fig. 3 is the structural representation controlling the device of sectional drawing based on gesture in the embodiment of the present invention three.Described device comprises:
Gesture motion recognition unit 310, for utilizing the gesture motion of the ultrasonic sensor identification user in mobile terminal;
Graphics field determining unit 320, for determining the graphics field that described gesture motion indicates;
Picture interception unit 330, for determining the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.
The technical scheme of the present embodiment, the gesture motion of the ultrasonic sensor identification user in mobile terminal is utilized by gesture motion recognition unit 310, the graphics field that described gesture motion indicates is determined according to graphics field determining unit 320, determine the view field of described graphics field on the display screen of described mobile terminal by picture interception unit 330, the displaying contents intercepting described view field corresponding saves as picture.The technical scheme of the present embodiment solves the problem that in prior art, sectional drawing mode is dumb, display screen that is that need mobile terminal has touch controllable function, realize not contacting the object that mobile terminal carries out every empty sectional drawing, improve the interest of shot operation and the Experience Degree of user.
Further, described gesture motion recognition unit 310 specifically for:
Utilize the hand gesture location coordinate that the gesture motion of described ultrasonic sensor acquisition user is corresponding, the coordinate of described hand gesture location coordinate with the standard gesture in the gesture library preset is carried out mating to identify described gesture motion.
Further, described graphics field determining unit 320 specifically for:
Enclosed region maximum for the area formed by gesture motion is defined as the graphics field of described gesture motion instruction; Or,
Completion operation is performed to the barbed portion meeting pre-conditioned non-occlusion region formed by gesture motion, the enclosed region obtained is defined as the graphics field of described gesture motion instruction after completion.
Further, the distance that the described pre-conditioned central angle comprised between two adjacent end points of described barbed portion exceedes between default angle threshold or adjacent two end points of described barbed portion does not exceed default distance threshold.
Further, described picture interception unit 330 specifically for:
According to hand gesture location coordinate determine described graphics field described mobile terminal display screen projected position coordinate in the plane, determine whether described projected position coordinate exceeds the display screen of described mobile terminal;
If described projected position coordinate does not exceed the display screen of described mobile terminal, then determine the view field of described graphics field on the display screen of described mobile terminal according to described projected position coordinate;
If described projected position coordinate exceeds the display screen of described mobile terminal, then the enclosed region formed according to the Projection Line Segment of display screen and display screen frame that fall into described mobile terminal determines the view field of described graphics field on the display screen of described mobile terminal.
Further, described ultrasonic sensor comprises at least one receiving end and at least one transmitting terminal; Wherein, shown in Figure 4, described transmitting terminal is the loudspeaker 410 of mobile terminal, and described receiving end is the microphone 420 of mobile terminal.The ultrasonic signal that described loudspeaker 410 is launched spreads out of mobile terminal through glass cover-plate, is being reflected, receive ultrasonic reflections signal by microphone 420 through barrier by this barrier.
The above-mentioned device based on gesture control sectional drawing can perform the method controlling sectional drawing based on gesture that any embodiment of the present invention provides, and possesses the corresponding functional module of manner of execution and beneficial effect.
Note, above are only preferred embodiment of the present invention and institute's application technology principle.Skilled person in the art will appreciate that and the invention is not restricted to specific embodiment described here, various obvious change can be carried out for a person skilled in the art, readjust and substitute and can not protection scope of the present invention be departed from.Therefore, although be described in further detail invention has been by above embodiment, the present invention is not limited only to above embodiment, when not departing from the present invention's design, can also comprise other Equivalent embodiments more, and scope of the present invention is determined by appended right.
Claims (12)
1. control a method for sectional drawing based on gesture, it is characterized in that, comprising:
Utilize the gesture motion of the ultrasonic sensor identification user in mobile terminal;
Determine the graphics field that described gesture motion indicates;
Determine the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.
2. method according to claim 1, is characterized in that, utilizes the gesture motion of the ultrasonic sensor identification user in mobile terminal, comprising:
Utilize the hand gesture location coordinate that the gesture motion of described ultrasonic sensor acquisition user is corresponding, the coordinate of described hand gesture location coordinate with the standard gesture in the gesture library preset is carried out mating to identify described gesture motion.
3. method according to claim 1, is characterized in that, determines to comprise the graphics field that described gesture motion indicates:
Enclosed region maximum for the area formed by gesture motion is defined as the graphics field of described gesture motion instruction; Or,
Completion operation is performed to the barbed portion meeting pre-conditioned non-occlusion region formed by gesture motion, the enclosed region obtained is defined as the graphics field of described gesture motion instruction after completion.
4. method according to claim 3, it is characterized in that, the distance that the described pre-conditioned central angle comprised between two adjacent end points of described barbed portion exceedes between default angle threshold or adjacent two end points of described barbed portion does not exceed default distance threshold.
5. method according to claim 2, is characterized in that, determines the view field of described graphics field on the display screen of described mobile terminal, comprising:
According to hand gesture location coordinate determine described graphics field described mobile terminal display screen projected position coordinate in the plane, determine whether described projected position coordinate exceeds the display screen of described mobile terminal;
If described projected position coordinate does not exceed the display screen of described mobile terminal, then determine the view field of described graphics field on the display screen of described mobile terminal according to described projected position coordinate;
If described projected position coordinate exceeds the display screen of described mobile terminal, then the enclosed region formed according to the Projection Line Segment of display screen and display screen frame that fall into described mobile terminal determines the view field of described graphics field on the display screen of described mobile terminal.
6., according to the arbitrary described method of claim 1-5, it is characterized in that, described ultrasonic sensor comprises at least one receiving end and at least one transmitting terminal; Wherein, described transmitting terminal is the loudspeaker of mobile terminal, and described receiving end is the microphone of mobile terminal.
7. control a device for sectional drawing based on gesture, it is characterized in that, comprising:
Gesture motion recognition unit, for utilizing the gesture motion of the ultrasonic sensor identification user in mobile terminal;
Graphics field determining unit, for determining the graphics field that described gesture motion indicates;
Picture interception unit, for determining the view field of described graphics field on the display screen of described mobile terminal, the displaying contents intercepting described view field corresponding saves as picture.
8. device according to claim 7, is characterized in that, described gesture motion recognition unit specifically for:
Utilize the hand gesture location coordinate that the gesture motion of described ultrasonic sensor acquisition user is corresponding, the coordinate of described hand gesture location coordinate with the standard gesture in the gesture library preset is carried out mating to identify described gesture motion.
9. device according to claim 7, is characterized in that, described graphics field determining unit specifically for:
Enclosed region maximum for the area formed by gesture motion is defined as the graphics field of described gesture motion instruction; Or,
Completion operation is performed to the barbed portion meeting pre-conditioned non-occlusion region formed by gesture motion, the enclosed region obtained is defined as the graphics field of described gesture motion instruction after completion.
10. device according to claim 9, it is characterized in that, the distance that the described pre-conditioned central angle comprised between two adjacent end points of described barbed portion exceedes between default angle threshold or adjacent two end points of described barbed portion does not exceed default distance threshold.
11. devices according to claim 8, is characterized in that, described picture interception unit specifically for:
According to hand gesture location coordinate determine described graphics field described mobile terminal display screen projected position coordinate in the plane, determine whether described projected position coordinate exceeds the display screen of described mobile terminal;
If described projected position coordinate does not exceed the display screen of described mobile terminal, then determine the view field of described graphics field on the display screen of described mobile terminal according to described projected position coordinate;
If described projected position coordinate exceeds the display screen of described mobile terminal, then the enclosed region formed according to the Projection Line Segment of display screen and display screen frame that fall into described mobile terminal determines the view field of described graphics field on the display screen of described mobile terminal.
12. according to the arbitrary described device of claim 7-11, and it is characterized in that, described ultrasonic sensor comprises at least one receiving end and at least one transmitting terminal; Wherein, described transmitting terminal is the loudspeaker of mobile terminal, and described receiving end is the microphone of mobile terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510670573.XA CN105260022B (en) | 2015-10-15 | 2015-10-15 | A kind of method and device based on gesture control sectional drawing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510670573.XA CN105260022B (en) | 2015-10-15 | 2015-10-15 | A kind of method and device based on gesture control sectional drawing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105260022A true CN105260022A (en) | 2016-01-20 |
CN105260022B CN105260022B (en) | 2018-03-23 |
Family
ID=55099743
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510670573.XA Expired - Fee Related CN105260022B (en) | 2015-10-15 | 2015-10-15 | A kind of method and device based on gesture control sectional drawing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105260022B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105824537A (en) * | 2016-03-30 | 2016-08-03 | 努比亚技术有限公司 | Screenshot capturing method and screenshot capturing device |
CN106502567A (en) * | 2016-10-31 | 2017-03-15 | 维沃移动通信有限公司 | A kind of screenshot method and mobile terminal |
CN109814787A (en) * | 2019-01-29 | 2019-05-28 | 广州视源电子科技股份有限公司 | Key message determines method, apparatus, equipment and storage medium |
CN109891910A (en) * | 2016-08-01 | 2019-06-14 | 高通股份有限公司 | Device control based on audio |
CN110286830A (en) * | 2019-06-28 | 2019-09-27 | Oppo广东移动通信有限公司 | Screenshotss method, apparatus, storage medium and electronic equipment |
CN112882563A (en) * | 2019-11-29 | 2021-06-01 | 中强光电股份有限公司 | Touch projection system and method thereof |
CN114911397A (en) * | 2022-05-18 | 2022-08-16 | 北京五八信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102317899A (en) * | 2011-08-02 | 2012-01-11 | 华为终端有限公司 | Method, device and terminal equipment for message generation and processing |
US20120176401A1 (en) * | 2011-01-11 | 2012-07-12 | Apple Inc. | Gesture Mapping for Image Filter Input Parameters |
CN103037102A (en) * | 2012-12-21 | 2013-04-10 | 广东欧珀移动通信有限公司 | Free screen shot method of touch screen cellphone and cellphone |
CN104978133A (en) * | 2014-04-04 | 2015-10-14 | 阿里巴巴集团控股有限公司 | Screen capturing method and screen capturing device for intelligent terminal |
-
2015
- 2015-10-15 CN CN201510670573.XA patent/CN105260022B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120176401A1 (en) * | 2011-01-11 | 2012-07-12 | Apple Inc. | Gesture Mapping for Image Filter Input Parameters |
CN102317899A (en) * | 2011-08-02 | 2012-01-11 | 华为终端有限公司 | Method, device and terminal equipment for message generation and processing |
CN103037102A (en) * | 2012-12-21 | 2013-04-10 | 广东欧珀移动通信有限公司 | Free screen shot method of touch screen cellphone and cellphone |
CN104978133A (en) * | 2014-04-04 | 2015-10-14 | 阿里巴巴集团控股有限公司 | Screen capturing method and screen capturing device for intelligent terminal |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105824537A (en) * | 2016-03-30 | 2016-08-03 | 努比亚技术有限公司 | Screenshot capturing method and screenshot capturing device |
WO2017166959A1 (en) * | 2016-03-30 | 2017-10-05 | 努比亚技术有限公司 | Screenshot method, apparatus, mobile terminal, and computer storage medium |
CN105824537B (en) * | 2016-03-30 | 2019-03-01 | 努比亚技术有限公司 | Screenshot method and device |
CN109891910A (en) * | 2016-08-01 | 2019-06-14 | 高通股份有限公司 | Device control based on audio |
CN109891910B (en) * | 2016-08-01 | 2020-06-16 | 高通股份有限公司 | Audio-based device control |
US11184699B2 (en) | 2016-08-01 | 2021-11-23 | Qualcomm Incorporated | Audio-based device control |
CN106502567A (en) * | 2016-10-31 | 2017-03-15 | 维沃移动通信有限公司 | A kind of screenshot method and mobile terminal |
CN109814787A (en) * | 2019-01-29 | 2019-05-28 | 广州视源电子科技股份有限公司 | Key message determines method, apparatus, equipment and storage medium |
CN110286830A (en) * | 2019-06-28 | 2019-09-27 | Oppo广东移动通信有限公司 | Screenshotss method, apparatus, storage medium and electronic equipment |
CN112882563A (en) * | 2019-11-29 | 2021-06-01 | 中强光电股份有限公司 | Touch projection system and method thereof |
CN114911397A (en) * | 2022-05-18 | 2022-08-16 | 北京五八信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN105260022B (en) | 2018-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105260022A (en) | Gesture-based screen capture control method and apparatus | |
US9069386B2 (en) | Gesture recognition device, method, program, and computer-readable medium upon which program is stored | |
US10101874B2 (en) | Apparatus and method for controlling user interface to select object within image and image input device | |
US9733752B2 (en) | Mobile terminal and control method thereof | |
US10055064B2 (en) | Controlling multiple devices with a wearable input device | |
US9104239B2 (en) | Display device and method for controlling gesture functions using different depth ranges | |
US20110191707A1 (en) | User interface using hologram and method thereof | |
US20150029402A1 (en) | Remote controller, system, and method for controlling remote controller | |
KR20150130379A (en) | Extending interactive inputs via sensor fusion | |
CN102880405A (en) | Information processing apparatus, information processing method, program and remote control system | |
JP2014164755A (en) | Apparatus for recognizing proximity motion using sensors, and method using the same | |
US20190272040A1 (en) | Manipulation determination apparatus, manipulation determination method, and, program | |
US10282087B2 (en) | Multi-touch based drawing input method and apparatus | |
US10372223B2 (en) | Method for providing user commands to an electronic processor and related processor program and electronic circuit | |
CN105260024A (en) | Method and apparatus for stimulating gesture motion trajectory on screen | |
US20140282283A1 (en) | Semantic Gesture Processing Device and Method Providing Novel User Interface Experience | |
CN110114749A (en) | Panel input device, touch gestures decision maker, touch gestures determination method and touch gestures decision procedure | |
CN105138136A (en) | Hand gesture recognition device, hand gesture recognition method and hand gesture recognition system | |
CN105306819A (en) | Gesture-based photographing control method and device | |
US9678608B2 (en) | Apparatus and method for controlling an interface based on bending | |
CN105320398A (en) | Method of controlling display device and remote controller thereof | |
KR101233793B1 (en) | Virtual mouse driving method using hand motion recognition | |
WO2014084634A1 (en) | Mouse apparatus for eye-glass type display device, and method for driving same | |
JP6075255B2 (en) | Input device, operation identification method | |
KR101558284B1 (en) | Touch sensing device utilizing assist device and method for operating the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
CP01 | Change in the name or title of a patent holder | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180323 |
|
CF01 | Termination of patent right due to non-payment of annual fee |