CN107918490B - Control method and electronic equipment - Google Patents

Control method and electronic equipment Download PDF

Info

Publication number
CN107918490B
CN107918490B CN201711171956.8A CN201711171956A CN107918490B CN 107918490 B CN107918490 B CN 107918490B CN 201711171956 A CN201711171956 A CN 201711171956A CN 107918490 B CN107918490 B CN 107918490B
Authority
CN
China
Prior art keywords
gesture
input interface
outline
gesture outline
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711171956.8A
Other languages
Chinese (zh)
Other versions
CN107918490A (en
Inventor
王海洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201711171956.8A priority Critical patent/CN107918490B/en
Publication of CN107918490A publication Critical patent/CN107918490A/en
Application granted granted Critical
Publication of CN107918490B publication Critical patent/CN107918490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method and electronic equipment, wherein the electronic equipment comprises input equipment, the input equipment projects projection rays onto a desktop where the input equipment is located so as to form an input interface, and the method comprises the following steps: obtaining return light of the projection light after encountering the operation body on the input interface; obtaining at least part of the gesture outline of the operation body based on the return light; and judging whether the at least part of the gesture outline meets a preset operation condition, if so, executing an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface. According to the method and the device, the content is operated by recognizing the gesture outline, and the corresponding function icon does not need to be clicked, so that the content can be operated, the user operation is saved, the user operation complexity is reduced, and the efficiency is improved.

Description

Control method and electronic equipment
Technical Field
The present disclosure relates to the field of device control technologies, and in particular, to a control method and an electronic device.
Background
With the development of technologies, more and more drawing devices such as touch panels, such as digital panels or projection panels, are used. When a user inputs an error and needs to delete the error, the user often needs to click a function icon of the eraser to erase the error picture, so that the operation complexity of the user is high, and the efficiency is low.
Disclosure of Invention
The application aims to provide a control method and electronic equipment, and the control method and the electronic equipment are used for solving the technical problems that in the prior art, the operation complexity of input equipment is high, and the working efficiency is low.
The application provides a control method, which is applied to input equipment, wherein the input equipment projects projection rays onto a desktop where the input equipment is located to form an input interface, and the method comprises the following steps:
obtaining return light of the projection light after encountering the operation body on the input interface;
obtaining at least part of gesture outline of the operation body based on the return light;
and judging whether the at least part of the gesture outline meets a preset operation condition, if so, executing an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface.
The method preferably determines whether the at least part of the gesture outline meets a preset operation condition, and includes:
comparing the at least part of gesture outline with a preset target gesture outline to obtain a similarity value of the at least part of gesture outline and the target gesture outline;
and judging whether the similarity value is larger than a preset first threshold value or not.
Preferably, the method performs an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface, and includes:
and deleting the content corresponding to the at least part of the gesture outline on the input interface.
Preferably, the determining whether the at least part of the gesture outline satisfies a preset operation condition includes:
identifying an operation gesture of the operation body based on the at least part of the gesture outline;
and judging whether the operation gesture meets a preset operation condition.
The above method preferably identifies an operation gesture of the operation body based on the at least part of the gesture outline, including;
comparing the at least part of gesture outline with preset gesture outlines one by one to obtain similarity values of the at least part of gesture outline and the gesture outlines;
and taking the operation gesture corresponding to the gesture outline with the highest similarity value as the operation gesture of the operation body.
Preferably, the method performs an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface, and includes:
determining a target position on the input interface where the at least part of the gesture outline is occluded;
and executing an operation instruction on the content correspondingly displayed at the target position on the input interface.
Preferably, the method performs an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface, and includes:
obtaining a movement track of the at least part of the gesture outline;
obtaining display content occluded by the at least part of the gesture outline on the input interface under the movement track;
and executing an operation instruction on the display content.
The present application further provides an electronic device, including an input device, the input device projects projection light onto a desktop on which the input device is located to form an input interface, the electronic device further includes:
and the processor is used for obtaining return light after the projection light meets the operation body on the input interface, obtaining at least part of gesture outline of the operation body based on the return light, judging whether the at least part of gesture outline meets preset operation conditions, and if so, executing an operation instruction on the content corresponding to the at least part of gesture outline on the input interface.
Preferably, the electronic device further includes: ultra-short-focus infrared projection equipment.
According to the scheme, the control method and the electronic device provided by the application have the advantages that the gesture outline of the operation body is obtained by obtaining the return light of the input device after the projection light meets the operation body on the input interface, so that the operation instruction is executed on the corresponding content on the input interface based on the gesture outline, the corresponding function icon does not need to be clicked, the content can be operated, the user operation is saved, the user operation complexity is reduced, and the efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a flowchart of a control method according to an embodiment of the present application;
FIGS. 2 to 9 are diagrams illustrating examples of applications of embodiments of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, a flowchart of an implementation of a control method provided in an embodiment of the present application is provided, where the control method may be applied to an input device, such as an ultra-short-focus infrared projection device in an electronic device, and the input device projects projection light onto a desktop to form an input interface, as shown in fig. 2. As shown in fig. 3, when the operation body of the user performs an input operation on the input interface on the desktop, the input device determines the operation position of the operation body of the user by identifying the direction and distance of the light beam blocked by the operation body, thereby implementing content input.
In this embodiment, the control method may include the steps of:
step 101: and obtaining the return light after the projection light meets the operation body on the input interface.
As shown in fig. 4, the operation body performs various input operations on the input interface, and projected light forming the input interface hits the operation body and reflects light outward, in this embodiment, return light after the projected light meets the operation body is acquired.
Step 102: based on the return light, at least a partial gesture contour of the operation body is obtained.
As shown in fig. 4, after the return light is obtained, in the present embodiment, the direction and distance at which the return light meets the operation body may be determined based on the time and angle at which the return light is obtained, so as to obtain at least a partial gesture contour of the operation body.
Step 103: and judging whether the at least part of the gesture outline meets a preset operation condition, and if so, executing step 104.
The operating condition here may be that at least part of the gesture outline obtained in this embodiment is consistent with or similar to a specific gesture outline; or the gesture area corresponding to at least part of the gesture outline reaches a certain threshold value.
Step 104: and executing an operation instruction on the content corresponding to at least part of the gesture outline on the input interface.
The operation instruction may be an editing operation instruction for modifying, deleting, adding, adjusting a display effect, and the like of the content.
The content corresponding to the at least part of the gesture outline on the input interface can be understood as: the content projected by the occluded projected rays is shown in fig. 5. In the embodiment, the gesture outline of the operation body is recognized, and the corresponding editing function icon is not required to be clicked, so that the content is edited.
According to the scheme, the control method obtains the gesture outline of the operation body by obtaining the return light of the input device after the projection light meets the operation body on the input interface, so that the operation instruction is executed on the corresponding content on the input interface based on the gesture outline, the corresponding function icon does not need to be clicked, the content can be operated, the user operation is saved, the user operation complexity is reduced, and the efficiency is improved.
In one implementation manner, when determining whether the at least part of the gesture outline meets a preset operation condition, the following may be implemented:
firstly, comparing the at least part of gesture outline with a preset target gesture outline to obtain a similarity value of the at least part of gesture outline and the target gesture outline, such as similarity of 50% or 40%, and then judging whether the similarity value is greater than or equal to a preset first threshold value, if so, executing an operation instruction on the content corresponding to the at least part of gesture outline on the input interface.
Or, in this embodiment, when determining whether the at least part of the gesture outline satisfies the preset operation condition, the following method may be implemented:
firstly, based on the at least part of the gesture outline, an operation gesture of the operation body, such as a palm operation or a single-finger operation, is recognized, as shown in fig. 6, and then it is determined whether the operation gesture of the operation body meets a preset operation condition, for example, it is determined whether the operation gesture is consistent with a preset target gesture, and if the operation gesture is consistent with the target gesture, an operation instruction is executed on the content corresponding to the at least part of the gesture outline on the input interface.
Furthermore, in this embodiment, when the operation gesture of the operation body is recognized, the at least part of the gesture outline may be compared with a plurality of preset gesture outlines one by one, such as an outline of a palm, an outline of a single-finger operation, an outline of a double-finger operation, and the like, so as to obtain a similarity value between each gesture outline and the at least part of the gesture outlines of the operation body, and finally, the operation gesture corresponding to the gesture outline with the highest similarity value is used as the operation gesture of the operation body. Then, the embodiment determines whether the operation gesture of the operation body meets a preset operation condition, and if the operation gesture is consistent with a preset target gesture, executes an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface.
In an implementation manner, when an operation instruction is executed on content corresponding to at least part of a gesture outline on an input interface in this embodiment, it is necessary to first determine the content corresponding to the at least part of the gesture outline locked on the input interface, and then execute the operation instruction on the content, specifically, in this embodiment, the operation instruction may be executed in the following manner:
firstly, determining a target position of the at least part of gesture outline which is blocked on the input interface, as shown in fig. 7, the target position X on the input interface is blocked by the at least part of gesture outline a, and then executing an operation instruction on content which is correspondingly displayed at the target position on the input interface, such as deleting content such as characters or lines and the like in the part of content;
it should be noted that, the operation body blocks the content corresponding to the projection light, and the area of the area may be larger than the area of the directly-parallel area of the operation body on the input interface, as shown in the area B in fig. 7, and the content displayed corresponding to the target position is the directly-parallel area C of the operation body on the input interface.
For example, after determining the target gesture contour or the operation gesture matched with the at least part of the gesture contour, the embodiment corresponds the target gesture contour or the operation gesture to at least part of the gesture contour of the gesture of the operation body on the input interface, as shown in fig. 8, determines a target position on the input interface that is blocked by the at least part of the gesture contour of the operation body, that is, a directly opposite parallel area on the input interface of the operation body, and thereby determines the content displayed corresponding to the target position.
Or, in another implementation manner, when an operation instruction is executed on content corresponding to at least part of the gesture outline on the input interface, the embodiment may be implemented in the following manner:
firstly, a movement track of the at least part of the gesture outline is obtained, such as a track X shown in fig. 9, then, display content that is occluded on the input interface by the at least part of the gesture outline under the movement track is obtained, such as a content Y in fig. 9, and finally, an operation instruction is executed on the display content, such as deleting the display content.
In this embodiment, the moving trajectory of the operation body may be determined based on the return light of the projection light blocked by the operation body, the display content blocked by the operation body on the input interface under the moving trajectory may be determined based on the target gesture contour or the operation gesture corresponding to the at least part of gesture contour, and finally, the operation instruction may be executed on the display content, such as deleting the display content.
Referring to fig. 10, a schematic structural diagram of an electronic device provided in this embodiment of the present disclosure may include an input device 1001, such as an ultra-short-focus infrared projection device, where the input device projects projection light onto a desktop where the input device is located to form an input interface, and in addition, the electronic device further includes a processor 1002, configured to obtain return light after the projection light meets an operation body on the input interface, obtain at least part of a gesture contour of the operation body based on the return light, determine whether the at least part of the gesture contour meets a preset operation condition, and if so, execute an operation instruction on content corresponding to the at least part of the gesture contour on the input interface.
The processor 1002 may compare the at least part of the gesture outline with a preset target gesture outline to obtain a similarity value between the at least part of the gesture outline and the target gesture outline, and then determine whether the similarity value is greater than a preset first threshold to determine whether the at least part of the gesture outline meets a preset operation condition, alternatively, the processor 1002 may identify an operation gesture of the operation body based on the at least part of the gesture outline, for example, the at least part of the gesture outlines are compared with preset gesture outlines one by one to obtain similarity values of the at least part of the gesture outlines and the gesture outlines, the operation gesture corresponding to the gesture outline with the highest similarity value is used as the operation gesture of the operation body, and then, and judging whether the operation gesture meets preset operation conditions or not to judge whether at least part of the gesture outline meets the preset operation conditions or not.
Correspondingly, when the processor 1002 executes an operation instruction on the content corresponding to the at least part of the gesture outline on the input interface, after determining a target position that is blocked by the at least part of the gesture outline on the input interface, the processor may execute the operation instruction on the content that is correspondingly displayed at the target position on the input interface; or, after obtaining the movement trajectory of the at least part of the gesture outline, the processor 1002 may execute an operation instruction on the display content by obtaining the display content blocked by the at least part of the gesture outline on the input interface under the movement trajectory, such as deleting the content corresponding to the at least part of the gesture outline on the input interface.
Taking an electronic device as an example, the projector includes an ultra-short-focus infrared projection lens and a processor, the projection lens projects an input interface, that is, a projection screen, onto a desktop, and a user can perform various input operations with a hand on the projection screen, for example, a palm is parallel to the screen, the projection lens identifies at least a part of the outline of the palm through infrared rays, and further identifies the palm as an eraser function, and along with the parallel movement of the palm, the content corresponding to the trajectory of the parallel movement of the palm is deleted, in this process, the user can directly erase wrong content, such as images or characters, on the projected desktop without manually clicking an icon of the eraser function by the user, thereby saving user operations and improving the working efficiency of the user.
The specific implementation manner of the processor 1002 in this embodiment may refer to the technical solutions described or illustrated in fig. 1 to 9, and will not be described in detail here.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing detailed description is directed to a control method and an electronic device provided by the present application, and specific examples are applied in the present application to explain the principles and embodiments of the present application, and the descriptions of the foregoing examples are only used to help understand the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (6)

1. A control method is applied to an input device, the input device projects projection rays onto a desktop on which the input device is located to form an input interface, and the method comprises the following steps:
obtaining return light of the projection light after encountering the operation body on the input interface;
obtaining at least part of gesture outline of the operation body based on the return light;
judging whether the at least part of the gesture outline meets a preset operation condition, if so, identifying a corresponding function, and executing an operation instruction on the projection content shielded by the at least part of the gesture outline on the input interface;
the executing the operation instruction on the projection content blocked by the at least part of the gesture outline on the input interface comprises:
determining a target position blocked by the at least part of gesture outline on the input interface, and executing an operation instruction on content correspondingly displayed at the target position on the input interface;
or obtaining a movement track of the at least part of gesture outline, obtaining display content blocked by the at least part of gesture outline on the input interface under the movement track, and executing an operation instruction on the display content;
or deleting the projection content blocked by the at least part of the gesture outline on the input interface.
2. The method of claim 1, wherein determining whether the at least partial gesture outline satisfies a preset operation condition comprises:
comparing the at least partial gesture outline with a preset target gesture outline to obtain a similarity value of the at least partial gesture outline and the target gesture outline;
and judging whether the similarity value is larger than a preset first threshold value or not.
3. The method according to claim 1, wherein the determining whether the at least partial gesture outline satisfies a preset operation condition comprises:
identifying an operation gesture of the operation body based on the at least partial gesture outline;
and judging whether the operation gesture meets a preset operation condition.
4. The method according to claim 3, characterized in that, based on the at least partial gesture profile, an operation gesture of the operation body is recognized, including;
comparing the at least part of gesture outline with preset gesture outlines one by one to obtain similarity values of the at least part of gesture outline and the gesture outlines;
and taking the operation gesture corresponding to the gesture outline with the highest similarity value as the operation gesture of the operation body.
5. An electronic device comprising an input device that projects projected light onto a desktop on which it is located to form an input interface, the electronic device further comprising:
the processor is used for obtaining return light after the projection light meets an operation body on the input interface, obtaining at least part of gesture outlines of the operation body based on the return light, judging whether the at least part of gesture outlines meet preset operation conditions, if so, identifying corresponding functions, and executing operation instructions on projection contents blocked by the at least part of gesture outlines on the input interface;
the executing the operation instruction on the projection content blocked by the at least part of the gesture outline on the input interface comprises:
determining a target position blocked by the at least part of gesture outline on the input interface, and executing an operation instruction on content correspondingly displayed at the target position on the input interface;
or obtaining a movement track of the at least part of gesture outline, obtaining display content blocked by the at least part of gesture outline on the input interface under the movement track, and executing an operation instruction on the display content;
or deleting the projection content blocked by the at least part of the gesture outline on the input interface.
6. The electronic device of claim 5, wherein the input device is: ultra-short-focus infrared projection equipment.
CN201711171956.8A 2017-11-22 2017-11-22 Control method and electronic equipment Active CN107918490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711171956.8A CN107918490B (en) 2017-11-22 2017-11-22 Control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711171956.8A CN107918490B (en) 2017-11-22 2017-11-22 Control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN107918490A CN107918490A (en) 2018-04-17
CN107918490B true CN107918490B (en) 2022-05-31

Family

ID=61897617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711171956.8A Active CN107918490B (en) 2017-11-22 2017-11-22 Control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN107918490B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110275611B (en) * 2019-05-27 2023-02-17 联想(上海)信息技术有限公司 Parameter adjusting method and device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104216642A (en) * 2014-08-21 2014-12-17 深圳市金立通信设备有限公司 Terminal control method
CN105389113A (en) * 2015-11-03 2016-03-09 小米科技有限责任公司 Gesture-based application control method and apparatus and terminal
CN105786393A (en) * 2016-03-31 2016-07-20 联想(北京)有限公司 Information processing method and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8682030B2 (en) * 2010-09-24 2014-03-25 Microsoft Corporation Interactive display
JP2015095210A (en) * 2013-11-14 2015-05-18 アルプス電気株式会社 Operation input device
CN103809756B (en) * 2014-02-24 2018-08-31 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN105208196A (en) * 2015-08-17 2015-12-30 美国掌赢信息科技有限公司 Call initiation method and electronic equipment
US9674931B1 (en) * 2016-01-08 2017-06-06 Osram Sylvania Inc. Techniques for gesture-based control of color-tunable lighting

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104216642A (en) * 2014-08-21 2014-12-17 深圳市金立通信设备有限公司 Terminal control method
CN105389113A (en) * 2015-11-03 2016-03-09 小米科技有限责任公司 Gesture-based application control method and apparatus and terminal
CN105786393A (en) * 2016-03-31 2016-07-20 联想(北京)有限公司 Information processing method and electronic equipment

Also Published As

Publication number Publication date
CN107918490A (en) 2018-04-17

Similar Documents

Publication Publication Date Title
US10001909B2 (en) Touch optimizations for range slider controls
US8427440B2 (en) Contact grouping and gesture recognition for surface computing
TWI397840B (en) A trajectory-based control method and apparatus thereof
US20090090567A1 (en) Gesture determination apparatus and method
US9047001B2 (en) Information processing apparatus, information processing method, and program
US20150363037A1 (en) Control method of touch panel
US8417026B2 (en) Gesture recognition methods and systems
CN102768595B (en) A kind of method and device identifying touch control operation instruction on touch-screen
CN105117056A (en) Method and equipment for operating touch screen
JP2014220720A (en) Electronic apparatus, information processing method, and program
CN109345558B (en) Image processing method, image processing apparatus, image processing medium, and electronic device
CN105892847A (en) Method and device for carrying out target program by multi-finger slide
CN110928449A (en) Touch screen point reporting method and device, electronic equipment and storage medium
CN107918490B (en) Control method and electronic equipment
JP6352801B2 (en) Information processing apparatus, information processing program, and information processing method
CN105843414A (en) Input correction method for input method and input method device
Sharma et al. Interactive projector screen with hand detection using gestures
CN104516559A (en) Multi-point touch method of touch input device
CN113031817B (en) Multi-touch gesture recognition method and false touch prevention method
CN111008080A (en) Information processing method, device, terminal equipment and storage medium
CN114996346A (en) Visual data stream processing method and device, electronic equipment and storage medium
US9799103B2 (en) Image processing method, non-transitory computer-readable storage medium and electrical device
CN112578961B (en) Application identifier display method and device
CN104536678A (en) Display effect regulating method and electronic device
CN104615342A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TG01 Patent term adjustment