CN107390922B - Virtual touch method, device, storage medium and terminal - Google Patents

Virtual touch method, device, storage medium and terminal Download PDF

Info

Publication number
CN107390922B
CN107390922B CN201710526167.5A CN201710526167A CN107390922B CN 107390922 B CN107390922 B CN 107390922B CN 201710526167 A CN201710526167 A CN 201710526167A CN 107390922 B CN107390922 B CN 107390922B
Authority
CN
China
Prior art keywords
screen
virtual
virtual touch
touch
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710526167.5A
Other languages
Chinese (zh)
Other versions
CN107390922A (en
Inventor
梁昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710526167.5A priority Critical patent/CN107390922B/en
Publication of CN107390922A publication Critical patent/CN107390922A/en
Application granted granted Critical
Publication of CN107390922B publication Critical patent/CN107390922B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a virtual touch method, a virtual touch device, a storage medium and a terminal. The virtual touch method comprises the following steps of acquiring a first distance from a finger to a screen when a trigger signal is detected; establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, and the second distance is smaller than the first distance; when the finger is detected to move to a first position on the virtual touch screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen; and generating a touch operation instruction according to the second position. The invention has the beneficial effect of facilitating the user to control the terminal.

Description

Virtual touch method, device, storage medium and terminal
Technical Field
The present invention relates to the field of communications, and in particular, to a virtual touch method, an apparatus, a storage medium, and a terminal.
Background
With the development of terminal technology, mobile terminals have begun to change from simply providing telephony devices to a platform for running general-purpose software. The platform no longer has the main purpose of providing call management, but provides an operating environment including various application software such as call management, game and entertainment, office events, mobile payment and the like.
The mobile terminal is mostly a touch screen, and a touch operation instruction is generated by touching the touch screen to operate the mobile terminal. However, when the user has water stain or contamination on the hand, it is inconvenient to perform the touch operation, or even if the touch operation is forcibly performed on the touch screen, the operation is disabled or an error is sensed, or the screen is contaminated.
Disclosure of Invention
The embodiment of the invention provides a virtual touch method, a virtual touch device, a storage medium and a terminal, which have the beneficial effect of facilitating a user to control the terminal when a finger is inconvenient to touch a screen.
The embodiment of the invention provides a virtual touch method, which is applied to a terminal, wherein the terminal comprises a screen, and the method comprises the following steps:
when a trigger signal is detected, acquiring a first distance from a finger to the screen;
establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, and the second distance is smaller than the first distance;
when the finger is detected to move to a first position on the virtual touch screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen;
and generating a touch operation instruction according to the second position.
The embodiment of the invention provides a virtual touch device, which is applied to a terminal, wherein the terminal comprises a screen, and the device comprises:
the first acquisition module is used for acquiring a first distance between a finger and the screen when the trigger signal is detected;
the first establishing module is used for establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one mode through a first mapping relation, and the second distance is smaller than the first distance;
a second obtaining module, configured to, when it is detected that the finger moves to a first position on the virtual touch screen, obtain a second position corresponding to the first position according to the first mapping relationship, where the second position is located on the screen;
and the first generating module is used for generating a touch operation instruction according to the second position.
The embodiment of the invention provides a storage medium, which stores a plurality of instructions, wherein the instructions are loaded by a processor and execute any one of the methods.
An embodiment of the present invention provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method described in any one of the above.
Drawings
Fig. 1 is a schematic view of a first scenario of a virtual touch method according to an embodiment of the present invention.
Fig. 2 is a first flowchart of a virtual touch method according to an embodiment of the invention.
Fig. 3 is a schematic view of a second scenario of the virtual touch method according to an embodiment of the invention.
Fig. 4 is a second flowchart of a virtual touch method according to an embodiment of the invention.
Fig. 5 is a first structural diagram of a virtual touch device according to an embodiment of the invention.
Fig. 6 is a second structural diagram of a virtual touch device according to an embodiment of the invention.
Fig. 7 is a third structural diagram of a virtual touch device according to an embodiment of the invention.
Fig. 8 is a first structural diagram of a terminal according to an embodiment of the present invention.
Fig. 9 is a second structure diagram of a terminal according to an embodiment of the invention.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present invention are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the invention and should not be taken as limiting the invention with regard to other embodiments that are not detailed herein.
In the description that follows, embodiments of the invention are described with reference to steps and symbols of operations performed by one or more computers, unless otherwise indicated. It will thus be appreciated that those steps and operations, which are referred to herein several times as being computer-executed, include being manipulated by a computer processing unit in the form of electronic signals representing data in a structured form. This manipulation transforms the data or maintains it at locations in the computer's memory system, which may reconfigure or otherwise alter the computer's operation in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the invention have been described in language specific to above, it is not intended to be limited to the specific details shown, since one skilled in the art will recognize that various steps and operations described below may be implemented in hardware.
Referring to fig. 1, fig. 1 is a scene schematic diagram of a virtual touch method and a virtual touch device in an embodiment of the invention. The virtual touch method and the device are mainly applied to terminals such as mobile phones and IPADs. In this scenario, a virtual touch screen 20 is established above the screen 11 of the terminal 10. The virtual touch screen 20 is parallel to the screen 11, but may be opposite and parallel to the screen. When the finger of the user moves to the first position 21 of the virtual touch screen 20, the terminal 10 generates a touch operation command, where the position of the screen 11 corresponding to the touch operation command is the second position 12; wherein the first position 21 corresponds to the second position 12 on the screen 11.
Referring to fig. 2, fig. 2 is a flowchart of a virtual touch method in an embodiment of the present invention, in which the virtual touch method includes the following steps:
s101, when the trigger signal is detected, acquiring a first distance between the finger and the screen.
In this step, when the finger of the user has oil stain, water stain or dirty things which are inconvenient to contact the screen, the user shakes the mobile terminal to generate a trigger signal, so as to start the virtual touch function. Of course, the virtual touch function can also be started through voice control. After the virtual touch function is started, the finger moves to a position with a certain distance right in front of the screen, and the mobile terminal detects a first distance from the finger to the screen by using the distance sensor or the depth camera.
S102, establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one mode through a first mapping relation, and the second distance is smaller than the first distance.
In this step, the virtual touch screen is a planar area which has the same size as the screen and is opposite to the screen. The virtual touch screen is provided with a plurality of coordinate points, and the screen is also provided with a plurality of coordinate points. And each coordinate point of each position on the virtual touch screen corresponds to each coordinate point of each position on the screen through a mapping relation. In this embodiment, any position on the virtual touch screen corresponds to a position of the virtual touch screen at a vertical projection on the screen. And acquiring image information of the finger through the camera, and setting a coordinate point at each position of the established virtual touch screen.
S103, when the finger is detected to be located at the first position on the virtual touch screen, obtaining a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen.
In this step, a first position of the finger on the virtual touch screen is obtained through the camera, that is, coordinate points of all positions of the area covered by the finger are obtained. And then, acquiring a corresponding second position on the screen according to the first mapping relation. In this embodiment, the first mapping relationship means that the position on the virtual touch screen and the corresponding position on the screen are opposite to each other.
And S104, generating a touch operation instruction according to the second position.
In this step, the touch operation command is associated with the second location, for example, when there is a QQ shortcut start icon at the second location, the touch operation command is corresponding to starting the QQ procedure.
In view of the above, the implementation of the present invention obtains the first distance from the finger to the screen when the trigger signal is detected; establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, and the second distance is smaller than the first distance; when the finger is detected to move to a first position on the virtual touch screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen; generating a touch operation instruction according to the second position; thereby completing the touch operation on the screen.
Referring to fig. 3 and fig. 4, fig. 3 is a scene diagram of a virtual touch method in an embodiment of the invention. Fig. 4 is a flowchart of a virtual touch method according to an embodiment of the invention. In this embodiment, the virtual touch method includes the following steps:
s201, when the trigger signal is detected, acquiring a first distance between the finger and the screen.
In this step, when the finger of the user has oil stain, water stain or dirty things which are inconvenient to contact the screen, the user shakes the mobile terminal to generate a trigger signal, so as to start the virtual touch function. Of course, the terminal may also start the virtual touch function through voice control. After the virtual touch function is started, the finger moves to a position with a certain distance right in front of the screen, and the terminal detects a first distance from the finger to the screen by using a distance sensor or a depth camera.
And before the trigger signal is generated, learning a scene needing to start the virtual touch function. For example, when it is detected that the user makes a predetermined gesture or emits a predetermined sound, the predetermined gesture or the predetermined sound is recorded and stored, and is set as a trigger signal of the virtual touch function. And in the process of using the terminal, when the preset gesture or the preset sound is detected, generating a trigger signal and starting a virtual touch function.
S202, establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one mode through a first mapping relation, and the second distance is smaller than the first distance.
In this step, the virtual touch screen 20 is a planar area with the same size as the screen 11 and opposite to each other. The virtual touch screen 20 has a plurality of coordinate points, and the screen 11 also has a plurality of coordinate points. Each coordinate point of each position on the virtual touch screen 20 corresponds to each coordinate point of each position on the screen 11 through the first mapping relationship. In the present embodiment, any position on the virtual touch screen 20 corresponds to a position of the vertical projection on the screen 11. Image information of the finger is obtained through the camera, and a coordinate point is set at each position of the established virtual touch screen 20.
S203, establishing a virtual positioning screen at a first distance from the screen, wherein the virtual positioning screen is parallel to and opposite to the screen, and each position on the virtual positioning screen corresponds to each position on the screen in a one-to-one manner through a second mapping relation.
In the present invention, the virtual touch screen 20 is located between the virtual positioning screen 30 and the screen 11, the virtual positioning screen 30, the virtual touch screen 20 and the screen 11 are sequentially parallel and opposite, and the shapes and the sizes of the virtual positioning screen 30, the virtual touch screen 20 and the screen 11 are the same. Each position on the virtual positioning screen 20 corresponds to each position on the screen 11 through the second mapping relationship, in this embodiment, the position on the virtual positioning screen 30 corresponds to the position opposite to each other on the virtual touch screen 20, and corresponds to the position opposite to each other on the screen 11. The virtual positioning screen 30 is mainly used for positioning and selecting an area to be touched by a user. In this embodiment, the second mapping relationship means that the position on the virtual positioning screen and the corresponding position on the screen are opposite to each other.
And S204, when the finger is detected to be positioned at the third position on the virtual positioning screen, acquiring a second position corresponding to the third position on the screen according to the second mapping relation.
In this step, a distance sensor or a depth lens is used to obtain a distance of the finger, and a camera is used to capture image information of the finger so as to detect that the finger is located at a third position on the virtual positioning screen.
When a user needs to touch a certain second position 12 on the screen 11, the user needs to move on the virtual positioning screen 30 first, and when the user moves to a third position 31 of the virtual positioning screen 30, the user gives a prompt on the screen, and then the user puts the finger vertically. The specific operation is shown in step S205.
And S205, displaying a second prompt sign at a second position on the screen.
In this step, the second position 11 on the screen generates a second cue flag when the user's finger is moved to the third position 31. For example, the second cue marker may be a light colored cursor. Of course, the second cue flag is not limited thereto.
S206, when the finger is detected to move to the first position on the virtual touch screen, acquiring a second position corresponding to the first position on the screen according to the mapping relation.
In this step, a first position of the finger on the virtual touch screen is obtained through the camera, that is, coordinate points of all positions of the area covered by the finger are obtained. In this embodiment, the difference between the first distance and the second distance is small, for example, the difference between the first distance and the second distance is set to be 1 mm. Once the user selects a position on the virtual positioning screen 30, for example, after finding the third position 31, the user can move the virtual positioning screen 20 to the first position 21 by moving the third position 31 vertically by 1mm toward the screen 11. In practical applications, when it is detected that the finger moves to the first position on the virtual touch screen, the first prompt mark is also displayed at the second position on the screen 11, for example, the first prompt mark may be a dark cursor to prompt the user of the position touched by the user.
And S207, generating a touch operation instruction according to the second position.
In this step, the touch operation command is associated with the second location, for example, when there is a QQ shortcut start icon at the second location, the touch operation command is corresponding to starting the QQ procedure.
In some embodiments, this step S207 includes:
and S2071, acquiring the stay time of the finger at the first position.
And S2072, if the staying time is longer than the preset time, generating a first touch operation instruction.
And S2073, if the staying time is less than the preset time, generating a second touch operation instruction.
S2074, after the first touch operation instruction is generated, if it is detected that the finger moves from the first position to the sixth position of the virtual touch screen, a third touch operation instruction is generated.
The preset time can be 1 second, the first touch operation instruction is a selection instruction and a to-be-moved instruction, and after the first touch operation instruction is generated, if the finger moves on the virtual touch screen, the selected object moves together. For example, when the selected object moves from the first position to the sixth position, the selected object is moved from the second position on the screen to a fifth position on the screen, and the fifth position corresponds to the sixth position. The second touch operation instruction is a selection instruction, and if an icon or an object exists at a second position corresponding to the first position, the icon or the object is selected.
As can be seen from the above, when the trigger signal is detected, the first distance from the finger to the screen is obtained in the embodiment of the present invention; establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, and the second distance is smaller than the first distance; when the finger is detected to move to a first position on the virtual touch screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen; generating a touch operation instruction according to the second position; therefore, the touch operation of the screen is completed, and the touch control method has the beneficial effect of facilitating the user to control the terminal.
Referring to fig. 5, fig. 5 is a virtual touch device applied to a terminal, where the terminal includes a screen, and the device includes: a first obtaining module 301, a first establishing module 302, a second obtaining module 303, and a first generating module 304.
The first obtaining module 301 is configured to obtain a first distance from the finger to the screen when the trigger signal is detected.
The first establishing module 302 is configured to establish a virtual touch screen at a second distance from the screen, where each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relationship, and the second distance is smaller than the first distance.
The second obtaining module 303 is configured to, when it is detected that the finger moves to a first position on the virtual touch screen, obtain a second position corresponding to the first position according to the first mapping relationship, where the second position is located on the screen.
The first generating module 304 is configured to generate a touch operation instruction according to the second position.
In some embodiments, referring to fig. 6, the first generating module 304 includes an obtaining unit 3041, a first generating unit 3042, a second generating unit 3043, and a third generating unit 3044.
The acquiring unit 3041 is configured to acquire a staying time of the finger at the first position.
The first generating unit 3042 is configured to generate a first touch operation instruction if the staying time is longer than a preset time.
The second generating unit 3043 is configured to generate a second touch operation instruction if the staying time is less than the preset time.
The third generating unit 3044 generates a third touch operation instruction when the finger is detected to move from the first position to a sixth position on the virtual touch screen when the staying time is longer than a preset time.
Referring to fig. 7, in some embodiments, the virtual touch device further includes: a first prompting module 305, a second generating module 306, a second prompting module 307, a second establishing module 308, a third obtaining module 309 and a third prompting module 310.
The first prompting module 305 is configured to display a first prompting mark at the second position on the screen.
The second generating module 306 is configured to generate a dragging operation instruction when it is detected that the finger slides from the second position to the third position on the virtual touch screen.
The second prompt module 307 is configured to generate a prompt message after the virtual touch screen is established.
The second establishing module 308 is configured to establish a virtual positioning screen at a first distance from the screen, where the virtual positioning screen, the virtual touch screen, and the screen are sequentially parallel and directly opposite to each other, and each position on the virtual positioning screen corresponds to each position on the screen in a one-to-one manner through a second mapping relationship;
the third obtaining module 309 is configured to, when it is detected that the finger is located at a third position on the virtual positioning screen, obtain, according to the second mapping relationship, a second position on the screen corresponding to the third position;
the third prompt module 310 is configured to display a second prompt sign at the second position on the screen.
Referring to fig. 8, fig. 8 is a structural diagram of a terminal according to an embodiment of the present invention. The terminal 400 comprises a processor 401, a memory 402, said processor 401 being adapted to perform the method in the above described embodiments by invoking a computer program in said memory 402. For example, it may perform the following steps: when a trigger signal is detected, acquiring a first distance from a finger to the screen; establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, and the second distance is smaller than the first distance; when the finger is detected to move to a first position on the virtual touch screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen; and generating a touch operation instruction according to the second position.
Referring to fig. 9, fig. 9 is another structural diagram of a terminal according to an embodiment of the present invention. The terminal 500 may be a mobile phone, a tablet computer, etc. The terminal 500 may include Radio Frequency (RF) circuitry 501, memory 502 including one or more computer-readable storage media, input unit 503, display unit 504, sensor 505, audio circuitry 506, Wireless Fidelity (WiFi) module 507, processor 508 including one or more processing cores, and power supply 509. Those skilled in the art will appreciate that the terminal structure shown in fig. 9 does not constitute a limitation of the terminal, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 501 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information of a base station and then sending the received downlink information to the one or more processors 508 for processing; in addition, data relating to uplink is transmitted to the base station.
The memory 502 may be used to store software programs and modules, and the processor 508 executes various functional applications and data processing by operating the software programs and modules stored in the memory 502.
The input unit 503 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, the input unit 503 may include a touch-sensitive surface as well as other input devices.
The display unit 504 may be used to display information input by or provided to the user and various graphical user interfaces of the terminal, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 504 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is communicated to the processor 708 to determine the type of touch event, and the processor 508 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 9 the touch sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel to implement input and output functions.
The terminal may also include at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal, detailed description is omitted here.
Audio circuitry 506, a speaker, and a microphone may provide an audio interface between the user and the terminal. The audio circuit 506 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 506 and converted into audio data, which is then processed by the audio data output processor 508, and then transmitted to, for example, another terminal via the RF circuit 501, or the audio data is output to the memory 502 for further processing. The audio circuit 506 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
WiFi belongs to short-distance wireless transmission technology, and the terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 507, and provides wireless broadband internet access for the user. Although fig. 9 shows the WiFi module 507, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 508 is a control center of the terminal, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 502 and calling data stored in the memory 502, thereby integrally monitoring the mobile phone.
The terminal also includes a power supply 509 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 508 via a power management system that may be used to manage charging, discharging, and power consumption. The power supply 509 may also include any component such as one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the terminal may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the processor 508 in the terminal loads the executable file corresponding to the process of one or more application programs into the memory 502 according to the following instructions, and the processor 508 runs the application program stored in the memory 502, so as to implement the following functions: when a trigger signal is detected, acquiring a first distance from a finger to the screen; establishing a virtual touch screen at a second distance from the screen, wherein each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, and the second distance is smaller than the first distance; when the finger is detected to move to a first position on the virtual touch screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen; and generating a touch operation instruction according to the second position.
Various operations of embodiments are provided herein. In one embodiment, the one or more operations may constitute computer readable instructions stored on one or more computer readable media, which when executed by an electronic device, will cause the computing device to perform the operations. The order in which some or all of the operations are described should not be construed as to imply that these operations are necessarily order dependent. Those skilled in the art will appreciate alternative orderings having the benefit of this description. Moreover, it should be understood that not all operations are necessarily present in each embodiment provided herein.
Also, as used herein, the word "preferred" is intended to serve as an example, instance, or illustration. Any aspect or design described herein as "preferred" is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word "preferred" is intended to present concepts in a concrete fashion. The term "or" as used in this application is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise or clear from context, "X employs A or B" is intended to include either of the permutations as a matter of course. That is, if X employs A; b is used as X; or X employs both A and B, then "X employs A or B" is satisfied in any of the foregoing examples.
Also, although the disclosure has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art based upon a reading and understanding of this specification and the annexed drawings. The present disclosure includes all such modifications and alterations, and is limited only by the scope of the appended claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the disclosure. In addition, while a particular feature of the disclosure may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for a given or particular application. Furthermore, to the extent that the terms "includes," has, "" contains, "or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term" comprising.
Each functional unit in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium. The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Each apparatus or system described above may perform the method in the corresponding method embodiment.
In summary, although the present invention has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present invention, therefore, the scope of the present invention shall be determined by the appended claims.

Claims (8)

1. A virtual touch method is applied to a terminal, the terminal comprises a screen, and the method is characterized by comprising the following steps:
when receiving shaking operation of a user on the terminal, generating a trigger signal, or learning a scene needing to generate the trigger signal in advance to obtain a preset gesture or preset sound for generating the trigger signal; generating a trigger signal when the predetermined gesture or the predetermined sound is detected;
when a trigger signal is detected, acquiring a first distance from a finger to the screen;
establishing a virtual touch screen at a second distance from the screen, and establishing a virtual positioning screen at a first distance from the screen, wherein the virtual positioning screen, the virtual touch screen and the screen are sequentially parallel and opposite, each position on the virtual touch screen corresponds to each position on the screen one by one through a first mapping relation, each position on the virtual positioning screen corresponds to each position on the screen one by one through a second mapping relation, the first mapping relation and the second mapping relation are vertical opposite relations, the position on the virtual positioning screen corresponds to the opposite position on the virtual touch screen, the virtual positioning screen is used for positioning and selecting an area needing touch, and the second distance is smaller than the first distance;
when the finger is detected to be located at a third position on the virtual positioning screen, acquiring a second position corresponding to the third position according to the second mapping relation;
displaying a second cue at the second location on the screen;
when the finger is detected to vertically move to a first position on the virtual touch screen from the third position towards the direction of the screen, acquiring a second position corresponding to the first position according to the first mapping relation, wherein the second position is located on the screen, and the first position corresponds to the third position;
acquiring the stay time of the finger at the first position;
if the retention time is longer than the preset time, generating a first touch operation instruction, wherein the first touch operation instruction is a selected instruction and a to-be-moved instruction;
if the finger is detected to move from the first position to a sixth position of the virtual touch screen, generating a third touch operation instruction, wherein the third touch operation instruction is used for moving an object selected by the first touch operation instruction from the second position to a fifth position on a screen, and the fifth position corresponds to the sixth position;
and if the retention time is less than the preset time, generating a second touch operation instruction, wherein the second touch operation instruction is a selected instruction.
2. The virtual touch method according to claim 1, wherein after the step of obtaining the second location corresponding to the first location according to the first mapping relationship, the method further comprises:
displaying a first cue marker at the second location on the screen.
3. The virtual touch method of claim 1, wherein after the step of establishing a virtual touch screen at a second distance from the screen, further comprising:
and generating prompt information after the establishment of the virtual touch screen is completed.
4. A virtual touch device is applied to a terminal, the terminal comprises a screen, and the device is characterized by comprising:
the third generation module is used for generating a trigger signal when receiving shaking operation of a user on the terminal, or learning a scene needing to generate the trigger signal in advance so as to obtain a preset gesture or a preset sound for generating the trigger signal; generating a trigger signal when the predetermined gesture or the predetermined sound is detected;
the first acquisition module is used for acquiring a first distance between a finger and the screen when the trigger signal is detected;
the first establishing module is used for establishing a virtual touch screen at a second distance from the screen and establishing a virtual positioning screen at a first distance from the screen, wherein the virtual positioning screen, the virtual touch screen and the screen are sequentially parallel and opposite, each position on the virtual touch screen corresponds to each position on the screen in a one-to-one manner through a first mapping relation, each position on the virtual positioning screen corresponds to each position on the screen in a one-to-one manner through a second mapping relation, the first mapping relation and the second mapping relation are vertical opposite relations, the position on the virtual positioning screen corresponds to the opposite position on the virtual touch screen, the virtual positioning screen is used for positioning and selecting an area needing touch control, and the second distance is smaller than the first distance;
a third obtaining module, configured to, when it is detected that the finger is located at a third position on the virtual positioning screen, obtain, according to the second mapping relationship, a second position corresponding to the third position;
a third prompt module for displaying a second prompt indicia at the second location on the screen;
a second obtaining module, configured to, when it is detected that the finger vertically moves from the third position to a first position on the virtual touch screen in a direction toward the screen, obtain a second position corresponding to the first position according to the first mapping relationship, where the second position is located on the screen, and the first position corresponds to the third position;
a first generation module, the first generation module comprising:
the acquisition unit is used for acquiring the stay time of the finger at the first position;
the first generating unit is used for generating a first touch operation instruction if the retention time is longer than a preset time, wherein the first touch operation instruction is a selected instruction and a to-be-moved instruction;
a fourth generating unit, configured to generate a third touch operation instruction if it is detected that the finger moves from the first position to a sixth position on the virtual touch screen, where the third touch operation instruction is used to move an object selected by the first touch operation instruction from the second position to a fifth position on the screen, and the fifth position corresponds to the sixth position;
and the second generating unit is used for generating a second touch operation instruction if the retention time is less than the preset time, wherein the second touch operation instruction is a selected instruction.
5. The virtual touch device of claim 4, further comprising:
a first prompt module for displaying a first prompt sign at the second position on the screen.
6. The virtual touch device of claim 4, further comprising:
and the second prompt module is used for generating prompt information after the virtual touch screen is established.
7. A storage medium storing a plurality of instructions, wherein the instructions are loaded by a processor and perform the method of any of claims 1-3.
8. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to any of claims 1-3 when executing the computer program.
CN201710526167.5A 2017-06-30 2017-06-30 Virtual touch method, device, storage medium and terminal Expired - Fee Related CN107390922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710526167.5A CN107390922B (en) 2017-06-30 2017-06-30 Virtual touch method, device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710526167.5A CN107390922B (en) 2017-06-30 2017-06-30 Virtual touch method, device, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN107390922A CN107390922A (en) 2017-11-24
CN107390922B true CN107390922B (en) 2020-11-13

Family

ID=60334627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710526167.5A Expired - Fee Related CN107390922B (en) 2017-06-30 2017-06-30 Virtual touch method, device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN107390922B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240571A (en) * 2018-07-11 2019-01-18 维沃移动通信有限公司 A kind of control device, terminal and control method
CN109358909A (en) * 2018-08-28 2019-02-19 努比亚技术有限公司 Show page control method, terminal and computer readable storage medium
CN110460713B (en) * 2018-11-21 2022-03-22 网易(杭州)网络有限公司 Terminal operation method and device, storage medium and electronic device
CN111045566B (en) * 2019-12-11 2022-02-08 上海传英信息技术有限公司 Stylus pen, terminal, control method thereof, and computer-readable storage medium
CN113569635B (en) * 2021-06-22 2024-07-16 深圳玩智商科技有限公司 Gesture recognition method and system
CN113821137B (en) * 2021-09-22 2024-07-16 携程计算机技术(上海)有限公司 Control display method, system, equipment and storage medium based on touch position
CN115268751A (en) * 2022-03-17 2022-11-01 绍兴埃瓦科技有限公司 Control method and device based on virtual display plane

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823550A (en) * 2012-11-16 2014-05-28 广达电脑股份有限公司 Virtual touch method
CN104065949A (en) * 2014-06-26 2014-09-24 深圳奥比中光科技有限公司 Television virtual touch method and system
CN104731313A (en) * 2013-12-24 2015-06-24 施耐德电器工业公司 Command execution method and device employing single-point touch gestures
CN104978018A (en) * 2014-04-11 2015-10-14 广达电脑股份有限公司 Touch system and touch method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101436608B1 (en) * 2008-07-28 2014-09-01 삼성전자 주식회사 Mobile terminal having touch screen and method for displaying cursor thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823550A (en) * 2012-11-16 2014-05-28 广达电脑股份有限公司 Virtual touch method
CN104731313A (en) * 2013-12-24 2015-06-24 施耐德电器工业公司 Command execution method and device employing single-point touch gestures
CN104978018A (en) * 2014-04-11 2015-10-14 广达电脑股份有限公司 Touch system and touch method
CN104065949A (en) * 2014-06-26 2014-09-24 深圳奥比中光科技有限公司 Television virtual touch method and system

Also Published As

Publication number Publication date
CN107390922A (en) 2017-11-24

Similar Documents

Publication Publication Date Title
CN107390922B (en) Virtual touch method, device, storage medium and terminal
CN110752980B (en) Message sending method and electronic equipment
CN107038112B (en) Application interface debugging method and device
AU2014200068B2 (en) Method and apparatus for providing mouse function using touch device
CN106406712B (en) Information display method and device
CN108549519B (en) Split screen processing method and device, storage medium and electronic equipment
EP2851779A1 (en) Method, device, storage medium and terminal for displaying a virtual keyboard
CN109871164B (en) Message sending method and terminal equipment
CN104915091B (en) A kind of method and apparatus for the prompt information that Shows Status Bar
CN109933252B (en) Icon moving method and terminal equipment
CN108513671B (en) Display method and terminal for 2D application in VR equipment
CN109085968B (en) Screen capturing method and terminal equipment
CN111026299A (en) Information sharing method and electronic equipment
CN110752981B (en) Information control method and electronic equipment
CN111163224B (en) Voice message playing method and electronic equipment
KR102004986B1 (en) Method and system for executing application, device and computer readable recording medium thereof
CN109614061A (en) Display methods and terminal
CN108681427B (en) Access right control method and terminal equipment
EP3962049A1 (en) Content input method and terminal device
WO2020181954A1 (en) Application program control method and terminal device
CN106371749A (en) Method and device for terminal control
CN107479799B (en) Method and device for displaying window
CN110058686B (en) Control method and terminal equipment
CN109067975B (en) Contact person information management method and terminal equipment
CN111090529A (en) Method for sharing information and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201113

CF01 Termination of patent right due to non-payment of annual fee