CN112905136A - Screen projection control method and device and storage medium - Google Patents

Screen projection control method and device and storage medium Download PDF

Info

Publication number
CN112905136A
CN112905136A CN202110266998.XA CN202110266998A CN112905136A CN 112905136 A CN112905136 A CN 112905136A CN 202110266998 A CN202110266998 A CN 202110266998A CN 112905136 A CN112905136 A CN 112905136A
Authority
CN
China
Prior art keywords
touch
screen
touch screen
touch operation
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110266998.XA
Other languages
Chinese (zh)
Inventor
黄海宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110266998.XA priority Critical patent/CN112905136A/en
Publication of CN112905136A publication Critical patent/CN112905136A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The present disclosure relates to a screen projection control method, device and storage medium, wherein the screen projection control method is applied to a first device, and comprises: sending a screen projection picture to the second equipment based on the acquired screen projection instruction; acquiring touch position information if the touch operation acting on the touch screen within a preset distance from the touch screen is detected; generating an indication icon based on the touch position information; sending the indication icon to the second device; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment. Therefore, the operation can be realized by directly watching the indication icon on the screen projection picture displayed by the second equipment during screen projection by sending the indication icon to the second equipment, the switching between the first equipment and the second equipment is not needed by the sight line, and the convenience of screen projection display can be improved.

Description

Screen projection control method and device and storage medium
Technical Field
The present disclosure relates to the field of display control technologies, and in particular, to a screen projection control method and apparatus, and a storage medium.
Background
With the development of electronic technology and the continuous improvement of living standard of people, the use of various intelligent terminals such as an intelligent television or an intelligent mobile phone is more and more popularized, and the intelligent television and the intelligent mobile phone become indispensable tools in the life of people. With the popularization of screen projection technology on the intelligent terminal, when a user uses the intelligent terminal, due to the fact that a screen is small, the user likes to project display content of the intelligent terminal to other large-screen devices for watching, or in various sharing conferences, display within a larger range is achieved through screen projection and the display becomes more and more frequent. However, in the current screen projection scene, the convenience of screen projection operation is not very good.
Disclosure of Invention
The disclosure provides a screen projection control method, a screen projection control device and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a screen projection control method applied to a first device, including:
sending a screen projection picture to the second equipment based on the acquired screen projection instruction;
acquiring touch position information if the touch operation acting on the touch screen within a preset distance from the touch screen is detected;
generating an indication icon based on the touch position information;
sending the indication icon to the second device; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment.
Optionally, if a touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information, including:
and when the touch screen is in a turned-off state, if the touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
Optionally, the touch operation includes: a floating touch operation not in contact with the touch screen, and/or a contact touch operation in contact with the touch screen;
the method further comprises the following steps:
starting an air-isolated touch function; the air-separating touch control function can detect the floating touch control operation within a preset distance from the first equipment touch control screen.
Optionally, if a touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information, including:
and based on the started spaced touch function, if the floating touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
Optionally, the touch position information of the hover touch operation at least includes: touch coordinate information used for indicating a first position of the floating touch operation projected on the touch screen; the first position is used for determining the coordinate of the indication icon on the screen projection picture.
Optionally, the method further comprises:
determining the display transparency of an indication icon to be generated based on a second position of the floating touch operation relative to the touch screen; the second position is used for indicating the distance between the action position of the floating touch operation and the touch screen; the spacing is positively correlated with the display transparency.
Optionally, the method further comprises:
generating movement information of the indication icon based on touch operation detected at continuous detection time;
sending the movement information to the second device; the movement information is used for indicating a movement path of an indication icon displayed on a screen projection picture displayed by the second equipment.
According to a second aspect of the embodiments of the present disclosure, there is provided a screen projection control device applied to a first device, including:
the first sending module is used for sending a screen projection picture to the second equipment based on the acquired screen projection instruction;
the touch control system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring touch control position information if touch control operation acting on a touch control screen within a preset distance from the touch control screen is detected;
the generating module is used for generating an indicating icon based on the touch position information;
the second sending module is used for sending the indication icon to the second equipment; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment.
Optionally, the obtaining module is further configured to:
and when the touch screen is in a turned-off state, if the touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
Optionally, the touch operation includes: a floating touch operation not in contact with the touch screen, and/or a contact touch operation in contact with the touch screen;
the device further comprises:
the starting module is used for starting an air-isolated touch function; the air-separating touch control function can detect the floating touch control operation within a preset distance from the first equipment touch control screen.
Optionally, the obtaining module includes:
and the position acquisition submodule is used for acquiring touch position information if the floating touch operation acting on the touch screen within a preset distance from the touch screen is detected based on the started spaced touch function.
Optionally, the touch position information of the hover touch operation at least includes: touch coordinate information used for indicating a first position of the floating touch operation projected on the touch screen; the first position is used for determining the coordinate of the indication icon on the screen projection picture.
Optionally, the apparatus comprises:
the transparency determination module is used for determining the display transparency of the indication icon to be generated based on the second position of the floating touch operation relative to the touch screen; the second position is used for indicating the distance between the action position of the floating touch operation and the touch screen; the spacing is positively correlated with the display transparency.
Optionally, the apparatus further comprises:
the mobile information generating module is used for generating the mobile information of the indication icon based on the touch operation detected at the continuous detection moment;
a third sending module, configured to send the mobile information to the second device; the movement information is used for indicating a movement path of an indication icon displayed on a screen projection picture displayed by the second equipment.
According to a third aspect of the embodiments of the present disclosure, there is provided a screen projection control apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the method of any of the above first aspects is implemented when executable instructions stored in the memory are executed.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the steps of the method provided by any one of the above-mentioned first aspects.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the screen projection control method provided by the embodiment of the disclosure, the second device can receive and display the screen projection picture and the display indication icon when the screen is projected. The indication icon is generated according to the touch position information of the touch operation acting on the touch screen of the first device, so that after the second electronic device receives the indication icon, the indication icon is displayed on the display interface at a position corresponding to the position of the touch operation acting on the touch screen of the first device. Therefore, after the screen is projected, if the demonstration is needed to be performed by matching with the content displayed by the second equipment, a presenter can directly know the touch position of the second equipment on the first equipment based on the indication icon displayed on the display interface of the second equipment, and thus the presenter can switch between the first equipment and the second equipment without looking back and forth during the demonstration and without lowering head for multiple times to confirm the touch position of the finger on the first equipment, the operation purpose can be achieved by only paying attention to the indication icon on the second equipment, and the convenience of the demonstration operation during the screen projection is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a first flowchart illustrating a screen projection control method according to an exemplary embodiment.
FIG. 2 is a schematic diagram illustrating display content of a first device according to an example embodiment. .
FIG. 3 is a diagram illustrating display content of a second device, according to an exemplary embodiment.
FIG. 4 is a flowchart illustrating a second method of screen projection control according to an exemplary embodiment.
Fig. 5 is a flowchart illustrating a third method of screen projection control according to an exemplary embodiment.
Fig. 6 is a schematic diagram illustrating a configuration of a screen projection control apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating a screen projection control apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The embodiment of the present disclosure provides a screen projection control method, and fig. 1 is a first flowchart of a screen projection control method according to an exemplary embodiment, and as shown in fig. 1, the screen projection control method includes the following steps:
step 101, sending a screen projection picture to second equipment based on an acquired screen projection instruction;
step 102, if touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information;
103, generating an indication icon based on the touch position information;
step 104, sending the indication icon to the second equipment; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment.
Illustratively, the first device to which the screen projection control method is applied may be any electronic device, such as a smartphone, a tablet computer, or a wearable electronic device. The second device receiving the screen projection image may also be any electronic device with a display function, such as a smart television, a projection device, or a smart phone.
In a screen projection scene, the equipment for realizing screen projection comprises: the screen projection device provides display content, and the screen projected device displays the display content provided by the screen projection device.
In the disclosed embodiment, the first device is a screen projection device; the second device is a projected screen device.
The first device is provided with a touch screen and can trigger corresponding functions based on touch operation acting on the touch screen. Moreover, in order to detect different types of touch operations and improve the convenience of the user in using the first device, the touch screen of the first device in the disclosure can detect the touch operation within a preset distance from the touch screen. That is, the touch operation in the present disclosure may include: the touch control method comprises the following steps of suspension touch control operation without contacting the touch screen, and/or contact touch control operation contacting the touch screen. Namely: the touch screen of the first device may detect a contact touch operation in contact with the touch screen, or may detect a floating touch operation not in contact with the touch screen. Thus, the first device has an air-gap touch function.
The preset distance is determined according to detection parameters of the touch screen, and in some embodiments, the detection parameters at least include: the voltage is detected. The detection voltage is positively correlated with the preset distance, and the larger the detection voltage is, the larger the detection sensitivity of the touch screen is, and the larger the detectable distance is. For example, when the detection voltage of the touch screen is increased to a certain extent, the floating touch operation at a certain distance from the touch screen can be detected.
The second device is also provided with a display screen, which may be a touch screen or a non-touch screen. And when the second device is a projector, taking the projection screen of the second device as the display screen of the second device. In some embodiments, the wall may also be considered a projection curtain. The present disclosure is not so limited.
In some embodiments, the area of the display screen of the second device is larger than that of the first device, so that the information can be displayed through the large-screen second device after the screen is projected, and the method can be better suitable for multiple people to watch scenes, such as a conference.
In other embodiments, the display screen area of the second device is smaller than or equal to that of the first device, so that sharing among multiple users can be realized. The present disclosure does not limit the display screen area of the second device.
When the first device detects a screen projection instruction, sending a screen projection picture to the second device, so that the second device can display the screen projection picture; wherein, the screen projection instruction at least comprises: the display content of the projected screen and the equipment parameters of the projected screen equipment; the device parameter is used to indicate a projected device. In some possible embodiments, the device parameters may include: internet Protocol Address (IP Address).
Here, after the screen projection screen is transmitted, although the screen projection screen is displayed on the second device, the operation control of the screen projection screen is performed on the first device; when the screen projection picture is required to be operated, touch operation needs to be performed on the touch screen of the first device, and the function of the corresponding control is triggered.
In the embodiment of the disclosure, after sending a screen projection picture, a first device detects whether a touch operation is performed on a touch screen, acquires touch position information if the touch operation performed on the touch screen is detected, generates an indication icon based on the touch position information, and sends the indication icon to a second device. In this way, the second device can display an indication icon in addition to the screen projection screen. Then, the position of the touch operation acting on the touch screen of the first device can be known according to the indication icon displayed on the second device. In this way, the operator's line of sight may not toggle between the two devices, and may focus only on the display on the second device. Here, the operator is a person who operates the screen projection.
The touch position information is used for indicating a touch position of a touch operation applied to the first device touch screen.
In some embodiments, the touch position includes at least: the touch operation is projected at a first position on the touch screen. For example, when a finger of a user acts on a touch screen of a first device, the projection position of the finger on the touch screen is determined; when in touch control, the finger and the touch screen can be suspended on the touch screen or contacted with the touch screen.
In some embodiments, when the touch operation is a hover touch operation, the touch position further includes: and a second position of the touch operation relative to the touch screen, wherein the second position indicates the distance between the floating touch operation and the touch screen.
The indication icon may be any reference mark for indicating a touch position. The indication icon includes: a graphic type icon, a character type icon or a character type icon; for example, a mouse arrow, a circle, or the letter A may be used as the indicator icon.
In the embodiment of the present disclosure, the generated indication icons may also be different according to different types of touch operations. In some embodiments, when a floating touch operation acting on the first device touch screen is detected, different indication icons may be generated according to a distance between the floating touch operation and the touch screen. For example, indicator icons of different transparencies or indicator icons of different morphologies are generated. The transparency of the indication icon is inversely related to the distance between the floating touch operation and the touch screen. The shape of the indication icon may be set in a pairing manner according to a distance between the floating touch operation and the touch screen, for example, taking a mouse arrow as an example, the direction of the mouse arrow may be adjusted according to the distance between the floating touch operation and the touch screen. The present disclosure does not limit the form of the indication icon.
Since the indication icon is generated based on the touch position information of the touch operation applied to the touch screen, the position of the touch operation applied to the touch screen of the first device can be known according to the indication icon displayed on the display interface of the second device. Therefore, after the screen is projected, if the demonstration is needed to be performed by matching with the content displayed on the second equipment, the presenter can directly know the touch position of the first equipment acted by the presenter based on the indication icon displayed on the display interface of the second equipment, so that the presenter can switch between the first equipment and the second equipment without the sight back and forth during the demonstration and can confirm the finger touch position on the first equipment without lowering the head for multiple times, the operation purpose can be achieved by only paying attention to the indication icon on the second equipment, and the convenience of the demonstration operation during the screen projection is improved.
In some embodiments, the obtaining the touch position information if the touch operation acting on the touch screen within the preset distance from the touch screen is detected includes:
and when the touch screen is in a turned-off state, if the touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
In the embodiment of the disclosure, after the screen projection picture is sent to the second device, the touch screen of the first device can be turned off. Therefore, the power consumption can be reduced, and the power of the first device can be saved.
It should be noted that, in the embodiment of the present disclosure, the touch detection function is retained when the touch screen of the first device is turned off. Thus, the purpose of touch detection can be achieved.
Here, the touch screen of the first device may be kept powered off to retain the touch detection function of the touch screen. And when the screen projection is finished, the original screen-off touch-control function is restarted. Therefore, the scene needing to be projected and the scene needing not to be projected can be distinguished, and the protection function of the touch screen is also closed after the screen is turned off under the scene needing not to be projected so as to achieve the purpose of preventing mistaken touch.
It should be further noted that, in some embodiments, when the touch screen is in a bright screen state, if a touch operation performed on the touch screen within a preset distance from the touch screen is detected, the touch position information may be obtained. As shown in fig. 2, fig. 2 is a schematic diagram of display contents of the first device according to an exemplary embodiment, and when the touch screen of the first device 201 is in a bright screen state, the touch operation performed on the touch screen within a preset distance from the touch screen may also be detected, and after an indication icon is generated, the indication icon 200 is also displayed on the touch screen of the first device. As shown in fig. 2, the circle above the 4 rows of application icons displaying the home page of the interface is the indicator icon 200.
In this way, after the indication icon is sent to the second device, the indication icon displayed on the first device and the indication icon displayed on the second device may correspond in position. Fig. 3 is a schematic diagram illustrating display contents of the second device according to an exemplary embodiment, and as shown in fig. 2 and fig. 3, the indication icon displayed on the first device is located above 4 rows of application icons on the display interface, where the screen shot is the display interface including 4 rows of application icons. The indicator icon 300 displayed on the second device 301 is also above the 4 rows of application icons. Therefore, the position of the finger of the operator on the touch screen of the first device can be prompted through the indication icon.
In some embodiments, the touch operation comprises: a floating touch operation not in contact with the touch screen, and/or a contact touch operation in contact with the touch screen;
based on the touch operation, the method further comprises:
starting an air-isolated touch function; the air-separating touch control function can detect the floating touch control operation within a preset distance from the first equipment touch control screen.
In the embodiment of the present disclosure, the touch screen of the first device may detect a contact touch operation that is in contact with the touch screen, and may also detect a floating touch operation that is not in contact with the touch screen.
Here, the touch screen includes: the touch detection assembly and the control chip; the touch detection assembly is installed on the display screen and used for detecting touch position information of a user and sending the touch position information to the control chip after the touch position information is detected. The control chip is mainly used for receiving touch position information from the touch detection component, converting the touch position information into touch point coordinates and sending the touch point coordinates to a processor (CPU) of the first equipment, and meanwhile, receiving and executing commands sent by the CPU.
In order to realize the space detection, at least two capacitor plates which respectively form a capacitor with the ground are arranged on the left side and the right side of the display screen, and the capacitor is a self-capacitance sensor and can sense the position and action change of an operator within a preset distance above the display screen. Exemplarily, when a finger approaches the display screen, capacitance change of the capacitor is caused, the touch detection component of the touch screen of the first device detects multiple groups of capacitance values and sends the multiple groups of capacitance values to the control chip, the control chip processes the multiple groups of capacitance values to obtain variation of the capacitance values, and then position coordinates and action change of the finger are judged, and an operation instruction corresponding to the action is generated, so that the air-separated touch operation of the first device is realized.
In some embodiments, the method further comprises:
generating movement information of the indication icon based on touch operation detected at continuous detection time;
sending the movement information to the second device; the movement information is used for indicating a movement path of an indication icon displayed on a screen projection picture displayed by the second equipment.
Here, the capacitance values detected by at least two capacitance plates on the touch screen are different, so that which capacitance plate the finger is closer to can be distinguished. And the distance change of the finger from the touch screen can be judged according to the difference of capacitance values detected in the two times. Multiple sets of capacitance values are detected, the movement trajectory/direction of the finger over the display screen can be analyzed, as well as whether there is a press/release action.
The maximum detection distance which can be reached by the space detection can be determined according to the detection voltage of the touch screen; the larger the detection voltage is, the larger the detection sensitivity of the touch screen is, and the larger the detectable distance is. Therefore, after the touch screen with the sensitivity capable of detecting the space touch function is selected, the floating touch operation which is not in contact with the touch screen and/or the contact touch operation which is in contact with the touch screen can be detected based on the touch screen.
In each touch, the first device side determines touch position information of touch operation based on capacitance values detected by at least two capacitance plates on the touch screen, and then, after the touch operation detected at the continuous detection time, may generate movement information of the indication icon based on the touch operation detected at the continuous detection time, and send the movement information to the second device. In this way, the second device can correspondingly move the indication icon on the displayed screen projection screen according to the movement information, and display the movement path during the movement.
Therefore, when the first device is operated, the operation position acted on the first device can be synchronously seen on the display screen of the second device, and a demonstrator does not need to switch the sight back and forth in the demonstration operation, so that the convenience of the demonstration operation is improved. After the display screen of the first device is turned off, because the operation position acting on the first device can be displayed on the second device, the corresponding touch operation can still be realized, and application scenes are increased.
In some embodiments, the obtaining the touch position information if the touch operation acting on the touch screen within the preset distance from the touch screen is detected includes:
and based on the started spaced touch function, if the floating touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
Here, the touch screen of the present disclosure has an air-gap touch function, which may be turned on or off. The function is started when the air-spaced touch is needed, and the function is closed when the air-spaced touch is not needed, so that the mistaken touch caused by long-term starting of the air-spaced touch function in daily use can be reduced.
In some embodiments, the touch position information of the hover touch operation at least includes: touch coordinate information used for indicating a first position of the floating touch operation projected on the touch screen; the first position is used for determining the coordinate of the indication icon on the screen projection picture.
Here, the touch position information is used to indicate a touch position of a touch operation applied to the touch screen of the first device. The touch coordinate information is used for indicating a first position of the floating touch operation projected on the touch screen.
Because the touch operation on the touch screen of the first device aims at triggering the corresponding function, based on the touch position information, the touch coordinate information of the user on the touch screen can be known, and if the coordinate position indicated by the touch coordinate information corresponds to the control on the display interface, the function corresponding to the control can be triggered.
For the floating touch operation, the floating touch operation is not in contact with the touch screen, and the touch coordinate information of the floating touch operation may be coordinate information of a first position where the floating touch operation is projected on the touch screen.
In order to improve the convenience of operation, the indication icon generated based on the touch position information is sent to the second device, so that when the touch operation is a floating touch operation, the indication icon generated based on the touch position information carries the first position, and the first position is used for determining the coordinate of the indication icon on the screen projection picture. In this way, after the screen projection picture is displayed on the display interface of the second device, the indication icon is displayed on the first position of the screen projection picture. In this way, the indication icon displayed on the screen projection screen displayed by the second device can be made to correspond to the position of the touch operation acting on the touch screen of the first device. Then, after the position of the floating touch operation is moved on the first device, the position of the indication icon is correspondingly moved on a screen projection picture displayed by the second device, so that a basis is provided for realizing accurate touch subsequently.
Here, the floating touch operation belongs to a spatial operation, and there is touch distance information indicating a distance from the touch screen in addition to the touch coordinate information. The following describes the touch distance information of the floating touch operation:
in some embodiments, fig. 4 is a flowchart illustrating a screen projection control method according to an exemplary embodiment, where as shown in fig. 4, the method further includes:
and 105, determining the display transparency of the indication icon to be generated based on the second position of the floating touch operation distance from the touch screen.
The second position is used for indicating the distance between the action position of the floating touch operation and the touch screen; the spacing is positively correlated with the display transparency.
Here, the touch position information of the hover touch operation further includes: touch distance information; the touch distance information indicates a second position of the hover touch operation relative to the touch screen. The second position is used for indicating the distance between the action position of the floating touch operation and the touch screen.
In the embodiment of the present disclosure, according to the action position of the floating touch operation and the difference between the distances between the touch screens, the display effect of the indication icon to be generated can be determined.
For example, in the disclosed embodiment, the display effect may be an effect embodied in transparency. In some embodiments, the transparency of the indicator icon may be positively correlated with the hover touch operation and the spacing between the touch screens. Illustratively, the closer the hover touch operation is to the touch screen, the smaller the value of transparency and the more opaque, the closer the indicator icon is to the entity. Conversely, the farther the floating touch operation is from the touch screen, the greater the value of the transparency, the more transparent the indication icon is, and the closer the indication icon is to the virtual body. Here, assuming that the value of transparency is 0 to 100, 0 corresponds to opaque and 100 corresponds to transparent.
Therefore, the distance between the floating touch operation and the touch screen can be known approximately according to different transparencies of the indication icons, and the indication icon is more visual in display and has an indication function.
In other embodiments, the display effect may also be embodied in a form. The shape of the indication icon may be set in a pairing manner according to a distance between the floating touch operation and the touch screen, for example, taking a mouse arrow as an example, the direction of the mouse arrow may be adjusted according to the distance between the floating touch operation and the touch screen. The present disclosure does not limit the form of the indication icon.
The present disclosure also provides the following embodiments:
fig. 5 is a flowchart illustrating a first screen projection control method according to an exemplary embodiment, as shown in fig. 5, the screen projection control method includes the following steps:
step 401, the first device obtains a screen projection instruction and sends a screen projection picture to the second device.
Here, when the user touches the screen projection function control on the first device, the screen projection screen currently displayed by the first device is sent to the second device. The screen projection instruction at least comprises the following steps: the display content of the projected screen and the equipment parameters of the projected screen equipment; the device parameter is used to indicate a projected device.
And 402, receiving a screen projection picture by the second equipment and displaying the screen projection picture.
Here, the screen projection screen displayed on the first device is the same as the second device, or the screen scale is the same.
And 403, extinguishing the touch screen of the first device, and reserving the touch function of the touch screen.
Step 404, the touch function of the first device is turned on.
Here, the air-gap touch function can be realized by increasing the detection voltage of the touch screen, and the detection voltage of the touch screen is increased, so that the sensitivity can be improved, and the purpose of air-gap detection is achieved.
Step 405, determining whether a floating touch operation acting on the touch screen within a preset distance from the touch screen of the first device is detected.
If yes, go to step 406; if not, go to step 412.
And 406, if the floating touch operation acting on the touch screen within the preset distance from the first equipment touch screen is detected, acquiring touch position information.
Here, since the finger of the operator approaches the first screen, the touch screen is in the space detection state at this time, when the touch screen approaches a certain distance (for example, 10cm), the touch screen can detect the presence of the finger, and at this time, the first device acquires the touch position information of the finger.
In step 407, the first device generates an indication icon based on the touch position information.
Here, the indication icon may be a cursor (mouse arrow) or a graphic class icon of a preset shape.
Step 408, the first device sends the indication icon to the second device.
Here, as shown in fig. 3, fig. 3 is a schematic diagram showing display contents of the second device in which an indication icon is displayed in addition to a screen shot, according to an exemplary embodiment.
The indication icon is used for prompting the position of a finger of an operator on the touch screen of the first device. The operator can locate the position on the first device to be touched through the prompt of the mouse arrow on the second device. As shown in fig. 4, when a finger approaches the touch screen of the first device within a certain distance (without touching), a prompt is given on the display content of the second device (the indication icon in the schematic diagram of fig. 3 is represented by a circle), so that an operator can know where to click on the second device, and the accurate touch operation can be realized without looking down at the touch screen of the first device after the operator touches the touch screen.
According to different types of touch operations, the generated indication icons may also be different, and in some embodiments, different indication icons may be generated according to a distance between a floating touch operation and the touch screen when the floating touch operation acting on the first device touch screen is detected. Illustratively, the transparency of the mouse arrow indicates the distance between the finger and the touch screen of the first device, and the transparency can be determined by the distance between the finger and the touch screen of the first device.
The distance between the finger and the touch screen can be judged by detecting the variation of the capacitance of the touch screen.
Step 409, determining whether a touch operation acting on the first device touch screen is detected.
If yes, go to step 410; if not, go to step 412.
Step 410, if a touch operation acting on the touch screen of the first device is detected, responding to the touch operation, acquiring a response result, and sending the response result to the second device.
After the finger is displayed on the first device touch screen in a floating manner, when the finger continues to approach the first device touch screen until the finger touches the touch screen, single-click, double-click, long-press or sliding touch operations and other touch operations can be realized, and corresponding functions can be triggered based on the touch operations to obtain corresponding response results.
In step 411, the second device displays the response result.
And step 412, ending.
Therefore, according to the screen projection control method provided by the disclosure, when screen projection display is performed, the touch screen of the first device can be turned off, meanwhile, the touch (touch) function of the first device is reserved, and the position of the indication icon on the screen projection screen displayed by the second device corresponds to the position of the touch operation acting on the touch screen of the first device in a mode of sending the screen projection screen and the indication icon to the second device. Exemplarily, the following steps are carried out: when a screen projection picture is displayed on the projection equipment (second equipment), when a finger hovers to a certain distance above a display screen of the mobile equipment (first equipment), the position of the finger which is supposed to act on the mobile equipment is displayed on the projection equipment, and a prompt can be given to a user based on the position of the finger displayed on the projection equipment, so that the situation that a black screen is touched by mistake possibly after a touch screen of the first equipment is turned off is prevented.
In this way, after the screen projection picture and the indication icon are sent to the second device, in the subsequent touch operation, based on accurate positioning of the accurate finger, touch operations such as clicking, double-clicking, long-pressing or sliding can be continuously executed on the touch screen of the first device, and after a corresponding function is triggered, a corresponding response result can be displayed on the second device.
This kind of mode that this kind of combination was thrown screen technique, was separated empty touch function and is showing the suggestion and richen the effect of throwing the screen application has promoted the convenience when throwing the screen demonstration, and the presenter can just can accurately accomplish the operation without looking down at first equipment to, extinguish the touch screen of first equipment after throwing the screen, can also reach the purpose of sparingly the consumption.
The present disclosure also provides a screen projection control device, and fig. 6 is a schematic structural diagram of a screen projection control device according to an exemplary embodiment, and as shown in fig. 6, the screen projection control device 600 includes:
a first sending module 601, configured to send a screen projection picture to the second device based on the acquired screen projection instruction;
an obtaining module 602, configured to obtain touch position information if a touch operation acting on a touch screen within a preset distance from the touch screen is detected;
a generating module 603, configured to generate an indication icon based on the touch position information;
a second sending module 604, configured to send the indication icon to the second device; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment.
In some embodiments, the obtaining module is further configured to:
and when the touch screen is in a turned-off state, if the touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
In some embodiments, the touch operation comprises: a floating touch operation not in contact with the touch screen, and/or a contact touch operation in contact with the touch screen;
the device further comprises:
the starting module is used for starting an air-isolated touch function; the air-separating touch control function can detect the floating touch control operation within a preset distance from the first equipment touch control screen.
In some embodiments, the obtaining module includes:
and the position acquisition submodule is used for acquiring touch position information if the floating touch operation acting on the touch screen within a preset distance from the touch screen is detected based on the started spaced touch function.
In some embodiments, the touch position information of the hover touch operation at least includes: touch coordinate information used for indicating a first position of the floating touch operation projected on the touch screen; the first position is used for determining the coordinate of the indication icon on the screen projection picture.
In some embodiments, the apparatus comprises:
the transparency determination module is used for determining the display transparency of the indication icon to be generated based on the second position of the floating touch operation relative to the touch screen; the second position is used for indicating the distance between the action position of the floating touch operation and the touch screen; the spacing is positively correlated with the display transparency.
In some embodiments, the apparatus further comprises:
the mobile information generating module is used for generating the mobile information of the indication icon based on the touch operation detected at the continuous detection moment;
a third sending module, configured to send the mobile information to the second device; the movement information is used for indicating a movement path of an indication icon displayed on a screen projection picture displayed by the second equipment.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a block diagram illustrating a screen projection control device 1800 according to an exemplary embodiment. For example, the apparatus 1800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, and so forth.
Referring to fig. 7, apparatus 1800 may include one or more of the following components: a processing component 1802, a memory 1804, a power component 1806, a multimedia component 1808, an audio component 1810, an input/output (I/O) interface 1812, a sensor component 1814, and a communications component 1816.
The processing component 1802 generally controls the overall operation of the device 1800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1802 may include one or more processors 1820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1802 may also include one or more modules that facilitate interaction between the processing component 1802 and other components. For example, the processing component 1802 can include a multimedia module to facilitate interaction between the multimedia component 1808 and the processing component 1802.
The memory 1804 is configured to store various types of data to support operation at the apparatus 1800. Examples of such data include instructions for any application or method operating on the device 1800, contact data, phonebook data, messages, images, videos, and so forth. The memory 1804 may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 1806 provide power to various components of device 1800. The power components 1806 may include: a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus 1800.
The multimedia component 1808 includes a screen that provides an output interface between the device 1800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1808 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 1800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and/or rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
Audio component 1810 is configured to output and/or input audio signals. For example, the audio component 1810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1800 is in operating modes, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1804 or transmitted via the communication component 1816. In some embodiments, audio component 1810 also includes a speaker for outputting audio signals.
I/O interface 1812 provides an interface between processing component 1802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 1814 includes one or more sensors for providing various aspects of state assessment for the apparatus 1800. For example, the sensor assembly 1814 can detect an open/closed state of the device 1800, the relative positioning of components such as a display and keypad of the device 1800, the sensor assembly 1814 can also detect a change in position of the device 1800 or a component of the device 1800, the presence or absence of user contact with the device 1800, orientation or acceleration/deceleration of the device 1800, and a change in temperature of the device 1800. The sensor assembly 1814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1816 is configured to facilitate communications between the apparatus 1800 and other devices in a wired or wireless manner. The device 1800 may access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1816 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In an exemplary embodiment, the apparatus 1800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as the memory 1804 including instructions that are executable by the processor 1820 of the apparatus 1800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, wherein instructions, when executed by a processor, enable performance of the above-described method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A screen projection control method is applied to first equipment and comprises the following steps:
sending a screen projection picture to the second equipment based on the acquired screen projection instruction;
acquiring touch position information if the touch operation acting on the touch screen within a preset distance from the touch screen is detected;
generating an indication icon based on the touch position information;
sending the indication icon to the second device; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment.
2. The method of claim 1, wherein the obtaining of the touch position information if the touch operation acting on the touch screen within a preset distance from the touch screen is detected comprises:
and when the touch screen is in a turned-off state, if the touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
3. The method according to claim 1 or 2, wherein the touch operation comprises: a floating touch operation not in contact with the touch screen, and/or a contact touch operation in contact with the touch screen;
the method further comprises the following steps:
starting an air-isolated touch function; the air-separating touch control function can detect the floating touch control operation within a preset distance from the first equipment touch control screen.
4. The method of claim 3, wherein the obtaining of the touch position information if the touch operation acting on the touch screen within a preset distance from the touch screen is detected comprises:
and based on the started spaced touch function, if the floating touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
5. The method of claim 3, wherein the touch position information of the hover touch operation at least comprises: touch coordinate information used for indicating a first position of the floating touch operation projected on the touch screen; the first position is used for determining the coordinate of the indication icon on the screen projection picture.
6. The method of claim 3, further comprising:
determining the display transparency of an indication icon to be generated based on a second position of the floating touch operation relative to the touch screen; the second position is used for indicating the distance between the action position of the floating touch operation and the touch screen; the spacing is positively correlated with the display transparency.
7. The method of claim 1, further comprising:
generating movement information of the indication icon based on touch operation detected at continuous detection time;
sending the movement information to the second device; the movement information is used for indicating a movement path of an indication icon displayed on a screen projection picture displayed by the second equipment.
8. A screen projection control device is applied to a first device and comprises:
the first sending module is used for sending a screen projection picture to the second equipment based on the acquired screen projection instruction;
the touch control system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring touch control position information if touch control operation acting on a touch control screen within a preset distance from the touch control screen is detected;
the generating module is used for generating an indicating icon based on the touch position information;
the second sending module is used for sending the indication icon to the second equipment; the position of the indication icon on the screen projection picture displayed by the second equipment corresponds to the position of the touch operation acting on the touch screen of the first equipment.
9. The apparatus of claim 8, wherein the obtaining module is further configured to:
and when the touch screen is in a turned-off state, if the touch operation acting on the touch screen within a preset distance from the touch screen is detected, acquiring touch position information.
10. The apparatus according to claim 8 or 9, wherein the touch operation comprises: a floating touch operation not in contact with the touch screen, and/or a contact touch operation in contact with the touch screen;
the device further comprises:
the starting module is used for starting an air-isolated touch function; the air-separating touch control function can detect the floating touch control operation within a preset distance from the first equipment touch control screen.
11. The apparatus of claim 10, wherein the obtaining module comprises:
and the position acquisition submodule is used for acquiring touch position information if the floating touch operation acting on the touch screen within a preset distance from the touch screen is detected based on the started spaced touch function.
12. The apparatus of claim 10, wherein the touch position information of the hover touch operation at least comprises: touch coordinate information used for indicating a first position of the floating touch operation projected on the touch screen; the first position is used for determining the coordinate of the indication icon on the screen projection picture.
13. The apparatus of claim 12, wherein the apparatus comprises:
the transparency determination module is used for determining the display transparency of the indication icon to be generated based on the second position of the floating touch operation relative to the touch screen; the second position is used for indicating the distance between the action position of the floating touch operation and the touch screen; the spacing is positively correlated with the display transparency.
14. The apparatus of claim 8, further comprising:
the mobile information generating module is used for generating the mobile information of the indication icon based on the touch operation detected at the continuous detection moment;
a third sending module, configured to send the mobile information to the second device; the movement information is used for indicating a movement path of an indication icon displayed on a screen projection picture displayed by the second equipment.
15. A screen projection control device, comprising:
a processor and a memory for storing executable instructions operable on the processor, wherein:
the processor is configured to execute the executable instructions, and the executable instructions perform the steps of the method provided by any one of the preceding claims 1 to 7.
16. A non-transitory computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, perform steps in a method as provided by any one of claims 1 to 7.
CN202110266998.XA 2021-03-11 2021-03-11 Screen projection control method and device and storage medium Pending CN112905136A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110266998.XA CN112905136A (en) 2021-03-11 2021-03-11 Screen projection control method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110266998.XA CN112905136A (en) 2021-03-11 2021-03-11 Screen projection control method and device and storage medium

Publications (1)

Publication Number Publication Date
CN112905136A true CN112905136A (en) 2021-06-04

Family

ID=76104943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110266998.XA Pending CN112905136A (en) 2021-03-11 2021-03-11 Screen projection control method and device and storage medium

Country Status (1)

Country Link
CN (1) CN112905136A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115243082A (en) * 2022-07-18 2022-10-25 海信视像科技股份有限公司 Display device and terminal control method
WO2023029526A1 (en) * 2021-08-30 2023-03-09 荣耀终端有限公司 Display control method and apparatus for pointer in window, device, and storage medium
WO2024060890A1 (en) * 2022-09-21 2024-03-28 北京字跳网络技术有限公司 Information prompting method and apparatus for virtual terminal device, device, medium, and product

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622182A (en) * 2012-04-16 2012-08-01 李波 Off-screen touch interactive system with nearly-joint sensing projection point coordinate indication
US20140298271A1 (en) * 2013-03-28 2014-10-02 Samsung Electronics Co., Ltd. Electronic device including projector and method for controlling the electronic device
CN105204764A (en) * 2015-09-06 2015-12-30 惠州Tcl移动通信有限公司 Hand-held terminal provided with suspension screen, display equipment and remote control method
CN106681632A (en) * 2016-12-09 2017-05-17 北京小米移动软件有限公司 Projection control method, device and system, terminal device and display device
CN106980456A (en) * 2016-01-15 2017-07-25 中兴通讯股份有限公司 The control method and projector equipment of projector equipment
CN109960449A (en) * 2019-03-22 2019-07-02 深圳前海达闼云端智能科技有限公司 A kind of throwing screen display methods and relevant apparatus
CN111061445A (en) * 2019-04-26 2020-04-24 华为技术有限公司 Screen projection method and computing equipment
US20200371666A1 (en) * 2017-06-28 2020-11-26 Huawei Technologies Co., Ltd. Icon Display Method, and Apparatus
CN112015508A (en) * 2020-08-29 2020-12-01 努比亚技术有限公司 Screen projection interaction control method and device and computer readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622182A (en) * 2012-04-16 2012-08-01 李波 Off-screen touch interactive system with nearly-joint sensing projection point coordinate indication
US20140298271A1 (en) * 2013-03-28 2014-10-02 Samsung Electronics Co., Ltd. Electronic device including projector and method for controlling the electronic device
CN105204764A (en) * 2015-09-06 2015-12-30 惠州Tcl移动通信有限公司 Hand-held terminal provided with suspension screen, display equipment and remote control method
CN106980456A (en) * 2016-01-15 2017-07-25 中兴通讯股份有限公司 The control method and projector equipment of projector equipment
CN106681632A (en) * 2016-12-09 2017-05-17 北京小米移动软件有限公司 Projection control method, device and system, terminal device and display device
US20200371666A1 (en) * 2017-06-28 2020-11-26 Huawei Technologies Co., Ltd. Icon Display Method, and Apparatus
CN109960449A (en) * 2019-03-22 2019-07-02 深圳前海达闼云端智能科技有限公司 A kind of throwing screen display methods and relevant apparatus
CN111061445A (en) * 2019-04-26 2020-04-24 华为技术有限公司 Screen projection method and computing equipment
CN112015508A (en) * 2020-08-29 2020-12-01 努比亚技术有限公司 Screen projection interaction control method and device and computer readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023029526A1 (en) * 2021-08-30 2023-03-09 荣耀终端有限公司 Display control method and apparatus for pointer in window, device, and storage medium
CN115243082A (en) * 2022-07-18 2022-10-25 海信视像科技股份有限公司 Display device and terminal control method
WO2024060890A1 (en) * 2022-09-21 2024-03-28 北京字跳网络技术有限公司 Information prompting method and apparatus for virtual terminal device, device, medium, and product

Similar Documents

Publication Publication Date Title
CN107908351B (en) Application interface display method and device and storage medium
EP3121701A1 (en) Method and apparatus for single-hand operation on full screen
US20160202834A1 (en) Unlocking method and terminal device using the same
KR101903261B1 (en) Method and device for preventing accidental touch of terminal with touch screen
EP3182716A1 (en) Method and device for video display
RU2630189C1 (en) Method of controlling button functions in one-hand operation mode, device and electronic device
CN104317402B (en) Description information display method and device and electronic equipment
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN107390932B (en) Edge false touch prevention method and device and computer readable storage medium
CN107992257B (en) Screen splitting method and device
JP2017510915A (en) Method and apparatus for switching display modes
CN112905136A (en) Screen projection control method and device and storage medium
EP3299946B1 (en) Method and device for switching environment picture
EP3232314A1 (en) Method and device for processing an operation
CN105487805B (en) Object operation method and device
EP3109741B1 (en) Method and device for determining character
US10061497B2 (en) Method, device and storage medium for interchanging icon positions
JP2017534087A (en) Method and apparatus for setting a threshold
CN109862169B (en) Electronic equipment control method, device and storage medium
CN106990893B (en) Touch screen operation processing method and device
CN114296587A (en) Cursor control method and device, electronic equipment and storage medium
CN107168631B (en) Application program closing method and device and terminal electronic equipment
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys
CN112115947A (en) Text processing method and device, electronic equipment and storage medium
CN108804009B (en) Gesture recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination