CN113986106A - Double-hand operation method and device of touch screen, electronic equipment and storage medium - Google Patents

Double-hand operation method and device of touch screen, electronic equipment and storage medium Download PDF

Info

Publication number
CN113986106A
CN113986106A CN202111204723.XA CN202111204723A CN113986106A CN 113986106 A CN113986106 A CN 113986106A CN 202111204723 A CN202111204723 A CN 202111204723A CN 113986106 A CN113986106 A CN 113986106A
Authority
CN
China
Prior art keywords
area
page
control area
manipulation
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111204723.XA
Other languages
Chinese (zh)
Other versions
CN113986106B (en
Inventor
董子尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xumi Yuntu Space Technology Co Ltd
Original Assignee
Shenzhen Jizhi Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jizhi Digital Technology Co Ltd filed Critical Shenzhen Jizhi Digital Technology Co Ltd
Priority to CN202111204723.XA priority Critical patent/CN113986106B/en
Publication of CN113986106A publication Critical patent/CN113986106A/en
Application granted granted Critical
Publication of CN113986106B publication Critical patent/CN113986106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a two-hand operation method and device of a touch screen, electronic equipment and a storage medium. The method comprises the following steps: determining a first control area and a second control area corresponding to a current page displayed in a touch screen; when first gesture operation aiming at a first control area is detected, generating a first operation event corresponding to page content in the first control area; when a second gesture operation aiming at the second control area is detected, generating a second operation event corresponding to the page content in the second control area; and executing a two-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands. The method and the device can reduce the frequency and complexity of touch operation, improve the scene applicability of the touch operation, and meet the diversified touch operation experience of users.

Description

Double-hand operation method and device of touch screen, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for operating a touch screen with two hands, an electronic device, and a storage medium.
Background
At present, along with the development of intelligent terminals and touch technologies, more and more terminal display screens and applications adopt a touch mode to realize interaction, for example, adopt a single-point touch mode to select contents in the display screens, so as to improve the interaction experience between the touch screen equipment and enable the user to operate the applications more conveniently and quickly.
Currently, touch operation is mainly single-point operation, and in the face of a complex operation scene, a user needs to continuously and repeatedly click, slide and the like on a page, so as to achieve a final operation target, but the single-point operation mode increases the operation times and complexity and reduces the operation experience of the user; although there are some multi-touch manners in the prior art, multi-touch at the present stage still remains in a one-hand operation manner, for example, a user performs simple operations such as enlarging and reducing on a page, so that the operation scene with a complicated and various forms still cannot be satisfied, the applicability of touch operation is reduced, and diversified touch operation experience cannot be provided for the user.
Disclosure of Invention
In view of this, embodiments of the present disclosure provide a method and an apparatus for operating a touch screen with two hands, an electronic device, and a storage medium, so as to solve the problems that the number of times and complexity of touch operations are high, and the touch operations cannot be applied to a complex operation scene, and the applicability of the touch operations and the user operation experience are reduced in the prior art.
In a first aspect of the embodiments of the present disclosure, a two-hand operation method for a touch screen is provided, including: determining a first control area and a second control area corresponding to a current page displayed in a touch screen; when first gesture operation aiming at a first control area is detected, generating a first operation event corresponding to page content in the first control area; when a second gesture operation aiming at the second control area is detected, generating a second operation event corresponding to the page content in the second control area; and executing a two-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands.
In a second aspect of the embodiments of the present disclosure, there is provided a two-handed operation device for a touch screen, including: the determining module is configured to determine a first control area and a second control area corresponding to a current page displayed in the touch screen; the first operation module is configured to generate a first operation event corresponding to the page content in the first control area when a first gesture operation for the first control area is detected; the second operation module is configured to generate a second operation event corresponding to the page content in the second control area when a second gesture operation aiming at the second control area is detected; and the execution module is configured to execute a two-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands.
In a third aspect of the embodiments of the present disclosure, an electronic device is provided, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method when executing the program.
In a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, which stores a computer program, which when executed by a processor, implements the steps of the above-mentioned method.
The embodiment of the present disclosure adopts at least one technical scheme that can achieve the following beneficial effects:
determining a first control area and a second control area corresponding to a current page displayed in a touch screen; when first gesture operation aiming at a first control area is detected, generating a first operation event corresponding to page content in the first control area; when a second gesture operation aiming at the second control area is detected, generating a second operation event corresponding to the page content in the second control area; and executing a two-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands. According to the touch screen control method and the touch screen control device, the touch screen is operated by both hands, so that the times and complexity of touch operation can be reduced, complex-form and varied operation scenes can be met, the scene applicability of touch operation is improved, and diversified touch operation experience of users is met.
Drawings
To more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive efforts.
Fig. 1 is a schematic flowchart of a two-handed operation method of a touch screen provided in an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a page including a floating window according to an embodiment of the disclosure;
FIG. 3 is a diagram illustrating two-handed multi-point manipulation according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of two-handed multi-point manipulation provided by a second embodiment of the present disclosure;
fig. 5 is a schematic diagram of two-handed multi-point manipulation provided by a third embodiment of the present disclosure;
fig. 6 is a schematic page diagram when a selection scene exists in a current page according to a fourth embodiment of the present disclosure;
fig. 7 is a schematic diagram of two-handed multipoint manipulation provided by a fourth embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a two-handed operation device of a touch screen provided in an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
As mentioned above, with the development of the internet and the intelligent terminal, users are increasingly eager for different forms of interaction such as touch control and voice with the terminal device, and along with the development of the touch control technology and the touch screen, there are already a lot of terminal devices and applications that can implement user interaction in a touch control manner, for example, a single-point touch control manner is adopted to select content in the display screen, so as to improve the interaction experience between the user and the touch screen device, and make the user operate the application more conveniently and quickly.
However, the current touch operation is still mainly a single-point operation, and in the face of a complicated operation scene, a user needs to repeatedly click and slide a page, so as to achieve a final operation target, for example: in a scene of displaying list content, if a user wants to perform a selection operation on the list content, the following touch operations need to be performed: selecting-sliding-selecting-sliding; if the list is long, this operation needs to be repeated for a plurality of times until the user selects all the required list contents. It can be seen that, the existing single-point operation mode not only increases the operation times and complexity, but also fails to achieve the operation target when facing some complicated and demanding operation scenes, and greatly reduces the operation experience of the user.
Although there are some multi-touch manners in the prior art, multi-touch at the present stage still remains in a one-hand operation manner, for example, a user performs simple operations such as enlarging and reducing on a page, so that the operation scene with a complicated and various forms still cannot be satisfied, the applicability of touch operation is reduced, and diversified touch operation experience cannot be provided for the user.
It should be noted that, in the application scenario of the embodiment of the present disclosure, the present disclosure may be executed by a system, a client, an application program, a web page or an applet installed or running in a mobile terminal, and the mobile terminal includes but is not limited to: the mobile terminal supports touch screen operation, such as a smart phone, a tablet computer and a notebook computer. The embodiment of the present disclosure is described by taking a system and an application program interface of a mobile phone terminal as an example, but in practical application, the embodiment of the present disclosure is not limited to a mobile phone terminal system.
Fig. 1 is a schematic flowchart of a two-hand operation method of a touch screen according to an embodiment of the present disclosure. The two-handed operation method of the touch screen of fig. 1 may be performed by a system program installed on the mobile terminal. As shown in fig. 1, the two-hand operation method of the touch screen may specifically include:
s101, determining a first control area and a second control area corresponding to a current page displayed in a touch screen;
s102, when a first gesture operation aiming at a first control area is detected, generating a first operation event corresponding to the page content in the first control area;
s103, when a second gesture operation aiming at a second control area is detected, a second operation event corresponding to the page content in the second control area is generated;
and S104, executing a double-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands.
Specifically, the current page displayed in the touch screen refers to an interface corresponding to a current application program in the mobile terminal, the display interfaces corresponding to different application programs are different, and the contents corresponding to different display interfaces of the same application program are different. The manipulation area may be considered to be an area corresponding to content that can be touch-operated by a user in a current application page, a position and content corresponding to the manipulation area are related to a current operation scene and state of an application program, and the position of the manipulation area is not fixed, and the position of the manipulation area changes not only with switching of the current application page but also with touch-operation of the user.
Further, the gesture operation refers to an operation formed by a user clicking or sliding a touch screen of the mobile terminal, the mobile terminal can generate a corresponding operation instruction after detecting a sliding operation for an application page, the operation instruction corresponds to a predetermined operation event, therefore, different gesture operations generate different operation events correspondingly, and the binding relationships between the gesture operation and the operation instruction and between the operation instruction and the operation event can be configured in advance and stored in a server corresponding to the application program client. In practical applications, after the touch screen detects a gesture operation sent by a user through a finger or a stylus, the gesture instruction sent by the user is determined according to a corresponding relationship between the gesture operation and a preconfigured operation instruction, for example: when the user slides the page up and down, the page will follow the direction of the finger slide.
Further, the operation event may be considered as a message type generated according to the operation instruction, and after the operation event is sent to the operating system of the mobile terminal, the operating system may generate a corresponding task according to the operation event, so as to run a task corresponding to the event, and therefore, the task is a program that needs to be run when the operation event occurs. In practical applications, the first operation event and the second operation event may be generated simultaneously or sequentially, so that the multi-point operation task may be considered as one task, that is, the task generated by the first operation event and the task generated by the second operation event are included. The specific content of the multi-point operation task and the number of the tasks do not constitute a limitation to the technical solution of the present disclosure.
According to the technical scheme provided by the embodiment of the disclosure, a first control area and a second control area corresponding to a current page displayed in a touch screen are determined; when first gesture operation aiming at a first control area is detected, generating a first operation event corresponding to page content in the first control area; when a second gesture operation aiming at the second control area is detected, generating a second operation event corresponding to the page content in the second control area; and executing a two-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands. According to the touch screen control method and the touch screen control device, the touch screen is operated by both hands, so that the times and complexity of touch operation can be reduced, complex-form and varied operation scenes can be met, the scene applicability of touch operation is improved, and diversified touch operation experience of users is met.
In some embodiments, before determining the first and second manipulation regions corresponding to the current page displayed in the touch screen, the method further includes: receiving a control switching instruction aiming at a current page displayed in the touch screen, and switching the current page from a one-hand control mode to a two-hand control mode based on the control switching instruction; the control switching instruction is an instruction generated by a user executing a preset operation on the touch screen, and the preset operation comprises a long-press gesture operation.
Specifically, the manipulation modes of the display screen of the mobile terminal may include a one-handed manipulation mode and a two-handed manipulation mode, and therefore, in order to enable a user to select different manipulation modes for different scene pages, the embodiment of the present disclosure provides a manner of switching the manipulation mode of a current page, for example: under the simple interaction scene page, the user can adopt a system default single-hand control mode, and at the moment, the user can hold the mobile terminal by a single hand and carry out single-hand operation on the page. However, when the application page enters an interactive scene page requiring complex operation, long-press gesture operation of a user for the current page is received, the current page is switched from a one-hand operation mode to a two-hand operation mode, the user can perform multi-instruction parallel operation on the content in the display screen, and single-point or multi-point touch operation is performed on different operation areas, so that the effect of simultaneous operation of multiple points by two hands is achieved.
In some embodiments, determining the first and second manipulation regions corresponding to the current page displayed in the touch screen includes: when the current page is switched to a two-hand control mode, dividing the content in the current page into the content of a first control area and the content of a second control area according to the state corresponding to the current page; the first control area corresponds to a left-hand control area, and the second control area corresponds to a right-hand control area, or the first control area corresponds to the right-hand control area, and the second control area corresponds to the left-hand control area; the positions of the first control area and the second control area change along with the gesture operation.
Specifically, the first manipulation region and the second manipulation region in the two-hand manipulation mode are determined by the scene and the state of the current page, in other words, the manipulation regions corresponding to the current page in different scenes may be different, for example, when the application page includes an upper display window and a lower display window, the upper display window in the page may correspond to the first manipulation region, and the lower display window in the page may correspond to the second manipulation region. The touch operation can be performed on the second manipulation area (namely, the lower layer display window) while the touch operation is performed on the first manipulation area (namely, the upper layer display window).
Further, different manipulation areas correspond to different hand manipulation areas, and two hands can simultaneously manipulate the two manipulation areas, at this time, the contacts of the two hands are distributed in the different manipulation areas, for example: when the left area of the touch screen slides up and down, other operations (such as deleting, selecting, closing and the like) can be performed on the browsed content in the right area of the touch screen. That is, when the left hand finger slides the left side content of the screen, the right hand finger can operate the right side content of the screen at the same time, and similarly, when the right hand finger operates, the left hand can also keep sliding operation.
Further, the first gesture operation is performed by the left hand, and the second gesture operation is performed by the right hand, or the first gesture operation is performed by the right hand, and the second gesture operation is performed by the left hand; the touch operations of the two hands may be single-point operations or multi-point operations, for example, the left hand performs single-point operations on the first control area, the right hand performs multi-point operations on the second control area, or both sides perform multi-point operations at the same time.
According to the technical scheme provided by the embodiment of the disclosure, a user can simultaneously perform touch operation on a first control area and a second control area through two hands, when a left hand controls a display screen, a right hand can also control the display screen, and the control of the two hands corresponds to different control areas, so that multipoint simultaneous control for the touch screen is formed, the problem that when a complex control scene is faced is solved, the operation step is continuously repeated by one hand, the operation complexity is increased, the quick and convenient achievement of an operation target of the user can be assisted, and the operation time of the user is shortened.
The following describes in detail the application of the technical solution of the present disclosure in various scenarios with reference to different embodiments, and the following embodiments are only examples in some of the scenarios listed in the present disclosure, and therefore, the following embodiments do not limit the technical solution of the present disclosure.
Example one
In the first embodiment of the present disclosure, an operation scenario in which a floating window exists in an application page is pointed to, at this time, the floating window may be used as a first control region, and a page window below the floating window may be used as a second control region, and the two-hand multipoint control scenario of the first embodiment is described in detail below with reference to the accompanying drawings and specific embodiments. Fig. 2 is a schematic view of a page including a floating window according to a first embodiment of the disclosure, and fig. 3 is a schematic view of a two-hand multi-point operation according to a first embodiment of the disclosure. As shown in fig. 2 and fig. 3, the process of two-hand multipoint manipulation including floating windows mainly includes the following steps:
in some embodiments, when the current page includes a floating window, when a first gesture operation for a first manipulation region is detected, a first operation event corresponding to page contents in the first manipulation region is generated, including: and taking the floating window as a first control area, receiving a click operation aiming at the floating window, wherein the floating window is in a selected state at the moment, and generating a first operation event that the floating window is dragged to a finger sliding position.
Specifically, as shown in fig. 2, when a floating window exists in a page, a click operation for the floating window is received, at this time, a dashed-line frame appears around the content of the floating window in the page, which indicates that the content of the floating window is in a selected state, and the click operation for the floating window may be an operation performed by a user using a left finger.
Here, the application page scene with the floating window may be, for example, a video application page, a news application page, a live application page, and so on, and thus, the content in the floating window includes, but is not limited to, the following: text content, video content, live content, and the like.
In some embodiments, when a second gesture operation for the second manipulation area is detected, a second manipulation event corresponding to the page content in the second manipulation area is generated, including: and taking the page window below the floating window as a second control area, receiving finger gathering operation aiming at the page window, reducing the page window to the bottom of the current page at the moment, and generating a second operation event that the page window is closed and the page window is returned to the desktop.
Specifically, as shown in fig. 3, while the first gesture operation (clicking the floating window with the left finger) is performed on the first manipulation region, the second gesture operation may also be performed on the second manipulation region, that is, the left finger clicks the content of the floating window, the right finger gathers and slides down the page at the level below the floating window, and the second operation event generated at this time is to close the page below the floating window in the current page, and leave the page of the program where the current program is located and return to the desktop. It should be noted that, in the process of performing the gathering sliding on the page at the lower level of the floating window by the right hand until the application page is closed, the left hand finger may drag the floating window at the same time.
Example two
In the second embodiment of the present disclosure, an operation scenario in which a floating window exists in an application page is pointed out, at this time, the floating window is taken as a first control region, and an application icon at the bottom of a current page is taken as a second control region, and the two-hand multipoint control scenario of the second embodiment is described in detail below with reference to the drawings and specific embodiments. Fig. 4 is a schematic diagram of two-hand multi-point manipulation according to a second embodiment of the disclosure. As shown in fig. 4, the process of two-hand multipoint manipulation including floating window mainly includes the following steps:
specifically, as shown in fig. 4, when a floating window exists in a page, a second gesture operation is received for an application icon at the bottom of the current page, for example, the application icon may be clicked, and an operation event for opening the application icon is generated, at this time, the content of the floating window and the application opened by the second gesture operation do not belong to the same program, and the content of the floating window may also be browsed or operated while the second gesture operation is performed on the application icon.
EXAMPLE III
In the third embodiment of the present disclosure, an operation scene in which a floating window exists in an application page is pointed to, at this time, the floating window may be used as a first control region, and a page window below the floating window may be used as a second control region, and the two-hand multipoint control scene in the third embodiment is described in detail with reference to the drawings and the specific embodiments. Fig. 5 is a schematic diagram of two-hand multipoint manipulation provided by a third embodiment of the present disclosure. As shown in fig. 5, the process of two-hand multipoint manipulation including floating window mainly includes the following steps:
specifically, as shown in fig. 5, when there is a floating window in the page, a click operation for the floating window is received, at this time, a dashed-line frame appears around the content of the floating window in the page, which indicates that the content of the floating window is in a selected state, and the click operation for the floating window may be an operation performed by a user using a left finger.
Here, while performing the first gesture operation (i.e., the click operation) on the floating window, the user may perform a second gesture operation on the page window below the floating window using the other hand, such as pulling the page below the floating window in the opposite direction by the two fingers of the other hand, and gradually darkening the application page while sliding in the opposite direction until gradually entering and exiting the page of the program to which the floating content belongs.
Further, when the user performs reverse sliding operation on the application page in the second control area, the other hand can click the content of the floating window, when the user performs the reverse sliding operation, the application page below the floating window is gradually hidden, when the application page is gradually hidden, the page of the program to which the floating window belongs is gradually displayed, and after two fingers performing direction sliding operation on the application page below the floating window are loosened, the page is switched to the program page to which the floating content belongs. It should be noted that the content of the floating window may be floating content corresponding to the program home page, and therefore, after the two fingers sliding in the direction are released, the page is switched to the page corresponding to the program home page.
Example four
In a fourth embodiment of the present disclosure, a scene in which a selection operation exists in an application page is pointed to, at this time, a right region of the application page may be taken as a first manipulation region, and a left region of a current page may be taken as a second manipulation region, and a two-hand multipoint manipulation scene in the fourth embodiment is described in detail below with reference to the drawings and specific embodiments. Fig. 6 is a schematic view of a page when a selected scene exists in a current page according to a fourth embodiment of the present disclosure, and fig. 7 is a schematic view of a two-hand multipoint manipulation according to a fourth embodiment of the present disclosure. As shown in fig. 6 and 7, the process of two-hand multipoint manipulation in the presence of the selection operation scene mainly includes the following steps:
in some embodiments, when the current page includes a selection operation scene, when a first gesture operation for a first manipulation area is detected, generating a first operation event corresponding to page content in the first manipulation area, including: and taking the right area of the current page as a first control area, receiving a sliding operation aiming at the right area, wherein the content in the right area moves along the finger sliding direction at the moment, and generating a first operation event that the content in the right area slides along the finger.
Specifically, as shown in fig. 6, the application page is a page corresponding to the chat program, and there is a selection item in the current page, that is, there is a selection operation scene in the current page, for example, the selection operation may be performed by clicking a circular button on the left side of each message, and in an actual application, performing the selection operation on the circular button on the left side of each message may be an operation of a finger of a left hand of the user.
Further, as shown in fig. 7, a right area of the current page may be used as a first manipulation area, and a right finger may perform a sliding gesture operation on the first manipulation area, at this time, the page corresponding to the first manipulation area moves along the direction of the sliding gesture operation, a step length of the movement is the same as a distance of the sliding operation, and an operation event generated by sliding the first manipulation area by the right finger is to make page content in the right area slide along with the finger. While the first manipulation region is slid with the right finger, the selection operation can be performed on the second manipulation region (i.e., the left region of the current page) with the other hand.
In some embodiments, when a second gesture operation for the second manipulation area is detected, a second manipulation event corresponding to the page content in the second manipulation area is generated, including: and taking the left area of the current page as a second control area, receiving the clicking operation aiming at the left area, wherein the clicked content in the left area is in a selected state, and the generated second operation event is the clicked content in the selected left area.
Specifically, as shown in fig. 7, when the user slides up and down the right area of the current page using the right finger, the left hand may perform a click operation on the circular button in the left area of the page, and the user may slide the right display area while selecting messages in the left display area to sequentially select in the sliding direction, so as to select a plurality of target messages, that is, while the user slides up and down the right finger, the left finger may perform a selection operation at the same time.
It should be noted that the positions and sizes of the left display area and the right display area (i.e., the left area and the right area) may be adjusted according to an actual application scenario, and in the fourth embodiment of the present disclosure, a corresponding area in a right dashed box (see fig. 7) of the current page is used as the right display area, and a display on the left of the right display area is used as the left display area.
In the embodiment of the present disclosure, the first to third embodiments provide a multi-point simultaneous operation touch scheme for a left-hand single-finger operation and a right-hand double-finger operation, and the fourth embodiment provides a multi-point simultaneous operation touch scheme for a left-hand single-finger operation and a right-hand single-finger operation, where the touch schemes are provided based on different touch operation scenarios. It should be understood that, the technical solution of the present disclosure for operating a display screen by multi-point simultaneous touch is not limited to the single-finger + double-finger touch manner, and the single-finger + single-finger touch manner, and the double-hand touch manner may be configured according to different scene requirements, for example, the first touch region and the second touch region both adopt the double-finger operation, or adopt the operation with more than two fingers (such as three-finger operation, multi-finger gesture operation, etc.), and thus, the single-finger operation and the double-finger operation on the manipulation region do not constitute a limitation of the technical solution of the present disclosure, and other manners of performing multi-point touch on the display screen by two hands at the same time in any form are also applicable to the present solution.
It should be noted that, when the control screen area is operated by two hands in a multi-point manner, the areas operated by the fingers of the two hands cannot be operated in the same area, and the operation areas of the two hands need to be divided into distinct distance or position areas, such as left and right screen areas, upper and lower screen areas, or a user-defined area.
According to the technical scheme provided by the embodiment of the disclosure, when one hand is used for touch operation on the display screen, the other hand can be used for simultaneous operation, and the control areas corresponding to the two hands are different, so that the original one-hand operation is changed into two-hand operation, when a complex page operation scene is faced, the mode that two hands are used for simultaneously performing touch operation on the display screen at multiple points greatly reduces the operation times and complexity, simplifies the flow of touch operation, can be expanded into complex and diverse operation scenes, meets the requirement of a user on performing touch operation on the complex scene, and improves the scene applicability of the touch operation.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 8 is a schematic structural diagram of a two-handed operation device of a touch screen provided in an embodiment of the present disclosure. As shown in fig. 8, the two-hand operation device of the touch screen includes:
a determining module 801 configured to determine a first manipulation area and a second manipulation area corresponding to a current page displayed in a touch screen;
the first operation module 802 is configured to generate a first operation event corresponding to page content in a first manipulation area when a first gesture operation for the first manipulation area is detected;
the second operation module 803 is configured to generate a second operation event corresponding to the page content in the second manipulation region when a second gesture operation for the second manipulation region is detected;
and the executing module 804 is configured to execute a two-hand multi-point operation task on the current page based on a first operation event and a second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands.
In some embodiments, the switching module 805 of fig. 8 receives a manipulation switching instruction for a current page displayed in the touch screen before determining a first manipulation area and a second manipulation area corresponding to the current page displayed in the touch screen, and switches the current page from a one-handed manipulation mode to a two-handed manipulation mode based on the manipulation switching instruction; the control switching instruction is an instruction generated by a user executing a preset operation on the touch screen, and the preset operation comprises a long-press gesture operation.
In some embodiments, when the current page is switched to the two-hand manipulation mode, the determining module 801 shown in fig. 8 divides the content in the current page into the content in the first manipulation area and the content in the second manipulation area according to the state corresponding to the current page; the first control area corresponds to a left-hand control area, and the second control area corresponds to a right-hand control area, or the first control area corresponds to the right-hand control area, and the second control area corresponds to the left-hand control area; the positions of the first control area and the second control area change along with the gesture operation.
In some embodiments, when the current page includes a floating window, the first operation module 802 in fig. 8 uses the floating window as a first manipulation area, receives a click operation on the floating window, where the floating window is in a selected state, and generates a first operation event that the floating window is dragged to a finger sliding position.
In some embodiments, the second operation module 803 in fig. 8 uses the page window below the floating window as a second manipulation region, receives a finger gathering operation for the page window, at this time, the page window is reduced to the bottom of the current page, and a second operation event is generated by closing the page window and returning to the desktop.
In some embodiments, when the current page includes a selection operation scene, the first operation module 802 in fig. 8 takes the right area of the current page as a first manipulation area, receives a sliding operation for the right area, and when the content in the right area moves along the finger sliding direction, generates a first operation event to make the content in the right area slide along with the finger.
In some embodiments, the second operation module 803 of fig. 8 uses the left area of the current page as the second manipulation area, receives a click operation for the left area, where the clicked content in the left area is in a selected state, and generates the second operation event as the clicked content in the selected left area.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present disclosure.
Fig. 9 is a schematic structural diagram of an electronic device 9 provided in the embodiment of the present disclosure. As shown in fig. 9, the electronic apparatus 9 of this embodiment includes: a processor 901, a memory 902 and a computer program 903 stored in the memory 902 and operable on the processor 901. The steps in the various method embodiments described above are implemented when the processor 901 executes the computer program 903. Alternatively, the processor 901 implements the functions of each module/unit in each apparatus embodiment described above when executing the computer program 903.
Illustratively, the computer program 903 may be divided into one or more modules/units, which are stored in the memory 902 and executed by the processor 901 to complete the present disclosure. One or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 903 in the electronic device 9.
The electronic device 9 may be a desktop computer, a notebook, a palm computer, a cloud server, or other electronic devices. The electronic device 9 may include, but is not limited to, a processor 901 and a memory 902. Those skilled in the art will appreciate that fig. 9 is merely an example of the electronic device 9, and does not constitute a limitation of the electronic device 9, and may include more or less components than those shown, or combine certain components, or different components, for example, the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 901 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 902 may be an internal storage unit of the electronic device 9, for example, a hard disk or a memory of the electronic device 9. The memory 902 may also be an external storage device of the electronic device 9, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the electronic device 9. Further, the memory 902 may also include both an internal storage unit of the electronic device 9 and an external storage device. The memory 902 is used for storing computer programs and other programs and data required by the electronic device. The memory 902 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, and for example, a division of modules or units, a division of logical functions only, an additional division may be made in actual implementation, multiple units or components may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain suitable additions or additions that may be required in accordance with legislative and patent practices within the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunications signals in accordance with legislative and patent practices.
The above examples are only intended to illustrate the technical solutions of the present disclosure, not to limit them; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.

Claims (10)

1. A two-hand operation method of a touch screen is characterized by comprising the following steps:
determining a first control area and a second control area corresponding to a current page displayed in a touch screen;
when a first gesture operation aiming at the first control area is detected, generating a first operation event corresponding to page content in the first control area;
when a second gesture operation aiming at the second control area is detected, generating a second operation event corresponding to the page content in the second control area;
and executing a two-hand multi-point operation task on the current page based on the first operation event and the second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands.
2. The method of claim 1, further comprising, prior to the determining the first and second manipulation regions corresponding to the current page displayed in the touch screen:
receiving a control switching instruction for the current page displayed in the touch screen, and switching the current page from a one-hand control mode to a two-hand control mode based on the control switching instruction;
the control switching instruction is an instruction generated by a user executing a preset operation on the touch screen, and the preset operation comprises a long-press gesture operation.
3. The method of claim 1, wherein the determining the first and second manipulation regions corresponding to the current page displayed on the touch screen comprises:
when the current page is switched to a two-hand control mode, dividing the content in the current page into the content of the first control area and the content of the second control area according to the state corresponding to the current page;
the first control area corresponds to a left-hand control area, and the second control area corresponds to a right-hand control area, or the first control area corresponds to a right-hand control area, and the second control area corresponds to a left-hand control area; the positions of the first manipulation area and the second manipulation area change along with gesture operation.
4. The method according to claim 1, wherein when the current page includes a floating window, generating a first operation event corresponding to page contents in the first manipulation region when the first gesture operation for the first manipulation region is detected comprises:
and taking the floating window as the first control area, receiving a click operation aiming at the floating window, wherein the floating window is in a selected state at the moment, and the generated first operation event is dragging the floating window to a finger sliding position.
5. The method according to claim 4, wherein the generating of the second operation event corresponding to the page content in the second manipulation area when the second gesture operation for the second manipulation area is detected comprises:
and taking the page window below the floating window as the second control area, receiving finger gathering operation aiming at the page window, reducing the page window to the bottom of the current page at the moment, and generating a second operation event that the page window is closed and returned to the desktop.
6. The method according to claim 1, wherein when the current page includes a selection operation scene, the generating a first operation event corresponding to page contents in the first manipulation region when the first gesture operation for the first manipulation region is detected comprises:
and taking the right area of the current page as the first manipulation area, receiving a sliding operation aiming at the right area, wherein the content in the right area moves along a finger sliding direction at the moment, and the generated first operation event is that the content in the right area slides along the finger.
7. The method of claim 6, wherein when a second gesture operation for the second manipulation area is detected, generating a second operation event corresponding to page content in the second manipulation area comprises:
and taking the left area of the current page as the second control area, receiving a clicking operation aiming at the left area, wherein the clicked content in the left area is in a selected state, and the generated second operation event is the clicked content in the selected left area.
8. A two-handed operation device for a touch screen, comprising:
the determining module is configured to determine a first control area and a second control area corresponding to a current page displayed in the touch screen;
the first operation module is configured to generate a first operation event corresponding to page content in the first control area when a first gesture operation for the first control area is detected;
the second operation module is configured to generate a second operation event corresponding to the page content in the second manipulation area when a second gesture operation aiming at the second manipulation area is detected;
and the execution module is configured to execute a two-hand multi-point operation task on the current page based on the first operation event and the second operation event, wherein the first gesture operation and the second gesture operation are gesture operations generated by different hands.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 7 when executing the program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202111204723.XA 2021-10-15 2021-10-15 Double-hand operation method and device of touch screen, electronic equipment and storage medium Active CN113986106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111204723.XA CN113986106B (en) 2021-10-15 2021-10-15 Double-hand operation method and device of touch screen, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111204723.XA CN113986106B (en) 2021-10-15 2021-10-15 Double-hand operation method and device of touch screen, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113986106A true CN113986106A (en) 2022-01-28
CN113986106B CN113986106B (en) 2024-08-30

Family

ID=79738860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111204723.XA Active CN113986106B (en) 2021-10-15 2021-10-15 Double-hand operation method and device of touch screen, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113986106B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114840298A (en) * 2022-07-04 2022-08-02 荣耀终端有限公司 Suspension window opening method and electronic equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130047125A1 (en) * 2011-08-19 2013-02-21 International Business Machines Corporation Touchscreen gestures for virtual bookmarking of pages
CN104035716A (en) * 2014-06-26 2014-09-10 苏宁云商集团股份有限公司 Touch panel operation method and device and terminal
JP2015106256A (en) * 2013-11-29 2015-06-08 コニカミノルタ株式会社 Information processor, method for controlling information processor, and program for allowing computer to execute the same method
CN105677215A (en) * 2015-12-30 2016-06-15 小米科技有限责任公司 Application control method and apparatus
WO2016123893A1 (en) * 2015-02-03 2016-08-11 中兴通讯股份有限公司 Photographing method, device and terminal
WO2016145832A1 (en) * 2015-08-04 2016-09-22 中兴通讯股份有限公司 Method of operating terminal and device utilizing same
CN107124508A (en) * 2017-04-18 2017-09-01 北京小米移动软件有限公司 Location regulation method, device and the terminal of suspension control, readable storage medium storing program for executing
WO2017167123A1 (en) * 2016-03-29 2017-10-05 北京金山安全软件有限公司 Method and apparatus for displaying resource entrance on mobile device, and mobile device
CN110275658A (en) * 2019-06-03 2019-09-24 Oppo广东移动通信有限公司 Display control method, device, mobile terminal and storage medium
CN111149086A (en) * 2017-09-30 2020-05-12 华为技术有限公司 Method for editing main screen, graphical user interface and electronic equipment
WO2021043223A1 (en) * 2019-09-06 2021-03-11 华为技术有限公司 Split-screen display method and electronic device
WO2021057337A1 (en) * 2019-09-27 2021-04-01 维沃移动通信有限公司 Operation method and electronic device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130047125A1 (en) * 2011-08-19 2013-02-21 International Business Machines Corporation Touchscreen gestures for virtual bookmarking of pages
JP2015106256A (en) * 2013-11-29 2015-06-08 コニカミノルタ株式会社 Information processor, method for controlling information processor, and program for allowing computer to execute the same method
CN104035716A (en) * 2014-06-26 2014-09-10 苏宁云商集团股份有限公司 Touch panel operation method and device and terminal
WO2016123893A1 (en) * 2015-02-03 2016-08-11 中兴通讯股份有限公司 Photographing method, device and terminal
WO2016145832A1 (en) * 2015-08-04 2016-09-22 中兴通讯股份有限公司 Method of operating terminal and device utilizing same
CN105677215A (en) * 2015-12-30 2016-06-15 小米科技有限责任公司 Application control method and apparatus
WO2017167123A1 (en) * 2016-03-29 2017-10-05 北京金山安全软件有限公司 Method and apparatus for displaying resource entrance on mobile device, and mobile device
CN107124508A (en) * 2017-04-18 2017-09-01 北京小米移动软件有限公司 Location regulation method, device and the terminal of suspension control, readable storage medium storing program for executing
CN111149086A (en) * 2017-09-30 2020-05-12 华为技术有限公司 Method for editing main screen, graphical user interface and electronic equipment
US20200233568A1 (en) * 2017-09-30 2020-07-23 Huawei Technologies Co., Ltd. Home screen editing method, graphical user interface, and electronic device
CN110275658A (en) * 2019-06-03 2019-09-24 Oppo广东移动通信有限公司 Display control method, device, mobile terminal and storage medium
WO2021043223A1 (en) * 2019-09-06 2021-03-11 华为技术有限公司 Split-screen display method and electronic device
WO2021057337A1 (en) * 2019-09-27 2021-04-01 维沃移动通信有限公司 Operation method and electronic device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴金铎;李宏汀;王春慧;徐源;庄梦迪;陈瑞;: "基于中老年用户的移动设备触摸屏手势操作可用性研究", 人类工效学, no. 02, 20 April 2016 (2016-04-20) *
廖虎雄;老松杨;邵宏韬;刘钢;: "基于触摸屏的双手交互指挥技术应用研究", 国防科技大学学报, no. 04, 28 August 2011 (2011-08-28) *
涂俊;鲍海;李晓娟;: "基于多点触控操作的网页界面设计趋势研究", 科技传播, no. 02, 23 January 2014 (2014-01-23) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114840298A (en) * 2022-07-04 2022-08-02 荣耀终端有限公司 Suspension window opening method and electronic equipment
CN114840298B (en) * 2022-07-04 2023-04-18 荣耀终端有限公司 Suspended window opening method and electronic equipment

Also Published As

Publication number Publication date
CN113986106B (en) 2024-08-30

Similar Documents

Publication Publication Date Title
US9733815B2 (en) Split-screen display method and apparatus, and electronic device thereof
CN103067569B (en) Method and device of multi-window displaying of smart phone
CN112148170B (en) Desktop element adjusting method and device and electronic equipment
CN104238949A (en) Split-screen displaying method and device
EP2521025B1 (en) Component display processing method and user device
CN107704157B (en) Multi-screen interface operation method and device and storage medium
CN107479818B (en) Information interaction method and mobile terminal
CN112099707A (en) Display method and device and electronic equipment
CN113703624A (en) Screen splitting method and device and electronic equipment
WO2022156666A1 (en) Icon arrangement method and apparatus, and electronic device and readable storage medium
CN106843735A (en) A kind of terminal control method and mobile terminal
WO2023045927A1 (en) Object moving method and electronic device
CN113342232A (en) Icon generation method and device, electronic equipment and readable storage medium
CN111580905A (en) Negative one-screen card management method, terminal and computer readable storage medium
CN113986106B (en) Double-hand operation method and device of touch screen, electronic equipment and storage medium
CN103383621A (en) Method and device for view switching
CN114415886A (en) Application icon management method and electronic equipment
CN114116098A (en) Application icon management method and device, electronic equipment and storage medium
CN111796746B (en) Volume adjusting method, volume adjusting device and electronic equipment
CN113485625A (en) Electronic equipment response method and device and electronic equipment
CN109739422B (en) Window control method, device and equipment
US9552132B2 (en) Application program preview interface and operation method thereof
CN115981531A (en) Page control method and device and electronic equipment
CN112230817B (en) Link page display method and device and electronic equipment
CN114442881A (en) Information display method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230117

Address after: 518054 cable information transmission building 25f2504, no.3369 Binhai Avenue, Haizhu community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Xumi yuntu Space Technology Co.,Ltd.

Address before: No.103, no.1003, Nanxin Road, Nanshan community, Nanshan street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen Jizhi Digital Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant