WO2022141286A1 - 应用切换方法、装置、电子设备及机器可读存储介质 - Google Patents

应用切换方法、装置、电子设备及机器可读存储介质 Download PDF

Info

Publication number
WO2022141286A1
WO2022141286A1 PCT/CN2020/141770 CN2020141770W WO2022141286A1 WO 2022141286 A1 WO2022141286 A1 WO 2022141286A1 CN 2020141770 W CN2020141770 W CN 2020141770W WO 2022141286 A1 WO2022141286 A1 WO 2022141286A1
Authority
WO
WIPO (PCT)
Prior art keywords
operating system
gesture
information
target gesture
touch
Prior art date
Application number
PCT/CN2020/141770
Other languages
English (en)
French (fr)
Inventor
李俊峰
董遇生
Original Assignee
安徽鸿程光电有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 安徽鸿程光电有限公司 filed Critical 安徽鸿程光电有限公司
Publication of WO2022141286A1 publication Critical patent/WO2022141286A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt

Definitions

  • the present disclosure relates to the technical field of electronic devices, and in particular, to an application switching method, an apparatus, an electronic device, and a machine-readable storage medium.
  • Application switching by a user is a relatively high-frequency operation, and the application switching operation is cumbersome for electronic devices with dual operating systems. For example, when a PPT is played in full screen, the user wishes to switch to another application. Since the full-screen application blocks the taskbar, the user cannot click and switch through the thumbnail window of the taskbar, but can only switch through keyboard shortcuts; once there is no external keyboard, it is necessary to exit the full-screen and then switch the application, which is very cumbersome. , inconvenient.
  • the purpose of the embodiments of the present disclosure is to provide an application switching method, apparatus, electronic device, and machine-readable storage medium, so as to at least solve the problem that the operation is very complicated and inconvenient when full-screen switching of electronic devices with dual operating systems is performed.
  • an application switching method is provided, which is applied to an electronic device.
  • the electronic device is provided with a first operating system and a second operating system.
  • the method may include:
  • the first operating system When the first application is in an open state, the first operating system receives the first input and recognizes gesture information in the first input;
  • the first operating system sends the operation information corresponding to the target gesture to the second operating system;
  • the second operating system When receiving the operation information, the second operating system switches the first application based on the operation information.
  • the method further includes:
  • the first operating system acquires touch point information, and determines whether the touch point information satisfies the target gesture condition;
  • the first operating system determines the gesture information in the first input as the target gesture.
  • judging whether the touch point information satisfies the target gesture condition including:
  • the first operating system determines that the gesture information in the first input is the target gesture, specifically:
  • the first operating system determines the gesture information as the target gesture.
  • judging whether the touch point information satisfies the target gesture condition also includes:
  • the first operating system sends the operation information corresponding to the target gesture to the second operating system, including:
  • the first operating system matches the application switching shortcut key instruction corresponding to the target gesture;
  • the application switching shortcut key instruction is sent to the second operating system.
  • the method also includes:
  • an application switching apparatus may include:
  • the first operating system is used to receive the first input and recognize the gesture information in the first input when the first application is in an open state, and when the gesture information is a target gesture, the first operating system converts the operation information corresponding to the target gesture sent to the second operating system;
  • the second operating system is configured to switch the first application based on the operation information when the operation information is received.
  • the first operating system includes:
  • the touch point acquisition module is used to acquire touch point information
  • a judgment module for judging whether the touch point information satisfies the target gesture condition
  • the target gesture determination module is configured to determine the gesture information in the first input as the target gesture when the touch point information satisfies the target gesture condition.
  • the judgment module includes:
  • a first judging unit for judging whether the number of touch points is equal to a preset number
  • a second judging unit configured to judge whether the first time interval between the initial touch points of the preset number of touch points is less than the first preset time
  • the third determination unit is configured to determine whether the distance between the preset number of touch points is less than the preset distance.
  • target gesture determination module is specifically used for:
  • the gesture information is determined as the target gesture.
  • judgment module also includes:
  • a fourth judging unit configured to judge whether the second time interval between the initial touch time and the cut-off time of each touch point in the preset number of touch points is less than the second preset time
  • the fifth judgment unit is used for judging whether the movement direction of the preset number of touch points is the movement in the gathering direction.
  • the first operating system also includes:
  • a matching module configured to match the application switching shortcut key instruction corresponding to the target gesture when the gesture information is the target gesture
  • the sending module is used for sending the application switching shortcut key instruction to the second operating system.
  • the first operating system also includes:
  • the deletion module is used for deleting the handwriting information, and the handwriting information is formed by the second operating system based on the gesture information in the writing mode.
  • an electronic device is provided, and the electronic device may include:
  • memory for storing processor-executable instructions
  • the processor is configured to execute the instruction to implement the application switching method in any one of the above technical solutions.
  • a machine-readable storage medium when instructions in the machine-readable storage medium are executed by a processor of an information processing apparatus or a server, so that the information processing apparatus or the server implements the foregoing technology
  • the application switching method of any one of the solutions is provided.
  • the gesture information in the first input is recognized by the first operating system, and when the gesture information is the target gesture, the first operating system sends the operation information corresponding to the target gesture to the second operating system; the second operating system When the operation information is received, the first application is switched based on the operation information. It is realized that the application displayed on the electronic device with dual operating systems can be switched by using gesture information, so that even if the electronic device with dual operating system does not have an external input device, the user can quickly switch the application, which improves the user switching dual system. The operating experience of the electronic device application of the operating system.
  • FIG. 1 is a flowchart of an application switching method according to an exemplary embodiment
  • FIG. 2 is a schematic structural diagram of an application switching device according to an exemplary embodiment
  • FIG. 3 is a schematic structural diagram of an electronic device according to an exemplary embodiment
  • FIG. 4 is a schematic diagram of a hardware structure of an electronic device according to an exemplary embodiment.
  • the present disclosure provides an application switching method, the application switching method is aimed at an electronic device with dual systems, the electronic device may be a computer, a tablet device, an interactive display device, etc., and is provided with A first operating system and a second operating system.
  • the electronic device may be a computer, a tablet device, an interactive display device, etc.
  • a first operating system and a second operating system For a dual-system electronic device, there is currently no convenient and quick application switching method in the prior art, especially when the device is displayed in full screen and there is no external input device.
  • the method of the present disclosure can use gesture information to switch applications displayed by an electronic device with dual operating systems.
  • an application switching method is provided, which is applied to an electronic device.
  • the method may include the following steps:
  • Step 100 when the first application is in an open state, the first operating system receives the first input and recognizes the gesture information in the first input;
  • Step 300 when the gesture information is the target gesture, the first operating system sends the operation information corresponding to the target gesture to the second operating system;
  • Step 400 When the second operating system receives the operation information, switches the first application based on the operation information.
  • the gesture information in the first input is recognized by the first operating system, and when the gesture information is the target gesture, the first operating system sends the operation information corresponding to the target gesture to the first operating system.
  • the operating experience of the electronic device application of the operating system is realized by using gesture information, so that even if the electronic device with dual operating system does not have an external input device, the user can quickly switch the application, which improves the user switching dual system.
  • Step 100 is first introduced.
  • the first operating system receives the first input and recognizes gesture information in the first input.
  • the user when the first application is in the open or running state, the user inputs the first input to the electronic device, and the first operating system of the electronic device receives the first input through the input module, and recognizes the gesture in the first input information.
  • the first input may be touch-sensitive or non-contact.
  • S100 may specifically be: after receiving the first input, the touch screen recognizes the touch information of the first input detected by the touch component, so as to obtain the gesture information in the first input.
  • the touch input may also be an input that is easy to perform operations such as shaking, flipping, and rotating the electronic device.
  • S100 may specifically be: the camera on the electronic device collects the gesture image, and recognizes the gesture information in the image, so as to obtain the gesture information in the first input.
  • the first application may be an application running under the second operating system, and the first application may be in a full-screen situation, or may be a situation in which the first application blocks the task switching control of the second operating system,
  • the first operating system recognizes the first input, and recognizes gesture information in the first input.
  • the first operating system may be a linux system, a windows operating system, a UNIX operating system, an Apple operating system or an Android operating system
  • the second operating system may be a linux system, a windows operating system, a UNIX operating system, an Apple operating system or Android operating system.
  • Step 300 is introduced next.
  • the first operating system sends the operation information corresponding to the target gesture to the second operating system.
  • step 100 is used to judge the result of the first input gesture information recognition.
  • the first operating system sends the operation information corresponding to the target gesture to the second operating system.
  • the target gesture may be a grasping gesture, a flipping gesture, a swinging gesture, a flicking gesture, a swinging gesture, etc., which are relatively easy and clearly recognized by the electronic device.
  • the relationship between the target gesture and the operation information corresponding to the electronic device is pre-mapped in the first operating system.
  • the operation information may be instruction manipulation information of an input device such as keyboard instruction information, mouse instruction information, and the like.
  • step 300 may specifically be that the relationship between the grab gesture and the operation information corresponding to the electronic device is mapped in advance, and when the gesture information is a grab gesture, the first operating system The operation information corresponding to the grasping gesture is matched, and the operation information is sent to the second operating system by the sending module.
  • step 400 is introduced.
  • the second operating system When the second operating system receives the operation information, it switches the application based on the operation information.
  • the second operating system is mainly used to support the development and operation of the application software, and when the second operating system receives the operation information, the operation of switching the application is performed correspondingly.
  • an interactive large screen with dual systems is provided.
  • One of the two systems may be a windows system, and the other It can be the android system.
  • the android system When the user cannot click and switch through the thumbnail window of the taskbar because the full-screen application blocks the taskbar of the windows system, the application switching can be performed by using the method of the above embodiment.
  • the android system Due to the system design, the android system always It first receives the user's first input on the interactive large screen. By recognizing the gesture information in the first input, the android system can simulate the keyboard message corresponding to the gesture information through the touch box, and send the keyboard message to the windows system. , so as to switch the full-screen display application under the windows system.
  • the first operating system when the gesture information is the target gesture in step 300, the first operating system sends the operation information corresponding to the target gesture to the second operating system, which may include:
  • Step 200 The first operating system judges the gesture information, and determines whether the gesture information is a target gesture.
  • step 200 may specifically include step 210 and step 220 .
  • Step 210 the first operating system obtains touch point information, and determines whether the touch point information satisfies the target gesture condition;
  • Step 220 When the touch point information satisfies the target gesture condition, the first operating system determines the gesture information in the first input as the target gesture.
  • the first input is a touch input
  • the first operating system recognizes the gesture information by recognizing the touch point information of the touch input
  • the first operating system can determine the first input Gestures for the target.
  • step 220 determines whether the touch point information satisfies the target gesture condition, which may include step 221 , step 222 and step 223 .
  • Step 221 Determine whether the number of touch points is equal to a preset number
  • Step 222 Determine whether the time interval between the initial touch points of the preset number of touch points is less than the first preset time
  • Step 223 Determine whether the distance between the preset number of touch points is less than the preset distance.
  • the gesture can be determined only when the judgment steps of steps 221, 222 and 223 all meet the preset conditions
  • the information is the target gesture, that is to say, the first operating system uses the analysis of the number of touch points, the time interval between the initial touch points of the touch points, and the distance between the touch points, so as to determine whether the touch point information satisfies the Target gesture condition. Because, for each gesture action, there will be a corresponding number of touch points, a time interval between the initial touch points of the touch points, and a distance between the touch points.
  • the target gesture to be represented by the touch point information can be determined.
  • step 221, step 222 and step 223 can be executed sequentially or in parallel.
  • the present application does not limit the execution sequence of step 221, step 222 and step 223. In practical application, it can be specifically set according to the actual situation. I won't list them all here.
  • step 220 when the touch point information satisfies the target gesture condition, the first operating system determines that the first input is the target gesture, which may specifically be step 230 .
  • Step 230 When the number of touch points is equal to the preset number, the time interval is less than the preset time, and the distance is less than the preset distance, the first operating system determines the gesture information as the target gesture.
  • whether the target gesture is formed is determined by analyzing the initial contact time interval of the plurality of touch points and the distance between the touch points.
  • the preset number can be set to 3-5
  • the first preset time can be set to 200ms
  • the preset distance can be set to 100 pixels.
  • step 220 determines whether the touch point information satisfies the target gesture condition, and may further include steps 224 and 225 .
  • Step 224 Determine whether the touch initial time and the touch-off time of each touch point in the preset number of touch points are less than the second preset time;
  • Step 225 Determine whether the movement direction of the preset number of touch points is the movement in the gathering direction.
  • the judgment of the gesture information is further determined by acquiring the touch time of the touch point and the movement direction of the touch point.
  • the second preset time and the moving direction of the touch point may be preset according to the target gesture. Exemplarily, when the second preset time is 500 ms, and the touch points are concentrated toward the center, it can be accurately determined that the gesture information at the moment of the operation is a grabbing operation. Exemplarily, the second preset time is 550ms, and the touch points are scattered around, so it can be accurately determined that the gesture information at the moment of the operation is a zoom-in gesture operation.
  • the first operating system when the gesture information is the target gesture in step 300, the first operating system sends the operation information corresponding to the target gesture to the second operating system, which may include steps 310 and 320.
  • Step 310 when the gesture information is the target gesture, the first operating system matches the application switching shortcut key instruction corresponding to the target gesture;
  • Step 320 Send the application switching shortcut key instruction to the second operating system.
  • the target gesture and the application switching shortcut key instruction are mapped in advance.
  • the first operating system recognizes that the gesture information is the target gesture, the corresponding application switching shortcut key instruction can be matched immediately.
  • An operating system sends the application switching shortcut key instruction to the second operating system, and the second operating system can perform application switching according to the application switching shortcut key instruction.
  • the method may further include:
  • Step 400 Delete the handwriting information, which is formed by the second operating system based on the gesture information in the writing mode.
  • applications are divided into two categories, one is writing applications, such as whiteboard software, etc.
  • This type of software can be divided into two modes, a writing mode and an operation mode.
  • Write handwriting however, the user does not need to write handwriting in the operation mode; the other type is other applications, such as the system desktop, etc., the user should not write handwriting in this type of software.
  • Touch point information, etc. are found in the handwriting collection, and they are deleted.
  • the handwriting information formed based on the gesture information is cleared to ensure the normal writing of the writing interface, and the cache can be cleared in time to ensure the smooth operation of the system.
  • an application switching apparatus is provided, and the apparatus may include:
  • the first operating system 100 is configured to receive the first input and recognize the gesture information in the first input when the first application is in an open state. When the gesture information is the target gesture, the first operating system 100 assigns the corresponding The operation information is sent to the second operating system;
  • the second operating system 200 is configured to switch the first application based on the operation information when the operation information is received.
  • the device in the above embodiment realizes that in the case of dual systems, the first operating system 100 recognizes the gesture information in the first input, and when the gesture information is the target gesture, the first operating system 100 sends the operation information corresponding to the target gesture. To the second operating system 200, the second operating system 200 switches applications based on the operation information.
  • the electronic device can quickly switch applications on the display device without an external keyboard, thereby improving the user's operating experience of switching applications.
  • the first operating system 100 includes:
  • the touch point acquisition module is used to acquire touch point information
  • a judgment module for judging whether the touch point information satisfies the target gesture condition
  • the target gesture determination module is configured to determine that the first input is the target gesture when the touch point information satisfies the target gesture condition.
  • the first input is a touch input
  • the touch point acquisition module acquires touch point information
  • the judgment module performs conditional judgment on the touch point information
  • the target gesture determination module determines the first The gesture information in the input is the target gesture.
  • the judgment module including:
  • a first judging unit for judging whether the number of touch points is equal to a preset number
  • a second judging unit configured to judge whether the first time interval between the initial touch points of the preset number of touch points is less than the first preset time
  • the third determination unit is configured to determine whether the distance between the preset number of touch points is less than the preset distance.
  • the first judging module uses the first judging unit, the second judging unit and the third judging unit to analyze the number of touch points, the time interval between the initial touch points of the touch points and the distance between the touch points distance, so as to determine whether the touch point information satisfies the target gesture condition. Because, for each gesture action, there will be a corresponding number of touch points, a time interval between the initial touch points of the touch points, and a distance between the touch points. If the number of touch points is within the preset number range, the time interval between the initial touch points of the touch points is within the preset time interval range, and the distance between the touch points is also within the preset distance range, then The target gesture to be represented by the touch point information can be determined.
  • the target gesture determination module which is specifically used for:
  • the gesture information is determined as the target gesture.
  • the target gesture determination module determines whether the target gesture is formed by analyzing the initial contact time interval of the plurality of touch points and the distance between the touch points.
  • the preset number can be set to 3-5
  • the first preset time can be set to 200ms
  • the preset distance can be set to 100 pixels.
  • the operation displayed by the finger can be considered as a grasping gesture.
  • the judgment module also includes:
  • a fourth judging unit configured to judge whether the second time interval between the initial touch time and the cut-off time of each touch point in the preset number of touch points is less than the second preset time
  • the fifth judgment unit is used for judging whether the movement direction of the preset number of touch points is the movement in the gathering direction.
  • the judgment module judges the touch time of the touch point and the movement direction of the touch point through the fourth judgment unit and the fifth judgment unit, and further determines the judgment of the gesture information.
  • the second preset time and the moving direction of the touch point may be preset according to the target gesture. Exemplarily, when the second preset time is 500 ms, and the touch points are concentrated toward the center, it can be accurately determined that the gesture information at the moment of the operation is a grabbing operation. Exemplarily, the second preset time is 550ms, and the touch points are scattered around, so it can be accurately determined that the gesture information at the moment of the operation is a zoom-in gesture operation.
  • the first operating system 100 further includes:
  • a matching module configured to match the application switching shortcut key instruction corresponding to the target gesture when the gesture information is the target gesture
  • the sending module is configured to send the application switching shortcut key instruction to the second operating system 200 .
  • the target gesture and the application switching shortcut key instruction are mapped in advance.
  • the matching module can immediately match the corresponding application switching shortcut key instruction, and then , the sending module sends the application switching shortcut key instruction to the second operating system, and the second operating system can perform application switching according to the application switching shortcut key instruction.
  • the information processing process is relatively small, which reduces the amount of computation , which improves the processing speed.
  • the first operating system 100 further includes:
  • the deletion module is used for deleting the handwriting information, and the handwriting information is formed by the second operating system based on the gesture information in the writing mode.
  • the deletion module deletes the user information (for example, id) and the handwriting The included touch point information, etc., are found in the handwriting collection, and they are deleted.
  • the handwriting information formed based on the gesture information is cleared to ensure the normal writing of the writing interface, and the cache can be cleared in time to ensure the smooth operation of the system.
  • an embodiment of the present disclosure further provides an electronic device 300 , including a processor 301 , a memory 302 , a program or instruction stored in the memory 302 and executable on the processor 301 , the program or instruction When executed by the processor 301, each process of the foregoing application switching method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, details are not described here.
  • examples of electronic devices in the embodiments of the present disclosure include mobile electronic devices and non-mobile electronic devices.
  • FIG. 4 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present disclosure.
  • the electronic device 400 includes but is not limited to: a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, a user input unit 407, an interface unit 408, a memory 409, and a processor 410, etc. part.
  • the electronic device 400 may also include a power source (such as a battery) for supplying power to various components, and the power source may be logically connected to the processor 410 through a power management system, so that the power management system can manage charging, discharging, and power management. consumption management and other functions.
  • a power source such as a battery
  • the structure of the electronic device shown in FIG. 4 does not constitute a limitation to the electronic device.
  • the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
  • the input unit 404 may include a graphics processing unit (Graphics Processing Unit, GPU) 4041 and a microphone 4042, and the graphics processor 4041 is used to communicate with the image capturing device (such as camera) to obtain still pictures or video image data for processing.
  • the display unit 406 may include a display panel 4061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 407 includes a touch panel 4071 and other input devices 4072 .
  • the touch panel 4071 is also called a touch screen.
  • the touch panel 4071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 4072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which are not described herein again.
  • Memory 409 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems.
  • the processor 410 may integrate an application processor and a modem processor, wherein the application processor mainly handles the operating system, user interface, and application programs, and the like, and the modem processor mainly handles wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 410.
  • Embodiments of the present disclosure further provide a machine-readable storage medium, where a program or an instruction is stored on the machine-readable storage medium.
  • a program or an instruction is stored on the machine-readable storage medium.
  • the processor is the processor in the electronic device described in the foregoing embodiments.
  • the machine-readable storage medium includes a non-transitory computer-readable storage medium, such as an electronic circuit, a semiconductor memory device, a computer read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), Flash memory, erasable ROM (EROM), floppy disk, CD-ROM, magnetic disk or optical disk, etc.
  • An embodiment of the present disclosure further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement the application switching method embodiments described above.
  • the chip includes a processor and a communication interface
  • the communication interface is coupled to the processor
  • the processor is configured to run a program or an instruction to implement the application switching method embodiments described above.
  • the chip mentioned in the embodiments of the present disclosure may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.

Abstract

一种应用切换方法、装置、电子设备(300,400)及机器可读存储介质,其中,应用切换方法,应用于电子设备(300,400),电子设备(300,400)设置有第一操作系统(100)和第二操作系统(200),该方法可以包括:在第一应用处于开启的状态下,第一操作系统(100)接收第一输入并识别第一输入中的手势信息(S100);当手势信息为目标手势时,第一操作系统(100)将目标手势对应的操作信息发送至第二操作系统(200)(S300);第二操作系统(200)在接收到操作信息时,对应用进行切换(S400)。实现了双系统设备在没有外接键盘的情况下,对应用的快速切换,提高了用户切换应用的操作体验。

Description

应用切换方法、装置、电子设备及机器可读存储介质
相关申请的交叉引用
本申请要求享有于2020年12月28日提交的名称为“应用切换方法、装置、电子设备及存储介质”的中国专利申请202011587264.3的优先权,该申请的全部内容通过引用并入本文中。
技术领域
本公开涉及电子设备技术领域,具体涉及一种应用切换方法、装置、电子设备及机器可读存储介质。
背景技术
用户进行应用切换是一个相对高频的操作,对于双操作系统电子设备来说应用切换操作较为繁琐,例如,在全屏播放PPT时,用户希望切换到另一个应用。由于全屏应用遮挡了任务栏,用户无法通过任务栏的缩略图窗口进行点击切换,只能通过键盘快捷键来进行切换;一旦没有了外接键盘,就需要退出全屏后再进行应用切换,操作非常繁琐、不便捷。
发明内容
本公开实施例的目的是提供一种应用切换方法、装置、电子设备及机器可读存储介质,以至少解决现有双操作系统电子设备全屏切换时操作非常繁琐、不便捷的问题。
本公开的技术方案如下:
根据本公开实施例的第一方面,提供一种应用切换方法,应用于电子设备,电子设备设置有第一操作系统和第二操作系统,该方法可以包括:
在第一应用处于开启的状态下,第一操作系统接收第一输入并识别第一输入中的手势信息;
当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统;
第二操作系统在接收到操作信息时,基于操作信息对第一应用进行切换。
进一步地,在第一操作系统将目标手势对应的操作信息发送至第二操作系统之前,该方法还包括:
第一操作系统获取触控点信息,并判断触控点信息是否满足目标手势条件;
当触控点信息满足目标手势条件,第一操作系统确定第一输入中的手势信息为目标手势。
进一步地,判断触控点信息是否满足目标手势条件,包括:
判断触控点的数量是否等于预设数量;
判断预设数量的触控点触控初始时刻之间的第一时间间隔是否小于第一预设时间;
判断预设数量的触控点之间的距离是否小于预设距离。
进一步地,当触控点信息满足目标手势条件,第一操作系统确定第一输入中的手势信息为目标手势,具体为:
当触控点的数量等于预设数量、时间间隔小于预设时间且距离小于预设距离时,第一操作系统确定手势信息为目标手势。
进一步地,判断触控点信息是否满足目标手势条件,还包括:
判断预设数量的触控点中每个触控点的触控初始时刻和截止时刻之间的第二时间间隔是否小于第二预设时间;
判断预设数量的触控点的运动方向是否为集聚方向移动。
进一步地,当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统,包括:
当手势信息为目标手势时,第一操作系统匹配与目标手势对应的应用切换快捷键指令;
将应用切换快捷键指令发送至第二操作系统。
进一步地,方法还包括:
删除笔迹信息,笔迹信息为第二操作系统在书写模式下基于手势信息形成的。
根据本公开实施例的第二方面,提供一种应用切换装置,该装置可以包括:
第一操作系统,用于在第一应用处于开启的状态下,接收第一输入并识别第一输入中的手势信息,当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统;
第二操作系统,用于在接收到操作信息时,基于操作信息对第一应用进行切换。
进一步地,第一操作系统,包括:
触控点获取模块,用于获取触控点信息;
判断模块,用于判断触控点信息是否满足目标手势条件;
目标手势确定模块,用于当触控点信息满足目标手势条件,确定第一输入中的手势信息为目标手势。
进一步地,判断模块,包括:
第一判断单元,用于判断触控点的数量是否等于预设数量;
第二判断单元,用于判断预设数量的触控点触控初始时刻之间的第一时间间隔是否小于第一预设时间;
第三判断单元,用于判断预设数量的触控点之间的距离是否小于预设距离。
进一步地,目标手势确定模块,具体用于:
当触控点的数量等于预设数量、时间间隔小于预设时间且距离小于预设距离时,确定手势信息为目标手势。
进一步地,判断模块,还包括:
第四判断单元,用于判断预设数量的触控点中每个触控点的触控初始时刻和截止时刻之间的第二时间间隔是否小于第二预设时间;
第五判断单元,用于判断预设数量的触控点的运动方向是否为集聚方向移动。
进一步地,第一操作系统,还包括:
匹配模块,用于当手势信息为目标手势时,匹配与目标手势对应的应用切换快捷键指令;
发送模块,用于将应用切换快捷键指令发送至第二操作系统。
进一步地,第一操作系统还包括:
删除模块,用于删除笔迹信息,笔迹信息为第二操作系统在书写模式下基于手势信息形成的。
根据本公开实施例的第三方面,提供一种电子设备,该电子设备可以包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,处理器被配置为执行指令,以实现上述技术方案中任一项的应用切换方法。
根据本公开实施例的第四方面,提供一种机器可读存储介质,当机器可读存储介质中的指令由信息处理装置或者服务器的处理器执行时,以使信息处理装置或者服务器实现上述技术方案中任一项的应用切换方法。
本公开的实施例提供的技术方案至少带来以下有益效果:
本公开实施例通过第一操作系统识别第一输入中的手势信息,并在当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统;第二操作系统在接收到操作信息时,基于操作信息对第一应用进行切换。实现了利用手势信息即可对双操作系统电子设备所显示的应用进行切换,从而使得即使双系统电子设备在没有外接输入设备的情况下,用户也能对应用进行快速切换,提高了用户切换双操作系统电子设备应用的操作体验。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限值本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理,并不构成对本公开的不当限定。
图1是根据一示例性实施例示出的应用切换方法流程图;
图2是根据一示例性实施例示出的应用切换化装置结构示意图;
图3是根据一示例性实施例示出的电子设备结构示意图;
图4是根据一示例性实施例示出的电子设备硬件结构示意图。
具体实施方式
为了使本领域普通人员更好地理解本公开的技术方案,下面将结合附图,对本公开实施例中的技术方案进行清楚、完整地描述。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
为了解决背景技术部分提出的技术问题,本公开提供一种应用切换方法,该应用切换方法针对于具有双系统的电子设备,该电子设备可以是计算机、平板设备以及交互式显示设备等,设置有第一操作系统和第二操作系统。对于双系统电子设备目前现有技术还没有较为方便快捷的应用切换方法,尤其是在设备全屏显示且没有外接输入设备的情况下。本公开的方法利用手势信息即可对双操作系统电子设备所显示的应用进行切换。
如图1所示,在本公开实施例的第一方面,提供一种应用切换方法,应用于电子设备,该方法可以包括以下步骤:
步骤100:在第一应用处于开启的状态下,第一操作系统接收第一输入并识别第一输入中的手势信息;
步骤300:当手势信息为目标手势时,第一操作系统将目标手势对应 的操作信息发送至第二操作系统;
步骤400:第二操作系统在接收到操作信息时,基于操作信息对第一应用进行切换。
上述实施例实现了在双系统的情况下,通过第一操作系统识别第一输入中的手势信息,并在当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统;第二操作系统在接收到操作信息时对应用进行切换。实现了利用手势信息即可对双操作系统电子设备所显示的应用进行切换,从而使得即使双系统电子设备在没有外接输入设备的情况下,用户也能对应用进行快速切换,提高了用户切换双操作系统电子设备应用的操作体验。
上述各步骤的具体实现方式将在下文中进行详细描述。
首先介绍步骤100,在第一应用处于开启的状态下,第一操作系统接收第一输入并识别第一输入中的手势信息。
本步骤中,在第一应用处于开启或者运行状态的情况下,用户向电子设备输入第一输入,电子设备的第一操作系统通过输入模块接收到第一输入,并识别第一输入中的手势信息。
示例性的,第一输入可以为触控式,也可以为非接触式。
当为触控式时,S100具体可以为:触控屏在接收到第一输入后,对触控组件检测到的第一输入的触控信息进行识别,从而得到第一输入中的手势信息。
不限于上述列举,触控式输入也可以是对电子设备的摇晃、翻转、旋转等易操作实施的输入。
当为非接触式时,S100具体可以为:电子设备上的摄像头采集手势图像,并识别图像中的手势信息,从而得到第一输入中的手势信息。
示例性的,第一应用可以是运行在第二操作系统下的应用,该第一应用可以是处于全屏情况下,也可以是第一应用将第二操作系统的任务切换控件遮挡的情况下,第一操作系统进行第一输入的识别,识别第一输入中的手势信息。
示例性的,第一操作系统可以是linux系统、windows操作系统、 UNIX操作系统、苹果操作系统或安卓操作系统;第二操作系统可以是linux系统、windows操作系统、UNIX操作系统、苹果操作系统或安卓操作系统。
接下来介绍的是步骤300,当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统。
本步骤中,是利用步骤100对第一输入的识别手势信息结果进行判断,当判断出手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统。
示例性的,目标手势可以是抓取手势、翻转手势、摆动手势、拨动手势、摆动手势等对于电子设备比较容易、清楚识别的手势。在第一操作系统中预先映射好目标手势与电子设备对应的操作信息之间的关系。该操作信息可以是键盘指令信息、鼠标指令信息等输入设备的指令操控信息。
示例性的,当目标手势为抓取手势时,步骤300具体可以是,预先映射好抓取手势与电子设备对应的操作信息之间的关系,当手势信息为抓取手势时,第一操作系统匹配抓取手势所对应的操作信息,并利用发送模块将操作信息发送至第二操作系统。
最后介绍一下步骤400,第二操作系统在接收到操作信息时,基于操作信息对应用进行切换。
本步骤中,第二操作系统主要是用于支持应用软件开发和运行,在第二操作系统接收到操作信息时,对应执行对应用切换的操作。
为了更加清楚的理解本公开的应用切换方法,接下来举一个更加直观的例子,示例性的,提供一种具有双系统的交互式大屏,两个系统中,一个可以是windows系统,另一个可以是android系统,在由于全屏应用遮挡了windows系统的任务栏,用户无法通过任务栏的缩略图窗口进行点击切换时,可以利用上述实施例方法进行应用切换,由于系统设计的原因,android系统总是先收到用户在交互式大屏上的第一输入,android系统通过识别第一输入中的手势信息,能够通过触控框模拟出手势信息对应的键盘消息,并将键盘消息发送给windows系统,从而实现对windows系统下的全屏显示应用进行切换。
在本公开的一些可选实施例中,在步骤300当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统,之前可以包括:
步骤200:第一操作系统对手势信息进行判断,判断手势信息是否为目标手势。
在本公开的一些可选实施例中,步骤200具体可以包括步骤210和步骤220。
步骤210:第一操作系统获取触控点信息,并判断触控点信息是否满足目标手势条件;
步骤220:当触控点信息满足目标手势条件,第一操作系统确定第一输入中的手势信息为目标手势。
本实施例中,第一输入为触控输入,第一操作系统通过识别触控输入的触控点信息从而识别手势信息,当手势信息为目标手势时,第一操作系统就可以确定第一输入为目标手势。
在本公开的一些可选实施例中,步骤220判断触控点信息是否满足目标手势条件,可以包括步骤221、步骤222和步骤223。
步骤221:判断触控点的数量是否等于预设数量;
步骤222:判断预设数量的触控点触控初始时刻之间的时间间隔是否小于第一预设时间;
步骤223:判断预设数量的触控点之间的距离是否小于预设距离。
本实施例中,通过执行步骤221、步骤222及步骤223从而对手势信息是否满足目标手势进行判断;只有在步骤221、步骤222及步骤223的判断步骤都满足预设条件的情况下才能确定手势信息为目标手势,也就是说第一操作系统是利用分析触控点的数量、触控点触控初始时刻之间的时间间隔和触控点之间的距离,从而判断触控点信息是否满足目标手势条件。因为,每一种手势动作,都会有对应的触控点的数量、触控点触控初始时刻之间的时间间隔和触控点之间的距离。在触控点的数量在预设数量范围内,触控点触控初始时刻之间的时间间隔在预设时间间隔范围内,同时触控点之间的距离也是在预设距离范围内的话就可以确定触控点信息所要表 示的目标手势。
需要说明的是,步骤221、步骤222及步骤223可以顺序执行,也可以并行执行,本申请并不限定步骤221、步骤222及步骤223的执行顺序,实际应用中可以根据实际情况具体设定,这里不再一一列举。
在本公开的一些可选实施例中,步骤220当触控点信息满足目标手势条件,第一操作系统确定第一输入为目标手势,具体可以为步骤230。
步骤230:当触控点的数量是否等于预设数量、时间间隔小于预设时间且距离小于预设距离时,第一操作系统确定手势信息为目标手势。
在本实施例中,是利用分析多个触控点的接触初始时间间隔和触控点之间的距离,来判断目标手势是否形成。示例性的,当目标手势为抓取手势时,可以将预设数量设定为3-5,第一预设时间可以设定为200ms,预设距离可以设定为100像素,当上述预设条件均满足的情况下,可以认为手指所展示的操作为抓取手势。利用本实施例方法判断手势信息较为准确,不容易出现手势误判断的情况。
在本公开的一些可选实施例中,步骤220判断触控点信息是否满足目标手势条件,还可以包括步骤224和步骤225。
步骤224:判断预设数量的触控点中每个触控点的触控初始时刻和截止时刻是否小于第二预设时间;
步骤225:判断预设数量的触控点的运动方向是否为集聚方向移动。
在本实施例中,通过获取触控点的触控时间和触控点的移动方向,更加进一步确定手势信息的判断。第二预设时间和触控点的移动方向可以根据目标手势进行预先设定。示例性的,第二预设时间为500ms,且触控点向中心集聚,可以准确的判断出操作瞬间的手势信息为抓取操作。示例性的,第二预设时间为550ms,且触控点向四周分散,可以准确的判断出操作瞬间的手势信息为放大手势操作。
在本公开的一些可选实施例中,步骤300当手势信息为目标手势时,第一操作系统将目标手势对应的操作信息发送至第二操作系统,可以包括步骤310和步骤320。
步骤310:当手势信息为目标手势时,第一操作系统匹配与目标手势 对应的应用切换快捷键指令;
步骤320:将应用切换快捷键指令发送至第二操作系统。
本实施例中,是预先将目标手势与应用切换快捷键指令做好映射,当第一操作系统识别出手势信息为目标手势时,立刻就能匹配出相应的应用切换快捷键指令,然后,第一操作系统将应用切换快捷键指令发送给第二操作系统,第二操作系统就可以根据应用切换快捷键指令进行应用切换,该方法对于双系统装置设备来说信息处理过程相对较少,减少了运算量,提高了处理速度。
在本公开的一些可选实施例中,该方法还可以包括:
步骤400:删除笔迹信息,笔迹信息为第二操作系统在书写模式下基于手势信息形成的。
示例性的,在windows系统下,应用分为两类,一类是书写类应用,比如白板软件等,这类软件可以分为两种模式,书写模式和操作模式,用户在书写模式下,可以写出笔迹,但是,用户在操作模式下,不需要写出笔迹;另一类是其他应用,比如系统桌面等,用户在这类软件中,不应该写出笔迹。但是,由于第一输入的识别需要一段时间,所以用户在进行第一输入操作的同时,也留下了一些笔迹,在本实施例中,将笔迹的用户信息(例如,id)及笔迹包括的触控点信息等,在笔迹集合中查找出来,并对它们进行删除处理。清除了基于手势信息形成的笔迹信息,保证书写界面的正常书写,并且还能及时清除缓存保证系统运行顺畅。
如图2所示,在本公开实施例的第二方面,提供一种应用切换装置,该装置可以包括:
第一操作系统100,用于在第一应用处于开启的状态下,接收第一输入并识别第一输入中的手势信息,当手势信息为目标手势时,第一操作系统100将目标手势对应的操作信息发送至第二操作系统;
第二操作系统200,用于在接收到操作信息时,基于操作信息对第一应用进行切换。
上述实施例装置实现了在双系统的情况下,通过第一操作系统100识别第一输入中的手势信息,在当手势信息为目标手势时,第一操作系统 100将目标手势对应的操作信息发送至第二操作系统200,第二操作系统200基于所述操作信息对应用进行切换。实现了电子设备在没有外接键盘的情况下,对显示设备上的应用进行快速切换,从而提高了用户切换应用的操作体验。
可选的,第一操作系统100,包括:
触控点获取模块,用于获取触控点信息;
判断模块,用于判断触控点信息是否满足目标手势条件;
目标手势确定模块,用于当触控点信息满足目标手势条件,确定第一输入为目标手势。
本实施例中,第一输入为触控输入,触控点获取模块获取触控点信息,判断模块对触控点信息进行条件判断;当手势信息为目标手势时,目标手势确定模块确定第一输入中的手势信息为目标手势。
可选的,判断模块,包括:
第一判断单元,用于判断触控点的数量是否等于预设数量;
第二判断单元,用于判断预设数量的触控点触控初始时刻之间的第一时间间隔是否小于第一预设时间;
第三判断单元,用于判断预设数量的触控点之间的距离是否小于预设距离。
本实施例中,第一判断模块是利用第一判断单元、第二判断单元及第三判断单元分析触控点的数量、触控点触控初始时刻之间的时间间隔和触控点之间的距离,从而判断触控点信息是否满足目标手势条件。因为,每一种手势动作,都会有对应的触控点的数量、触控点触控初始时刻之间的时间间隔和触控点之间的距离。在触控点的数量在预设数量范围内,触控点触控初始时刻之间的时间间隔在预设时间间隔范围内,同时触控点之间的距离也是在预设距离范围内的话就可以确定触控点信息所要表示的目标手势。
可选的,目标手势确定模块,具体用于:
当触控点的数量等于预设数量、时间间隔小于预设时间且距离小于预设距离时,确定手势信息为目标手势。
在本实施例中,目标手势确定模块是利用分析多个触控点的接触初始时间间隔和触控点之间的距离,来判断目标手势是否形成。示例性的,当目标手势为抓取手势时,可以将预设数量设定为3-5,第一预设时间可以设定为200ms,预设距离可以设定为100像素,当上述预设条件均满足的情况下,可以认为手指所展示的操作为抓取手势。利用本实施例装置判断手势信息较为准确,不容易出现手势误判断的情况。
可选的,判断模块,还包括:
第四判断单元,用于判断预设数量的触控点中每个触控点的触控初始时刻和截止时刻之间的第二时间间隔是否小于第二预设时间;
第五判断单元,用于判断预设数量的触控点的运动方向是否为集聚方向移动。
在本实施例中,判断模块通过第四判断单元和第五判断单元判断触控点的触控时间和触控点的移动方向,更加进一步确定手势信息的判断。第二预设时间和触控点的移动方向可以根据目标手势进行预先设定。示例性的,第二预设时间为500ms,且触控点向中心集聚,可以准确的判断出操作瞬间的手势信息为抓取操作。示例性的,第二预设时间为550ms,且触控点向四周分散,可以准确的判断出操作瞬间的手势信息为放大手势操作。
可选的,第一操作系统100,还包括:
匹配模块,用于当手势信息为目标手势时,匹配与目标手势对应的应用切换快捷键指令;
发送模块,用于将应用切换快捷键指令发送至第二操作系统200。
本实施例中,是预先将目标手势与应用切换快捷键指令做好映射,当目标手势确定模块识别出手势信息为目标手势时,匹配模块立刻就能匹配出相应的应用切换快捷键指令,然后,发送模块将应用切换快捷键指令发送给第二操作系统,第二操作系统就可以根据应用切换快捷键指令进行应用切换,对于双系统装置设备来说信息处理过程相对较少,减少了运算量,提高了处理速度。
可选的,第一操作系统100还包括:
删除模块,用于删除笔迹信息,笔迹信息为第二操作系统在书写模式下基于手势信息形成的。
示例性的,在windows系统下,应用分为两类,一类是书写类应用,比如白板软件等,这类软件可以分为两种模式,书写模式和操作模式,用户在书写模式下,可以写出笔迹,但是,用户在操作模式下,不需要写出笔迹;另一类是其他应用,比如系统桌面等,用户在这类软件中,不应该写出笔迹。但是,由于第一输入的识别需要一段时间,所以用户在进行第一输入操作的同时,也留下了一些笔迹,在本实施例中,删除模块将笔迹的用户信息(例如,id)及笔迹包括的触控点信息等,在笔迹集合中查找出来,并对它们进行删除处理。清除了基于手势信息形成的笔迹信息,保证书写界面的正常书写,并且还能及时清除缓存保证系统运行顺畅。
如图3所示,本公开实施例还提供一种电子设备300,包括处理器301,存储器302,存储在存储器302上并可在所述处理器301上运行的程序或指令,该程序或指令被处理器301执行时实现上述应用切换方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,本公开实施例中的电子设备的示例包括移动电子设备和非移动电子设备。
图4为实现本公开实施例的一种电子设备的硬件结构示意图。
该电子设备400包括但不限于:射频单元401、网络模块402、音频输出单元403、输入单元404、传感器405、显示单元406、用户输入单元407、接口单元408、存储器409、以及处理器410等部件。
本领域技术人员可以理解,电子设备400还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器410逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图4中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
应理解的是,本公开实施例中,输入单元404可以包括图形处理器(Graphics Processing Unit,GPU)4041和麦克风4042,图形处理器4041 对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元406可包括显示面板4061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板4061。用户输入单元407包括触控面板4071以及其他输入设备4072。触控面板4071,也称为触摸屏。触控面板4071可包括触摸检测装置和触摸控制器两个部分。其他输入设备4072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。存储器409可用于存储软件程序以及各种数据,包括但不限于应用程序和操作系统。处理器410可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器410中。
本公开实施例还提供一种机器可读存储介质,所述机器可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述应用切换方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述机器可读存储介质包括非暂态计算机可读存储介质,如电子电路、半导体存储器设备、计算机只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、闪存、可擦除ROM(EROM)、软盘、CD-ROM、磁碟或者光盘等。
本公开实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述应用切换方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本公开实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、 物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本公开实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个机器可读存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (15)

  1. 一种应用切换方法,应用于电子设备,所述电子设备设置有第一操作系统和第二操作系统,该方法包括:
    在第一应用处于开启的状态下,所述第一操作系统接收第一输入并识别所述第一输入中的手势信息;
    当所述手势信息为目标手势时,所述第一操作系统将所述目标手势对应的操作信息发送至所述第二操作系统;
    所述第二操作系统在接收到所述操作信息时,基于所述操作信息对所述第一应用进行切换。
  2. 根据权利要求1所述的方法,其中,在所述第一操作系统将所述目标手势对应的操作信息发送至所述第二操作系统之前,所述方法还包括:
    所述第一操作系统获取触控点信息,并判断所述触控点信息是否满足目标手势条件;
    当所述触控点信息满足目标手势条件,所述第一操作系统确定所述第一输入中的手势信息为目标手势。
  3. 根据权利要求2所述的方法,其中,所述判断所述触控点信息是否满足目标手势条件,包括:
    判断触控点的数量是否等于预设数量;
    判断预设数量的触控点触控初始时刻之间的第一时间间隔是否小于第一预设时间;
    判断预设数量的触控点之间的距离是否小于预设距离。
  4. 根据权利要求3所述的方法,其中,所述当所述触控点信息满足目标手势条件,所述第一操作系统确定所述第一输入中的手势信息为目标手势,具体为:
    当所述触控点的数量等于预设数量、所述时间间隔小于预设时间且所述距离小于预设距离时,所述第一操作系统确定所述手势信息为目标手势。
  5. 根据权利要求3所述的方法,其中,所述判断所述触控点信息是否 满足目标手势条件,还包括:
    判断预设数量的触控点中每个触控点的触控初始时刻和截止时刻之间的第二时间间隔是否小于第二预设时间;
    判断预设数量的触控点的运动方向是否为集聚方向移动。
  6. 根据权利要求1所述的方法,其中,所述当所述手势信息为目标手势时,所述第一操作系统将所述目标手势对应的操作信息发送至第二操作系统,包括:
    当所述手势信息为目标手势时,所述第一操作系统匹配与所述目标手势对应的应用切换快捷键指令;
    将所述应用切换快捷键指令发送至第二操作系统。
  7. 根据权利要求1-6任一项所述的方法,所述方法还包括:
    删除笔迹信息,所述笔迹信息为第二操作系统在书写模式下基于所述手势信息形成的。
  8. 一种应用切换装置,包括:
    第一操作系统,用于接收第一输入并识别所述第一输入中的手势信息,当所述手势信息为目标手势时,所述第一操作系统将所述目标手势对应的操作信息发送至第二操作系统;
    所述第二操作系统,用于在接收到所述操作信息时,基于所述操作信息对应用进行切换。
  9. 根据权利要求8所述的装置,其中,所述第一操作系统包括:
    触控点获取模块,用于获取触控点信息;
    判断模块,用于判断所述触控点信息是否满足目标手势条件;
    目标手势确定模块,用于当所述触控点信息满足目标手势条件,确定所述第一输入中的手势信息为目标手势。
  10. 根据权利要求9所述的装置,其中,所述判断模块包括:
    第一判断单元,用于判断触控点的数量是否等于预设数量;
    第二判断单元,用于判断预设数量的触控点触控初始时刻之间的第一时间间隔是否小于第一预设时间;
    第三判断单元,用于判断预设数量的触控点之间的距离是否小于预设距离。
  11. 根据权利要求10所述的装置,其中,所述目标手势确定模块具体用于:当所述触控点的数量等于预设数量、所述时间间隔小于预设时间且所述距离小于预设距离时,确定所述手势信息为目标手势。
  12. 根据权利要求10所述的装置,其中,所述判断模块还包括:
    第四判断单元,用于判断预设数量的触控点中每个触控点的触控初始时刻和截止时刻之间的第二时间间隔是否小于第二预设时间;
    第五判断单元,用于判断预设数量的触控点的运动方向是否为集聚方向移动。
  13. 根据权利要求8所述的装置,其中,所述第一操作系统还包括:
    匹配模块,用于当所述手势信息为目标手势时,匹配与所述目标手势对应的应用切换快捷键指令;
    发送模块,用于将所述应用切换快捷键指令发送至第二操作系统。
  14. 一种电子设备,包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令,以实现如权利要求1-7中任一项所述的应用切换方法。
  15. 一种机器可读存储介质,当所述机器可读存储介质中的指令由信息处理装置或者服务器的处理器执行时,以使所述信息处理装置或者所述服务器实现如权利要求1-7中任一项所述的应用切换方法。
PCT/CN2020/141770 2020-12-28 2020-12-30 应用切换方法、装置、电子设备及机器可读存储介质 WO2022141286A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011587264 2020-12-28
CN202011587264.3 2020-12-28

Publications (1)

Publication Number Publication Date
WO2022141286A1 true WO2022141286A1 (zh) 2022-07-07

Family

ID=82260055

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/141770 WO2022141286A1 (zh) 2020-12-28 2020-12-30 应用切换方法、装置、电子设备及机器可读存储介质

Country Status (1)

Country Link
WO (1) WO2022141286A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102810049A (zh) * 2012-07-17 2012-12-05 华为终端有限公司 一种应用程序切换方法、装置及触摸屏电子设备
CN102830858A (zh) * 2012-08-20 2012-12-19 深圳市真多点科技有限公司 一种手势识别方法、装置及触摸屏终端
CN103425525A (zh) * 2012-05-16 2013-12-04 联想(北京)有限公司 一种切换方法及电子设备
US20140078081A1 (en) * 2012-09-14 2014-03-20 Asustek Computer Inc. Operation method of operating system
CN105094555A (zh) * 2015-07-24 2015-11-25 努比亚技术有限公司 一种通过滑动手势切换应用程序的方法及装置
CN108874288A (zh) * 2018-06-05 2018-11-23 Oppo广东移动通信有限公司 应用程序切换方法、装置、终端及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103425525A (zh) * 2012-05-16 2013-12-04 联想(北京)有限公司 一种切换方法及电子设备
CN102810049A (zh) * 2012-07-17 2012-12-05 华为终端有限公司 一种应用程序切换方法、装置及触摸屏电子设备
CN102830858A (zh) * 2012-08-20 2012-12-19 深圳市真多点科技有限公司 一种手势识别方法、装置及触摸屏终端
US20140078081A1 (en) * 2012-09-14 2014-03-20 Asustek Computer Inc. Operation method of operating system
CN105094555A (zh) * 2015-07-24 2015-11-25 努比亚技术有限公司 一种通过滑动手势切换应用程序的方法及装置
CN108874288A (zh) * 2018-06-05 2018-11-23 Oppo广东移动通信有限公司 应用程序切换方法、装置、终端及存储介质

Similar Documents

Publication Publication Date Title
WO2019128732A1 (zh) 一种图标管理的方法及装置
CA2841524C (en) Method and apparatus for controlling content using graphical object
US8269736B2 (en) Drop target gestures
US8276085B2 (en) Image navigation for touchscreen user interface
US20200183574A1 (en) Multi-Task Operation Method and Electronic Device
US8358200B2 (en) Method and system for controlling computer applications
US20120212438A1 (en) Methods and apparatuses for facilitating interaction with touch screen apparatuses
US20130050143A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
US9423908B2 (en) Distinguishing between touch gestures and handwriting
WO2022048633A1 (zh) 显示方法、装置和电子设备
CN109471692B (zh) 一种显示控制方法及终端设备
CN110069178B (zh) 界面控制方法及终端设备
CN110007996B (zh) 应用程序管理方法及终端
US9632693B2 (en) Translation of touch input into local input based on a translation profile for an application
US9025878B2 (en) Electronic apparatus and handwritten document processing method
WO2021203815A1 (zh) 页面操作方法、装置、终端及存储介质
WO2019033655A1 (zh) 一种防误触的方法、装置、设备及存储介质
WO2018086234A1 (zh) 一种对象处理方法和终端
WO2022194004A1 (zh) 应用图标整理方法、装置和电子设备
US10732719B2 (en) Performing actions responsive to hovering over an input surface
US20160170632A1 (en) Interacting With Application Beneath Transparent Layer
KR102346565B1 (ko) 다중 스테이지 사용자 인터페이스
WO2022111458A1 (zh) 图像拍摄方法和装置、电子设备及存储介质
WO2016078251A1 (zh) 一种投影仪播放控制方法、装置及计算机存储介质
WO2023169499A1 (zh) 触摸屏的单手控制方法、控制装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20967626

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20967626

Country of ref document: EP

Kind code of ref document: A1