CN112328155B - Input device control method and device and electronic device - Google Patents

Input device control method and device and electronic device Download PDF

Info

Publication number
CN112328155B
CN112328155B CN202011262707.1A CN202011262707A CN112328155B CN 112328155 B CN112328155 B CN 112328155B CN 202011262707 A CN202011262707 A CN 202011262707A CN 112328155 B CN112328155 B CN 112328155B
Authority
CN
China
Prior art keywords
application
operation mode
target
input device
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011262707.1A
Other languages
Chinese (zh)
Other versions
CN112328155A (en
Inventor
汪铭扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011262707.1A priority Critical patent/CN112328155B/en
Publication of CN112328155A publication Critical patent/CN112328155A/en
Application granted granted Critical
Publication of CN112328155B publication Critical patent/CN112328155B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses a control method and device of input equipment and electronic equipment, belongs to the technical field of communication, and is used for solving the problem that when a user needs to execute other operations on the electronic equipment, the user can only externally connect other equipment on the electronic equipment, and then the operation steps of the user are complicated. The method is applied to electronic equipment, the electronic equipment is connected with input equipment, and the input equipment supports a plane operation mode and a three-dimensional operation mode, and the method comprises the following steps: the method comprises the steps that under the condition that target application is operated by electronic equipment, a target operation mode matched with an application type of the target application is determined by the input equipment, wherein the application type comprises two-dimensional application and/or three-dimensional application; the electronic equipment sends a control instruction to the input equipment, wherein the control instruction is used for controlling the input equipment to operate in the target operation mode; the target operation mode is a plane operation mode or a stereo operation mode.

Description

Input device control method and device and electronic device
Technical Field
The application belongs to the technical field of communication, and particularly relates to a control method and device of input equipment and electronic equipment.
Background
With the development of electronic device technology, more and more functional electronic devices (e.g., mobile phones, computers, and Augmented Reality (AR)) are available for people. The development of the external equipment technology greatly enriches the scenes and the use experience of the user in using the electronic equipment.
In many external devices, a mouse is a device frequently used by a user, and the user can drag the mouse on a plane, so that the electronic device receives positioning operation and input operation required by the user.
However, since the mouse only has the functions of positioning operation and inputting operation, the function is single, and when a user needs to perform other operations on the electronic device, the user can only connect other devices externally to the electronic device, which makes the operation steps of the user cumbersome.
Disclosure of Invention
The embodiment of the application aims to provide a control method and device of an input device and an electronic device, and can solve the problem that when a user needs to execute other operations on the electronic device, the user can only externally connect other devices to the electronic device, and further the operation steps of the user are complicated.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a method for controlling an input device, which is applied to an electronic device, where the electronic device is connected to the input device, and the input device supports a planar operation mode and a stereoscopic operation mode, and the method includes: the method comprises the steps that under the condition that target application is operated by electronic equipment, a target operation mode matched with an application type of the target application is determined by the input equipment, wherein the application type comprises two-dimensional application and/or three-dimensional application; the electronic equipment sends a control instruction to the input equipment, wherein the control instruction is used for controlling the input equipment to operate in the target operation mode; the target operation mode is a plane operation mode or a stereo operation mode.
In a second aspect, an embodiment of the present application provides an apparatus for controlling an input device, where the apparatus includes a determining module and a sending module: the determining module is configured to determine, in a case that a target application is executed, a target operation mode in which the input device matches an application type of the target application, where the application type includes a two-dimensional application and/or a three-dimensional application; the sending module is configured to send the control instruction determined by the determining module to the input device, where the control instruction is used to control the input device to operate in the target operation mode; the target operation mode is a plane operation mode or a stereo operation mode.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In this embodiment of the application, an electronic device is connected to an input device, the input device supports a planar operation mode and a stereoscopic operation mode, in a case that a target application is run, the electronic device may determine a target operation mode in which the input device matches an application type of the target application, the application type includes a two-dimensional application and/or a three-dimensional application, after determining the target operation mode, the electronic device sends a control instruction to the input device, the control instruction is used for controlling the input device to run in the target operation mode, and the target operation mode may be the planar operation mode or the stereoscopic operation mode. Therefore, the type of the target application and the target operation mode matched with the type can be directly determined by using the electronic equipment, and then the electronic equipment can send a control instruction to the input equipment to control the input equipment to operate the target operator mode, so that the use scenes and the use functions of the input equipment are enriched, the quantity of the input equipment is saved for a user, and the efficiency of using the input equipment by the user is improved.
Drawings
Fig. 1 is a schematic flowchart of a control method of an input device according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a control apparatus of an input device according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 4 is a second schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The term "mouse handle" appearing in the embodiments of the present application is explained below:
the mouse handle is an input device which integrates a mouse function and a handle function at the same time, and belongs to an external device of electronic equipment. The electronic device may be a mobile phone, a computer, Augmented Reality (AR) glasses, and the like. The mouse handle and the electronic device can be connected in a wireless connection mode (for example, Bluetooth connection) or in a wired connection mode.
Because the mouse handle has multiple functions and modes, the mouse handle can be provided with switching hardware accessories for switching different functions and modes, a user can switch the functions and the modes of the mouse handle by controlling the switching hardware accessories, and the user can also control the switching functions and the modes of the mouse handle through the electronic equipment connected with the user. It should be noted that the mouse handle may have both the above two switching modes, and a user may select one mode to complete the switching between the functions and the modes when using the mouse.
It can be understood that when the mouse handle is opened to the handle mode, the mouse handle can start the handle function singly, and can also start the handle function and the mouse function simultaneously, that is, the user can use the mouse function simultaneously while using the handle function of the mouse handle.
Different functions of the mouse handle can be generally used in different application scenarios, for example, in a planar operation scenario, a mouse mode, that is, a planar operation mode, is generally used, and in a stereoscopic operation scenario, for example, a user waves the mouse handle in the air, and a handle mode, that is, a stereoscopic operation mode, is generally turned on.
The following describes in detail a control method of an input device according to an embodiment of the present application with reference to the accompanying drawings.
In the related art, an electronic device has various external input devices, and different external input devices have different functions, so that when a specific external input device is connected with the electronic device, the electronic device can be controlled to complete different operations. Among the many external input devices, the mouse and the handle are two kinds of external devices that users often use. When the user uses the mouse, the user can drag the mouse on the plane, so that the electronic equipment receives the positioning operation and the input operation required by the user, and when the user uses the handle, the user can control the electronic equipment to complete the corresponding control operation through the touch input on the handle. However, when the user needs to switch from the mouse-using scene to the handle-using scene, the user needs to connect the handle externally and complete the input using the handle. It can be seen from the above process that when the user uses a different input device, the user needs to re-input the device and replace the currently used input device. Thus, the operation procedure of the user is complicated.
In order to solve the above problem, in the present application, an input device with multiple operation modes is used, and in a process of using the input device with multiple operation modes, if a user needs to switch different operation modes of the input device, in an embodiment of the present application, in a case of running a target application, an electronic device connected to the input device may be used to determine a target operation mode of the input device that matches an application type of the target application, where the application type may include a two-dimensional application and/or a three-dimensional application, and after the electronic device determines the target operation mode, the electronic device may send a control instruction to the input device to instruct the input device to run in the target operation mode, where the target operation mode may be a planar operation mode or a three-dimensional operation mode. Therefore, the type of the target application and the target operation mode matched with the type can be directly determined by using the electronic equipment, and then the electronic equipment can send a control instruction to the input equipment to control the input equipment to operate the target operator mode, so that the use scenes and the use functions of the input equipment are enriched, the quantity of the input equipment is saved for a user, and the efficiency of using the input equipment by the user is improved.
The present embodiment provides a method for controlling an input device, as shown in fig. 1, the present embodiment is applied to an electronic device, the electronic device is connected to the input device, the input device supports a planar operation mode and a stereoscopic operation mode, and the method for controlling the input device includes the following steps 301 and 302:
step 301: and under the condition that the electronic equipment runs the target application, determining a target operation mode of the input equipment matched with the application type of the target application.
In this embodiment of the application, the connection mode between the input device and the electronic device may be a wireless connection, for example, a bluetooth connection or a WiFi connection; the connection may also be a wired connection, which is not limited in the embodiments of the present application.
In the embodiment of the present application, the input device may be an electronic device having multiple modes and/or functions. The electronic equipment can be external input equipment of other electronic equipment, and the control operation is executed on the input equipment, so that the other electronic equipment is controlled to execute the operation corresponding to the control operation.
In the embodiment of the present application, the electronic device may be any type of electronic device having a system for determining an application attribute. Such as cell phones, computers, AR glasses.
In this embodiment of the present application, the target application may be any application, and the application may be an application in an electronic device, and may also be an application in another electronic device connected to the electronic device, which is not limited in this embodiment of the present application.
In the embodiment of the present application, the application types of the target application include a two-dimensional application and/or a three-dimensional application.
In one example, the two-dimensional application and the three-dimensional application may be: when the target application runs, the content displayed by the application is two-dimensional content or three-dimensional content in the appearance of the user. For example, for a document editing application, the user's look and feel is that of a two-dimensional plane for content presented in an electronic device, and for a three-dimensional modeling application, the user's look and feel is that of a three-dimensional solid for content presented in an electronic device.
In one example, the two-dimensional application and the three-dimensional application may be: when the target application runs, whether the content displayed by the application is the content which can enable the user to have immersive experience or not is the experience of the user. When the application is run and the experience of the user is immersive experience, the application is a three-dimensional application, and when the user does not have immersive experience, the application is a two-dimensional application.
In the embodiment of the present application, the two-dimensional application and the three-dimensional application are completely set when being created as target applications, where one application may be a two-dimensional application or a three-dimensional application. For example, a video application has a two-dimensional mode and a three-dimensional mode, when the two-dimensional mode is opened, a user watches a video, which is not different from a normal video, and when the three-dimensional mode is opened, the video application triggers the electronic device to execute a preset three-dimensional process on the video, so that the user has a three-dimensional surrounding look and feel when watching the video.
In the embodiment of the present application, the target operation mode is a planar operation mode or a stereoscopic operation mode.
Illustratively, the above-mentioned plane operation mode refers to: the user operates the input device in a planar space. Taking the three-dimensional space as a space formed by three axes X-Y-Z as an example, when the input device is in a planar operation mode, it only performs operations on any two axes plane of the three axes X-Y-Z, that is, the operation direction of the input device only includes the directions corresponding to the two axes, for example, the input device performs operations on the X-Y axis plane or performs operations on the Y-Z axis plane.
Exemplarily, the stereoscopic operation mode includes: the user operates the input device in a three-dimensional space. Taking a three-dimensional space as a space formed by three axes X-Y-Z as an example, when the input device is in a planar operation mode, the input device performs an operation in the space formed by the three axes X-Y-Z, that is, the operation direction of the input device includes directions corresponding to the three axes.
In one example, the input device may be a device having both a mouse function and a handle function. For example, a mouse handle, wherein the detailed information of the mouse handle can refer to the foregoing description, and is not described herein again.
In this embodiment of the application, when the application type of the target application is a three-dimensional application, the target operation mode matched with the application type of the target application is a stereoscopic operation mode; when the application type of the target application is a two-dimensional application, the target operation mode matched with the application type of the target application is a plane operation mode.
It can be understood that, in the process of determining the target operation mode of the input device by the electronic device, when the electronic device running the target application switches different target applications or switches different modes (for example, a three-dimensional mode or a two-dimensional mode) in the target application, an application switching instruction is sent to the electronic device, the electronic device is triggered to re-determine the application type of the target application, and then the target operation mode of the input device is determined; the application type of the target application can be determined for the electronic equipment periodically, namely, the application type of the target application is determined once at intervals, and then the target operation mode of the input equipment is determined; the target operation mode of the input device can be determined continuously, namely the application type of the target application is continuously monitored, and the user can adjust the determination mode in a user-defined mode. When the electronic equipment is determined periodically, the determination period can be set in a self-defined mode. For example, the user may set the acquisition period to determine the hand parameters every 15 seconds.
Further, since the hand parameter may be a parameter within a predetermined time period, the process of collecting the hand parameter by the wearable device may be an operation process lasting for the predetermined time period. The preset time period can be preset for the wearable device or can be set by a user in a user-defined mode.
Step 302: and the electronic equipment sends a control instruction to the input equipment, wherein the control instruction is used for controlling the input equipment to operate in the target operation mode.
In this embodiment, the control command may be data information that can be read and executed by an input device.
Example 1: the electronic equipment is used as AR glasses, and the input equipment is a mouse handle. When the user wears the AR glasses, in a case where an immersive chat application (i.e., the target application) is run in the AR glasses, the operation mode in which the mouse handle matches the application type of the immersive chat application is determined, where the application type of the immersive chat application is a three-dimensional application, and the AR glasses may determine that the operation mode of the mouse handle is a stereoscopic operation mode (i.e., the target operation mode). After the operation mode of the mouse handle is determined to be the three-dimensional operation mode, the AR glasses send a control instruction to the mouse handle, and the control instruction is used for controlling the mouse handle to operate in the three-dimensional operation mode.
In the method for controlling an input device provided in the embodiment of the present application, an electronic device is connected to the input device, the input device supports a planar operation mode and a stereoscopic operation mode, and when a target application is executed, the electronic device may determine a target operation mode in which the input device matches an application type of the target application, where the application type includes a two-dimensional application and/or a three-dimensional application, and after the target operation mode is determined, the electronic device sends a control instruction to the input device, where the control instruction is used to control the input device to execute in the target operation mode, and the target operation mode may be the planar operation mode or the stereoscopic operation mode. Therefore, the type of the target application and the target operation mode matched with the type can be directly determined by using the electronic equipment, and then the electronic equipment can send a control instruction to the input equipment to control the input equipment to operate the target operator mode, so that the use scenes and the use functions of the input equipment are enriched, the quantity of the input equipment is saved for a user, and the efficiency of using the input equipment by the user is improved.
Optionally, in this embodiment, before the step 302, the method for controlling an input device provided in this embodiment further includes the following step 303:
step 303: the electronic equipment acquires the application identification information of the target application under the condition of running the target application.
Illustratively, the application identification information includes: creating application category information of the target application, and/or format information of a target application creation file, and/or application space information of the target application; the application identification information matches the application type.
Illustratively, the application identification information is used in a serialization process for the target application, i.e., is marked in the target application. That is, the target application is initially generated, and already contains the application identification information.
In one example, when the target application is created, the target applications of different creation applications may create target applications of different application types, that is, different creation applications correspond to target applications of different application types. The two-dimensional application is created by a creation application specially creating the two-dimensional application, the three-dimensional application is created by a creation application specially creating the three-dimensional application, and the creation application corresponding to the target application stores the creation category information of the creation application in the target application in the serialization process of the target application. For example, the three-dimensional application may be created by three-dimensional application development software, such as UNITY, UNREAL, C4D, SKETCH UP, and the like.
In one example, the electronic device creates a target application file through a target application, and the target application file may contain format information of the target application creation file, where the format information may contain file type information, three-dimensional file format information, or two-dimensional file format information. The three-dimensional file format information may indicate that the file is created for a three-dimensional application, and the two-dimensional file format information may indicate that the file is created for a two-dimensional application. The format information of the target application generated file is stored in the target application during the serialization process. For example, the three-dimensional file format information may include FBX, OBJ, SKP.
In one example, during the development process of the target application, the target application needs to be correspondingly created according to the coordinate information to be used in the application. If the target application is a three-dimensional application, the application is correspondingly created by using X-Y-Z three-axis coordinate information when being created; if the target application is a two-dimensional application, the X-Y two-axis coordinate information, the Y-Z two-axis coordinate information is correspondingly used for creating the application, or the coordinate information is not needed to be used for creating the application, and the default is directly created as the two-dimensional application. The coordinate information used in the creation process is stored in the target application during the serialization process. Different coordinate information corresponds to different application types, for example. The application type corresponding to the X-Y-Z three-axis coordinate information is three-dimensional application.
For example, the electronic device may automatically read the application type of the target application indicated in the application identification information of the target application.
Therefore, the electronic equipment can read the application type of the target application by acquiring the application identification information, and further quickly determine the target operation mode of the input equipment, so that the accuracy and efficiency of determining the target operation mode by the electronic equipment can be improved, the mode of the input equipment does not need to be changed manually by a user, and the efficiency of using the input equipment by the user is improved.
Optionally, in this embodiment of the application, in a case that the input device is in a plane operation mode, the input device is configured to control a target object displayed when the electronic device runs the target application to move in a two-dimensional space; and under the condition that the input device is in a three-dimensional operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a three-dimensional space.
Illustratively, the electronic device is an electronic device that can be used for displaying the target object.
In one example, the electronic device may be an electronic device having a display screen.
For example, the planar operation mode and the stereoscopic operation mode may refer to the foregoing description, and are not described herein again.
For example, the target object may be content input in the electronic device when the target application runs by the input device. For example, when the input device is a mouse, the target object is a cursor; when the input device is a handle, the target object may be a direction indicator or an operation indicator corresponding to the handle.
Illustratively, the two-dimensional space refers to a two-dimensional space on the visual effect or a two-dimensional space on the usage scene, and correspondingly, the three-dimensional space refers to a three-dimensional space on the visual effect or the usage scene.
It is to be understood that, in the related art, different applications may set a visual scene or a usage scene for a reference at the time of creating the application according to actual application requirements, for example, an immersive video application requires an immersive view of a user when viewing a video through the application, and at the time of creating, the application may be designed as an application capable of presenting a video with a visual three-dimensional effect, and for example, a document editing application requires a user to view on a two-dimensional plane, and at the time of creating the application, the application may be designed as an application visually viewing as a two-dimensional plane effect.
It should be noted that the target objects corresponding to different operation modes may have the same form, may also have different forms, and specifically, is suitable for a two-dimensional space or a three-dimensional space in the operation mode.
Example 2: taking the input device as a mouse handle as an example, assuming that the electronic device is AR glasses, when the mouse handle is in a plane operation mode, starting a mouse function of the mouse handle, where the mouse handle is used to control a display area of the AR glasses to display a cursor (i.e., the target object) to move in a two-dimensional plane (i.e., the two-dimensional space); and when the mouse handle is in the three-dimensional operation mode, starting a handle function of the mouse handle, wherein the mouse handle is used for controlling a display area display direction control and a control (namely the target object) of the AR glasses to move in a three-dimensional space.
Therefore, the input equipment can control the target object corresponding to the input equipment to be displayed in different types of spaces according to different use modes, a user can use the input equipment conveniently in different operation modes, and the efficiency and use experience of the user in using the input equipment are improved.
Optionally, in this embodiment of the application, the input device is a mouse handle; under the plane operation mode, the control instruction is used for controlling the mouse handle to execute a mouse function; and under the three-dimensional operation mode, the control instruction is used for controlling the mouse handle to execute a handle function.
For example, the description of the mouse handle may refer to the foregoing description, and will not be repeated here.
It should be noted that, in the control method of the input device provided in the embodiment of the present application, the execution main body may be a control apparatus of the input device, or a control module in the control apparatus of the input device, for executing the control method of the input device. In the embodiment of the present application, a control method for an input device executed by a control method device of the input device is taken as an example, and the control device of the input device provided in the embodiment of the present application is described.
Fig. 2 is a schematic diagram of a possible structure of a control apparatus for implementing an input device according to an embodiment of the present application. As shown in fig. 2, the apparatus 600 includes a determining module 601 and a sending module 602: the determining module 601 is configured to determine, in a case that a target application is executed, a target operation mode in which the input device matches an application type of the target application, where the application type includes a two-dimensional application and/or a three-dimensional application; the sending module 602 is configured to send the control instruction determined by the determining module 601 to the input device, where the control instruction is used to control the input device to operate in the target operation mode; the target operation mode is a plane operation mode or a stereo operation mode.
According to the control device of the input device, the electronic device is connected with the input device, the input device supports a plane operation mode and a stereo operation mode, under the condition that a target application is operated, the electronic device can determine the target operation mode matched with the application type of the target application, the application type comprises a two-dimensional application and/or a three-dimensional application, after the target operation mode is determined, the electronic device sends a control instruction to the input device, the control instruction is used for controlling the input device to operate in the target operation mode, and the target operation mode can be the plane operation mode or the stereo operation mode. Therefore, the type of the target application and the target operation mode matched with the type can be directly determined by using the electronic equipment, and then the electronic equipment can send a control instruction to the input equipment to control the input equipment to operate the target operator mode, so that the use scenes and the use functions of the input equipment are enriched, the quantity of the input equipment is saved for a user, and the efficiency of using the input equipment by the user is improved.
Optionally, in this embodiment of the present application, the apparatus 600 further includes: an acquisition module 603; the obtaining module 603 is configured to obtain application identification information of a target application when the target application is running; wherein, the application identification information includes: creating application category information of the target application, and/or format information of a target application creation file, and/or application space information of the target application; the application identification information matches the application type.
Optionally, in this embodiment of the application, in a case that the input device is in a plane operation mode, the input device is configured to control a target object displayed when the electronic device runs the target application to move in a two-dimensional space; and under the condition that the input device is in a three-dimensional operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a three-dimensional space.
Optionally, in this embodiment of the application, the input device is a mouse handle; under the plane operation mode, the control instruction is used for controlling the mouse handle to execute a mouse function; and under the three-dimensional operation mode, the control instruction is used for controlling the mouse handle to execute a handle function.
The control device of the input device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The control device of the input apparatus in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The control apparatus of the input device provided in the embodiment of the present application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
It should be noted that, as shown in fig. 2, modules that are necessarily included in the electronic device 600 are illustrated by solid line boxes, such as the determination module 601.
Optionally, as shown in fig. 3, an electronic device 800 is further provided in the embodiment of the present application, and includes a processor 801, a memory 802, and a program or an instruction that is stored in the memory 802 and is executable on the processor 801, where the program or the instruction is executed by the processor 801 to implement each process of the embodiment of the control method for an input device, and can achieve the same technical effect, and is not described herein again to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 4 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110. Wherein the user input unit 107 includes: touch panel 1071 and other input devices 1072, display unit 106 including display panel 1061, input unit 104 including image processor 1041 and microphone 1042, memory 109 may be used to store software programs (e.g., an operating system, application programs needed for at least one function), and various data.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 4 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
When the electronic device 100 is connected to an input device, the input device supports a planar operation mode and a stereoscopic operation mode, and the processor 110 is configured to determine, in a case where a target application is running, a target operation mode in which the input device matches an application type of the target application, where the application type includes a two-dimensional application and/or a three-dimensional application; a radio frequency unit 101, configured to send a control instruction to the input device, where the control instruction is used to control the input device to operate in the target operation mode; the target operation mode is a plane operation mode or a stereo operation mode.
The electronic device provided by the embodiment of the application is connected with an input device, the input device supports a plane operation mode and a stereo operation mode, when a target application is run, the electronic device can determine a target operation mode matching an application type of the input device and the target application, the application type includes a two-dimensional application and/or a three-dimensional application, after the target operation mode is determined, the electronic device sends a control instruction to the input device, the control instruction is used for controlling the input device to run in the target operation mode, and the target operation mode can be the plane operation mode or the stereo operation mode. Therefore, the type of the target application and the target operation mode matched with the type can be directly determined by using the electronic equipment, and then the electronic equipment can send a control instruction to the input equipment to control the input equipment to operate the target operator mode, so that the use scenes and the use functions of the input equipment are enriched, the quantity of the input equipment is saved for a user, and the efficiency of using the input equipment by the user is improved.
Optionally, the processor 110 is further configured to obtain application identification information of the target application when the target application is run; wherein, the application identification information includes: creating application category information of the target application, and/or format information of a target application creation file, and/or application space information of the target application; the application identification information matches the application type.
Optionally, when the input device is in a plane operation mode, the input device is configured to control a target object displayed when the electronic device runs the target application to move in a two-dimensional space; and under the condition that the input device is in a three-dimensional operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a three-dimensional space.
Optionally, the input device is a mouse handle; under the plane operation mode, the control instruction is used for controlling the mouse handle to execute a mouse function; and under the three-dimensional operation mode, the control instruction is used for controlling the mouse handle to execute a handle function.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 110 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned control method for the input device, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the control method embodiment of the input device, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A control method of an input device is applied to an electronic device, the electronic device is connected with the input device, the input device supports a plane operation mode and a stereo operation mode, and the method comprises the following steps:
under the condition of running a target application, automatically acquiring application identification information of the target application, determining an application type of the target application, and determining a target operation mode matched with the application type of the target application by the input device, wherein the application type comprises a two-dimensional application and/or a three-dimensional application;
sending a control instruction to the input device, wherein the control instruction is used for controlling the input device to operate in the target operation mode;
wherein the target operation mode is a planar operation mode or a stereoscopic operation mode.
2. The method of claim 1,
the application identification information includes: creating application category information of the target application, and/or format information of a target application creation file, and/or application space information of the target application; the application identification information matches the application type.
3. The method of claim 1,
under the condition that the input device is in a plane operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a two-dimensional space;
and under the condition that the input device is in a stereoscopic operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a three-dimensional space.
4. The method of claim 1, wherein the input device is a mouse handle; in the plane operation mode, the control instruction is used for controlling the mouse handle to execute a mouse function; and under the three-dimensional operation mode, the control instruction is used for controlling the mouse handle to execute a handle function.
5. The control device of the input equipment is characterized by comprising an acquisition module, a determination module and a sending module:
the acquisition module is used for automatically acquiring the application identification information of the target application under the condition of running the target application;
the determining module is used for determining the application type of a target application under the condition that the target application is run, and determining a target operation mode matched with the application type of the target application by the input device, wherein the application type comprises a two-dimensional application and/or a three-dimensional application;
the sending module is configured to send the control instruction determined by the determining module to the input device, where the control instruction is used to control the input device to operate in the target operation mode;
wherein the target operation mode is a planar operation mode or a stereoscopic operation mode.
6. The apparatus of claim 5,
the application identification information includes: creating application category information of the target application, and/or format information of a target application creation file, and/or application space information of the target application; the application identification information matches the application type.
7. The apparatus of claim 5,
under the condition that the input device is in a plane operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a two-dimensional space;
and under the condition that the input device is in a stereoscopic operation mode, the input device is used for controlling the target object displayed when the electronic device runs the target application to move in a three-dimensional space.
8. The apparatus of claim 5, wherein the input device is a mouse handle; in the plane operation mode, the control instruction is used for controlling the mouse handle to execute a mouse function; and under the three-dimensional operation mode, the control instruction is used for controlling the mouse handle to execute a handle function.
9. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the method of controlling an input device according to any one of claims 1-4.
10. A readable storage medium, characterized in that the readable storage medium stores thereon a program or instructions which, when executed by a processor, implement the steps of the control method of an input device according to any one of claims 1-4.
CN202011262707.1A 2020-11-12 2020-11-12 Input device control method and device and electronic device Active CN112328155B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011262707.1A CN112328155B (en) 2020-11-12 2020-11-12 Input device control method and device and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011262707.1A CN112328155B (en) 2020-11-12 2020-11-12 Input device control method and device and electronic device

Publications (2)

Publication Number Publication Date
CN112328155A CN112328155A (en) 2021-02-05
CN112328155B true CN112328155B (en) 2022-05-17

Family

ID=74318069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011262707.1A Active CN112328155B (en) 2020-11-12 2020-11-12 Input device control method and device and electronic device

Country Status (1)

Country Link
CN (1) CN112328155B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115033119A (en) * 2022-08-04 2022-09-09 荣耀终端有限公司 Method and device for switching configuration modes of stylus pen and electronic equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2011203158A1 (en) * 2010-06-29 2012-01-19 X6D Limited Universal 3D glasses
CN102141891A (en) * 2011-03-23 2011-08-03 友达光电股份有限公司 Method for operating touch panel
CN108595009B (en) * 2012-02-29 2020-12-18 联想(北京)有限公司 Man-machine interaction control method and electronic terminal
US10216355B2 (en) * 2012-06-17 2019-02-26 Atheer, Inc. Method for providing scale to align 3D objects in 2D environment
CN106020518B (en) * 2016-07-06 2019-02-19 童宗伟 Finger ring wireless mouse and its control method
EP3561667B1 (en) * 2017-01-26 2022-02-23 Huawei Technologies Co., Ltd. Method for displaying 2d application in vr device, and terminal
CN107688348B (en) * 2017-08-10 2020-10-20 中国科学院半导体研究所 Wireless man-machine interaction equipment and method for realizing pointing control function
US11550421B2 (en) * 2017-08-21 2023-01-10 Huawei Technologies Co., Ltd. Electronic device control method and input device
CN111090342A (en) * 2018-10-24 2020-05-01 宏景科技股份有限公司 Method for switching plane operation mode and 3D space operation mode by mouse
CN111324224A (en) * 2020-04-01 2020-06-23 国微集团(深圳)有限公司 Mouse based on pressure induction and control method thereof

Also Published As

Publication number Publication date
CN112328155A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN113596555B (en) Video playing method and device and electronic equipment
CN113209601B (en) Interface display method and device, electronic equipment and storage medium
CN114296595A (en) Display method and device and electronic equipment
CN112416485A (en) Information guiding method, device, terminal and storage medium
CN112486444A (en) Screen projection method, device, equipment and readable storage medium
CN111651106A (en) Unread message prompting method, unread message prompting device, unread message prompting equipment and readable storage medium
CN112230907B (en) Program generation method, device, terminal and storage medium
CN112328155B (en) Input device control method and device and electronic device
CN112698762B (en) Icon display method and device and electronic equipment
CN113963108A (en) Medical image cooperation method and device based on mixed reality and electronic equipment
CN112416199B (en) Control method and device and electronic equipment
CN113485625A (en) Electronic equipment response method and device and electronic equipment
CN112965773A (en) Method, apparatus, device and storage medium for information display
CN112911052A (en) Information sharing method and device
CN115421631A (en) Interface display method and device
CN115268817A (en) Screen-projected content display method, device, equipment and storage medium
CN111857474B (en) Application program control method and device and electronic equipment
CN115037874A (en) Photographing method and device and electronic equipment
CN114461022A (en) Separable module management method and device of terminal and terminal
CN114416269A (en) Interface display method and display device
CN111898353A (en) Table display method, device and medium
CN111991801A (en) Display method and device and electronic equipment
CN112328156B (en) Input device control method and device and electronic device
CN110941389A (en) Method and device for triggering AR information points by focus
CN113157180B (en) Touch operation method and device for application and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant