CN112181582A - Method, apparatus, device and storage medium for device control - Google Patents
Method, apparatus, device and storage medium for device control Download PDFInfo
- Publication number
- CN112181582A CN112181582A CN202011202114.6A CN202011202114A CN112181582A CN 112181582 A CN112181582 A CN 112181582A CN 202011202114 A CN202011202114 A CN 202011202114A CN 112181582 A CN112181582 A CN 112181582A
- Authority
- CN
- China
- Prior art keywords
- gesture
- instruction
- function
- menu bar
- track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000006870 function Effects 0.000 claims abstract description 109
- 230000015654 memory Effects 0.000 claims description 20
- 230000008569 process Effects 0.000 claims description 6
- 230000006399 behavior Effects 0.000 claims description 4
- 230000001427 coherent effect Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a method, a device, equipment and a storage medium for equipment control, and relates to the fields of equipment control, application program control, browsers and the like. The specific implementation scheme is as follows: displaying a menu bar according to an instruction corresponding to the first gesture, wherein the menu bar comprises at least one candidate function inlet; selecting a target function entry from the candidate function entries according to an instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowable range; and executing the function corresponding to the target function entry. The problems that in the prior art, the entry path of the target function is too deep, and the learning cost and the operation cost are high for a new user are solved.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to the fields of device control, application control, and browsers.
Background
The native application program (native APP) is cooperatively customized and optimized by two parts of 'cloud server data + APP application client', and all UI elements, data contents and logic frameworks of the APP application can be installed on intelligent mobile equipment such as a mobile phone.
Native APP integrates functions of information pushing, local resource (camera shooting, Bluetooth, GPS, gyroscope and the like) access, user interaction and the like, and therefore the native APP has a wider prospect. Based on this, more enterprises are interested in creating native APPs from home. The over-capacity results from an over-mission of native APP bearers. Due to limited resources in the mobile device, part of the functions cannot be directly used in the corresponding scene, or the path used by part of the functions is too deep. The defects are high in learning cost and operation cost for new users.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for equipment control.
According to an aspect of the present application, there is provided a method of device control, which may include the steps of:
displaying a menu bar according to an instruction corresponding to the first gesture, wherein the menu bar comprises at least one candidate function inlet;
selecting a target function entry from the candidate function entries according to an instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowable range;
and executing the function corresponding to the target function entry.
According to another aspect of the present application, there is provided an apparatus for device control, which may include:
the menu bar display module is used for displaying a menu bar according to the instruction corresponding to the first gesture, and the menu bar comprises at least one candidate function inlet;
the target function entry selection module is used for selecting a target function entry from the candidate function entries according to the instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowable range;
and the function execution module is used for executing the function corresponding to the target function entrance.
In a third aspect, an embodiment of the present application provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform a method provided by any one of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform a method provided by any one of the embodiments of the present application.
According to the application, the target function entrance can be determined by using a shorter operation path. The problems that in the prior art, the entry path of the target function is too deep, and the learning cost and the operation cost are high for a new user are solved. For a user, the function corresponding to the target function entry can be quickly searched and controlled by two simple gestures, so that the user experience and the use satisfaction are improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a flow chart of a method of device control according to the present application;
FIG. 2 is a schematic diagram of touch gesture control according to the present application;
FIG. 3 is a flow chart of a manner of determining an instruction corresponding to a second gesture according to the present application;
FIG. 4 is a flow chart illustrating a manner of determining an instruction corresponding to a first gesture according to the present application;
FIG. 5 is a schematic diagram of an apparatus controlled according to the present application;
fig. 6 is a block diagram of an electronic device for implementing a method for device control according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
As shown in fig. 1, the present application provides a method of device control, which may include the steps of:
s101: displaying a menu bar according to an instruction corresponding to the first gesture, wherein the menu bar comprises at least one candidate function inlet;
s102: selecting a target function entry from the candidate function entries according to an instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowable range;
s103: and executing the function corresponding to the target function entry.
The execution main body of the embodiment of the application can be a smart phone, a tablet computer and other devices. In the embodiment of the present application, the above-mentioned device may be referred to as a controlled device.
The gesture may be a touch, slide, or the like operation applied on a screen of the controlled device. For example, gestures may include clicks, right and left swipes, up and down swipes, and arc swipes or other irregular swipes.
The description will be given by taking an example in which a user controls a browser-like application program through a gesture. In the related art, due to the limitation of the screen size of the controlled device, the operation entrance which can be accommodated by the display interface of the browser application program is limited. Generally speaking, only a few limited function entries such as "back", "menu", "voice search", "share", "home page", etc. are included. The controlled device receives a click, a slide up and down, a slide left and right, and the like of the user, and executes options such as "select function entry" or "determine (execute) function entry". Since each action of the user corresponds to only one operation, it is a first aspect for the user to find the final desired functional entry with at least a plurality of interactions. On the other hand, if the function entry that the user finally desires is not set on the display interface, it will cause a trouble for the user.
In the embodiment of the application, the function entry which is finally desired by the user can be directly found in the operation interface through one continuous operation of the user. For example, the user may add all function portals as candidate function portals in the toolbar of the browser-like application in advance, or add commonly used function portals as candidate function portals.
As shown in connection with fig. 2. Illustratively, the first gesture and the second gesture in FIG. 2 are touch gestures. And the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowable range. For example, the controlled device may record the position information of the second gesture and the first gesture in real time, and the position information may be a space coordinate (space gesture) or a coordinate on a touch screen. From the determined distance between the ending position of the first gesture and the enlightenment position of the second gesture, it can be determined whether there is a coherent operation between the second gesture and the first gesture. The determination of the ending position of the first gesture and the revealing position of the second gesture is described in detail below.
For example, the first gesture is a left-to-right swipe, and the second gesture is a top-to-bottom swipe. In a case where the end position of the first gesture and the real position of the second gesture are not the same coordinates, it may be determined whether to perform an operation of selecting a target function entry among the candidate function entries according to a distance between the two coordinates. In a case where the distance between the two positions is within the allowable range, an operation of selecting a target function entry among the candidate function entries is performed.
Wherein the first gesture may be a left swipe or a right swipe (shown in fig. 2 as "left-right swipe" occurring to the left). And the controlled device detects a left sliding track or a right sliding track, and displays a menu bar of the browser according to the corresponding relation between the first gesture and a preset instruction. The menu bar of the browser contains candidate function entries which are added by the user in advance. In addition, in the case where the finger is released after the user performs the first gesture of left or right sliding (in fig. 2, "release" is shown to occur on the left side) is detected, it is exemplarily indicated that the user's operation intention is to page only the left or right sliding, rather than displaying the menu bar of the browser. In this case, the page-turning action corresponding to the left-sliding or right-sliding may be directly performed (shown as "perform other action" in fig. 2).
When the menu bar of the browser is displayed, the user performs the action of the second gesture. The second gesture is a coherent gesture with the first gesture, so the starting position of the second gesture is the same as the ending position of the first gesture. The second gesture may be a slide up and down, a slide left and right, etc.
Where the second gesture is a swipe down (shown as "down" in fig. 2), the corresponding instruction may perform the selection of the candidate function entry, and select the candidate function entry ranked first (shown as "add selection" in fig. 2). In the event that the second gesture eventually leaves the touch screen of the controlled device (shown as "release" on the right side in fig. 2), the function entry ranked first is determined to be the selected target function entry (shown as "perform action, cancel other events" in fig. 2).
In the case where the second gesture is first a downward swipe, and then a leftward or rightward swipe (shown in fig. 2 as "leftward and rightward swipe" occurring on the right side), the corresponding instruction may be to switch left and right among the plurality of function entries (shown in fig. 2 as "switch"). And when the left sliding or the right sliding is stopped, the corresponding function inlet is the candidate function inlet selected by the user. And in the case that the candidate function entrance is selected and the second gesture leaves the touch screen of the controlled device, determining the candidate function entrance as the target function entrance.
In the case where the second gesture is first sliding down and then sliding up (shown as "up" in fig. 2), the corresponding instruction may be to cancel the display of the menu bar (shown as "deselect" in fig. 2).
And the controlled equipment executes the corresponding function according to the selected target function inlet.
In the above example, the technical solution is described by taking the gesture as the touch gesture as an example. For example, the gesture may be an air-spaced operation of the controlled device by the user, and the controlled device collects the gesture of the user through a camera, an infrared sensor, or other devices, so as to perform corresponding control according to an instruction corresponding to the gesture. The specific execution principle is the same as the touch gesture control principle.
On the other hand, the above example is explained by taking control of a browser-like application as an example. In practice, it is not limited thereto, for example, the above method may be applied to any type of application. In addition, the method can also be applied to an operating system of the controlled equipment. For example, a menu bar of an operating system (such as an Android operating system and an IOS operating system) is displayed by an instruction corresponding to the first gesture, and a target function entry is selected in the menu bar by an instruction corresponding to the second gesture.
In yet another aspect, the above examples represent the first gesture and the second gesture as a tap, a left-right swipe, and an up-down swipe. Actually, the gesture setting method is not limited to this, and any gesture set by the user may be supported.
Through the scheme, the target function entrance can be determined by using a shorter operation path. The problems that in the prior art, the entry path of the target function is too deep, and the learning cost and the operation cost are high for a new user are solved. For a user, the function corresponding to the target function entry can be quickly searched and controlled by two simple gestures, so that the user experience and the use satisfaction are improved.
As shown in fig. 3, in an embodiment, the determining manner of the instruction corresponding to the second gesture in step S102 includes:
s1021: taking the ending position of the first gesture as the starting position of the second gesture;
s1022: acquiring a track of a second gesture;
s1023: and determining the instruction corresponding to the second gesture according to the track of the second gesture and the corresponding relation between the track and the instruction.
Since the second gesture and the first gesture are coherent gestures. Thus, the starting position of the second gesture and the ending position of the first gesture may be the same. Or, when the ending position of the first gesture is determined, the gesture position acquired at preset time intervals is used as the starting position of the second gesture. The manner of determining the ending position of the first gesture will be described in detail later.
The preset instruction may include a determination instruction, a (left and right) movement instruction, a cancel instruction, and the like. For example, the user may previously establish a corresponding relationship between different gesture tracks and a preset instruction. For example, the gesture of increasing the touch force of the finger, sliding down the finger, or leaving the touch screen by the finger may correspond to the determination instruction, the finger sliding left or right may correspond to the movement instruction, and the finger sliding up may correspond to the cancellation instruction, etc.
By means of the consecutive first gesture and the consecutive second gesture, the path for searching the target function entry can be reduced, and therefore the target function entry can be selected quickly.
As shown in fig. 4, in an embodiment, the determining manner of the instruction corresponding to the first gesture in step S101 includes:
s1011: acquiring a starting position of a first gesture;
s1012: in the process of moving the first gesture, under the condition that a gesture ending condition is met, determining a position corresponding to the condition that the gesture ending condition is met as an ending position of the first gesture;
s1013: determining a trajectory between a starting position and an ending position of the first gesture as a trajectory of the first gesture;
s1014: and determining the instruction corresponding to the first gesture according to the track of the first gesture and the corresponding relation between the track and the instruction.
The controlled device can detect the gesture of the user in real time. For example, the detection may be performed by a camera, an infrared detection device, a touch screen, or the like.
In the event that a user's gesture is detected, the detected gesture may be determined to be a first gesture. First, the starting position of the first gesture can be determined according to the moving process of the gesture. During the moving of the first gesture, the first gesture can be detected in real time to determine whether the first gesture meets the end condition.
Illustratively, the end condition may be that the user's finger stops moving or that a pressing force degree is increased on the display screen, or the like. When the first gesture satisfies the end condition, a position corresponding to the gesture satisfying the end condition may be determined as an end position of the first gesture.
Based on this, a trajectory between the start position and the end position of the first gesture may be determined as the trajectory of the first gesture. And determining the instruction corresponding to the first gesture according to the corresponding relation between the track of the first gesture and the preset instruction.
By the scheme, the ending position of the first gesture, namely the starting position of the second gesture is judged by utilizing the ending condition, so that the first gesture and the second gesture are distinguished, and the menu bar can be conveniently called and the target function entry can be conveniently selected.
In one embodiment, the end condition includes: the gesture dwell time exceeds the corresponding threshold, the pressing force degree on the controlled equipment exceeds the corresponding threshold, and the gesture moving speed is lower than at least one of the corresponding thresholds.
For an air-spaced control gesture for a controlled device, a gesture dwell time exceeding a corresponding threshold and/or a gesture movement speed below a corresponding threshold may be taken as an end condition for the first gesture.
For the touch control gesture for the controlled device, at least one of the gesture dwell time length exceeding the corresponding threshold, the pressing force degree for the controlled device exceeding the corresponding threshold, and the gesture moving speed being lower than the corresponding threshold may be used as the ending condition of the first gesture.
Through the scheme, the control gesture of the controlled equipment can comprise air separation control and touch control, so that the method has better adaptability and wider applicable scenes.
In one embodiment, the second gesture and the first gesture are touch gestures corresponding to the same touch behavior that the hand does not leave the screen of the controlled device.
For the touch control gesture of the controlled device, besides that the starting position of the second gesture and the ending position of the first gesture are the same as the judgment criterion of the coherent gesture, the situation that a finger does not leave the screen of the controlled device can be used as the judgment criterion of the coherent gesture.
Also taking fig. 2 as an example, when the first gesture is a left slide or a right slide, if the user leaves the controlled device screen after the first gesture is finished, it indicates that the user has performed the left slide or the right slide only once. Based on this, it can be shown that the user's operation purpose does not need to display a menu bar but only performs a conventional page turning instruction by left-sliding or sliding again. In this case, the menu bar is not required to be displayed, and only the corresponding page-turning instruction is executed to slide left or slide again.
On the contrary, if the user does not leave the screen of the controlled device after the first gesture is finished, for example, the finger gesture staying time exceeds the threshold, the pressing force on the controlled device exceeds the threshold, and the gesture moving speed is lower than the threshold, in this case, the menu bar needs to be displayed. Furthermore, as the user continues to execute the second gesture, the control instruction of the second gesture is correspondingly executed.
Through the scheme, whether the finger leaves the screen or not is taken as an auxiliary judgment standard for judging whether the gesture is a coherent gesture or not. The controlled device can be endowed with richer functions by aiming at the touch control gesture of the controlled device.
In one embodiment, the menu bar comprises a menu bar of a browser;
the candidate function entry is a preset function entry, and the candidate function entry comprises: at least one of back, menu, voice search, share, home page back, refresh, forward, add multiple windows, delete current window and collection.
As mentioned above, a preferred embodiment of the present application is applied to browser-like applications. The function entries added to the menu bar may be for presetting, and the function entries may include all the function entries of the above browser, or function entries which are frequently used by the user. For example, adding multiple windows and deleting the current window can be used as function entries with higher frequency of use. Therefore, commands such as adding or deleting the browser window can be conveniently and quickly carried out through one-time gesture operation.
As shown in fig. 5, in one embodiment, the present application also provides an apparatus for device control, which may include:
a menu bar display module 501, configured to display a menu bar according to an instruction corresponding to the first gesture, where the menu bar includes at least one candidate function entry;
a target function entry selection module 502, configured to select a target function entry from the candidate function entries according to an instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowable range;
and the function executing module 503 is configured to execute a function corresponding to the target function entry.
In one embodiment, the target function entry selection module 502 may further include:
the starting position determining submodule of the second gesture is used for taking the ending position of the first gesture as the starting position of the second gesture;
the track acquisition submodule of the second gesture is used for acquiring a track of the second gesture;
and the instruction determining submodule corresponding to the second gesture is used for determining the instruction corresponding to the second gesture according to the track of the second gesture and the corresponding relation between the track and the instruction.
In one embodiment, the menu bar display module 501 may further include:
the starting position acquisition submodule of the first gesture is used for acquiring the starting position of the first gesture;
the end position determining submodule of the first gesture is used for determining a position corresponding to the condition of meeting the gesture end condition as the end position of the first gesture when the gesture end condition is met in the process of moving the first gesture;
the trajectory determination submodule of the first gesture is used for determining a trajectory between the starting position and the ending position of the first gesture as the trajectory of the first gesture;
and the instruction determining submodule corresponding to the first gesture is used for determining an instruction corresponding to the first gesture according to the track of the first gesture and the corresponding relation between the track and the instruction.
In one embodiment, the end condition includes: the gesture dwell time exceeds the corresponding threshold, the pressing force degree on the controlled equipment exceeds the corresponding threshold, and the gesture moving speed is lower than at least one of the corresponding thresholds.
In one embodiment, the second gesture and the first gesture are touch gestures corresponding to the same touch behavior that the finger does not leave the screen of the controlled device;
the first part of the touch gesture corresponding to the same touch line is a first gesture, and the second part of the touch gesture corresponding to the same touch line is a second gesture.
In one embodiment of the method of the present invention,
the menu bar comprises a menu bar of the browser;
the candidate function entry is a preset function entry, and the candidate function entry comprises: at least one of back, menu, voice search, share, home page back, refresh, forward, add multiple windows, delete current window and collection.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 6, it is a block diagram of an electronic device according to the method of device control of the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 610, memory 620, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). One processor 610 is illustrated in fig. 6.
The memory 620, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method of device control in the embodiments of the present application (for example, the menu bar display module 501, the target function portal selection module 502, and the function execution module 503 shown in fig. 5). The processor 610 executes various functional applications of the server and data processing by executing non-transitory software programs, instructions, and modules stored in the memory 620, that is, implements the method of device control in the above-described method embodiments.
The memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device according to a method of device control, and the like. Further, the memory 620 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 620 optionally includes memory located remotely from the processor 610, and these remote memories may be connected over a network to the electronics of the method of device control. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the device control method may further include: an input device 630 and an output device 640. The processor 610, the memory 620, the input device 630, and the output device 640 may be connected by a bus or other means, such as the bus connection in fig. 6.
The input device 630 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device of the method of device control, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or other input devices. The output device 640 may include a display device, an auxiliary lighting device (e.g., an LED), a haptic feedback device (e.g., a vibration motor), and the like. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service. The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (14)
1. A method of device control, comprising:
displaying a menu bar according to an instruction corresponding to the first gesture, wherein the menu bar comprises at least one candidate function inlet;
selecting a target function entry from the candidate function entries according to an instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowed range;
and executing the function corresponding to the target function entrance.
2. The method of claim 1, wherein the instruction corresponding to the second gesture is determined in a manner comprising:
taking the end position of the first gesture as the starting position of the second gesture;
acquiring a track of the second gesture;
and determining the instruction corresponding to the second gesture according to the track of the second gesture and the corresponding relation between the track and the instruction.
3. The method of claim 1, wherein the instruction corresponding to the first gesture is determined in a manner comprising:
acquiring a starting position of the first gesture;
in the process of moving the first gesture, under the condition that a gesture ending condition is met, determining a position corresponding to the condition that the gesture ending condition is met as an ending position of the first gesture;
determining a trajectory between a starting position and an ending position of the first gesture as a trajectory of the first gesture;
and determining the instruction corresponding to the first gesture according to the track of the first gesture and the corresponding relation between the track and the instruction.
4. The method of claim 3, wherein the end condition comprises: the gesture dwell time exceeds the corresponding threshold, the pressing force degree on the controlled equipment exceeds the corresponding threshold, and the gesture moving speed is lower than at least one of the corresponding thresholds.
5. The method according to any one of claims 1 to 4, wherein the second gesture and the first gesture are touch gestures corresponding to the same touch behavior that the hand does not leave the screen of the controlled device.
6. The method of claim 5, wherein:
the menu bar comprises a menu bar of the browser;
the candidate function portal is a preset function portal, and the candidate function portal includes: at least one of back, menu, voice search, share, home page back, refresh, forward, add multiple windows, delete current window and collection.
7. An apparatus for device control, comprising:
the menu bar display module is used for displaying a menu bar according to an instruction corresponding to the first gesture, and the menu bar comprises at least one candidate function inlet;
the target function entry selection module is used for selecting a target function entry from the candidate function entries according to the instruction corresponding to the second gesture; the distance between the starting position of the second gesture and the ending position of the first gesture is within an allowed range;
and the function execution module is used for executing the function corresponding to the target function entrance.
8. The apparatus of claim 7, wherein the target function portal selection module comprises:
a starting position determining submodule of a second gesture, which is used for taking the ending position of the first gesture as the starting position of the second gesture;
the track acquisition submodule of the second gesture is used for acquiring the track of the second gesture;
and the instruction determining submodule corresponding to the second gesture is used for determining the instruction corresponding to the second gesture according to the track of the second gesture and the corresponding relation between the track and the instruction.
9. The apparatus of claim 7, wherein the menu bar display module comprises:
the starting position acquisition submodule of the first gesture is used for acquiring the starting position of the first gesture;
the end position determining submodule of the first gesture is used for determining a position corresponding to the condition of meeting the gesture end condition as the end position of the first gesture when the gesture end condition is met in the process of moving the first gesture;
the trajectory determination submodule of the first gesture is used for determining a trajectory between the starting position and the ending position of the first gesture as the trajectory of the first gesture;
and the instruction determining submodule corresponding to the first gesture is used for determining the instruction corresponding to the first gesture according to the track of the first gesture and the corresponding relation between the track and the instruction.
10. The apparatus of claim 9, wherein the end condition comprises: the gesture dwell time exceeds the corresponding threshold, the pressing force degree on the controlled equipment exceeds the corresponding threshold, and the gesture moving speed is lower than at least one of the corresponding thresholds.
11. The apparatus according to any one of claims 7 to 10, wherein the second gesture and the first gesture are touch gestures corresponding to a same touch behavior that a hand does not leave a screen of a controlled device.
12. The apparatus of claim 11, wherein,
the menu bar comprises a menu bar of the browser;
the candidate function portal is a preset function portal, and the candidate function portal includes: at least one of back, menu, voice search, share, home page back, refresh, forward, add multiple windows, delete current window and collection.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 6.
14. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011202114.6A CN112181582A (en) | 2020-11-02 | 2020-11-02 | Method, apparatus, device and storage medium for device control |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011202114.6A CN112181582A (en) | 2020-11-02 | 2020-11-02 | Method, apparatus, device and storage medium for device control |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112181582A true CN112181582A (en) | 2021-01-05 |
Family
ID=73916935
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011202114.6A Pending CN112181582A (en) | 2020-11-02 | 2020-11-02 | Method, apparatus, device and storage medium for device control |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112181582A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113190107A (en) * | 2021-03-16 | 2021-07-30 | 青岛小鸟看看科技有限公司 | Gesture recognition method and device and electronic equipment |
WO2022007541A1 (en) * | 2020-07-09 | 2022-01-13 | Oppo广东移动通信有限公司 | Device control method and apparatus, storage medium, and electronic device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855073A (en) * | 2011-06-30 | 2013-01-02 | 联想(北京)有限公司 | Electronic device and information processing method using same |
US20130179781A1 (en) * | 2012-01-06 | 2013-07-11 | Microsoft Corporation | Edge-based hooking gestures for invoking user interfaces |
CN103777850A (en) * | 2014-01-17 | 2014-05-07 | 广州华多网络科技有限公司 | Menu display method, device and terminal |
CN104102441A (en) * | 2013-04-09 | 2014-10-15 | 腾讯科技(深圳)有限公司 | Menuitem executing method and device |
CN108536273A (en) * | 2017-03-01 | 2018-09-14 | 天津锋时互动科技有限公司深圳分公司 | Man-machine menu mutual method and system based on gesture |
WO2020000276A1 (en) * | 2018-06-27 | 2020-01-02 | 华为技术有限公司 | Method and terminal for controlling shortcut button |
CN111190520A (en) * | 2020-01-02 | 2020-05-22 | 北京字节跳动网络技术有限公司 | Menu item selection method and device, readable medium and electronic equipment |
-
2020
- 2020-11-02 CN CN202011202114.6A patent/CN112181582A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102855073A (en) * | 2011-06-30 | 2013-01-02 | 联想(北京)有限公司 | Electronic device and information processing method using same |
US20130179781A1 (en) * | 2012-01-06 | 2013-07-11 | Microsoft Corporation | Edge-based hooking gestures for invoking user interfaces |
CN104102441A (en) * | 2013-04-09 | 2014-10-15 | 腾讯科技(深圳)有限公司 | Menuitem executing method and device |
CN103777850A (en) * | 2014-01-17 | 2014-05-07 | 广州华多网络科技有限公司 | Menu display method, device and terminal |
CN108536273A (en) * | 2017-03-01 | 2018-09-14 | 天津锋时互动科技有限公司深圳分公司 | Man-machine menu mutual method and system based on gesture |
WO2020000276A1 (en) * | 2018-06-27 | 2020-01-02 | 华为技术有限公司 | Method and terminal for controlling shortcut button |
CN111190520A (en) * | 2020-01-02 | 2020-05-22 | 北京字节跳动网络技术有限公司 | Menu item selection method and device, readable medium and electronic equipment |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022007541A1 (en) * | 2020-07-09 | 2022-01-13 | Oppo广东移动通信有限公司 | Device control method and apparatus, storage medium, and electronic device |
CN113190107A (en) * | 2021-03-16 | 2021-07-30 | 青岛小鸟看看科技有限公司 | Gesture recognition method and device and electronic equipment |
CN113190107B (en) * | 2021-03-16 | 2023-04-14 | 青岛小鸟看看科技有限公司 | Gesture recognition method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105335048B (en) | Electronic equipment with hidden application icon and method for hiding application icon | |
US9632618B2 (en) | Expanding touch zones of graphical user interface widgets displayed on a screen of a device without programming changes | |
US8212785B2 (en) | Object search method and terminal having object search function | |
CN110505245B (en) | Application login method, device and equipment | |
CN113055525A (en) | File sharing method, device, equipment and storage medium | |
CN104360805A (en) | Application icon management method and application icon management device | |
CN104571852A (en) | Icon moving method and device | |
CN110620844B (en) | Program starting method, device, equipment and storage medium | |
US9870122B2 (en) | Graphical user interface for rearranging icons | |
CN105302458A (en) | Message display method and apparatus | |
CN112181582A (en) | Method, apparatus, device and storage medium for device control | |
US11169652B2 (en) | GUI configuration | |
CN104267867A (en) | Content input method and device | |
US9587956B2 (en) | Route stabilization scrolling mode | |
CN103870120A (en) | Information processing method and electronic equipment | |
US10795569B2 (en) | Touchscreen device | |
CN110427138A (en) | Translation information processing method, device, electronic equipment and storage medium | |
CN112527110B (en) | Non-contact interaction method, non-contact interaction device, electronic equipment and medium | |
CN104750401A (en) | Touch method and related device as well as terminal equipment | |
CN104866210A (en) | Touch screen control method and device and electronic equipment | |
US10175774B1 (en) | Keyboard having a spacebar with touchpad | |
CN104407763A (en) | Content input method and system | |
CN114518819A (en) | Icon management method and device and electronic equipment | |
US10235890B1 (en) | System for navigating an aircraft display with a mobile device | |
CN111966432A (en) | Verification code processing method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210105 |