CN106527954B - Equipment control method and device and mobile terminal - Google Patents

Equipment control method and device and mobile terminal Download PDF

Info

Publication number
CN106527954B
CN106527954B CN201611070505.0A CN201611070505A CN106527954B CN 106527954 B CN106527954 B CN 106527954B CN 201611070505 A CN201611070505 A CN 201611070505A CN 106527954 B CN106527954 B CN 106527954B
Authority
CN
China
Prior art keywords
gesture
display
display bar
preset
triggering request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611070505.0A
Other languages
Chinese (zh)
Other versions
CN106527954A (en
Inventor
郝思涵
侯恩星
李适
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201611070505.0A priority Critical patent/CN106527954B/en
Publication of CN106527954A publication Critical patent/CN106527954A/en
Application granted granted Critical
Publication of CN106527954B publication Critical patent/CN106527954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The disclosure relates to a device control method and device and a mobile terminal, wherein the device control method comprises the following steps: receiving an input gesture, and identifying at least one of a generation position and a trajectory of the gesture; when the gesture is recognized to be a preset gesture, determining the selected control area according to at least one of the generation position and the track of the gesture; and controlling the display columns included in the selected control areas according to the operation corresponding to the preset gesture. According to the embodiment of the disclosure, when the gesture input by the user is recognized to be the preset gesture, the selected control area is determined, and the display columns included in the selected control area are controlled according to the operation corresponding to the preset gesture, so that the user can operate a plurality of devices simultaneously, and the operation is simple.

Description

Equipment control method and device and mobile terminal
Technical Field
The present disclosure relates to the field of mobile terminal technologies, and in particular, to a device control method and apparatus, and a mobile terminal.
Background
With the rapid development of mobile terminal technology, various mobile terminals such as mobile phones have become very popular and increasingly powerful. At present, users can install various Applications (APP) on mobile phones to meet their needs. For example, a user may install a smart home APP on a mobile phone and then add various smart devices to the smart home APP to control the smart devices.
However, the user can only control the intelligent devices one by one when controlling the intelligent devices, and if the number of devices to be controlled by the user is large, repeated operation is required for many times, which is complicated in operation.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a device control method and apparatus, and a mobile terminal.
According to a first aspect of embodiments of the present disclosure, there is provided an apparatus control method including:
receiving an input gesture, and recognizing at least one of a generation position and a trajectory of the gesture;
when the gesture is recognized to be a preset gesture, determining the selected control area according to at least one of the generation position and the track of the gesture;
controlling a display bar included in the selected control area according to the operation corresponding to the preset gesture;
wherein the control area comprises at least one reversible display bar; the display bar has at least two display surfaces.
In one embodiment, the controlling the display bar included in the selected control area includes:
and turning over the display surface of the display bar.
In one embodiment, the controlling the display bar included in the selected control area includes:
and moving the display bar.
In one embodiment, the display bar is displayed on the interface in a preset style, and the preset style comprises a grid style.
In one embodiment, the display surface has displayed thereon:
description information of the device and its icon; or
Description information of a device and status information thereof, wherein the status information comprises at least one of status and parameter information.
In an embodiment, the method further comprises:
receiving an icon triggering request, and starting corresponding equipment according to the icon triggering request; or
Receiving a parameter information triggering request, and closing corresponding equipment according to the parameter information triggering request; or
And receiving a description information triggering request, and displaying a detail page of the corresponding equipment according to the description information triggering request.
According to a second aspect of the embodiments of the present disclosure, there is provided an apparatus control device including:
a receiving recognition module configured to receive an input gesture and recognize at least one of a generation position and a trajectory of the gesture;
a determination module configured to determine the selected control area according to at least one of a generation position and a trajectory of the gesture when the gesture is recognized as a preset gesture by the reception recognition module;
the control module is configured to control the display columns included in the selected control areas determined by the determination module according to the operation corresponding to the preset gesture;
wherein the control area comprises at least one reversible display bar; the display bar has at least two display surfaces.
In an embodiment, the control module is configured to:
and turning over the display surface of the display bar.
In an embodiment, the control module is configured to:
and moving the display bar.
In one embodiment, the display bar is displayed on the interface in a preset style, and the preset style comprises a grid style.
In one embodiment, the display surface has displayed thereon:
description information of the device and its icon; or
Description information of a device and status information thereof, wherein the status information comprises at least one of status and parameter information.
In one embodiment, the apparatus further comprises:
the receiving and starting module is configured to receive an icon triggering request and start corresponding equipment according to the icon triggering request; or
The receiving and closing module is configured to receive a parameter information triggering request and close corresponding equipment according to the parameter information triggering request; or
And the receiving and displaying module is configured to receive the description information triggering request and display the detail page of the corresponding equipment according to the description information triggering request.
According to a third aspect of the embodiments of the present disclosure, there is provided a mobile terminal including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
receiving an input gesture, and recognizing at least one of a generation position and a trajectory of the gesture;
when the gesture is recognized to be a preset gesture, determining the selected control area according to at least one of the generation position and the track of the gesture;
controlling a display bar included in the selected control area according to the operation corresponding to the preset gesture;
wherein the control area comprises at least one reversible display bar; the display bar has at least two display surfaces.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the selected control area is determined when the gesture input by the user is recognized to be the preset gesture, and the display columns included in the selected control area are controlled according to the operation corresponding to the preset gesture, so that the user can operate a plurality of devices simultaneously and the operation is simple.
By turning over the display surface of the display bar, switching between the display surfaces can be realized, so that a user can view more information.
The selected control area is determined when the gesture input by the user is recognized to be the preset gesture, and the display bar included in the selected control area is moved according to the operation corresponding to the preset gesture, so that the user can adjust the position of the display bar as required, and the user can clearly see the information of the equipment to be checked.
By displaying the display bar in a preset style, the current interface can display information of more devices.
The scheme is made clearer by describing the content of the display surface.
The corresponding equipment can be closed according to the parameter information triggering request, so that the aim of closing the equipment is fulfilled; the corresponding equipment can be started according to the icon triggering request, so that the purpose of starting the equipment is achieved; the detail page of the corresponding equipment can be displayed according to the description information triggering request, so that the purpose of entering the equipment detail page is achieved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flow chart illustrating a device control method according to an exemplary embodiment.
FIG. 2A is a diagram illustrating a display bar according to an example embodiment.
FIG. 2B is a schematic diagram illustrating another display bar, according to an example embodiment.
Fig. 3 is a flow chart illustrating another method of device control according to an example embodiment.
Fig. 4 is a flow chart illustrating another method of device control according to an example embodiment.
Fig. 5 is a flow chart illustrating another method of device control according to an example embodiment.
FIG. 6 is a flow chart illustrating another method of device control according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating an appliance control device according to an exemplary embodiment.
Fig. 8 is a block diagram illustrating another device control apparatus according to an example embodiment.
Fig. 9 is a block diagram illustrating an apparatus suitable for use in a mobile terminal according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a device control method according to an exemplary embodiment, which is applicable to a mobile terminal installed with a preset APP as shown in fig. 1, and includes the following steps S101 to S103:
in step S101, an input gesture is received, and at least one of a generation position and a trajectory of the gesture is recognized.
Among them, the gesture input by the user may include, but is not limited to, a user's finger sliding left, right, up, and down on a touch screen of a mobile terminal such as a mobile phone.
In this embodiment, the gesture may be recognized by analyzing the generation position and the sliding track of the gesture, or the gesture may be recognized according to the generation position of the gesture.
In step S102, when the gesture is recognized as a preset gesture, the selected control area is determined according to the generation position and trajectory of the gesture.
The control area comprises at least one reversible display bar, and the display bar is displayed on the interface in a preset style, wherein the preset style can include but is not limited to a grid style, such as a 16 grid style, a 25 grid style, and the like. The display bar has at least two display surfaces. The display surface can display thereon: the description information of the device and its icon may also be displayed with the description information of the device and its status information. The description information of the device may include, but is not limited to, a name of the device, the status information of the device may include at least one of status and parameter information, where the status may include, but is not limited to, on, off, online, offline, and the like, and the parameter information may be a parameter value of the corresponding device, for example, a brightness value of a color lamp.
In this embodiment, assuming that the preset gesture is a rightward sliding gesture, after the gesture input by the user is recognized as the preset gesture, the control area may be determined according to the generation position and the sliding trajectory of the gesture, and it is assumed that the control area determined in this embodiment is the current interface area in fig. 2A.
In step S103, a display bar included in the selected control area is controlled according to an operation corresponding to the preset gesture.
In this embodiment, after the control area is determined, the display surfaces of the display columns may be flipped according to an operation corresponding to a preset gesture, that is, an operation of sliding to the right, and if each display column in the control area includes a first display surface and a second display surface, the display columns may be flipped from the first display surface to the second display surface, that is, fig. 2A to fig. 2B, where the switching manner may be sequentially switched according to a sliding order, for example, sequentially switched according to an order from left to right and from top to bottom, or switched simultaneously, and the switching manner is not specifically limited in this embodiment.
In addition, if the current interface is as shown in fig. 2B, after the user performs the operation of inputting the preset gesture again, the second display surface of the display bar may be flipped to the first display surface, that is, fig. 2B is switched to fig. 2A.
Further, if the display bar includes a plurality of display surfaces, switching between the plurality of display surfaces may be performed according to a preset gesture, such as turning from the first display surface to the second display surface, turning from the second display surface to the third display surface, turning from the third display surface to the first display surface, and so on.
Alternatively, the description information of the device may indicate the state of the corresponding device through different colors, for example, if the description information is gray, the device is in an off state or an off-line state, and if the description information is color, the device is in an on state or an on-line state. That is, the on-off state and the online-offline state of the device can be known by the color of the device name in fig. 2A and 2B.
Similarly, the status of the above-mentioned device can be indicated by text and its color, for example, by a gray "off state" if the television is in an off state, and by a color "on state" if the television is in an on state. The parameter information of the equipment can represent different states of the corresponding parameter characterization indexes through different colors. For example, when the parameter value of the air purifier is red, the current air is heavily polluted, when the parameter value is yellow, the current air is lightly polluted, and when the parameter value is blue, the current air quality is excellent.
Note that, the device names and the status information in fig. 2A and 2B may be different colors such as gray, green, and red, but fig. 2A and 2B are grayscale diagrams, and thus cannot display colors.
According to the device control method, when the gesture input by the user is recognized to be the preset gesture, the selected control area is determined, and the display columns included in the selected control area are controlled according to the operation corresponding to the preset gesture, so that the user can operate a plurality of devices at the same time, and the operation is simple.
Fig. 3 is a flow chart illustrating another apparatus control method according to an exemplary embodiment, which may further include the steps of, as shown in fig. 3:
in step S301, an input gesture is received, and at least one of a generation position and a trajectory of the gesture is recognized.
In step S302, when the gesture is recognized as a preset gesture, the selected control area is determined according to at least one of a generation position and a trajectory of the gesture.
In step S303, the display bar included in the selected control area is moved according to the operation corresponding to the preset gesture.
In this embodiment, assuming that the user's finger is long pressing the last display bar in FIG. 2A and dragging the last display bar to the position of the first display bar, and then the finger leaves the interface, the movement of the last display bar to the position of the first display bar can be achieved.
According to the device control method, when the gesture input by the user is recognized to be the preset gesture, the selected control area is determined, and the display bar included in the selected control area is moved according to the operation corresponding to the preset gesture, so that the user can adjust the position of the display bar as required, and the user can clearly see the information of the device to be checked.
Fig. 4 is a flow chart illustrating another apparatus control method according to an exemplary embodiment, which may further include the steps of, as shown in fig. 4:
in step S401, an input gesture is received, and at least one of a generation position and a trajectory of the gesture is recognized.
In step S402, when the gesture is recognized as a preset gesture, the selected control area is determined according to at least one of a generation position and a trajectory of the gesture.
In step S403, according to an operation corresponding to the preset gesture, the first display surface of the display column in the control area is turned into a second display surface, where the second display surface displays description information of the device and status information thereof, and the status information of the device includes at least one of a status of the device and parameter information.
In step S404, a parameter information trigger request is received, and the corresponding device is turned off according to the parameter information trigger request.
In this embodiment, when the user clicks the parameter information in fig. 2B, for example, the user clicks the current brightness of the bedroom color light lamp "80%", the color light lamp may be turned off.
According to the embodiment of the equipment control method, the corresponding equipment can be closed according to the parameter information triggering request, so that the aim of closing the equipment is fulfilled.
Fig. 5 is a flow chart illustrating another apparatus control method according to an exemplary embodiment, which may further include the steps of, as shown in fig. 5:
in step S501, an input gesture is received, and at least one of a generation position and a trajectory of the gesture is recognized.
In step S502, when the gesture is recognized as the preset gesture, the selected control area is determined according to at least one of a generation position and a trajectory of the gesture.
In step S503, according to an operation corresponding to the preset gesture, the second display surface of the display column in the control area is turned into the first display surface, where the description information of the device and the icon thereof are displayed on the first display surface.
In step S504, an icon trigger request is received, and a corresponding device is turned on according to the icon trigger request.
In this embodiment, the fan may be turned on when the user clicks on the device icon in FIG. 2A, for example, the user clicks on the Living room fan icon.
According to the embodiment of the equipment control method, the corresponding equipment can be started according to the icon trigger request, so that the purpose of starting the equipment is achieved.
Fig. 6 is a flowchart illustrating another apparatus control method according to an exemplary embodiment, and as shown in fig. 6, after the step S403, the method may further include the steps of:
in step S405, a description information trigger request is received, and a detail page of the corresponding device is displayed according to the description information trigger request.
Further, step S405, which is not shown, may be included after step S503.
In this embodiment, when the user clicks on the device name in fig. 2A or fig. 2B, for example, the user clicks on "living room | fan", a detail page for the fan may be displayed.
According to the embodiment of the device control method, the request for displaying the detail page of the corresponding device can be triggered according to the description information, so that the purpose of entering the device detail page is achieved.
Corresponding to the embodiment of the equipment control method, the disclosure also provides an embodiment of an equipment control device.
Fig. 7 is a block diagram illustrating an appliance control apparatus according to an exemplary embodiment, the appliance control apparatus including, as shown in fig. 7: a reception recognition module 71, a determination module 72 and a control module 73.
The reception recognition module 71 is configured to receive an input gesture and recognize at least one of a generation position and a trajectory of the gesture.
Among them, the gesture input by the user may include, but is not limited to, a user's finger sliding left, right, up, and down on a touch screen of a mobile terminal such as a mobile phone.
In this embodiment, the gesture may be recognized by analyzing the generation position and the sliding track of the gesture, or the gesture may be recognized according to the generation position of the gesture.
The determination module 72 is configured to determine the selected control area according to at least one of a generation position and a trajectory of the gesture when the reception recognition module 71 recognizes that the gesture is a preset gesture.
The control area comprises at least one reversible display bar, and the display bar is displayed on the interface in a preset style, wherein the preset style can include but is not limited to a grid style, such as a 16 grid style, a 25 grid style, and the like. The display bar has at least two display surfaces. The display surface can display thereon: the description information of the device and its icon may also be displayed with the description information of the device and its status information. The description information of the device may include, but is not limited to, a name of the device, the status information of the device may include at least one of status and parameter information, where the status may include, but is not limited to, on, off, online, offline, and the like, and the parameter information may be a parameter value of the corresponding device, for example, a brightness value of a color lamp.
In this embodiment, assuming that the preset gesture is a rightward sliding gesture, after the gesture input by the user is recognized as the preset gesture, the control area may be determined according to the generation position and the sliding trajectory of the gesture, and it is assumed that the control area determined in this embodiment is the current interface area in fig. 2A.
The control module 73 is configured to control the display columns included in the selected control areas determined by the determination module 72 according to the operation corresponding to the preset gesture.
In this embodiment, after the control area is determined, the display surfaces of the display columns may be flipped according to an operation corresponding to a preset gesture, that is, an operation of sliding to the right, and if each display column in the control area includes a first display surface and a second display surface, the display columns may be flipped from the first display surface to the second display surface, that is, fig. 2A to fig. 2B, where the switching manner may be sequentially switched according to a sliding order, for example, sequentially switched according to an order from left to right and from top to bottom, or switched simultaneously, and the switching manner is not specifically limited in this embodiment.
In addition, if the current interface is as shown in fig. 2B, after the user performs the operation of inputting the preset gesture again, the second display surface of the display bar may be flipped to the first display surface, that is, fig. 2B is switched to fig. 2A.
Further, if the display bar includes a plurality of display surfaces, switching between the plurality of display surfaces may be performed according to a preset gesture, such as turning from the first display surface to the second display surface, turning from the second display surface to the third display surface, turning from the third display surface to the first display surface, and so on.
Alternatively, the description information of the device may indicate the state of the corresponding device through different colors, for example, if the description information is gray, the device is in an off state or an off-line state, and if the description information is color, the device is in an on state or an on-line state. That is, the on-off state and the online-offline state of the device can be known by the color of the device name in fig. 2A and 2B.
Similarly, the status of the above-mentioned device can be indicated by text and its color, for example, by a gray "off state" if the television is in an off state, and by a color "on state" if the television is in an on state. The parameter information of the equipment can represent different states of the corresponding parameter characterization indexes through different colors. For example, when the parameter value of the air purifier is red, the current air is heavily polluted, when the parameter value is yellow, the current air is lightly polluted, and when the parameter value is blue, the current air quality is excellent.
Note that, the device names and the status information in fig. 2A and 2B may be different colors such as gray, green, and red, but fig. 2A and 2B are grayscale diagrams, and thus cannot display colors.
In this embodiment, assuming that the user's finger presses the last display bar in fig. 2A for a long time and drags the last display bar to the position of the first display bar, and then the finger leaves the interface, the control module 73 may move the last display bar to the position of the first display bar, so that the user may adjust the positions of the display bars as needed, and the user may clearly see the information of the desired viewing device.
The apparatus shown in fig. 7 is used for implementing the method flow shown in fig. 1, and related contents are described the same, which are not described herein again.
According to the device control device, the selected control area is determined when the gesture input by the user is recognized to be the preset gesture, and the display columns included in the selected control area are controlled according to the operation corresponding to the preset gesture, so that the user can operate a plurality of devices simultaneously, and the operation is simple.
Fig. 8 is a block diagram of another device control apparatus according to an exemplary embodiment, and as shown in fig. 8, on the basis of the above embodiment shown in fig. 7, the apparatus may further include: at least one of the reception opening module 74, the reception closing module 75, and the reception display module 76.
The receive start module 74 is configured to receive the icon trigger request and start the corresponding device according to the icon trigger request.
In this embodiment, the fan may be turned on when the user clicks on the device icon in FIG. 2A, for example, the user clicks on the Living room fan icon.
The receive shutdown module 75 is configured to receive the parameter information trigger request and shut down the corresponding device according to the parameter information trigger request.
In this embodiment, when the user clicks the parameter information in fig. 2B, for example, the user clicks the current brightness of the bedroom color light lamp "80%", the color light lamp may be turned off.
The receiving and displaying module 76 is configured to receive the description information trigger request and display the detail page of the corresponding device according to the description information trigger request.
In this embodiment, when the user clicks on the device name in fig. 2A or fig. 2B, for example, the user clicks on "living room | fan", a detail page for the fan may be displayed.
The apparatus shown in fig. 8 is used to implement the method flow shown in fig. 4, fig. 5, or fig. 6, and related contents are described the same, which is not described herein again.
According to the embodiment of the device control device, the corresponding device can be started according to the icon trigger request, so that the purpose of starting the device is achieved, the corresponding device can be closed according to the parameter information trigger request, so that the purpose of closing the device is achieved, and the detail page of the corresponding device can be displayed according to the description information trigger request, so that the purpose of entering the device detail page is achieved.
With regard to the apparatus in the above embodiment, the specific manner in which each module and sub-module performs operations has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 9 is a block diagram illustrating a suitable device control apparatus according to an exemplary embodiment. For example, the apparatus 900 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 9, apparatus 900 may include one or more of the following components: processing component 902, memory 904, power component 906, multimedia component 908, audio component 910, input/output (I/O) interface 912, sensor component 914, and communication component 916.
The processing component 902 generally controls overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processing element 902 may include one or more processors 920 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 902 can include one or more modules that facilitate interaction between processing component 902 and other components. For example, the processing component 902 can include a multimedia module to facilitate interaction between the multimedia component 908 and the processing component 902.
The memory 904 is configured to store various types of data to support operation at the device 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 904 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 906 provides power to the various components of the device 900. The power components 906 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 900.
The multimedia component 908 comprises a screen providing an output interface between the device 900 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 908 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 900 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 910 is configured to output and/or input audio signals. For example, audio component 910 includes a Microphone (MIC) configured to receive external audio signals when apparatus 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 904 or transmitted via the communication component 916. In some embodiments, audio component 910 also includes a speaker for outputting audio signals.
I/O interface 912 provides an interface between processing component 902 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 914 includes one or more sensors for providing status assessment of various aspects of the apparatus 900. For example, the sensor assembly 914 may detect an open/closed state of the device 900, the relative positioning of the components, such as a display and keypad of the apparatus 900, the sensor assembly 914 may also detect a change in the position of the apparatus 900 or a component of the apparatus 900, the presence or absence of user contact with the apparatus 900, orientation or acceleration/deceleration of the apparatus 900, and a change in the temperature of the apparatus 900. The sensor assembly 914 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 916 is configured to facilitate communications between the apparatus 900 and other devices in a wired or wireless manner. The apparatus 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 916 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 916 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 904 comprising instructions, executable by the processor 920 of the apparatus 900 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. An apparatus control method, characterized in that the method comprises:
receiving an input gesture, and recognizing at least one of a generation position and a trajectory of the gesture;
when the gesture is recognized to be a preset gesture, determining the selected control area according to at least one of the generation position and the track of the gesture;
controlling a display bar included in the selected control area according to the operation corresponding to the preset gesture; wherein the control area comprises at least one reversible display bar; the display bar is provided with at least two display surfaces, and the at least two display surfaces of the same display bar represent different display information interfaces of the same device;
the control the display column that selected control area includes:
and turning over the display surface of the display bar.
2. The device control method according to claim 1, wherein the controlling of the display bar included in the selected control area includes:
and moving the display bar.
3. The device control method according to claim 1, wherein the display bar is displayed on an interface in a preset style, and the preset style comprises a grid style.
4. The device control method according to any one of claims 1 to 3, wherein:
description information of the device and its icon; or
Description information of a device and status information thereof, wherein the status information comprises at least one of status and parameter information.
5. The apparatus control method according to claim 4, characterized in that the method further comprises:
receiving an icon triggering request, and starting corresponding equipment according to the icon triggering request; or
Receiving a parameter information triggering request, and closing corresponding equipment according to the parameter information triggering request; or
And receiving a description information triggering request, and displaying a detail page of the corresponding equipment according to the description information triggering request.
6. An apparatus control device, characterized in that the device comprises:
a receiving recognition module configured to receive an input gesture and recognize at least one of a generation position and a trajectory of the gesture;
a determination module configured to determine the selected control area according to at least one of a generation position and a trajectory of the gesture when the gesture is recognized as a preset gesture by the reception recognition module;
the control module is configured to control the display columns included in the selected control areas determined by the determination module according to the operation corresponding to the preset gesture; wherein the control area comprises at least one reversible display bar; the display bar is provided with at least two display surfaces, and the at least two display surfaces of the same display bar represent different display information interfaces of the same device;
the control module configured to: and turning over the display surface of the display bar.
7. The device control apparatus of claim 6, wherein the control module is configured to:
and moving the display bar.
8. The device control apparatus according to claim 6, wherein the display bar is displayed on the interface in a preset pattern, the preset pattern including a grid pattern.
9. The device control apparatus according to any one of claims 6 to 8, wherein:
description information of the device and its icon; or
Description information of a device and status information thereof, wherein the status information comprises at least one of status and parameter information.
10. The appliance control device according to claim 9, characterized in that the device further comprises:
the receiving and starting module is configured to receive an icon triggering request and start corresponding equipment according to the icon triggering request; or
The receiving and closing module is configured to receive a parameter information triggering request and close corresponding equipment according to the parameter information triggering request; or
And the receiving and displaying module is configured to receive the description information triggering request and display the detail page of the corresponding equipment according to the description information triggering request.
11. A mobile terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any of claims 1 to 5.
CN201611070505.0A 2016-11-28 2016-11-28 Equipment control method and device and mobile terminal Active CN106527954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611070505.0A CN106527954B (en) 2016-11-28 2016-11-28 Equipment control method and device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611070505.0A CN106527954B (en) 2016-11-28 2016-11-28 Equipment control method and device and mobile terminal

Publications (2)

Publication Number Publication Date
CN106527954A CN106527954A (en) 2017-03-22
CN106527954B true CN106527954B (en) 2020-07-03

Family

ID=58355069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611070505.0A Active CN106527954B (en) 2016-11-28 2016-11-28 Equipment control method and device and mobile terminal

Country Status (1)

Country Link
CN (1) CN106527954B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113126870A (en) * 2019-12-30 2021-07-16 佛山市云米电器科技有限公司 Parameter setting method, intelligent refrigerator and computer readable storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9619143B2 (en) * 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US8266550B1 (en) * 2008-05-28 2012-09-11 Google Inc. Parallax panning of mobile device desktop
JP5761934B2 (en) * 2010-06-30 2015-08-12 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
CN102135853A (en) * 2011-03-09 2011-07-27 苏州佳世达电通有限公司 Information processing method for touch control display device
JP5889005B2 (en) * 2012-01-30 2016-03-22 キヤノン株式会社 Display control apparatus and control method thereof
TWI528235B (en) * 2012-02-08 2016-04-01 緯創資通股份有限公司 Touch display device and touch method
CN103455128A (en) * 2012-06-04 2013-12-18 联想(北京)有限公司 Display method and electronic device
CN103677509B (en) * 2012-09-24 2017-08-29 联想(北京)有限公司 A kind of display methods and electronic equipment
CN104247445B (en) * 2013-02-20 2018-10-12 松下电器(美国)知识产权公司 The control method of portable information terminal
CN105706395B (en) * 2014-01-06 2020-01-14 三星电子株式会社 Control apparatus and control method thereof
CN105786944B (en) * 2015-12-08 2020-03-17 小米科技有限责任公司 Method and device for processing automatic page turning of browser
CN105388779A (en) * 2015-12-25 2016-03-09 小米科技有限责任公司 Control method and device for intelligent equipment

Also Published As

Publication number Publication date
CN106527954A (en) 2017-03-22

Similar Documents

Publication Publication Date Title
CN105955607B (en) Content sharing method and device
CN106572299B (en) Camera opening method and device
US20170322709A1 (en) Split-screen display method, apparatus and medium
JP6488375B2 (en) Device control method and apparatus
US20170344192A1 (en) Method and device for playing live videos
WO2016134591A1 (en) Manipulation method and apparatus of intelligent device
RU2630189C1 (en) Method of controlling button functions in one-hand operation mode, device and electronic device
CN108710306B (en) Control method and device of intelligent equipment and computer readable storage medium
CN114201133A (en) Split screen display method and device
CN107102772B (en) Touch control method and device
CN111381746B (en) Parameter adjusting method, device and storage medium
CN105487805B (en) Object operation method and device
CN104317402B (en) Description information display method and device and electronic equipment
CN104850432A (en) Method and device for adjusting color
CN104484111A (en) Content display method and device for touch screen
WO2017008400A1 (en) Method and device for controlling intelligent device
WO2018000710A1 (en) Method and device for displaying wifi signal icon and mobile terminal
CN106775377B (en) Gesture recognition device, equipment and control method of gesture recognition device
CN111611034A (en) Screen display adjusting method and device and storage medium
CN107396166A (en) The method and device of live middle display video
CN107132983B (en) Split-screen window operation method and device
CN105094626A (en) Method and device for selecting text contents
CN107566878B (en) Method and device for displaying pictures in live broadcast
CN107272427B (en) Control method and device of intelligent equipment
CN106775210B (en) Wallpaper changing method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant