CN115225757A - Object control method, intelligent terminal and storage medium - Google Patents

Object control method, intelligent terminal and storage medium Download PDF

Info

Publication number
CN115225757A
CN115225757A CN202210540638.9A CN202210540638A CN115225757A CN 115225757 A CN115225757 A CN 115225757A CN 202210540638 A CN202210540638 A CN 202210540638A CN 115225757 A CN115225757 A CN 115225757A
Authority
CN
China
Prior art keywords
control
adjusting
parameter
adjustment
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210540638.9A
Other languages
Chinese (zh)
Inventor
王尊伟
党进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Transsion Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Holdings Co Ltd filed Critical Shenzhen Transsion Holdings Co Ltd
Priority to CN202210540638.9A priority Critical patent/CN115225757A/en
Publication of CN115225757A publication Critical patent/CN115225757A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30003Arrangements for executing specific machine instructions
    • G06F9/30076Arrangements for executing specific machine instructions to perform miscellaneous control operations, e.g. NOP
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The application provides an object control method, an intelligent terminal and a storage medium, wherein the object control method comprises the following steps: outputting a control window at a control time; at least one function and/or at least one parameter of the first object is controlled in response to a control instruction for the control window. According to the method and the device, the control effect can be improved from multiple aspects, and the experience of the user in the object control process is improved.

Description

Object control method, intelligent terminal and storage medium
Technical Field
The application relates to the technical field of intelligent terminals, in particular to an object control method, an intelligent terminal and a storage medium.
Background
Intelligent terminals such as mobile phones and the like and/or various application programs run by the intelligent terminals generally have an object control function, such as controlling the brightness, volume and other parameters of the intelligent terminals and/or the application programs, and controlling a Bluetooth switch of the intelligent terminals; by controlling the intelligent terminal and/or various application programs and other objects operated by the intelligent terminal, the operation characteristics can be matched with the user requirements when the various application programs operated by the intelligent terminal and/or the intelligent terminal respond to the user requirements.
In the course of conceiving and implementing the present application, the inventors found that at least the following problems existed: when controlling objects such as an intelligent terminal and/or an application program, the control of corresponding functions and/or parameters is often realized only by entering a corresponding control interface.
The foregoing description is provided for general background information and does not necessarily constitute prior art.
Disclosure of Invention
In view of the foregoing technical problems, the present application provides an object control method, an intelligent terminal, and a storage medium, which can control objects such as an intelligent terminal and/or an application more conveniently.
In order to solve the above technical problem, the present application provides an object control method, which can be applied to an intelligent terminal such as a mobile phone, and includes the following steps:
outputting a control window at a control time;
at least one function and/or at least one parameter of the first object is controlled in response to a control instruction for the control window.
Optionally, the control occasion comprises at least one of: when the second object is started; when the first object provides the specified function; the first object and/or the second object receive a control wake up operation.
Optionally, controlling the wake-up operation comprises at least one of: a first trigger operation received by the designated icon; a second trigger operation received by the first area; and (5) voice wake-up operation.
Optionally, the control window comprises at least one of: at least one preset control; a second area for receiving control instructions; and a voice acquisition window.
Optionally, the control instructions include parameter adjustment instructions and/or function switch instructions.
Optionally, controlling at least one function and/or at least one parameter of the first object comprises at least one of:
switching a response mode of at least one function associated with the first object;
adjusting a degree of response of the at least one parameter of the first subject.
Optionally, the object control method further includes: and controlling at least one function and/or at least one parameter of the first object according to the operation characteristics of the first object.
The application also provides an object control method, which can be applied to intelligent terminals such as mobile phones and the like, and comprises the following steps:
s22, outputting an adjusting window at the adjusting time;
and S23, responding to the adjusting instruction aiming at the adjusting window, and adjusting at least one parameter of the first object.
Optionally, the adjustment window comprises at least one of: at least one preset control; at least one adjustment zone for receiving adjustment instructions; and a voice acquisition window.
Optionally, the adjustment instructions include at least one of: a brightness adjustment instruction, a volume adjustment instruction, a resolution adjustment instruction, a definition adjustment instruction, and a size adjustment instruction of at least one display object.
Optionally, the preset controls comprise a first type of adjustment control characterizing a continuous variation of the parameter and/or a second type of adjustment control comprising at least one parameter step.
Optionally, the preset control comprises at least one of: the display device comprises a brightness adjusting control, a volume adjusting control, a resolution adjusting control, a definition adjusting control and at least one size adjusting control of a display object.
Optionally, adjusting at least one parameter of the first subject comprises at least one of:
adjusting the brightness of the first object;
adjusting a volume of the first object;
adjusting a resolution of the first object;
adjusting the sharpness of the first object;
the size of at least one display object is adjusted.
Optionally, the object control method further includes the steps of: at least one parameter of the first object is adjusted based on the operational characteristic of the first object.
Optionally, the job characteristics include environmental parameters.
Optionally, adjusting at least one parameter of the first object according to the job feature of the first object includes: acquiring an environmental parameter of a first object; outputting adjustment confirmation information of at least one parameter corresponding to the environmental parameter; adjusting at least one parameter according to the adjustment confirmation information.
Optionally, after acquiring the environmental parameter of the first object, the object control method further includes: when the environmental parameter meets the adjusting condition, outputting mode selection information comprising a first adjusting mode and/or a second adjusting mode; the adjustment window is output in response to a mode selection command for a first adjustment mode, and/or adjustment confirmation information is output in response to a mode selection command for a second adjustment mode.
The application also provides an intelligent terminal, including: the object control method comprises a memory and a processor, wherein the memory stores an object control program, and the object control program realizes the steps of any one of the object control methods when being executed by the processor.
Optionally, the intelligent terminal further comprises an environment sensing device.
Optionally, the environment sensing device is used for acquiring an environmental parameter of the first object.
Optionally, the environment sensing device comprises a light sensor and/or a volume detector.
The present application also provides a computer-readable storage medium, which stores a computer program that, when executed by a processor, implements the steps of the object control method as described in any one of the above.
As described herein, in the object control method of the present application, the first object may output the control window at the control timing, so that a user may directly input the control instruction corresponding to the first object in the control window, and convenience of the user in inputting the control instruction may be improved; the first object can respond to the control instruction aiming at the control window, and control at least one function and/or at least one parameter, so that the response efficiency can be improved, the control process can be simplified, and the user experience corresponding to the first object can be improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive step.
Fig. 1 is a schematic diagram of a hardware structure of an intelligent terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
FIG. 3 is a flow diagram illustrating an object control method according to one embodiment;
FIGS. 4a, 4b, 4c, and 4d are schematic diagrams of designated icons shown according to an embodiment of the present application;
FIGS. 5a and 5b are schematic views of a first region shown according to an embodiment of the present application;
FIGS. 6a, 6b, and 6c are schematic diagrams of a control window shown according to an embodiment of the present application;
FIG. 7 is a flow diagram illustrating an object control method according to one embodiment;
FIGS. 8a, 8b, 8c, and 8d are schematic views of an adjustment window according to an embodiment of the present application;
FIGS. 9a, 9b, 9c, and 9d are schematic diagrams of display objects shown according to an embodiment of the present application;
fig. 10a and 10b are schematic diagrams illustrating adjustment confirmation information according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. Specific embodiments of the present application have been shown by way of example in the drawings and will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of a claim "comprising a" 8230a "\8230means" does not exclude the presence of additional identical elements in the process, method, article or apparatus in which the element is incorporated, and further, similarly named components, features, elements in different embodiments of the application may have the same meaning or may have different meanings, the specific meaning of which should be determined by its interpretation in the specific embodiment or by further combination with the context of the specific embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", by way of further example," a, B or C "or" a, B and/or C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or partially with other steps or at least some of the sub-steps or stages of other steps.
The words "if", as used herein may be interpreted as "at \8230; \8230whenor" when 8230; \8230when or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as S12 and S13 are used herein for the purpose of more clearly and briefly describing the corresponding contents, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S13 first and then S12 in the specific implementation, but these should be within the protection scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The smart terminal may be implemented in various forms. For example, the smart terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
While the following description will be given by way of example of a smart terminal, those skilled in the art will appreciate that the configuration according to the embodiments of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of an intelligent terminal for implementing various embodiments of the present application, the intelligent terminal 100 may include: RF (Radio Frequency) unit 101, wiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the intelligent terminal architecture shown in fig. 1 does not constitute a limitation of the intelligent terminal, and that the intelligent terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The following specifically introduces each component of the intelligent terminal with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution, time Division Long Term Evolution), 5G, and so on.
WiFi belongs to short-distance wireless transmission technology, and the intelligent terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the smart terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the smart terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the smart terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is for receiving an audio or video signal. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The smart terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor and a proximity sensor, the ambient light sensor may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1061 and/or the backlight when the smart terminal 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing gestures of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometers and taps), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the intelligent terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, etc., and the like, without limitation.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 1, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the intelligent terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the intelligent terminal, which is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the smart terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the smart terminal 100 or may be used to transmit data between the smart terminal 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the intelligent terminal, connects various parts of the entire intelligent terminal using various interfaces and lines, and performs various functions of the intelligent terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the intelligent terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally the application processor primarily handles operating systems, user interfaces, application programs, etc., and the modem processor primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The intelligent terminal 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
Although not shown in fig. 1, the smart terminal 100 may further include a bluetooth module or the like, which is not described herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the intelligent terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system provided in an embodiment of the present application, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an e-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an epc (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an hss (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a pgw (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide some registers to manage functions such as home location register (not shown) and holds some user-specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems (e.g. 5G), and the like.
Based on the above intelligent terminal hardware structure and communication network system, various embodiments of the present application are provided.
First embodiment
In a first embodiment, there is provided an object control method including the steps of:
outputting a control window at a control time;
at least one function and/or at least one parameter of the first object is controlled in response to a control instruction for the control window.
Alternatively, the first object may comprise a smart terminal such as a mobile phone, a tablet computer, a smart band and/or a personal computer. Optionally, an execution subject of the object control method of this embodiment may be a first object (i.e., an intelligent terminal), so that the intelligent terminal may output a control window corresponding to the local machine, receive a control instruction, and control at least one function and/or at least one parameter of the local machine.
Optionally, the first object may further include at least one application program run by the intelligent terminal, such as a video playing program, an instant messaging program, and/or a traditional communication program such as a telephone. In one example, the execution body of the object control method may include an intelligent terminal corresponding to the first object, so that the intelligent terminal may output a control window corresponding to the first object, receive a control instruction, and control at least one function and/or at least one parameter of the first object that is executed locally. In another example, the execution body of the object control method may include the first object, so that the application program belonging to the first object may output the corresponding control window, receive the control instruction, and control at least one function and/or at least one parameter of the application program.
Referring to fig. 3, the object control method may include steps S12 to S13, which are described in detail as follows:
step S12, outputting a control window at the control time;
optionally, the control timing includes a timing at which the user needs to control the first object, optionally, an adjustment timing of the at least one parameter and/or a switch timing of the at least one function, and the like. The control window comprises a window for controlling the first object, optionally an adjustment window for adjusting at least one parameter of the first object and/or a switch window for opening at least one function of the first object, etc. The first object can detect the control opportunity to automatically output the control window when the control opportunity is detected, so that a user can control the first object, and the convenience of the control process is improved. The first object can also determine a control opportunity according to the received control wake-up operation, so that the control window is output after the user inputs the control wake-up operation aiming at the first object, and the matching degree of the output process of the control window and the user requirement is improved.
Optionally, the control timing comprises at least one of: when the second object is started; when the first object provides a specified function; the first object and/or the second object receive a control wake-up operation.
Optionally, the control opportunity comprises at start-up of the second object. The second object may or may not be identical to the first object. In one example, the first object and the second object may both be smart terminals; at this time, if the execution main body of the object control method includes the intelligent terminal, the intelligent terminal may output a control window for controlling the local computer when the local computer is started. In one example, the first object and the second object may both be an application; at this time, if the execution main body of the object control method includes the intelligent terminal, the intelligent terminal may output a control window for controlling the application program when the application program is started, and if the execution main body of the object control method includes the application program, the application program may output a control window corresponding to the application program when the application program is started.
In another example, the first object may be at least one application program run by the intelligent terminal, and the second object may be the intelligent terminal where the first object is located; if the execution main body of the object control method comprises the intelligent terminal, the intelligent terminal can output a control window for controlling the first object when the machine is started; if the execution main body of the object control method comprises the application program, the application program can output a control window for controlling the first object when the intelligent terminal is started. In another example, the first object may be a smart terminal, and the second object may be at least one application program run by the smart terminal; at this time, if the execution main body of the object control method includes the intelligent terminal, the intelligent terminal may output a control window for controlling the local computer when the second object is started. The second object is started and determined as a control opportunity, so that a control window is output when the second object is started, and a user can control the function and/or the parameter of the first object through the control window, so that the first object can respond to various requirements of the user by adopting the function and/or the parameter set by the user, and the response effect can be improved.
Optionally, the control opportunity comprises when the first object provides the specified function. The designated function may be determined according to the functional characteristics of the first object, and may include a function that requires readjustment of the relevant control parameters in response to a user's request. Optionally, when the first object includes a smart terminal such as a mobile phone, the specified function may include a video call function, a video play function, a data download function, and/or a picture editing function, etc.; when the first object includes an instant messaging program, the designated function may include a voice call function and/or a video call function, and the like; when the first object includes a picture processing program such as a figure show, the specified function may include a picture editing function or the like. The control window is output when the first object provides the appointed function, so that a user can reset the function and/or the adopted parameter possessed by the first object when the first object provides the appointed function, the degree of fit between the corresponding control opportunity and the user requirement can be improved, the control window can be prevented from being output at the opportunity outside the range of the appointed function provided by the first object on the basis of improving the convenience of the corresponding control scheme, and the effectiveness of the output control window can be improved.
Optionally, the control occasion comprises when the first object and/or the second object receive a control wake-up operation. The first object and/or the second object output the control window when receiving the control wakeup operation, so that the degree of engagement between the corresponding control opportunity and the user requirement is greatly improved, and the user experience provided by the corresponding control scheme is improved.
Optionally, controlling the wake-up operation comprises at least one of: a first trigger operation received by the designated icon; a second trigger operation received by the first area; and (5) voice wake-up operation.
Optionally, the designated icon may include an icon corresponding to the first object and/or the second object; the designated icon is determined according to the type of the first object and/or the relative relationship between the first object and the second object. In one example, the designated icon may include at least one icon of the first object corresponding interface; for example, referring to the mobile phone desktop shown in fig. 4a, the designated icon is a newly set icon on the mobile phone desktop; for example, referring to the chat interface corresponding to the social program shown in fig. 4b, the designated icon is an icon displayed and/or suspended on the chat interface, and so on. In another example, illustrated with reference to fig. 4c and 4d, the first object includes a first application, a second application, and a third application, and the second object includes a cell phone running the first object; optionally, as shown in fig. 4c, the designated icon may include at least one icon corresponding to the first object, such as an icon corresponding to the first application, an icon corresponding to the second application, and/or an icon corresponding to the third application; optionally, as shown in fig. 4d, the designated icon may also include an icon newly set on the desktop of the mobile phone. Optionally, the first trigger operation may include a single click, a double click and/or a long press, which are/is commonly used by the user. Optionally, the first trigger operation is different from an operation of an existing instruction of the designated icon, taking a desktop of a mobile phone shown in fig. 4c as an example, if the designated icon includes an icon corresponding to the first application program, and an operation of clicking the icon corresponding to the first application program represents that the first application program is started, the first trigger operation may be another operation different from the clicking operation, for example, a long-press operation or the like, so as to ensure ordering of inputting various operations to the designated icon.
Optionally, the first area may include at least a partial area of the display interface corresponding to the first object and/or the second object, such as a lower right corner area, a left side area and/or a right side area, and so on corresponding to the display interface. For example, referring to fig. 5a, the first object includes a first application program, a second application program, and a third application program, the second object includes a mobile phone running the first object, when the first object is not started, the corresponding icon is displayed on the desktop of the mobile phone, and at this time, the first area may be a lower right corner area, a left side area, and/or a right side area of the desktop of the mobile phone, and so on; if the first application program is a video playing program, after entering the video playing interface shown in fig. 5b, the first area may be a left area and/or a right area of the video playing interface, and so on. Optionally, when the first area does not receive the related operation, the first area is not displayed to avoid interfering with other operations of the display interface corresponding to the data, and when the second trigger operation is received, the second area may be displayed by increasing brightness, adjusting display color and/or display contour line, etc. to prompt the user that the areas are the first area, and the first trigger operation input by the user will be responded. The second trigger operation may include a single click, a double click, and/or a long press, etc. that are commonly used by the user.
Optionally, the voice wake-up operation comprises inputting a preset voice to the first object and/or the second object. The preset speech may comprise a language characterizing control instructions associated with the first object, such as controlling the first object, controlling at least one function associated with the first object, and/or controlling at least one parameter associated with the first object. The voice input mode is adopted to control the awakening operation, so that the flexibility of the user in the process of inputting and controlling the awakening operation can be improved, and the corresponding user experience is improved.
Optionally, the control window comprises at least one of: at least one preset control; a second area for receiving control instructions; and a voice acquisition window.
Optionally, the control window may include at least one preset control. Optionally, the preset control may include a function switch control such as a WiFi (wireless network communication technology) switch, a bluetooth switch, an NFC (near field communication) switch, and/or a mobile data traffic switch; for example, as shown in fig. 6a, the corresponding control window includes a WiFi switch and a mobile data traffic switch, and the user can control the WiFi function of the corresponding intelligent terminal by controlling the WiFi switch, and can control the mobile data traffic function of the corresponding intelligent terminal by controlling the mobile data traffic switch. Optionally, the preset control may include a parameter adjusting control such as a volume adjusting control, a brightness adjusting control, a resolution adjusting control, and/or a definition adjusting control; for example, referring to fig. 6b, the control window shown in the figure includes a definition adjustment control for the played video, the definition adjustment control includes four definition steps of a first definition 270P, a second definition 480P, a third definition 720P and a fourth definition 1020P, and the definition of the played video can be adjusted by clicking the corresponding definition step by the user.
Optionally, the control window may comprise a second area for receiving control instructions. The second area may include at least a partial area of the display interface corresponding to the first object and/or the second object, such as a left area and/or a right area of the display interface, and so on. For example, referring to fig. 6c, the first object includes a video playing program, the control instruction includes a definition adjustment instruction, and the second area may include an area on the right side of the video playing interface shown in fig. 6c, and the user slides up in the second area to adjust the definition of the played video up and slides down to adjust the definition of the played video down. Optionally, when the second area does not receive the related operation, the second area is not displayed to avoid interfering with other operations of the display interface corresponding to the data, and when the operation corresponding to the control instruction is received, the second area may be displayed in a manner of increasing brightness, adjusting display color and/or display contour line, and the like, so as to prompt the user that the areas are the second area for receiving the control instruction input by the user. Optionally, if the control wake-up operation corresponding to the first object corresponds to the first area, the second area may include other areas different from the first area.
Optionally, the control window may comprise a voice capture window. The voice acquisition window can be characterized in a form of displaying static or dynamic microphone icons, or in a form of voice broadcasting and the like, so that a user can determine the voice acquisition opportunity corresponding to the current control instruction, and can acquire the control instruction input by the user in a voice form.
Optionally, the control window may exit when the user triggers an object such as another icon and/or control outside the control window, or may exit after the time length during which the corresponding control instruction is not received exceeds the set time length, so as to avoid interference on the operation process of the first object. Optionally, the control window includes a closing control and/or an exit control to receive a corresponding closing instruction and/or an exit instruction, and exit in time according to a user requirement.
Optionally, the control instruction may include a parameter adjustment instruction, such as a volume adjustment instruction, a brightness adjustment instruction, a resolution adjustment instruction, and/or a definition adjustment instruction. Optionally, the control command includes a function switch command, a WiFi switch command, a bluetooth switch command, an NFC switch command, and/or a mobile data traffic command, and the like.
Step S13, controlling at least one function and/or at least one parameter of the first object in response to the control instruction for the control window.
Optionally, the control window may include an adjustment window, and the control instruction may include an adjustment instruction corresponding to the adjustment window; optionally, the control window may include an open/close window, and the control instruction may include an open instruction or a close instruction corresponding to the open/close window. In the step S13, the control instruction input by the user may be received through the control window, so that the user may conveniently control at least one function and/or at least one parameter of the first object, and the rapidity and flexibility of the corresponding control process may be improved.
Optionally, in step S13, controlling at least one function and/or at least one parameter of the first object comprises at least one of: switching a response mode of at least one function; the functions may include functions provided by the first object such as WiFi, bluetooth and/or NFC; the response mode may include a state of the corresponding function, such as on or off, relative to the first object.
Optionally, switching the response mode of the at least one function comprises: switching the response mode of the WiFi function from the first mode to the second mode, switching the response mode of the bluetooth function from the first mode to the second mode, and/or switching the response mode of the NFC function from the first mode to the second mode, and so on. Optionally, the first mode is one of an on mode and an off mode, and the second mode is the other of the on mode and the off mode.
Adjusting a degree of response of the at least one parameter; the parameters may include volume, brightness and/or resolution parameters that affect the effect of the response of the first object; the degree of response matches the parameter type direction and may include magnitude and/or height, etc. Optionally, adjusting the degree of response of the at least one parameter comprises: adjusting volume, adjusting brightness, and/or adjusting resolution, etc.
Optionally, the object control method further includes: and controlling at least one function and/or at least one parameter of the first object according to the operation characteristics of the first object.
Optionally, the operation characteristic includes a parameter such as a state characteristic characterizing the operation of the first object, and may also include an environmental characteristic and/or a parameter such as an environmental parameter characterizing the operation environment of the first object. Optionally, the status feature of the first object includes a current status of the first object, such as a current power and/or available storage space of the first object. Optionally, the environmental characteristic of the first object includes an external environment such as a natural environment and/or a network environment in which the first object is currently located. Optionally, the environmental characteristic of the first object includes a state of the smart terminal where the first object is located, and if the first object is an application, the environmental characteristic of the first object includes an electric quantity of the smart terminal where the application is located, and the like. Optionally, controlling at least one function and/or at least one parameter of the first object according to the job characteristic of the first object comprises: if the electric quantity corresponding to the first object is smaller than a first preset electric quantity, starting a power saving function (or a power saving mode); if the available storage space of the first object is smaller than the preset capacity, closing the automatic storage function; if the network environment where the first object is located is a wireless local area network environment, starting a WiFi function; if the volume of the environment where the first object is located is larger than the preset volume, the corresponding playing volume is increased; if the electric quantity of the intelligent terminal where the first object is located is smaller than a second preset electric quantity, reducing the corresponding brightness and/or volume; and/or, if the network environment in which the first object is located is a traffic-limited environment, the definition of the corresponding video is reduced, and the like. Alternatively, the restricted traffic environment may include a communication environment such as a WiFi connection disconnection environment and/or a network environment in which traffic charges are relatively high.
At least one function and/or at least one parameter of the first object can be controlled in a self-adaptive manner according to the operation characteristics of the first object, and the at least one function and/or parameter of the first object can be controlled to be matched with the operation characteristics on the premise that a user does not participate in the control process, so that the control process can be further simplified, and the control effect can be improved.
According to the object control method, the first object can output the control window at the control time, so that a user can directly input the control instruction corresponding to the first object in the control window, and convenience of the user in inputting the control instruction can be improved; the first object can respond to the control instruction aiming at the control window, and control at least one function and/or at least one parameter, so that the response efficiency can be improved, the control process can be simplified, and the user experience corresponding to the first object can be improved. The control window comprises a preset control, at least one second area for receiving a control instruction and/or a voice acquisition window and other instruction acquisition windows in various forms, so that a user can input various control instructions more conveniently. The first object can also adaptively control at least one function and/or at least one parameter according to the operation characteristics, and can control at least one function and/or parameter of the first object to a state matched with the operation characteristics on the premise that a user does not participate in the control process, so that the control process can be further simplified, and the control effect is improved. Therefore, the object control method can improve the control effect from multiple aspects and improve the experience of the user in the object control process.
Second embodiment
On the basis of the above embodiments, a second embodiment of the present application provides an object control system including:
the first output module is used for outputting a control window at the control time;
a control module for controlling at least one function and/or at least one parameter of the first object in response to a control instruction for the control window.
For specific limitations of the object control system, reference may be made to the above limitations of the object control method, which are not described herein again. The various modules in the object control system may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in hardware or independent of a processor in the computer device, or can be stored in a memory in the computer device in software, so that the processor can call and execute operations corresponding to the modules.
The division of the modules in the object control system is only used for illustration, and in other embodiments, the object control system may be divided into different modules as needed to complete all or part of the functions of the object control system.
Third embodiment
In a third embodiment, an object control method is provided, where the object control method may include a parameter adjustment method, and may be applied to a smart terminal and/or various applications run by the smart terminal. The object control method includes the steps of:
outputting an adjusting window at the adjusting time;
at least one parameter of the first object is adjusted in response to an adjustment instruction for the adjustment window.
Optionally, the first object comprises a smart terminal such as a mobile phone, a tablet computer, a smart band and/or a personal computer; optionally, the execution main body of the object control method may include the first object (i.e., the intelligent terminal), so that the intelligent terminal may output a corresponding adjustment window of the local computer, receive an adjustment instruction, and adjust at least one parameter of the local computer. Optionally, the first object includes at least one application program run by the intelligent terminal, such as a video playing program, an instant messaging program and/or a traditional messaging program such as a telephone, and the like; in one example, the execution main body of the object control method may include an intelligent terminal corresponding to the first object, so that the intelligent terminal may output an adjustment window corresponding to the first object, receive an adjustment instruction, and adjust at least one parameter of the first object running locally; in another example, the execution body of the object control method may include a first object, so that an application program belonging to the first object may output a corresponding adjustment window, receive an adjustment instruction, and control at least one parameter of the application program.
Referring to fig. 7, the object control method includes steps S22 to S23, which are described in detail as follows:
step S22, outputting an adjusting window at the adjusting time;
optionally, the adjustment occasion comprises an occasion when a user needs a parameter adjustment for the first object. The adjustment window includes a window for adjusting the first object. The first object can detect the adjusting time, so that the adjusting window is automatically output when the adjusting time is detected, a user can adjust the parameters of the first object, and convenience of the corresponding parameter adjusting process is improved. The first object can also determine an adjusting time according to the received adjusting awakening operation, so that after the user inputs the adjusting awakening operation aiming at the first object, the adjusting window is output, and the matching degree of the output process of the adjusting window and the user requirement is improved.
Optionally, adjusting the timing comprises at least one of: when the second object starts; when the first object provides a specified function; the first object and/or the second object receive the adjusted wake up operation.
Optionally, the adjustment opportunity comprises at start-up of the second object. The second object may be the same as or different from the first object, e.g., in one example, the first object and the second object may both be smart terminals; at this time, if the execution main body of the object control method includes the intelligent terminal, the intelligent terminal can output an adjustment window for adjusting the local machine when the local machine is started.
In one example, the first object and the second object may both be an application; at this time, if the execution main body of the object control method includes the intelligent terminal, the intelligent terminal may output an adjustment window for adjusting the application program when the application program is started, and if the execution main body of the object control method includes the application program, the application program may output a corresponding adjustment window when the application program is started.
In another example, the first object may be at least one application program run by the intelligent terminal, and the second object may be the intelligent terminal where the first object is located; if the execution main body of the object control method comprises the intelligent terminal, the intelligent terminal can output an adjusting window for adjusting the first object when the machine is started; if the execution main body of the object control method comprises the application program, the application program can output the adjusting window for adjusting the first object when the intelligent terminal is started.
In another example, the first object may be a smart terminal, and the second object may be at least one application program run by the smart terminal; at this time, if the execution main body of the object control method includes the intelligent terminal, the intelligent terminal may output an adjustment window for adjusting the local computer when the second object is started. The second object is started to be determined as the adjusting time, so that the adjusting window is output when the second object is started, and the user can adjust at least one parameter of the first object through the adjusting window, so that the first object can respond by adopting the parameter set by the user when subsequently responding to various requirements of the user, and the response effect can be improved.
Optionally, adjusting the timing comprises when the first object provides the specified function; the designated function may be determined according to the functional characteristics of the first object, and may include a function of readjusting the relevant adjustment parameter in response to the user's request. Optionally, when the first object includes a smart terminal such as a mobile phone, the specified function may include a video call function, a video playing function, and/or a picture editing function, etc.; when the first object includes an instant messaging program, the designated function may include a voice call function and/or a video call function, etc.; when the first object includes a picture processing program, the specified function may include a picture editing function or the like. The first object outputs the adjusting window when providing the appointed function, so that a user can reset the parameters adopted by the first object when providing the appointed function, the conformity between the corresponding adjusting time and the user requirement can be improved, the adjusting window can be prevented from being output at the time outside the range of providing the appointed function by the first object on the basis of improving the convenience of the corresponding adjusting scheme, and the effectiveness of the output adjusting window can be improved.
Optionally, the adjustment opportunity comprises when the first object and/or the second object receive an adjustment wake-up operation; the first object and/or the second object output the adjusting window when receiving the adjusting awakening operation, so that the degree of fit between the corresponding adjusting opportunity and the user requirement is greatly improved, and the user experience provided by the corresponding adjusting scheme is improved.
Optionally, adjusting the wake-up operation comprises at least one of: a first trigger operation received by the designated icon; a second trigger operation received by the first area; and (5) voice wake-up operation.
Optionally, the designated icon may include an icon corresponding to the first object and/or the second object; the designated icon is determined according to the type of the first object and/or the relative relationship between the first object and the second object. In one example, the designated icon may include at least one icon of the first object corresponding interface; optionally, referring to the mobile phone desktop shown in fig. 4a, the designated icon is a newly set icon on the mobile phone desktop; optionally, as shown in the chat interface corresponding to the social program in fig. 4b, the designated icon is an icon displayed and/or suspended on the chat interface, and so on. In another example, illustrated with reference to fig. 4c and 4d, the first object includes a first application, a second application, and a third application, and the second object includes a cell phone running the first object; optionally, as shown in fig. 4c, the designated icon may include at least one icon corresponding to the first object, such as an icon corresponding to the first application program, an icon corresponding to the second application program, and/or an icon corresponding to the third application program; optionally, as shown in fig. 4d, the designated icon may also include an icon newly set on the desktop of the mobile phone. Optionally, the first trigger operation may include a single click, a double click and/or a long press, which are/is commonly used by the user. Optionally, the first trigger operation is different from an operation of an existing instruction of the designated icon, taking a desktop of the mobile phone shown in fig. 4c as an example, if the designated icon includes an icon corresponding to the first application program, and an operation representation of clicking the icon corresponding to the first application program starts the first application program, the first trigger operation may be another operation different from the clicking, for example, a long-time press or the like, so as to ensure an order of inputting various types of operations to the designated icon.
Optionally, the first area may include at least a partial area of the display interface corresponding to the first object and/or the second object, such as a lower right corner area, a left side area and/or a right side area, and so on corresponding to the display interface. For example, referring to fig. 5a, the first object includes a first application program, a second application program, and a third application program, the second object includes a mobile phone running the first object, when the first object is not started, the corresponding icon is displayed on the desktop of the mobile phone, and at this time, the first area may be a lower right corner area, a left side area, and/or a right side area of the desktop of the mobile phone, and so on; if the first application program is a video playing program, after entering the video playing interface shown in fig. 5b, the first area may be a left area and/or a right area of the video playing interface, and so on. Optionally, when the first area does not receive the related operation, the first area is not displayed to avoid interfering with other operations of the display interface corresponding to the data, and when the second trigger operation is received, the second area may be displayed in a manner of increasing brightness, adjusting display color and/or display contour line, etc., to prompt the user that the areas are the first area, and the first trigger operation input by the user will be responded. The second trigger operation may comprise a single click, a double click and/or a long press, etc. which are commonly used by the user.
Optionally, the voice wake-up operation comprises inputting a preset voice to the first object and/or the second object. The preset speech may include a language of adjustment instructions characterizing the association of the first object, such as adjusting the first object, and/or adjusting at least one parameter associated with the first object. The voice form is adopted to input and adjust the awakening operation, so that the flexibility of the user in the process of inputting and adjusting the awakening operation can be improved, and the corresponding user experience is improved.
Optionally, the adjustment window comprises at least one of: at least one preset control; at least one adjustment zone for receiving adjustment instructions; and a voice acquisition window.
Optionally, the adjustment window may comprise at least one preset control. Optionally, the preset control may include a parameter adjustment control such as a volume adjustment control, a brightness adjustment control, a resolution adjustment control, and/or a definition adjustment control. Referring to fig. 8a, a volume adjustment control of an online conference interface corresponding to an online conference program is shown, by which the volume of an online conference can be adjusted; as shown in fig. 8a, the volume adjustment control includes a volume indication bar and a volume adjustment block, and a user drags the volume adjustment block on the volume indication bar, so that the volume of the online conference can be adjusted, for example, the volume adjustment block is dragged upwards, the volume can be increased, the volume adjustment block is dragged downwards, the volume can be decreased, and the like. Referring to fig. 8b, this figure shows a brightness adjustment control of a live broadcast interface of a commodity corresponding to a live broadcast program, and by adjusting this brightness adjustment control, the brightness of the live broadcast interface of the commodity can be adjusted; as shown in fig. 8b, the brightness adjustment control includes a brightness indication bar and a brightness adjustment block, and the user drags the brightness adjustment block on the brightness indication bar, so as to adjust the brightness of the corresponding live broadcast interface of the commodity, for example, the brightness adjustment block is dragged upwards to increase the brightness, the brightness adjustment block is dragged downwards to decrease the brightness, and the like. Referring to fig. 8c, the figure shows a definition adjustment control of a video playing interface corresponding to a video playing program, where the definition adjustment control includes four definition levels, that is, a first definition level 270P, a second definition level 480P, a third definition level 720P, and a fourth definition level 1020P, and a user can adjust the definition of a played video by clicking the corresponding definition level.
Optionally, the adjustment window may comprise at least one adjustment region for receiving adjustment instructions. The adjustment region may include at least a partial region of the display interface corresponding to the first object and/or the second object, such as a left region and/or a right region of the corresponding display interface, and so on. For example, referring to fig. 8d, the first object includes a video playing program, the adjustment instruction includes a brightness adjustment instruction, and the adjustment area may include an area on the right side of the video playing interface shown in fig. 8d, where the user slides up to adjust the brightness of the played video, and slides down to adjust the brightness of the played video. Optionally, when the relevant operation is not received, the adjustment area is not displayed to avoid interfering with other operations of the display interface corresponding to the data, and when the operation corresponding to the adjustment instruction is received, the adjustment area may be displayed in a manner of increasing brightness, adjusting display color and/or display contour line, and the like, so as to prompt the user that the areas are the adjustment areas and used for receiving the adjustment instruction input by the user. Optionally, if the adjusted wake-up operation corresponding to the first object corresponds to the first area, the adjusted area may include other areas different from the first area.
Optionally, the adjustment window may include a voice acquisition window, and the voice acquisition window may be represented in a form of displaying a static or dynamic microphone icon, or may be represented in a form of voice broadcasting or the like, so that the user specifies a voice acquisition opportunity corresponding to the current adjustment instruction, and can acquire the adjustment instruction input by the user in a voice form.
Optionally, the adjustment window may exit when the user triggers an object such as another icon and/or control outside the adjustment window, or may exit after the time length during which the corresponding adjustment instruction is not received exceeds the set time length, so as to avoid interference on the operation process of the first object. Optionally, the adjusting window includes a closing control and/or an exit control to receive a corresponding closing instruction and/or an exit instruction and exit in time according to a user requirement.
Optionally, the adjustment instructions include at least one of: the display device comprises a brightness adjusting instruction, a volume adjusting instruction, a resolution adjusting instruction, a definition adjusting instruction and at least one size adjusting instruction of a display object, so that corresponding parameters can be adjusted conveniently, at least one parameter can be adjusted rapidly, and adjusting efficiency is further improved. Optionally, the display object may include at least one object corresponding to the display interface, such as at least one window displayed on a desktop of the smart terminal, at least one window and/or object displayed on a display interface of the application program, and so on. Referring to fig. 9a, the online conference interface corresponding to the conference program may include three objects, namely a conference window, a recording window, and a conference user window, the preset control included in the adjustment window includes a size adjustment control corresponding to each window, the size adjustment control may refer to fig. 9b, and the user clicks the size corresponding to each window, so as to adjust the size of each window at the same time. For example, if the user enters the adjustment operation shown in fig. 9c in the adjustment window: the size corresponding to the conference window is medium, the size corresponding to the recording window is large, and the size corresponding to the participant user list window is medium, so that the sizes of the windows can be adjusted to be as shown in fig. 9 d. Alternatively, each display object may correspond to a plurality of size stages, and the size associated with each size stage may be preset.
Optionally, the preset controls comprise a first type of adjustment control characterizing a continuous variation of the parameter and/or a second type of adjustment control comprising at least one parameter step. Optionally, the first type of adjustment control may provide an adjustment path for any parameter from the minimum value to the maximum value; as shown in fig. 8a, the lowest end of the volume bar indicates the minimum volume, the top end indicates the maximum volume, and the user can adjust any volume between the minimum volume and the maximum volume as required; as shown in fig. 8b, the lowest end of the brightness bar represents the minimum brightness, and the topmost end represents the maximum brightness, and the user can adjust the brightness to any brightness between the minimum brightness and the maximum brightness as required. It can be seen that the first type of adjustment control provides all adjustment options from the minimum value to the maximum value, with greater flexibility. The second type of adjusting control comprises at least one parameter gear, for example, the definition adjusting control shown in fig. 8c comprises four definition gears including a first definition 270P, a second definition 480P, a third definition 720P and a fourth definition 1020P, and each size adjusting control shown in fig. 9b comprises three size gears including a large size gear, a medium size gear and a small size gear, so that a user triggers the corresponding parameter gear, and can adjust the parameter to be adjusted to the target gear quickly, thereby simplifying the adjusting process and improving the adjusting efficiency.
Optionally, the preset control comprises at least one of: a brightness adjusting control, a volume adjusting control, a resolution adjusting control, a definition adjusting control and at least one size adjusting control of a display object; so that the user can more conveniently adjust the at least one parameter.
Step S23, adjusting at least one parameter of the first object in response to the adjustment instruction for the adjustment window.
Step S23 may receive an adjustment instruction input by the user through the adjustment window, so that the user may conveniently adjust at least one parameter of the first object, and the rapidity and flexibility of the corresponding parameter adjustment process may be improved.
Optionally, in step S23, adjusting at least one parameter of the first object comprises at least one of:
adjusting the brightness of the first object, such as the brightness of the intelligent terminal and/or the brightness of various applications;
adjusting the volume of the first object, such as the volume of the intelligent terminal, the volume of the social program, the volume of the playing program and/or the volume of the live program, and the like;
adjusting the resolution of the first object, such as the resolution of various shooting programs, the resolution of a picture editing program and/or the resolution of an image processing program, and the like;
adjusting the sharpness of the first object;
and adjusting the size of at least one display object, wherein the display object can comprise an object such as a window and/or an element of a corresponding display interface.
At least one parameter such as brightness, volume, resolution and/or definition of the first object can be adjusted rapidly, the adjusting process of the corresponding parameter can be simplified, and adjusting efficiency is improved.
Optionally, the object control method further includes:
step S24, at least one parameter of the first object is adjusted according to the operation characteristic of the first object.
Optionally, the operation characteristic includes a parameter such as a state characteristic characterizing the operation of the first object, and may also include an environmental characteristic and/or a parameter such as an environmental parameter characterizing the operation environment of the first object. Optionally, the status feature of the first object includes a current status of the first object, such as a current power and/or available storage space of the first object. Optionally, the environmental characteristic of the first object includes an external environment such as a natural environment and/or a network environment in which the first object is currently located. Optionally, the environmental characteristic of the first object includes a state of the smart terminal where the first object is located, and if the first object is an application, the environmental characteristic of the first object includes an electric quantity of the smart terminal where the application is located, and the like. Optionally, adjusting at least one parameter of the first object according to the job characteristic of the first object comprises: if the electric quantity corresponding to the first object is smaller than a first preset electric quantity, reducing the brightness and/or reducing the volume of the first object; if the volume of the environment where the first object is located is larger than the preset volume, the playing volume of the first object is increased; and/or if the brightness difference between the ambient brightness of the first object and the brightness of the first object is larger than the set difference value, adjusting the brightness of the first object according to the ambient brightness, and the like.
At least one parameter of the first object can be adjusted in a self-adaptive mode according to the operation characteristics of the first object, and on the premise that a user does not participate in the adjusting process, the corresponding parameter adopted by the first object can be adjusted to the parameter matched with the operation characteristics, so that the adjusting process can be further simplified, and the adjusting effect is improved.
Optionally, the job characteristics include environmental parameters, such as ambient brightness and/or ambient volume.
Optionally, step S24 includes:
step S241, obtaining the environmental parameters of the first object;
step S242, outputting adjustment confirmation information of at least one parameter corresponding to the environmental parameter;
in step S243, at least one parameter is adjusted according to the adjustment confirmation information, for example, at least one parameter is adjusted in response to a confirmation command corresponding to the adjustment confirmation information.
Before at least one parameter is adjusted according to the environmental parameters of the first object, corresponding adjustment confirmation information is output, so that a user can select whether to adjust the parameters according to requirements, the matching degree of the parameter adjustment process and the requirements of the user can be improved, and the user experience is improved. Optionally, at least one parameter may be adjusted according to a parameter type and/or a preset adjustment rule, so as to adjust the parameter to match the environmental parameter of the first object; for example, if the environmental parameter comprises an environmental brightness, adjusting the at least one parameter comprises: adjust the brightness of the first object to match the above-mentioned ambient brightness (e.g., adjust to be equal to the ambient brightness or slightly higher than the ambient brightness), and so on.
Optionally, the first object includes a mobile phone, the environment parameter includes environment brightness, the parameter to be adjusted includes mobile phone brightness, if the environment brightness is greater than the current mobile phone brightness, the mobile phone may output adjustment confirmation information as shown in fig. 10a for the user to select whether to adjust the mobile phone brightness, if the user triggers "yes", a confirmation instruction may be input, the mobile phone adjusts the local brightness to match the environment brightness, if the user triggers "no", an no-adjustment instruction may be input, the mobile phone exits the current brightness adjustment process, and continues to respond to other works. Optionally, the first object includes a video playing program, the environment parameter includes an electric quantity of the intelligent terminal where the video playing program is located, the parameter to be adjusted includes display brightness and playing volume, if the electric quantity corresponding to the intelligent terminal is smaller than a second preset electric quantity, the video playing program can output adjustment confirmation information as shown in fig. 10b at this time, so that a user can select whether to lower the display brightness or not and lower the playing volume, so that the adjustment process of each parameter is matched with the current demand of the user, and on the basis of improving the adjustment effect, the user experience is improved as much as possible.
Optionally, after the step S241, the following steps are further included:
s244, when the environmental parameter meets the adjusting condition, outputting mode selection information comprising a first adjusting mode and/or a second adjusting mode;
and S245, responding to the mode selection instruction aiming at the first adjusting mode, and outputting an adjusting window for a user to input a corresponding adjusting instruction through the adjusting window.
Optionally, after the step of S244, the following steps are further included:
in response to the mode selection instruction for the second adjustment mode, step S242 is executed to output adjustment confirmation information of at least one parameter corresponding to the environmental parameter, so that the user further selects whether to automatically adjust the corresponding parameter.
Optionally, the adjustment condition may be set according to factors such as the type of the parameter to be adjusted, and may be set as a condition that the environmental parameter exceeds the corresponding parameter range; for the parameter to be adjusted, such as brightness, the adjustment condition includes that the brightness difference between the ambient brightness of the first object and the brightness of the first object is greater than a set difference value; for example, for a parameter to be adjusted such as volume, the adjustment condition includes that the volume of the environment in which the first object is located is greater than a volume threshold, and so on. The first adjustment mode may include a manual adjustment or the like that requires user involvement, and the second adjustment mode may include a self-adjustment or the like that does not require user involvement. After the environmental parameters reach the adjustment conditions, the first object outputs mode selection information for the user to select the corresponding adjustment mode for parameter adjustment, so that the flexibility in the parameter adjustment process can be improved.
According to the object control method, the first object can output the adjusting window at the adjusting time, so that a user can directly input the adjusting instruction corresponding to the first object in the adjusting window, and convenience of the user in inputting the adjusting instruction can be improved; the first object can respond to the adjusting instruction aiming at the adjusting window and adjust at least one parameter, so that the adjusting efficiency can be improved, and the adjusting process can be simplified. The adjusting window comprises a preset control, at least one adjusting area for receiving an adjusting instruction and/or a voice acquiring window and other instruction acquiring windows in various forms, so that a user can input various adjusting instructions more conveniently. The first object can adaptively adjust at least one parameter according to the relevant operation characteristics, and the parameters adopted by the first object are adjusted to the parameters matched with the operation characteristics on the premise that the user does not participate in the adjustment process, so that the adjustment process can be further simplified, and the adjustment effect is improved; in addition, the first object can also output mode selection information after the environmental parameters reach the adjustment conditions, so that a user can select a corresponding adjustment mode to perform parameter adjustment, and the flexibility in the parameter adjustment process can be improved. The object control method can improve the parameter adjustment effect from multiple aspects and improve the user experience corresponding to the first object.
Fourth embodiment
On the basis of the above embodiments, a fourth embodiment of the present application provides an object control system including:
the second output module is used for outputting the adjusting window at the adjusting time;
and the adjusting module is used for responding to the adjusting instruction aiming at the adjusting window and adjusting at least one parameter of the first object.
For specific limitations of the object control system, reference may be made to the above limitations of the object control method, which are not described herein again. The various modules in the object control system may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in hardware or independent of a processor in the computer device, or can be stored in a memory in the computer device in software, so that the processor can call and execute operations corresponding to the modules.
The division of the modules in the object control system is only used for illustration, and in other embodiments, the object control system may be divided into different modules as needed to complete all or part of the functions of the object control system.
The embodiment of the present application further provides an intelligent terminal, where the intelligent terminal includes a memory and a processor, and the memory stores an object control program, and the object control program implements the steps of the object control method according to any one of the embodiments when executed by the processor.
Optionally, the intelligent terminal further comprises an environment sensing device; the environment sensing device is used for acquiring the environment parameters of the first object and sending the environment parameters to the processor, so that the processor automatically controls at least one function and/or at least one parameter according to the environment parameters.
Optionally, the environment sensing device includes a light sensor and/or a volume detector for measuring various types of environment parameters in the environment where the first object is located.
Taking the brightness control process of the first object as an example for explanation, the environment sensing device comprises a light sensing sensor, the light sensing sensor measures environment brightness and sends the environment brightness to the processor, and the processor adjusts the backlight duty ratio corresponding to the whole machine according to the environment brightness so as to adjust the brightness of the machine to be matched with the environment brightness; for example, if the ambient brightness is greater than the local brightness, the corresponding backlight duty cycle may be adjusted higher to adjust the brightness equal to the ambient brightness, and so on.
An embodiment of the present application further provides a computer-readable storage medium, where an object control program is stored on the computer-readable storage medium, and when the object control program is executed by a processor, the steps of the object control method in any of the above embodiments are implemented.
In the embodiments of the intelligent terminal and the computer-readable storage medium provided in the present application, all technical features of any one of the embodiments of the object control method may be included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method, and are not described herein again.
Embodiments of the present application further provide a computer program, where the computer program includes computer program code, and when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
It is to be understood that the foregoing scenarios are only examples, and do not constitute a limitation on application scenarios of the technical solutions provided in the embodiments of the present application, and the technical solutions of the present application may also be applied to other scenarios. For example, as can be known by those skilled in the art, with the evolution of system architecture and the emergence of new service scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device in the embodiment of the application can be merged, divided and deleted according to actual needs.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with an emphasis on the description, and reference may be made to the description of other embodiments for parts that are not described or recited in any embodiment.
The technical features of the technical solution of the present application may be arbitrarily combined, and for brevity of description, all possible combinations of the technical features in the embodiments are not described, however, as long as there is no contradiction between the combinations of the technical features, the scope of the present application should be considered as being described in the present application.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, memory Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. An object control method, comprising:
outputting a control window at a control time;
controlling at least one function and/or at least one parameter of the first object in response to the control instruction for the control window.
2. The method of claim 1, comprising at least one of:
the control opportunities include at least one of: when a second object is started, the first object provides a specified function, and the first object and/or the second object receive a control wakeup operation;
the control wake-up operation comprises at least one of: a first trigger operation received by the designated icon, a second trigger operation received by the first area and a voice awakening operation are received;
the control window includes at least one of: the voice control system comprises at least one preset control, a second area for receiving a control instruction and a voice acquisition window;
the control instruction comprises a parameter adjusting instruction and/or a function switch instruction;
the controlling at least one function and/or at least one parameter of the first object comprises switching a response mode of the at least one function associated with the first object and/or adjusting a degree of response of the at least one parameter of the first object.
3. An object control method, characterized by comprising:
outputting an adjusting window at the adjusting time;
adjusting at least one parameter of the first object in response to the adjustment instruction for the adjustment window.
4. The method of claim 3, wherein the adjustment window comprises at least one of: at least one preset control; at least one adjustment zone for receiving adjustment instructions; and a voice acquisition window.
5. The method of claim 3, wherein said adjusting at least one parameter of the first object comprises at least one of:
adjusting the brightness of the first object;
adjusting a volume of the first object;
adjusting a resolution of the first object;
adjusting the sharpness of the first object;
the size of at least one display object is adjusted.
6. The method according to any one of claims 3 to 5, further comprising:
and adjusting at least one parameter of the first object according to the operation characteristics of the first object.
7. The method of claim 6, wherein said adjusting at least one parameter of the first object based on the job characteristic of the first object comprises:
acquiring an environmental parameter of the first object;
outputting adjustment confirmation information of at least one parameter corresponding to the environment parameter;
and adjusting the at least one parameter according to the adjustment confirmation information.
8. The method of claim 7, wherein after acquiring the environmental parameters of the first object, the method further comprises:
when the environmental parameter meets the adjusting condition, outputting mode selection information comprising a first adjusting mode and/or a second adjusting mode;
the adjustment window is output in response to a mode selection command for the first adjustment mode, and/or the adjustment confirmation information is output in response to a mode selection command for the second adjustment mode.
9. An intelligent terminal, characterized in that, intelligent terminal includes: memory, a processor, wherein the memory has stored thereon an object control program which, when executed by the processor, implements the steps of the object control method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the object control method according to any one of claims 1 to 8.
CN202210540638.9A 2022-05-17 2022-05-17 Object control method, intelligent terminal and storage medium Pending CN115225757A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210540638.9A CN115225757A (en) 2022-05-17 2022-05-17 Object control method, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210540638.9A CN115225757A (en) 2022-05-17 2022-05-17 Object control method, intelligent terminal and storage medium

Publications (1)

Publication Number Publication Date
CN115225757A true CN115225757A (en) 2022-10-21

Family

ID=83607105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210540638.9A Pending CN115225757A (en) 2022-05-17 2022-05-17 Object control method, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN115225757A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611047A (en) * 2015-12-16 2016-05-25 芜湖美智空调设备有限公司 Shortcut control method and device based on mobile terminal
CN105892822A (en) * 2016-04-25 2016-08-24 乐视控股(北京)有限公司 Mobile terminal and rapid setting method and device thereof
CN105955608A (en) * 2016-04-22 2016-09-21 北京金山安全软件有限公司 Shortcut control method and device and electronic equipment
CN107995353A (en) * 2017-10-27 2018-05-04 努比亚技术有限公司 The quick control method of smart machine, terminal and computer-readable recording medium
CN109842725A (en) * 2019-01-25 2019-06-04 努比亚技术有限公司 A kind of method for controlling volume of mobile terminal, mobile terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611047A (en) * 2015-12-16 2016-05-25 芜湖美智空调设备有限公司 Shortcut control method and device based on mobile terminal
CN105955608A (en) * 2016-04-22 2016-09-21 北京金山安全软件有限公司 Shortcut control method and device and electronic equipment
CN105892822A (en) * 2016-04-25 2016-08-24 乐视控股(北京)有限公司 Mobile terminal and rapid setting method and device thereof
CN107995353A (en) * 2017-10-27 2018-05-04 努比亚技术有限公司 The quick control method of smart machine, terminal and computer-readable recording medium
CN109842725A (en) * 2019-01-25 2019-06-04 努比亚技术有限公司 A kind of method for controlling volume of mobile terminal, mobile terminal and storage medium

Similar Documents

Publication Publication Date Title
CN107832032B (en) Screen locking display method and mobile terminal
CN110187808B (en) Dynamic wallpaper setting method and device and computer-readable storage medium
CN112822538A (en) Screen projection display method, screen projection device, terminal and storage medium
CN112399215A (en) Screen projection parameter regulation and control method and device and computer readable storage medium
CN114126015A (en) Power consumption control method, intelligent terminal and storage medium
CN109308147B (en) Application icon display method and device and computer readable storage medium
CN115086479B (en) Terminal control method, intelligent terminal and storage medium
CN107360599B (en) Intelligent wifi networking method and mobile terminal
CN114630406A (en) Power consumption control method, intelligent terminal and storage medium
CN113282205A (en) Starting method, terminal device and storage medium
CN113453326A (en) Terminal equipment power consumption optimization method and device and computer readable storage medium
CN115225757A (en) Object control method, intelligent terminal and storage medium
CN115802455B (en) Control method, intelligent terminal and storage medium
CN109669594B (en) Interaction control method, equipment and computer readable storage medium
CN114090202A (en) Terminal control method, intelligent terminal and storage medium
CN115793922A (en) Display method, intelligent terminal and storage medium
CN113873334A (en) Data refreshing method, intelligent terminal and storage medium
CN117221438A (en) Volume adjusting method, intelligent terminal and storage medium
CN114116104A (en) Information control method, intelligent terminal and storage medium
CN114089897A (en) Wallpaper switching method, intelligent terminal and storage medium
CN113489842A (en) Incoming call interception method, mobile terminal and storage medium
CN115174731A (en) Control method, intelligent terminal and storage medium
CN113315873A (en) Processing method, mobile terminal and storage medium
CN116866471A (en) Control method, intelligent terminal and storage medium
CN113448674A (en) Interface display method, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20221021