CN111049970B - Method and device for operating equipment, electronic equipment and medium - Google Patents

Method and device for operating equipment, electronic equipment and medium Download PDF

Info

Publication number
CN111049970B
CN111049970B CN201911015213.0A CN201911015213A CN111049970B CN 111049970 B CN111049970 B CN 111049970B CN 201911015213 A CN201911015213 A CN 201911015213A CN 111049970 B CN111049970 B CN 111049970B
Authority
CN
China
Prior art keywords
user
operation instruction
function
determining
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911015213.0A
Other languages
Chinese (zh)
Other versions
CN111049970A (en
Inventor
邱肯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN201911015213.0A priority Critical patent/CN111049970B/en
Publication of CN111049970A publication Critical patent/CN111049970A/en
Application granted granted Critical
Publication of CN111049970B publication Critical patent/CN111049970B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method and a device for operating equipment, electronic equipment and a medium. According to the method and the device, when the display screen of the mobile terminal is detected to be in the off state, an operation instruction generated by a first user and carried out aiming at the display screen is obtained, then a function corresponding to the operation instruction is determined based on a preset operation strategy, and the function corresponding to the operation instruction is realized when the display screen of the mobile terminal is in the off state. By applying the technical scheme, the first operation performed by the user on the touch screen can be received when the mobile terminal is in the screen-off state, and the corresponding function of the operation is identified through the preset operation strategy, so that the mobile terminal can realize the corresponding function without unlocking. Further, the disadvantage that the user needs to go through a long operation process to realize the required functions in the related art can be avoided.

Description

Method and device for operating equipment, electronic equipment and medium
Technical Field
The present application relates to data processing technologies, and in particular, to a method and an apparatus for operating a device, an electronic device, and a medium.
Background
Due to the rise of the communications era and society, smart devices have been continuously developed with the use of more and more users.
Further, with the rapid development of the internet, in order to enable a user to have better use experience, multiple functions are often deployed in the smart device. For example, the wireless local area network WIFI function, the Bluetooth function, the payment function, the flashlight function, the music playing function and the like can be provided. Furthermore, before using the corresponding function, the user needs to light the mobile phone screen, and after unlocking the mobile phone screen, finds the corresponding icon on the corresponding display page. And then the function is started and used after the icon is clicked. It can be understood that, when a user needs to implement multiple functions, the user also needs to click each icon in sequence to activate the corresponding function respectively, and then the function can be used.
However, the existing methods in the related art require more operations by the user, thereby reducing the user experience.
Disclosure of Invention
The embodiment of the application provides a method and a device for operating equipment, electronic equipment and a medium.
According to an aspect of an embodiment of the present application, there is provided a method of operating a device, including:
when a display screen of the mobile terminal is in a turned-off state, acquiring an operation instruction generated by a first user, wherein the operation instruction is an operation performed on the display screen;
determining a function corresponding to the operation instruction based on a preset operation strategy;
and when the display screen of the mobile terminal is in the extinguishing state, realizing the function corresponding to the operation instruction.
Optionally, in another embodiment based on the foregoing method of the present application, before determining, based on a preset operation policy, a function corresponding to the operation instruction, the method further includes:
acquiring operation data generated by a second user in a historical time period, wherein the operation data is used for representing a combined function realized by the second user on the mobile terminal;
determining the operating policy based on the operating data.
Optionally, in another embodiment based on the method of the present application, the determining the operation policy based on the operation data includes:
when detecting that the second user starts at least two functions within a first time range based on the operation data, determining that the at least two functions are a first combined function corresponding to the second user;
and recording the first combined function, and generating the operation strategy based on the first combined function.
Optionally, in another embodiment based on the method of the present application, the determining the operation policy based on the operation data includes:
when detecting that the second user starts at least two functions in a preset area range based on the operation data, determining that the at least two functions are second combined functions corresponding to the second user;
and recording the second combination function and the preset area range, and generating the operation strategy based on the second combination function and the preset area range.
Optionally, in another embodiment based on the foregoing method of the present application, after the obtaining of the operation instruction generated by the first user, the method further includes:
detecting a pressing force value received by a display screen of the mobile terminal, wherein the pressing force value is a numerical value generated based on the operation instruction;
and when the pressing force value is confirmed to be larger than the force threshold, determining a function corresponding to the operation instruction based on the operation strategy.
Optionally, in another embodiment based on the foregoing method of the present application, after the obtaining of the operation instruction generated by the first user, the method further includes:
acquiring an image of the target user;
extracting biological information characteristics of the target user by utilizing a convolutional neural network model based on the image of the target user;
and determining a function corresponding to the operation instruction based on the biological information characteristics and the operation strategy.
Optionally, in another embodiment based on the foregoing method of the present application, after the display screen of the mobile terminal is in the off state and the function corresponding to the operation instruction is implemented, the method further includes:
and generating a vibration message, wherein the vibration message is used for prompting the user that the function corresponding to the operation instruction is realized.
According to another aspect of the embodiments of the present application, there is provided an apparatus for operating a device, including:
the mobile terminal comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring an operation instruction generated by a first user when the display screen of the mobile terminal is in a turned-off state, and the operation instruction is an operation performed on the display screen;
the determining module is arranged for determining a function corresponding to the operation instruction based on a preset operation strategy;
and the implementation module is set to implement the function corresponding to the operation instruction when the display screen of the mobile terminal is in the extinguishing state.
According to another aspect of the embodiments of the present application, there is provided an electronic device including:
a memory for storing executable instructions; and
a display for displaying with the memory to execute the executable instructions to perform the operations of any of the above described methods of operating a device.
According to a further aspect of the embodiments of the present application, there is provided a computer-readable storage medium for storing computer-readable instructions, which, when executed, perform the operations of any one of the above-described methods for operating a device.
According to the method and the device, when the display screen of the mobile terminal is detected to be in the off state, the operation instruction generated by the first user and carried out aiming at the display screen is obtained, then the function corresponding to the operation instruction is determined based on the preset operation strategy, and the function corresponding to the operation instruction is realized when the display screen of the mobile terminal is in the off state. By applying the technical scheme, the first operation performed by the user on the touch screen can be received when the mobile terminal is in the screen-off state, and the corresponding function of the operation is identified through the preset operation strategy, so that the mobile terminal can realize the corresponding function without unlocking. Further, the disadvantage that the user needs to go through a long operation process to realize the required functions in the related art can be avoided.
The technical solution of the present application is further described in detail by the accompanying drawings and examples.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
The present application may be more clearly understood from the following detailed description with reference to the accompanying drawings, in which:
FIG. 1 is a system architecture diagram of an operating device according to the present application;
FIG. 2 is a schematic diagram of a method of operating an apparatus as set forth herein;
FIG. 3 is a schematic diagram of yet another method of operating a device as set forth herein;
FIG. 4 is a schematic diagram of the structure of the apparatus for operating the device of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to the present application.
Detailed Description
Various exemplary embodiments of the present application will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the application, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be discussed further in subsequent figures.
In addition, technical solutions between the various embodiments of the present application may be combined with each other, but it must be based on the realization of the technical solutions by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should be considered to be absent and not within the protection scope of the present application.
It should be noted that all the directional indicators (such as upper, lower, left, right, front and rear … …) in the embodiment of the present application are only used to explain the relative position relationship between the components, the motion situation, etc. in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indicator is changed accordingly.
A method for operating a device according to an exemplary embodiment of the present application is described below in conjunction with fig. 1-3. It should be noted that the following application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present application, and the embodiments of the present application are not limited in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
Fig. 1 shows a schematic diagram of an exemplary system architecture 100 to which a video processing method or a video processing apparatus of an embodiment of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
A user may use terminal devices 101, 102, 103 to interact with a server 105 over a network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to smart phones, tablet computers, portable computers, desktop computers, and the like.
The terminal apparatuses 101, 102, 103 in the present application may be terminal apparatuses that provide various services. For example, the user implements, via the terminal device 103 (or terminal device 101 or 102): when a display screen of the mobile terminal is in a turned-off state, acquiring an operation instruction generated by a first user, wherein the operation instruction is an operation performed on the display screen; determining a function corresponding to the operation instruction based on a preset operation strategy; and when the display screen of the mobile terminal is in the off state, realizing the function corresponding to the operation instruction.
It should be noted that the video processing method provided in the embodiments of the present application may be executed by one or more of the terminal devices 101, 102, and 103, and/or the server 105, and accordingly, the video processing apparatus provided in the embodiments of the present application is generally disposed in the corresponding terminal device, and/or the server 105, but the present application is not limited thereto.
The application also provides a method, a device, a target terminal and a medium for operating the equipment.
Fig. 2 schematically shows a flow diagram of a method of operating a device according to an embodiment of the present application. As shown in fig. 2, the method includes:
s101, when a display screen of the mobile terminal is in a turned-off state, an operation instruction generated by a first user is obtained, and the operation instruction is an operation performed on the display screen.
It should be noted that, in the present application, the mobile terminal is not specifically limited, and may be, for example, an intelligent device or a server. The smart device may be a PC (Personal Computer), a smart phone, a tablet PC, an electronic book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) operating device, an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) operating device, a portable Computer, and other mobile terminal devices having a display function.
It should be noted that, in order to protect the security of the mobile terminal, the mobile terminal usually locks the screen when detecting that the user does not perform an operation within a preset time period. When the user needs to use the terminal again, the screen of the terminal must be unlocked by inputting a password or pressing a fingerprint and the like. And after the screen is successfully unlocked, starting the corresponding function by clicking an icon and the like. As can be appreciated, such an approach is cumbersome to operate. Therefore, the present application provides a method for implementing a corresponding function when a display screen of a mobile terminal is in an off state.
Specifically, it is required to first acquire an operation instruction generated by a first user for a display screen of the mobile terminal when the display screen of the mobile terminal is in an off state. It should be noted that, the present application does not specifically limit the operation instruction, and for example, the operation may be an operation of clicking a terminal screen, an operation of sliding the terminal screen, or an operation of pressing the terminal screen.
And S102, determining a function corresponding to the operation instruction based on a preset operation strategy.
Further, after receiving the operation instruction generated by the first user, the method and the device for determining the operation instruction may query whether the determined operation instruction has a corresponding function based on a preset operation policy. It can be understood that, if the operation instruction is an operation of mistakenly touching the terminal screen, the operation cannot find the corresponding function in the operation policy. The terminal will not react to this type of operation.
It should be noted that, the operation policy is not specifically limited in the present application, for example, the operation policy may be a policy generated by the mobile terminal according to the historical usage habit of the user, the operation policy may be a policy input by the user in advance at the mobile terminal, and the operation policy may also be a policy pre-stored in the database.
Optionally, in the application, after the operation instruction generated by the first user is obtained, the operation policy stored in advance may be called from the mobile terminal. The operation policy may be a policy preset by a user. For example, when the terminal receives an operation of continuously clicking the third screen of the terminal generated by the first user, the operation of continuously clicking the third screen of the terminal is recorded and matched with the operation strategy. And when detecting that the function corresponding to the operation is the Bluetooth function, determining that the function corresponding to the operation generated by the first user under the continuous click terminal screen III is the Bluetooth function.
Alternatively, the operation policy may be a policy generated based on the user's historical habits. For example, when the terminal receives an operation of touching the terminal screen for more than 3 seconds generated by the first user, the operation of touching the terminal screen for more than 3 seconds is recorded and matched with the operation policy. When the operation strategy is detected to store historical events meeting the current time, the user opens the video application program in the terminal for a plurality of times. Therefore, the function corresponding to the operation of the touch terminal screen generated by the first user for more than 3 seconds can be determined to be the function of opening the video application.
Still alternatively, the operation policy may be a policy prestored in a database. For example, when the terminal receives an operation of sliding the terminal screen generated by a first user, the operation of sliding the terminal screen is recorded and matched with the operation policy. And when the function corresponding to the operation of the sliding terminal screen stored in the operation strategy is detected to be the function of opening the music application, determining that the function corresponding to the operation of the sliding terminal screen generated by the first user is the function of opening the music application.
And S103, realizing the function corresponding to the operation instruction when the display screen of the mobile terminal is in a turned-off state.
It can be understood that, in the present application, after determining the function corresponding to the operation instruction based on the preset operation policy, the function corresponding to the operation instruction can be automatically implemented. It should be noted that, the function corresponding to the operation instruction implemented by the mobile terminal is implemented when the display screen is in the off state, so that the disadvantage that the corresponding function can be started only by unlocking the display screen first in the related art can be reduced.
For example, when a user is riding a public transportation vehicle, the user usually selects to connect a bluetooth headset using bluetooth function and open a music application to play games and enjoy music. Furthermore, in order to avoid the need for the user to unlock the mobile phone first, the bluetooth function and the application program are opened in sequence. According to the method and the device, after the operation instruction is generated under the condition that the screen of the mobile phone is locked by the user, the operation instruction can be matched with the operation strategy based on the operation instruction. And when the instruction is matched to open the Bluetooth function and the music application program, the Bluetooth function and the music APP are automatically opened for the user and corresponding music is automatically played under the condition that the screen of the mobile phone is still turned off.
For further example, when the user is sleeping, the mobile phone is usually required to be switched to a mute mode and a power saving mode. Furthermore, in order to avoid the need for the user to unlock the mobile phone first, the procedures of the silent mode and the power saving mode are set in sequence. According to the method and the device, after the operation instruction is generated under the condition that the screen of the mobile phone is locked by the user, the operation instruction can be matched with the operation strategy based on the operation instruction. And when the instruction is matched to set the mute mode and the power saving mode, the mute mode and the power saving mode can be automatically set for the user under the condition that the screen of the mobile phone is still turned off.
According to the method and the device, when the display screen of the mobile terminal is detected to be in the off state, the operation instruction generated by the first user and carried out aiming at the display screen is obtained, then the function corresponding to the operation instruction is determined based on the preset operation strategy, and the function corresponding to the operation instruction is realized when the display screen of the mobile terminal is in the off state. By applying the technical scheme, the first operation performed by the user on the touch screen can be received when the mobile terminal is in the screen-off state, and the corresponding function of the operation is identified through the preset operation strategy, so that the mobile terminal can realize the corresponding function without unlocking. Further, the disadvantage that the user needs to go through a long operation process to realize the required functions in the related art can be avoided.
Optionally, in another embodiment of the present application, after S101 (obtaining the operation instruction generated by the first user when the display screen of the mobile terminal is in the off state), the method may further include:
acquiring a target user image;
extracting biological information characteristics of the target user by utilizing a convolutional neural network model based on the target user image;
and determining the function corresponding to the operation instruction based on the biological information characteristics and the operation strategy.
In the application, after the operation instruction is acquired, the function that the user wants to realize is further confirmed. But also by means of biometric information identifying the user. To determine the function corresponding to the operation instruction. Further, the application does not specifically limit the functions corresponding to the biological information features and the operation instructions. For example, a plurality of correspondence relationships may be set in advance by a user. For example, when the user generates the first operation instruction, blinking the camera capture device of the mobile phone means that the bluetooth function is turned on. Or when the user generates the first operation instruction, waving the camera shooting and collecting device of the mobile phone represents turning on the flashlight function, and the like.
For the equipment, after the image of the target user is acquired by the camera shooting acquisition device, the biological information characteristics of the target user can be extracted by utilizing the convolutional neural network model. The biological information features may be various, such as face feature information, iris feature information, limb organ feature information, and the like. This is not a limitation of the present application.
It should be noted that a Convolutional Neural Network (CNN) is a kind of feed-forward Neural network (fed-forward Neural Networks) that includes convolution calculation and has a deep structure, and is one of the representative algorithms for deep learning. The convolutional neural network has a representation learning (representation learning) capability, and can perform translation invariant classification on input information according to a hierarchical structure of the convolutional neural network. The CNN (convolutional neural network) has remarkable effects in the fields of image classification, target detection, semantic segmentation and the like due to the powerful feature characterization capability of the CNN on the image.
Further, the present application may use the extracted biometric information in the user image in the CNN neural network model. At least one user image needs to be input into a preset convolutional neural network model, and the output of a last full connected layer (FC) of the convolutional neural network model is used as feature data corresponding to the user image. So as to obtain the operation purpose of the user according to the characteristic data.
It should be further noted that, before determining the feature data corresponding to each user image by using the convolutional neural network model, the convolutional neural network model needs to be obtained in the following manner:
obtaining a sample image, wherein the sample image includes at least one sample feature;
and training a preset neural network image classification model by using the sample image to obtain a convolutional neural network model meeting a preset condition.
Further, the present application may identify, through a neural network image classification model, a sample feature (for example, an iris feature, a facial feature, a limb feature, or the like) of at least one object included in the sample image. Furthermore, the neural network image classification model may classify each sample feature in the sample image, and classify the sample features belonging to the same category into the same type, so that a plurality of sample features obtained after semantic segmentation of the sample image may be sample features composed of a plurality of different types.
It should be noted that, when the neural network image classification model performs semantic segmentation processing on the sample image, the more accurate the classification of the pixel points in the sample image is, the higher the accuracy rate of identifying the labeled object in the sample image is. It should be noted that the preset condition may be set by a user.
For example, the preset conditions may be set as: the classification accuracy of the pixel points reaches more than 70%, then, the sample image is used for repeatedly training the neural network image classification model, and when the classification accuracy of the neural network image classification model on the pixel points reaches more than 70%, then the neural network image classification model can be applied to the embodiment of the application for carrying out image segmentation on the user image.
Further optionally, in another embodiment of the present application, before S102 (determining a function corresponding to the operation instruction based on the preset operation policy), an embodiment is further included, specifically as shown in fig. 3, where:
s201, collecting operation data generated by a second user in a historical time period, wherein the operation data is used for representing a combined function realized by the second user on the mobile terminal;
it should be noted that, in the process of determining the operation policy, the corresponding operation policy may be automatically generated according to the historical habits of the user. For example, the application may collect operational data generated by the second user over a historical period of time for implementing the combined function on the mobile terminal. The second user is not specifically limited in the present application, for example, the second user may be the first user, and the second user may not be the same user as the first user.
In addition, the historical time period is not specifically limited, and for example, the historical time period may be 30 days, or 60 days, and the like.
Furthermore, in order to help the user to realize the corresponding function quickly. The operation strategy in the application can also correspond to a plurality of functions at the same time. Specifically, the number of functions of the combined function is not specifically limited in the present application, and may be, for example, 2 or 5.
S202, determining an operation strategy based on the operation data.
Optionally, the following manners may be included in the present application to determine the operation policy:
the first mode is as follows:
when detecting that the second user starts at least two functions within a first time range based on the operation data, determining that the at least two functions are first combined functions corresponding to the second user;
the first combined function is recorded, and an operation strategy is generated based on the first combined function.
First, the present application may traverse the operation data, determine whether there is an operation to start at least two functions within a first time range, and if so, determine the at least two functions as a first combined function corresponding to the second user. So that the first combined function is subsequently added to the operating policy. The operation instruction corresponding to the first combined function may be an operation instruction assigned by the terminal, or may be a user-defined operation instruction. Further, the operation instruction establishes a mapping relationship with the first combined function, and records the mapping relationship in the operation policy.
For example, taking the mobile terminal as a mobile phone, the historical time period is 3 days, the first time range is 5 seconds, and the two functions are to open a bluetooth connection and play music, respectively:
the embodiment of the application can acquire whether the operation of continuously starting the Bluetooth connection and playing music in 5 seconds exists in the historical time period of the mobile phone of nearly 3 days. Specifically, in the historical time period of the mobile phone of nearly 3 days, at 2019.10.1 days, after the user opens the bluetooth connection of the mobile phone for 2 seconds, the application opens the music application APP in the mobile phone and starts playing music. And on 2019.10.2 days, after the user opens the Bluetooth connection of the mobile phone for 3 seconds, the music application APP in the mobile phone is opened and the music is played. And 2019.10.3 days, after the user opens the Bluetooth connection of the mobile phone for 1 second, the music application APP in the mobile phone is opened and the music is played. And determining that the two functions are the first combined function corresponding to the second user according to the obtained operation of the starting function combination generated by the user. And simultaneously generating an operation instruction under the continuous clicking screen III for the first combined function, and establishing a first mapping relation between the operation instruction under the continuous clicking screen III and the simultaneously started Bluetooth connection and music application programs. It will be appreciated that this first mapping relationship is added to the operating policy. After the mobile phone detects that the user continuously clicks the operation instruction under the third screen when the display screen is in the off state subsequently, the Bluetooth connection and the music application program are automatically started, and music is played.
For further example, still taking the mobile terminal as a mobile phone, the historical time period is 3 days, the first time range is 5 seconds, and the two functions are respectively to start bluetooth connection and open shared bicycle software as an example:
the method and the device for obtaining the Bluetooth connection can obtain whether the operations of continuously starting the Bluetooth connection and sharing the bicycle software exist in the historical time period of the mobile phone in nearly 3 days. Specifically, in the historical time period of the mobile phone of nearly 3 days, in 2019.9.1 days, after the user opens the bluetooth connection of the mobile phone for 2 seconds, the application also starts the sharing bicycle APP in the mobile phone and opens a scanning function. And 2019.9.2 days, after the user opens the Bluetooth connection of the mobile phone for 3 seconds, the sharing bicycle APP in the mobile phone is started and a scanning function is started. And 2019.9.3 days, after the user opens the Bluetooth connection of the mobile phone for 5 seconds, the sharing bicycle APP in the mobile phone is started and a scanning function is started. And determining that the scanning function of opening the Bluetooth connection and starting the shared bicycle APP is a second combined function corresponding to a second user according to the obtained operation of starting the function combination generated by the user. And simultaneously generating two operating instructions under the continuous clicking screen for the first combined function, and establishing a second mapping relation between the two operating instructions under the continuous clicking screen and the simultaneous starting of the Bluetooth connection and the scanning function. It will be appreciated that this second mapping relationship is added to the operating policy. So that the mobile phone automatically starts a scanning function of Bluetooth connection and sharing bicycle APP after detecting the operation instruction that the user continuously clicks two screens when the display screen is in an off state. Therefore, the user can directly aim the camera device in the mobile phone at the two-dimensional code of the shared bicycle to realize quick code scanning without unlocking.
The second mode is as follows:
when detecting that the second user starts at least two functions in the preset area range based on the operation data, determining that the at least two functions are second combined functions corresponding to the second user;
and recording the second combined function and the preset area range, and generating an operation strategy based on the second combined function and the preset area range. .
Further, the application may also traverse through the operation data, and determine whether there are operations of at least two functions started in the preset area range, if so, the at least two functions are determined as a second combined function corresponding to the second user. So that the second combined function is subsequently added to the operating policy. It should also be noted that the operation instruction corresponding to the second combined function may be an operation instruction allocated by the terminal, or an operation instruction customized by the user. Further, the operation instruction establishes a mapping relationship with the second combined function, and records the mapping relationship in the operation policy.
It should be noted that the preset area range is not specifically limited in the present application, that is, the preset area range may be any area.
For example, taking the mobile terminal as a mobile phone, the historical time period is 3 days, and the two functions are to turn on the flashlight and play music respectively:
the method and the device for acquiring the flashlight can acquire whether the flashlight is continuously turned on and music data is played in a preset area in the historical time period of the mobile phone of nearly 3 days. Specifically, when the application acquires that the mobile phone is located in the park of the Toyobo district A in Beijing city on 2019.10.1 days in the historical time period of nearly 3 days, the user turns on the flashlight of the mobile phone, and then also turns on the music application APP in the mobile phone and starts to play songs. And 2019.10.2 days, when the user was in Toyobo district A park in Beijing, the user would turn on the flashlight of the cell phone and then also turn on the music application APP in the cell phone and start playing the song. And 2019.10.3 days, when the user was in Toyobo district A park in Beijing, the user would turn on the flashlight of the cell phone and then also turn on the music application APP in the cell phone and start playing the song. And determining that the two functions are second combined functions corresponding to the second user according to the obtained operation of the starting function combination generated by the user. And simultaneously generating an operation instruction for sliding the screen leftwards for 2 seconds for the second combined function, and establishing a second mapping relation between the operation instruction for sliding the screen leftwards for 2 seconds and the operation instruction for simultaneously starting the flashlight and the music application program. It will be appreciated that this second mapping relationship is added to the operating policy. So that the flashlight and the music application program are automatically started and songs are played after the mobile phone detects an operation instruction of 2 seconds on the left sliding screen of the park A in the Toyobo district in Beijing when the display screen is in an off state.
S203, when the display screen of the mobile terminal is in a turning-off state, acquiring an operation instruction generated by the first user.
Optionally, after the operation instruction generated by the first user is obtained, a pressing force value received by a display screen of the mobile terminal may be further detected, where the pressing force value is a numerical value generated based on the operation instruction;
and when the pressing force value is confirmed to be larger than the force threshold, determining a function corresponding to the operation instruction based on the operation strategy.
In order to prevent the operation instruction generated by the first user from being a misoperation generated when the user is in an unintentional state, the pressing force value received by the display screen of the mobile terminal can be further detected. And when the pressing force value is detected to be larger than the force threshold value, the operation is judged to be the operation generated by the user subjectively instead of the operation generated by mistakenly touching. Therefore, the function corresponding to the operation instruction can be determined based on the operation policy.
The present application does not specifically limit the pressing force value, and may be, for example, 5pa or 1 pa.
And S204, determining a function corresponding to the operation instruction based on a preset operation strategy.
And S205, when the display screen of the mobile terminal is in a turned-off state, realizing a function corresponding to the operation instruction.
Further, in a possible implementation manner, after the function corresponding to the operation instruction is implemented, the vibration message may be further generated by the mobile terminal. The vibration message is used for prompting the user that the function corresponding to the operation instruction is realized.
According to the method and the device, when the display screen of the mobile terminal is detected to be in the off state, the operation instruction generated by the first user and carried out aiming at the display screen is obtained, then the function corresponding to the operation instruction is determined based on the preset operation strategy, and the function corresponding to the operation instruction is realized when the display screen of the mobile terminal is in the off state. By applying the technical scheme, the first operation performed by the user on the touch screen can be received when the mobile terminal is in the screen-off state, and the corresponding function of the operation is identified through the preset operation strategy, so that the mobile terminal can realize the corresponding function without unlocking. Further, the disadvantage that the user needs to go through a long operation process to realize the required functions in the related art can be avoided.
In another embodiment of the present application, as shown in fig. 2, the present application further provides an apparatus for operating a device, where the apparatus includes an obtaining module 301, a determining module 302, and an implementing module 303, where:
the obtaining module 301 is configured to obtain an operation instruction generated by a first user when a display screen of the mobile terminal is in a turned-off state, where the operation instruction is an operation performed on the display screen;
a determining module 302 configured to determine a function corresponding to the operation instruction based on a preset operation policy;
and the implementation module 303 is configured to implement a function corresponding to the operation instruction when the display screen of the mobile terminal is in the off state.
In another embodiment of the present application, the determining module 302 further includes:
a determining module 302 configured to collect operation data generated by a second user in a historical time period, wherein the operation data is used for characterizing a combined function implemented by the second user on the mobile terminal;
a determination module 302 configured to determine the operation policy based on the operation data.
In another embodiment of the present application, the determining module 302 further includes:
a determining module 302 configured to determine, when it is detected that at least two functions are started by the second user within a first time range based on the operation data, that the at least two functions are a first combined function corresponding to the second user;
a determining module 302 configured to record the first combined function and generate the operation policy based on the first combined function.
In another embodiment of the present application, the determining module 302 further includes:
a determining module 302, configured to determine, when it is detected that the second user starts at least two functions within a preset area range based on the operation data, that the at least two functions are second combined functions corresponding to the second user;
a determining module 302 configured to record the second combined function and the preset area range, and generate the operation policy based on the second combined function and the preset area range.
In another embodiment of the present application, the determining module 302 further includes:
a determining module 302 configured to detect a pressing force value received by a display screen of the mobile terminal, where the pressing force value is a numerical value generated based on the operation instruction;
a determining module 302 configured to determine, based on the operation policy, a function corresponding to the operation instruction when it is determined that the pressing force value is greater than the force threshold.
In another embodiment of the present application, the determining module 302 further includes:
a determination module 302 configured to obtain an image of the target user;
extracting biological information characteristics of the target user by utilizing a convolutional neural network model based on the image of the target user;
a determining module 302 configured to determine a function corresponding to the operation instruction based on the biological information feature and the operation policy.
In another embodiment of the present application, the method further includes a generating module 304, wherein:
a generating module 304 configured to generate a vibration message for prompting a user that a function corresponding to the operation instruction has been implemented.
Fig. 5 is a block diagram illustrating a logical structure of an electronic device in accordance with an exemplary embodiment. For example, the electronic device 400 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 5, electronic device 400 may include one or more of the following components: a processor 401 and a memory 402.
Processor 401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 401 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 401 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 401 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 402 may include one or more computer-readable storage media, which may be non-transitory. Memory 402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 402 is used to store at least one instruction for execution by the processor 401 to implement the interactive special effect calibration method provided by the method embodiments of the present application.
In some embodiments, the electronic device 400 may further optionally include: a peripheral interface 403 and at least one peripheral. The processor 401, memory 402 and peripheral interface 403 may be connected by bus or signal lines. Each peripheral may be connected to the peripheral interface 403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 404, touch screen display 405, camera 406, audio circuitry 407, positioning components 408, and power supply 409.
The peripheral interface 403 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 401 and the memory 402. In some embodiments, processor 401, memory 402, and peripheral interface 403 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 401, the memory 402 and the peripheral interface 403 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 404 communicates with a communication network and other communication devices via electromagnetic signals. The rf circuit 404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 405 is a touch display screen, the display screen 405 also has the ability to capture touch signals on or over the surface of the display screen 405. The touch signal may be input to the processor 401 as a control signal for processing. At this point, the display screen 405 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 405 may be one, providing the front panel of the electronic device 400; in other embodiments, the display screen 405 may be at least two, respectively disposed on different surfaces of the electronic device 400 or in a folded design; in still other embodiments, the display screen 405 may be a flexible display screen disposed on a curved surface or a folded surface of the electronic device 400. Even further, the display screen 405 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display screen 405 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 406 is used to capture images or video. Optionally, camera assembly 406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 401 for processing, or inputting the electric signals to the radio frequency circuit 404 for realizing voice communication. For stereo capture or noise reduction purposes, the microphones may be multiple and disposed at different locations of the electronic device 400. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 401 or the radio frequency circuit 404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 407 may also include a headphone jack.
The positioning component 408 is used to locate the current geographic Location of the electronic device 400 for navigation or LBS (Location Based Service). The Positioning component 408 may be a Positioning component based on the GPS (Global Positioning System) of the united states, the beidou System of china, the graves System of russia, or the galileo System of the european union.
The power supply 409 is used to supply power to the various components in the electronic device 400. The power source 409 may be alternating current, direct current, disposable or rechargeable. When power source 409 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 400 also includes one or more sensors 410. The one or more sensors 410 include, but are not limited to: acceleration sensor 411, gyro sensor 412, pressure sensor 413, fingerprint sensor 414, optical sensor 415, and proximity sensor 416.
The acceleration sensor 411 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the electronic apparatus 400. For example, the acceleration sensor 411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 401 may control the touch display screen 405 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 411. The acceleration sensor 411 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 412 may detect a body direction and a rotation angle of the electronic device 400, and the gyro sensor 412 may cooperate with the acceleration sensor 411 to acquire a 3D motion of the user on the electronic device 400. From the data collected by the gyro sensor 412, the processor 401 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 413 may be disposed on a side bezel of the electronic device 400 and/or on a lower layer of the touch display screen 405. When the pressure sensor 413 is arranged on the side frame of the electronic device 400, a holding signal of the user to the electronic device 400 can be detected, and the processor 401 performs left-right hand identification or shortcut operation according to the holding signal collected by the pressure sensor 413. When the pressure sensor 413 is disposed at the lower layer of the touch display screen 405, the processor 401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 414 is used for collecting a fingerprint of the user, and the processor 401 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 414, or the fingerprint sensor 414 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 414 may be disposed on the front, back, or side of the electronic device 400. When a physical button or vendor Logo is provided on the electronic device 400, the fingerprint sensor 414 may be integrated with the physical button or vendor Logo.
The optical sensor 415 is used to collect the ambient light intensity. In one embodiment, the processor 401 may control the display brightness of the touch display screen 405 based on the ambient light intensity collected by the optical sensor 415. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 405 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 405 is turned down. In another embodiment, the processor 401 may also dynamically adjust the shooting parameters of the camera assembly 406 according to the ambient light intensity collected by the optical sensor 415.
Proximity sensor 416, also known as a distance sensor, is typically disposed on the front panel of electronic device 400. The proximity sensor 416 is used to capture the distance between the user and the front of the electronic device 400. In one embodiment, the processor 401 controls the touch display screen 405 to switch from the bright screen state to the dark screen state when the proximity sensor 416 detects that the distance between the user and the front surface of the electronic device 400 gradually decreases; when the proximity sensor 416 detects that the distance between the user and the front of the electronic device 400 is gradually increased, the processor 401 controls the touch display screen 405 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 5 does not constitute a limitation of the electronic device 400, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
In an exemplary embodiment, there is also provided a non-transitory computer-readable storage medium, such as the memory 404, comprising instructions executable by the processor 420 of the electronic device 400 to perform a method of operating the device, the method comprising: when a display screen of the mobile terminal is in a turned-off state, acquiring an operation instruction generated by a first user, wherein the operation instruction is an operation performed on the display screen; determining a function corresponding to the operation instruction based on a preset operation strategy; and when the display screen of the mobile terminal is in the extinguishing state, realizing the function corresponding to the operation instruction. Optionally, the instructions may also be executable by the processor 420 of the electronic device 400 to perform other steps involved in the exemplary embodiments described above. Optionally, the instructions may also be executed by the processor 420 of the electronic device 400 to perform other steps involved in the exemplary embodiments described above. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided an application/computer program product comprising one or more instructions executable by the processor 420 of the electronic device 400 to perform the above method of operating a device, the method comprising: when a display screen of the mobile terminal is in a turned-off state, acquiring an operation instruction generated by a first user, wherein the operation instruction is an operation performed on the display screen; determining a function corresponding to the operation instruction based on a preset operation strategy; and when the display screen of the mobile terminal is in the extinguishing state, realizing the function corresponding to the operation instruction. Optionally, the instructions may also be executable by the processor 420 of the electronic device 400 to perform other steps involved in the exemplary embodiments described above. Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (7)

1. A method of operating a device, comprising:
acquiring operation data generated by a second user in a historical time period, wherein the operation data is used for representing a combined function realized by the second user on the mobile terminal;
determining an operation policy based on the operation data; wherein determining an operating policy based on the operational data comprises: when detecting that the second user starts at least two different functions in a preset area range based on the operation data, determining that the at least two different functions are second combined functions corresponding to the second user; recording the second combination function and the preset area range, and generating an operation strategy based on the second combination function and the preset area range;
when a display screen of the mobile terminal is in a turned-off state, acquiring an operation instruction generated by a first user, wherein the operation instruction is an operation performed on the display screen;
acquiring a target user image, and extracting biological information characteristics of a target user by using a convolutional neural network model based on the target user image;
determining a function corresponding to the operation instruction based on a preset operation strategy, wherein the function is a combined function corresponding to the operation strategy and comprising at least two different functions; the determining the function corresponding to the operation instruction based on the preset operation strategy comprises the following steps: determining a combined function corresponding to the operation instruction based on the biological information characteristics and a preset operation strategy;
and when the display screen of the mobile terminal is in the extinguishing state, realizing a combined function corresponding to the operation instruction and comprising at least two different functions.
2. The method of claim 1, wherein the determining the operational policy based on the operational data comprises:
when detecting that the second user starts at least two functions within a first time range based on the operation data, determining that the at least two functions are first combined functions corresponding to the second user;
and recording the first combined function, and generating the operation strategy based on the first combined function.
3. The method of claim 1, after said obtaining the first user-generated operating instructions, further comprising:
detecting a pressing force value received by a display screen of the mobile terminal, wherein the pressing force value is a numerical value generated based on the operation instruction;
and when the pressing force value is confirmed to be larger than the force threshold, determining a function corresponding to the operation instruction based on the operation strategy.
4. The method of claim 1, wherein after the function corresponding to the operation instruction is implemented when the display screen of the mobile terminal is in the off state, the method further comprises:
and generating a vibration message, wherein the vibration message is used for prompting the user that the function corresponding to the operation instruction is realized.
5. An apparatus for operating a device, comprising:
the determining module is used for acquiring operation data generated by a second user in a historical time period, wherein the operation data is used for representing the combined function of the second user on the mobile terminal;
a determination module configured to determine an operation policy based on the operation data, wherein the determining an operation policy based on the operation data comprises: when detecting that the second user starts at least two different functions in a preset area range based on the operation data, determining that the at least two different functions are second combined functions corresponding to the second user; recording the second combination function and the preset area range, and generating an operation strategy based on the second combination function and the preset area range;
the mobile terminal comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring an operation instruction generated by a first user when the display screen of the mobile terminal is in a turned-off state, and the operation instruction is an operation performed on the display screen; acquiring a target user image, and extracting biological information characteristics of the target user by using a convolutional neural network model based on the target user image;
the determining module is configured to determine a function corresponding to the operation instruction based on a preset operation policy, wherein the function is a combined function corresponding to the operation policy and including at least two different functions; the determining the function corresponding to the operation instruction based on the preset operation strategy comprises the following steps: determining a combined function which comprises at least two different functions and corresponds to the operation instruction based on the biological information characteristics and a preset operation strategy;
and the implementation module is set to implement the function corresponding to the operation instruction when the display screen of the mobile terminal is in the extinguishing state.
6. An electronic device, comprising:
a memory for storing executable instructions; and the number of the first and second groups,
a processor for display with the memory to execute the executable instructions to perform the operations of the method of operating the device of any of claims 1-4.
7. A computer-readable storage medium storing computer-readable instructions that, when executed, perform the operations of the method of operating a device of any of claims 1-4.
CN201911015213.0A 2019-10-24 2019-10-24 Method and device for operating equipment, electronic equipment and medium Active CN111049970B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911015213.0A CN111049970B (en) 2019-10-24 2019-10-24 Method and device for operating equipment, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911015213.0A CN111049970B (en) 2019-10-24 2019-10-24 Method and device for operating equipment, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN111049970A CN111049970A (en) 2020-04-21
CN111049970B true CN111049970B (en) 2022-06-07

Family

ID=70231773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911015213.0A Active CN111049970B (en) 2019-10-24 2019-10-24 Method and device for operating equipment, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN111049970B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9037277B1 (en) * 2012-06-13 2015-05-19 Audible, Inc. Systems and methods for initiating action based on audio output device
CN105159592B (en) * 2015-09-09 2018-10-23 魅族科技(中国)有限公司 Using startup method and terminal
CN106528238A (en) * 2016-11-14 2017-03-22 珠海市魅族科技有限公司 Application startup method and device
CN106708402A (en) * 2016-11-25 2017-05-24 努比亚技术有限公司 Method and apparatus for controlling terminal application
CN107360370A (en) * 2017-07-27 2017-11-17 深圳市泰衡诺科技有限公司 A kind of method, photo taking and photo camera for smart machine
CN109240581A (en) * 2018-08-06 2019-01-18 Oppo(重庆)智能科技有限公司 terminal control method, device, terminal device and computer readable storage medium

Also Published As

Publication number Publication date
CN111049970A (en) 2020-04-21

Similar Documents

Publication Publication Date Title
CN110572711B (en) Video cover generation method and device, computer equipment and storage medium
CN110933468A (en) Playing method, playing device, electronic equipment and medium
CN111048111B (en) Method, device, equipment and readable storage medium for detecting rhythm point of audio
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN109068008B (en) Ringtone setting method, device, terminal and storage medium
CN110341627B (en) Method and device for controlling behavior in vehicle
CN110139143B (en) Virtual article display method, device, computer equipment and storage medium
CN110109608B (en) Text display method, text display device, text display terminal and storage medium
CN112788359B (en) Live broadcast processing method and device, electronic equipment and storage medium
CN111062248A (en) Image detection method, device, electronic equipment and medium
CN110853124B (en) Method, device, electronic equipment and medium for generating GIF dynamic diagram
CN110675473B (en) Method, device, electronic equipment and medium for generating GIF dynamic diagram
CN111327819A (en) Method, device, electronic equipment and medium for selecting image
CN109547847B (en) Method and device for adding video information and computer readable storage medium
CN111402844A (en) Song chorusing method, device and system
CN108495183B (en) Method and device for displaying album information
CN112860046B (en) Method, device, electronic equipment and medium for selecting operation mode
CN110191236B (en) Song playing queue management method and device, terminal equipment and storage medium
CN111341317A (en) Method and device for evaluating awakening audio data, electronic equipment and medium
CN111008083A (en) Page communication method and device, electronic equipment and storage medium
CN112866470A (en) Incoming call processing method and device, electronic equipment and medium
CN111049970B (en) Method and device for operating equipment, electronic equipment and medium
CN111369434B (en) Method, device, equipment and storage medium for generating spliced video covers
CN114595019A (en) Theme setting method, device and equipment of application program and storage medium
CN112732133B (en) Message processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant