CN113093980A - Terminal control method and device, terminal and storage medium - Google Patents

Terminal control method and device, terminal and storage medium Download PDF

Info

Publication number
CN113093980A
CN113093980A CN202110500916.3A CN202110500916A CN113093980A CN 113093980 A CN113093980 A CN 113093980A CN 202110500916 A CN202110500916 A CN 202110500916A CN 113093980 A CN113093980 A CN 113093980A
Authority
CN
China
Prior art keywords
terminal
data
vibration
user
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110500916.3A
Other languages
Chinese (zh)
Other versions
CN113093980B (en
Inventor
杨司烨
魏新勇
谢昂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Douyin Vision Beijing Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202110500916.3A priority Critical patent/CN113093980B/en
Publication of CN113093980A publication Critical patent/CN113093980A/en
Application granted granted Critical
Publication of CN113093980B publication Critical patent/CN113093980B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a control method and device of a terminal, the terminal and a storage medium. A control method of a terminal includes: determining user identity information of a current user; determining a target control parameter based on the use habit data of the user according to the identity information of the user, wherein the use habit data comprises: the terminal comprises first vibration data and/or second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation; and controlling the terminal according to the target control parameter. The method disclosed by the invention can improve the use experience of a user and reduce the false response to the vibration.

Description

Terminal control method and device, terminal and storage medium
Technical Field
The present disclosure relates to the field of intelligent terminal technologies, and in particular, to a terminal control method, an apparatus, a terminal, and a storage medium.
Background
The television is usually controlled by a remote controller, along with the development of the technology, the intelligent remote controller is widely used, and the intelligent remote controller controls the television by controlling a control mark on the television.
Disclosure of Invention
The disclosure provides a control method and device of a terminal, the terminal and a storage medium.
The present disclosure adopts the following technical solutions.
In some embodiments, the present disclosure provides a control method of a terminal, including:
determining user identity information of a current user;
determining a target control parameter based on the use habit data of the user according to the user identity information, wherein the use habit data comprises: the terminal comprises a first vibration data and/or a second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation;
and controlling the terminal according to the target control parameter.
In some embodiments, the present disclosure provides a control apparatus of a terminal, including:
the identity recognition unit is used for determining the user identity information of the current user;
a parameter determining unit, configured to determine a target control parameter based on usage habit data of a user according to the user identity information, where the usage habit data includes: the terminal comprises a first vibration data and/or a second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation;
and the control unit is used for controlling the terminal according to the target control parameter.
In some embodiments, the present disclosure provides a terminal comprising: at least one memory and at least one processor;
the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method.
In some embodiments, the present disclosure provides a storage medium for storing program code for performing the above-described method.
According to the control method of the terminal, the target control parameter is determined based on the user use habit data, so that the target control parameter can be matched with the use habit of the user, and therefore the use experience of the user is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a flowchart of a control method of a terminal according to an embodiment of the present disclosure.
Fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that various steps recited in method embodiments of the present disclosure may be performed in parallel and/or in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The embodiments of the present application will be described in detail below with reference to the accompanying drawings.
With the development of the technology, a remote controller matched with a television is gradually intelligentized, a control identifier associated with the remote controller is displayed on the television, the control identifier on the television is controlled through the operation of the remote controller, and the control identifier can control the content displayed on the television. The embodiment of the present disclosure provides a control method for a terminal, where the terminal may be a remote controller of a television, but is not limited to the remote controller of the television, and may also be other devices, such as a mobile phone and a tablet.
As shown in fig. 1, fig. 1 is a flowchart of a control method of a terminal according to an embodiment of the present disclosure, including the following steps.
S11: and determining the user identity information of the current user.
In some embodiments, the user identity information may be determined when the user uses the terminal, for example, a fingerprint identification area may be provided on the terminal, the user identity information may be determined through a fingerprint image collected by the fingerprint identification area, and the user identity information may be identified when the current user is unlocked through a fingerprint. In other embodiments, it is considered that the terminal may be used by a plurality of different users, and the user identity information of the current user may be identified when the terminal detects the control operation, for example, an image may be collected by a camera on the terminal, the user identity information may be identified according to the collected image, the user identity information may be identified when the control operation is detected, and the user identity information may be identified once each time the control operation is detected, so that the problem that the terminal changes the user in the use process is solved.
S12: and determining target control parameters based on the use habit data of the user according to the identity information of the user.
In some embodiments, the usage habit data is determined and obtained according to the user identity information. In some embodiments, the usage habit data includes one or both of first vibration data and second vibration data, the first vibration data being associated with a state when the terminal does not receive the control operation, the second vibration data being associated with a state when the terminal receives the control operation, and in some embodiments, the first vibration data is used to characterize that the terminal is turned on, the shock state when the control operation is not received, in some cases, part of the users may be affected by physiological reasons (e.g. illness), there is a slight hand shake in the case of holding the terminal, and a gyro sensor or the like of the terminal detects the shake, however, the jitter in this case is not intended to be body-sensing controlled for the terminal, and if a response is made to the jitter, an erroneous response is caused, that is, the first vibration data may be background for representing vibrations, that is, vibrations that the terminal may still detect without any operation by the user. The second vibration data is vibration data when a control operation is performed, the control operation may be, for example, a pressing operation or a clicking operation, an object of the operation may be a touch area or a physical key, when the control operation is performed, a user may contact with the terminal, which generally causes a movement of a position of the terminal, that is, a shake is generated, and a sensor such as a gyroscope may detect the shake, but the second vibration data is actually generated by the control operation, and the shake should not be responded. In some embodiments, the control parameters include a vibration response strategy determined based on the first vibration data and/or the second vibration data, the vibration response strategy being used to determine vibration data that needs to be responded and vibration data that does not need to be responded.
S13: and controlling the terminal according to the target control parameter.
In some embodiments, the target control parameter may be set as a control parameter of the terminal, for example, the vibration response policy may be included in the target control parameter, and the response of the terminal to the vibration is controlled according to the vibration response policy. In some embodiments, the target control parameter is determined based on the user usage habit data, so that the target control parameter can be matched with the user usage habit, thereby improving the user experience, and on the other hand, the user usage habit data includes one or two of the first vibration data and the second vibration data, so that after the terminal is controlled by the target control parameter, the false response of the terminal to the vibration can be reduced.
In some embodiments of the present disclosure, controlling a terminal according to a target control parameter includes: and controlling the terminal according to the first vibration data so that the terminal does not respond to the vibration with the vibration amplitude smaller than the first vibration amplitude under the condition that the terminal does not receive the control operation, wherein the first vibration amplitude is associated with the first vibration data. In some embodiments, the first vibration data may be indicative of a vibration state of the terminal when held by the user but not controlled, i.e. a vibration state that does not require a response, and the first vibration amplitude may be determined from the first vibration data, the first vibration amplitude may be indicative of a vibration amplitude of the terminal when held by the user that is involuntary, and in the event that the vibration amplitude of the terminal is less than the first vibration amplitude, it is indicative that the vibration is likely to be a vibration involuntary to the user and therefore does not require a response to the vibration.
In some embodiments of the present disclosure, controlling a terminal according to a target control parameter includes: and controlling the terminal according to the second vibration data so that the terminal does not respond to the vibration below a second vibration amplitude when receiving the control operation, wherein the second vibration amplitude is associated with the second vibration data. In some embodiments, the second vibration data is indicative of a vibration of the terminal caused by contact with the terminal when the user performs the control operation, and the vibration of the terminal caused by the control operation should not be responded, so that the second vibration amplitude may be determined according to the second vibration data, and the second vibration amplitude may be, for example, a vibration amplitude associated with the control operation of the user determined according to the second vibration data, and in a case where the terminal detects that the vibration amplitude is smaller than the second vibration amplitude and detects the control operation (for example, in a case where a time difference between the detection of the control operation and the detection of the vibration is smaller than a time threshold), it is indicated that the detected vibration is likely to be caused by the control operation, and therefore, the detected vibration does not need to be responded.
In some embodiments of the present disclosure, the usage habit data further comprises: mistakenly touching data; in some embodiments, the false touch data may include, for example, the number of false touches, the position of each false touch, and the type of the false touch (click or slide), and in some embodiments, controlling the terminal according to the target control parameter includes: and adjusting the detection sensitivity of the terminal for detecting the touch operation according to the false touch data. In some embodiments, the number of times of false touches by the user in the last period of time (for example, the last three days or a week) may be determined, and if the number of times of false touches exceeds a threshold number, it indicates that there are more recent false touches, so that the detection sensitivity for detecting a touch operation may be reduced, thereby reducing the response to a slight touch operation and further reducing the response to a false touch.
In some embodiments of the present disclosure, the false touch data includes false touch data in different directions or in different areas of the terminal touch area; and adjusting the detection sensitivity of the terminal in different directions or different areas according to the false touch data. In some embodiments, the false touch data is associated with a touch area and a touch direction, if there is more false touches in a certain area, the detection sensitivity in the area needs to be reduced, and if there is more false touches in a certain direction, the detection sensitivity in the certain direction is reduced, that is, in some embodiments of the present disclosure, the detection sensitivities in different directions or in different areas of the touch area of the terminal may be different, after a touch operation is detected, the area where the touch operation is located or the operation direction of the touch operation (sliding up and down, sliding left and right, etc.) may be determined, then the detection sensitivity in the area or the operation direction may be obtained, and it is determined whether a response should be performed to the touch operation based on the detection sensitivity, so as to reduce false responses. Because the detection sensitivity in different areas or directions can be different in the disclosure, compared with setting a uniform detection sensitivity for the terminal, the method and the device more accord with the use habit of the user, especially under the condition that the user has no operation condition in different directions or different areas, the actual use of the user is not influenced while the false response is reduced to the maximum degree, the terminal adapts to the use habit of the user, and the user does not need to adapt to the terminal.
In some embodiments of the present disclosure, the usage habit data comprises: wake-up operation data for identifying an operational characteristic of the wake-up operation when the wake-up terminal is shaken. In some embodiments, the terminal has a shake wake-up function, and when the terminal is in a sleep state, a wake-up operation is performed on the terminal by shaking the terminal, and the wake-up operation data may be determined according to the operation characteristics of the user when shaking the wake-up terminal, which are collected in the past, for example, one or at least two of a shaking gesture, a shaking strength, a shaking accuracy and a shaking acceleration of the user's shaking operation. The awakening operation used by the user can be known through the awakening operation data. In some embodiments, controlling the terminal according to the target control parameter includes: and adjusting the awakening judgment condition of the terminal according to the awakening operation data. In some embodiments, after the waking operation that the user is accustomed to is known, the waking judgment condition is adjusted correspondingly, for example, the waking judgment condition may be set to match with the waking operation that is most frequently used by the user, so that the user can wake up the terminal more easily without the problem that the terminal cannot be woken up by shaking the terminal many times.
In some embodiments of the present disclosure, the usage habit data comprises: the first sliding distance associated with the clicking operation and/or the second sliding distance associated with the sliding operation. In some embodiments, the first sliding distance may be, for example, a distance that the user slides on the touch area when performing a click operation, and by collecting the first sliding distance, a distance that the user may slide when performing the click operation may be known, and similarly, the second sliding distance may be a distance that the user slides when performing the sliding operation. Controlling the terminal according to the target control parameter comprises: and adjusting the judgment conditions of the terminal for the clicking operation and the sliding operation according to the first sliding distance and/or the second sliding distance. In some embodiments, the specific content of the control may include determining a maximum distance that the click operation is likely to slide according to the first sliding distance and/or the second sliding distance, identifying the touch operation as the click operation in a case where the sliding distance of the touch operation is less than the maximum distance that the click operation is likely to slide, and identifying the touch operation as the sliding operation in a case where the sliding distance of the touch operation is greater than the minimum distance that the sliding operation is likely to slide according to the first sliding distance and/or the second sliding distance. According to the touch operation judgment method and the touch operation judgment device, the judgment conditions of the touch operation and the sliding operation are determined according to the use habits of the user, so that the misjudgment of the terminal on the touch operation in the process of using the terminal by the user can be reduced.
In some embodiments of the present disclosure, the usage habit data comprises: and a mistaken touch area in the terminal touch area is an area where a mistaken touch action occurs. Controlling the terminal according to the target control parameter comprises: and determining whether the detected touch operation is the false touch operation according to the false touch area. In some embodiments, when a user holds the terminal, part of fingers may touch the touch area, that is, a false touch is generated, the area where the false touch occurs may be the false touch area, the false touch area may represent the area that the user may touch when holding the terminal, after the false touch area is determined, the terminal may be controlled without responding to the touch operation of the false touch area, that is, the touch operation in the false touch area may be regarded as the false touch operation.
In some embodiments of the present disclosure, the usage habit data comprises: the terminal comprises first posture data and/or second posture data when being held, wherein the first posture data and the second posture data correspond to different holding hands; the terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: determining a display orientation of a control marker on the target device according to the first posture data and/or the second posture data. In some embodiments, the control indicator has a display orientation, the display orientation refers to a direction in which a shape of the control indicator points when the control indicator is displayed on the target device, and is related to a deflection state of the terminal. The first posture data and the second posture data may be angles describing inclination or deflection habitually used by a user to hold the terminal, for example, a person holding the terminal such as a remote controller with a right hand may unconsciously rotate clockwise or counterclockwise to hold the terminal at a certain angle instead of horizontally, the control sign may display an orientation on the target device according to a rotation corresponding to an angle between the terminal and a horizontal plane when the terminal is held, but actually, the user does not want to change the display orientation of the control sign at this time, and therefore, in some embodiments of the present application, the display orientation of the control sign displayed on the target device is determined according to the first posture data and/or the second posture data, so that the user may not respond to the involuntary rotation when the terminal is held by the user, so that the display manner of the control sign is more consistent with the user's expectation, the display orientation of the control identifier of the terminal on the target device can be determined according to the correction angle.
In some embodiments of the present disclosure, the usage habit data comprises: first sensitivity data and/or second sensitivity data, the first sensitivity data and the second sensitivity data corresponding to different holding hands. The terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: and determining the sensitivity of the terminal control mark based on the holding hand when the terminal is held according to the first sensitivity data and/or the second sensitivity data.
In some embodiments, the first sensitivity data and the second sensitivity data correspond to left and right hands of the user, respectively, and the first sensitivity data and the second sensitivity data are used for representing sensitivities of the user when the user uses the left and right hand control terminal to control the control identifier on the target device. Specifically, for example, in the case that the terminal is a remote controller, the control identifier may be a mouse on a television, and when the user uses the left hand and the right hand, the sensitivity of the mouse is different, so that the mouse is matched with the hand used by the user, and the use experience of the user is improved. In some embodiments, the sensitivity may be a correspondence between a moving distance of the terminal and a moving distance of the control identifier.
In some embodiments of the present disclosure, determining the target control parameter based on the usage habit data of the user according to the user identity information includes: establishing a use preference model of the user based on a machine learning method according to the use habit data of the user; target control parameters are determined based on the user identity information and the usage preference model. In some embodiments, the preference model may be established using a neural network model, which may be, for example, a convolutional neural network or a residual neural network. The model is continuously optimized according to the collected user use habit data by using a machine learning mode, so that the effect of dynamically adjusting the terminal can be realized, and the terminal is continuously matched with the use habit of the user.
Some embodiments of the present disclosure also provide a control apparatus of a terminal, including:
the identity recognition unit is used for determining the user identity information of the current user;
the parameter determining unit is used for determining target control parameters based on the use habit data of the user according to the identity information of the user, wherein the use habit data comprises the following steps: the terminal receives first vibration data when the terminal does not receive control operation, and/or receives second vibration data when the terminal receives control operation;
and the control unit is used for controlling the terminal according to the target control parameter.
In some embodiments, the controlling unit controls the terminal according to the target control parameter, including:
controlling the terminal according to the first vibration data so that the terminal does not respond to vibration with a vibration amplitude smaller than the first vibration amplitude under the condition that the terminal does not receive the control operation, wherein the first vibration amplitude is associated with the first vibration data;
and/or controlling the terminal according to the second vibration data so that the terminal does not respond to the vibration with the vibration amplitude smaller than the second vibration amplitude under the condition of receiving the control operation, wherein the second vibration amplitude is associated with the second vibration data.
In some embodiments, the usage habit data further comprises: mistakenly touching data;
the control unit controls the terminal according to the target control parameter, and comprises the following steps: and adjusting the detection sensitivity of the terminal for detecting the touch operation according to the false touch data.
In some embodiments, the false touch data includes false touch data in different directions or in different areas of the touch area of the terminal; and the control unit adjusts the detection sensitivity of the terminal in different directions or different areas according to the false touch data.
In some embodiments, the usage habit data comprises: wake-up operation data for identifying an operation characteristic of a wake-up operation when the wake-up terminal is shaken;
the control unit controls the terminal according to the target control parameter, and comprises the following steps: and adjusting the awakening judgment condition of the terminal according to the awakening operation data.
In some embodiments, the usage habit data comprises: the first sliding distance associated with the click operation and/or the second sliding distance associated with the sliding operation;
the control unit controls the terminal according to the target control parameter, and comprises the following steps: and adjusting the judgment conditions of the terminal for the clicking operation and the sliding operation according to the first sliding distance and/or the second sliding distance.
In some embodiments, the usage habit data comprises: a mistaken touch area in the terminal touch area is an area where a mistaken touch action occurs; the control unit controls the terminal according to the target control parameter, and comprises the following steps: and determining whether the detected touch operation is the false touch operation according to the false touch area.
In some embodiments, the usage habit data comprises: the terminal comprises first posture data and/or second posture data when being held, wherein the first posture data and the second posture data correspond to different holding hands; the control unit terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: and determining the display orientation of the control mark on the target equipment according to the first posture data and/or the second posture data.
In some embodiments, the usage habit data comprises: first sensitivity data and/or second sensitivity data, the first sensitivity data and the second sensitivity data corresponding to different holding hands; the terminal is used for controlling the control identifier displayed on the target equipment, and the control unit controls the terminal according to the target control parameter, and the control unit comprises the following steps: and determining the sensitivity of the terminal control mark based on the holding hand when the terminal is held according to the first sensitivity data and/or the second sensitivity data.
In some embodiments, the determining unit determines the target control parameter based on the usage habit data of the user according to the identity information of the user, and includes: establishing a use preference model of the user based on a machine learning method according to the use habit data of the user; target control parameters are determined based on the user identity information and the usage preference model.
For the embodiments of the apparatus, since they correspond substantially to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described apparatus embodiments are merely illustrative, wherein the modules described as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The method and apparatus of the present disclosure have been described above based on the embodiments and application examples. In addition, the present disclosure also provides a terminal and a storage medium, which are described below.
Referring now to fig. 2, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in the drawings is only an example and should not bring any limitation to the functions and use range of the embodiments of the present disclosure.
The electronic device 800 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage means 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data necessary for the operation of the electronic apparatus 800 are also stored. The processing apparatus 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
Generally, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage 808 including, for example, magnetic tape, hard disk, etc.; and a communication device 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While the figure illustrates an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 809, or installed from the storage means 808, or installed from the ROM 802. The computer program, when executed by the processing apparatus 801, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure as described above.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, including:
determining user identity information of a current user;
determining a target control parameter based on the use habit data of the user according to the user identity information, wherein the use habit data comprises: the terminal comprises a first vibration data and/or a second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation;
and controlling the terminal according to the target control parameter.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, which controls the terminal according to a target control parameter, including:
controlling the terminal according to the first vibration data so that the terminal does not respond to vibration with vibration amplitude smaller than a first vibration amplitude under the condition that the terminal does not receive a control operation, wherein the first vibration amplitude is associated with the first vibration data;
and/or controlling the terminal according to the second vibration data so that the terminal does not respond to vibration with vibration amplitude smaller than second vibration amplitude under the condition of receiving control operation, wherein the second vibration amplitude is associated with the second vibration data.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, the usage habit data further including: mistakenly touching data;
controlling the terminal according to the target control parameter comprises: and adjusting the detection sensitivity of the terminal for detecting the touch operation according to the false touch data.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, where the mis-touch data includes mis-touch data in different directions or in different areas of a touch area of the terminal;
and adjusting the detection sensitivity of the terminal in different directions or different areas according to the false touch data.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, the usage habit data including: awakening operation data, wherein the awakening operation data is used for identifying operation characteristics of awakening operation when the awakening terminal is shaken;
controlling the terminal according to the target control parameter comprises: and adjusting the awakening judgment condition of the terminal according to the awakening operation data.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, the usage habit data including: the first sliding distance associated with the click operation and/or the second sliding distance associated with the sliding operation;
controlling the terminal according to the target control parameter comprises: and adjusting the judgment conditions of the terminal for the clicking operation and the sliding operation according to the first sliding distance and/or the second sliding distance.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, the usage habit data including: a mistaken touch area in the terminal touch area is an area where a mistaken touch action occurs;
controlling the terminal according to the target control parameter comprises: and determining whether the detected touch operation is the false touch operation according to the false touch area.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, the usage habit data including: first posture data and/or second posture data when the terminal is held, wherein the first posture data and the second posture data correspond to different holding hands;
the terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: and determining the display orientation of the control mark on the target equipment according to the first posture data and/or the second posture data.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, the usage habit data including: first sensitivity data and/or second sensitivity data, the first sensitivity data and the second sensitivity data corresponding to different holding hands;
the terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: and determining the sensitivity of the terminal for controlling the control identifier based on a holding hand when the terminal is held according to the first sensitivity data and/or the second sensitivity data.
According to one or more embodiments of the present disclosure, there is provided a control method of a terminal, which determines a target control parameter based on usage habit data of a user according to user identity information, including:
establishing a use preference model of the user based on a machine learning method according to the use habit data of the user;
determining the target control parameter according to the user identity information and the usage preference model.
According to one or more embodiments of the present disclosure, there is provided a control apparatus of a terminal, including:
the identity recognition unit is used for determining the user identity information of the current user;
a parameter determining unit, configured to determine a target control parameter based on usage habit data of a user according to the user identity information, where the usage habit data includes: the terminal comprises a first vibration data and/or a second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation;
and the control unit is used for controlling the terminal according to the target control parameter.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor;
wherein the at least one memory is configured to store program code, and the at least one processor is configured to call the program code stored in the at least one memory to perform the method of any one of the above.
According to one or more embodiments of the present disclosure, there is provided a storage medium for storing program code for performing the above-described method.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (13)

1. A control method of a terminal, comprising:
determining user identity information of a current user;
determining a target control parameter based on the use habit data of the user according to the user identity information, wherein the use habit data comprises: the terminal comprises a first vibration data and/or a second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation;
and controlling the terminal according to the target control parameter.
2. The method of claim 1, wherein controlling the terminal according to the target control parameter comprises:
controlling the terminal according to the first vibration data so that the terminal does not respond to vibration with vibration amplitude smaller than a first vibration amplitude under the condition that the terminal does not receive a control operation, wherein the first vibration amplitude is associated with the first vibration data;
and/or the presence of a gas in the gas,
and controlling the terminal according to the second vibration data so that the terminal does not respond to the vibration with the vibration amplitude smaller than the second vibration amplitude under the condition of receiving the control operation, wherein the second vibration amplitude is associated with the second vibration data.
3. The method of claim 1,
the usage habit data further comprises: mistakenly touching data;
controlling the terminal according to the target control parameter comprises: and adjusting the detection sensitivity of the terminal for detecting the touch operation according to the false touch data.
4. The method of claim 3,
the false touch control data comprises false touch control data in different directions or different areas of a terminal touch control area;
and adjusting the detection sensitivity of the terminal in different directions or different areas according to the false touch data.
5. The method of claim 1,
the usage habit data includes: awakening operation data, wherein the awakening operation data is used for identifying operation characteristics of awakening operation when the awakening terminal is shaken;
controlling the terminal according to the target control parameter comprises: and adjusting the awakening judgment condition of the terminal according to the awakening operation data.
6. The method of claim 1,
the usage habit data includes: the first sliding distance associated with the click operation and/or the second sliding distance associated with the sliding operation;
controlling the terminal according to the target control parameter comprises: and adjusting the judgment conditions of the terminal for the clicking operation and the sliding operation according to the first sliding distance and/or the second sliding distance.
7. The method of claim 1,
the usage habit data includes: a mistaken touch area in the terminal touch area is an area where a mistaken touch action occurs;
controlling the terminal according to the target control parameter comprises: and determining whether the detected touch operation is the false touch operation according to the false touch area.
8. The method of claim 1,
the usage habit data includes: first posture data and/or second posture data when the terminal is held, wherein the first posture data and the second posture data correspond to different holding hands;
the terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: and determining the display orientation of the control mark on the target equipment according to the first posture data and/or the second posture data.
9. The method of claim 1,
the usage habit data includes: first sensitivity data and/or second sensitivity data, the first sensitivity data and the second sensitivity data corresponding to different holding hands;
the terminal is used for controlling the control identifier displayed on the target equipment, and the control of the terminal according to the target control parameter comprises the following steps: and determining the sensitivity of the terminal for controlling the control identifier based on a holding hand when the terminal is held according to the first sensitivity data and/or the second sensitivity data.
10. The method of claim 1, wherein determining target control parameters based on user usage habit data according to the user identity information comprises:
establishing a use preference model of the user based on a machine learning method according to the use habit data of the user;
determining the target control parameter according to the user identity information and the usage preference model.
11. A control apparatus of a terminal, comprising:
the identity recognition unit is used for determining the user identity information of the current user;
a parameter determining unit, configured to determine a target control parameter based on usage habit data of a user according to the user identity information, where the usage habit data includes: the terminal comprises a first vibration data and/or a second vibration data, wherein the first vibration data is associated with a state when the terminal does not receive the control operation, and the second vibration data is associated with a state when the terminal receives the control operation;
and the control unit is used for controlling the terminal according to the target control parameter.
12. A terminal, comprising:
at least one memory and at least one processor;
wherein the at least one memory is configured to store program code and the at least one processor is configured to invoke the program code stored in the at least one memory to perform the method of any of claims 1 to 10.
13. A storage medium for storing program code for performing the method of any one of claims 1 to 10.
CN202110500916.3A 2021-05-08 2021-05-08 Terminal control method and device, terminal and storage medium Active CN113093980B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110500916.3A CN113093980B (en) 2021-05-08 2021-05-08 Terminal control method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110500916.3A CN113093980B (en) 2021-05-08 2021-05-08 Terminal control method and device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113093980A true CN113093980A (en) 2021-07-09
CN113093980B CN113093980B (en) 2023-02-21

Family

ID=76664714

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110500916.3A Active CN113093980B (en) 2021-05-08 2021-05-08 Terminal control method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113093980B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741749A (en) * 2021-08-27 2021-12-03 北京字节跳动网络技术有限公司 Cursor position updating method and device and electronic equipment
CN117153163A (en) * 2023-10-08 2023-12-01 桂林航天工业学院 Hand rehabilitation method, system, storage medium and terminal based on voice interaction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130013607A1 (en) * 2011-07-07 2013-01-10 Daniel Allan Mooney Systems, methods, and media for correlating objects according to relationships
CN105607741A (en) * 2015-12-31 2016-05-25 联想(北京)有限公司 Control method and electronic equipment
CN107357465A (en) * 2017-07-26 2017-11-17 深圳天珑无线科技有限公司 Information processing method, device and nonvolatile computer storage media
CN107506634A (en) * 2017-07-31 2017-12-22 广东欧珀移动通信有限公司 Display methods, device, storage medium and the terminal of data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130013607A1 (en) * 2011-07-07 2013-01-10 Daniel Allan Mooney Systems, methods, and media for correlating objects according to relationships
CN105607741A (en) * 2015-12-31 2016-05-25 联想(北京)有限公司 Control method and electronic equipment
CN107357465A (en) * 2017-07-26 2017-11-17 深圳天珑无线科技有限公司 Information processing method, device and nonvolatile computer storage media
CN107506634A (en) * 2017-07-31 2017-12-22 广东欧珀移动通信有限公司 Display methods, device, storage medium and the terminal of data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741749A (en) * 2021-08-27 2021-12-03 北京字节跳动网络技术有限公司 Cursor position updating method and device and electronic equipment
CN117153163A (en) * 2023-10-08 2023-12-01 桂林航天工业学院 Hand rehabilitation method, system, storage medium and terminal based on voice interaction

Also Published As

Publication number Publication date
CN113093980B (en) 2023-02-21

Similar Documents

Publication Publication Date Title
WO2019174611A1 (en) Application configuration method and mobile terminal
US10866706B2 (en) Electronic device for displaying application and operating method thereof
WO2019105227A1 (en) Application icon display method, terminal, and computer readable storage medium
CN113093980B (en) Terminal control method and device, terminal and storage medium
CN109428969A (en) Edge touch control method, device and the computer readable storage medium of double screen terminal
CN105549878A (en) Electronic book page turning control method and device
CN111104980B (en) Method, device, equipment and storage medium for determining classification result
CN108984066B (en) Application icon display method and mobile terminal
CN111738365B (en) Image classification model training method and device, computer equipment and storage medium
CN111835916B (en) Training method and device of attitude detection model and detection method and device of terminal attitude
CN110890969B (en) Method and device for mass-sending message, electronic equipment and storage medium
CN109302563B (en) Anti-shake processing method and device, storage medium and mobile terminal
CN107967086B (en) Icon arrangement method and device for mobile terminal and mobile terminal
CN113032172B (en) Abnormality detection method and device and electronic equipment
CN113342170A (en) Gesture control method, device, terminal and storage medium
CN112770003A (en) Method, device, terminal and storage medium for controlling electronic equipment
US9189151B2 (en) Pre-emptive CPU activation from touch input
CN111324247A (en) Information display method and electronic equipment
CN111026955A (en) Application program recommendation method and electronic equipment
CN107930126B (en) Game reservation data processing method and device and mobile terminal
CN113253847B (en) Terminal control method, device, terminal and storage medium
CN113766293B (en) Information display method, device, terminal and storage medium
CN111263084B (en) Video-based gesture jitter detection method, device, terminal and medium
CN110908732B (en) Application task deleting method and electronic equipment
CN113055524A (en) Terminal control method and device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: Tiktok vision (Beijing) Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.