CN107291242B - Intelligent terminal control method and intelligent terminal - Google Patents

Intelligent terminal control method and intelligent terminal Download PDF

Info

Publication number
CN107291242B
CN107291242B CN201710523540.1A CN201710523540A CN107291242B CN 107291242 B CN107291242 B CN 107291242B CN 201710523540 A CN201710523540 A CN 201710523540A CN 107291242 B CN107291242 B CN 107291242B
Authority
CN
China
Prior art keywords
user
data
motion
behavior event
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710523540.1A
Other languages
Chinese (zh)
Other versions
CN107291242A (en
Inventor
彭兰华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710523540.1A priority Critical patent/CN107291242B/en
Publication of CN107291242A publication Critical patent/CN107291242A/en
Application granted granted Critical
Publication of CN107291242B publication Critical patent/CN107291242B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/21Combinations with auxiliary equipment, e.g. with clocks or memoranda pads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention provides a control method of an intelligent terminal and the intelligent terminal, wherein the method comprises the following steps: receiving sensing data sent by intelligent wearable equipment, wherein the intelligent terminal is in communication connection with the intelligent wearable equipment in advance; identifying a user behavior event of the intelligent wearable device user according to the sensing data; and if the user behavior event is a target behavior event, executing the operation associated with the target behavior event. The embodiment of the invention can achieve the aim of realizing intelligent control on the intelligent terminal by combining the intelligent wearable equipment, and the function of combining the intelligent wearable equipment and the intelligent terminal is richer.

Description

Intelligent terminal control method and intelligent terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a control method for an intelligent terminal, and a computer-readable storage medium.
Background
Along with the development of mobile communication technology, intelligent terminal has become an indispensable instrument in people's life and work, and intelligent terminal and intelligent wearing equipment's combination application then have brought more facilities for people's life. At present, scenes in which an intelligent terminal and an intelligent wearable device are combined and applied are often some more conventional scenes, for example: the intelligent bracelet can track daily activities, sleep conditions and eating habits of the user, synchronizes data to the intelligent terminal, and helps the user to know and improve the health condition of the user. However, in the prior art, the control of the operating state and the operating mode of the intelligent terminal is often implemented by manual control of a user, for example: when a user wants to listen to a broadcast during driving, the radio application program of the intelligent terminal needs to be manually started, so that the intelligent terminal plays a broadcast signal received through the radio application program. Therefore, the existing intelligent terminal has the problems that the control method is not intelligent enough and the function of combining with the intelligent wearing equipment is single.
Disclosure of Invention
The embodiment of the invention provides a control method of an intelligent terminal and the intelligent terminal, and aims to solve the problems that the existing intelligent terminal is not intelligent enough in control method and is single in function combined with intelligent wearing equipment.
In order to solve the technical problem, the invention is realized as follows: a control method of an intelligent terminal is applied to the intelligent terminal, and the method comprises the following steps:
receiving sensing data sent by intelligent wearable equipment, wherein the intelligent terminal is in communication connection with the intelligent wearable equipment in advance;
identifying a user behavior event of the intelligent wearable device user according to the sensing data;
and if the user behavior event is a target behavior event, executing the operation associated with the target behavior event.
An embodiment of the present invention further provides an intelligent terminal, including:
the intelligent wearable device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving sensing data sent by the intelligent wearable device, and the intelligent terminal is in communication connection with the intelligent wearable device in advance;
the identification module is used for identifying the user behavior event of the intelligent wearable device user according to the sensing data;
and the execution module is used for executing the operation associated with the target behavior event if the user behavior event is the target behavior event.
An embodiment of the present invention further provides an intelligent terminal, including: the intelligent terminal control method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the computer program realizes the steps of the intelligent terminal control method in the embodiment of the invention when being executed by the processor.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the control method of the intelligent terminal in the embodiment of the present invention are implemented.
In the embodiment of the invention, the intelligent terminal can confirm the behavior event of the user according to the sensing data sent by the intelligent wearable device and execute the operation associated with the behavior event of the user according to the behavior event of the user, so that the purpose of realizing intelligent control on the intelligent terminal by combining the intelligent wearable device is achieved, and the function of combining the intelligent wearable device and the intelligent terminal is richer.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a schematic flowchart of a control method of an intelligent terminal according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of another control method for an intelligent terminal according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of another control method for an intelligent terminal according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an intelligent terminal according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another intelligent terminal provided in the embodiment of the present invention;
fig. 6 is a schematic structural diagram of another intelligent terminal provided in the embodiment of the present invention;
fig. 7 is a schematic structural diagram of another intelligent terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flowchart of a control method of an intelligent terminal according to an embodiment of the present invention, and is applied to the intelligent terminal, as shown in fig. 1, the method includes the following steps:
step 101, receiving sensing data sent by intelligent wearable equipment, wherein the intelligent terminal is in communication connection with the intelligent wearable equipment in advance.
The intelligent wearing equipment is a general name applying wearing technology to intelligently design daily wearing and develop wearable equipment, and the intelligent wearing equipment can be an intelligent watch, an intelligent bracelet or intelligent glasses and the like. The sensing data may be data such as daily behaviors or physiological states of the user detected by the intelligent wearable device through a built-in sensor, for example: the user's motion state and motion data, or the user's heart rate and blood pressure, etc.
The above-mentioned intelligent wearable device can send the sensed data who gathers to intelligent terminal, wherein, intelligent terminal with intelligent wearable device has set up communication connection in advance, for example: the intelligent terminal and the intelligent wearable device establish Communication connection through Bluetooth, Wireless Local Area Network (WLAN) or Near Field Communication (NFC). Therefore, the intelligent terminal can detect the sensing data acquired by the intelligent wearable device in real time so as to judge the behavior event of the user according to the sensing data.
It should be noted that the intelligent terminal may set a time for monitoring the intelligent wearable device, that is, the intelligent terminal receives the sensing data, which is sent by the intelligent wearable device and collected in the monitoring time period, in a preset monitoring time period, so that diversified requirements of a user may be met, and system resources of the intelligent terminal may be saved.
And 102, identifying a user behavior event of the intelligent wearable device user according to the sensing data.
The above-mentioned intelligent wearing equipment user is for wearing the user of intelligent wearing equipment, and above-mentioned user action incident is the action that the user carries out at present, for example: tooth brushing, running, riding, sleeping and the like. In this step, a user behavior event of the user needs to be identified according to the sensing data received in step 101, for example: the received sensing data comprises data of motion frequency and motion amplitude, the motion frequency and the motion amplitude are in a certain specific range and accord with the characteristics of the tooth brushing behavior, and the user behavior event can be judged to be the tooth brushing event according to the detected motion frequency and motion amplitude of the user. In this way, by identifying the received sensing data, the user behavior event of the intelligent wearable device user can be determined, so that corresponding association operation can be executed according to the user behavior event.
It should be noted that, the intelligent terminal may preset a user behavior identification rule, for example: setting a motion data characteristic of the brushing event, wherein the motion data of the brushing event comprises a motion frequency and a motion amplitude, so that whether the user behavior event is the brushing event can be determined by judging whether the received sensing data conforms to the preset range of the motion frequency and the motion amplitude of the brushing event. The intelligent terminal can set identification rules of various user behavior events, such as: a brushing event, a running event, a riding event, etc., so that different behavior events of the user can be recognized, thereby performing different associated operations according to the different behavior events of the user.
And 103, if the user behavior event is a target behavior event, executing an operation associated with the target behavior event.
The target behavior event may be a user behavior event preset by the intelligent terminal, and the number of the user behavior events preset by the intelligent terminal may be multiple, for example: the preset user behavior events of the intelligent terminal comprise a tooth brushing event, a running event, a riding event and the like, and the corresponding user behavior events are defined as the target behavior events as long as the user behavior events are one of the preset user behavior events. And the operation associated with the target behavior event can be news broadcasting, business arrangement broadcasting or music playing, and the like, so that the intelligent terminal can identify the user behavior event through the sensing data collected by the intelligent wearable device and execute the operation associated with the user behavior event under the condition that the user behavior event is determined to be the target behavior event, and the purpose of intelligently controlling the intelligent terminal by combining the intelligent wearable device is achieved.
It should be noted that, the intelligent terminal may preset an execution condition of an operation associated with a target behavior event, for example: and presetting weather and news broadcast if the target behavior event is a tooth brushing event, and playing music and the like if the target behavior event is a running event. Like this, some associated operations can be carried out according to user's action intelligence to intelligent terminal alright, and need not user manual operation, thereby make intelligent terminal has more intellectuality, and the function that intelligent wearing equipment and intelligent terminal's combination realized is not only data such as monitoring user's motion step number, rhythm of the heart or blood pressure, but realizes intelligent control to intelligent terminal through monitoring user's action.
In the embodiment of the present invention, the intelligent terminal includes, but is not limited to, an intelligent terminal such as a Mobile phone, a tablet Personal Computer (tablet Personal Computer), a notebook Personal Computer (notebook Personal Computer), a palm top Computer, a navigation device, a Personal Digital Assistant (PDA), or a Mobile Internet Device (MID).
In the embodiment of the invention, the intelligent terminal can confirm the behavior event of the user according to the sensing data sent by the intelligent wearable device and execute the operation associated with the behavior event of the user according to the behavior event of the user, so that the purpose of realizing intelligent control on the intelligent terminal by combining the intelligent wearable device is achieved, and the function of combining the intelligent wearable device and the intelligent terminal is richer.
Referring to fig. 2, fig. 2 is a schematic flowchart of another control method of an intelligent terminal according to an embodiment of the present invention, and the control method is applied to the intelligent terminal. The embodiment is based on the embodiment shown in fig. 1, the sensing data is defined, and the steps of identifying the user behavior event of the smart wearable device user according to the sensing data are explained. As shown in fig. 2, the method comprises the steps of:
step 201, receiving sensing data sent by an intelligent wearable device, wherein the intelligent terminal and the intelligent wearable device establish communication connection in advance, and the sensing data comprises user motion behavior data.
In this step, the sensing data includes user motion behavior data, where the user motion behavior data may include one or more of motion direction data, motion frequency data, motion amplitude data, motion speed data, and motion angular speed data. The movement direction data includes up, down, left, right, forward, backward, clockwise rotation in horizontal direction, counterclockwise rotation in horizontal direction, clockwise rotation in vertical direction, counterclockwise rotation in vertical direction, or the like. The motion frequency data may be the number of times of a certain same or similar action performed within a certain time, the motion amplitude data may be a motion range in a certain direction, the motion speed data may be a distance moved within a certain time, and the motion angular speed may be an arc moved within a certain time. In this way, the intelligent terminal can receive the user motion behavior data sent by the intelligent wearable device, so as to identify the user behavior event of the user according to the user motion behavior data.
The specific implementation of the step of receiving the sensing data sent by the intelligent wearable device may refer to the implementation of step 101 in the method embodiment shown in fig. 1, and is not described here again to avoid repetition.
Step 202, judging whether the user motion behavior data is within a preset motion data range, and if the user motion behavior data is within the preset motion data range, determining that the user behavior event is the target behavior event.
The preset motion data range may be a motion data range of a user behavior event preset by the intelligent terminal, for example: the motion frequency and the motion amplitude range of the tooth brushing event are preset, and the motion speed, the motion direction range and the like of the running event are preset. And the intelligent terminal can set the motion data range of different behavior events. In this way, in this step, it is determined whether the received user athletic performance data is within a preset athletic data range of a certain user behavioral event, and according to a preset athletic data range to which the user athletic performance data conforms, a user behavioral event corresponding to the preset athletic data range is determined, and the user behavioral event is determined as the target behavioral event.
Optionally, the determining whether the user athletic performance data is within a preset athletic data range, and if the user athletic performance data is within the preset athletic data range, determining that the user behavior event of the user of the smart wearable device is the target behavior event includes:
judging whether the motion frequency of the user of the intelligent wearable device is within a preset motion frequency range or not, and if the motion frequency is within the preset motion frequency range, determining that the user behavior event is a first target behavior event; or
Judging whether the motion amplitude of the user of the intelligent wearable device is within a preset motion amplitude range or not, and if the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a second target behavior event; or
Judging whether the motion frequency of the intelligent wearable device user is within a preset motion frequency range or not and whether the motion amplitude of the intelligent wearable device user is within a preset motion amplitude range or not, and if the motion frequency is within the preset motion frequency range and the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a third target behavior event.
The preset motion frequency range may be a preset motion frequency range of a certain user behavior event, the motion frequency may be a frequency of a certain motion characteristic, and the first target behavior event may be understood as a certain user behavior event preset by the intelligent terminal. For example: the preset first target behavior event is a rope skipping movement event, and the movement frequency range of the rope skipping movement event is 100-250 times per minute, the above-mentioned judgment on whether the movement frequency of the user of the intelligent wearable device is within the preset movement frequency range may be to judge whether the rope skipping frequency of the user per minute is 150-240 times. And if the motion frequency is within the preset motion frequency range, determining that the user behavior event is a first target behavior event. In this way, the smart terminal may determine whether the user behavior event is the first target behavior event by recognizing a motion frequency of the user.
The preset motion amplitude range may be a preset motion amplitude range of a certain user behavior event, the motion amplitude may be an amplitude of a certain motion characteristic, and the first target behavior event may be understood as another user behavior event preset by the intelligent terminal. For example: if the second target behavior event is preset to be a slow-walking motion event, and the motion amplitude range of the slow-walking motion event is 40-70 centimeters, the above-mentioned judgment on whether the motion amplitude of the user of the intelligent wearable device is within the preset motion amplitude range can be to judge whether the distance of the user walking one step slowly is 40-70 centimeters. And if the motion amplitude is within the preset motion amplitude range, determining the user behavior event as a second target behavior event. In this way, the intelligent terminal may determine whether the user behavior event is the second target behavior event by recognizing the motion amplitude of the user.
In this embodiment, the intelligent terminal may preset that the third target behavior event includes a motion frequency characteristic and a motion amplitude characteristic. For example: if a third target behavior event is preset as a tooth brushing event, the range of the motion frequency of the tooth brushing event is 150-240 times per minute, and the range of the motion amplitude is 1-10 cm, the above-mentioned method determines whether the motion frequency of the user of the intelligent wearable device is within the preset range of the motion frequency, and whether the motion amplitude of the user of the intelligent wearable device is within the preset range of the motion amplitude, so that the user behavior event needs to meet the third target behavior event, which may be determining whether the number of times of up-and-down arm motions of the user per minute is 150-240 times, and whether the amplitude of up-and-down arm motions of the user is 1-10 cm. And if the motion frequency range is within the preset motion frequency range and the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a third target behavior event. In this way, the smart terminal may determine whether the user behavior event is the third target behavior event by recognizing the motion frequency and the motion amplitude of the user.
It should be noted that the specific characteristics of the target behavior event may be specifically set by the intelligent terminal according to the attributes of the target behavior event, so that the intelligent terminal may determine, according to the specific motion characteristic data of the user, the specific target behavior event to which the user behavior event belongs.
Optionally, the third targeted behavioral event is a brushing event.
The intelligent terminal can preset that the tooth brushing event needs to meet two characteristics, wherein the two characteristics are that the motion frequency of the intelligent wearable device user is within the preset motion frequency range, and the motion amplitude of the intelligent wearable device user is within the preset motion amplitude range. In this embodiment, if the user behavior event matches the two characteristics, the user behavior event can be determined to be the brushing event. In this way, the intelligent terminal can execute the operation associated with the brushing event according to the judgment result, such as: when the user is detected to have tooth brushing behavior, weather or news is broadcasted.
Optionally, the sensing data further includes external environment data;
the judging whether the user motion behavior data is within a preset motion data range includes:
and judging whether the external environment data is predefined data representing a preset environment or not, and if the external environment data is the predefined data representing the preset environment, judging whether the user motion behavior data is within the range of the preset motion data or not.
The external environment data may include one or more of light intensity data, temperature data, and sound data. The light intensity data can be for passing through the external light intensity data that intelligence wearing equipment gathered, temperature data can be for passing through the external temperature data that intelligence wearing equipment gathered, sound data can be for passing through the external sound decibel data that intelligence wearing equipment gathered. In this embodiment, before determining the user athletic performance data, it is determined whether the external environment data is predefined data representing a preset environment, where the predefined data representing the preset environment may be, for example: the temperature data range of the cold environment is predefined to be below 10 ℃, and the sound decibel range of the quiet environment is predefined to be below 50 decibels.
And when the external environment data is judged to be the predefined data representing the preset environment, executing the step of judging whether the user motion behavior data is within the preset motion data range, so that the intelligent terminal can also decide whether to execute the operation associated with the user behavior event according to the external environment characteristics so as to meet the diversified requirements of the user. For example: and if the external environment data is judged to be in accordance with the preset data of cold weather, judging whether the user motion behavior is in the preset motion data range, and if the user behavior event is judged to be a tooth brushing event, outputting a voice reminding signal to remind the user of cold weather and paying attention to clothes addition.
And 203, if the user behavior event is a target behavior event, executing an operation associated with the target behavior event.
The specific implementation of this step may refer to the implementation of step 103 in the method embodiment shown in fig. 1, and is not described here again to avoid repetition.
Optionally, if the user behavior event is a target behavior event, executing an operation associated with the target behavior event, including:
and if the user behavior event is the target behavior event, starting an application program associated with the target behavior event, and playing the multimedia content of the application program.
The application program may be an application program preset by the intelligent terminal and associated with the target behavior event, for example: the application program related to the tooth brushing event is preset to be a news broadcasting application program, and the application program related to the running event is preset to be a music playing application program. The multimedia content may be one or both of audio data and video data, so that if it is determined that the user behavior event is the target behavior event, an application program associated with the target behavior event may be started, and the multimedia content of the application program may be played.
In practical applications, when a user is engaged in certain sports, it is inconvenient to carry the intelligent terminal on his body or operate and view the displayed content, so that some specific user behavior events can be preset to be associated with some applications capable of playing audio or video, so that the user can still enjoy the information services provided by the intelligent terminal while performing some specific activities.
The embodiment described in the foregoing embodiment will be described below by way of example of a news broadcast operation performed according to a brushing behavior event of a user:
example 1:
the method comprises the steps that a preset monitoring time period is obtained, sensing data sent by the intelligent wearable device are received at the starting time of the monitoring time period, wherein the intelligent terminal is in communication connection with the intelligent wearable device in advance, and the sensing data can comprise motion frequency data and motion amplitude data of a user.
And judging whether the sensing data meet a preset motion data range of a preset tooth brushing event, namely judging whether the motion frequency data of the user are within the preset motion frequency range and judging whether the motion amplitude data of the user are within the preset motion amplitude range.
And if the motion frequency data or the motion amplitude data of the user is not in the preset motion data range, judging that the action event currently performed by the user is not a tooth brushing event, and the intelligent terminal does not need to execute any associated operation.
If the motion frequency data and the motion amplitude data of the user are both within the preset motion data range, judging that the current action event of the user is a tooth brushing event, and determining that the target execution operation related to the tooth brushing event is a broadcast news through acquiring the preset operation related to the tooth brushing event. Finally, a news cast application associated with the brushing event is started and the news of the news cast application is cast. The specific flow of this example can be as shown in fig. 3.
In this embodiment, the sensing data is defined on the basis of the embodiment shown in fig. 1, and the step of identifying the user behavior event of the user of the smart wearable device according to the sensing data is explained, so that the way of identifying the user behavior event of the user of the smart wearable device is more clearly clarified. In addition, this embodiment has still increased multiple optional implementation modes on the basis of the embodiment that fig. 1 shows, and these optional implementation modes can be combined each other and realize, also can realize alone, and can both reach and combine intelligent wearing equipment to realize intelligent control to intelligent terminal, and make the intelligent wearing equipment and the richer technological effect of function that intelligent terminal combines.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an intelligent terminal according to an embodiment of the present invention, and as shown in fig. 4, the intelligent terminal 400 includes:
the receiving module 401 is configured to receive sensing data sent by an intelligent wearable device, where the intelligent terminal and the intelligent wearable device establish a communication connection in advance;
an identifying module 402, configured to identify a user behavior event of the smart wearable device user according to the sensing data;
an executing module 403, configured to execute an operation associated with the target behavior event if the user behavior event is the target behavior event.
Optionally, the sensing data includes user athletic performance data;
the identification module 402 is configured to determine whether the user athletic performance data is within a preset athletic data range, and if the user athletic performance data is within the preset athletic data range, determine that the user behavior event is the target behavior event.
Optionally, as shown in fig. 5, the identifying module 402 includes:
the first judging unit 4021 is configured to judge whether a motion frequency of the user of the smart wearable device is within a preset motion frequency range, and if the motion frequency is within the preset motion frequency range, determine that the user behavior event is a first target behavior event; or
A second determining unit 4022, configured to determine whether a motion amplitude of the user of the smart wearable device is within a preset motion amplitude range, and if the motion amplitude is within the preset motion amplitude range, determine that the user behavior event is a second target behavior event; or
A third determining unit 4023, configured to determine whether the motion frequency of the smart wearable device user is within a preset motion frequency range, and whether the motion amplitude of the smart wearable device user is within a preset motion amplitude range, and if the motion frequency is within the preset motion frequency range and the motion amplitude is within the preset motion amplitude range, determine that the user behavior event is a third target behavior event.
Optionally, the third targeted behavioral event is a brushing event.
Optionally, the sensing data further includes external environment data;
the identification module 402 is configured to determine whether the external environment data is predefined data representing a preset environment, and if the external environment data is the predefined data representing the preset environment, determine whether the user motion behavior data is within the preset motion data range.
Optionally, the executing module 403 is configured to start an application program associated with the target behavior event and play the multimedia content of the application program if the user behavior event is the target behavior event.
The intelligent terminal 400 provided in the embodiment of the present invention can implement each process implemented by the intelligent terminal in the method embodiments of fig. 1 to fig. 3, and is not described herein again to avoid repetition. The intelligent terminal 400 can be combined with the intelligent wearable device to realize intelligent control over the intelligent terminal, and the functions combined with the intelligent terminal are richer.
Referring to fig. 6, fig. 6 is a structural diagram of another intelligent terminal according to the present invention, and as shown in fig. 6, an intelligent terminal 600 includes: at least one processor 601, memory 602, at least one network interface 604, and other user interfaces 603. The various components in the intelligent terminal 600 are coupled together by a bus system 605. It is understood that the bus system 605 is used to enable communications among the components. The bus system 605 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 605 in fig. 6.
The user interface 603 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, etc.).
It will be appreciated that the memory 602 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (ddr SDRAM ), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus DRAM (DRDRAM). The memory 602 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 602 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 6021 and application programs 6022.
The operating system 6021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application program 6022 includes various application programs such as a Media Player (Media Player), a Browser (Browser), and the like, and is used to implement various application services. A program implementing the method of an embodiment of the invention can be included in the application program 6022.
In this embodiment of the present invention, the mobile terminal 600 further includes: a computer program stored in the memory 602 and executable on the processor 601 may specifically be a computer program in the application 6022, and when executed by the processor 601, the following steps are implemented:
receiving sensing data sent by intelligent wearable equipment, wherein the intelligent terminal is in communication connection with the intelligent wearable equipment in advance;
identifying a user behavior event of the intelligent wearable device user according to the sensing data;
and if the user behavior event is a target behavior event, executing the operation associated with the target behavior event.
The method disclosed by the above-mentioned embodiment of the present invention can be applied to the processor 601, or implemented by the processor 601. The processor 601 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 601. The Processor 601 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 602, and the processor 601 reads the information in the memory 602 and completes the steps of the method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the sensing data includes user athletic performance data;
the processor 601 executes the step of identifying the user behavior event of the smart wearable device user according to the sensing data, which includes:
and judging whether the user motion behavior data is within a preset motion data range, and if the user motion behavior data is within the preset motion data range, determining that the user behavior event is the target behavior event.
Optionally, the step of determining, by the processor 601, whether the user athletic performance data is within a preset athletic data range is executed, and if the user athletic performance data is within the preset athletic data range, determining that the user behavioral event of the user of the smart wearable device is the target behavioral event includes:
judging whether the motion frequency of the user of the intelligent wearable device is within a preset motion frequency range or not, and if the motion frequency is within the preset motion frequency range, determining that the user behavior event is a first target behavior event; or
Judging whether the motion amplitude of the user of the intelligent wearable device is within a preset motion amplitude range or not, and if the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a second target behavior event; or
Judging whether the motion frequency of the intelligent wearable device user is within a preset motion frequency range or not and whether the motion amplitude of the intelligent wearable device user is within a preset motion amplitude range or not, and if the motion frequency is within the preset motion frequency range and the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a third target behavior event.
Optionally, the third targeted behavioral event is a brushing event.
Optionally, the sensing data further includes external environment data;
the processor 601 performs the step of determining whether the user exercise behavior data is within a preset exercise data range, including:
and judging whether the external environment data is predefined data representing a preset environment or not, and if the external environment data is the predefined data representing the preset environment, judging whether the user motion behavior data is within the range of the preset motion data or not.
Optionally, the step of executing, by the processor 601, the operation associated with the target behavior event if the user behavior event is the target behavior event includes:
and if the user behavior event is the target behavior event, starting an application program associated with the target behavior event, and playing the multimedia content of the application program.
The intelligent terminal 600 provided in the embodiment of the present invention can implement each process implemented by the intelligent terminal in the method embodiments of fig. 1 to fig. 3, and is not described herein again to avoid repetition. Intelligent terminal 600 can combine intelligent wearing equipment to realize intelligent control to intelligent terminal, and the function that combines with intelligent terminal is abundanter.
Referring to fig. 7, fig. 7 is a schematic diagram of a hardware structure of an intelligent terminal for implementing various embodiments of the present invention, where the intelligent terminal 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an (audio/video) input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, and a power supply 711. Those skilled in the art will appreciate that the intelligent terminal architecture shown in fig. 7 does not constitute a limitation of the intelligent terminal, and that the intelligent terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The following specifically describes each component of the intelligent terminal with reference to fig. 7:
the radio frequency unit 701 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), TDD-LTE (Time Division duplex-Long Term Evolution), Wi-Fi module, NFC module, and bluetooth module.
The intelligent terminal may assist the user in sending and receiving e-mail, browsing web pages, accessing streaming media, etc. through the network module 702, which provides the user with wireless broadband internet access, and the network module 702 includes but is not limited to RJ45 port module, etc.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound when the smart terminal 700 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the smart terminal 700 (e.g., a call signal reception sound or a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 can receive sounds (audio data) via the microphone 7042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The smart terminal 700 also includes at least one sensor 705, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or backlight when the smart terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related game or magnetometer attitude calibration) for recognizing the attitude of the intelligent terminal, vibration recognition related functions (such as pedometer or tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer or an infrared sensor, which can be configured in the intelligent terminal, further description is omitted here.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the smart terminal. Specifically, the user input unit 707 may include a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, can collect touch operations performed by a user on or near the touch panel 7071 (e.g., operations performed by the user on or near the touch panel 7071 using any suitable object or accessory such as a finger, a stylus, etc.), and drive corresponding connection devices according to a preset program. The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, and can receive and execute commands sent by the processor 710. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, one or more of a physical keyboard, a function key (such as a volume control key or a switch key), a trackball, a mouse, a joystick, and the like, which are not limited to these specific examples.
Further, the touch panel 7071 may cover the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although in fig. 7, the touch panel 7071 and the display panel 7061 are implemented as two independent components to implement the input and output functions of the smart terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the smart terminal, which is not limited herein.
The interface unit 708 serves as an interface through which at least one external device is connected to the smart terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information or power, etc.) from an external device and transmit the received input to one or more elements within the smart terminal 700 or may be used to transmit data between the smart terminal 700 and the external device.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playing function or an image playing function) required by at least one function, and the like; the storage data area may store data created according to the use of the cellular phone (such as audio data or a phonebook, etc.), and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the intelligent terminal, connects various parts of the entire intelligent terminal by using various interfaces and lines, and performs various functions of the intelligent terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby performing overall monitoring of the intelligent terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The intelligent terminal 700 may further comprise a power supply 711 (such as a battery) for supplying power to each component, and preferably, the power supply 711 may be logically connected to the processor 710 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the intelligent terminal 700 includes some functional modules that are not shown, and are not described in detail herein.
The intelligent terminal 700 provided in the embodiment of the present invention can implement each process implemented by the intelligent terminal in the method embodiments of fig. 1 to fig. 3, and is not described herein again to avoid repetition. The intelligent terminal 700 can be combined with the intelligent wearable device to realize intelligent control over the intelligent terminal, and the functions combined with the intelligent terminal are richer.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process in the control method embodiment of the intelligent terminal, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A control method of an intelligent terminal is applied to the intelligent terminal, and is characterized by comprising the following steps:
receiving sensing data sent by intelligent wearable equipment, wherein the intelligent terminal is in communication connection with the intelligent wearable equipment in advance;
identifying a user behavior event of the intelligent wearable device user according to the sensing data;
if the user behavior event is a target behavior event, executing an operation associated with the target behavior event;
the sensory data comprises user athletic performance data;
the step of identifying the user behavior event of the intelligent wearable device user according to the sensing data specifically comprises:
judging whether the user motion behavior data is within a preset motion data range or not, and if the user motion behavior data is within the preset motion data range, determining that the user behavior event is the target behavior event;
the sensing data further comprises external environment data;
the step of judging whether the user motion behavior data is within a preset motion data range specifically comprises:
judging whether the external environment data is predefined data representing a preset environment or not, and if the external environment data is the predefined data representing the preset environment, judging whether the user motion behavior data is within the range of the preset motion data or not;
before the step of receiving the sensing data sent by the smart wearable device, the method further comprises:
setting and monitoring time of the intelligent wearable equipment;
and receiving the sensing data which is sent by the intelligent wearable equipment and collected in the monitoring time period in a preset monitoring time period.
2. The method according to claim 1, wherein the step of determining whether the user motion behavior data is within a preset motion data range or not, and if the user motion behavior data is within the preset motion data range, determining that the user behavior event of the user of the smart wearable device is the target behavior event specifically includes:
judging whether the motion frequency of the user of the intelligent wearable device is within a preset motion frequency range or not, and if the motion frequency is within the preset motion frequency range, determining that the user behavior event is a first target behavior event; or
Judging whether the motion amplitude of the user of the intelligent wearable device is within a preset motion amplitude range or not, and if the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a second target behavior event; or
Judging whether the motion frequency of the intelligent wearable device user is within a preset motion frequency range or not and whether the motion amplitude of the intelligent wearable device user is within a preset motion amplitude range or not, and if the motion frequency is within the preset motion frequency range and the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a third target behavior event.
3. The method of claim 2, wherein the third targeted behavioral event is a brushing event.
4. The method according to any one of claims 1 to 3, wherein, if the user behavior event is a target behavior event, the step of executing the operation associated with the target behavior event specifically includes:
and if the user behavior event is the target behavior event, starting an application program associated with the target behavior event, and playing the multimedia content of the application program.
5. An intelligent terminal, comprising:
the intelligent wearable device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving sensing data sent by the intelligent wearable device, and the intelligent terminal is in communication connection with the intelligent wearable device in advance;
the identification module is used for identifying the user behavior event of the intelligent wearable device user according to the sensing data;
the execution module is used for executing the operation associated with the target behavior event if the user behavior event is the target behavior event;
the sensory data comprises user athletic performance data;
the identification module is used for judging whether the user motion behavior data is in a preset motion data range, and if the user motion behavior data is in the preset motion data range, determining that the user behavior event is the target behavior event;
the sensing data further comprises external environment data;
the identification module is used for judging whether the external environment data is predefined data representing a preset environment or not, and if the external environment data is the predefined data representing the preset environment, judging whether the user motion behavior data is in the preset motion data range or not;
the receiving module includes:
the setting subunit is used for setting and monitoring the time of the intelligent wearable equipment;
and the receiving subunit is used for receiving the sensing data which is sent by the intelligent wearable device and collected in the monitoring time period in a preset monitoring time period.
6. The intelligent terminal of claim 5, wherein the identification module comprises:
the first judging unit is used for judging whether the motion frequency of the intelligent wearable device user is within a preset motion frequency range or not, and if the motion frequency is within the preset motion frequency range, determining that the user behavior event is a first target behavior event; or
The second judging unit is used for judging whether the motion amplitude of the user of the intelligent wearable device is within a preset motion amplitude range, and if the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a second target behavior event; or
And the third judging unit is used for judging whether the motion frequency of the intelligent wearable device user is within a preset motion frequency range or not and whether the motion amplitude of the intelligent wearable device user is within a preset motion amplitude range or not, and if the motion frequency is within the preset motion frequency range and the motion amplitude is within the preset motion amplitude range, determining that the user behavior event is a third target behavior event.
7. The intelligent terminal of claim 6, wherein the third targeted behavior event is a brushing event.
8. The intelligent terminal according to any one of claims 5 to 7, wherein the execution module is configured to start an application associated with the target behavior event and play multimedia content of the application if the user behavior event is the target behavior event.
9. An intelligent terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the control method of a smart terminal according to any of claims 1-4.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the control method of an intelligent terminal according to any one of claims 1 to 4.
CN201710523540.1A 2017-06-30 2017-06-30 Intelligent terminal control method and intelligent terminal Active CN107291242B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710523540.1A CN107291242B (en) 2017-06-30 2017-06-30 Intelligent terminal control method and intelligent terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710523540.1A CN107291242B (en) 2017-06-30 2017-06-30 Intelligent terminal control method and intelligent terminal

Publications (2)

Publication Number Publication Date
CN107291242A CN107291242A (en) 2017-10-24
CN107291242B true CN107291242B (en) 2020-06-26

Family

ID=60098564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710523540.1A Active CN107291242B (en) 2017-06-30 2017-06-30 Intelligent terminal control method and intelligent terminal

Country Status (1)

Country Link
CN (1) CN107291242B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045994B (en) * 2018-01-12 2022-09-16 Oppo广东移动通信有限公司 Application program processing method and device, electronic equipment and computer readable storage medium
CN108681546A (en) * 2018-03-26 2018-10-19 四川斐讯信息技术有限公司 One kind moving synchronously method and system
CN110580557A (en) * 2018-06-08 2019-12-17 上海博泰悦臻网络技术服务有限公司 Intelligent mobile terminal and fragment time management and utilization method thereof based on artificial intelligence
CN109567814B (en) * 2018-10-22 2022-06-28 深圳大学 Classification recognition method, computing device, system and storage medium for tooth brushing action
CN110534172B (en) * 2019-08-23 2022-07-19 青岛海尔科技有限公司 Oral cavity cleaning reminding method and device based on intelligent home operating system
CN110889322A (en) * 2019-10-09 2020-03-17 深圳市九洲电器有限公司 Method for preventing sedentary sitting and related product
CN111813280B (en) * 2020-05-28 2022-02-22 维沃移动通信有限公司 Display interface control method and device, electronic equipment and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105117025A (en) * 2015-09-28 2015-12-02 广东欧珀移动通信有限公司 Regularly reminded method and system and regularly reminded device based on intelligent wearable device
CN106020356A (en) * 2016-05-19 2016-10-12 珠海市魅族科技有限公司 Control method of smart terminal, smart terminal and wearing device
CN106095136A (en) * 2016-06-10 2016-11-09 北京行云时空科技有限公司 A kind of wearable device controls the method for intelligent terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105117025A (en) * 2015-09-28 2015-12-02 广东欧珀移动通信有限公司 Regularly reminded method and system and regularly reminded device based on intelligent wearable device
CN106020356A (en) * 2016-05-19 2016-10-12 珠海市魅族科技有限公司 Control method of smart terminal, smart terminal and wearing device
CN106095136A (en) * 2016-06-10 2016-11-09 北京行云时空科技有限公司 A kind of wearable device controls the method for intelligent terminal

Also Published As

Publication number Publication date
CN107291242A (en) 2017-10-24

Similar Documents

Publication Publication Date Title
CN107291242B (en) Intelligent terminal control method and intelligent terminal
CN109938720B (en) Heart rate-based reminding method, wearable device and computer-readable storage medium
CN108888955B (en) Method and device for controlling visual angle in game
CN108628217B (en) Wearable device power consumption control method, wearable device and computer-readable storage medium
CN107743178B (en) Message playing method and mobile terminal
CN107734170B (en) Notification message processing method, mobile terminal and wearable device
CN110096195B (en) Sports icon display method, wearable device and computer readable storage medium
CN108762814B (en) Screen lightening method and related equipment
CN107633853B (en) Control method for playing audio and video files and user terminal
CN108874121A (en) Control method, wearable device and the computer readable storage medium of wearable device
CN110062273B (en) Screenshot method and mobile terminal
CN108762472A (en) Wearable device control method, wearable device and computer readable storage medium
CN108540665A (en) Based on flexible terminal and its massage method, computer readable storage medium
CN110138968A (en) A kind of incoming call reminding method, wearable device and storage medium
WO2023045897A1 (en) Adjustment method and apparatus for electronic device, and electronic device
CN110012148A (en) A kind of bracelet control method, bracelet and computer readable storage medium
CN109276881A (en) A kind of game control method, equipment
CN110139270B (en) Wearable device pairing method, wearable device and computer readable storage medium
CN108769206B (en) Data synchronization method, terminal and storage medium
CN109164908B (en) Interface control method and mobile terminal
CN110638437A (en) Control method based on heart rate monitoring, terminal, wearable device and storage medium
CN112764543B (en) Information output method, terminal equipment and computer readable storage medium
CN110532050B (en) Motion data refreshing method, wearable device and computer readable storage medium
CN107562305B (en) Method for setting dynamic icon, terminal and computer readable storage medium
WO2022227252A1 (en) Wearable device control method and apparatus, wearable device, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant