WO2021081768A1 - 界面切换方法、装置、可穿戴电子设备及存储介质 - Google Patents
界面切换方法、装置、可穿戴电子设备及存储介质 Download PDFInfo
- Publication number
- WO2021081768A1 WO2021081768A1 PCT/CN2019/114076 CN2019114076W WO2021081768A1 WO 2021081768 A1 WO2021081768 A1 WO 2021081768A1 CN 2019114076 W CN2019114076 W CN 2019114076W WO 2021081768 A1 WO2021081768 A1 WO 2021081768A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- behavior
- electronic device
- user interface
- wearable electronic
- scene
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 94
- 238000000605 extraction Methods 0.000 claims abstract description 13
- 230000006399 behavior Effects 0.000 claims description 394
- 238000012549 training Methods 0.000 claims description 40
- 238000013528 artificial neural network Methods 0.000 claims description 13
- 238000004140 cleaning Methods 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000005065 mining Methods 0.000 claims description 6
- 238000012937 correction Methods 0.000 claims description 5
- 230000007613 environmental effect Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 230000003542 behavioural effect Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 5
- 210000002569 neuron Anatomy 0.000 description 4
- 239000013598 vector Substances 0.000 description 4
- 210000004556 brain Anatomy 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 235000019580 granularity Nutrition 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Definitions
- This application relates to the technical field of mobile terminals, and more specifically, to an interface switching method, device, wearable electronic equipment, and storage medium.
- Wearable electronic devices such as smart watches, smart glasses, and smart bracelets, have become one of the most commonly used consumer electronic products in people's daily lives. Wearable electronic devices have been favored by more and more consumers due to their convenience to wear and the ability to provide users with more user-friendly services. However, the traditional wearable electronic device is cumbersome to operate when the user interface is switched, which brings inconvenience to the user.
- this application proposes an interface switching method, device, wearable electronic device, and storage medium.
- an embodiment of the present application provides an interface switching method, which is applied to a wearable electronic device, the wearable electronic device includes a sensor for collecting behavior data, and the method includes: acquiring behavior collected by the sensor Data; perform feature extraction on the behavior data to obtain behavior characteristics; input the behavior characteristics into a trained preset model to obtain the current behavior scene in which the wearable electronic device is located, and the preset model is pre-trained , To output the recognition result of the behavior scene according to the input behavior characteristics; switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene.
- an embodiment of the present application provides an interface switching device, which is applied to a wearable electronic device, the wearable electronic device includes a sensor for collecting behavior data, and the device includes: a data acquisition module and a feature acquisition module A scene recognition module and an interface switching module, wherein the data acquisition module is used to acquire behavior data collected by the sensor; the feature acquisition module is used to perform feature extraction on the behavior data to obtain behavior characteristics; the scene The recognition module is used to input the behavior feature into a trained preset model to obtain the current behavior scene in which the wearable electronic device is located, and the preset model is pre-trained to output the behavior scene according to the input behavior feature The recognition result; the interface switching module is used to switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene.
- an embodiment of the present application provides a wearable electronic device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory And is configured to be executed by the one or more processors, and the one or more programs are configured to execute the interface switching method provided in the above-mentioned first aspect.
- an embodiment of the present application provides a computer-readable storage medium.
- the computer-readable storage medium stores program code, and the program code can be invoked by a processor to execute the interface provided in the first aspect. Switch method.
- the solution provided in this application obtains the behavior data collected by the sensor of the wearable electronic device, extracts the characteristics of the behavior data, obtains the behavior characteristics, and inputs the behavior characteristics into the trained preset model to obtain the current state of the wearable electronic device.
- Behavior scene the preset model is pre-trained to output the recognition result of the behavior scene according to the input behavior characteristics, and then the current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene, so as to realize according to the sensor
- the behavior characteristics corresponding to the collected behavior data automatically identify the current behavior scene where the wearable electronic device is located, and switch the displayed current interface to the user interface corresponding to the current behavior scene, reducing user operations in switching user interfaces and improving user experience .
- Fig. 1 shows a flowchart of an interface switching method according to an embodiment of the present application.
- Fig. 2 shows a schematic diagram of an interface provided by an embodiment of the present application.
- Figure 3 shows a schematic diagram of another interface provided by an embodiment of the present application.
- FIG. 4 shows a schematic diagram of another interface provided by an embodiment of the present application.
- Fig. 5 shows a flowchart of an interface switching method according to another embodiment of the present application.
- Fig. 6 shows a flowchart of an interface switching method according to another embodiment of the present application.
- FIG. 7 shows a flowchart of step S310 in the interface switching method provided by another embodiment of the present application.
- Fig. 8 shows a flowchart of an interface switching method according to still another embodiment of the present application.
- Fig. 9 shows a block diagram of an interface switching device according to an embodiment of the present application.
- Fig. 10 is a block diagram of a wearable electronic device for executing the interface switching method according to the embodiment of the present application.
- FIG. 11 is a storage unit for storing or carrying program code for implementing the interface switching method according to the embodiment of the present application according to an embodiment of the present application.
- wearable electronic devices such as smart watches, etc.
- the wearable electronic device may not be limited to the display of time, but may also display some information, such as weather information, health information, prompt information, and so on. Therefore, the style of the user interface (such as the dial, etc.) of the wearable electronic device is different, and different users have different requirements for the style of the user interface, which makes a style of the user interface unable to meet the needs of the user, and the wearable electronics There are usually multiple styles of user interfaces in the device.
- the inventor proposes the interface switching method, device, wearable electronic device, and storage medium provided by the embodiments of the present application.
- the behavior characteristics corresponding to the behavior data collected by the sensor the current behavior of the wearable electronic device is automatically recognized. Scene, and switch the displayed current interface to the user interface corresponding to the current behavior scene, reducing the user's operation in switching the user interface and improving the user experience.
- the specific interface switching method will be described in detail in the subsequent embodiments.
- FIG. 1 shows a schematic flowchart of an interface switching method provided by an embodiment of the present application.
- the interface switching method is used for automatically identifying the current behavior scene of the wearable electronic device according to the behavior characteristics corresponding to the behavior data collected by the sensor, and switching the displayed current interface to the user interface corresponding to the current behavior scene, reducing switching users
- the user's operation in the interface enhances the user experience.
- the interface switching method is applied to the interface switching device 400 as shown in FIG. 9 and the wearable electronic device 100 equipped with the interface switching device 400 (FIG. 10 ).
- the wearable electronic device may include sensors for collecting behavior data, and the sensors may include acceleration sensors, gyroscope sensors, gravity sensors, heart rate sensors, brain wave sensors, positioning sensors, infrared sensors, etc., which are not limited here.
- the sensors may include acceleration sensors, gyroscope sensors, gravity sensors, heart rate sensors, brain wave sensors, positioning sensors, infrared sensors, etc., which are not limited here.
- the following will take an electronic device as an example to describe the specific process of this embodiment.
- the wearable electronic device applied in this embodiment can be a smart watch, a smart bracelet, etc., which is not limited here.
- the following will elaborate on the process shown in FIG. 1, and the interface switching method may specifically include the following steps:
- Step S110 Obtain behavior data collected by the sensor.
- various sensors for collecting behavior data may be provided in the mobile terminal, such as acceleration sensors, gyroscope sensors, gravity sensors, heart rate sensors, brain wave sensors, positioning sensors, infrared sensors, etc.
- behavior data may refer to data used to characterize user behavior.
- User behaviors may include different types of user behaviors such as walking, standing still, running, squatting, hand movement, head shaking, etc. The specific user behavior may not be limited. Under different user behaviors, the behavior data collected by these sensors can be different. And behavioral data can reflect the characteristics of some behavioral scenes. For example, the scene when the user is walking on the route to the home location can reflect the scene based on positioning data, motion data, biometric data, environmental data, and time data. Characteristics. Therefore, the behavior data collected by the sensor can be used to identify the behavior scene of the wearable electronic device.
- the wearable electronic device can acquire the behavior data collected by the sensor used to collect behavior data when recognizing the behavior scene.
- the behavior data acquired by the mobile terminal may include behavior data collected by various sensors. For example, all behavior data collected by sensors that can collect behavior data can be acquired, or behavior data collected by some sensors can be acquired, which is not limited here. . It is understandable that the more types of behavior data acquired (that is, the more types of sensors that collect behavior data), the more features are used to identify behavior scenarios, and the higher the dimension of features, which can improve the recognition of behavior scenarios. Accuracy.
- the wearable electronic device can obtain the behavior data collected by the sensor within a preset time period, so as to subsequently recognize the behavior scene. It is understandable that through the behavior data collected within a period of time, the accuracy of the subsequent recognized behavior scene can be made high.
- Step S120 Perform feature extraction on the behavior data to obtain behavior features.
- the wearable electronic device after the wearable electronic device acquires the behavior data collected by the sensor for collecting behavior data, it can perform feature extraction on the behavior data to obtain behavior characteristics, so as to subsequently perform behavior scenarios based on the behavior characteristics.
- the process of identification after the wearable electronic device acquires the behavior data collected by the sensor for collecting behavior data, it can perform feature extraction on the behavior data to obtain behavior characteristics, so as to subsequently perform behavior scenarios based on the behavior characteristics. The process of identification.
- the wearable electronic device after the wearable electronic device acquires the behavior data collected by the sensor, it can perform feature extraction on the acquired behavior data.
- the extracted features can include time series features, frequency domain features, statistical features, etc., and the specific extracted features Can not be used as a limit.
- the behavior data collected by the sensor acquired by the wearable electronic device may be time series data (real-time domain data), and the mobile terminal may perform statistical feature extraction, and obtain the median and average value from the behavior data detected by the sensor. , Maximum value, minimum value, peak value, etc., so as to obtain the statistical characteristics in the time series data; the mobile terminal can also obtain the value of a certain point on the time axis but before and before and after the preset time interval according to the time series data, so as to obtain the time series
- the characteristics are not limited here.
- the mobile terminal may also perform a fast Fourier transform on the time series data to obtain frequency domain data, and then separate the high and low frequency signals, calculate the overall capability of the frequency domain signal, and take at least some coefficients as frequency domain features. For example, after the fast Fourier transform is performed on the acceleration curve with time, the acceleration and frequency curve can be obtained, and the frequency domain characteristics can be calculated according to the obtained acceleration and frequency curve curve.
- the specific method of extracting the behavior feature corresponding to the behavior data may not be a limitation.
- Step S130 Input the behavior feature into a pre-trained preset model to obtain the current behavior scene in which the wearable electronic device is located.
- the preset model is pre-trained to output the behavioral scene according to the input behavior feature. Recognition results.
- the obtained behavior characteristics may be input into a trained preset model to obtain the recognition result of the behavior scene output by the preset model according to the input behavior characteristics, and identify The result includes the behavior scene, and the behavior scene obtained by the recognition is used as the current behavior scene.
- the preset model may be trained from a large number of training samples.
- the training samples used to train the preset model may include input samples and output samples.
- the input samples may include behavior features, and the output samples may include behaviors corresponding to the behavior features. Scenes. Therefore, the trained preset model can be used to output the recognition result according to the input behavior characteristics, and the recognition result may not include the current behavior scene of the wearable electronic device.
- the recognition result output by the trained preset model may be the scene identifier of the behavior scene, where different behavior scenes can be stored in the form of different scene identifiers, for example, through 1 word The integer number of the section (ie 0-255) identifies different behavior scenarios.
- the preset model may include Support Vector Machine (SVM), neural network, decision tree model, etc., which are not limited here.
- SVM Support Vector Machine
- neural network neural network
- decision tree model decision tree model
- Step S140 Switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene.
- the wearable electronic device after the wearable electronic device recognizes the behavior scene in which the wearable electronic device is located, it can determine the user interface corresponding to the behavior scene according to the behavior scene in which the wearable electronic device is located, so that the wearable electronic device The current interface of the electronic device is switched to the user interface corresponding to the current behavior scene to realize automatic switching of the user interface, and the switched user interface corresponds to the current behavior scene.
- the wearable electronic device may pre-store user interfaces corresponding to different behavior scenarios, and the wearable electronic device can determine the user interface to be switched to according to the recognized behavior scenarios. For example, when the recognition result includes the scene identifier of the behavior scene, the user interface corresponding to the scene identifier can be read, and the current interface of the wearable electronic device can be switched to the determined user interface.
- the user interface may include a dial interface, a home screen interface, a lock screen interface, or an application interface.
- the wearable electronic device can automatically switch any user interface among the dial interface, the main screen interface, the lock screen interface, and the application interface, reducing user operations, and the switched interface corresponds to the current behavior scene, which satisfies The needs of users.
- the user interface may be a dial interface
- behavior scenarios may include sports scenes, home scenes, and work scenes. If the current behavior scene is a sports scene, you can switch the current dial to the dial corresponding to the sports scene, as shown in Figure 2, after the wearable electronic device switches the dial to the dial corresponding to the sports scene, the dial corresponding to the sports scene can be displayed Sports-related content such as time, steps, calories burned, heart rate, etc.; if the current behavior scene is a work scene, the current dial can be switched to the dial corresponding to the work scene, as shown in Figure 3, the wearable electronic device will switch the dial After setting the dial corresponding to the work scene, the dial corresponding to the work scene can display work-related content such as date, time, and work schedule; if the current home scene is a family scene, you can switch the current dial to the dial corresponding to the family scene.
- work-related content such as date, time, and work schedule
- the dial corresponding to the work scene can display family-related content such as date, time, weather, and TV program reminders.
- family-related content such as date, time, weather, and TV program reminders.
- the behavior data is extracted to obtain the behavior characteristics, and the behavior characteristics are input into the trained preset model to obtain the wearable electronic device.
- the current behavior scene where the device is located, and then the current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene, so as to realize the automatic identification of the wearable electronic device according to the behavior characteristics corresponding to the behavior data collected by the sensor And switch the displayed current interface to the user interface corresponding to the current behavior scene to meet the user's needs for the user interface, reduce the user's operation in switching the user interface, and improve the user experience.
- FIG. 5 shows a schematic flowchart of an interface switching method provided by another embodiment of the present application.
- This interface switching method can be applied to the above-mentioned wearable electronic device.
- the wearable electronic device includes a sensor for collecting behavioral data.
- the process shown in FIG. 5 will be described in detail below.
- the interface switching method may specifically include the following steps :
- Step S210 Obtain the behavior data collected by the sensor.
- Step S220 Perform feature extraction on the behavior data to obtain behavior features.
- step S210 and step S220 can refer to the content of the foregoing embodiment, which will not be repeated here.
- Step S230 Perform preprocessing on the behavior feature to obtain the preprocessed behavior feature.
- the wearable electronic device may also perform preprocessing on the behavior characteristics before inputting the behavior characteristics into the trained preset model.
- preprocessing the behavior feature may include: performing feature cleaning and feature mining on the behavior feature.
- feature cleaning includes removing content in behavior features according to preset cleaning rules;
- feature mining includes mining behavior features to form more dimensional features.
- performing feature cleaning on the behavior feature by the mobile terminal may include: removing missing values and abnormal values in the behavior feature, for example, removing incomplete data, wrong type data, and the like.
- feature cleaning can be missing value processing. Dimensions with missing values less than a preset percentage can be fitted to missing values based on other values of the dimension. If the number of missing values is greater than the preset percentage, it means that the feature is invalid. Feature, remove the dimension. Among them, the preset percentage may be 35%, 40%, etc., and the specific preset percentage may not be a limitation.
- performing feature mining on behavior features may include: using a boosted tree model to mine behavior features after feature cleaning.
- the mobile terminal Before inputting the behavior feature into the boosting tree model, the mobile terminal can also quantify the numerical feature in the behavior feature, and quantize the behavior feature into a vector after quantizing the behavior feature. Then the quantized vector is input to the boosting tree model, and the boosting tree model outputs multi-dimensional feature vectors according to the input vector, so as to obtain the preprocessed behavior characteristics.
- Step S240 Input the pre-processed behavior characteristics into a pre-trained preset model to obtain the current behavior scene in which the wearable electronic device is located.
- the preset model is pre-trained to determine the behavior according to the input behavior characteristics. Output the recognition result of the behavior scene.
- Step S250 Switch the current interface displayed by the wearable electronic device to a user interface corresponding to the current behavior scene.
- step S240 and step S250 can refer to the content of the foregoing embodiment, which will not be repeated here.
- Step S260 When a switching operation is detected, the user interface displayed by the wearable electronic device is switched to a target user interface corresponding to the switching operation.
- the wearable electronic device can also detect When the switching operation of the user interface displayed by the electronic device is detected, the current interface displayed by the wearable electronic device can be switched to the target user interface corresponding to the switching operation, so as to meet the needs of the user.
- the switching operation on the user interface may be a shaking operation on the wearable electronic device.
- detecting the switching operation may include: acquiring the shaking trajectory of the wearable electronic device; if the shaking trajectory satisfies a preset trajectory condition, determining that the switching operation is detected. It is understandable that wearable electronic devices are usually worn on the user, and when the displayed user interface is automatically switched to the user interface corresponding to the behavior scene, if the user is not satisfied with the automatically switched user interface, the user can set According to the rules, the user interface can be switched by shaking the wearable electronic device, and it is also convenient for the user to switch the user interface.
- the wearable electronic device may determine the shaking trajectory of the wearable electronic device according to the angular velocity value detected by the gyroscope sensor and the like.
- the preset trajectory condition may be a preset judgment condition for determining whether to switch the user interface, and the preset trajectory condition may be that the acquired shaking trajectory is any one of a plurality of shaking trajectories.
- step S260 may include: acquiring a target user interface corresponding to the shaking track; switching the user interface displayed by the wearable electronic device to the target user interface.
- the wearable electronic device can pre-store the corresponding relationship between the shaking trajectory and the user interface. Different shaking trajectories can correspond to different user interfaces.
- the wearable electronic device can determine that the shaking trajectory meets the preset trajectory condition according to the correspondence. Relationship, determine the user interface corresponding to the shaking trajectory. Therefore, the user can switch the displayed user interface to the required user interface by performing shaking operations on the smart wearable device with different shaking trajectories, which facilitates the switching of the interface and improves the user experience.
- Step S270 If the switching operation is detected within a preset period of time, and the target user interface corresponds to the target behavior scene, the behavior characteristics of the target behavior scene are marked, the preset model is input, and the target user interface corresponds to the target behavior scene.
- the preset model is used for correction training.
- the wearable electronic device may also determine whether the switching operation is an operation detected within a preset time period.
- the time length can be a relatively short time length, for example, it can be 1 minute to 5 minutes. If it is an operation that is monitored within the preset time period, it means that the user interface was incorrectly switched based on the identified behavior scene before, and it also means that the previous behavior scene was wrong. Therefore, the target behavior scene corresponding to the target user interface can be determined, and then the behavior characteristics (recognized behavior characteristics) labeled with the target behavior scene can be input into the preset model, and the preset model can be calibrated and trained. That is, the behavior feature is used as the input sample, and the target behavior scene is used as the output sample, and the preset model is trained to achieve the purpose of correcting the preset model and make the output result of the preset model more accurate.
- the interface switching method extracts the behavior characteristics by acquiring the collected behavior data, and then preprocesses the behavior characteristics to obtain the preprocessed behavior characteristics, and then input the preprocessed behavior characteristics into the trained
- the preset model obtains the current behavior scene where the wearable electronic device is located, and then switches the displayed current interface to the user interface corresponding to the current behavior scene, and when the switching operation is detected, the displayed interface is switched to correspond to the switching operation
- the target user interface of the target user interface, and the preset model is corrected when the switching operation is detected within the preset time. In this way, it is possible to improve the accuracy of recognition by preprocessing the behavior features. In addition, correcting the preset model can also improve the accuracy of the recognition of the behavior scene.
- FIG. 6 shows a schematic flowchart of an interface switching method provided by another embodiment of the present application.
- This interface switching method can be applied to the above-mentioned wearable electronic device.
- the wearable electronic device includes a sensor for collecting behavioral data.
- the process shown in FIG. 6 will be described in detail below.
- the interface switching method may specifically include the following steps :
- Step S310 Obtain the corresponding relationship between the behavior scene and the user interface.
- the wearable electronic device may obtain the corresponding relationship between the behavior scene and the user interface in advance.
- different behavior scenes may correspond to different user interfaces, or they may be multiple users.
- the scenes correspond to the same user interface, and the specific correspondence relationship may not be limited.
- step S310 may include:
- Step S311 Display a setting interface, where the setting interface is used to set the corresponding relationship between the behavior scene and the user interface.
- displaying the setting interface may include: acquiring a user interface of multiple styles currently existing in the wearable electronic device, and multiple behavior scenarios that can be recognized by the preset model; A setting interface for an option and a plurality of second options, the first option corresponds to the user interface of the multiple styles one-to-one, and the second option corresponds to the multiple behavior scenarios one-to-one.
- the wearable electronic device can pre-store a variety of styles of user interface, and store a variety of behavior scenarios that can be recognized by the preset model, and display according to the multiple styles of user interface and multiple behavior scenarios.
- the setting interface includes a plurality of first options and a plurality of second options, so that the user can associate the behavior scene with the user interface through the first option and the second option.
- Step S312 According to the setting operation detected in the setting interface, the corresponding relationship between the behavior scene and the user interface is set, and the corresponding relationship is stored.
- the wearable electronic device can detect the setting operation in the setting interface, and according to the setting operation, after setting the corresponding relationship between the behavior scene and the user interface, the corresponding relationship is stored, so that the wearable electronic device can perform the user When the interface is switched, the corresponding relationship is used to determine the user interface.
- the wearable electronic device may also display a first setting interface first, and the first setting interface includes a scene option for selecting a behavior scene, and after detecting a touch operation on the scene option corresponding to the behavior scene, The second setting interface is displayed.
- the second setting interface includes interface options for selecting the user interface. After detecting a touch operation on the interface option, the behavior scene corresponding to the operated scene option is corresponding to the operated interface option The user interface is associated to obtain the corresponding relationship between the behavior scene and the user interface.
- step S310 may also include: receiving a corresponding relationship between the behavior scene and the user interface sent by the server, and the corresponding relationship is that the mobile terminal generates the behavior scene and the user interface based on the detected association operation between the behavior scene and the user interface.
- the corresponding relationship is then sent to the server, where the mobile terminal is associated with the wearable electronic device. It is understandable that in this way, the user can associate the behavior scene with the user interface through the mobile phone, tablet, etc. associated with the wearable electronic device, and generate the corresponding relationship between the behavior scene and the user interface.
- the screen is small, and this implementation mode can facilitate the user to associate the behavior scene with the user interface.
- Step S320 Obtain the behavior data collected by the sensor.
- Step S330 Perform feature extraction on the behavior data to obtain behavior features.
- Step S340 Input the behavior feature into a trained preset model to obtain the current behavior scene in which the wearable electronic device is located.
- the preset model is pre-trained to output the behavioral scene according to the input behavior feature. Recognition results.
- steps S320 to S340 can refer to the content of the foregoing embodiment, and details are not described herein again.
- the wearable electronic device can also add behavior scenarios based on user operations. Therefore, the interface switching method may further include: acquiring scene addition data, the scene addition data including a new behavior scene and its corresponding behavior characteristics; according to the new behavior scene and its corresponding behavior characteristics, comparing the preset model Update. It is understandable that the wearable electronic device can acquire a new behavior scene and the behavior characteristics corresponding to the behavior scene, and then input the behavior characteristics marked with the behavior scene into the preset model for training, so as to realize the training of the preset model. Update.
- the interface switching method may further include: according to the detected update operation, combining the new behavior scene with the user interface. After any one of the user interfaces of the multiple styles is associated, the corresponding relationship is updated. It is understandable that by updating the corresponding relationship between the behavior scene and the user interface, the wearable electronic device can switch the user interface to the user interface corresponding to the new behavior scene when the wearable electronic device recognizes that the behavior scene is the new behavior scene. .
- Step S350 Switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene.
- the wearable electronic device may determine the user interface corresponding to the current behavior scene according to the corresponding relationship obtained in step S310, and switch the displayed current interface to the user interface corresponding to the current behavior scene.
- the interface switching method obtained by the embodiments of the present application obtains the corresponding relationship between the behavior scene and the user interface in advance, then obtains the behavior data, extracts the behavior characteristics, and then inputs the behavior characteristics into the trained preset model to obtain the current
- the user interface corresponding to the current behavior scene is determined according to the corresponding relationship between the behavior scene and the user interface, and the displayed current interface is switched to the user interface corresponding to the current behavior scene to realize automatic switching of the user interface.
- the corresponding relationship between the behavior scene and the user interface can be freely set by the user to meet the needs of different users.
- FIG. 8 shows a schematic flowchart of an interface switching method provided by still another embodiment of the present application.
- This interface switching method can be applied to the above-mentioned wearable electronic device.
- the wearable electronic device includes a sensor for collecting behavior data.
- the process shown in FIG. 8 will be described in detail below.
- the interface switching method may specifically include the following steps :
- Step S410 Obtain a training data set, where the training data set includes behavior features of samples marked with behavior scenes.
- the embodiments of the present application also include a training method for the preset model. It is worth noting that the training of the preset model may be based on the obtained training The data collection is performed in advance, and each subsequent behavioral scene recognition can be performed using a preset model, instead of training the model each time the behavioral scene is recognized.
- behavior data in different behavior scenarios may be collected, and behavior characteristics corresponding to the behavior data may be extracted as sample behavior characteristics, and the sample behavior characteristics may be marked on the behavior scene. In this way, sample behavior characteristics corresponding to multiple behavior scenarios can be obtained.
- the sample behavior feature is the input sample used for training
- the labeled behavior scene is the output sample used for training.
- Each set of training data may include one input sample and one output sample.
- Step S420 Input the training data set into a neural network, train the neural network to obtain the trained preset model, and the preset model can determine the behavior scene corresponding to the behavior feature according to the behavior feature.
- the training data set may be input to the neural network for training according to the training data set, so as to obtain the preset model.
- the neural network may be a deep neural network, which is not limited here.
- the following describes the training of the initial model based on the training data set.
- the behavior characteristics of samples in a set of data in the training data set are used as input samples of the neural network, and the behavior scenes marked in the set of data can be used as output samples of the neural network.
- the neurons in the input layer are fully connected with the neurons in the hidden layer
- the neurons in the hidden layer are fully connected with the neurons in the output layer, which can effectively extract potential features of different granularities.
- the number of hidden layers can be multiple, so as to better fit the nonlinear relationship, and make the preset model obtained by training more accurate. It is understandable that the training process of the preset model may be completed by the wearable electronic device, or may not be completed by the wearable electronic device.
- the wearable electronic device can only serve as a direct user or an indirect user, that is, the wearable electronic device can send the acquired behavior characteristics to the server storing the preset model , Obtain the recognition result of the behavior scene from the server.
- the preset model obtained by training may be stored locally in the wearable electronic device, and the preset model obtained by training may also be stored in a server communicatively connected with the wearable electronic device. , Which can reduce the storage space occupied by wearable electronic devices and improve the operating efficiency of wearable electronic devices.
- the preset model may periodically or irregularly obtain new training data, and train and update the preset model.
- Step S430 Obtain the behavior data collected by the sensor.
- Step S440 Perform feature extraction on the behavior data to obtain behavior features.
- Step S450 Input the behavior feature into a pre-trained preset model to obtain the current behavior scene in which the wearable electronic device is located.
- the preset model is pre-trained to output the behavioral scene according to the input behavior feature. Recognition results.
- Step S460 Switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene.
- the interface switching method may further include: determining whether the current behavior scene exists in the wearable electronic device Corresponding user interface; if there is a user interface corresponding to the current behavior scene, switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene; if there is no user interface corresponding to the current behavior scene User interface, the use frequency of each user interface among all user interfaces existing in the wearable electronic device is acquired; according to the use frequency of each user interface, the current interface displayed by the wearable electronic device is switched to The most frequently used user interface.
- the wearable electronic device does not store the user interface corresponding to the identified current behavior scene, the wearable electronic device cannot switch the displayed current interface to the user interface corresponding to the current behavior scene at this time.
- the current interface displayed by the wearable electronic device can be switched to the user interface with the highest frequency of use, so that the switched user interface can meet the needs of the user as much as possible.
- the interface switching method provided by the embodiment of the application provides a training method of a preset model.
- the initial model is trained through the behavior characteristics of the sample labeled with the behavior scene, so as to obtain the trained preset model.
- the preset model can be used According to the behavior characteristics corresponding to the collected behavior data, the recognition result of the behavior scene is output.
- the wearable electronic device obtains the behavior data and extracts the behavior characteristics, and then inputs the behavior characteristics into the trained preset model to obtain the current behavior scene, and switches the displayed current interface to the user interface corresponding to the current behavior scene. Realize the automatic switching of the user interface, reduce the user's operation, and improve the user experience.
- FIG. 9 shows a structural block diagram of an interface switching apparatus 400 provided by an embodiment of the present application.
- the interface switching device 400 is applied to the above-mentioned wearable electronic device, and the wearable electronic device includes a sensor for collecting behavior data.
- the interface switching device 400 includes: a data acquisition module 410, a feature acquisition module 420, a scene recognition module 430, and an interface switching module 440.
- the data acquisition module 410 is used to acquire the behavior data collected by the sensor; the feature acquisition module 420 is used to perform feature extraction on the behavior data to obtain behavior characteristics; the scene recognition module 430 is used to obtain the behavior data
- the behavior feature is input into a trained preset model to obtain the current behavior scene in which the wearable electronic device is located, and the preset model is pre-trained to output the recognition result of the behavior scene according to the input behavior feature;
- the interface switching module 440 is configured to switch the current interface displayed by the wearable electronic device to a user interface corresponding to the current behavior scene.
- the scene recognition module 430 includes a feature processing unit and a feature input unit.
- the feature processing unit is used to preprocess the behavior feature to obtain the preprocessed behavior feature;
- the feature input unit is used to input the preprocessed behavior feature into a trained preset model.
- the feature processing unit may be specifically used to: perform feature cleaning and feature mining on the behavior feature.
- the interface switching module 440 may also be used to switch the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene, and when a switching operation is detected, set the available The user interface displayed by the wearable electronic device is switched to the target user interface corresponding to the switching operation.
- the interface switching device 400 may further include: a model correction module.
- the model correction module is configured to, if the switching operation is detected within a preset period of time and the target user interface corresponds to the target behavior scene, it will be marked with the behavior characteristics of the target behavior scene and input the preset model, Perform correction training on the preset model.
- the interface switching device 400 may further include: a trajectory acquisition module and an operation acquisition module.
- the trajectory acquisition module is used to acquire the shaking trajectory of the wearable electronic device; the operation acquisition module is used to determine that a switching operation is detected if the shaking trajectory meets a preset trajectory condition.
- the interface switching module 440 switches the user interface displayed by the wearable electronic device to the target user interface corresponding to the switching operation, including: acquiring the target user interface corresponding to the shaking trajectory; The displayed user interface is switched to the target user interface.
- the interface switching device 400 may further include: a corresponding relationship acquisition module.
- the corresponding relationship acquisition module is configured to acquire the corresponding relationship between the behavior scene and the user interface before the current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene.
- the interface switching module 440 may include: an interface determining unit, configured to determine a user interface corresponding to the current behavior scene according to the corresponding relationship; a switching execution unit, configured to display the wearable electronic device The current interface of is switched to the user interface corresponding to the current behavior scene.
- the corresponding relationship acquisition module may include: an interface display unit, configured to display a setting interface, the setting interface configured to set the corresponding relationship between the behavior scene and the user interface; the corresponding relationship setting unit, configured based on the Set the detected setting operation in the setting interface, set the corresponding relationship between the behavior scene and the user interface, and store the corresponding relationship.
- the interface display unit may be specifically configured to: obtain a user interface of multiple styles currently existing in the wearable electronic device, and multiple behavior scenarios recognizable by the preset model; the display includes multiple first options And a setting interface of a plurality of second options, the first option corresponds to the user interface of the multiple styles one-to-one, and the second option corresponds to the multiple behavior scenarios one-to-one.
- the interface switching device 400 may further include: a data acquisition module and a model update module.
- the data acquisition module is used to acquire scene addition data, the scene addition data includes a new behavior scene and its corresponding behavior characteristics; the model update module is used to compare the preset model according to the new behavior scene and its corresponding behavior characteristics Update.
- the interface switching device 400 may further include: a correspondence update module, configured to update the preset model according to the new behavior scene and its corresponding behavior characteristics, and then according to the detected update operation After associating the new behavior scene with any one of the user interfaces of the multiple styles, the corresponding relationship is updated.
- a correspondence update module configured to update the preset model according to the new behavior scene and its corresponding behavior characteristics, and then according to the detected update operation After associating the new behavior scene with any one of the user interfaces of the multiple styles, the corresponding relationship is updated.
- the corresponding relationship acquisition module may be specifically configured to: receive the corresponding relationship between the behavior scene and the user interface sent by the server, and the corresponding relationship is that the mobile terminal generates the corresponding relationship between the behavior scene and the user interface based on the detected association operation between the behavior scene and the user interface. The corresponding relationship is then sent to the server, where the mobile terminal is associated with the wearable electronic device.
- the interface switching device 400 may further include: a training data acquisition module for acquiring a training data set, the training data set including sample behavior features marked with behavior scenes; and a model training module for integrating
- the training data set is input to a neural network, and the neural network is trained to obtain the trained preset model, and the preset model can determine the behavior scene corresponding to the behavior feature according to the behavior feature.
- the interface switching module 440 may be specifically configured to: determine whether there is a user interface corresponding to the current behavior scene in the wearable electronic device; if there is a user interface corresponding to the current behavior scene, change the The current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene.
- the interface switching module 440 may also be used to: if there is no user interface corresponding to the current behavior scene, obtain the usage frequency of each user interface among all the user interfaces existing in the wearable electronic device; Describe the frequency of use of each user interface, and switch the current interface displayed by the wearable electronic device to the user interface with the highest frequency of use.
- the behavior data includes at least one of positioning data, exercise data, biometric data, environmental data, and time data;
- the user interface includes a dial interface, a home screen interface, a lock screen interface, or Application interface.
- the coupling between the modules may be electrical, mechanical or other forms of coupling.
- each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
- the above-mentioned integrated modules can be implemented in the form of hardware or software function modules.
- the preset model is pre-trained to output the recognition result of the behavior scene according to the input behavior characteristics, and then the current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene, so as to realize the collection according to the sensor
- the behavior characteristics corresponding to the behavior data of the wearable electronic device automatically recognize the current behavior scene in which the wearable electronic device is located, and switch the displayed current interface to the user interface corresponding to the current behavior scene, reducing user operations in switching user interfaces and improving user experience.
- the wearable electronic device 100 may be an electronic device capable of running application programs, such as a smart watch, a smart bracelet, or smart glasses.
- the wearable electronic device 100 in this application may include one or more of the following components: a processor 110, a memory 120, and one or more application programs, where one or more application programs may be stored in the memory 120 and configured to Executed by one or more processors 110, one or more programs are configured to execute the methods described in the foregoing method embodiments.
- the processor 110 may include one or more processing cores.
- the processor 110 uses various interfaces and lines to connect various parts of the entire wearable electronic device 100, by running or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and calling data stored in the memory 120 , Perform various functions of the wearable electronic device 100 and process data.
- the processor 110 may use at least one of digital signal processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA).
- DSP Digital Signal Processing
- FPGA Field-Programmable Gate Array
- PLA Programmable Logic Array
- the processor 110 may be integrated with one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
- the CPU mainly processes the operating system, user interface, and application programs; the GPU is used for rendering and drawing of display content; the modem is used for processing wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 110, but may be implemented by a communication chip alone.
- the memory 120 may include random access memory (RAM) or read-only memory (Read-Only Memory).
- the memory 120 may be used to store instructions, programs, codes, code sets or instruction sets.
- the memory 120 may include a program storage area and a data storage area, where the program storage area may store instructions for implementing the operating system and instructions for implementing at least one function (such as touch function, sound playback function, image playback function, etc.) , Instructions used to implement the following various method embodiments, etc.
- the data storage area can also store data (such as phone book, audio and video data, chat record data) created by the terminal 100 during use.
- FIG. 11 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
- the computer-readable medium 800 stores program code, and the program code can be invoked by a processor to execute the method described in the foregoing method embodiment.
- the computer-readable storage medium 800 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
- the computer-readable storage medium 800 includes a non-transitory computer-readable storage medium.
- the computer-readable storage medium 800 has storage space for the program code 810 for executing any method steps in the above-mentioned methods. These program codes can be read from or written into one or more computer program products.
- the program code 810 may be compressed in a suitable form, for example.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (20)
- 一种界面切换方法,其特征在于,应用于可穿戴电子设备,所述可穿戴电子设备包括用于采集行为数据的传感器,所述方法包括:获取所述传感器采集的行为数据;对所述行为数据进行特征提取,获得行为特征;将所述行为特征输入已训练的预设模型,获得所述可穿戴电子设备所处的当前行为场景,所述预设模型被预先训练,以根据输入的行为特征而输出行为场景的识别结果;将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面。
- 根据权利要求1所述的方法,其特征在于,所述将所述行为特征输入已训练的预设模型,包括:对所述行为特征进行预处理,获得预处理后的行为特征;将所述预处理后的行为特征输入已训练的预设模型。
- 根据权利要求2所述的方法,其特征在于,所述对所述行为特征进行预处理,包括:对所述行为特征进行特征清洗以及特征挖掘。
- 根据权利要求1-3任一项所述的方法,其特征在于,在将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面之后,所述方法还包括:当检测到切换操作时,将所述可穿戴电子设备显示的所述用户界面切换为切换操作对应的目标用户界面。
- 根据权利要求4所述的方法,其特征在于,在将所述可穿戴电子设备显示的所述用户界面切换为目标行为场景对应的目标用户界面之后,所述方法还包括:如果所述切换操作为预设时长内检测到的,并且目标用户界面与目标行为场景对应,将标注有所述目标行为场景的所述行为特征,输入所述预设模型,对所述预设模型进行较正训练。
- 根据权利要求4或5所述的方法,其特征在于,在当检测到切换操作时,根据所述切换操作,将所述可穿戴电子设备显示的所述用户界面切换为切换操作对应的目标用户界面之前,所述方法还包括:获取所述可穿戴电子设备的晃动轨迹;如果所述晃动轨迹满足预设轨迹条件,确定检测到切换操作。
- 根据权利要求6所述的方法,其特征在于,所述将所述可穿戴电子设备显示的所述用户界面切换为切换操作对应的目标用户界面包括:获取所述晃动轨迹对应的目标用户界面;将所述可穿戴电子设备显示的所述用户界面切换为所述目标用户界面。
- 根据权利要求1-7任一项所述的方法,其特征在于,在所述将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面之前,所述方法还包括:获取行为场景与用户界面的对应关系;所述将所述可穿戴电子设备显示的当前界面切换为所述行为场景对应的用户界面,包括:根据所述对应关系,确定所述当前行为场景对应的用户界面;将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面。
- 根据权利要求8所述的方法,其特征在于,所述获取行为场景与用户界面的对应关系,包括:显示设置界面,所述设置界面用于设置行为场景与用户界面的对应关系;根据于所述设置界面中检测到的设置操作,设置行为场景与用户界面的对应关系,并将所述对应关系进行存储。
- 根据权利要求9所述的方法,其特征在于,所述显示设置界面,包括:获取所述可穿戴电子设备中当前存在的多种样式的用户界面,以及所述预设模型可识别的多种行为场景;显示包括多个第一选项以及多个第二选项的设置界面,所述第一选项与所述多种样式的用户界面一一对应,所述第二选项与所述多种行为场景一一对应。
- 根据权利要求10所述的方法,其特征在于,所述方法还包括:获取场景增添数据,所述场景增添数据包括新的行为场景及其对应的行为特征;根据新的行为场景及其对应的行为特征,对所述预设模型进行更新。
- 根据权利要求11所述的方法,其特征在于,在所述根据新的行为场景及其对应的行为特征,对所述预设模型进行更新之后,所述方法还包括:根据检测到的更新操作,将所述新的行为场景与所述多种样式的用户界面中任一用户界面关联后,对所述对应关系进行更新。
- 根据权利要求8所述的方法,其特征在于,所述获取行为场景与用户界面的对应关系,包括:接收服务器发送的行为场景与用户界面的对应关系,所述对应关系为移动终端根据检测到的对行为场景与用户界面的关联操作,生成所述对应关系后发送至所述服务器,其中,所述移动终端与所述可穿戴电子设备关联。
- 根据权利要求1-13任一项所述的方法,其特征在于,所述预设模型通过如下步骤训练得到:获取训练数据集合,所述训练数据集合包括被标注有行为场景的样本行为特征;将所述训练数据集合输入神经网络,对所述神经网络进行训练,获得已训练的所述预设模型,所述预设模型能够根据行为特征确定该行为特征对应的行为场景。
- 根据权利要求1-14任一项所述的方法,其特征在于,在所述将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面之前,所述方法还包括:确定所述可穿戴电子设备中是否存在所述当前行为场景对应的用户界面;如果存在所述当前行为场景对应的用户界面,将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面。
- 根据权利要求15所述的方法,其特征在于,在所述确定所述可穿戴电子设备中是否存在所述当前行为场景对应的用户界面之后,所述方法还包括:如果不存在所述当前行为场景对应的用户界面,则获取所述可穿戴电子设备中存在的所有用户界面中每个用户界面的使用频率;根据所述每个用户界面的使用频率,将所述可穿戴电子设备显示的当前界面切换为使用频率最高的用户界面。
- 根据权利要求1-16任一项所述的方法,其特征在于,所述行为数据包括:定位数据、运动数据、生物特征数据、环境数据以及时间数据中的至少一种;所述用户界面包括表盘界面、主屏幕界面、锁屏界面、或者应用界面。
- 一种界面切换装置,其特征在于,应用于可穿戴电子设备,所述可穿戴电子设备包括用于采集行为数据的传感器,所述装置包括:数据获取模块、特征获取模块、场景识别模块以及界面切换模块,其中,所述数据获取模块用于获取所述传感器采集的行为数据;所述特征获取模块用于对所述行为数据进行特征提取,获得行为特征;所述场景识别模块用于将所述行为特征输入已训练的预设模型,获得所述可穿戴电子设备所处的当前行为场景,所述预设模型被预先训练,以根据输入的行为特征而输出行为场景的识别结果;所述界面切换模块用于将所述可穿戴电子设备显示的当前界面切换为所述当前行为场景对应的用户界面。
- 一种可穿戴电子设备,其特征在于,包括:一个或多个处理器;存储器;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行如权利要求1-17任一项所述的方法。
- 一种计算机可读取存储介质,其特征在于,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行如权利要求1-17任一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201980099239.XA CN114223139B (zh) | 2019-10-29 | 2019-10-29 | 界面切换方法、装置、可穿戴电子设备及存储介质 |
PCT/CN2019/114076 WO2021081768A1 (zh) | 2019-10-29 | 2019-10-29 | 界面切换方法、装置、可穿戴电子设备及存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/114076 WO2021081768A1 (zh) | 2019-10-29 | 2019-10-29 | 界面切换方法、装置、可穿戴电子设备及存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021081768A1 true WO2021081768A1 (zh) | 2021-05-06 |
Family
ID=75714716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/114076 WO2021081768A1 (zh) | 2019-10-29 | 2019-10-29 | 界面切换方法、装置、可穿戴电子设备及存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114223139B (zh) |
WO (1) | WO2021081768A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113960911A (zh) * | 2021-11-02 | 2022-01-21 | 珠海读书郎软件科技有限公司 | 一种运动手表的表盘自动生成和切换的系统及方法 |
CN116173484A (zh) * | 2023-03-03 | 2023-05-30 | 乐渊网络科技(上海)有限公司 | 运动数据的处理方法、装置及电子设备 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114675740A (zh) * | 2022-03-29 | 2022-06-28 | 西安歌尔泰克电子科技有限公司 | 一种腕带设备的表盘切换方法、装置、系统及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015037804A1 (en) * | 2013-09-11 | 2015-03-19 | Lg Electronics Inc. | Wearable computing device and user interface method |
CN106598222A (zh) * | 2016-11-14 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | 一种场景模式的切换方法和系统 |
CN107422944A (zh) * | 2017-06-09 | 2017-12-01 | 广东乐心医疗电子股份有限公司 | 一种自动调整菜单显示模式的方法与装置以及可穿戴设备 |
CN108703760A (zh) * | 2018-06-15 | 2018-10-26 | 安徽中科智链信息科技有限公司 | 基于九轴传感器的人体运动姿态识别系统及方法 |
CN108764059A (zh) * | 2018-05-04 | 2018-11-06 | 南京邮电大学 | 一种基于神经网络的人体行为识别方法及系统 |
CN108831526A (zh) * | 2018-05-21 | 2018-11-16 | 四川斐讯信息技术有限公司 | 一种具有识别功能的智能可穿戴运动设备及其识别方法 |
CN110010224A (zh) * | 2019-03-01 | 2019-07-12 | 出门问问信息科技有限公司 | 用户运动数据处理方法、装置、可穿戴设备及存储介质 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107817891B (zh) * | 2017-11-13 | 2020-01-14 | Oppo广东移动通信有限公司 | 屏幕控制方法、装置、设备及存储介质 |
CN110134316B (zh) * | 2019-04-17 | 2021-12-24 | 华为技术有限公司 | 模型训练方法、情绪识别方法及相关装置和设备 |
CN110334497B (zh) * | 2019-06-28 | 2021-10-26 | Oppo广东移动通信有限公司 | 显示界面的切换方法和穿戴式电子设备、存储介质 |
-
2019
- 2019-10-29 CN CN201980099239.XA patent/CN114223139B/zh active Active
- 2019-10-29 WO PCT/CN2019/114076 patent/WO2021081768A1/zh active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015037804A1 (en) * | 2013-09-11 | 2015-03-19 | Lg Electronics Inc. | Wearable computing device and user interface method |
CN106598222A (zh) * | 2016-11-14 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | 一种场景模式的切换方法和系统 |
CN107422944A (zh) * | 2017-06-09 | 2017-12-01 | 广东乐心医疗电子股份有限公司 | 一种自动调整菜单显示模式的方法与装置以及可穿戴设备 |
CN108764059A (zh) * | 2018-05-04 | 2018-11-06 | 南京邮电大学 | 一种基于神经网络的人体行为识别方法及系统 |
CN108831526A (zh) * | 2018-05-21 | 2018-11-16 | 四川斐讯信息技术有限公司 | 一种具有识别功能的智能可穿戴运动设备及其识别方法 |
CN108703760A (zh) * | 2018-06-15 | 2018-10-26 | 安徽中科智链信息科技有限公司 | 基于九轴传感器的人体运动姿态识别系统及方法 |
CN110010224A (zh) * | 2019-03-01 | 2019-07-12 | 出门问问信息科技有限公司 | 用户运动数据处理方法、装置、可穿戴设备及存储介质 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113960911A (zh) * | 2021-11-02 | 2022-01-21 | 珠海读书郎软件科技有限公司 | 一种运动手表的表盘自动生成和切换的系统及方法 |
CN113960911B (zh) * | 2021-11-02 | 2022-09-20 | 珠海读书郎软件科技有限公司 | 一种运动手表的表盘自动生成和切换的系统及方法 |
CN116173484A (zh) * | 2023-03-03 | 2023-05-30 | 乐渊网络科技(上海)有限公司 | 运动数据的处理方法、装置及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN114223139A (zh) | 2022-03-22 |
CN114223139B (zh) | 2023-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021081768A1 (zh) | 界面切换方法、装置、可穿戴电子设备及存储介质 | |
CN110765939B (zh) | 身份识别方法、装置、移动终端及存储介质 | |
CN110135497B (zh) | 模型训练的方法、面部动作单元强度估计的方法及装置 | |
CN104065928A (zh) | 一种行为模式统计装置与方法 | |
CN104616002A (zh) | 用于年龄段判断的面部识别设备 | |
CN111144344B (zh) | 人物年龄的确定方法、装置、设备及存储介质 | |
CN103942243A (zh) | 显示设备以及使用该显示设备提供消费者构建信息的方法 | |
CN111027507A (zh) | 基于视频数据识别的训练数据集生成方法及装置 | |
CN111524513A (zh) | 一种可穿戴设备及其语音传输的控制方法、装置及介质 | |
CN113128368A (zh) | 一种人物交互关系的检测方法、装置及系统 | |
CN111967770A (zh) | 基于大数据的问卷调查数据处理方法、装置及存储介质 | |
JP2013157984A (ja) | Ui提供方法およびそれを適用した映像受信装置 | |
CN113111782A (zh) | 基于显著对象检测的视频监控方法及装置 | |
CN114513694B (zh) | 评分确定方法、装置、电子设备和存储介质 | |
CN112580472A (zh) | 一种快速轻量的人脸识别方法、装置、机器可读介质及设备 | |
WO2021147473A1 (zh) | 一种模型训练方法、内容生成方法以及相关装置 | |
CN106155707B (zh) | 信息处理方法及电子设备 | |
CN112906599A (zh) | 一种基于步态的人员身份识别方法、装置及电子设备 | |
CN110716632A (zh) | 一种电池电量管理方法及智能终端 | |
CN107832690B (zh) | 人脸识别的方法及相关产品 | |
CN114005174A (zh) | 工作状态的确定方法、装置、电子设备及存储介质 | |
US20220101871A1 (en) | Live streaming control method and apparatus, live streaming device, and storage medium | |
CN111797127B (zh) | 时序数据分割方法、装置、存储介质及电子设备 | |
CN110556099B (zh) | 一种命令词控制方法及设备 | |
CN112580543A (zh) | 行为识别方法、系统及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19950596 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19950596 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18/10/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19950596 Country of ref document: EP Kind code of ref document: A1 |