CN114935993A - Graphical interface interaction method, wearable device and computer-readable storage medium - Google Patents

Graphical interface interaction method, wearable device and computer-readable storage medium Download PDF

Info

Publication number
CN114935993A
CN114935993A CN202210532013.8A CN202210532013A CN114935993A CN 114935993 A CN114935993 A CN 114935993A CN 202210532013 A CN202210532013 A CN 202210532013A CN 114935993 A CN114935993 A CN 114935993A
Authority
CN
China
Prior art keywords
monitoring
wearable device
touch input
graphical interface
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210532013.8A
Other languages
Chinese (zh)
Inventor
何岸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DO Technology Co ltd
Original Assignee
DO Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DO Technology Co ltd filed Critical DO Technology Co ltd
Priority to CN202210532013.8A priority Critical patent/CN114935993A/en
Publication of CN114935993A publication Critical patent/CN114935993A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides a graphical interface interaction method, wearable equipment and a computer readable storage medium, wherein the graphical interface interaction method comprises the following steps: detecting a touch input to a monitoring application icon in a wearable device graphical interface, and executing a monitoring application program associated with the monitoring application icon in response to a first touch input to the monitoring application icon; in response to a second touch input to the monitoring application icon, an information window is displayed in the graphical interface, the information window including a first visual element provided by the monitoring application to represent the monitoring information, the second touch input being different from the first touch input. By adopting the method, the monitoring information can be directly checked on the icon display interface of the wearable device without starting a monitoring application program, a more convenient and efficient interaction mode is provided, so that a user can quickly know the monitoring information of the wearable device, and the energy consumption of the device is reduced.

Description

Graphical interface interaction method, wearable device and computer-readable storage medium
Technical Field
The present invention relates to a graphical user interface of a wearable device, and more particularly, to a graphical interface interaction method, a wearable device, and a computer-readable storage medium.
Background
A wearable device (e.g., a smart watch) may configure a plurality of graphical interfaces including time elements for a user to select by sliding left and right on a screen or manipulating a button as a home interface that is presented after the wearable device is unlocked. For example, the main interface of the smart watch is presented when the user lifts the wrist and brightens the screen, and can also be displayed as a normally bright interface.
The wearable device is generally configured with a plurality of sensors, so that the wearable device has a monitoring function on human body physiological information, human body motion information and environmental information. For example, an acceleration sensor, a gyroscope and an altimeter are arranged for monitoring human motion information; a photoplethysmography sensor is provided to acquire blood information of a human body, an ECG (electrocardiogram) electrode is provided to acquire an electrocardiogram of the human body, a microphone is provided to acquire respiratory information or environmental noise information, and a temperature sensor is provided to acquire human body temperature or environmental temperature information. These monitoring functions are typically implemented as an application that is resident in the memory of the wearable device and provides a monitoring application icon in the main interface of the wearable device for selection by the user.
The existing monitoring information checking mode usually needs to start a related application program, the monitoring information is searched in a graphical interface presented by the application program, the monitoring information can be checked only by starting the application program in the mode, a user needs to click or press keys for many times, interaction is complex and time-consuming, and energy of wearing equipment is wasted.
Disclosure of Invention
The embodiment of the invention aims to provide a graphical interface interaction method, a wearable device and a computer readable storage medium, which provide a more convenient and efficient interaction mode so that a user can know monitoring information of the wearable device quickly and the energy consumption of the device can be reduced.
In a first aspect, an embodiment of the present application provides a graphical interface interaction method, including:
detecting touch input of a monitoring application icon in a wearable device graphical interface;
in response to a first touch input to the monitoring application icon, executing a monitoring application associated with the monitoring application icon;
in response to a second touch input to the monitoring application icon, an information window is displayed in the graphical interface, the information window including a visual element provided by the monitoring application to represent the monitoring information, the second touch input being different from the first touch input.
In one possible implementation manner of the first aspect, the method further includes: and closing the information window in response to the display of the information window reaching a preset time or in response to user input for interrupting the display of the information window.
In one possible implementation of the first aspect, the user input to interrupt the display of the information window comprises: touch input to areas outside the information window and user input to the keys of the wearable device.
In one possible implementation manner of the first aspect, after responding to the second touch input to the monitoring application icon, the method further includes:
the method comprises the steps of obtaining an audio output mode of the wearable device, broadcasting monitoring information in an audio mode if the wearable device is not in a mute mode, and providing tactile feedback based on a reminding state associated with the monitoring information if the wearable device is in the mute mode.
In one possible implementation manner of the first aspect, providing tactile feedback based on the reminder state associated with the monitoring information includes: and identifying a reminding state associated with the monitoring information, and providing different tactile feedback based on different reminding states.
In one possible implementation manner of the first aspect, the method further includes: and in response to closing the information window, terminating the audio broadcast of the monitoring information or terminating the tactile feedback.
In one possible implementation manner of the first aspect, displaying an information window in a graphical interface includes: an information window is displayed centrally in the graphical interface.
In one possible implementation manner of the first aspect, displaying an information window in a graphical interface includes: an information window is displayed in the graphical interface proximate to the location of the monitoring application icon.
In one possible implementation manner of the first aspect, after detecting the touch input to the monitoring application icon in the graphical interface of the wearable device, the method further includes:
the touch input is recognized as the first touch input or the second touch input based on a touch intensity or a touch duration of the touch input.
In one possible implementation manner of the first aspect, the method further includes: in response to a third touch input to the graphical interface, the monitoring application icon is distinctively displayed, the third touch input being different from the first touch input and the second touch input.
In a second aspect, embodiments of the present application provide a wearable device, including a display, a processor, and a memory configured to store one or more programs for execution by the processor, the one or more programs including instructions for performing the above-described method.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the above method.
In the embodiment of the application, touch input to a monitoring application icon in a graphical interface of the wearable device is detected, and a monitoring application program associated with the monitoring application icon is executed in response to the first touch input to the monitoring application icon; in response to a second touch input to the monitoring application icon, an information window is displayed in the graphical interface, the information window including a first visual element provided by the monitoring application to represent the monitoring information, the second touch input being different from the first touch input. The monitoring application program is accessed by touching the monitoring application icon in the graphical interface of the wearable device through the first touch mode, the monitoring application icon is touched through the second touch mode different from the first touch mode, the monitoring information of the monitoring application program is displayed in the graphical interface where the touch monitoring application icon is located through the window mode, the monitoring information can be directly checked on the icon display interface of the wearable device without starting the monitoring application program in the mode, a more convenient and efficient interaction mode is provided, so that a user can quickly know the monitoring information of the wearable device, and the energy consumption of the device is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a block diagram of a wearable device of a specific implementation of an electronic device of the present invention;
fig. 2A is a schematic diagram illustrating an update of a graphical interface when a wearable device performs a first touch operation according to an embodiment of the present application;
fig. 2B is a schematic diagram illustrating an update of a graphical interface when a wearable device performs a second touch operation according to an embodiment of the present application;
fig. 2C is a schematic diagram illustrating an update of a graphical interface when a wearable device performs a second touch operation according to another embodiment of the present application;
fig. 2D is a schematic diagram illustrating that a graphical interface is updated when a wearable device performs a third touch operation according to an embodiment of the present application;
FIG. 3 is a flowchart of a graphical interface interaction method provided by an embodiment of the present invention;
FIG. 4 is a flow chart of another graphical interface interaction method provided by embodiments of the present invention;
fig. 5 is a flowchart of another graphical interface interaction method provided in an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
As described above, it is desirable to provide a graphical interface interaction mode for a user to quickly access monitoring information acquired by a wearable device, so as to reduce time waste of the user in unnecessary interaction and save device energy.
The basic idea of the invention is to detect a touch input to a monitoring application icon in a graphical interface of a wearable device, and to execute a monitoring application associated with the monitoring application icon in response to a first touch input to the monitoring application icon; in response to a second touch input to the monitoring application icon, an information window is displayed in the graphical interface, the information window including a first visual element provided by the monitoring application to represent the monitoring information, the second touch input being different from the first touch input. According to the method, the monitoring application icon is touched to enter the monitoring application program through the first touch mode in the graphical interface of the wearable device, the monitoring application icon is touched through the second touch mode different from the first touch mode, the monitoring information of the monitoring application program is displayed in the graphical interface where the monitoring application icon is touched in a window mode, the monitoring information can be directly checked on the icon display interface of the wearable device without starting the monitoring application program, a more convenient and efficient interaction mode is provided, a user can conveniently and quickly know the monitoring information of the wearable device, and the energy consumption of the device is reduced.
Fig. 1 shows a block diagram of a wearable device, which may include, but is not limited to, a smart watch, a smart bracelet, a smart wristband, a smart ring, and the like. Wearable device 100 may include one or more processors 101, memory 102, communication module 103, sensor module 104, display 105, audio module 106, speaker 107, microphone 108, camera module 109, motor 110, keys 111, indicator 112, battery 113, power management module 114. These components may communicate over one or more communication buses or signal lines.
The processor 101 is a final execution unit of information processing and program operation, and may execute an operating system or an application program stored in the memory 102 to execute various functional applications of the wearable device 100 and data processing. For example, the graphical interface interaction method provided by the embodiment of the invention is executed.
The memory 102 may be used to store computer-executable program code, which includes instructions. The memory 102 may include a program storage area and a data storage area. The storage program area may store an operating system, and application programs (such as a sound playing function and a graphical interface interaction function) required by at least one function. In some embodiments, the storage program area may store monitoring applications including an application for detecting human parameters and/or an application for detecting environmental information. The data storage area can store data monitored in the using process of the wearable device, such as exercise data of each exercise of the user, including step number, stride, pace, distance and the like; such as physiological data of the user including heart rate, blood oxygen, blood glucose concentration, energy expenditure (calories), etc.; such as environmental monitoring data including temperature, humidity, noise, barometric pressure, altitude, location, etc.
The communication module 103 may support the wearable device 100 to communicate with the network and the mobile terminal 200 through a wireless communication technology. The communication module 103 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. The communication module 103 may include one or more of a cellular mobile communication module, a short-range wireless communication module, a wireless internet module, and a location information module.
The sensor module 104 is used to measure a physical quantity or detect an operation state of the wearable device 100. The sensor module 104 may include an acceleration sensor 104A, a gyroscope sensor 104B, an air pressure sensor 104C, a magnetic sensor 104D, a biometric sensor 104E, a proximity sensor 104F, an ambient light sensor 104G, a touch sensor 104H, and the like. The sensor module 104 may also include control circuitry for controlling one or more sensors included in the sensor module 104.
The acceleration sensor 104A may detect the magnitude of acceleration of the wearable device 100 in various directions. The magnitude and direction of gravity can be detected when the wearable device 100 is at rest. The gesture of the wearable device 100 can be recognized, and the gesture recognition method is applied to horizontal and vertical screen switching, pedometer and other applications. In one embodiment, the acceleration sensor 104A may be combined with the gyroscope sensor 104B to monitor the stride length, pace frequency, pace speed, etc. of the user during exercise.
The gyroscope sensor 104B may be used to determine the motion pose of the wearable device 100. In some embodiments, the angular velocity of the wearable device about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 104B.
The air pressure sensor 104C is used to measure air pressure. In some embodiments, the wearable device 100 calculates altitude, aiding in positioning and navigation, from the barometric pressure values measured by the barometric pressure sensor 104C.
The magnetic sensor 104D includes a hall sensor, or magnetometer, etc., which may be used to determine the user position.
The biometric sensor 104E is used to measure physiological parameters of the user including, but not limited to, Photoplethysmography (PPG) sensors, ECG sensors, EMG sensors, blood glucose sensors, temperature sensors. For example, the wearable device 100 may measure heart rate, blood oxygen, blood pressure data of the user via signals of the photoplethysmography sensor and/or the ECG sensor, and identify a blood glucose value of the user based on data generated by the blood glucose sensor. In some embodiments, the wearable device 100 may detect whether the user is in a sleep state based on the acceleration sensor 104A and the biometric sensor 104E, identify a sleep stage of the user, and identify sleep apnea.
The proximity sensor 104F is used to detect the presence of an object near the wearable device 100 in the absence of any physical contact. In some embodiments, the proximity sensor 104F may include a light emitting diode and a light detector. The light emitting diode may be infrared light and the wearable device 100 detects reflected light from nearby objects using a light detector. When the reflected light is detected, it can be determined that there is an object near the wearable device 100. The wearable device 100 can detect its wearing state using the proximity sensor 104F.
The ambient light sensor 104G is used to sense ambient light level. In some embodiments, the wearable device may adaptively adjust display screen brightness based on perceived ambient light levels to reduce power consumption.
The touch sensor 104H is used to detect a touch operation applied thereto or nearby, and is also referred to as a "touch device". The touch sensor 104H can be disposed on the display screen 105, and the touch sensor 104H and the display screen 105 form a touch screen. The touch sensor 104H may be used to detect a touch duration, a touch intensity, a sliding direction, and the like of the user on the touch screen. The processor 101 identifies a user's touch input based on detecting a duration of touch, a strength of touch, a direction of swipe, etc., such as a tap, a sustained press, a double tap, etc., of a single graphical interface element on the touch screen by the user, and such as a single-finger swipe, a double-finger contact, a double-finger swipe, a three-finger swipe, etc., on the touch screen by the user.
The display screen 105 is used to display a graphical User Interface (UI) that may include graphics, text, icons, video, and any combination thereof. The Display 105 may be a Liquid Crystal Display (lcd), an Organic Light-Emitting Diode (OLED) Display, or the like. When the display screen 105 is a touch display screen, the display screen 105 can capture a touch signal on or over the surface of the display screen 105 and input the touch signal as a control signal to the processor 101.
The audio module 106, the speaker 107, the microphone 108 provide audio functions between the user and the wearable device 100, such as listening to music or talking; for another example, when the wearable device 100 receives a request from the user to view the monitoring information, the monitoring information is broadcasted through the speaker 107. The audio module 106 converts the received audio data into an electrical signal and sends the electrical signal to the speaker 107, and the speaker 107 converts the electrical signal into sound; or the microphone 108 converts the sound into an electrical signal and sends the electrical signal to the audio module 106, and then the audio module 106 converts the electrical audio signal into audio data.
The camera module 109 is used to capture still images or video. The camera module 109 may include an image sensor, an Image Signal Processor (ISP), and a Digital Signal Processor (DSP). The image sensor converts the optical signal into an electrical signal, the image signal processor converts the electrical signal into a digital image signal, and the digital signal processor converts the digital image signal into an image signal in a standard format (RGB, YUV).
The motor 110 may convert the electrical signal into mechanical vibrations, producing a vibratory effect. In some embodiments, the motor 110 may be used to generate tactile feedback, i.e., when the user's hand is in contact with the screen, keys of the wearable device, the vibration generated by the motor 110 causes the user to generate a tactile sensation. Such as providing tactile feedback when a user touches an application icon of a display screen or providing tactile feedback to alert the user when a detected noise value exceeds a preset criterion. The motor 110 is preferably a linear motor and may provide a user with a variety of tactile feedback, for example, tactile feedback of varying vibration times and amplitudes.
The keys 111 include a power-on key, a volume key, and the like. The keys 111 may be mechanical keys (physical buttons) or touch keys. The indicator 112 is used for indicating the state of the wearable device 100, such as indicating the charging state, the change of the charge amount, and may also be used for indicating a message, a missed call, a notification, and the like. In some embodiments, the wearable device 100 provides vibratory feedback upon receiving the notification message from the mobile terminal application.
The battery 113 is used to supply power to various components of the wearable device. The power management module 114 is used for managing charging and discharging of the battery, and monitoring parameters such as battery capacity, battery cycle number, battery health (whether leakage occurs, impedance, voltage, current, and temperature). In some embodiments, the power management module 114 may charge the battery in a wired or wireless manner.
It should be understood that in some embodiments, the wearable device 100 may be comprised of one or more of the aforementioned components, and the wearable device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Fig. 2A is a schematic diagram illustrating an update of a graphical interface when a wearable device performs a first touch operation according to an embodiment of the present application; fig. 2B is a schematic diagram illustrating an update of a graphical interface when a wearable device performs a second touch operation according to an embodiment of the present application; fig. 2C is a schematic diagram illustrating an update of a graphical interface when a wearable device performs a second touch operation according to another embodiment of the present application; fig. 2D shows a schematic diagram of updating a graphical interface when a wearable device performs a third touch operation according to an embodiment of the present application.
As shown in fig. 2A, wearable device 200 has touch display screen 210. In some embodiments, the wearable device 200 may be the wearable device 100 as shown in fig. 1, such as a smart watch. The touch screen display 210 can receive touch input of a user, and a first graphical interface 211 is displayed on the touch screen display 210, wherein the first graphical interface 211 comprises a dial 201, a noise monitoring icon 202, a step-counting monitoring icon 203, a heart rate monitoring icon 204 and a temperature monitoring icon 205. Wherein the noise monitoring icon 202 is associated with a noise monitoring application, the step-counting monitoring icon 203 is associated with a step-counting monitoring application, the heart rate monitoring icon 204 is associated with a heart rate monitoring application, and the temperature monitoring icon 205 is associated with a monitoring application. The first graphical interface 211 may also include other icons for monitoring applications, such as a blood oxygen monitoring program, a blood pressure monitoring program, a blood glucose detection program, a humidity detection program, an altitude detection program, a barometric pressure detection program, and the like.
Wearable device 200 also includes keys 220 for receiving user press inputs, and in some embodiments, wearable device 200 may also include a crown for receiving user rotation and press inputs.
As shown in fig. 2A, wearable device 200, in response to first touch input 230 to heart rate monitoring icon 204, executes a heart rate monitoring application associated with heart rate monitoring icon 204, updating first graphical interface 211 to heart rate monitoring graphical interface 212. The heart rate monitoring graphical interface 212 includes a heart rate measurement window 241, a historical heart rate window 242, a maximum heart rate visual element 243, and a minimum heart rate visual element 244. The first touch input 230 is preferably a tap input to the heart rate monitor icon 204, and in some embodiments, the first touch input 230 may also be a long press, a hard tap (a click with a touch intensity that exceeds a preset intensity), a double click, a multiple click, or the like.
In another embodiment, the first touch input 230 may also be replaced by user input of other input mechanisms, such as a press input or a rotation input based on the keys 220 or crown. The heart rate monitor icon may be brought into focus of control, for example, by a rotational input to the crown, and then entered by a pressing input to enter the heart rate monitor application.
As shown in fig. 2B, in response to the second touch input 232 to the heart rate monitoring icon 204, the wearable device 200 updates the first graphical interface 211 to the second graphical interface 213, and displays an information window 250 in the center of the second graphical interface 213, wherein the information window 250 includes a heart rate graphical element 251, a number element 252, and a text element 253 provided by the monitoring application program for representing the monitoring information, so as to briefly present the historical recent heart rate monitoring information. And the information window 250 may include a plurality of visual elements representing the type of monitoring information, the monitoring value, and the reminder information, but at least the visual elements representing the monitoring value. Where the visual elements may be graphics, text, icons, video, or other user interface elements. In fig. 2B, a heart rate graphic element 251 is used to indicate the type of monitoring information, a number element 252 indicates the monitoring value, and a text element 253 explains the monitoring value to the user for the reminder information.
The second touch input 232 is preferably a long press input, and in some embodiments, the second touch input 232 may also be a tap (a click with a touch intensity exceeding a preset intensity), a double tap, a multiple click, or the like, which is distinguished from the first touch input 230. Wearable device 200 may identify the touch input as first touch input 230 or second touch input 232 based on a touch intensity or a touch duration of the touch input. In another embodiment, the second touch input 232 may also be replaced by user input from other input mechanisms, such as a press input or a rotation input based on the keys 220 or crown. The heart rate monitor icon may be brought into focus of control by, for example, a rotational input of the crown, and the information window 250 may be displayed by a plurality of pressing inputs.
In some embodiments, wearable device 200 may feed back at least a portion of the content of the monitoring information in information window 250 to the user in a non-visual manner. For example, the text element 253 in the information window 250 can be broadcasted in a voice manner, and for example, the reminding state associated with the monitoring information can be reflected in a tactile feedback manner. Therefore, richer, convenient and efficient interactive feedback can be provided, the user can understand the monitoring information more quickly, the interactive pleasure can be improved, and the energy consumption of equipment is reduced. In some embodiments, the non-visual manner of providing feedback monitoring information to the user may be selected based on the operating mode of the wearable device 200. For example, when the wearable device 200 is not in the silent mode, the monitoring information (e.g., the text element 253) may be broadcasted in a voice manner, and a tactile feedback manner may be provided to reflect a reminder state associated with the monitoring information at the same time; for example, when the wearable device 200 is in a silent mode, then tactile feedback is provided based on the alert status associated with the monitoring information.
The reminding state is a state in which whether the monitoring value acquired by the monitoring application reaches a warning to the user, and may be provided by the monitoring application. The reminding state may include a normal state and an alert state, and the wearable device 200 feeds back the reminding state associated with the monitoring information to the user when displaying the information window 250. In some embodiments, wearable device 200 may provide different feedback (tactile feedback, visual feedback, auditory feedback, etc.) to the user based on the alert state, e.g., wearable device 200 may provide a first feedback or no feedback if in a normal state; if the status is alert, the wearable device 200 may provide a second feedback. In some embodiments, the wearable device 200 may set a heart rate threshold for the user in the non-exercise state, and if the heart rate threshold is not exceeded in the non-exercise state, the state is normal; if the heart rate threshold value is exceeded by the user in the non-motion state, the state is an alarm state, and tactile feedback, visual feedback, auditory feedback and the like can be sent to the user to remind the user. In some embodiments, the wearable device 200 may set a user's daily step count target, and if the user does not reach the step count target, it is a normal state, and the wearable device 200 may provide the first feedback or no feedback; if the user reaches the step target, then the wearable device 200 may provide a second feedback for the alert state. In some embodiments, the wearable device 200 may set the ambient noise threshold to 70 db, and if the noise monitored by the noise monitoring application does not meet or exceed the threshold, it is a normal state, and the wearable device 200 may provide the first feedback or no feedback; if the noise monitored by the noise monitoring application does not meet or exceed the threshold, it is an alert state, and wearable device 200 may provide a second feedback.
In some embodiments, wearable device 200 may identify a reminder state associated with the monitoring information and provide different tactile feedback based on different reminder states. For example, monitoring that the information is normal, the wearable device 200 provides a first tactile feedback; monitoring information is an alert state and wearable device 200 provides a second tactile feedback. Wherein the second haptic feedback is different from the first haptic feedback, such as outputting a different vibration duration or vibration amplitude using a linear motor.
In some embodiments, wearable device 200 may close information window 250 in response to display of information window 250 for a preset length of time or in response to a user input interrupting display of information window 250. For example, the wearable device 200 may preset the display duration of the information window 250 to be 3 seconds, and when the display duration of the information window 250 reaches 3 seconds, the wearable device 200 controls to close the information window 250. For example, when the wearable device 200 detects a rotation or press input to the crown, the wearable device 200 controls to close the information window 250. In some embodiments, the user input interrupting the display of the information window 250 may include: touch input to an area outside of the information window 250 and user input to the wearable device keys 220, or user input to operate other input devices (e.g., microphone, crown). In some embodiments, wearable device 200 terminates audibly broadcasting the monitoring information or terminates the tactile feedback in response to closing the information window. Thereby reducing unnecessary visual, auditory and tactile output and reducing the energy consumption of the wearable device.
As shown in fig. 2C, wearable device 200 updates first graphical interface 211 to third graphical interface 214 in response to second touch input 232 to heart rate monitoring icon 204. In contrast to fig. 2B, in the third graphical interface 214, a smaller size information window 250 is provided, the information window 250 is disposed close to the heart rate monitor icon 204, and only a text element 253 is displayed in the information window 250, the text element 253 including the monitor value. Since the information window 250 is disposed close to the application icon associated therewith, the user can be more clearly aware of which monitoring application the displayed monitoring information comes from, providing a more efficient interactive interface.
As shown in fig. 2D, a fourth graphical interface 215 is displayed on the touch display screen 210 of the wearable device 200, and the second graphical interface 215 includes a dial 201, a noise monitoring icon 202, a step-counting monitoring icon 203, a heart rate monitoring icon 204, and a temperature monitoring icon 205. Unlike fig. 2A-2C, the icons in the fourth graphical interface 215 and the dial do not have an outer frame. Specifically, when the fourth graphical interface 215 is a main (home) interface of the wearable device 200, the developer usually blends the application icon into the background of the interface (e.g., does not display the outer frame) to make the interface beautiful. However, the application icon is often difficult to perceive when the user uses the application icon. The present embodiment improves on this, and as shown in fig. 2D, wearable device 200 updates fourth graphical interface 215 to fifth graphical interface 216 in response to third touch input 234 to fourth graphical interface 215. The noise monitoring icon 202, the step-counting monitoring icon 203, the heart rate monitoring icon 204, and the temperature monitoring icon 205 are augmented with a box element 260 in the fifth graphical interface 216 to distinctively display the monitoring application icon. In some embodiments, the monitoring application icons may also be displayed differently in other ways, such as changing the color scheme, brightness, etc. of the icons. Therefore, a faster and more efficient interaction mode can be provided for the user, and the user interaction errors are reduced, so that the energy consumption of the equipment is reduced.
Wherein the third touch input 234 is different in touch pattern from the first and second touch inputs 230, 232. For example, a two-finger press, a three-finger press, a two-finger slide, or the like may be used. In another embodiment, the third touch input 234 may also be replaced by user input from other input mechanisms, such as a press input or a rotation input based on the keys 220 or crown.
Fig. 3 is a flowchart of a graphical interface interaction method according to an embodiment of the present invention. The method may be implemented by a wearable device 100 as shown in fig. 1. The method comprises the following steps:
step S301, detecting touch input of the monitoring application icon in the graphical interface of the wearable device. In some embodiments, the touch input may be a tap, long press, hard tap (a click with a touch intensity exceeding a preset intensity), double tap, multiple click, etc. input to the detection application icon.
Step S302, in response to a first touch input to the monitoring application icon, executing a monitoring application associated with the monitoring application icon. Specifically, when a first touch input to the monitoring application icon is detected, the monitoring application program associated with the monitoring application icon is executed, and the wearable device displays a graphical interface of the monitoring application program (as shown in fig. 2A).
Step S303, in response to a second touch input to the monitoring application icon, displaying an information window in the graphical interface, where the information window includes a visual element provided by the monitoring application program for representing the monitoring information, and the second touch input is different from the first touch input. Specifically, the wearable device displays an information window on the graphical interface in response to the second touch input to the monitoring application icon, where the information window may include a plurality of visual elements representing the type of monitoring information, the monitoring value, and the reminder information, but at least includes a visual element representing the monitoring value (as shown in fig. 2B). Where the visual elements may be graphics, text, icons, video, or other user interface elements. In some embodiments, displaying an information window in the graphical interface comprises: the information window is displayed centrally in the graphical interface (as in fig. 2B) or in a location in the graphical interface near the monitoring application icon (as in fig. 2C).
In some embodiments, the touch input may also be replaced by user input of other input mechanisms, such as a key or crown based press input or a rotation input. For example, the monitor icon may be brought into the control focus by a rotational input of the crown, and the information window may be displayed by a plurality of pressing inputs.
This embodiment is through getting into monitoring application through first touch mode touch monitoring application icon in wearing equipment's graphical interface, touch monitoring application icon through the second touch mode that is different from first touch mode, the monitoring information of this monitoring application is shown through the window form in the graphical interface at touch monitoring application icon place, adopt this mode need not to start monitoring application and can directly look over monitoring information at wearing equipment's icon display interface, a more convenient and efficient interactive mode is provided, so that the user knows wearing equipment's monitoring information fast, help reducing the equipment energy consumption.
Fig. 4 is a flowchart of another graphical interface interaction method provided in an embodiment of the present invention. The method may be implemented by a wearable device 100 as shown in fig. 1. The method comprises the following steps:
step S401, touch input of the monitoring application icon in the wearable device graphical interface is detected. In some embodiments, the touch input may be a tap, long press, hard tap (a click with a touch intensity exceeding a preset intensity), double tap, multiple click, etc. input to the detection application icon.
Step S402, recognizing the touch input as a first touch input or a second touch input based on the touch intensity or the touch duration of the touch input.
In step S403, in response to the first touch input to the monitoring application icon, a monitoring application associated with the monitoring application icon is executed. Specifically, when a first touch input to the monitoring application icon is detected, the monitoring application program associated with the monitoring application icon is executed, and the wearable device displays a graphical interface of the monitoring application program (as shown in fig. 2A).
Step S404, responding to a second touch input to the monitoring application icon, displaying an information window in the graphical interface, wherein the information window comprises a visual element which is provided by the monitoring application program and used for representing the monitoring information, and the second touch input is different from the first touch input. Specifically, the wearable device displays an information window on the graphical interface in response to the second touch input to the monitoring application icon, where the information window may include a plurality of visual elements representing the type of monitoring information, the monitoring value, the reminder information, and the like, but at least includes a visual element representing the monitoring value (see fig. 2B). Where the visual elements may be graphics, text, icons, video, or other user interface elements. In some embodiments, displaying an information window in the graphical interface comprises: the information window is displayed centrally in the graphical interface (as in fig. 2B), or in a location in the graphical interface near the monitoring application icon (as in fig. 2C).
Step S405, obtaining an audio output mode of the wearable device, broadcasting monitoring information in an audio mode if the wearable device is not in a mute mode, and providing tactile feedback based on a reminding state of the monitoring information if the wearable device is in the mute mode. In some embodiments, the wearable device may feed back at least a portion of the content of the monitoring information in the information window to the user in a non-visual manner. For example, a text element in the information window may be broadcasted in a voice manner, and for example, a reminder state associated with the monitoring information may be reflected in a tactile feedback manner. In some embodiments, the non-visual manner of providing feedback monitoring information to the user may be selected based on the operating mode of the wearable device. For example, when the wearable device is not in the silent mode, the monitoring information can be broadcasted in a voice mode, and a tactile feedback mode can be provided to reflect the reminding state associated with the monitoring information; for example, when the wearable device is in a silent mode, tactile feedback is provided based on the alert status associated with the monitoring information. Therefore, richer, more convenient and more efficient interactive feedback can be provided, the user can understand the monitoring information more quickly, the interactive pleasure can be improved, and the energy consumption of equipment is reduced.
The reminding state is a state in which whether the monitoring value acquired by the monitoring application reaches a warning to the user, and may be provided by the monitoring application. The reminding state can comprise a normal state and a warning state, and the wearable device feeds back the reminding state related to the monitoring information to the user when displaying the information window. In some embodiments, the wearable device may provide different feedback (tactile feedback, visual feedback, auditory feedback, etc.) to the user based on the alert state, e.g., if normal, the wearable device may provide the first feedback or no feedback; if the alarm state is detected, the wearable device can provide second feedback. In some embodiments, the wearable device may set a heart rate threshold of the user in the non-exercise state, and if the heart rate threshold is not exceeded by the user in the non-exercise state, the wearable device is in a normal state; if the user exceeds the heart rate threshold value in the non-motion state, the state is an alarm state, and tactile feedback, visual feedback, auditory feedback and the like can be sent to the user to remind the user. In some embodiments, the wearable device may set a user's daily step count target, and if the user does not reach the step count target, the wearable device may provide a first feedback or no feedback in a normal state; if the user reaches the step number target, the wearable device can provide second feedback for the warning state. In some embodiments, the wearable device may set the ambient noise threshold to 70 db, and if the noise monitored by the noise monitoring application does not meet or exceed the threshold, it is in a normal state, and the wearable device may provide the first feedback or no feedback; if the noise monitored by the noise monitoring application does not meet or exceed the threshold, the wearable device may provide a second feedback for the alert state.
Wherein, the warning state based on monitoring information provides tactile feedback, includes: the wearable device identifies the reminding state associated with the monitoring information and provides different tactile feedback based on different reminding states. For example, the monitoring information is in a normal state, and the wearable device provides first tactile feedback; monitoring information is the warning state, and wearing equipment provides second sense of touch feedback. Wherein the second haptic feedback is different from the first haptic feedback, such as outputting a different vibration duration or vibration amplitude using a linear motor.
Step S406, in response to the display of the information window reaching a preset duration or in response to the user input of interrupting the display of the information window, closing the information window. Wherein the user input interrupting the display of the information window comprises: touch input to areas outside the information window and user input to the wearable device keys. In some embodiments, the audible announcement of the monitoring information or the termination of the haptic feedback is terminated in response to the closing of the information window. Thereby reducing unnecessary visual, auditory and tactile output and reducing the energy consumption of the wearable device.
Fig. 5 is a flowchart of another graphical interface interaction method provided in an embodiment of the present invention. The method may be implemented by a wearable device 100 as shown in fig. 1. The method comprises the following steps:
step S501, touch input of the monitoring application icon in the graphical interface of the wearable device is detected.
Step S502, in response to the first touch input to the monitoring application icon, executing a monitoring application program associated with the monitoring application icon.
Step S503, in response to a second touch input to the monitoring application icon, displaying an information window in the graphical interface, where the information window includes a visual element provided by the monitoring application program for representing the monitoring information, and the second touch input is different from the first touch input. In some embodiments, displaying an information window in the graphical interface comprises: the information window is displayed centrally in the graphical interface (as in fig. 2B) or in a location in the graphical interface near the monitoring application icon (as in fig. 2C).
In step S504, in response to a third touch input (as in fig. 2D) to the graphical interface, the monitoring application icon is distinctively displayed, and the third touch input is different from the first touch input and the second touch input. In some embodiments, the display of the monitoring application icons may be changed to distinctively display the monitoring application icons, such as by adding borders, changing color schemes, brightness of icons, and so forth. Therefore, a faster and more efficient interaction mode can be provided for the user, and the user interaction errors are reduced, so that the energy consumption of the equipment is reduced.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a computer program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal wearable device to perform the steps described in the above-described exemplary methods of this specification when the program product is run on a wearable device, for example, any one or more of the steps of fig. 3, fig. 4, or fig. 5 may be performed.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (12)

1. A graphical interface interaction method, comprising:
detecting touch input of a monitoring application icon in a graphical interface of wearable equipment;
in response to a first touch input to the monitoring application icon, executing a monitoring application associated with the monitoring application icon;
displaying an information window in the graphical interface in response to a second touch input to the monitoring application icon, the information window including a visual element provided by the monitoring application to represent monitoring information, the second touch input being different from the first touch input.
2. The interaction method of claim 1, wherein the method further comprises: and closing the information window in response to the display of the information window reaching a preset time or in response to user input for interrupting the display of the information window.
3. The interactive method of claim 2, wherein interrupting the user input to display the information window comprises: touch input to an area outside the information window and user input to a wearable device key.
4. The interaction method of claim 2, further comprising, in response to the second touch input to the monitoring application icon,:
the audio output mode of the wearable device is obtained, if the wearable device is not in the mute mode, the monitoring information is broadcasted in an audio mode, and if the wearable device is in the mute mode, the tactile feedback is provided based on the reminding state related to the monitoring information.
5. The interaction method of claim 4, wherein providing haptic feedback based on the alert status associated with the monitoring information comprises: and identifying a reminding state associated with the monitoring information, and providing different tactile feedback based on different reminding states.
6. The interactive method of claim 5, further comprising: in response to closing the information window while terminating audibly broadcasting the monitoring information or terminating the haptic feedback.
7. The interactive method of claim 1, wherein displaying an information window in the graphical interface comprises: displaying the information window centrally in the graphical interface.
8. The interactive method of claim 1, wherein displaying an information window in the graphical interface comprises: and displaying the information window in the graphical interface close to the monitoring application icon.
9. The interaction method of claim 1, wherein after detecting the touch input to the monitoring application icon in the wearable device graphical interface, further comprising:
identifying the touch input as the first touch input or the second touch input based on a touch intensity or a touch duration of the touch input.
10. The interaction method of claim 1, wherein the method further comprises: in response to a third touch input to the graphical interface, the monitoring application icon is distinctively displayed, the third touch input being different from the first touch input and the second touch input.
11. A wearable device comprising a display, a processor, and a memory configured to store one or more programs for execution by the processor, the one or more programs comprising instructions for performing the method of any of claims 1-10.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 10.
CN202210532013.8A 2022-05-17 2022-05-17 Graphical interface interaction method, wearable device and computer-readable storage medium Pending CN114935993A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210532013.8A CN114935993A (en) 2022-05-17 2022-05-17 Graphical interface interaction method, wearable device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210532013.8A CN114935993A (en) 2022-05-17 2022-05-17 Graphical interface interaction method, wearable device and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN114935993A true CN114935993A (en) 2022-08-23

Family

ID=82864196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210532013.8A Pending CN114935993A (en) 2022-05-17 2022-05-17 Graphical interface interaction method, wearable device and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN114935993A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045394A (en) * 2015-08-03 2015-11-11 歌尔声学股份有限公司 Method and apparatus for starting preset function in wearable electronic terminal
CN107066192A (en) * 2015-03-08 2017-08-18 苹果公司 Equipment, method and graphic user interface for manipulating user interface object using vision and/or touch feedback
US20190012059A1 (en) * 2016-01-14 2019-01-10 Samsung Electronics Co., Ltd. Method for touch input-based operation and electronic device therefor
US10459887B1 (en) * 2015-05-12 2019-10-29 Apple Inc. Predictive application pre-launch

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107066192A (en) * 2015-03-08 2017-08-18 苹果公司 Equipment, method and graphic user interface for manipulating user interface object using vision and/or touch feedback
US10459887B1 (en) * 2015-05-12 2019-10-29 Apple Inc. Predictive application pre-launch
CN105045394A (en) * 2015-08-03 2015-11-11 歌尔声学股份有限公司 Method and apparatus for starting preset function in wearable electronic terminal
US20190012059A1 (en) * 2016-01-14 2019-01-10 Samsung Electronics Co., Ltd. Method for touch input-based operation and electronic device therefor

Similar Documents

Publication Publication Date Title
US11769589B2 (en) Receivers for analyzing and displaying sensor data
KR20170136317A (en) Electronic apparatus and operating method thereof
CN109718112B (en) Medication reminding method and device and computer readable storage medium
KR20170019081A (en) Portable apparatus and method for displaying a screen
EP3067780A1 (en) Method for controlling terminal device, and wearable electronic device
CN113495609A (en) Sleep state judgment method and system, wearable device and storage medium
CN114668368A (en) Sleep state monitoring method, electronic equipment and computer readable storage medium
CN114637452A (en) Page control method and wearable device
CN108837271B (en) Electronic device, output method of prompt message and related product
CN114532992B (en) Method, device and system for detecting nap state and computer readable storage medium
CN114935993A (en) Graphical interface interaction method, wearable device and computer-readable storage medium
CN114995710A (en) Wearable device interaction method, wearable device and readable storage medium
CN107395910B (en) Incoming call notification method and mobile terminal
CN109660660B (en) Reminding method, mobile terminal and server
CN114209298A (en) PPG sensor control method and device and electronic equipment
CN113867666A (en) Information display method and device and wearable device
CN113703641A (en) Message preview method and device, wearable device and computer readable storage medium
CN114638247B (en) Man-machine interaction method and wearable device
CN115328351A (en) Icon display switching method, smart watch and computer storage medium
CN117555456A (en) Motion interface interaction method, wearable device and readable medium
CN114661216A (en) Alarm clock setting method, electronic equipment and computer readable storage medium
CN115328360A (en) Medal management method, electronic device and computer storage medium
CN114936302A (en) Music recommendation method and device, electronic equipment and computer readable storage medium
CN116483304A (en) Display control method, wrist wearing equipment and readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination