WO2022184030A1 - 穿戴设备的交互方法和装置 - Google Patents

穿戴设备的交互方法和装置 Download PDF

Info

Publication number
WO2022184030A1
WO2022184030A1 PCT/CN2022/078474 CN2022078474W WO2022184030A1 WO 2022184030 A1 WO2022184030 A1 WO 2022184030A1 CN 2022078474 W CN2022078474 W CN 2022078474W WO 2022184030 A1 WO2022184030 A1 WO 2022184030A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
target
information
input
target area
Prior art date
Application number
PCT/CN2022/078474
Other languages
English (en)
French (fr)
Inventor
钟国强
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2022184030A1 publication Critical patent/WO2022184030A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • the present application belongs to the technical field of mobile devices, and in particular relates to an interaction method and device for wearable devices, electronic devices and storage media.
  • smart wearable devices are different from traditional electronic devices, such as mobile phones, computers, tablets, etc., in that they neither have a large interactive screen nor can they be input through a keyboard. Therefore, the social interaction brought by smart wearable devices is far weaker than that of traditional electronic devices, which restricts the popularization of smart wearable devices.
  • the purpose of the embodiments of the present application is to provide an interaction method and device for a wearable device, an electronic device and a storage medium, which can solve the problem of poor interactivity of the smart wearable device.
  • an embodiment of the present application provides an interaction method for a wearable device, which is applied to a first wearable device, including:
  • map interface including the target area
  • the target area is determined by the position information of the first wearable device or the user's input.
  • an embodiment of the present application provides an interaction device for a wearable device, which is disposed on the first wearable device and includes:
  • an acquisition module configured to acquire motion information and position information of the second wearable device in the target area
  • a display module for displaying a map interface, the map interface including a target area
  • a positioning module configured to display the identification of the second wearable device and the motion information on the map interface according to the location information; wherein, the target area is determined by the location information of the first wearable device or the user's input Sure.
  • embodiments of the present application provide an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored on the memory and executable on the processor, the program or instruction being The processor implements the steps of the method according to the first aspect when executed.
  • an embodiment of the present application provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the method according to the first aspect are implemented .
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, and implement the first aspect the method described.
  • the motion information and position information of the second wearable device in the target area can be obtained, and a map interface can be displayed, and the logo and motion information of the second wearable device can be displayed on the map interface according to the position information, so that the map interface can display the identification and motion information of the second wearable device on the map interface.
  • the identifier and motion information of the second wearable device displayed in the display are used as the interaction portal between the wearable devices to improve the interactivity between the wearable devices.
  • FIG. 1 is one of the schematic flowcharts of the interaction method of the wearable device provided by the embodiment of the present application;
  • FIG. 2 is a second schematic flowchart of an interaction method for a wearable device provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of an application example of determining the position of a second wearable device provided by an embodiment of the present application
  • FIG. 4 is a schematic diagram of a map interface provided by an embodiment of the present application.
  • FIG. 5 is a third schematic flowchart of an interaction method for a wearable device provided by an embodiment of the present application.
  • FIG. 6 is a fourth schematic flowchart of an interaction method for a wearable device provided by an embodiment of the present application.
  • FIG. 7 is one of the schematic diagrams of an example of an interactive operation of a wearable device provided by an embodiment of the present application.
  • FIG. 8 is a second schematic diagram of an example of an interactive operation of a wearable device provided by an embodiment of the present application.
  • FIG. 9 is a third schematic diagram of an example of an interactive operation of a wearable device provided by an embodiment of the present application.
  • FIG. 10 is a fourth schematic diagram of an example of an interactive operation of a wearable device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an interaction device for a wearable device provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • An embodiment of the present application discloses an interaction method for a wearable device, see FIG. 1 , including:
  • Step 101 Determine the target area.
  • the target area is determined by the position information of the first wearable device or the user's input.
  • the target area may take the position information of the first wearable device as the center of the circle, and determine the target area according to preset or currently input data as the radius.
  • the preset radius is 5 meters
  • an area within a range of 5 meters from the position of the first wearable device is determined as the target area.
  • the target area can also be set manually, for example, the target area is determined by the input place name.
  • Step 102 Acquire motion information and position information of the second wearable device in the target area.
  • the motion information of the second wearable device includes: a user identity document (ID) corresponding to the second wearable device, ranking information of running distance, ranking information of running duration, ranking information of swimming laps, and the like.
  • ID user identity document
  • the first wearable device When the user of the first wearable device enters the exercise mode and starts exercising, the first wearable device will start recording exercise data of the user. At the same time, the sports data will be synchronized to the server, so that the server can rank sports in a specific area. The sports ranking data will be pushed to the first wearable device at regular intervals. However, it does not automatically jump out of the display, and is stored in the storage area of the first wearable device. When the user wants to open the sports ranking interface, the corresponding sports information will be displayed automatically.
  • each wearable device has bound account information. If the wearable device itself has embedded subscriber identity card (Embedded-Subscriber Identity Module, eSIM) networking and Global Positioning System (Global Positioning System, GPS) capabilities, the location of the device can be obtained directly through the Global Positioning System (Global Positioning System, GPS) information. Through the comparison and retrieval of the Global Positioning System (Global Positioning System, GPS) position, the specific name of the gymnasium, school playground or park where the current user is located can be accurately determined. Then, through the Long Term Evolution (LTE) link, the wearable device's motion data, location information, date information, motion category and other information are serialized, and then a data table is generated and uploaded to the cloud server.
  • LTE Long Term Evolution
  • the cloud server is responsible for extracting information such as sports venues, dates, and sports categories from the data packets, and sorts according to sports data according to the sports venues, dates, and sports categories as the sorting range.
  • the server will maintain the ranking data continuously, and push the ranking data to the first wearable device at regular intervals.
  • This implementation relies on a central server, and the device needs to rely on the Global Positioning System (Global Positioning System, GPS) to actively obtain the position and hand it over to the server for processing.
  • the first wearable device obtains the motion information and location information of the second wearable device by accessing the server.
  • Global Positioning System Global Positioning System, GPS
  • a lightweight server needs to be deployed near sports venues. For example, a server is arranged around a sports stadium. The reason for this implementation is that the amount of data that the server needs to process is not very large, and it only needs to sort the user's motion data.
  • Traditional large-scale servers are deployed centrally. Although they can process massive data, the delay in data refresh will be too high. Due to the deployment of lightweight servers near sports venues, users can get the latest rankings with minimal delay.
  • step 102 includes: the first wearable device sends an acquisition request to a server corresponding to the target area, so as to acquire motion information and position information of the second wearable device in the target area.
  • the specific Internet Protocol (Internet Protocol, IP) address of the lightweight server in the current area cannot be obtained. Therefore, after entering sports mode, broadcast packets need to be sent in the local network.
  • IP Internet Protocol
  • the lightweight server receives the broadcast request and performs authentication, it starts to return data packets and its own Internet Protocol (IP) address, and starts to send the first wearable device to provide the first wearable device within the range of the sports venue. Movement information of the second wearable device or invitation information of the second wearable device.
  • the position information of the second wearable device is determined by the following method:
  • Step 201 generating a broadcast packet.
  • the broadcast packet includes first positioning request information.
  • Step 202 Send the broadcast packet to multiple relay devices in the target area, so that the multiple relay devices parse the first positioning request information in the broadcast packet, generate second positioning request information, and send them to server in the target region.
  • the three-dimensional scanning model of the target area can be stored in the server in advance.
  • the relay equipment needs to be deployed in the target area in advance, such as deploying the relay equipment in the key position of the stadium.
  • the purpose of these relay devices is to enhance and forward the relay signal, and the ultimate forwarding target is the server.
  • Step 203 Receive the location information of the second wearable device generated by the server according to the second location request information sent by the multiple relay devices.
  • the server may receive multiple pieces of second positioning request information for the same second wearable device, obtain the exact position of the second wearable device by correcting the multiple pieces of second positioning request information, and locate the second wearable device
  • the accurate location information of the device is sent to the first wearable device in the form of a network data packet.
  • the second positioning request information needs to include the relative position between the relay device and the second wearable device, and the relative position between the relay device and the server.
  • steps 201 to 203 the positioning requirement of the second wearable device when the target area is small can be satisfied.
  • Step 103 Display a map interface, where the map interface includes the target area.
  • displaying the map interface can be performed according to the user's input instruction, such as opening the map interface by clicking an icon; it can also be performed automatically according to other commands, for example, the user accepts invitation information from other wearable devices at the first wearable device.
  • the map interface can also be enlarged or reduced according to the user's operation, so that the user can view all the information displayed on the map interface.
  • Step 104 Display the identifier of the second wearable device and the movement information on the map interface according to the location information.
  • FIG. 4 shows that the identifiers of the second wearable devices corresponding to user A and user B are displayed in the map interface. If user A and user B are exercising, correspondingly display the exercise information of the second wearable device and the position information of the identifier of the second wearable device after exercising in the map interface.
  • the location information of the first wearable device and the second wearable device can be displayed, and the user can be guided to meet.
  • step 104 the method includes:
  • Step 501 In the case of receiving the invitation information sent by the target wearable device, display the invitation information.
  • the second wearable device includes the target wearable device.
  • the invitation information sent by the target wearable device may be an operation of clicking the corresponding invitation button when the first wearable device is displayed in the interface of the target wearable device. ” virtual button.
  • Step 502 Receive a first input for the target object.
  • the target object may be invitation information corresponding to the target wearable device or an identifier corresponding to the target wearable device.
  • the first input may also be in various situations, for example, the first input is an operation of clicking the "OK" button corresponding to the target object, or a long-pressing operation, a double-clicking operation, and the like on the target object.
  • Step 503 In response to the first input, display a route between the location of the first wearable device and the location of the target wearable device corresponding to the target object.
  • the route between the first wearable device and the target wearable device can be further displayed on the display interface of the first wearable device, so as to guide the user to meet.
  • connection can be made according to a specific interaction. For example, when it is detected that a somatosensory interaction action is performed between the first wearable device and the target wearable device, a connection between the first wearable device and the target wearable device is established.
  • somatosensory interaction actions for example, a handshake action, a hand waving action, and the like.
  • the method further includes:
  • Step 601 Receive a second input from a user with respect to the first wearable device.
  • the second input may be a user's click operation in the map interface of the first wearable device, or other input operations of the user, or the like.
  • Step 602 In response to the second input, obtain a first input track corresponding to the second input.
  • the first input track may be generated through an operation in the map interface, for example, the first input track may be determined according to the selected starting point and ending point.
  • Step 603 Acquire connection information sent by the target wearable device, where the connection information is the second input track received by the target wearable device.
  • Step 604 Control the first wearable device to enter a target mode when the first input trajectory matches the second input trajectory.
  • the second input track may be generated for an operation in the map interface displayed in the target wearable device, for example, the second input track may be determined according to the selected starting point and ending point.
  • the first wearable device compares the motion information of the first wearable device with the motion information of the second wearable device.
  • the interaction between the first wearable device and the second wearable device can be realized, and the online and offline linkage between users can be enhanced.
  • the interaction method of the wearable device can acquire the motion information and position information of the second wearable device in the target area, display the map interface, and display the identification and motion information of the second wearable device on the map interface according to the position information, thereby
  • the identification and motion information of the second wearable device that can be displayed in the map interface can be used as an interaction portal between the wearable devices to improve the interactivity between the wearable devices.
  • FIG. 7 to FIG. 10 illustrate an example process of an interactive operation of the wearable device in this embodiment.
  • FIG. 7 shows the ranking information of the running distance displayed in the wearable watch under a usage scenario.
  • Step 1 Determine the target area.
  • Step 2 Obtain motion information and position information of the second wearable device in the target area.
  • the motion information includes running distance.
  • Step 3 After opening the sports ranking interface, you can display the ranking information of running distance by selecting the sports type.
  • the ranking information includes the ranking, the user identity document (ID) of each ranking, and the corresponding running distance, as shown in FIG. 7 .
  • the ranking information is displayed in order from top to bottom, and users can swipe to browse. Further, you can quickly locate and jump to your location by clicking the "running distance" above.
  • each row can also display brief information of the user, among which the user-defined avatar can be displayed in the circle.
  • the user's identity document (ID) can be a user-defined nickname. Click the user's identity document (ID) or avatar to jump to the user's specific information page, as shown in Figure 8.
  • the specific information of the user includes various sports information of the user, which can be obtained from the server. The user's specific information page is uploaded to the server and maintained at the same time when the user uploads the sports data for the first time.
  • the user's specific information includes not only the user's recent exercise data, but also some other data, such as the record of player challenges (Player Killing, PK) between users.
  • PK Player Killing
  • Step 4 Display a map interface, and display the identification and motion information of the second wearable device on the map interface according to the location information, as shown in FIG. 9 .
  • Step 5 In the case of receiving the invitation information sent by the target wearable device, display the invitation information.
  • the user device that receives the invitation information will directly pop up an invitation prompt. If the user device is currently in silent mode, the invitation information will be temporarily kept in the notification queue for a period of time. If the user does not view it after the set time, the invitation information will automatically expire.
  • Step 6 Receive a first input for the target object.
  • Step 7 In response to the first input, display a route between the position of the first wearable device and the position of the target wearable device corresponding to the target object, as shown in FIG. 10 .
  • the confirmation can be made according to the specific interaction method. If the wearable watch detects the handshake behavior of both parties, it will automatically start the target mode. In the target mode, the wearable watch compares the movement information of the wearable watch with the movement information of the target wearable device, and displays the comparison result in the wearable watch.
  • the execution subject may be the interaction device of the wearable device, or, or a control module in the interaction device of the wearable device for executing the interaction method for loading the wearable device .
  • an interaction method for loading a wearable device performed by an interaction device of a wearable device is taken as an example to describe the interaction method for a wearable device provided by the embodiments of the present application.
  • the embodiment of the present application discloses an interaction device for a wearable device, which is arranged on a first wearable device, see FIG. 11 , and includes:
  • an acquisition module 1102 configured to acquire motion information and position information of the second wearable device in the target area
  • a display module 1103, configured to display a map interface, where the map interface includes a target area;
  • the positioning module 1104 is configured to display the identification of the second wearable device and the motion information on the map interface according to the location information; wherein, the target area is determined by the location information of the first wearable device or the user's Enter OK.
  • the apparatus further includes: an invitation display module, configured to display the invitation information in the case of receiving the invitation information sent by the target wearable device;
  • a first receiving module for receiving a first input for the target object
  • a first response module configured to display a route between the position of the first wearable device and the position of the target wearable device corresponding to the target object in response to the first input
  • the second wearable device includes the target wearable device, and the target object is invitation information corresponding to the target wearable device or an identifier corresponding to the target wearable device.
  • the device further includes:
  • the second receiving module is configured to receive the user's response to the first wearable device after the first response module displays the route between the position of the first wearable device and the position of the target wearable device corresponding to the target object. the second input;
  • a second response module configured to acquire a first input trajectory corresponding to the second input in response to the second input
  • a processing module configured to acquire connection information sent by the target wearable device, where the connection information is the second input track received by the target wearable device;
  • control module configured to control the first wearable device to enter a target mode when the first input trajectory matches the second input trajectory
  • the first wearable device compares the movement information of the first wearable device with the movement information of the second wearable device.
  • the acquisition module is specifically configured to: send an acquisition request to a server corresponding to the target area, so as to acquire motion information and position information of the second wearable device in the target area.
  • the device further includes a location information determination module for:
  • the broadcast packet includes first positioning request information
  • the location information of the second wearable device generated by the server according to the second positioning request information sent by the multiple relay devices is received.
  • the interaction device of the wearable device in the embodiment of the present application may be an electronic device, or may be a component in the electronic device, such as an integrated circuit or a chip.
  • the electronic device may be a terminal, or may be other devices other than the terminal.
  • the electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted electronic device, a Mobile Internet Device (MID), an augmented reality (AR)/virtual reality (VR) ) device, robot, wearable device, ultra-mobile personal computer (UMPC), netbook or personal digital assistant (PDA), etc.
  • MID Mobile Internet Device
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the interaction device of the wearable device in the embodiment of the present application may be a device having an operating system.
  • the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
  • the interaction device of the wearable device provided in the embodiment of the present application can implement each process implemented by the interaction device of the wearable device in the method embodiments of FIG. 1 to FIG. 10 . To avoid repetition, details are not described here.
  • the interaction device of the wearable device can acquire the motion information and position information of the second wearable device in the target area, display a map interface, and display the identification and motion information of the second wearable device on the map interface according to the position information, thereby
  • the identification and motion information of the second wearable device that can be displayed in the map interface can be used as an interaction portal between the wearable devices to improve the interactivity between the wearable devices.
  • an embodiment of the present application further provides an electronic device, including a processor 1210, a memory 1209, a program or instruction stored in the memory 1209 and executable on the processor 1210, the program or instruction being processed by the processor
  • an electronic device including a processor 1210, a memory 1209, a program or instruction stored in the memory 1209 and executable on the processor 1210, the program or instruction being processed by the processor
  • FIG. 12 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • the electronic device 1200 includes but is not limited to: a radio frequency unit 1201, a network module 1202, an audio output unit 1203, an input unit 1204, At least some of the components in the sensor 1205, the display unit 1206, the user input unit 1207, the interface unit 1208, the memory 1209, and the processor 1210 and the like.
  • the electronic device 1200 may also include a power supply (such as a battery) for supplying power to various components, and the power supply may be logically connected to the processor 1210 through a power management system, so as to manage charging, discharging, and power management through the power management system. consumption management and other functions.
  • a power supply such as a battery
  • the structure of the electronic device shown in FIG. 12 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
  • the input unit 1204 may include a graphics processor (Graphics Processing Unit, GPU) 12041 and a microphone 12042. Such as camera) to obtain still pictures or video image data for processing.
  • the display unit 1206 may include a display panel 12061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 1207 includes a touch panel 12071 and other input devices 12072 .
  • the touch panel 12071 is also called a touch screen.
  • the touch panel 12071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 12072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which are not described herein again.
  • the radio frequency unit 1201 receives the downlink data from the network side device, and then processes it to the processor 1210; in addition, sends the uplink data to the network side device.
  • the radio frequency unit 1201 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the memory 1209 may be used to store software programs as well as various data.
  • the memory 1209 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required for at least one function (such as a sound playback function, image playback function, etc.), etc.
  • memory 1209 may include volatile memory or non-volatile memory, or memory 1209 may include both volatile and non-volatile memory.
  • the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory.
  • Volatile memory can be random access memory (Random Access Memory, RAM), static random access memory (Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), synchronous dynamic random access memory (Synchronous random access memory) DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Rate SDRAM,
  • DDR SDRAM enhanced synchronous dynamic random access memory
  • Enhanced SDRAM enhanced synchronous dynamic random access memory
  • ESDRAM synchronous link dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • Direct Rambus RAM Direct Rambus RAM
  • the processor 1210 may include one or more processing units; optionally, the processor 1210 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, application programs or instructions, etc., Modem processors mainly deal with wireless communications, such as baseband processors. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 1210.
  • the processor 1210 is configured to: determine the target area;
  • the processor 1210 is configured to: acquire motion information and position information of the second wearable device in the target area;
  • a display unit 1206, configured to display a map interface, where the map interface includes a target area;
  • the display unit 1206 is configured to display the identification of the second wearable device and the motion information on the map interface according to the location information; wherein the target area is determined by the location information of the first wearable device or the user's Enter OK.
  • the display unit 1206 displays the identification and motion information of the second wearable device on the map interface according to the location information:
  • a display unit 1206, configured to display the invitation information in the case of receiving the invitation information sent by the target wearable device
  • a user input unit 1207 configured to: receive a first input for the target object
  • a display unit 1206, configured to display a route between the position of the first wearable device and the position of the target wearable device corresponding to the target object in response to the first input;
  • the second wearable device includes the target wearable device, and the target object is invitation information corresponding to the target wearable device or an identifier corresponding to the target wearable device.
  • a user input unit 1207 configured to receive a second input from the user with respect to the first wearable device
  • the processor 1210 is configured to: in response to the second input, obtain a first input trajectory corresponding to the second input;
  • connection information sent by the target wearable device where the connection information is the second input track received by the target wearable device
  • the first wearable device compares the movement information of the first wearable device with the movement information of the second wearable device.
  • the processor 1210 is configured to: send an acquisition request to a server corresponding to the target area, so as to acquire motion information and position information of the second wearable device in the target area.
  • the processor 1210 is configured to: generate a broadcast packet, wherein the broadcast packet includes the first positioning request information;
  • the location information of the second wearable device generated by the server according to the second positioning request information sent by the multiple relay devices is received.
  • the electronic device can acquire the motion information and position information of the second wearable device in the target area, display the map interface, and display the logo and motion information of the second wearable device on the map interface according to the position information, so that the map interface can display the identification and motion information of the second wearable device on the map.
  • the identifier and motion information of the second wearable device displayed in the interface are used as an interaction portal between the wearable devices, so as to improve the interactivity between the wearable devices.
  • the electronic device embodiments in the embodiments of the present application are product embodiments corresponding to the above method embodiments, and all implementations in the above method embodiments are applicable to the electronic device embodiments, and can also achieve the same or similar technical effects. Therefore, it will not be repeated here.
  • the embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, each process of the above-mentioned embodiment of the interaction method for a wearable device is implemented, and can To achieve the same technical effect, in order to avoid repetition, details are not repeated here.
  • the processor is the processor in the electronic device described in the foregoing embodiments.
  • the readable storage medium includes a computer-readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run a program or an instruction to implement the above-mentioned interaction method for a wearable device In order to avoid repetition, the details are not repeated here.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
  • the method of the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course can also be implemented by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in a part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, CD-ROM), including several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of this application.
  • a storage medium such as ROM/RAM, magnetic disk, CD-ROM

Abstract

本申请公开了一种穿戴设备的交互方法和装置,属于移动设备技术领域。其中,所述交互方法包括:确定目标区域;获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;显示地图界面,所述地图界面包括目标区域;根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;其中,所述目标区域由第一穿戴设备的位置信息或用户的输入确定。

Description

穿戴设备的交互方法和装置
相关申请的交叉引用
本申请要求于2021年03月05日提交的申请号为2021102470891,发明名称为“穿戴设备的交互方法和装置”的中国专利申请的优先权,其通过引用方式全部并入本申请。
技术领域
本申请属于移动设备技术领域,具体涉及一种穿戴设备的交互方法和装置、电子设备和存储介质。
背景技术
随着第五代移动通信技术(5th Generation Mobile Communication Technology,5G)时代的到来,智能终端产品的越来越得到了普及。智能手表作为智能终端设备的代表,已经被越来越多的人所拥有。围绕着智能手表展开的应用场景和应用技术的需求也日渐增加。越来越多的用户借助智能手表、智能手环等穿戴设备来进行身体健康数据的监测和运动数据的记录。
相关技术中,智能穿戴设备与传统的电子设备,如手机、电脑、平板等有所区别,既没有很大的交互屏幕,也无法通过键盘输入。因此,智能穿戴设备带来的社会交互性远远弱于传统的电子设备,制约了智能穿戴设备的普及。
发明内容
本申请实施例的目的是提供一种穿戴设备的交互方法和装置、电子设备和存储介质,能够解决智能穿戴设备的交互性较差的问题。
为了解决上述技术问题,本申请是这样实现的:
第一方面,本申请实施例提供了一种穿戴设备的交互方法,应用于第一穿戴设备,包括:
确定目标区域;
获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;
显示地图界面,所述地图界面包括目标区域;
根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;
其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
第二方面,本申请实施例提供了一种穿戴设备的交互装置,设置于第一穿戴设备,包括:
确定模块,用于确定目标区域;
获取模块,用于获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;
显示模块,用于显示地图界面,所述地图界面包括目标区域;
定位模块,用于根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
第三方面,本申请实施例提供了一种电子设备,该电子设备包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面所述的方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面所述的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
在本申请实施例中,可以获取目标区域内第二穿戴设备的运动信息、位置信息,并显示地图界面,根据位置信息在地图界面显示第二穿戴设备的标识及运动信息,从而可以在地图界面中显示的第二穿戴设备的标识及运动信息作为穿戴设备之间的交互入口,提高穿戴设备之间的交互性。
附图说明
图1是本申请实施例提供的穿戴设备的交互方法的流程示意图之一;
图2是本申请实施例提供的穿戴设备的交互方法的流程示意图之二;
图3是本申请实施例提供的确定第二穿戴设备的位置的应用实例示意图;
图4是本申请实施例提供的地图界面的示意图;
图5是本申请实施例提供的穿戴设备的交互方法的流程示意图之三;
图6是本申请实施例提供的穿戴设备的交互方法的流程示意图之四;
图7是本申请实施例提供的穿戴设备的交互操作实例的示意图之一;
图8是本申请实施例提供的穿戴设备的交互操作实例的示意图之二;
图9是本申请实施例提供的穿戴设备的交互操作实例的示意图之三;
图10是本申请实施例提供的穿戴设备的交互操作实例的示意图之四;
图11是本申请实施例提供的穿戴设备的交互装置的结构示意图;
图12是本申请实施例提供的电子设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
下面结合附图,通过具体的实施例及其应用场景对本申请实施例提供的数据交互方法和装置、电子设备和存储介质进行详细地说明。
本申请实施例公开了一种穿戴设备的交互方法,参见图1,包括:
步骤101、确定目标区域。
其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
具体地,目标区域可以将第一穿戴设备的位置信息作为圆心,根据预先设置或者当前输入的数据为半径,确定目标区域。
例如预先设置半径为5米,则将距离第一穿戴设备的位置为5米范围内的区域均确定为目标区域。
当然,目标区域也可以为人为设置,例如通过输入的地名确定目标区域。
步骤102、获取处于所述目标区域内第二穿戴设备的运动信息、位置信息。
具体地,第二穿戴设备的运动信息包括:第二穿戴设备对应的用户身份标识号(Identity document,ID)、跑步距离的排名信息、跑步时长的排名信息、游泳圈数的排名信息等。
在第一穿戴设备的用户进入运动模式并开始运动时,第一穿戴设备就会开始记录用户的运动数据。同时会将运动数据同步到服务器,以使服务器进行特定区域内的运动排名。运动排名的数据会以固定的时间间隔推送给第一穿戴设备。但是并不会主动跳出显示,存储于第一穿戴设备的存储区域内。当用户希望打开运动排名的界面的时候,对应的运动信息才会主动显示。
本实施例中,获取第二穿戴设备的运动信息及位置信息的途径有两种:
一种是每一个穿戴设备都有绑定的账号信息。如果穿戴设备本身具备嵌入式用户识别卡(Embedded-Subscriber Identity Module,eSIM)联网和全球定位系统(Global Positioning System,GPS)能力,可以直接通过全球定位系统(Global Positioning System,GPS)获取设备的位置信息。通过全球定位系统(Global Positioning System,GPS)位置的对比和检索,精确到当前用户所处的体育馆、学校操场或者公园的具体名字。然后通过长期演进技术(Long Term Evolution,LTE)链路,将穿戴设备的运动数据、位置信息、日期信息、运动类别等信息进行序列化后,生成数据表上传到云端服务器。
云端服务器负责从数据包中提取运动场所、日期以及运动类别等信息, 并根据运动场所、日期和运动类别作为排序范围,根据运动数据进行排序。服务器会不停地维护排名数据,并以固定的时间间隔推送排名数据给第一穿戴设备。
该实施方案依赖一个中心服务器,设备需要依赖全球定位系统(Global Positioning System,GPS)主动获取位置并交给服务器进行处理。第一穿戴设备通过访问服务器获取第二穿戴设备的运动信息和位置信息。
另一种是第一穿戴设备不再需要主动依靠全球定位系统(Global Positioning System,GPS)来获取位置信息。但是需要在运动场所附近部署轻量级的服务器。例如在一个运动场馆周围布置服务器。这么实施的理由是服务器需要处理的数据量并不是很大,只需要对用户的运动数据进行排序即可。传统的大型服务器是集中部署的,虽然可以处理海量数据,但是会造成数据刷新的延迟过高。由于在运动场所附近部署轻量级的服务器,用户能以最小的延迟获取最新的排名情况。
对应地,步骤102包括:第一穿戴设备向所述目标区域对应的服务器发送获取请求,以获取所述目标区域内的第二穿戴设备的运动信息、位置信息。
具体地,第一穿戴设备在进入运动模式后,并不能获取当前区域的轻量级服务器的具体网际互连协议(Internet Protocol,IP)地址。因此在进入运动模式后,需要在当地网络中发送广播包。当轻量级服务器接收到广播请求,并进行身份验证后,开始返回数据包和自己的网际互连协议(Internet Protocol,IP)地址,并开始向第一穿戴设备发送提供运动场馆范围内的第二穿戴设备的运动信息或第二穿戴设备的邀请信息。
对应地,在室内或者目标区域较小的时候,为了增加定位的准确性,参见图2,第二穿戴设备的位置信息通过以下方法确定:
步骤201、生成广播包。
其中,所述广播包中包括第一定位请求信息。
步骤202、发送所述广播包至目标区域内的多个中继设备,以使所述多个中继设备解析所述广播包中的第一定位请求信息,生成第二定位请求信息分别发送至所述目标区域的服务器。
其中,由于服务器的部署,可以提前将目标区域的三维扫描模型存储 在服务器中。中继设备需要提前部署于目标区域内,例如在体育馆的关键方位部署中继设备。这些中继设备的目的是增强并转发中继信号,其最终的转发目标就是服务器。
步骤203、接收所述服务器根据多个中继设备发送的第二定位请求信息生成的所述第二穿戴设备的位置信息。
具体地,服务器可能会收到多个针对同一第二穿戴设备的第二定位请求信息,通过该多个第二定位请求信息的矫正,来获取第二穿戴设备的准确位置,并将第二穿戴设备的准确位置信息以网络数据包的形式发送至第一穿戴设备。
具体地,第二定位请求信息中需要包括该中继设备与第二穿戴设备之间的相对位置,以及中继设备与服务器之间的相对位置。
参见图3,以两个中继设备为例,中继设备1发送的第二定位请求信息中,包含中继设备与服务器的相对位置为(-4,-1),中继设备与第二穿戴设备的相对位置为(2,-3),可以得到第二穿戴设备的位置为(-4,-1)+(2,-3)=(-2,-4);中继设备2发送的第二定位请求信息中,包含的中继设备2与服务器的相对位置为(2,-1),中继设备2与第二穿戴设备的相对位置为(-4,-3),那么最终确定第二穿戴设备的位置为(2,-1)+(-4,-3)=(-2,-4)。
通过步骤201~203,可以满足第二穿戴设备在目标区域较小的情况下的定位需求。
步骤103、显示地图界面,所述地图界面包括目标区域。
具体地,显示地图界面可以根据用户的输入指令而执行,例如通过点选图标打开地图界面;也可以根据其他命令而自动执行,例如用户在第一穿戴设备接受了其他穿戴设备的邀请信息。
进一步地,在显示地图界面后,还可以根据用户的操作放大或缩小地图界面,以便于用户对地图界面显示的所有信息的查看。
步骤104、根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息。
具体地,参见图4,图4示出了在地图界面中显示用户A和用户B对应的第二穿戴设备的标识。若用户A和用户B进行运动,那么对应地在 地图界面中显示第二穿戴设备的运动信息,以及第二穿戴设备的标识在运动后的位置信息。
通过显示地图界面,可以显示第一穿戴设备和第二穿戴设备的位置信息,可以引导用户见面。
另外,参见图5,在步骤104后,所述方法包括:
步骤501、在接收到所述目标穿戴设备发送的邀请信息的情况下,显示所述邀请信息。
其中,第二穿戴设备包括目标穿戴设备。
其中,目标穿戴设备发送邀请信息,可以为通过在目标穿戴设备的界面内显示第一穿戴设备时,点选对应的邀请按钮的操作,例如邀请按钮作可以为“玩家挑战(Player Killing,PK)”的虚拟按钮。
步骤502、接收针对目标对象的第一输入。
其中,目标对象可以为目标穿戴设备对应的邀请信息或目标穿戴设备对应的标识。
对应地,第一输入也可以为多种情况,例如第一输入为点选目标对象对应的“确定”按钮的操作,或者对目标对象的长按操作、双击操作等。
步骤503、响应于所述第一输入,显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线。
通过步骤501~503,可以进一步地在第一穿戴设备的显示界面中显示第一穿戴设备和目标穿戴设备之间的路线,以便于引导用户见面。
在用户见面后,可以根据特定交互方式来实现连接。例如在检测到第一穿戴设备与目标穿戴设备之间执行体感交互动作的情况下,建立第一穿戴设备与目标穿戴设备之间的连接。
具体地,体感交互动作可以为多种,例如为握手动作、摆手动作等等。
可选地,为了实现控制第一穿戴设备进入目标模式,参见图6,在步骤503之后,所述方法还包括:
步骤601、接收用户针对所述第一穿戴设备的第二输入。
其中,第二输入可以为用户在第一穿戴设备的地图界面中的点选操作,或者用户的其他输入操作等。
步骤602、响应于所述第二输入,获取所述第二输入对应的第一输入 轨迹。
其中,第一输入轨迹可以为通过地图界面中的操作而生成,例如根据点选的起始点和终点确定第一输入轨迹。
步骤603、获取所述目标穿戴设备发送的连接信息,所述连接信息为所述目标穿戴设备的接收的第二输入轨迹。
步骤604、在所述第一输入轨迹和所述第二输入轨迹匹配的情况下,控制所述第一穿戴设备进入目标模式。
其中,第二输入轨迹可以为在目标穿戴设备中显示的地图界面中的操作而生成,例如根据点选的起始点和终点确定第二输入轨迹。
其中,在目标模式下,第一穿戴设备将第一穿戴设备的运动信息与第二穿戴设备的运动信息进行比较。
通过比较第一穿戴设备和第二穿戴设备的运动信息,可以实现第一穿戴设备和第二穿戴设备之间的交互,增强用户之间线上与线下的联动。
本申请实施例的穿戴设备的交互方法,可以获取目标区域内第二穿戴设备的运动信息、位置信息,并显示地图界面,根据位置信息在地图界面显示第二穿戴设备的标识及运动信息,从而可以在地图界面中显示的第二穿戴设备的标识及运动信息作为穿戴设备之间的交互入口,提高穿戴设备之间的交互性。
图7~图10示出了一个本实施例的穿戴设备的交互操作实例的过程。
如图7所示,图7示出了一种使用场景下在穿戴手表中显示的跑步距离的排名信息。
步骤1、确定目标区域。
步骤2、获取目标区域内的第二穿戴设备的运动信息以及位置信息。
其中,运动信息包括跑步距离。
步骤3、打开运动排名界面后,可以通过选择运动种类,展示跑步距离的排名信息。
其中,排名信息包括名次、每个名次的用户身份标识号(Identity document,ID)以及对应的跑步距离,如图7所示。排名信息从上到下根据次序展示,用户可以滑动来进行浏览。进一步地,可以通过点击上方的“跑步距离”,快速定位并跳转到自己所处的位置。
另外,每一行还可以展示用户的简略信息,其中圆圈中可以展示用户自定义的头像。用户的身份标识号(Identity document,ID)可以为用户自定义的昵称。点击用户身份标识号(Identity document,ID)或头像,可以跳转到用户的具体信息页,如图8所示。用户的具体信息包括该用户的各种运动信息,可以从服务器获取。用户的具体信息页面在用户第一次上传运动数据的时候同时上传服务器并维护。
另外,用户的具体信息除去包括用户近期的运动数据外,也包括一些其他数据,如用户之间进行玩家挑战(Player Killing,PK)的战绩记录。对于欣赏和佩服的用户,可以点击下方的爱心对用户进行点赞。
步骤4、显示地图界面,根据位置信息在地图界面显示第二穿戴设备的标识及运动信息,如图9所示。
步骤5、在接收到目标穿戴设备发送的邀请信息的情况下,显示邀请信息。
在具体使用时,收到邀请信息的用户设备会直接弹出邀请提示。如果用户设备当前开启静音模式,则会将邀请信息暂时保留在通知队列中,并持续一段时间。如果用户超过设定时间未查看,则邀请信息会主动过期。
步骤6、接收针对目标对象的第一输入。
步骤7、响应于第一输入,显示第一穿戴设备位置与目标对象对应的目标穿戴设备的位置之间的路线,如图10所示。
通过显示路线,可以引导用户A和用户B见面。
在用户见面后,可以根据特定交互方式来进行确认。如穿戴手表检测到双方的握手行为,则会自动开始目标模式。在所述目标模式下,穿戴手表将穿戴手表的运动信息与目标穿戴设备的运动信息进行比较,并将比较结果显示于穿戴手表中。
需要说明的是,本申请实施例提供的穿戴设备的交互方法,执行主体可以为穿戴设备的交互装置,或者,或者该穿戴设备的交互装置中的用于执行加载穿戴设备的交互方法的控制模块。本申请实施例中以穿戴设备的交互装置执行加载穿戴设备的交互方法为例,说明本申请实施例提供的穿戴设备的交互方法。
本申请实施例公开了一种穿戴设备的交互装置,设置于第一穿戴设备, 参见图11,包括:
确定模块1101,用于确定目标区域;
获取模块1102,用于获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;
显示模块1103,用于显示地图界面,所述地图界面包括目标区域;
定位模块1104,用于根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
可选地,所述装置还包括:邀请显示模块,用于在接收到所述目标穿戴设备发送的邀请信息的情况下,显示所述邀请信息;
第一接收模块,用于接收针对目标对象的第一输入;
第一响应模块,用于响应于所述第一输入,显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线;
其中,所述第二穿戴设备包括所述目标穿戴设备,所述目标对象为所述目标穿戴设备对应的邀请信息或所述目标穿戴设备对应的标识。
可选地,所述装置还包括:
第二接收模块,用于在所述第一响应模块显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线之后,接收用户针对所述第一穿戴设备的第二输入;
第二响应模块,用于响应于所述第二输入,获取所述第二输入对应的第一输入轨迹;
处理模块,用于获取所述目标穿戴设备发送的连接信息,所述连接信息为所述目标穿戴设备的接收的第二输入轨迹;
控制模块,用于在所述第一输入轨迹和所述第二输入轨迹匹配的情况下,控制所述第一穿戴设备进入目标模式;
其中,在所述目标模式下,所述第一穿戴设备将所述第一穿戴设备的运动信息与所述第二穿戴设备的运动信息进行比较。
可选地,所述获取模块具体用于:向所述目标区域对应的服务器发送获取请求,以获取所述目标区域内的第二穿戴设备的运动信息、位置信息。
可选地,所述装置还包括位置信息确定模块,用于:
生成广播包,其中,所述广播包中包括第一定位请求信息;
发送所述广播包至目标区域内的多个中继设备,以使所述多个中继设备解析所述广播包中的第一定位请求信息,生成第二定位请求信息分别发送至所述目标区域的服务器;
接收所述服务器根据多个中继设备发送的第二定位请求信息生成的所述第二穿戴设备的位置信息。
本申请实施例中的穿戴设备的交互装置可以是电子设备,也可以是电子设备中的部件,例如集成电路或芯片。该电子设备可以是终端,也可以为除终端之外的其他设备。示例性的,电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、移动上网装置(Mobile Internet Device,MID)、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、机器人、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,还可以为服务器、网络附属存储器(Network Attached Storage,NAS)、个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例中的穿戴设备的交互装置可以为具有操作系统的装置。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本申请实施例不作具体限定。
本申请实施例提供的穿戴设备的交互装置能够实现图1至图10的方法实施例中穿戴设备的交互装置实现的各个过程,为避免重复,这里不再赘述。
本申请实施例的穿戴设备的交互装置,可以获取目标区域内第二穿戴设备的运动信息、位置信息,并显示地图界面,根据位置信息在地图界面显示第二穿戴设备的标识及运动信息,从而可以在地图界面中显示的第二穿戴设备的标识及运动信息作为穿戴设备之间的交互入口,提高穿戴设备之间的交互性。
可选的,本申请实施例还提供一种电子设备,包括处理器1210,存储器1209,存储在存储器1209上并可在所述处理器1210上运行的程序或指令,该程序或指令被处理器1210执行时实现上述穿戴设备的交互方法实 施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
图12为本申请实施例提供的一种电子设备的硬件结构示意图,如图12所示,该电子设备1200包括但不限于:射频单元1201、网络模块1202、音频输出单元1203、输入单元1204、传感器1205、显示单元1206、用户输入单元1207、接口单元1208、存储器1209、以及处理器1210等中的至少部分部件。
本领域技术人员可以理解,电子设备1200还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器1210逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图12中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
应理解的是,本申请实施例中,输入单元1204可以包括图形处理器(Graphics Processing Unit,GPU)12041和麦克风12042,图形处理器12041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元1206可包括显示面板12061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板12061。用户输入单元1207包括触控面板12071以及其他输入设备12072。触控面板12071,也称为触摸屏。触控面板12071可包括触摸检测装置和触摸控制器两个部分。其他输入设备12072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
本申请实施例中,射频单元1201将来自网络侧设备的下行数据接收后,给处理器1210处理;另外,将上行的数据发送给网络侧设备。通常,射频单元1201包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。
存储器1209可用于存储软件程序以及各种数据。存储器1209可主要包括存储程序或指令的第一存储区和存储数据的第二存储区,其中,第一存储区可存储操作系统、至少一个功能所需的应用程序或指令(比如声音播放功能、图像播放功能等)等。此外,存储器1209可以包括易失性存储器或非易 失性存储器,或者,存储器1209可以包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,
DDRSDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,
ESDRAM)、同步连接动态随机存取存储器(Synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DRRAM)。本申请实施例中的存储器1209包括但不限于这些和任意其它适合类型的存储器。
处理器1210可包括一个或多个处理单元;可选地,处理器1210可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序或指令等,调制解调处理器主要处理无线通信,如基带处理器。可以理解的是,上述调制解调处理器也可以不集成到处理器1210中。
其中,
处理器1210,用于:确定目标区域;
处理器1210,用于:获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;
显示单元1206,用于显示地图界面,所述地图界面包括目标区域;
显示单元1206,用于根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
可选地,在显示单元1206根据位置信息在地图界面显示第二穿戴设备的标识及运动信息后:
显示单元1206,用于在接收到所述目标穿戴设备发送的邀请信息的情况下,显示所述邀请信息;
用户输入单元1207,用于:接收针对目标对象的第一输入;
显示单元1206,用于响应于所述第一输入,显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线;
其中,所述第二穿戴设备包括所述目标穿戴设备,所述目标对象为所述目标穿戴设备对应的邀请信息或所述目标穿戴设备对应的标识。
可选地,用户输入单元1207,用于接收用户针对第一穿戴设备的第二输入;
处理器1210,用于:响应于所述第二输入,获取所述第二输入对应的第一输入轨迹;
获取所述目标穿戴设备发送的连接信息,所述连接信息为所述目标穿戴设备的接收的第二输入轨迹;
在所述第一输入轨迹和所述第二输入轨迹匹配的情况下,控制所述第一穿戴设备进入目标模式;
其中,在所述目标模式下,所述第一穿戴设备将所述第一穿戴设备的运动信息与所述第二穿戴设备的运动信息进行比较。
可选地,处理器1210用于:向所述目标区域对应的服务器发送获取请求,以获取所述目标区域内的第二穿戴设备的运动信息、位置信息。
可选地,处理器1210用于:生成广播包,其中,所述广播包中包括第一定位请求信息;
发送所述广播包至目标区域内的多个中继设备,以使所述多个中继设备解析所述广播包中的第一定位请求信息,生成第二定位请求信息分别发送至所述目标区域的服务器;
接收所述服务器根据多个中继设备发送的第二定位请求信息生成的所述第二穿戴设备的位置信息。
本申请实施例的电子设备,可以获取目标区域内第二穿戴设备的运动信息、位置信息,并显示地图界面,根据位置信息在地图界面显示第二穿戴设备的标识及运动信息,从而可以在地图界面中显示的第二穿戴设备的标识及运动信息作为穿戴设备之间的交互入口,提高穿戴设备之间的交互性。
本申请实施例中的电子设备实施例是与上述方法实施例对应的产品实施例,上述方法实施例中的所有实现方式均适用于该电子设备实施例, 亦可达到相同或相似的技术效果,故在此不再赘述。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述穿戴设备的交互方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述穿戴设备的交互方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产 品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (15)

  1. 一种穿戴设备的交互方法,应用于第一穿戴设备,包括:
    确定目标区域;
    获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;
    显示地图界面,所述地图界面包括目标区域;
    根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;
    其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
  2. 根据权利要求1所述的穿戴设备的交互方法,其中,在根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息后,所述方法包括:
    在接收到目标穿戴设备发送的邀请信息的情况下,显示所述邀请信息;
    接收针对目标对象的第一输入;
    响应于所述第一输入,显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线;
    其中,所述第二穿戴设备包括所述目标穿戴设备,所述目标对象为所述目标穿戴设备对应的邀请信息或所述目标穿戴设备对应的标识。
  3. 根据权利要求2所述的穿戴设备的交互方法,其中,在所述显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线之后,所述方法还包括:
    接收用户针对所述第一穿戴设备的第二输入;
    响应于所述第二输入,获取所述第二输入对应的第一输入轨迹;
    获取所述目标穿戴设备发送的连接信息,所述连接信息为所述目标穿戴设备的接收的第二输入轨迹;
    在所述第一输入轨迹和所述第二输入轨迹匹配的情况下,控制所述第一穿戴设备进入目标模式;
    其中,在所述目标模式下,所述第一穿戴设备将所述第一穿戴设备的运动信息与所述第二穿戴设备的运动信息进行比较。
  4. 根据权利要求1所述的穿戴设备的交互方法,其中,获取处于所述目标区域内第二穿戴设备的运动信息、位置信息,包括:
    向所述目标区域对应的服务器发送获取请求,以获取所述目标区域内的第二穿戴设备的运动信息、位置信息。
  5. 根据权利要求4所述的穿戴设备的交互方法,其中,所述第二穿戴设备的位置信息通过以下方法确定:
    生成广播包,其中,所述广播包中包括第一定位请求信息;
    发送所述广播包至目标区域内的多个中继设备,以使所述多个中继设备解析所述广播包中的第一定位请求信息,生成第二定位请求信息分别发送至所述目标区域的服务器;
    接收所述服务器根据多个中继设备发送的第二定位请求信息生成的所述第二穿戴设备的位置信息。
  6. 一种穿戴设备的交互装置,设置于第一穿戴设备,包括:
    确定模块,用于确定目标区域;
    获取模块,用于获取处于所述目标区域内第二穿戴设备的运动信息、位置信息;
    显示模块,用于显示地图界面,所述地图界面包括目标区域;
    定位模块,用于根据所述位置信息在所述地图界面显示所述第二穿戴设备的标识及所述运动信息;其中,所述目标区域由所述第一穿戴设备的位置信息或用户的输入确定。
  7. 根据权利要求6所述的穿戴设备的交互装置,其中,所述装置还包括:邀请显示模块,用于在接收到目标穿戴设备发送的邀请信息的情况下,显示所述邀请信息;
    第一接收模块,用于接收针对目标对象的第一输入;
    第一响应模块,用于响应于所述第一输入,显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线;
    其中,所述第二穿戴设备包括所述目标穿戴设备,所述目标对象为所述目标穿戴设备对应的邀请信息或所述目标穿戴设备对应的标识。
  8. 根据权利要求7所述的穿戴设备的交互装置,其中,所述装置还包括:
    第二接收模块,用于在所述第一响应模块显示所述第一穿戴设备位置与所述目标对象对应的目标穿戴设备的位置之间的路线之后,接收用户针对所述第一穿戴设备的第二输入;
    第二响应模块,用于响应于所述第二输入,获取所述第二输入对应的第一输入轨迹;
    处理模块,用于获取所述目标穿戴设备发送的连接信息,所述连接信息为所述目标穿戴设备的接收的第二输入轨迹;
    控制模块,用于在所述第一输入轨迹和所述第二输入轨迹匹配的情况下,控制所述第一穿戴设备进入目标模式;
    其中,在所述目标模式下,所述第一穿戴设备将所述第一穿戴设备的运动信息与所述第二穿戴设备的运动信息进行比较。
  9. 根据权利要求6所述的穿戴设备的交互装置,其中,所述获取模块,具体用于:向所述目标区域对应的服务器发送获取请求,以获取所述目标区域内的第二穿戴设备的运动信息、位置信息。
  10. 根据权利要求9所述的穿戴设备的交互装置,其中,所述装置还包括位置信息确定模块,用于:
    生成广播包,其中,所述广播包中包括第一定位请求信息;
    发送所述广播包至目标区域内的多个中继设备,以使所述多个中继设备解析所述广播包中的第一定位请求信息,生成第二定位请求信息分别发送至所述目标区域的服务器;
    接收所述服务器根据多个中继设备发送的第二定位请求信息生成的所述第二穿戴设备的位置信息。
  11. 一种电子设备,包括处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1-5任一项所述的穿戴设备的交互方法的步骤。
  12. 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1-5任一项所述的穿戴设备的交互方法的步骤。
  13. 一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1-5任一项所述的穿戴设备的交互方法。
  14. 一种计算机程序产品,其特征在于,所述程序产品被至少一个处理器执行以实现如权利要求1至5任一项所述的穿戴设备的交互方法。
  15. 一种电子设备,包括用于执行如权利要求1-5任一项所述的穿戴设备的交互方法的步骤。
PCT/CN2022/078474 2021-03-05 2022-02-28 穿戴设备的交互方法和装置 WO2022184030A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110247089.1 2021-03-05
CN202110247089.1A CN112988930A (zh) 2021-03-05 2021-03-05 穿戴设备的交互方法和装置

Publications (1)

Publication Number Publication Date
WO2022184030A1 true WO2022184030A1 (zh) 2022-09-09

Family

ID=76353135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/078474 WO2022184030A1 (zh) 2021-03-05 2022-02-28 穿戴设备的交互方法和装置

Country Status (2)

Country Link
CN (1) CN112988930A (zh)
WO (1) WO2022184030A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112988930A (zh) * 2021-03-05 2021-06-18 维沃移动通信有限公司 穿戴设备的交互方法和装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104836725A (zh) * 2015-03-31 2015-08-12 北京奇艺世纪科技有限公司 用于添加好友用户的方法及装置
CN105843908A (zh) * 2016-03-24 2016-08-10 李秋燕 基于户外运动数据的信息交互方法
CN106656762A (zh) * 2016-12-29 2017-05-10 杭州联络互动信息科技股份有限公司 基于智能可穿戴设备的信息交流方法、装置及系统
CN111770444A (zh) * 2019-04-02 2020-10-13 深圳市贝沃电子科技有限公司 一种基于定位数据分析的app监管系统
CN112000896A (zh) * 2019-05-27 2020-11-27 北京小米移动软件有限公司 运动社交方法及装置
WO2020252575A1 (en) * 2019-06-17 2020-12-24 Rohit Seth Relative position tracking using motion sensor with drift correction
CN112333795A (zh) * 2020-11-11 2021-02-05 维沃移动通信有限公司 网络接入方法及装置
CN112988930A (zh) * 2021-03-05 2021-06-18 维沃移动通信有限公司 穿戴设备的交互方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016125233A1 (de) * 2015-12-29 2017-06-29 Suunto Oy Sendeempfänger und zugehörige Kommunikations- und Navigationsverfahren
CN108989991A (zh) * 2018-06-04 2018-12-11 上海康斐信息技术有限公司 可穿戴设备及其运动数据分享方法、装置
CN108900969A (zh) * 2018-06-07 2018-11-27 广东小天才科技有限公司 好友添加方法、装置、设备及存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104836725A (zh) * 2015-03-31 2015-08-12 北京奇艺世纪科技有限公司 用于添加好友用户的方法及装置
CN105843908A (zh) * 2016-03-24 2016-08-10 李秋燕 基于户外运动数据的信息交互方法
CN106656762A (zh) * 2016-12-29 2017-05-10 杭州联络互动信息科技股份有限公司 基于智能可穿戴设备的信息交流方法、装置及系统
CN111770444A (zh) * 2019-04-02 2020-10-13 深圳市贝沃电子科技有限公司 一种基于定位数据分析的app监管系统
CN112000896A (zh) * 2019-05-27 2020-11-27 北京小米移动软件有限公司 运动社交方法及装置
WO2020252575A1 (en) * 2019-06-17 2020-12-24 Rohit Seth Relative position tracking using motion sensor with drift correction
CN112333795A (zh) * 2020-11-11 2021-02-05 维沃移动通信有限公司 网络接入方法及装置
CN112988930A (zh) * 2021-03-05 2021-06-18 维沃移动通信有限公司 穿戴设备的交互方法和装置

Also Published As

Publication number Publication date
CN112988930A (zh) 2021-06-18

Similar Documents

Publication Publication Date Title
US10841661B2 (en) Interactive method, apparatus, and system in live room
WO2019228120A1 (zh) 视频互动方法、装置、终端及存储介质
US11055740B2 (en) Advertisement push system, apparatus, and method
US10341716B2 (en) Live interaction system, information sending method, information receiving method and apparatus
WO2018094556A1 (zh) 直播间视频流推送控制方法及相应的服务器与移动终端
US10136289B2 (en) Cross device information exchange using gestures and locations
WO2015180672A1 (en) Video-based interaction method, terminal, server and system
US10673790B2 (en) Method and terminal for displaying instant messaging message
US20130304820A1 (en) Network system with interaction mechanism and method of operation thereof
WO2015062462A1 (en) Matching and broadcasting people-to-search
US11516303B2 (en) Method for displaying media resources and terminal
RU2745737C1 (ru) Способ видеозаписи и видеозаписывающий терминал
US20130041976A1 (en) Context-aware delivery of content
CN108733429B (zh) 系统资源配置的调整方法、装置、存储介质及移动终端
WO2022183707A1 (zh) 互动方法及其装置
CN107908765B (zh) 一种游戏资源处理方法、移动终端及服务器
US20170353525A1 (en) Method for sharing file and electronic device for the same
CN113810732B (zh) 直播内容显示方法、装置、终端、存储介质及程序产品
CN115604515A (zh) 视频中的协同效果
US20180196885A1 (en) Method for sharing data and an electronic device thereof
CN110087149A (zh) 一种视频图像分享方法、装置及移动终端
CN114205633B (zh) 直播互动方法、装置、存储介质和电子设备
WO2022184030A1 (zh) 穿戴设备的交互方法和装置
CN113839913B (zh) 一种互动信息处理方法、相关装置及存储介质
CN109951741A (zh) 数据对象信息展示方法、装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22762495

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22762495

Country of ref document: EP

Kind code of ref document: A1