CN109200567B - Motion data interaction method and device and electronic equipment - Google Patents

Motion data interaction method and device and electronic equipment Download PDF

Info

Publication number
CN109200567B
CN109200567B CN201710528462.4A CN201710528462A CN109200567B CN 109200567 B CN109200567 B CN 109200567B CN 201710528462 A CN201710528462 A CN 201710528462A CN 109200567 B CN109200567 B CN 109200567B
Authority
CN
China
Prior art keywords
motion
electronic device
data
electronic equipment
motion data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710528462.4A
Other languages
Chinese (zh)
Other versions
CN109200567A (en
Inventor
丁玲
钱允
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201710528462.4A priority Critical patent/CN109200567B/en
Publication of CN109200567A publication Critical patent/CN109200567A/en
Application granted granted Critical
Publication of CN109200567B publication Critical patent/CN109200567B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/30Speed

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of terminals, in particular to an interaction method and device of motion data and electronic equipment. The method comprises the following steps: the method comprises the steps of obtaining motion data of first electronic equipment and motion data of each second electronic equipment matched with the motion data of the first electronic equipment; determining a motion type according to the motion data, and generating an online virtual motion scene corresponding to the motion type according to the motion type, wherein the virtual motion scene comprises the first electronic equipment and the avatar of each second electronic equipment; and respectively presenting the simulation image and the motion data of the first electronic equipment and each second electronic equipment in the virtual motion scene. The embodiment of the invention can enable the user to obtain other users with similar motion types to the user, initiate the online motion, perform the simulated physical and chemical presentation on the other users, check the motion ranking and the motion data in the online motion in real time, enhance the motion data interaction among different users and enhance the user experience.

Description

Motion data interaction method and device and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of terminals, in particular to a motion data interaction method and device and electronic equipment.
Background
With the acceleration of life rhythm, people also become heavy when busy work, and more people pay attention to the motion, and the motion product also comes along. The existing system of the electronic equipment is provided with a motion recording mode, such as step counting software, which can record the motion type and motion track of a user, and the user can also record and rank the step number through the step counting software or social software.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art: in the prior art, the self-contained motion data recording software in the electronic equipment only displays the step number in a ranking mode, the motion content shared by a user and remote friends is only the step number preset, the motion modes and types of the user are various, the user cannot move with the remote friends on line, and different motion data contents, such as data of sharing the motion type and the consumed calories of the user, can be shared with different friends according to the needs of the user. Therefore, the motion data in the existing electronic equipment is monotonous in recording interactive content, poor in sharing freedom degree and poor in user experience.
Disclosure of Invention
The embodiment of the invention mainly solves the technical problem of providing an interaction method and device for motion data and electronic equipment, and can solve the problems of monotonous motion data record interaction content and poor sharing freedom degree in the electronic equipment in the prior art.
In order to solve the above technical problem, one technical solution adopted by the embodiments of the present invention is: there is provided an interactive method of motion data,
the method comprises the following steps:
acquiring motion data of the first electronic equipment and motion data of each second electronic equipment matched with the motion data of the first electronic equipment;
determining a motion type according to the motion data, and generating an online virtual motion scene corresponding to the motion type according to the motion type, wherein the virtual motion scene comprises the first electronic equipment and the simulated images of the second electronic equipment;
and respectively presenting the simulation images and the motion data of the first electronic equipment and the second electronic equipment in the virtual motion scene.
Optionally, the acquiring the motion data of the first electronic device and the motion data of each second electronic device matched with the motion data of the first electronic device includes:
acquiring motion data of the first electronic equipment and each second electronic equipment meeting preset conditions;
and acquiring the motion data of each second electronic device which is matched with the motion data of the first electronic device and meets the preset condition.
Optionally, the determining a motion type according to the motion data, and generating an online virtual motion scene corresponding to the motion type according to the motion type, where the virtual motion scene includes the avatar of the first electronic device and each of the second electronic devices, includes:
obtaining motion parameters in the motion data, and determining a motion type according to the motion parameters;
and generating an online virtual motion scene corresponding to the motion type according to the motion type or matching a prestored online virtual motion scene according to the motion type, wherein the virtual motion scene comprises the first electronic equipment and the simulation images of the second electronic equipment.
Optionally, respectively presenting the avatar and the motion data of the first electronic device and each of the second electronic devices in the virtual motion scene, includes:
acquiring the object images in the virtual motion scene, and automatically allocating the object images to different object images of the first electronic equipment and the second electronic equipment or receiving a user instruction to allocate the different object images to the first electronic equipment and the second electronic equipment;
when the first electronic equipment and each second electronic equipment move, the virtual motion scene presents corresponding motion data in real time on the object image corresponding to the first electronic equipment and each second electronic equipment.
Optionally, each of the second electronic devices that satisfy the preset condition includes:
each second electronic device meeting the condition that the distance between the second electronic device and the first electronic device is within a preset distance;
and/or each second electronic device corresponding to the friend list maintained by the first electronic device.
Optionally, the acquiring the motion data of the second electronic devices, which is matched with the motion data of the first electronic device and meets the preset condition, includes:
acquiring the motion speed of the first electronic equipment according to the motion data of the first electronic equipment;
respectively acquiring the motion speed of each second electronic device according to the motion data of each second electronic device;
and if the difference value between the motion speed of the first electronic equipment and the motion speed of each second electronic equipment is within a threshold value, matching the motion data of the first electronic equipment with the motion data of each second electronic equipment.
In order to solve the above technical problem, another technical solution adopted in the embodiments of the present invention is: the interactive device of the motion data is applied to electronic equipment, wherein the device comprises:
the data acquisition module is used for acquiring the motion data of the first electronic equipment and the motion data of each second electronic equipment matched with the motion data of the first electronic equipment;
the virtual scene generation module is used for determining a motion type according to the motion data and generating an online virtual motion scene corresponding to the motion type according to the motion type, wherein the virtual motion scene comprises the first electronic equipment and the simulated images of the second electronic equipment;
and the data presentation module is used for respectively presenting the simulation image and the motion data of the first electronic equipment and each second electronic equipment in the virtual motion scene.
Optionally, the data acquisition module comprises a data acquisition module,
the first data acquisition unit is used for acquiring the motion data of the first electronic equipment and each second electronic equipment meeting preset conditions;
and the second data acquisition unit is used for acquiring the motion data of each second electronic device which is matched with the motion data of the first electronic device and meets the preset condition.
Optionally, the virtual scene generation module includes:
the motion type acquisition unit is used for acquiring motion parameters in the motion data and determining a motion type according to the motion parameters;
and the virtual motion scene generating unit is used for generating an online virtual motion scene corresponding to the motion type according to the motion type or matching a prestored online virtual motion scene according to the motion type, wherein the virtual motion scene comprises the first electronic device and the simulation images of the second electronic devices.
Optionally, the data presentation module comprises,
the virtual motion scene acquisition unit is used for acquiring virtual images in the virtual motion scene, automatically allocating the virtual images to different virtual images of the first electronic equipment and the second electronic equipment or receiving a user instruction to allocate the different virtual images to the first electronic equipment and the second electronic equipment;
and the motion data presentation unit is used for presenting corresponding motion data in real time in the virtual motion scene on the object image corresponding to the first electronic equipment and each second electronic equipment when the first electronic equipment and each second electronic equipment move.
Optionally, each of the second electronic devices that satisfy the preset condition includes:
each second electronic device meeting the condition that the distance between the second electronic device and the first electronic device is within a preset distance;
and/or each second electronic device corresponding to the friend list maintained by the first electronic device.
Optionally, the second data obtaining unit comprises,
the first speed acquisition unit is used for acquiring the movement speed of the first electronic equipment according to the movement data of the first electronic equipment;
a second speed obtaining unit, configured to obtain, according to the motion data of each second electronic device, a motion speed of each second electronic device;
and the matching unit is used for matching the motion data of the first electronic equipment with the motion data of each second electronic equipment if the difference value between the motion speed of the first electronic equipment and the motion speed of each second electronic equipment is within a threshold value.
In order to solve the above technical problem, another technical solution adopted in the embodiments of the present invention is: an electronic device is provided, which includes:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the above-described interaction method of motion data.
Another embodiment of the present invention provides a computer program product comprising a computer program stored on a non-volatile computer-readable storage medium, the computer program comprising program instructions which, when executed by a processor, cause the processor to perform the above-mentioned interaction method of motion data.
Another embodiment of the present invention provides a non-transitory computer-readable storage medium storing computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the above-described interaction method for motion data.
The embodiment of the invention provides a motion data interaction method, a motion data interaction device and electronic equipment. Therefore, the method and the device can enable the user to initiate online movement during movement, view movement data in a virtual movement scene in real time, promote movement interaction among the users and improve the interestingness of online movement.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below. It is obvious that the drawings described below are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 2 is a schematic view of a virtual scene according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating an interaction method of athletic data according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of step S100 in FIG. 3;
FIG. 5 is a schematic flow chart of step S102 in FIG. 4;
FIG. 6 is a schematic flow chart of step S200 in FIG. 3;
FIG. 7 is a schematic flow chart of step S300 in FIG. 3;
FIG. 8 is a functional block diagram of an interactive device for athletic data according to an embodiment of the present invention;
FIG. 9 is a functional block diagram of the data acquisition module 100 of FIG. 8;
fig. 10 is a functional structure diagram of the second data obtaining unit 102 in fig. 9;
FIG. 11 is a functional block diagram of the virtual scene generation module 200 in FIG. 8;
FIG. 12 is a functional block diagram of the data presentation module 300 of FIG. 8;
fig. 13 is a hardware configuration diagram of an electronic device according to still another embodiment of the present invention.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The interaction method of the motion data of the embodiment of the invention can be executed in any suitable type of electronic equipment with a user interaction device and a processor with computing capacity, such as: portable phones, smart phones, tablet computers, notebooks, tablet PCs, laptop computers, digital broadcast terminals, personal digital assistants (PADs), Portable Multimedia Players (PMPs), navigators, and the like.
The motion data interaction device in the embodiment of the present invention may be independently disposed in the electronic device as one of software or hardware functional units, or may be integrated in a processor as one of functional modules to execute the motion data interaction method in the embodiment of the present invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 1, the electronic apparatus 1000 includes a wireless communication unit 11, an audio/video (a/V) input unit 12, a user input unit 13, a sensing unit 14, an output unit 15, a display unit 16, a memory 17, an interface unit 18, and a controller 19.
The electronic device 1000 may include a multi-mode (multi mode) portable terminal that is respectively connected to communication networks according to at least two communication methods or at least two operators, and a multi standby (multi standby) portable terminal that is simultaneously connected to the communication networks according to at least two communication methods or at least two operators.
For illustrative purposes, a terminal according to an embodiment of the present invention is described as a multi-standby terminal as an example. The multi-standby terminal is a portable terminal as shown below, that is: which is simultaneously connected to three communication networks selected from a plurality of communication methods including, for example, Code Division Multiple Access (CDMA), global system for mobile communications (GSM), Wideband Code Division Multiple Access (WCDMA), or wireless broadband (Wibro).
The wireless communication unit 11 may include at least one module capable of enabling wireless communication between the terminal and a wireless communication system or between the terminal and a network in which the electronic device is located. For example, the wireless communication unit 11 includes a broadcast receiving module, a mobile communication module, a wireless internet module, a short-range communication module, and a location information module.
The broadcast receiving module receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. Here, the broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server refers to a server as follows: which generates and transmits a broadcast signal and/or broadcast associated information, or receives an already-generated broadcast signal and/or broadcast associated information and transmits the already-generated broadcast signal and/or broadcast associated information to a terminal. The broadcast signal may include not only a television broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal in a form in which a data broadcast signal is combined with a television broadcast signal or a radio broadcast signal.
The broadcast associated information may refer to information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast associated information may also be provided through a mobile communication network. In this case, the broadcast associated information may be received by the mobile communication module. The broadcasting related information may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H).
The broadcast receiving module may receive a digital broadcast signal using a digital broadcasting system such as terrestrial digital multimedia broadcasting (DMB-T), satellite digital multimedia broadcasting (DMB-S), media forward link only (MediaFLO), digital video broadcasting-handheld (DVB-H), or terrestrial integrated services digital broadcasting (ISDB-T). It should be noted that the broadcast receiving module 111 may be included to be suitable not only for the above-described digital broadcasting system but also for other broadcasting systems. The broadcast signal and/or the broadcast associated information received through the broadcast receiving module may also be stored in the memory.
The mobile communication module transmits or may receive a wireless signal to or from at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include a voice call signal, a video call signal, or various forms of data according to the reception and transmission of the character/multimedia message.
The wireless internet module refers to a module for wireless internet connection, and may be built in or out of the terminal. Wireless internet technologies such as wireless lan (wlan) (Wi-Fi), wireless broadband (Wibro), worldwide interoperability for microwave access (Wimax), High Speed Downlink Packet Access (HSDPA) may be used.
The short-range communication module refers to a module for performing short-range communication. Short range communication technologies such as Bluetooth (Bluetooth), Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), or ZigBee may be used. The positioning information module is a module for obtaining a position of the electronic device, such as a Global Positioning System (GPS) module.
In addition, an audio/video (a/V) input unit 12 is used to input an audio signal or a video signal, and may include a camera and a microphone (microphone). The camera processes a still image or a video frame such as a moving picture obtained by an image sensor in a video call mode or a photographing mode. The processed video frame may be displayed on a display unit. The video frames processed by the camera may be stored in the memory or may be transmitted to the outside through the wireless communication unit. Two or more cameras may be included depending on the user environment.
The microphone receives an external sound signal through the microphone in a call mode, a recording mode, or a voice recognition mode, and processes the sound signal into electric voice data. For the call mode, the processed sound data may be converted into a format capable of being transmitted to a mobile communication base station through a mobile communication module to be output. In the microphone, various noise removal algorithms for removing noise generated during reception of an external sound signal may be implemented.
The user input unit 13 generates input data for controlling the operation of the terminal by the user. The user input unit may include, for example, a keyboard, a dome switch (dome switch), a touch pad (constant voltage/constant current), a jog wheel (jog wheel), or a jog switch (jog switch). The user input unit may include an identification module selection switch for generating a selection signal for selecting a specific identification module among the plurality of selection modules.
The sensing unit 14 may detect a current state of the electronic device, such as an open/close state of the electronic device, a position of the terminal, whether contact is made with a user, a direction of the terminal, or acceleration/deceleration of the electronic device, to generate a sensing signal for controlling an operation of the terminal. For example, when the terminal is a slide phone type, whether the slide phone is opened or closed may be sensed. In addition, whether the power supply unit supplies power or whether an external device is connected to the interface unit may be sensed. The sensing unit may include, for example, a touch sensor and a proximity sensor. The touch sensor is a sensor for detecting a touch operation. For example, the touch sensor may have the form of a touch film, a touch sheet, or a touch unit.
The touch sensor may have an interlayer structure (hereinafter, referred to as a "touch screen") together with the display unit. The touch sensor may be configured to convert a pressure applied to a specific portion of the display unit or a change in capacitance generated at the specific portion of the display unit into an electrical input signal. The touch sensor may be configured to detect not only a position and an area of a touch but also a pressure of the touch.
When there is a touch input on the touch sensor, a signal (or signals) corresponding thereto is sent to the touch controller. The touch controller processes the signal(s) and then sends corresponding data to the controller. Thus, the controller may determine which region of the display unit is touched.
In some embodiments, the sensing unit 14 of the electronic device 1000 may include two touch units, namely a first touch unit 141 and a second touch unit 142, where the first touch unit 141 is capable of responding to a first touch instruction input by a user, and the second touch unit 142 is capable of responding to a second touch instruction input by the user, and each touch instruction may be a track operation, a click operation, a double-click operation, and the like performed on a touch-sensitive surface of the touch unit.
The proximity sensor may be arranged in an inner area of the terminal surrounded by or close to the touch screen. A proximity sensor refers to a sensor for detecting the presence of an object approaching a predetermined detection surface or existing in the vicinity using a force in an electromagnetic field or infrared light without mechanical contact. In addition, proximity sensors have a longer lifetime and higher utility than contact sensors.
Examples of the proximity sensor include a transmission type photosensor, a direct reflection type photosensor, a mirror reflection type photosensor, a high-frequency oscillation type proximity sensor, a capacitance type proximity sensor, a magnetic proximity sensor, and an infrared light proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of a pointer by a change in an electric field due to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
"proximity touch" refers to an action that causes a pointer that does not contact the touch screen but is in proximity to the touch screen to be recognized as being located on the touch screen. "contact touch" refers to an action of actually contacting a pointer on a touch screen. The position where the pointer makes the proximity touch on the touch screen represents a position where the pointer vertically corresponds to the touch screen when the pointer makes the proximity touch.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement, etc.). Information corresponding to the detected proximity touch and the proximity touch pattern may be output on the touch screen.
The display unit 16 displays (outputs) information processed in the terminal 100. For example, when the terminal is in a call mode, a User Interface (UI) or a Graphic User Interface (GUI) related to a call is displayed. When the terminal is in a video call mode or a photographing mode, photographed and/or received images, a UI or a GUI are displayed.
The display unit 16 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light emitting diode display (OLED), a flexible display, and a three-dimensional (3D) display.
There may be two or more display units depending on the implementation form of the terminal. For example, in a terminal, a plurality of displays may be separately or integrally arranged on a surface, or respectively arranged on different surfaces.
The output unit 15 is used to generate an output related to a visual sense, an auditory sense, or a touch, and may include a sound output module, an alarm unit, and a haptic module.
The sound output module may output audio data received from the wireless communication unit or output audio data stored in the memory upon receiving a call signal in a call mode, a recording mode, a voice recognition mode, or a broadcast reception mode. The sound output module may output a sound signal related to a certain function performed by the electronic device (e.g., a sound of receiving a call signal, a sound of receiving a message, etc.). The sound output module may include a receiver, a speaker, or a buzzer.
The alarm unit outputs a signal for notifying the occurrence of an event of the terminal. Examples of events occurring in the electronic device include receiving a call signal, receiving a message, a key signal input, and a touch input. The alarm unit may output a signal other than a video signal or an audio signal, for example, a signal notifying occurrence of an event by vibration. The video signal or the audio signal may be output through the display unit or the sound output module. Thus, the display unit or the sound output module may be classified as a part of the alarm unit.
The haptic module generates various haptic effects that a user can feel. A typical example of a haptic effect generated by a haptic module is vibration. The intensity and pattern of the vibrations generated by the haptic module may be controlled. For example, different vibrations may be output synthetically or sequentially.
The haptic module may generate various haptic effects other than vibration, such as the effects of various stimuli: for example, an array of pins that move vertically with respect to the skin-contacting surface, air jet or suction through an orifice or suction hole, rubbing across the skin surface, contact of electrodes, or electrostatic forces and the use of elements that can absorb or release heat to reproduce the effects of cold and heat.
The haptic module can not only transmit a haptic effect by a direct touch but also implement a haptic effect by, for example, a muscle sense of a finger or an arm of a user. Two or more haptic modules may be provided according to the type of formation of the terminal.
The memory 17 may store a program for operating the controller, and may temporarily store input/output data (e.g., address book, message, still image, video, etc.). The memory may also store data related to various patterns of vibration and sound output when a touch input is applied to the touch screen.
The memory may include at least one of the following types of storage media: flash memory type memory, hard disk type memory, micro multimedia card type memory, card type memory (e.g., SD or XD memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Programmable Read Only Memory (PROM), magnetic memory, magnetic disk, and optical disk. The terminal may operate in association with a network storage that performs a storage function of the memory on the internet.
The interface unit 18 performs a function of a path connecting all external devices to the terminal. The interface unit receives data from an external device, is supplied with power and transfers power to each element within the terminal, or transmits data within the terminal to the external device. For example, a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting to a device having an identification module, an audio input/output (I/O) port, a video input/output (I/O) port, and a headset port may be included in the interface unit.
The identification module is a chip storing various information for authenticating a user's access to the terminal, and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and a Universal Subscriber Identity Module (USIM). A device having an identification module (hereinafter referred to as "identification device") may be manufactured in the form of a smart card. Thus, the identification device may be connected with the terminal via the port.
The interface unit may be used as a path for supplying power from the cradle to the terminal when the terminal is connected to an external cradle, or a path for transmitting various command signals input by a user through the cradle to the electronic device. Various command signals or power input from the cradle may be used as a signal for identifying whether the terminal is properly mounted to the cradle.
Further, the controller 19 controls the overall operation of the electronic apparatus. For example, the controller 19 may perform control and processing related to a voice call, data communication, or video call. The controller 19 may include a multimedia module for playing multimedia. The multimedia module may be implemented within the controller and may be implemented separately from the controller 19.
In a hardware implementation, the embodiments described herein may be implemented using at least one of the following: application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and other electronic units performing these functions. In some cases, these embodiments may be implemented by the controller 19.
In embodiments of the present invention, the electronic device supports the installation of various desktop applications, such as one or more of the following desktop applications: a drawing application, a presentation application, a word processing application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, a training support application, a photo management application, a digital camera application, a digital video recorder application, a web browsing application, a digital music player application, a digital video player application, and the like.
The electronic equipment disclosed by the embodiment of the invention can complete the implementation of various functional applications or scenes based on the electronic equipment.
Referring to fig. 2, fig. 2 is a schematic view of a virtual scene according to an embodiment of the present invention. As shown in fig. 2, the virtual scene may be implemented in the electronic device 1000 shown in fig. 1. The first electronic device 1001 and the second electronic device 1002 in the embodiment of the present invention are both one of the electronic devices 1000 shown in fig. 1. Only two electronic devices are taken as an example for the description.
When the user uses the first electronic device 1001 to perform the exercise recording, the first electronic device 1001 acquires the exercise data of the current user in real time, such as acquisition of exercise frequency, calorie consumption, exercise type, exercise speed, and the like. At this time, the first electronic device may obtain the second electronic device 1002 in the predetermined area or the second electronic device 1002 corresponding to the friend in the friend list in the first electronic device as needed.
The motion data of the second electronic device 1002 is acquired and matched with the motion data of the first electronic device, if the motion data is the same motion type, the first electronic device generates a virtual scene, if the motion type acquired by the first electronic device 1001 is running, a circular runway is displayed on a display screen of the first electronic device, and after a period of time passes at the initial position, the first electronic device 1001 and the second electronic device 1002 are identified on the runway according to the difference between the speeds of the first electronic device and the second electronic device. In fig. 2, a virtual scene of a first electronic device 1001 with a speed lower than that of a second electronic device 1002 during movement is shown. Wherein the arrows are the directions of movement of the first electronic device and the second electronic device. Wherein the first electronic device is ranked 2 nd and the second electronic device is ranked 1 st.
In order to implement the application scenario, an embodiment of the present invention provides a flowchart of an interaction method for motion data, which is applied to an electronic device, and as shown in fig. 3, the embodiment includes:
step S100, obtaining the motion data of the first electronic device and the motion data of each second electronic device matched with the motion data of the first electronic device.
During specific implementation, the motion data of the user can be automatically sensed through a high-sensitivity acceleration sensor arranged in the first electronic device, and the motion data of the user can also be acquired through intelligent wearable equipment bound with the electronic device. Wherein the smart wearable device includes but is not limited to a smart bracelet. The method is introduced by taking an acceleration sensor built in a mobile phone as an example, obtains the motion data of a user according to the acceleration sensor, specifically obtains the step number of the user, can calculate the motion speed and the motion intensity according to time, can calculate the consumed heat and the corresponding fat mass, and can also automatically generate the life track of the user by combining GPS positioning.
The motion data matched with the first electronic device is similar to the motion speed of the electronic device or in the same motion type, and a user can set the matching standard according to needs. And the second electronic equipment matched with the motion data provides a public reference for ranking in the virtual scene, and provides comfort for the user.
Referring to fig. 2, the user may operate the physical key 12 to trigger the first electronic device to acquire the motion data of each second electronic device matching the motion data of the first electronic device. The method for triggering the first electronic device to acquire the motion data of each second electronic device matched with the motion data of the first electronic device through the physical key is different from the method for triggering the first electronic device to acquire the motion data of each second electronic device matched with the motion data of the first electronic device through the physical key in the embodiment: the user can also trigger the electronic equipment to acquire the motion data of each second electronic equipment matched with the motion data of the first electronic equipment by operating the virtual key. Or, the user may input a preset track or a preset password under the touch screen of the first electronic device to trigger the first electronic device to acquire the motion data of each second electronic device matching the motion data of the first electronic device, specifically, when the first electronic device detects that the track input by the user on the touch screen of the first electronic device matches a preset reference track, the motion data of each second electronic device matching the motion data of the first electronic device is acquired; if the reference trajectories do not match, no processing is done.
Step S200, determining a motion type according to the motion data, and generating an online virtual motion scene corresponding to the motion type according to the motion type, wherein the virtual motion scene comprises the first electronic device and the avatar of each second electronic device.
In specific implementation, the motion type is determined according to the motion speed in the motion data of the first electronic device, and the motion type is divided into fast running, slow running, walking and the like. And after the motion type is obtained, generating a corresponding online virtual motion scene. Wherein the virtual motion scene further comprises an avatar of the first electronic device and each of the second electronic devices. For example, if the type of motion of the user is walking according to the motion data, the online virtual motion scene may be a straight long path. If the user is running, the virtual sport scene can be an annular playground runway. The avatar generally comes in the form of a cartoon, thereby vividly identifying the famous electronic device.
Step S300, the simulation image and the motion data of the first electronic device and each second electronic device are respectively presented in the virtual motion scene.
During specific implementation, when the first electronic device and each matched second electronic device move, the corresponding virtual object image in the virtual motion scene also moves according to the same proportion with the real motion data, and the motion data of the corresponding electronic device is displayed on the virtual object image in real time. The exercise data type may be set according to a user's selection, including one or more of exercise speed, exercise duration, calorie consumption, number of steps, and the like.
Optionally, after step S300, the method further includes: and when the end of the movement of the first electronic equipment is detected, controlling the first electronic equipment to send different movement data to corresponding second electronic equipment according to a preset priority.
In specific implementation, when it is detected that the first electronic device does not move any more, different motion data of the first electronic device can be sent to the corresponding second electronic device according to the corresponding preset priorities of different electronic devices.
The setting of the priority can be carried out according to the relationship intimacy between the user and friends and the user and strangers. For example, the user buddy list may also set a plurality of priorities, the second electronic device corresponding to the buddy is of a high priority, the second electronic device corresponding to the colleague is of a medium priority, and the electronic device corresponding to the client and the stranger is of a low priority. In addition to the number of steps, the user corresponding to the first device also has different parameters such as calorie consumption and the like in the exercise process.
And when the second electronic equipment is detected to be the highest priority, the first electronic equipment sends all parameters recorded in the motion to the second electronic equipment, so that the interaction with the second electronic equipment is realized. And when the second electronic equipment is detected to be of the medium priority, the first electronic equipment sends the step number recorded in the movement to the second electronic equipment. When the second electronic equipment is detected to be in low priority, the first electronic equipment does not send all parameters generated in motion to the second electronic equipment, and therefore privacy of a user is protected.
Optionally, an embodiment of the present invention further provides a flowchart of step S100 in fig. 3, and as shown in fig. 4, step S100 specifically includes:
step S101, acquiring motion data of the first electronic equipment and each second electronic equipment meeting preset conditions;
and step S102, acquiring the motion data of each second electronic device which is matched with the motion data of the first electronic device and meets the preset conditions.
In specific implementation, step S101 obtains motion data of the first electronic device, and obtains the second electronic device that meets the preset condition. The preset condition refers to that the second electronic equipment which is within a preset distance from the first electronic equipment is satisfied or a friend list maintained by the first electronic equipment is obtained, and the second electronic equipment corresponding to the friend list is obtained.
The user can pass through the second electronic equipment corresponding to the friend list in the first electronic equipment in the process of movement, and if the related data cannot be obtained, the user can reduce the drying sensation of the user during movement of one person by obtaining the movement data of the second electronic equipment which is within the preset distance from the first electronic equipment.
In step S102, according to the motion data in the second electronic device that meets the preset condition, a second electronic device that matches the motion data in the first electronic device, that is, a second electronic device with a similar motion speed or motion type, is obtained.
Optionally, an embodiment of the present invention further provides another flow chart of step S102 in fig. 4, and as shown in fig. 5, step S102 specifically includes:
step S121, obtaining the movement speed of the first electronic equipment according to the movement data of the first electronic equipment;
step S122, respectively acquiring the motion speed of each second electronic device according to the motion data of each second electronic device;
step S123, if the difference between the motion speed of the first electronic device and the motion speed of each second electronic device is within a threshold, matching the motion data of the first electronic device with the motion data of each second electronic device.
In specific implementation, step S121 obtains the moving speed of the first electronic device according to the recorded number of steps of the first electronic device within the predetermined time, or obtains the moving speed of the first electronic device through GPS positioning.
In step S122, the first electronic device sends a motion data acquisition request to the second electronic device, and the second electronic device sends motion data to the first electronic device in real time according to the motion data of the second electronic device, or the second electronic device sends motion data to the cloud server in real time, and the first electronic device is connected to the cloud end to acquire the motion data of the second electronic device sent by the cloud end. And calculating the movement speed of the second electronic equipment according to the acquired movement data of the second electronic equipment.
In step S123, if the difference between the motion speed of the first electronic device and the motion speed of each second electronic device is within a threshold, the motion data of the first electronic device is matched with the motion data of each second electronic device. Wherein the threshold value can be set by the user.
Optionally, an embodiment of the present invention further provides a flowchart of step S200 in fig. 3, and as shown in fig. 6, step S200 specifically includes:
step S201, obtaining motion parameters in the motion data, and determining motion types according to the motion parameters;
step S202, generating an online virtual motion scene corresponding to the motion type according to the motion type or matching a prestored online virtual motion scene according to the motion type, wherein the virtual motion scene comprises the first electronic device and the avatar of each second electronic device.
In specific implementation, the motion parameters in step S201 include a motion trajectory, a motion speed, and the like. The motion trail can be realized by GPS positioning built in the electronic equipment. The movement speed can be calculated according to the number of steps taken in a specified time or the road taken in a specified time. For example, the motion type of the electronic device can be determined according to the difference of the motion speeds by acquiring the motion speed in the motion data. Different movement speed intervals correspond to different movement types.
Step S202 is specifically to generate an online virtual motion scene according to a certain rule according to the motion type, or to match a pre-stored online virtual motion scene according to the motion type, or to select a pre-stored online virtual motion scene according to the user selection. The virtual motion scene comprises the first electronic equipment and the simulation images of the second electronic equipment.
Optionally, an embodiment of the present invention further provides a flowchart of step S300 in fig. 3, and as shown in fig. 7, step S300 specifically includes:
step S301, obtaining the object image in the virtual motion scene, automatically distributing the object image to the first electronic device and each second electronic device or receiving a user instruction to distribute different object images to the first electronic device and each second electronic device;
step S302, when the first electronic device and each second electronic device move, the virtual motion scene presents corresponding motion data in real time on the avatar corresponding to the first electronic device and each second electronic device.
In specific implementation, the avatar in the virtual motion scene is obtained in step S301 and is automatically allocated to the first electronic device and each of the second electronic devices, or the user automatically allocates to the first electronic device and each of the second electronic devices according to his/her preference. The simulated image can be a cartoon character image or an animal image downloaded from the network or designed by the user.
When the first electronic device and each second electronic device move in step 302, the corresponding avatar in the virtual motion scene also moves according to the same proportion as the real motion data, and the avatar displays the motion data of the corresponding electronic device in real time. The exercise data type may be set according to a user's selection, including one or more of exercise speed, exercise duration, calorie consumption, number of steps, and the like.
On the other hand, referring to fig. 8, fig. 8 is a functional structure diagram of an interactive device for sports data according to another embodiment of the present invention. The motion data interaction device is used for electronic equipment, and as shown in fig. 8, the device comprises:
a data obtaining module 100, configured to obtain motion data of the first electronic device and motion data of each second electronic device that matches the motion data of the first electronic device;
a virtual scene generation module 200, configured to determine a motion type according to the motion data, and generate an online virtual motion scene corresponding to the motion type according to the motion type, where the virtual motion scene includes the avatar of the first electronic device and each second electronic device;
a data presenting module 300, configured to present the avatar and the motion data of the first electronic device and each of the second electronic devices in the virtual motion scene, respectively.
In specific implementation, the data obtaining module 100 is configured to obtain motion data of the first electronic device, and specifically, may automatically sense the motion data of the user through a highly sensitive acceleration sensor built in the first electronic device, and may also obtain the motion data of the user through an intelligent wearable device bound to the electronic device. Wherein the smart wearable device includes but is not limited to a smart bracelet. The method is introduced by taking an acceleration sensor built in a mobile phone as an example, obtains the motion data of a user according to the acceleration sensor, specifically obtains the step number of the user, can calculate the motion speed and the motion intensity according to time, can calculate the consumed heat and the corresponding fat mass, and can also automatically generate the life track of the user by combining GPS positioning.
The motion data matched with the first electronic device is similar to the motion speed of the electronic device or in the same motion type, and a user can set the matching standard according to needs. And the second electronic equipment matched with the motion data provides a public reference for ranking in the virtual scene, and provides comfort for the user.
The virtual scene generating module 200 is configured to determine a motion type according to a motion speed in the motion data of the first electronic device, where the motion type is classified into fast running, slow running, walking, and the like. And after the motion type is obtained, generating a corresponding online virtual motion scene. Wherein the virtual motion scene further comprises an avatar of the first electronic device and each of the second electronic devices. For example, if the type of motion of the user is walking according to the motion data, the online virtual motion scene may be a straight long path. If the user is running, the virtual sport scene can be an annular playground runway. The avatar generally comes in the form of a cartoon, thereby vividly identifying the famous electronic device.
When the data presentation module 300 is used for the first electronic device and each of the matched second electronic devices to move, the corresponding avatar in the virtual motion scene also moves according to the same proportion as the real motion data, and the avatar displays the motion data of the corresponding electronic device in real time. The exercise data type may be set according to a user's selection, including one or more of exercise speed, exercise duration, calorie consumption, number of steps, and the like.
Optionally, the apparatus further includes a data pushing module, configured to control the first electronic device to send different motion data to the corresponding second electronic device according to a preset priority when detecting that the motion of the first electronic device is finished.
Specifically, the data pushing module is configured to send different motion data of the first electronic device to the corresponding second electronic device according to the preset corresponding priority of different electronic devices when it is detected that the first electronic device does not move any more.
The setting of the priority can be carried out according to the relationship intimacy between the user and friends and the user and strangers. For example, the user buddy list may also set a plurality of priorities, the second electronic device corresponding to the buddy is of a high priority, the second electronic device corresponding to the colleague is of a medium priority, and the electronic device corresponding to the client and the stranger is of a low priority. In addition to the number of steps, the user corresponding to the first device also has different parameters such as calorie consumption and the like in the exercise process.
And when the second electronic equipment is detected to be the highest priority, the first electronic equipment sends all parameters recorded in the motion to the second electronic equipment, so that the interaction with the second electronic equipment is realized. And when the second electronic equipment is detected to be of the medium priority, the first electronic equipment sends the step number recorded in the movement to the second electronic equipment. When the second electronic equipment is detected to be in low priority, the first electronic equipment does not send all parameters generated in motion to the second electronic equipment, and therefore privacy of a user is protected.
Optionally, an embodiment of the present invention further provides a functional structure schematic diagram of the data obtaining module 100 in fig. 8, as shown in fig. 9, the data obtaining module 100 includes:
a first data acquisition unit 101, configured to acquire motion data of the first electronic device and each second electronic device that meets a preset condition;
a second data obtaining unit 102, configured to obtain motion data of each second electronic device that meets a preset condition and matches the motion data of the first electronic device.
In specific implementation, the first data obtaining unit 101 is configured to obtain motion data of a first electronic device and obtain a second electronic device that meets a preset condition. The preset condition refers to that the second electronic equipment which is within a preset distance from the first electronic equipment is satisfied or a friend list maintained by the first electronic equipment is obtained, and the second electronic equipment corresponding to the friend list is obtained.
The user can pass through the second electronic equipment corresponding to the friend list in the first electronic equipment in the motion process, if the related data cannot be acquired, the motion data of the second electronic equipment with the distance from the first electronic equipment within the preset distance can be acquired, so that the user is prevented from moving alone, the dryness feeling is easily generated, and the motion interest is improved.
The second data obtaining unit 102 is configured to obtain, according to the motion data in the second electronic device that meets the preset condition, a second electronic device that matches the motion data in the first electronic device, that is, a second electronic device with a similar motion speed or motion type.
Optionally, an embodiment of the present invention further provides a functional structure schematic diagram of the second data acquiring unit 102 in fig. 9, as shown in fig. 10, the second data acquiring unit 102 includes:
a first speed obtaining unit 121, configured to obtain a movement speed of the first electronic device according to movement data of the first electronic device;
a second speed obtaining unit 122, configured to obtain the motion speed of each second electronic device according to the motion data of each second electronic device;
a matching unit 123, configured to match the motion data of the first electronic device with the motion data of the respective second electronic devices if a difference between the motion speed of the first electronic device and the motion speed of the respective second electronic devices is within a threshold.
In a specific implementation, the first speed obtaining unit 121 is configured to obtain a moving speed of the first electronic device according to the recorded number of steps of the first electronic device within a predetermined time, or obtain a moving speed of the first electronic device through GPS positioning to obtain a distance traveled within the predetermined time.
The second speed obtaining unit 122 is configured to send, by the first electronic device, a motion data obtaining request to the second electronic device, where the second electronic device sends motion data to the first electronic device in real time according to the motion data of the second electronic device, or the second electronic device sends the motion data to the cloud server in real time, and the first electronic device is connected to the cloud end to obtain the motion data of the second electronic device sent by the cloud end. And calculating the movement speed of the second electronic equipment according to the acquired movement data of the second electronic equipment.
The matching unit 123 is configured to determine that a difference between the motion speed of the first electronic device and the motion speed of each second electronic device is within a threshold, and then the motion data of the first electronic device is matched with the motion data of each second electronic device. Wherein the threshold value can be set by the user.
Optionally, an embodiment of the present invention further provides a functional structure schematic diagram of the virtual scene generation module 200 in fig. 8, as shown in fig. 11, the virtual scene generation module 200 includes:
a motion type obtaining unit 201, configured to obtain a motion parameter in the motion data, and determine a motion type according to the motion parameter;
a virtual motion scene generating unit 202, configured to generate an online virtual motion scene corresponding to the motion type according to the motion type or match a pre-stored online virtual motion scene according to the motion type, where the virtual motion scene includes the avatar of the first electronic device and each second electronic device.
In specific implementation, the motion parameters include a motion track, a motion speed, and the like. The motion trail can be realized by GPS positioning built in the electronic equipment. The movement speed can be calculated according to the number of steps taken in a specified time or the road taken in a specified time. For example, the motion type acquiring unit 201 is configured to determine the motion type of the electronic device according to the difference of the motion speeds by acquiring the motion speed in the motion data. Different movement speed intervals correspond to different movement types.
The virtual motion scene generating unit 202 is specifically configured to generate an online virtual motion scene according to a certain rule according to a motion type, or match a pre-stored online virtual motion scene according to the motion type, or select a pre-stored online virtual motion scene according to a user selection. The virtual motion scene comprises the first electronic equipment and the simulation images of the second electronic equipment.
Optionally, an embodiment of the present invention further provides a functional structure schematic diagram of the data presenting module 300 in fig. 8, as shown in fig. 12, the data presenting module 300 includes:
a virtual image assigning unit 301, configured to obtain virtual images in the virtual motion scene, automatically assign different virtual images to the first electronic device and each second electronic device, or receive a user instruction to assign different virtual images to the first electronic device and each second electronic device;
the motion data presenting unit 302 is configured to present, in real time, corresponding motion data in the virtual motion scene in the avatar corresponding to the first electronic device and each second electronic device when the first electronic device and each second electronic device move.
In specific implementation, the avatar allocation unit 301 obtains an avatar in a virtual motion scene, and automatically allocates the avatar to the first electronic device and each of the second electronic devices, or automatically allocates the avatar to the first electronic device and each of the second electronic devices according to the preference of the user. The simulated image can be a cartoon character image or an animal image downloaded from the network or designed by the user.
The motion data presenting unit 302 is configured to, when the first electronic device and each second electronic device move, move the corresponding avatar in the virtual motion scene according to the same proportion as the real motion data, and display the motion data of the corresponding electronic device on the avatar in real time. The exercise data type may be set according to a user's selection, including one or more of exercise speed, exercise duration, calorie consumption, number of steps, and the like.
As shown in fig. 13, a schematic diagram of a hardware structure of an electronic device according to another embodiment of the present invention is provided, where the electronic device 10 includes:
one or more processors 401 and a memory 402, where one processor 401 is illustrated in fig. 13, the processor 401 and the memory 402 may be connected by a bus or in other manners, and fig. 13 illustrates a connection by a bus as an example.
The memory 402, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/units corresponding to the interaction method of the motion data in the embodiment of the present invention (for example, the data acquisition module 100, the virtual scene generation module 200, and the data presentation module 300 shown in fig. 8). The processor 401 executes various functional applications and data processing of the electronic device, that is, implements the interaction method of the motion data in the above-described method embodiments, by executing the nonvolatile software program, instructions, and units stored in the memory 402.
The memory 402 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the electronic device, and the like. Further, the memory 402 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 402 may optionally include memory located remotely from processor 401, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more units are stored in the memory 402, and when executed by the one or more processors 401, perform the interaction method of the motion data in any of the above-described method embodiments, for example, perform the above-described method steps S100 to S300 in fig. 3, and implement the functions of the module 100 and 300 in fig. 8.
The electronic device 10 of the embodiment of the present invention is in various forms, and performs the above-described steps illustrated in fig. 3 to 7; when the functions of the modules and units described in fig. 8 to fig. 12 can also be implemented, the electronic device 10 includes, but is not limited to:
(1) a mobile communication device: such devices are characterized by mobile communications capabilities and are primarily targeted at providing voice, data communications. Such terminals include intelligent electronic devices (e.g., iphones), multimedia electronic devices, functional electronic devices, and low-end electronic devices, among others.
(2) The ultra-mobile personal computer equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include: PDA, MID, and UMPC devices, etc., such as ipads.
(3) A portable entertainment device: such devices can display and play video content, and generally also have mobile internet access features. This type of device comprises: video players, handheld game consoles, and intelligent toys and portable car navigation devices.
(4) And other electronic equipment with a video playing function and an internet surfing function.
The electronic equipment can execute the interaction method of the motion data provided by the embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. Technical details that are not described in detail in the embodiment of the electronic device may be referred to in the interaction method of the motion data provided by the embodiment of the present invention.
Embodiments of the present invention provide a non-transitory computer-readable storage medium storing computer-executable instructions, which are executed by one or more processors, for example, to execute the above-described method steps S100 to S300 in fig. 3, and implement the functions of the module 100 and 300 in fig. 8.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that the embodiments may be implemented by software plus a general hardware platform, and may also be implemented by hardware. With this in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (14)

1. An interaction method of motion data is applied to a first electronic device, and comprises the following steps:
acquiring motion data of the first electronic device and motion data of each second electronic device matched with the motion data of the first electronic device, wherein,
the motion data matching with the first electronic equipment means that the motion data is in the same motion type with the motion speed of the first electronic equipment;
determining a motion type according to the motion data, and generating an online virtual motion scene corresponding to the motion type according to the motion type, wherein the virtual motion scene comprises the first electronic device and the avatar of each second electronic device,
the determining a type of motion from the motion data includes: obtaining motion parameters in the motion data, and determining a motion type according to the motion parameters, wherein the motion parameters comprise a motion track and the motion speed;
and respectively presenting the simulation images and the motion data of the first electronic equipment and the second electronic equipment in the virtual motion scene.
2. The method for interacting motion data according to claim 1, wherein the obtaining motion data of the first electronic device and motion data of each second electronic device matched with the motion data of the first electronic device comprises:
acquiring motion data of the first electronic equipment and each second electronic equipment meeting preset conditions;
and acquiring the motion data of each second electronic device which is matched with the motion data of the first electronic device and meets the preset condition.
3. The method according to claim 1, wherein the determining a motion type according to the motion data and generating an online virtual motion scene corresponding to the motion type according to the motion type, the virtual motion scene including the avatar of the first electronic device and each of the second electronic devices, comprises:
obtaining motion parameters in the motion data, and determining a motion type according to the motion parameters;
and generating an online virtual motion scene corresponding to the motion type according to the motion type or matching a prestored online virtual motion scene according to the motion type, wherein the virtual motion scene comprises the first electronic equipment and the simulation images of the second electronic equipment.
4. The method of claim 1, wherein presenting the avatar and the motion data of the first electronic device and each second electronic device in the virtual motion scene respectively comprises:
acquiring the object images in the virtual motion scene, and automatically allocating the object images to different object images of the first electronic equipment and the second electronic equipment or receiving a user instruction to allocate the different object images to the first electronic equipment and the second electronic equipment;
when the first electronic equipment and each second electronic equipment move, the virtual motion scene presents corresponding motion data in real time on the object image corresponding to the first electronic equipment and each second electronic equipment.
5. The interaction method for motion data according to claim 2, wherein each of the second electronic devices that satisfy the preset condition comprises:
each second electronic device meeting the condition that the distance between the second electronic device and the first electronic device is within a preset distance;
and/or each second electronic device corresponding to the friend list maintained by the first electronic device.
6. The method for interacting motion data according to claim 2 or 5, wherein the obtaining motion data of the respective second electronic devices that match the motion data of the first electronic device and satisfy a preset condition comprises:
acquiring the motion speed of the first electronic equipment according to the motion data of the first electronic equipment;
respectively acquiring the motion speed of each second electronic device according to the motion data of each second electronic device;
and if the difference value between the motion speed of the first electronic equipment and the motion speed of each second electronic equipment is within a threshold value, matching the motion data of the first electronic equipment with the motion data of each second electronic equipment.
7. An interaction device of motion data, applied to an electronic device, comprising:
a data acquisition module for acquiring motion data of a first electronic device and motion data of each second electronic device matched with the motion data of the first electronic device, wherein,
the motion data matching with the first electronic equipment means that the motion data is in the same motion type with the motion speed of the first electronic equipment;
a virtual scene generation module, configured to determine a motion type according to the motion data, and generate an online virtual motion scene corresponding to the motion type according to the motion type, where the virtual motion scene includes avatar images of the first electronic device and each second electronic device, and the avatar images include a plurality of avatar images of the second electronic device,
the determining a type of motion from the motion data includes: obtaining motion parameters in the motion data, and determining a motion type according to the motion parameters, wherein the motion parameters comprise a motion track and the motion speed;
and the data presentation module is used for respectively presenting the simulation image and the motion data of the first electronic equipment and each second electronic equipment in the virtual motion scene.
8. The interactive device for sports data of claim 7, wherein the data acquisition module comprises,
the first data acquisition unit is used for acquiring the motion data of the first electronic equipment and each second electronic equipment meeting preset conditions;
and the second data acquisition unit is used for acquiring the motion data of each second electronic device which is matched with the motion data of the first electronic device and meets the preset condition.
9. The motion data interaction device of claim 7, wherein the virtual scene generation module comprises:
the motion type acquisition unit is used for acquiring motion parameters in the motion data and determining a motion type according to the motion parameters;
and the virtual motion scene generating unit is used for generating an online virtual motion scene corresponding to the motion type according to the motion type or matching a prestored online virtual motion scene according to the motion type, wherein the virtual motion scene comprises the first electronic device and the simulation images of the second electronic devices.
10. The interactive device for athletic data of claim 7, wherein the data presentation module comprises,
the virtual motion scene acquisition unit is used for acquiring virtual images in the virtual motion scene, automatically allocating the virtual images to different virtual images of the first electronic equipment and the second electronic equipment or receiving a user instruction to allocate the different virtual images to the first electronic equipment and the second electronic equipment;
and the motion data presentation unit is used for presenting corresponding motion data in real time in the virtual motion scene on the object image corresponding to the first electronic equipment and each second electronic equipment when the first electronic equipment and each second electronic equipment move.
11. The apparatus for interacting motion data according to claim 8, wherein each of the second electronic devices that satisfies the predetermined condition includes:
each second electronic device meeting the condition that the distance between the second electronic device and the first electronic device is within a preset distance;
and/or each second electronic device corresponding to the friend list maintained by the first electronic device.
12. The interactive device for sports data according to claim 8 or 11, wherein the second data obtaining unit includes,
the first speed acquisition unit is used for acquiring the movement speed of the first electronic equipment according to the movement data of the first electronic equipment;
a second speed obtaining unit, configured to obtain, according to the motion data of each second electronic device, a motion speed of each second electronic device;
and the matching unit is used for matching the motion data of the first electronic equipment with the motion data of each second electronic equipment if the difference value between the motion speed of the first electronic equipment and the motion speed of each second electronic equipment is within a threshold value.
13. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of interacting with motion data of any of claims 1-6.
14. A non-transitory computer-readable storage medium storing computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the method of interacting with athletic data of any of claims 1-6.
CN201710528462.4A 2017-07-01 2017-07-01 Motion data interaction method and device and electronic equipment Active CN109200567B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710528462.4A CN109200567B (en) 2017-07-01 2017-07-01 Motion data interaction method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710528462.4A CN109200567B (en) 2017-07-01 2017-07-01 Motion data interaction method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN109200567A CN109200567A (en) 2019-01-15
CN109200567B true CN109200567B (en) 2020-08-14

Family

ID=64992455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710528462.4A Active CN109200567B (en) 2017-07-01 2017-07-01 Motion data interaction method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN109200567B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109893846A (en) * 2019-02-27 2019-06-18 深圳市科迈爱康科技有限公司 Movement exchange method, device and movement interactive terminal based on exercise data stream
CN109700467B (en) * 2019-02-27 2022-05-03 深圳市科迈爱康科技有限公司 Multi-person motion level comparison method and device and motion level comparison terminal
CN112153091B (en) * 2019-06-27 2022-05-13 北京百度网讯科技有限公司 Method and device for determining relevance of equipment
CN112839254A (en) * 2019-11-04 2021-05-25 海信视像科技股份有限公司 Display apparatus and content display method
CN110680294A (en) * 2019-11-05 2020-01-14 深圳市傲威奇科技有限公司 Data processing method and device, sound box and storage medium
CN112642133B (en) * 2020-11-24 2022-05-17 杭州易脑复苏科技有限公司 Rehabilitation training system based on virtual reality
CN112755495A (en) * 2020-12-09 2021-05-07 歌尔科技有限公司 Interaction method, equipment and system of motion data and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008046443A1 (en) * 2006-10-20 2008-04-24 Senzathlon Gmbh System and method for virtual sports competitions and sports centric internet communities
CN104007822A (en) * 2014-05-30 2014-08-27 中山市永衡互联科技有限公司 Large database based motion recognition method and device
CN104856684A (en) * 2015-04-10 2015-08-26 深圳市虚拟现实科技有限公司 Moving object acquisition method and system
CN105797311A (en) * 2016-05-19 2016-07-27 湖北海山科技有限公司 Running machine based on internet and internet competition system
CN105903166A (en) * 2016-04-18 2016-08-31 北京小鸟看看科技有限公司 3D online sport competition method and system
CN106310643A (en) * 2016-09-27 2017-01-11 深圳市智游人科技有限公司 True environment-based live-action 3D motion system
CN106547829A (en) * 2016-10-10 2017-03-29 福建省够兄弟科技有限公司 A kind of processing method and system of sports data
CN106669115A (en) * 2017-01-14 2017-05-17 中州大学 Interactive physical training system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7736272B2 (en) * 2001-08-21 2010-06-15 Pantometrics, Ltd. Exercise system with graphical feedback and method of gauging fitness progress

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008046443A1 (en) * 2006-10-20 2008-04-24 Senzathlon Gmbh System and method for virtual sports competitions and sports centric internet communities
CN104007822A (en) * 2014-05-30 2014-08-27 中山市永衡互联科技有限公司 Large database based motion recognition method and device
CN104856684A (en) * 2015-04-10 2015-08-26 深圳市虚拟现实科技有限公司 Moving object acquisition method and system
CN105903166A (en) * 2016-04-18 2016-08-31 北京小鸟看看科技有限公司 3D online sport competition method and system
CN105797311A (en) * 2016-05-19 2016-07-27 湖北海山科技有限公司 Running machine based on internet and internet competition system
CN106310643A (en) * 2016-09-27 2017-01-11 深圳市智游人科技有限公司 True environment-based live-action 3D motion system
CN106547829A (en) * 2016-10-10 2017-03-29 福建省够兄弟科技有限公司 A kind of processing method and system of sports data
CN106669115A (en) * 2017-01-14 2017-05-17 中州大学 Interactive physical training system

Also Published As

Publication number Publication date
CN109200567A (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN109200567B (en) Motion data interaction method and device and electronic equipment
CN111408136B (en) Game interaction control method, device and storage medium
US10659844B2 (en) Interaction method and system based on recommended content
US20190221045A1 (en) Interaction method between user terminals, terminal, server, system, and storage medium
CN108984087B (en) Social interaction method and device based on three-dimensional virtual image
US20220391060A1 (en) Methods for displaying and providing multimedia resources
CN110198484B (en) Message pushing method, device and equipment
CN112016941B (en) Virtual article pickup method, device, terminal and storage medium
US10854009B2 (en) Method, apparatus, and system for obtaining virtual object, and storage medium
US10136289B2 (en) Cross device information exchange using gestures and locations
EP4120170A1 (en) Method and apparatus for displaying resources
CN106371689B (en) Picture joining method, apparatus and system
EP2546726A2 (en) Mobile terminal for outputing the execution screen of an application through another display device
CN110913066A (en) Display method and electronic equipment
CN105431813A (en) Attributing user action based on biometric identity
CN111327914B (en) Interaction method and related device
CN108958629B (en) Split screen quitting method and device, storage medium and electronic equipment
WO2022183707A1 (en) Interaction method and apparatus thereof
CN112836136A (en) Chat interface display method, device and equipment
CN104700040B (en) Authority control method and device
CN113018857B (en) Game operation data processing method, device, equipment and storage medium
CN112915547A (en) Virtual object acquisition method, device, terminal, server and readable storage medium
CN110890969B (en) Method and device for mass-sending message, electronic equipment and storage medium
CN111045562B (en) Interface display method, device, equipment and readable storage medium
CN107395876A (en) Self-adaptive adjusting method and device for screen display content and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant