CN112633473A - Wearable device based on AI and application data processing method thereof - Google Patents

Wearable device based on AI and application data processing method thereof Download PDF

Info

Publication number
CN112633473A
CN112633473A CN202011511656.1A CN202011511656A CN112633473A CN 112633473 A CN112633473 A CN 112633473A CN 202011511656 A CN202011511656 A CN 202011511656A CN 112633473 A CN112633473 A CN 112633473A
Authority
CN
China
Prior art keywords
data
application
wearable device
neural network
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011511656.1A
Other languages
Chinese (zh)
Inventor
杨鹤
张慧敏
肖正飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN202011511656.1A priority Critical patent/CN112633473A/en
Publication of CN112633473A publication Critical patent/CN112633473A/en
Priority to PCT/CN2021/131289 priority patent/WO2022127497A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses wearable equipment based on AI and an application data processing method thereof, wherein NPU is integrated in the wearable equipment; the application data processing method comprises the following steps: in response to receiving a triggering instruction of an application program of the wearable device, acquiring application data associated with the application program; calling the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model; and operating the application program according to the application optimization data to output the operation result of the application program. According to the wearable device and the application program optimization method, the NPU is integrated in the wearable device, optimization requirements of various application programs running on the wearable device are met, diversified AI functions are effectively provided, and therefore user experience is improved.

Description

Wearable device based on AI and application data processing method thereof
Technical Field
The invention relates to the field of intelligent terminals, in particular to wearable equipment based on AI (artificial intelligence) and an application data processing method thereof.
Background
A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. Wearable equipment is not only a hardware equipment, realizes powerful function through software support and data interaction, high in the clouds interaction more, and wearable equipment will bring very big transition to our life, perception.
At present, wearable devices such as smartwatches are generally equipped with mobile operating systems, and various application programs can be operated to meet different requirements of users. However, with the continuous development of the application programs of the intelligent terminal, the existing intelligent watch based on the main control chip such as the MCU (micro control unit) cannot better meet the optimization requirements of various application programs, and cannot provide diversified functions, which results in affecting the user experience.
Disclosure of Invention
The invention aims to overcome the defects that wearable devices such as smart watches cannot well meet the optimization requirements of various application programs and cannot provide diversified functions in the prior art, so that the user experience is affected, and provides an AI-based wearable device and an application data processing method thereof.
The invention solves the technical problems through the following technical scheme:
an application data Processing method of wearable equipment based on AI, said wearable equipment is integrated with NPU (Neural Network Processing Unit, Neural Network processor, which is a new processor general name based on Neural Network algorithm and acceleration, generally speaking, it has special DSP (digital signal processor) and general NPU, etc.);
the application data processing method comprises the following steps:
in response to receiving a triggering instruction of an application program of the wearable device, acquiring application data associated with the application program;
calling the NPU to input the application data into a trained preset neural network model and outputting application optimization data by operating the preset neural network model;
and operating the application program according to the application optimization data to output an operation result of the application program.
Optionally, the step of acquiring application data associated with the application program in response to receiving a trigger instruction of the application program of the wearable device includes:
responding to a trigger instruction of receiving an audio playing program of the wearable device, and acquiring audio related data;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the audio related data into a trained Recurrent Neural Network (RNN) and operating the recurrent Neural Network to output tuned PCM (Pulse Code Modulation) data;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the audio playing program according to the tuned PCM data to output audio.
Optionally, the audio-related data includes at least one of raw audio data, codec (audio D (digital) a (analog) conversion module) parameters, and audio device parameters.
Optionally, the step of acquiring application data associated with the application program in response to receiving a trigger instruction of the application program of the wearable device includes:
in response to receiving a trigger instruction of a positioning program of the wearable device, acquiring positioning related data;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the positioning related data into a trained Deep Neural Network model (DNN) and outputting AI-fused positioning data by operating the Deep Neural Network model;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the positioning program according to the positioning data after AI fusion to output positioning information.
Optionally, the positioning related data includes at least one of modem (modem, also commonly referred to as cellular or baseband chip, referring to mobile communication module at the terminal side) positioning data, GPS (global positioning system) positioning data, GNSS (global navigation satellite system) positioning data, bluetooth positioning data, and Wi-Fi (wireless internet access) positioning data.
Optionally, the step of acquiring application data associated with the application program in response to receiving a trigger instruction of the application program of the wearable device includes:
responding to a trigger instruction of a call application program of the wearable device, and acquiring voice data and calibration information under various real environments;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the voice data and the calibration information into a trained recurrent neural network model and outputting emotion classification possibility of a preset type by operating the recurrent neural network model;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the call application program according to the emotion classification possibility of the preset category to output an emotion classification result.
Optionally, the step of acquiring application data associated with the application program in response to receiving a trigger instruction of the application program of the wearable device includes:
acquiring environment-related data under various real environments in response to receiving a trigger instruction of an environment detection program of the wearable device;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the environment-related data into a trained deep neural network model and output a preset type of external environment classification possibility by operating the deep neural network model;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the environment detection program according to the external environment classification possibility of the preset type to output an external environment classification result.
Optionally, the environment-related data comprises at least one of ambient sound data, sensor data, and location data.
Optionally, the NPU is a lightweight NPU (NPU lite); and/or the presence of a gas in the gas,
the wearable device comprises a smart watch or a smart bracelet.
A wearable device comprises a main control chip, wherein the main control chip comprises a data acquisition module, a model calling module and an application processing module, and an NPU is further integrated in the wearable device;
the data acquisition module is configured to acquire application data associated with an application of the wearable device in response to receiving a triggering instruction of the application;
the model calling module is configured to call the NPU to input the application data to a trained preset neural network model and output application optimization data by running the preset neural network model;
the application processing module is configured to run the application program according to the application optimization data to output a running result of the application program.
Optionally, the data acquisition module is configured to acquire audio-related data in response to receiving a trigger instruction of an audio playing program of the wearable device;
the model calling module is configured to call the NPU to input the audio-related data to a trained recurrent neural network model and output tuned PCM data by operating the recurrent neural network model;
the application processing module is configured to run the audio playing program according to the tuned PCM data to output audio.
Optionally, the audio related data includes at least one of raw audio data, codec parameters, and audio device parameters.
Optionally, the data acquisition module is configured to acquire positioning related data in response to receiving a triggering instruction of a positioning program of the wearable device;
the model calling module is configured to call the NPU to input the positioning related data to a trained deep neural network model and output AI-fused positioning data after operating the deep neural network model;
the application processing module is configured to run the positioning program according to the AI fused positioning data to output positioning information.
Optionally, the positioning-related data comprises at least one of modem positioning data, GPS positioning data, GNSS positioning data, bluetooth positioning data, and Wi-Fi positioning data.
Optionally, the data acquisition module is configured to acquire voice data and calibration information in various real environments in response to receiving a trigger instruction of a call application program of the wearable device;
the model calling module is configured to call the NPU to input the voice data and the calibration information into a trained recurrent neural network model and output a preset category of emotion classification possibilities by operating the recurrent neural network model;
the application processing module is configured to run the call application program according to the emotion classification possibility of the preset category to output an emotion classification result.
Optionally, the data acquisition module is configured to acquire environment-related data in various real environments in response to receiving a trigger instruction of an environment detection program of the wearable device;
the model calling module is configured to call the NPU to input the environment-related data to a trained deep neural network model and output a preset kind of external environment classification possibility by operating the deep neural network model;
the application processing module is configured to run the environment detection program according to a preset category of external environment classification possibility probability to output an external environment classification result.
Optionally, the environment-related data comprises at least one of ambient sound data, sensor data, and location data.
Optionally, the NPU is a lightweight NPU; and/or the presence of a gas in the gas,
the wearable device comprises a smart watch or a smart bracelet.
An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the application data processing method of the AI-based wearable device as described above when executing the computer program.
A computer readable medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the application data processing method of the AI-based wearable device as described above.
On the basis of the common knowledge in the field, the preferred conditions can be combined randomly to obtain the preferred embodiments of the invention.
The positive progress effects of the invention are as follows:
according to the wearable device based on the AI and the application data processing method thereof, provided by the invention, the NPU is integrated in the wearable device, so that the optimization requirements of various application programs running on the wearable device are realized, and diversified AI functions are effectively provided, thereby improving the user experience.
Drawings
The features and advantages of the present disclosure will be better understood upon reading the detailed description of embodiments of the disclosure in conjunction with the following drawings. In the drawings, components are not necessarily drawn to scale, and components having similar relative characteristics or features may have the same or similar reference numerals.
Fig. 1 is a flowchart illustrating an application data processing method of an AI-based wearable device according to an embodiment of the present invention.
Fig. 2a is a schematic diagram of application data processing of a wearable device in an audio playing application scenario according to an embodiment of the present invention.
Fig. 2b is a schematic diagram of application data processing of a wearable device in a positioning application scenario according to an embodiment of the present invention.
Fig. 2c is a schematic diagram of application data processing of the wearable device in an emotion classification application scenario according to an embodiment of the present invention.
Fig. 2d is a schematic diagram of application data processing of the wearable device in an external environment classification application scenario according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a partial control structure of a wearable device according to another embodiment of the present invention.
Fig. 4 is a schematic structural diagram of an electronic device implementing an application data processing method of an AI-based wearable device according to another embodiment of the present invention.
Detailed Description
The invention is further illustrated by the following examples, which are not intended to limit the scope of the invention.
In order to overcome the above existing defects, the present embodiment provides an application data processing method for a wearable device based on an AI, where an NPU is integrated in the wearable device; the application data processing method comprises the following steps: in response to receiving a triggering instruction of an application program of the wearable device, acquiring application data associated with the application program; calling the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model; and operating the application program according to the application optimization data to output the operation result of the application program.
In this embodiment, preferably, the wearable device is a smart watch or a smart bracelet, but the present embodiment does not specifically limit the type of the wearable device, and the wearable device may be selected accordingly according to actual needs.
The intelligent watch comprises an adult intelligent watch, a child intelligent watch, an old man intelligent watch and the like, and various sensors can be additionally arranged for use of various information interconnection and intelligent experience.
In this embodiment, preferably, the NPU adopts a lightweight NPU (e.g., ARM Cortex-M55, ARM Ethos-U65, etc.) optimized for the internet of things and the like to achieve the effect of low power consumption and guarantee a certain AI operation capability, so that the NPU can be effectively applied to wearable devices such as smartwatches. However, the present embodiment does not specifically limit the type of the NPU, and may be selected accordingly according to actual requirements.
In this embodiment, through the integrated NPU in wearable equipment, realized the optimization demand of all kinds of application program that can wear to operate on the equipment, provided diversified AI function effectively to user experience degree has been promoted.
Specifically, as an embodiment, as shown in fig. 1, the application data processing method of the AI-based wearable device provided by this embodiment mainly includes the following steps:
step 101, in response to receiving a trigger instruction of an application program, acquiring application data associated with the application program.
In this embodiment, four different AI optimized application scenarios of the smart watch are mainly provided, the first application scenario is an audio playing application scenario, the second application scenario is a positioning application scenario, the third application scenario is an emotion classification application scenario, and the fourth application scenario is an external environment classification application scenario.
In this step in the first application scenario, audio-related data is acquired in response to receiving a trigger instruction of an audio playback program of the smart watch.
Referring to fig. 2a, in the present embodiment, the audio-related data includes at least one of pre-collected raw audio data with good hearing quality, codec parameters, and audio device parameters, but is not limited thereto.
In this step in the second application scenario, positioning related data is acquired in response to receiving a trigger instruction of a positioning program of the smart watch.
Referring to fig. 2b, in the present embodiment, the positioning-related data includes at least one of modem positioning data, GPS positioning data, GNSS positioning data, bluetooth positioning data, and Wi-Fi positioning data, but is not limited thereto.
Referring to fig. 2c, in the present step in the third application scenario, in response to receiving a trigger instruction of a call application program of the smart watch, voice data and calibration information under various real environments are obtained, and specifically, user voice data and calibration information during a call process may be obtained.
In this step in the fourth application scenario, environment-related data in various real environments is acquired in response to receiving a trigger instruction of an environment detection program of the smart watch.
Referring to fig. 2d, in the present embodiment, the environment-related data includes at least one of environment sound data, sensor data and position data, but is not limited thereto.
Step 102, calling the NPU to input the application data to the trained preset neural network model and output the application optimization data by operating the preset neural network model.
In this step in a first application scenario, the NPU is invoked to input audio-related data into the trained recurrent neural network model and to output tuned PCM data by running the recurrent neural network model.
Under the application scene, in the process of training the model, the tuned audio data is verified in the scene of the intelligent watch and is fed back to the model as calibration data to train the parameter model.
In this step in a second application scenario, the NPU is invoked to input positioning related data to the trained deep neural network model and to output AI-fused positioning data via the operating deep neural network model.
In the application scenario, in the process of training the model, the network model is trained through the pre-collected data of the input type and the corresponding standard calibration data.
In this step in a third application scenario, the NPU is invoked to input the speech data and the calibration information to the trained recurrent neural network model and to output a predetermined category of emotion classification possibilities by operating the recurrent neural network model.
In this application scenario, in the training of the model, the classification model is trained using the calibration information and the speech data.
In this step in a fourth application scenario, the NPU is invoked to input the environment-related data to the trained deep neural network model and output the external environment classification possibility of the preset category through running the deep neural network model.
In this application scenario, in the process of training the model, the classification model is trained using the calibration information and the above input data.
And 103, operating the application program according to the application optimization data to output an operation result of the application program.
In this step in the first application scenario, an audio playback program is run according to the tuned PCM data to output audio.
Specifically, the use process of the audio playing program may be a simplified version of the above functions, that is, the parameter input is less, the maximum power audio output under the condition of ensuring the effect is mainly completed, and the use scene is a common audio playing scene.
In this step in the second application scenario, a positioning program is run according to the AI-fused positioning data to output positioning information.
Specifically, various positioning information acquired by the smart watch is input to a model running on the NPU to obtain positioning information after AI fusion, and the positioning information is used as final positioning information.
The possible positioning means are intelligently fused, more accurate positioning information is output, and each positioning means has limitations such as signals, errors, hot spots, services and the like, so that more comprehensive and accurate positioning experience can be provided through the fused positioning information.
In this step in the third application scenario, the call application is run according to the emotion classification possibility of the preset category to output an emotion classification result.
Specifically, during the call, uplink data of the bypass admission call is transmitted to the NPU for model identification, and then the identification result is sent to the receiving end through a private protocol (the mode is selectable through a network, a short message and the like).
The specific use scene can be a child/old man intelligent watch scene, the difference between the language expression meaning of the child and the actual emotion is judged through the emotion, and the family is convenient to finely manage.
In this step in the fourth application scenario, the environment detection program is executed according to the external environment classification possibility of the preset kind to output the external environment classification result.
Specifically, when needed, the recording, sensor data and positioning functions of the smart watch are triggered in a remote control mode, data are transmitted to the NPU for model recognition, and then the recognition result is sent to a receiving end through a private protocol (the mode is selectable through a network, a short message and the like).
The specific use scene can be children/old man's intelligence wrist-watch scene, judges the external environment difference that the other side language expressed meaning and reality through the external environment, and the family of being convenient for is managed more meticulously.
The application data processing method of the wearable device based on the AI provided by the embodiment effectively meets the AI optimization requirements of relative lightweight models such as power consumption effect optimization, audio effect optimization, positioning effect optimization, emotion recognition and external environment recognition, and realizes the AI optimization on the high-integration intelligent watch through the scheme of integrating the lightweight NPU, thereby improving the user experience.
As another embodiment, there is also provided a wearable device using the application data processing method of the AI-based wearable device as described above.
Specifically, as shown in fig. 3, the wearable device mainly includes a main control chip 21 and an NPU22, and the main control chip 21 mainly includes a data obtaining module 211, a model calling module 212, and an application processing module 213.
In this embodiment, preferably, the wearable device is a smart watch or a smart bracelet, but the present embodiment does not specifically limit the type of the wearable device, and the wearable device may be selected accordingly according to actual needs.
The intelligent watch comprises an adult intelligent watch, a child intelligent watch, an old man intelligent watch and the like, and various sensors can be additionally arranged for use of various information interconnection and intelligent experience.
In this embodiment, preferably, the NPU22 adopts a lightweight NPU (e.g., ARM Cortex-M55, ARM Ethos-U65, etc.) optimized for the internet of things and the like to achieve the effect of low power consumption and ensure a certain AI operation capability, so that the NPU22 can be effectively applied to wearable devices such as smartwatches. However, the present embodiment is not limited to the type of the NPU22, and may be selected according to actual requirements.
In this embodiment, four different AI optimized application scenarios of the smart watch are mainly provided, the first application scenario is an audio playing application scenario, the second application scenario is a positioning application scenario, the third application scenario is an emotion classification application scenario, and the fourth application scenario is an external environment classification application scenario.
In a first application scenario, the data obtaining module 211 is configured to obtain the audio related data in response to receiving a trigger instruction of an audio playing program of the smart watch.
The model invocation module 212 is configured to invoke the NPU22 to input audio-related data to the trained recurrent neural network model and to output tuned PCM data by running the recurrent neural network model.
The application processing module 213 is configured to run an audio playback program according to the tuned PCM data to output audio.
In the present embodiment, the audio-related data includes at least one of pre-collected original audio data with good hearing quality, codec parameters, and audio device parameters, but is not limited thereto.
In a second application scenario, the data obtaining module 211 is configured to obtain the positioning related data in response to receiving a trigger instruction of a positioning program of the smart watch.
The model invocation module 212 is configured to invoke the NPU22 to input positioning-related data to the trained deep neural network model and to output AI-fused positioning data via the run deep neural network model.
The application processing module 213 is configured to run a positioning program to output positioning information according to the AI-fused positioning data.
In this embodiment, the positioning-related data includes at least one of modem positioning data, GPS positioning data, GNSS positioning data, bluetooth positioning data, and Wi-Fi positioning data, but is not limited thereto.
In a third application scenario, the data obtaining module 211 is configured to obtain voice data and calibration information in various real environments in response to receiving a trigger instruction of a call application program of the smart watch.
The model invocation module 212 is configured to invoke the NPU22 to input the speech data and the calibration information to the trained recurrent neural network model and to output a preset category of emotion classification possibilities via running the recurrent neural network model.
The application processing module 213 is configured to run a call application to output the emotion classification result according to the emotion classification possibility of the preset category.
In a fourth application scenario, the data obtaining module 211 is configured to obtain environment-related data in various real environments in response to receiving a trigger instruction of an environment detection program of the smart watch.
The model invocation module 212 is configured to invoke the NPU22 to input the environment-related data to the trained deep neural network model and output a preset category of external environment classification possibilities via the run deep neural network model.
The application processing module 213 is configured to execute the environment detection program according to the external environment classification possibility of the preset kind to output the external environment classification result.
In the present embodiment, the environment-related data includes at least one of environmental sound data, sensor data, and location data, but is not limited thereto.
The wearable device provided by the embodiment effectively meets the AI optimization requirements of relative lightweight models such as power consumption effect optimization, audio effect optimization, positioning effect optimization, emotion recognition and external environment recognition, and realizes AI optimization on a high-integration intelligent watch through the scheme of integrating the lightweight NPU, so that the user experience is improved.
Fig. 4 is a schematic structural diagram of an electronic device according to another embodiment of the present invention. The electronic device comprises a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the application data processing method of the AI-based wearable device as in the above embodiments when executing the program. The electronic device 30 shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 4, the electronic device 30 may be embodied in the form of a general purpose computing device, which may be, for example, a server device. The components of the electronic device 30 may include, but are not limited to: the at least one processor 31, the at least one memory 32, and a bus 33 connecting the various system components (including the memory 32 and the processor 31).
The bus 33 includes a data bus, an address bus, and a control bus.
The memory 32 may include volatile memory, such as Random Access Memory (RAM)321 and/or cache memory 322, and may further include Read Only Memory (ROM) 323.
Memory 32 may also include a program/utility 325 having a set (at least one) of program modules 324, such program modules 324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The processor 31 executes various functional applications and data processing, such as an application data processing method of the AI-based wearable device in the above embodiment of the present invention, by running a computer program stored in the memory 32.
The electronic device 30 may also communicate with one or more external devices 34 (e.g., keyboard, pointing device, etc.). Such communication may be through input/output (I/O) interfaces 35. Also, model-generating device 30 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via network adapter 36. As shown in FIG. 4, network adapter 36 communicates with the other modules of model-generating device 30 via bus 33. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the model-generating device 30, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID (disk array) systems, tape drives, and data backup storage systems, etc.
It should be noted that although in the above detailed description several units/modules or sub-units/modules of the electronic device are mentioned, such a division is merely exemplary and not mandatory. Indeed, the features and functionality of two or more of the units/modules described above may be embodied in one unit/module according to embodiments of the invention. Conversely, the features and functions of one unit/module described above may be further divided into embodiments by a plurality of units/modules.
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the steps in the application data processing method of the AI-based wearable device as in the above embodiments.
More specific examples, among others, that the readable storage medium may employ may include, but are not limited to: a portable disk, a hard disk, random access memory, read only memory, erasable programmable read only memory, optical storage device, magnetic storage device, or any suitable combination of the foregoing.
In a possible embodiment, the invention can also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps of the application data processing method implementing the AI-based wearable device as in the above embodiment, when the program product is run on the terminal device.
Where program code for carrying out the invention is written in any combination of one or more programming languages, the program code may execute entirely on the user device, partly on the user device, as a stand-alone software package, partly on the user device and partly on a remote device or entirely on the remote device.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that this is by way of example only, and that the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and scope of the invention, and these changes and modifications are within the scope of the invention.

Claims (20)

1. An application data processing method of wearable equipment based on AI is characterized in that NPU is integrated in the wearable equipment;
the application data processing method comprises the following steps:
in response to receiving a triggering instruction of an application program of the wearable device, acquiring application data associated with the application program;
calling the NPU to input the application data into a trained preset neural network model and outputting application optimization data by operating the preset neural network model;
and operating the application program according to the application optimization data to output an operation result of the application program.
2. The application data processing method of claim 1, wherein the step of acquiring application data associated with the application in response to receiving a triggering instruction of the application of the wearable device comprises:
responding to a trigger instruction of receiving an audio playing program of the wearable device, and acquiring audio related data;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the audio related data into a trained recurrent neural network model and operating the recurrent neural network model to output tuned PCM data;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the audio playing program according to the tuned PCM data to output audio.
3. The application data processing method of claim 2, wherein the audio-related data includes at least one of raw audio data, codec parameters, and audio device parameters.
4. The application data processing method of claim 1, wherein the step of acquiring application data associated with the application in response to receiving a triggering instruction of the application of the wearable device comprises:
in response to receiving a trigger instruction of a positioning program of the wearable device, acquiring positioning related data;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the positioning related data into a trained deep neural network model and output AI-fused positioning data by operating the deep neural network model;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the positioning program according to the positioning data after AI fusion to output positioning information.
5. The application data processing method of claim 4, wherein the positioning-related data comprises at least one of modem positioning data, GPS positioning data, GNSS positioning data, bluetooth positioning data, and Wi-Fi positioning data.
6. The application data processing method of claim 1, wherein the step of acquiring application data associated with the application in response to receiving a triggering instruction of the application of the wearable device comprises:
responding to a trigger instruction of a call application program of the wearable device, and acquiring voice data and calibration information under various real environments;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the voice data and the calibration information into a trained recurrent neural network model and outputting emotion classification possibility of a preset type by operating the recurrent neural network model;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the call application program according to the emotion classification possibility of the preset category to output an emotion classification result.
7. The application data processing method of claim 1, wherein the step of acquiring application data associated with the application in response to receiving a triggering instruction of the application of the wearable device comprises:
acquiring environment-related data under various real environments in response to receiving a trigger instruction of an environment detection program of the wearable device;
the step of invoking the NPU to input the application data to the trained preset neural network model and outputting application optimization data by operating the preset neural network model includes:
calling the NPU to input the environment-related data into a trained deep neural network model and output a preset type of external environment classification possibility by operating the deep neural network model;
the step of running the application program according to the application optimization data to output a running result of the application program includes:
and operating the environment detection program according to the external environment classification possibility of the preset type to output an external environment classification result.
8. The application data processing method of claim 7, wherein the environment-related data comprises at least one of environmental sound data, sensor data, and location data.
9. The application data processing method of any one of claims 1 to 8, wherein the NPU is a lightweight NPU; and/or the presence of a gas in the gas,
the wearable device comprises a smart watch or a smart bracelet.
10. A wearable device is characterized by comprising a main control chip, wherein the main control chip comprises a data acquisition module, a model calling module and an application processing module, and an NPU is further integrated in the wearable device;
the data acquisition module is configured to acquire application data associated with an application of the wearable device in response to receiving a triggering instruction of the application;
the model calling module is configured to call the NPU to input the application data to a trained preset neural network model and output application optimization data by running the preset neural network model;
the application processing module is configured to run the application program according to the application optimization data to output a running result of the application program.
11. The wearable device of claim 10, wherein the data acquisition module is configured to acquire audio-related data in response to receiving a triggering instruction of an audio playback program of the wearable device;
the model calling module is configured to call the NPU to input the audio-related data to a trained recurrent neural network model and output tuned PCM data by operating the recurrent neural network model;
the application processing module is configured to run the audio playing program according to the tuned PCM data to output audio.
12. The wearable device of claim 11, wherein the audio related data comprises at least one of raw audio data, codec parameters, and audio device parameters.
13. The wearable device of claim 10, wherein the data acquisition module is configured to acquire location-related data in response to receiving a triggering instruction of a location program of the wearable device;
the model calling module is configured to call the NPU to input the positioning related data to a trained deep neural network model and output AI-fused positioning data after operating the deep neural network model;
the application processing module is configured to run the positioning program according to the AI fused positioning data to output positioning information.
14. The wearable device of claim 13, wherein the positioning-related data comprises at least one of modem positioning data, GPS positioning data, GNSS positioning data, bluetooth positioning data, and Wi-Fi positioning data.
15. The wearable device of claim 10, wherein the data acquisition module is configured to acquire voice data and calibration information in various real environments in response to receiving a triggering instruction of a call application of the wearable device;
the model calling module is configured to call the NPU to input the voice data and the calibration information into a trained recurrent neural network model and output a preset category of emotion classification possibilities by operating the recurrent neural network model;
the application processing module is configured to run the call application program according to the emotion classification possibility of the preset category to output an emotion classification result.
16. The wearable device of claim 10, wherein the data acquisition module is configured to acquire environment-related data in various real environments in response to receiving a triggering instruction of an environment detection program of the wearable device;
the model calling module is configured to call the NPU to input the environment-related data to a trained deep neural network model and output a preset kind of external environment classification possibility by operating the deep neural network model;
the application processing module is configured to run the environment detection program according to a preset category of external environment classification possibility probability to output an external environment classification result.
17. The wearable device of claim 16, wherein the environment-related data comprises at least one of environmental sound data, sensor data, and location data.
18. The wearable device of any of claims 10-17, wherein the NPU is a lightweight NPU; and/or the presence of a gas in the gas,
the wearable device comprises a smart watch or a smart bracelet.
19. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the method for processing application data of the AI-based wearable device according to any of claims 1 to 9.
20. A computer readable medium having stored thereon computer instructions, which when executed by a processor, implement the steps of the method of application data processing of an AI-based wearable device according to any of claims 1-9.
CN202011511656.1A 2020-12-18 2020-12-18 Wearable device based on AI and application data processing method thereof Pending CN112633473A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011511656.1A CN112633473A (en) 2020-12-18 2020-12-18 Wearable device based on AI and application data processing method thereof
PCT/CN2021/131289 WO2022127497A1 (en) 2020-12-18 2021-11-17 Ai-based wearable device, and application data processing method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011511656.1A CN112633473A (en) 2020-12-18 2020-12-18 Wearable device based on AI and application data processing method thereof

Publications (1)

Publication Number Publication Date
CN112633473A true CN112633473A (en) 2021-04-09

Family

ID=75317669

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011511656.1A Pending CN112633473A (en) 2020-12-18 2020-12-18 Wearable device based on AI and application data processing method thereof

Country Status (2)

Country Link
CN (1) CN112633473A (en)
WO (1) WO2022127497A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022127497A1 (en) * 2020-12-18 2022-06-23 展讯通信(上海)有限公司 Ai-based wearable device, and application data processing method therefor

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105979201A (en) * 2016-04-11 2016-09-28 上海大学 Intelligent wearable device based on parallel processor
CN108573307A (en) * 2018-03-05 2018-09-25 维沃移动通信有限公司 A kind of method and terminal of processing neural network model file
CN109602410A (en) * 2018-11-16 2019-04-12 青岛真时科技有限公司 A kind of wearable device and its monitoring of pulse method
US20190332931A1 (en) * 2018-04-25 2019-10-31 Fujitsu Limited Deep neural network training for application program generation
CN110554768A (en) * 2018-05-31 2019-12-10 努比亚技术有限公司 intelligent wearable device control method and device and computer readable storage medium
CN110841262A (en) * 2019-12-06 2020-02-28 郑州大学体育学院 Football training system based on wearable equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105980008B (en) * 2014-02-24 2019-04-12 索尼公司 Body position optimization and bio signal feedback for intelligent wearable device
CN107678799B (en) * 2017-09-30 2019-10-25 Oppo广东移动通信有限公司 Application program management-control method, device, storage medium and electronic equipment
CN111753959A (en) * 2020-06-24 2020-10-09 南方科技大学 Neural network model optimization method, device, equipment and storage medium
CN112633473A (en) * 2020-12-18 2021-04-09 展讯通信(上海)有限公司 Wearable device based on AI and application data processing method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105979201A (en) * 2016-04-11 2016-09-28 上海大学 Intelligent wearable device based on parallel processor
CN108573307A (en) * 2018-03-05 2018-09-25 维沃移动通信有限公司 A kind of method and terminal of processing neural network model file
US20190332931A1 (en) * 2018-04-25 2019-10-31 Fujitsu Limited Deep neural network training for application program generation
CN110554768A (en) * 2018-05-31 2019-12-10 努比亚技术有限公司 intelligent wearable device control method and device and computer readable storage medium
CN109602410A (en) * 2018-11-16 2019-04-12 青岛真时科技有限公司 A kind of wearable device and its monitoring of pulse method
CN110841262A (en) * 2019-12-06 2020-02-28 郑州大学体育学院 Football training system based on wearable equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022127497A1 (en) * 2020-12-18 2022-06-23 展讯通信(上海)有限公司 Ai-based wearable device, and application data processing method therefor

Also Published As

Publication number Publication date
WO2022127497A1 (en) 2022-06-23

Similar Documents

Publication Publication Date Title
WO2021027267A1 (en) Speech interaction method and apparatus, terminal and storage medium
JP6505117B2 (en) Interaction of digital personal digital assistant by replication and rich multimedia at response
CN111724775B (en) Voice interaction method and electronic equipment
KR102040783B1 (en) Feedback controller for sending data
US20190103100A1 (en) Techniques for client-side speech domain detection and a system using the same
CN110910872A (en) Voice interaction method and device
CN111506291B (en) Audio data acquisition method, device, computer equipment and storage medium
US20210297494A1 (en) Intelligent layer to power cross platform, edge-cloud hybrid artificial intelligence services
CN101246687A (en) Intelligent voice interaction system and method thereof
US11537360B2 (en) System for processing user utterance and control method of same
US20210304020A1 (en) Universal client api for ai services
CN113168227A (en) Method of performing function of electronic device and electronic device using the same
Yang et al. An intelligent voice interaction system based on Raspberry Pi
WO2022143258A1 (en) Voice interaction processing method and related apparatus
CN112633473A (en) Wearable device based on AI and application data processing method thereof
CN114333774A (en) Speech recognition method, speech recognition device, computer equipment and storage medium
WO2021190225A1 (en) Voice interaction method and electronic device
CN112673367A (en) Electronic device and method for predicting user intention
CN111123326B (en) Positioning method, positioning device, storage medium and terminal
CN113409805A (en) Man-machine interaction method and device, storage medium and terminal equipment
CN113678119A (en) Electronic device for generating natural language response and method thereof
CN112802485B (en) Voice data processing method and device, computer equipment and storage medium
CN111580893A (en) Method for providing routine and electronic device supporting the same
WO2021202605A1 (en) A universal client api for ai services
CN114637531A (en) Method and device for dynamically generating application program interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210409