CN111316199B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN111316199B
CN111316199B CN201880072215.0A CN201880072215A CN111316199B CN 111316199 B CN111316199 B CN 111316199B CN 201880072215 A CN201880072215 A CN 201880072215A CN 111316199 B CN111316199 B CN 111316199B
Authority
CN
China
Prior art keywords
application processor
coprocessor
operation result
data
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880072215.0A
Other languages
Chinese (zh)
Other versions
CN111316199A (en
Inventor
潘尚斌
孙忠
李大伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN111316199A publication Critical patent/CN111316199A/en
Application granted granted Critical
Publication of CN111316199B publication Critical patent/CN111316199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

The application provides an information processing method and electronic equipment, which are applied to the electronic equipment comprising an application processor and a coprocessor, and the method comprises the following steps: a coprocessor of the electronic equipment receives a recommendation information reporting instruction from an application processor of the electronic equipment; then the coprocessor acquires service data generated by the application processor and environmental data generated by a low-power consumption normally open device of the electronic equipment according to the instruction; then the coprocessor utilizes an artificial intelligence algorithm to carry out artificial intelligence operation on the service data and the environment data to generate an operation result; when the operation result meets the preset condition, the coprocessor reports the operation result to the application processor so that the application processor displays the operation result as recommendation information.

Description

Information processing method and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an information processing method and an electronic device.
Background
With the popularization of the application of artificial intelligence on electronic equipment, the functions of the electronic equipment become more and more powerful in the aspects of image processing, audio processing, language processing capability and the like, and the artificial intelligence becomes more and more indispensable core competitiveness of the electronic equipment.
Currently, the artificial intelligence processing function is integrated in the android operating system corresponding to the application processor, and the function is called by the application program of the application program layer. Since the artificial intelligence processing function is triggered to execute based on some action of the user or some event received by the android operating system, the artificial intelligence processing function is only called by the application program when the service is needed. And the application processor is limited by power consumption and is in a dormant state when no service exists, so that the artificial intelligence processing function does not continuously run, so that the changes of user actions, behavior intentions, environment changes and the like cannot be sensed in real time, the AI sensing capability cannot be automatically run, the user needs to rely on specific actions or call some application modules, the intelligence is not high, and the user experience is not good.
Disclosure of Invention
The application provides an information processing method and electronic equipment, which are used for realizing artificial intelligence operation on the electronic equipment side in real time, so that accurate recommendation information is timely provided for a user.
In a first aspect, an embodiment of the present application provides an information processing method, where the method is applied to an electronic device that includes an application processor and a coprocessor, and includes: a coprocessor of the electronic equipment receives a recommendation information reporting instruction from an application processor of the electronic equipment; then the coprocessor acquires service data generated by the application processor and environmental data generated by a low-power consumption normally open device of the electronic equipment according to the instruction; then the coprocessor utilizes an artificial intelligence algorithm to carry out artificial intelligence operation on the service data and the environment data to generate an operation result; and when the operation result meets the preset condition, the coprocessor reports the operation result to the application processor so that the application processor displays the operation result as recommended information.
In the embodiment of the application, the application processor and the coprocessor are in a cooperative working mode, the processing result of the coprocessor can be reported to the application processor or a requirement of the application processor within a specified time to make a quick response, all available resources are scheduled to complete a real-time task, so that all real-time tasks can be controlled to run coordinately and consistently, the response speed is high, the reliability is high, the coprocessor performs operation by combining service data and environment data, the changes of user intention, expression and environment can be sensed in real time, the capability of seamlessly sensing application services can be provided for a user, electronic equipment is more intelligent and more natural in interaction, and the efficiency of man-machine interaction can be improved.
In one possible design, when an application processor of the electronic device is in a dormant state, a processor acquires environmental data acquired by a low-power-consumption normally-open device of the electronic device; and when the application processor of the electronic equipment is in a dormant state, the coprocessor performs artificial intelligence operation on the service data and the environment data by using an artificial intelligence algorithm to generate an operation result.
Therefore, the application processor can be awakened after receiving the event reported by the coprocessor, so that the power consumption of the application processor is not greatly influenced
In one possible design, the electronic device receives a user input for triggering a recommendation function; and responding to the user input, and sending an instruction of reporting the recommendation information to the coprocessor by the application processor. In the embodiment of the application, after the electronic equipment detects the user input, AI operation is performed on the service data generated by the user input, so that the recommendation result is updated, the electronic equipment is more intelligent, the interaction is more natural, and the human-computer interaction efficiency can be improved.
In one possible design, the artificial intelligence algorithm is solidified in the hardware of the co-processor. Thus, the calculation efficiency can be improved, and the power consumption generated during the calculation can be reduced to a certain extent.
In a second aspect, an embodiment of the present application further provides an information processing method, which is applied to an electronic device including an application processor and a coprocessor, and includes: the method comprises the steps that a coprocessor of the electronic equipment obtains environmental data collected by a low-power-consumption normally open device in a first time period, and an application processor of the electronic equipment is in a dormant state in the first time period; then a coprocessor of the electronic equipment acquires service data from the application processor at a first moment; then, the coprocessor of the electronic equipment performs artificial intelligence operation by using an artificial intelligence algorithm according to the environment data and the service data to generate an operation result; and finally, when the operation result meets the preset condition, the coprocessor of the electronic equipment reports the operation result to the application processor so as to wake the application processor to display the operation result as recommended information.
According to the method, the electronic equipment does not depend on specific operation of a user, changes of intentions, expressions and environments of the user can be automatically sensed in real time, the ability of seamlessly sensing application services can be provided for the user, the electronic equipment is enabled to be more intelligent, interaction is more natural, and the efficiency of man-machine interaction can be improved.
In one possible design, the artificial intelligence algorithm is solidified in the hardware of the coprocessor. Thus, the calculation efficiency can be improved, and the power consumption generated during the calculation can be reduced to a certain extent.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor and a memory. Wherein the memory is for storing one or more computer programs; the one or more computer programs stored in the memory, when executed by the processor, enable the electronic device to implement any of the possible design methodologies of any of the aspects described above.
In a fourth aspect, the present application further provides an apparatus including a module/unit for performing the method of any one of the possible designs of any one of the above aspects. These modules/units may be implemented by hardware or by hardware executing corresponding software.
In a fifth aspect, this embodiment further provides a computer-readable storage medium, which includes a computer program and when the computer program runs on an electronic device, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
In a sixth aspect, the present application further provides a computer program product, which when run on an electronic device, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
Fig. 1 is a schematic view of a suitable communication network interconnection scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an android operating system component architecture according to an embodiment of the present application;
fig. 4 is a schematic diagram of a system composition architecture of an RTOS according to an embodiment of the present disclosure;
fig. 5a and fig. 5b are schematic flow charts of an information processing method provided in an embodiment of the present application;
fig. 6 is a schematic view illustrating an interface change in a panoramic search service according to an embodiment of the present application;
fig. 7 is a schematic diagram of a predicted result generation process according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an information prediction apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments of the present application, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
The information processing method provided by the embodiment of the application can be applied to a scene that a plurality of electronic devices 100 are interconnected based on a communication network as shown in fig. 1. The communication network may be a local area network or a wide area network (wan) switched by a relay device. When the communication network is a local area network, for example, the communication network may be a wifi hotspot network, a wifi P2P network, a bluetooth network, a zigbee network, or a Near Field Communication (NFC) network, or other close range communication networks. When the communication network is a wide area network, the communication network may be, for example, a 3rd-generation wireless telephone technology (3G) network, a 4 th-generation mobile communication technology (4G) network, a 5th-generation mobile communication technology (5G) network, a future-evolution Public Land Mobile Network (PLMN), the internet, or the like. In the scenario shown in fig. 1, data, such as interactive pictures, texts, and videos, or results of processing objects such as pictures, texts, or videos by the interactive electronic device, may be interacted between different electronic devices through a communication network.
In some embodiments of the present application, the electronic device 100 shown in fig. 1 may be a portable electronic device, such as a mobile phone, a tablet computer, a wearable device (e.g., a smart watch) with wireless communication function, and the like, that also includes other functions, such as personal digital assistant and/or music player functions. Exemplary embodiments of the portable electronic device include, but are not limited to, a mount
Figure GPA0000288247890000051
Figure GPA0000288247890000052
Or other operating system. The portable electronic device may also be other portable electronic devices such as laptop computers (laptop) with touch sensitive surfaces (e.g., touch panels), etc. It should also be understood that in some other embodiments of the present application, the electronic device 100 may not be a portable electronic device, but may be a desktop computer with a touch-sensitive surface (e.g., a touch panel).
Exemplarily, as shown in fig. 2, the following takes the electronic device 100 as an example to specifically describe the embodiment.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a SIM card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a coprocessor, a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In the embodiment of the application, the coprocessor integrates AI (AI) capability, continuously operates in a low power consumption mode, detects whether the action intention of a user and the surrounding environment of the equipment change or not, and generates a corresponding event to be reported to the application processor after the change is detected. The application processor is in a dormant state when no service exists, and is awakened after receiving the event reported by the coprocessor, and the application processor runs the application program corresponding to the event. The specific composition of the coprocessor and how to cooperate with the application processor may be referred to as described in fig. 4 below.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive a charging input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via a USB interface. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna module 1, the antenna module 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the cellular network antenna may be multiplexed into a wireless local area network diversity antenna. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power Amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLAN), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be an LCD (liquid crystal display), an OLED (organic light-emitting diode), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, electronic device 100 may include 1 or N display screens, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones for collecting sound signals, reducing noise, identifying sound sources, and implementing directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface may be a USB interface, or may be an open mobile electronic device platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". May be disposed on the display screen 194. For detecting a touch operation acting thereon or thereabout. The detected touch operation may be passed to the application processor to determine the type of touch event and provide a corresponding visual output via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the electronic device 100 at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in the headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a Subscriber Identity Module (SIM). The SIM card can be attached to and detached from the electronic device 100 by being inserted into or pulled out of the SIM card interface. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100. The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 3 is a block diagram of a software configuration of an application processor in the electronic device 100 according to the embodiment of the present invention.
The operating system of the application processor is an Android system, the Android system is divided into a plurality of layers by a layered architecture, and each layer has a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, phone calls dialed and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGLES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, timestamp of the touch operation, and the like). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera.
Fig. 4 is a block diagram of software structures of the application processor 210 and the coprocessor 220 in the electronic device 100 in the embodiment of the present application.
The operating system of coprocessor 220 is a Real Time Operating System (RTOS), and the layered architecture divides the RTOS system into several layers, each layer having a clear role and division of work. The layers communicate with each other through a software interface. In some embodiments, the RTOS system is divided into a kernel (kernel) layer 221, an application framework layer 222, and an application layer 223 from bottom to top.
The inner core layer 221 includes: peripheral driver module 2211, hardware acceleration module 2212 and AI operator library module 2213.
Peripheral driver module 2211: and a software interface can be provided for mounting various peripheral chips. For example, a constantly-open low-power camera 230 may be mounted, which may provide a hardware basis for the coprocessor to sense user behavioral intent or environmental changes. The coprocessor can analyze the characteristics of the user, such as the action, the surrounding environment and the like according to the image data acquired by the low-power-consumption camera, and provides a data source for the coprocessor to process AI services.
Hardware acceleration module 2212: the process of the AI model management module is accelerated by calling operators in the AI operator library module to run the AI model by the AI engine module through the acceleration mode. The method can ensure that the AI engine module can quickly call the operators in the AI operator library module in real time, and provides capability interfaces for various AI algorithms in an AI algorithm model of an application program framework (frame) layer.
AI operator library module 2213: the AI engine module of the application program layer can operate the AI model management module of the application program layer by calling operators in the AI operator library module to perform operations such as environment recognition or face recognition. Because the resources of the coprocessor are limited, the AI operator library module for designing a large amount of mathematical computation can be solidified in hardware, most operators of AI can be realized by hardware, and high processor load generated by software realization operators can be avoided. The interface of the hardware cure operator may be used 2233 by a kernel (kernel) that provides an interface to the AI model management module at the application layer.
Optionally, in some embodiments, the peripheral device that can be mounted on the peripheral driving module may further include but is not limited to: the sensor (can be used for recognizing user actions), the normally-open low-power microphone (can be used for analyzing characteristics of user voice and the like), the position sensor (for example, a Global Positioning System (GPS), a wireless local area network (WIFI), a Modem (can be used for providing position information of a user), data collected by the GPS/WIFI/Modem can be used for generating positioning information and the like, images collected by the low-power normally-open camera can be used for analyzing facial characteristics, expressions, surrounding space environment factors and the like of the user, audio data collected by the low-power normally-open microphone can be used for analyzing voice keywords, environmental background sounds and the like of the user, for example, application programs commonly used by the user at home are different from application programs commonly used by a company, so that the data collected by the peripheral devices can be used for a coprocessor to further comprehensively judge the prediction result of an AI algorithm, the accuracy of the prediction result can be improved.
The application framework layer 222 includes: an AI application management module 2221, an AI algorithm management module 2222, and an AI algorithm model 2223.
The AI application management module 2221: the data reported by the peripheral driver module 2211 may be classified. For example, the received data is classified into image class, video class, audio class, etc. so as to call the AI algorithm models 2223 of different classes for analysis processing.
The AI algorithm management module 2222: and the AI application management module 2221 is responsible for algorithm management, and can select a corresponding AI algorithm model from the multiple operating AI algorithm models 2223 for analysis according to different types of data reported by the AI application management module 2221.
AI Algorithm model 2223: may be a collection of algorithmic features of images, sounds consistent with certain services. For example, in conducting a face recognition service, the AI algorithm model 2223 can be a set of conforming face contour features. As another example, in performing context-aware services, the AI algorithm model 2223 may be a collection of features that conform to a certain context. The AI algorithm model 2223 may be trained using large-scale image data, and after the training is completed, an algorithm model is generated, and the corresponding AI operator may operate the algorithm model to perform operations such as environment recognition or face recognition.
It should be noted that the AI algorithm model 2223 may be integrated in a software system by default, or may be updated to the coprocessor 220 by the application processor 210, which is not specifically limited in this embodiment of the present application.
The application layer 223 includes: an AI application layer module 2231, an AI engine module 2232, and an AI model management module 2233.
AI application layer module 2231: various continuously-open AI applications can be implemented at the application layer 223 according to the scenario requirements of the electronic device service design. The AI application layer module 2231 may call various algorithms to obtain AI identification results of various devices mounted on the peripheral device, and may report corresponding AI event messages to the application processor 210. If the application processor 210 is in the sleep state, the AI event message may be secondarily processed after being awakened.
The AI engine module 2232: and the AI algorithm model 2223 can be responsible for scheduling and coordinating operations. Since a plurality of AI algorithm models 2223 are simultaneously operated, the scheduling management control of the AI engine module 2232 can maximally ensure the software to be operated in order.
AI model management module 2233: in some embodiments, application processor 210 may also optimize AI algorithm model 2223. For example, the result of the AI algorithm model 2223 may be comprehensively determined using positioning information such as GPS/WIFI/modem, so as to improve the accuracy of the AI algorithm model 2223. The AI model management module 2233 in the application layer 223 can modify certain features in the AI algorithm model 2223.
The kernel layer and the application framework layer are the core foundation of the whole software system, are responsible for scheduling system resources, and provide the operational capability of the AI algorithm for the application layer, such as providing an AI operator, an AI engine, a hardware accelerator, and the like. The AI operator is integrated in the coprocessor in a hardware solidification mode, the AI engine is responsible for scheduling and coordinating AI operator operation, for example, when a plurality of AI algorithms run simultaneously, the scheduling management control of the AI engine can exert the capability of the hardware to the maximum extent, and the orderly operation of software is ensured.
Considering that an application processor is limited by power consumption, no real-time operation is achieved, and in order to recommend information to a user in real time, AI computing capacity is integrated into a coprocessor. The coprocessor integrates AI computing power and can continuously run in a low-power-consumption mode to detect the action intention of a user and the environmental change. The coprocessor is mounted on the application processor, and when a corresponding event is detected, an AI event message is reported to the application processor to trigger the awakening of the application processor. The coprocessor can also perform AI operation according to the requirements of the application processor and report the calculation result. The coprocessor is capable of receiving and processing at a sufficiently fast speed when external events or data are generated. The processed result can be reported to the application processor within the specified time or the requirement of the application processor is responded quickly, all available resources are scheduled to complete the real-time tasks, and therefore all the real-time tasks can be controlled to run coordinately, the response speed is high, and the reliability is high.
The application processor 210 and the coprocessor 220 are in a cooperative working mode, the coprocessor bears AI computing capability, the application processor bears various application service functions, and good man-machine experience is provided for users. When there is no service, the application processor can perform a normal sleep standby state and enter a low power consumption mode. And after receiving the AI event message sent by the coprocessor and after the application processor is awakened, receiving the event reported by the coprocessor and triggering the corresponding service scene function.
The application processor 210: and the system is responsible for running various applications of the electronic equipment, including a UI (user interface) man-machine interaction interface, cloud interaction and the like. When no service exists, the main controller system is in normal sleep standby mode and enters a low power consumption mode.
The application processor 210 may include: AI local (AI native)211, AI event message manager (AI service)212, Application (APP) 213, APP214, APP 215.
AI local (AI native) 211: the application processor 210 may be awakened by receiving an AI event message reported by the coprocessor 220. The AI algorithm model optimized by the application processor 210 may also be sent to the AI engine module 2232 of the coprocessor 220, and the AI engine module 2232 may update the AI algorithm model 2223 through the AI model management module 2233.
AI event message manager (AI service) 212: the AI event message reported by the AI native 211 can be received, the AI capability interface of the electronic device can be managed in a unified manner, and an AI Application Program Interface (API) is provided for each service module. According to the product service requirements, various bright spot service functions are realized. For example, different highlight service functions may be implemented according to different applications (APP213 or APP214 or APP 215).
Optionally, in some embodiments, if big data processing is required, the AI service 212 may also transmit data to the cloud end, so as to complete a low power consumption business processing mode combining the electronic device and the cloud.
In the embodiment of the application, the main frequency of the coprocessor is low, the AI operators of a large number of related mathematical operations are integrated in a hardware solidification mode, and the peripheral devices are low-power-consumption normally-open devices, so that the AI operators can be normally open and operate the AI sensing capability in a low-power-consumption mode, and the electronic equipment can automatically sense the action change or the environment change of a user without depending on a specific action.
In fig. 4, when there is no service, the application processor is normally dormant and standby, and enters a low power consumption mode. The coprocessor continuously operates in a low-power-consumption mode, namely, data acquired by low-power-consumption normally-open devices (such as a sensor and a low-power-consumption normally-open camera) are acquired, different AI algorithm analysis is correspondingly called according to the acquired data, whether user action, intention and environmental characteristics change or not is sensed and inferred in real time, if the change is determined, AI information is reported to an application processor, after an event is locally detected by the AI of the application processor, the application processor is awakened, and after the application processor is awakened, an application program APP213 or APP214 or APP215 executes a service function corresponding to the event, for example, recommendation information is displayed on an interface of the corresponding application program. The scenario of the specific event is not specifically limited in the embodiment of the present application. As an example, in a face recognition scenario, the occurrence of the specific event may be used to indicate that the face of the previous non-user in the image captured by the low power consumption camera is changed into the face of the user, or the face of the previous user is changed into the face of the non-user. For example, the coprocessor may analyze whether the face of the user may be detected in the image according to the image data and the corresponding AI algorithm, and may determine, from the state acquired before, that the face of the previous non-user in the image acquired by the low-power-consumption camera is changed into the face of the user, or that the face of the previous user in the image acquired by the low-power-consumption camera is changed into the face of the non-user (which may also be understood as a change in the user's behavioral intention). As another example, in a scenario where the environment surrounding the user is identified, the occurrence of the specific event may be used to indicate that the environmental scene around the user has changed in the image captured by the low power consumption camera. For example, the coprocessor may analyze the image according to the image data and the corresponding AI algorithm, and may analyze that the previous environmental scene is changed into the target environmental scene according to the previously acquired environmental scene around the user.
The workflow of the co-processor is illustrated below in connection with classifying an image into a scene.
For example, the AI algorithm in the coprocessor includes an AI algorithm for a MobileNet model file for image classification, and the AI operator to be called to run the MobileNet model file may be, in part, fixed in hardware and, in part, a software operator in a software operator library. After the low-power-consumption normally-open camera collects multiple images in real time, the application program layer of the coprocessor utilizes the MobileNet model to calculate the collected multiple images, an AI algorithm model of the application program framework layer is called in the calculation process, and finally an image analysis result is generated. Because the main frequency of the coprocessor is low, AI operators related to a large number of mathematical operations can also be integrated in a hardware solidification mode, and peripheral devices are low-power consumption peripheral devices. In addition, the coprocessor does not need to be networked, data collected by the low-power-consumption normally-open device is stored in the coprocessor, and the data security is high, so that the privacy of a user can be well protected.
The following embodiments may be implemented in the electronic device 100 having the above-described hardware structure. The following embodiment describes a specific implementation of the information processing method provided in the embodiment of the present application with reference to a structural diagram shown in fig. 4.
The embodiment of the present application provides an information processing method, which is performed by an electronic device, where a processor in the electronic device includes an application processor 210 and a coprocessor 220, and the coprocessor 220 may always be at a relatively low operating frequency, so that the coprocessor 220 can always be kept in a low power consumption operating state. Referring to fig. 5a, the specific process of the information processing method includes:
in step 301a, when the application processor 210 of the electronic device is in a sleep state at a first time period, the coprocessor 220 acquires environmental data collected by the low-power-consumption normally-open device and acquires service data from the application processor 210 at the first time period.
The low-power-consumption normally-open device mainly refers to a sensor, a low-power-consumption normally-open camera, a low-power-consumption normally-open microphone, a GPS/WIFI/Modem and other devices. The environmental data can refer to data which are collected by a GPS/WIFI/Modem and used for positioning, light brightness data collected by a sensor, images collected by a low-power-consumption normally-opened camera, audio data collected by a low-power-consumption normally-opened microphone and the like. In addition, different service scenes generate different service data correspondingly, and the service data can be dotting records of the mobile terminal. A dotting record generated, for example, at the moment when a user opens an application or detects the use of a particular service; the dotting record is generated when a user determines to take a car or navigate to a certain place; and the dotting record is generated at the moment when the user specifically uses a certain bank card for consumption. In the embodiment of the present invention, the coprocessor 220 processes the acquired environment data and service data in real time, regardless of the state of the application processor. It is a special case that the data is acquired and processed while the application processor is asleep, and in other states of the application processor, the coprocessor will continue to perform AI operations in real time unless instructed specifically by the application processor. Real environment conditions and user intentions are fed back in time, and recommendation or guidance and the like are given according to a preset algorithm, so that value-added services are provided for the user.
In step 302a, the coprocessor 220 performs artificial intelligence operation by using an artificial intelligence algorithm according to the environment data and the service data, and generates an operation result.
Step 303a, when the operation result meets the preset condition, the coprocessor 220 reports an event to the application processor 210, where the event includes the operation result.
In step 304a, the application processor 210 displays the operation result as recommendation information on the interface for the user to select according to the received event.
That is to say, the coprocessor 220 obtains the environmental data collected by the low-power consumption normally open device from the application processor 210 in real time, obtains the service data from the application processor 210, and invokes the AI algorithm to perform AI operation on the collected environmental data and the service data, so as to generate a prediction result. The coprocessor 220 reports the operation result as an event to the application processor 210 in case of a specific event (e.g., the operation result is generated) (as an example, the coprocessor may generate an AI message at an application layer and may report the AI message to a main controller, i.e., the application processor). The application processor 210 (main controller) opens a corresponding application function according to the received AI message.
In the embodiment of the present application, the application processor 210 may be responsible for running various applications of the electronic device, including but not limited to: user Interface (UI), human interaction interface, face recognition, environment recognition, automatic on-off of the screen, and the like. When there is no traffic, the application processor 210 normally sleeps for standby and enters a low power consumption mode. The application processor is only woken up after the event is reported by the coprocessor 220. The application processor 210 may implement various highlight service functions according to the product service requirements, or transmit the event message to other related service modules, and the other service modules complete final processing. As an example, the application process 210 receives the prediction result reported by the coprocessor 220, runs the application program corresponding to the prediction result, and displays the prediction result as recommendation information on an interface of the application program.
In the embodiment of the present application, since the power consumption of the coprocessor 220 is low during operation, real-time computation can be achieved, and the application processor 210 can be woken up after receiving the event reported by the coprocessor 220, so that the power consumption of the application processor is not greatly affected. The coprocessor can carry out AI operation in a sleep mode or a low-power-consumption mode of the application processor, so that the real-time performance of the operation can be ensured, each action of a user can be learned and operated, each changed environmental factor can influence the subsequent AI operation result (such as recommended content) in each action of the user, the influence is real-time, and the user can obtain feedback in time. The method provided by the embodiment of the application can enable the electronic equipment to be independent of specific operation of a user, can automatically sense the change of the intention, the expression and the environment of the user in real time, can provide the capability of seamlessly sensing the application service for the user, enables the electronic equipment to be more intelligent and more natural in interaction, and can improve the efficiency of man-machine interaction. In the embodiment of the invention, the previous action of the user immediately affects the current result, and the AI operation is performed in real time without waiting for the system to be idle. The AI operation is carried out 7 x 24 hours by the co-processing, large-scale data iteration is split into small data calculation, consumption is greatly reduced, the data can be processed in real time, and an AI operation result is provided for a user to use, so that instantaneity can be greatly improved, and more value-added services are provided for the user.
On the other hand, the present application further provides an information processing method, and the specific process is shown in fig. 5b, and the specific flow of the method may include:
step 301 b: the application processor 210 of the electronic device detects a first operation of the user, and in response to the first operation, the application processor in the electronic device generates service data related to the first operation and saves the service data, for example, in a buffer.
Step 302b, the application processor 210 in the electronic device sends an instruction to the coprocessor 220 of the electronic device, where the instruction is used to instruct the coprocessor to report recommendation information according to the service data. Wherein the instruction may occur immediately after the first operation by the user.
Step 303b, after receiving the instruction for reporting the recommendation information from the application processor 210 of the electronic device, the coprocessor 220 in the electronic device obtains the service data (for example, from the cache region), and obtains the environmental data collected by the low-power normally-open device within the set time length. Wherein the set duration is determined according to the time information of the service data.
The specific contents of the environment data and the service data may be as described in step 301a above.
And step 304b, the coprocessor of the electronic equipment runs an AI algorithm according to the service data and the environmental data to generate a prediction result.
The AI algorithm may have a recommendation algorithm, a collaborative filtering algorithm, a clustering algorithm, and the like.
Step 305b, when the operation result meets the preset condition, the coprocessor 220 reports an event to the application processor 210, where the event includes the operation result.
In step 306b, the application processor 210 displays the operation result as recommendation information on the interface for the user to select according to the received event.
In the embodiment of the invention, the previous action of the user immediately affects the current result, and the AI operation is performed in real time without waiting for the system to be idle. When the user instructs to check the recommendation result, the operation of the previous step affects the currently displayed recommendation result. Because the AI operation is executed for 7-24 hours by the co-processing, large-scale data iteration is split into small data calculation, the consumption is greatly reduced, the data can be processed in real time, and an AI operation result is provided for a user to use, so that the real-time performance can be greatly improved, the interaction efficiency is improved, the intelligence of the terminal is increased, and more value-added services are provided for the user.
For example, referring to fig. 6, the handset detects that the earphone is inserted into the earphone hole, and then the user opens the global search interface, which includes the recommended applications that the user may use, as shown in fig. 6 a. If the user does not find the application he wants in this interface, he can further enter a keyword in the search bar, e.g. as shown in fig. 6b, he enters FM in the search bar and downloads the installation himalaya FM. After the user installs the application program, if the user does not run the application program but quits the global search or opens another application program, then after the handset detects the earphone next time, if the user opens the global search interface (the action of opening the global search interface may occur immediately after the action of quitting the global search or opening another application program), the recommended applications displayed by the interface include the application programs that the user may use, but do not include the maranya FM, as shown in fig. 6 c. In contrast, if the user installed and started the himalayan FM running before, as shown in fig. 6d, the user opened the global search interface next time after the user exited the application himalayan FM (again, the act of opening the global search interface again may occur immediately after the act of exiting the himalayan FM application), and the application program himalayan FM is included in the recommended application, as shown in fig. 6 e.
That is, the application processor of the mobile phone records each operation of the user, that is, generates and records service data corresponding to each operation, for example, in the above example, the application processor records operations such as user FM search, FM installation, and whether himalayas FM is executed or not as service data, and stores the service data in the buffer, and then the coprocessor 220 of the mobile phone obtains the service data in real time (for example, from the buffer), executes an AI algorithm related to the recommended service according to the service data, and generates a prediction result in real time. The coprocessor then sends the prediction result to the application processor. Assuming that the prediction result generated by the coprocessor indicates that the application probability that the user uses the "himalaya FM" again in the future is high, after the application processor obtains the prediction result from the coprocessor, the application processor notifies the window manager in the application program layer to adjust the display information of the "global search interface", and when the mobile phone detects that the earphone hole is inserted into the earphone next time, the application program of the "himalaya FM" is immediately displayed in the application program recommendation column, as shown in fig. 6 e. Therefore, each action of the user can influence the recommended content, the influence is real-time, and the user can obtain feedback in time, so that the method is greatly superior to idle time operation in the prior art in timeliness, can predict the intention of the user in time, and provides more intelligent and valuable recommendation information for the user.
In the method, the electronic equipment also combines environmental data collected by a peripheral low-power-consumption normally-open device to comprehensively judge the prediction result. Illustratively, the mobile phone detects that the earphone is inserted into the earphone hole, if the user starts to run the phone application program and the himalaya FM application program in the first period of time, the application processor of the mobile phone records the operations of the user such as conversation, himalaya FM running and the like as service data, and stores the service data in the buffer area. Suppose that the co-processor of the mobile phone runs the AI algorithm related to the recommended service according to the service data, and the obtained prediction result shows that the probability of the behavior that the user may run the phone application program at the next time is 0.4, and the probability of the behavior that the user may run the himalayas FM application program at the next time is 0.6. The coprocessor further acquires the acquired data of low-power-consumption normally-open devices such as a GPS (global positioning system), a WiFi (wireless fidelity) and the like, determines the current position information of a user, if the current user is judged to be in a company, the coprocessor adjusts the prediction result according to the current position information of the user, adjusts the probability of the behavior that the user can run a telephone application program at the next moment to be 0.6, adjusts the probability of the behavior that the user can run a Himalayan FM application program at the next moment to be 0.4, reports the prediction result as an event to the application processor, after the application processor acquires the prediction result from the coprocessor, the window manager in the application program layer is informed to adjust the display information of the global search interface, and when the mobile phone detects that an earphone hole is inserted into the earphone again, the application program of the telephone is displayed in the application program recommendation column.
Further, as shown in fig. 7, the specific implementation process of step 304b is that the coprocessor converts the environment data and the service data into feature vectors, forms a feature matrix after discretization and normalization, and substitutes the feature matrix as an input parameter into an AI algorithm corresponding to the service, and iteratively generates a prediction result. A feature vector is a floating point number or set of floating point numbers used to represent a feature. For example, the feature vector used to characterize time is a floating point number between 0.0 and 23.0. The feature vector for characterizing the geographic location is composed of longitude and latitude. For the characteristic of the wifi signal, the coprocessor needs to convert the wifi signal into a feature vector, that is, a Service Set Identifier (SSID) of n (n > 1) wifi signals with the highest signal strength is fixedly selected, and then the n SSIDs are converted into numbers through a hash algorithm to obtain n feature vectors. For example, in fig. 8, the training data portion is a feature matrix generated by the coprocessor, and the coprocessor substitutes the feature matrix as an input parameter into the AI algorithm corresponding to the service, and generates probability values corresponding to three possible behaviors, where y1 is the first possible behavior, the probability is 0.67, y2 is the second possible behavior, the probability is 0.22, y3 is the third possible behavior, and the probability is 0.11.
In one possible design, the coprocessor may perform AI operations in conjunction with user portrait data in a database of memory, in addition to using business data and environmental data. For example, a user's portrait is labeled: the city is in Beijing, male, company is in world trade building, the favorite articles are men's shoes, sports shoes, and the favorite brands are naike, Adida and the like. In addition, the coprocessor can also perform AI operation by combining scene intelligence data acquired from a database of the memory. The scene intelligence means that the daily life of a user is managed and arranged, and the user is reminded to be shown in a card form through an intelligent engine service. For example, situational intelligence data is at 20: 00 generating an alarm clock card which reminds the user to set a travel alarm clock. The alarm clock recommended time is 3 hours before takeoff. The user portrait data and the scene intelligent data are used for the coprocessor to further comprehensively judge the prediction result of the AI algorithm, and the accuracy of the prediction result is improved.
In summary, the embodiment of the present application can perform real-time information prediction according to the service data generated by the current operation of the user by using the information processing method, and comprehensively determine the prediction result by combining the environmental data, the user portrait data, and the like. The method can automatically sense the change of the intention, the expression and the environment of the user in real time, can provide the capability of seamlessly sensing the application service for the user, enables the electronic equipment to be more intelligent and more natural in interaction, and can improve the efficiency of man-machine interaction.
An embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device is caused to execute any one of the possible implementations of the information processing method.
Embodiments of the present application further provide a computer program product, which, when running on an electronic device, enables the electronic device to execute any one of the possible implementations of the information processing method.
In some embodiments of the present application, an embodiment of the present application discloses an information processing apparatus, as shown in fig. 8, for implementing the method described in the above method embodiments, including: a main processing module 801, a co-processing module 802, a transceiver module 803, and a display module 804.
The main processing module 801 is configured to support the electronic device to execute the method steps on the application processor side, for example, execute step 301b in fig. 5b, the co-processing module 802 is configured to support the electronic device to support the method steps on the co-processor side, for example, execute step 303b and step 304b in fig. 5b, and the transceiver module 801 is configured to support the main processing module 801 to send an instruction to the co-processing module 802 and receive an event reported by the co-processing module. The display module 805 is used to support the electronic device to display the recommended information, for example, execute step 306b in fig. 5 b. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
Or, the co-processing module 802 is configured to support the electronic device to support the method steps of the co-processor side, for example, execute steps 301a to 303a in fig. 5a, the transceiver module 801 is configured to support the co-processing module 802 to report an event to the main processing module 801, and the main processing module 801 is configured to receive the event reported by the co-processing module 802. The display module 805 is used to support the electronic device to display the recommended information, for example, execute step 304a in fig. 5 a. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In other embodiments of the present application, an embodiment of the present application discloses an electronic device, as shown in fig. 9, the electronic device may include: an application processor 901; a coprocessor 905; a memory 902; a display 903; one or more applications (not shown); and one or more computer programs 904, which may be connected by one or more communication buses 906.
Wherein the one or more computer programs 904 are stored in the memory 902 and configured to be executed by the application processor 901 and the co-processor 905, the one or more computer programs 904 comprising instructions which may be used to perform the steps as in the corresponding embodiment of fig. 5 b. Specifically, application processor 901 is configured to perform steps 301b and 302b in fig. 5b, coprocessor 905 is configured to perform steps 303b through 305b in fig. 5b, and display 903 is configured to perform step 306b in fig. 5 b.
Or, the coprocessor 905 is configured to execute steps 301a to 304a in fig. 5a, the application processor 901 is configured to receive an event reported by the coprocessor 905, and the display 903 is configured to execute step 304a in fig. 5b according to an instruction of the application processor 901.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. An information processing method applied to an electronic device comprising an application processor and a coprocessor comprises the following steps:
the electronic device receiving a user input for triggering a recommendation function; responding to the user input, and sending an instruction of reporting recommendation information to the coprocessor by the application processor;
the coprocessor acquires service data generated by an application processor and environment data generated by a low-power normally open device of the electronic equipment when the application processor of the electronic equipment is in a dormant state according to the instruction; wherein the service data comprises service data generated by the last user operation before the instruction;
when an application processor of the electronic equipment is in a dormant state, the coprocessor performs artificial intelligence operation on the service data, the environment data, the user portrait data and the scene intelligence data by using an artificial intelligence algorithm to generate an operation result;
and when the operation result meets a preset condition, the coprocessor reports the operation result to the application processor so that the application processor displays the operation result as recommendation information.
2. The method of claim 1, wherein the artificial intelligence algorithm is solidified in hardware of the co-processor.
3. An information processing method applied to an electronic device comprising an application processor and a coprocessor, comprising:
the coprocessor of the electronic equipment receives an instruction of recommendation information reported by the application processor, wherein the instruction is obtained by the application processor in response to user input for triggering a recommendation function;
the coprocessor of the electronic equipment acquires environmental data acquired by a low-power-consumption normally open device in a first time period, and an application processor of the electronic equipment is in a dormant state in the first time period;
a coprocessor of the electronic equipment acquires service data from an application processor in a first time period;
the coprocessor of the electronic equipment performs artificial intelligence operation by using an artificial intelligence algorithm according to environment data, service data, user portrait data and scene intelligence data in a first period of time to generate an operation result;
when the operation result meets a preset condition, the coprocessor of the electronic equipment reports the operation result to the application processor so as to wake the application processor and display the operation result as recommended information.
4. The method of claim 3, wherein the artificial intelligence algorithm is solidified in hardware of the co-processor.
5. A computer storage medium having computer-executable instructions stored thereon for causing a computer to perform the method of any one of claims 1 to 4.
6. A computer program product, which, when executed by a computer, causes the computer to carry out the method of any one of claims 1 to 4.
7. An electronic device is characterized by comprising an application processor, a coprocessor, one or more memories, a display and a low-power normally-on device;
wherein the coprocessor is coupled to the application processor, and the memory is used for storing data and instructions for being called by the coprocessor and the application processor;
the application processor is configured to receive a user input for triggering a recommendation function; responding to the user input, and sending a recommendation information reporting instruction to the coprocessor;
the coprocessor is used for receiving a recommendation information reporting instruction from an application processor of the electronic equipment; according to the instruction, when an application processor is in a dormant state, acquiring service data generated by the application processor and environment data generated by the low-power-consumption normally-open device; when the application processor is in a dormant state, carrying out artificial intelligence operation on the service data, the environment data, the user portrait data and the scene intelligence data by using an artificial intelligence algorithm to generate an operation result; when the operation result meets a preset condition, reporting the operation result to the application processor so that the application processor displays the operation result as recommendation information;
the application processor is further configured to receive the operation result reported by the coprocessor;
and the display is used for displaying information by taking the operation result as recommended information according to the instruction of the application processor.
8. The electronic device of claim 7, wherein the application processor is specifically to:
receiving a user input for triggering a recommendation function; and responding to the user input, and sending an instruction for reporting recommendation information to the coprocessor.
9. The electronic device of any of claims 7-8, wherein the coprocessor is specifically configured to:
and carrying out artificial intelligence operation by using an artificial intelligence algorithm solidified in the hardware of the coprocessor.
10. An electronic device is characterized by comprising an application processor, a coprocessor, one or more memories, a display and a low-power normally-on device;
wherein the coprocessor is coupled to the application processor and the memory is configured to store data and instructions for invocation by the coprocessor and the application processor;
the application processor is configured to receive a user input for triggering a recommendation function; responding to the user input, and sending a recommendation information reporting instruction to the coprocessor;
the coprocessor is used for receiving a recommendation information reporting instruction from an application processor of the electronic equipment; the application processor is used for acquiring environmental data collected by the low-power normally-open device in a first period of time, and is in a dormant state in the first period of time; acquiring service data from the application processor in a first time period; in a first period, carrying out artificial intelligence operation according to the environment data, the service data, the user portrait data and the scene intelligence data to generate an operation result; when the operation result meets a preset condition, reporting the operation result to the application processor so as to remind the application processor to display the operation result as recommended information;
the application processor is used for receiving the operation result reported by the coprocessor;
and the display is used for displaying information by taking the operation result as recommended information according to the instruction of the application processor.
11. The electronic device of claim 10, wherein the coprocessor is specifically configured to:
and carrying out artificial intelligence operation by using an artificial intelligence algorithm solidified in the hardware of the coprocessor.
CN201880072215.0A 2018-10-16 2018-10-16 Information processing method and electronic equipment Active CN111316199B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/110510 WO2020077540A1 (en) 2018-10-16 2018-10-16 Information processing method and electronic device

Publications (2)

Publication Number Publication Date
CN111316199A CN111316199A (en) 2020-06-19
CN111316199B true CN111316199B (en) 2022-08-19

Family

ID=70283351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880072215.0A Active CN111316199B (en) 2018-10-16 2018-10-16 Information processing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN111316199B (en)
WO (1) WO2020077540A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752713B (en) 2020-06-28 2022-08-05 浪潮电子信息产业股份有限公司 Method, device and equipment for balancing load of model parallel training task and storage medium
CN114222020B (en) * 2020-09-03 2022-11-25 华为技术有限公司 Position relation identification method and device and readable storage medium
WO2022047808A1 (en) * 2020-09-07 2022-03-10 华为技术有限公司 Image processing apparatus, electronic device, and image processing method
CN114466308B (en) * 2020-10-22 2023-10-10 华为技术有限公司 Positioning method and electronic equipment
CN112561047B (en) * 2020-12-22 2023-04-28 上海壁仞智能科技有限公司 Apparatus, method and computer readable storage medium for processing data
CN116133091A (en) * 2021-11-12 2023-05-16 华为技术有限公司 Message processing method and device and first electronic equipment
CN113886196B (en) * 2021-12-07 2022-03-15 上海燧原科技有限公司 On-chip power consumption management method, electronic device and storage medium
CN114861152A (en) * 2022-05-31 2022-08-05 Oppo广东移动通信有限公司 Method and device for processing biological characteristic information, electronic equipment and storage medium
CN118276665A (en) * 2022-12-30 2024-07-02 华为技术有限公司 Display method and communication device of intelligent watch and intelligent watch
CN116795628B (en) * 2023-05-24 2024-05-14 荣耀终端有限公司 Power consumption processing method of terminal equipment, terminal equipment and readable storage medium
CN117692998A (en) * 2023-07-27 2024-03-12 荣耀终端有限公司 Data acquisition method under abnormal dormancy condition and electronic equipment
CN116761207B (en) * 2023-08-22 2023-12-15 杭州纵横通信股份有限公司 User portrait construction method and system based on communication behaviors

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104516472A (en) * 2013-09-29 2015-04-15 联想(北京)有限公司 Processor and data processing method
CN104809501A (en) * 2014-01-24 2015-07-29 清华大学 Computer system based on brain-like coprocessor
CN106407364A (en) * 2016-09-08 2017-02-15 北京百度网讯科技有限公司 Information recommendation method and apparatus based on artificial intelligence
CN107094181A (en) * 2017-05-27 2017-08-25 广东欧珀移动通信有限公司 Information output method and related product
CN108197327A (en) * 2018-02-07 2018-06-22 腾讯音乐娱乐(深圳)有限公司 Song recommendations method, apparatus and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9733392B2 (en) * 2008-06-27 2017-08-15 Deep Sciences, LLC Methods of using environmental conditions in sports applications
CN202391475U (en) * 2011-12-23 2012-08-22 北京中矿华沃科技股份有限公司 Portable safety recorder
CN104656873B (en) * 2013-11-25 2018-02-27 联想(北京)有限公司 A kind of information processing method and electronic equipment
KR102137097B1 (en) * 2014-08-21 2020-07-23 삼성전자주식회사 Processing Method for periodic event and Electronic device supporting the same
KR102299330B1 (en) * 2014-11-26 2021-09-08 삼성전자주식회사 Method for voice recognition and an electronic device thereof
CN107341201A (en) * 2017-06-20 2017-11-10 广东欧珀移动通信有限公司 Information-pushing method and Related product
CN107277904A (en) * 2017-07-03 2017-10-20 上海斐讯数据通信技术有限公司 A kind of terminal and voice awakening method
CN108057249B (en) * 2017-11-29 2020-07-24 腾讯科技(成都)有限公司 Service data processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104516472A (en) * 2013-09-29 2015-04-15 联想(北京)有限公司 Processor and data processing method
CN104809501A (en) * 2014-01-24 2015-07-29 清华大学 Computer system based on brain-like coprocessor
CN106407364A (en) * 2016-09-08 2017-02-15 北京百度网讯科技有限公司 Information recommendation method and apparatus based on artificial intelligence
CN107094181A (en) * 2017-05-27 2017-08-25 广东欧珀移动通信有限公司 Information output method and related product
CN108197327A (en) * 2018-02-07 2018-06-22 腾讯音乐娱乐(深圳)有限公司 Song recommendations method, apparatus and storage medium

Also Published As

Publication number Publication date
WO2020077540A1 (en) 2020-04-23
CN111316199A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
CN111316199B (en) Information processing method and electronic equipment
KR102470275B1 (en) Voice control method and electronic device
CN109814766B (en) Application display method and electronic equipment
CN114816210B (en) Full screen display method and device of mobile terminal
CN111913750B (en) Application program management method, device and equipment
CN111602108B (en) Application icon display method and terminal
CN113254409A (en) File sharing method, system and related equipment
CN114095599B (en) Message display method and electronic equipment
CN113805797A (en) Network resource processing method, electronic device and computer readable storage medium
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN112740148A (en) Method for inputting information into input box and electronic equipment
CN114650330A (en) Method, electronic equipment and system for adding operation sequence
CN114911400A (en) Method for sharing pictures and electronic equipment
WO2023207667A1 (en) Display method, vehicle, and electronic device
CN114444000A (en) Page layout file generation method and device, electronic equipment and readable storage medium
CN113380240B (en) Voice interaction method and electronic equipment
CN115421619A (en) Window display method and electronic equipment
CN114003241A (en) Interface adaptation display method and system of application program, electronic device and medium
CN114006976B (en) Interface display method and terminal equipment
CN115706753A (en) Application program management method and device and electronic equipment
CN117311484A (en) Method for adjusting power consumption of equipment and electronic equipment
CN114764316A (en) Focus synchronization method and electronic device
CN114490006A (en) Task determination method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant