WO2020077540A1 - Procédé de traitement d'informations et dispositif électronique - Google Patents

Procédé de traitement d'informations et dispositif électronique Download PDF

Info

Publication number
WO2020077540A1
WO2020077540A1 PCT/CN2018/110510 CN2018110510W WO2020077540A1 WO 2020077540 A1 WO2020077540 A1 WO 2020077540A1 CN 2018110510 W CN2018110510 W CN 2018110510W WO 2020077540 A1 WO2020077540 A1 WO 2020077540A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
coprocessor
application processor
processor
artificial intelligence
Prior art date
Application number
PCT/CN2018/110510
Other languages
English (en)
Chinese (zh)
Inventor
潘尚斌
孙忠
李大伟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201880072215.0A priority Critical patent/CN111316199B/zh
Priority to PCT/CN2018/110510 priority patent/WO2020077540A1/fr
Publication of WO2020077540A1 publication Critical patent/WO2020077540A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power

Definitions

  • the present application relates to the technical field of terminals, in particular to an information processing method and electronic equipment.
  • the artificial intelligence processing function is integrated in the Android operating system corresponding to the application processor, and this function is called by the application program at the application layer. Since the artificial intelligence processing function is triggered and executed based on a certain action of the user or an event received by the Android operating system, the artificial intelligence processing function will only be called by the application program when the business requires it. Moreover, the application processor is limited by power consumption and is in a dormant state when there is no business, so the artificial intelligence processing function will not continue to run, so it cannot also sense changes in user actions, behavioral intentions, environmental changes, etc. in real time, and cannot run AI autonomously. Perception ability, which depends on specific actions or calls of certain application modules, is not highly intelligent and has poor user experience.
  • This application provides an information processing method and an electronic device for real-time artificial intelligence calculation on the electronic device side, so as to provide users with accurate recommendation information in a timely manner.
  • an embodiment of the present application provides an information processing method, which is applied to an electronic device including an application processor and a coprocessor, including: the coprocessor of the electronic device receives the application processor from the electronic device The instruction for reporting recommended information; then the coprocessor obtains the business data generated by the application processor and the environmental data generated by the low-power normally-open device of the electronic device according to the instruction; then the coprocessor uses artificial intelligence algorithms to analyze the business data and The environment data performs artificial intelligence operations to generate operation results; when the operation results meet the preset conditions, the coprocessor reports the operation results to the application processor, so that the application processor displays the operation results as recommended information.
  • the application processor and the coprocessor are in a cooperative working mode, and the processing result of the coprocessor can be reported to the application processor within a prescribed time or make a quick response to the requirements of the application processor, and the scheduling All available resources are used to complete real-time tasks, so all real-time tasks can be controlled to run in harmony, with fast response and high reliability.
  • the method co-processor combines business data and environmental data to perform operations, and can sense user intentions, expressions, and environment in real time. The changes can provide users with the ability to seamlessly perceive application services, make electronic devices more intelligent, more natural interaction, and improve the efficiency of human-computer interaction.
  • the processor when the application processor of the electronic device is in a sleep state, the processor obtains environmental data collected by a low-power normally-open device of the electronic device; the coprocessor is located in the electronic device When the application processor is in a dormant state, an artificial intelligence algorithm is used to perform artificial intelligence operations on the business data and the environmental data to generate an operation result.
  • the application processor can be awakened only after receiving the event reported by the coprocessor, so it does not have a great impact on the power consumption of the application processor. Overall, this method affects the overall power consumption of the electronic device Smaller
  • the electronic device receives user input for triggering the recommendation function; in response to the user input, the application processor sends an instruction to report the recommendation information to the coprocessor.
  • the electronic device after detecting the user input, performs AI calculation on the business data generated by the user input, thereby updating the recommendation result, making the electronic device more intelligent, the interaction more natural, and can improve the human-computer interaction. effectiveness.
  • the artificial intelligence algorithm is solidified in the hardware of the coprocessor. In this way, not only can the calculation efficiency be improved, but also the power consumption generated during the calculation can be reduced to a certain extent.
  • an embodiment of the present application also provides an information processing method, which is applied to an electronic device including an application processor and a coprocessor, including: the coprocessor of the electronic device acquires low power consumption within a first period For the environmental data collected by the normally-open device, the application processor of the electronic device is in the sleep state for the first period; then the coprocessor of the electronic device obtains the business data from the application processor at the first moment; then the coprocessor of the electronic device is based on Environmental data and business data, using artificial intelligence algorithms to perform artificial intelligence calculations to generate calculation results; finally, the coprocessor of the electronic device reports the calculation results to the application processor when the calculation results meet the preset conditions, so as to wake up the application processor The calculation result is displayed as recommended information.
  • the application processor and the coprocessor are in a cooperative working mode.
  • This method can make the electronic device independent of the user's specific operation, can automatically sense the user's intention, expression and environment changes in real time, and can provide seamless
  • the ability to perceive the application business makes the electronic device more intelligent and the interaction more natural, which can improve the efficiency of human-computer interaction.
  • the artificial intelligence algorithm is solidified in the hardware of the coprocessor. In this way, not only can the calculation efficiency be improved, but also the power consumption generated during the calculation can be reduced to a certain extent.
  • an embodiment of the present application provides an electronic device, including a processor and a memory.
  • the memory is used to store one or more computer programs; when the one or more computer programs stored in the memory are executed by the processor, the electronic device can implement any possible design method of any of the above aspects.
  • an embodiment of the present application further provides an apparatus.
  • the apparatus includes a module / unit that performs any possible design method of any one of the above aspects.
  • These modules / units can be implemented by hardware, and can also be implemented by hardware executing corresponding software.
  • a computer-readable storage medium is also provided in an embodiment of the present application.
  • the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device performs any of the above Any possible design method.
  • an embodiment of the present application further provides a method including a computer program product that, when the computer program product runs on an electronic device, causes the electronic device to perform any possible design of any of the above aspects.
  • FIG. 1 is a schematic diagram of an applicable communication network interconnection scenario provided by an embodiment of this application;
  • FIG. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an Android operating system composition architecture provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an RTOS system composition architecture provided by an embodiment of the present application.
  • FIG. 5a and 5b are schematic flowcharts of an information processing method provided by embodiments of the present application.
  • FIG. 6 is a schematic diagram of an interface change under a panoramic search service provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a prediction result generation process provided by an embodiment of this application.
  • FIG. 8 is a schematic structural diagram of an information prediction apparatus provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the information processing method provided in the embodiments of the present application may be applied to a scenario where multiple electronic devices 100 shown in FIG. 1 are interconnected based on a communication network.
  • the communication network may be a local area network or a wide area network transferred by a relay device.
  • the communication network may be a wifi hotspot network, a wifi P2P network, a Bluetooth network, a zigbee network, or a near field communication (NFC) network and other short-distance communication networks.
  • NFC near field communication
  • the communication network may be a third-generation mobile communication technology (3rd-generation wireless telephone technology, 3G) network, a fourth-generation mobile communication technology (the 4th generation mobile communication technology, 4G ) Network, 5th-generation mobile communication technology (5G) network, future public land mobile network (PLMN) or Internet etc.
  • 3G third-generation mobile communication technology
  • 4G fourth-generation mobile communication technology
  • 5G 5th-generation mobile communication technology
  • PLMN public land mobile network
  • different electronic devices can exchange data through a communication network, such as interactive pictures, text, and video, or the result of the interactive electronic device processing objects such as pictures, text, or video.
  • the electronic device 100 shown in FIG. 1 may be a portable electronic device that also includes other functions such as a personal digital assistant and / or a music player function, such as a mobile phone, a tablet computer, and a wireless communication function.
  • Wearable devices such as smart watches
  • Exemplary embodiments of portable electronic devices include, but are not limited to Or portable electronic devices of other operating systems.
  • the above portable electronic device may also be other portable electronic devices, such as a laptop with a touch-sensitive surface (for example, a touch panel) or the like.
  • the electronic device 100 may not be a portable electronic device, but a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the following uses the electronic device 100 as an example to specifically describe the embodiment.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, and a wireless communication module 160 , Audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and SIM card interface 195.
  • a processor 110 an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, and a wireless communication module 160 , Audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and SIM card interface 195.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or less components than shown, or combine some components, or split some components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a coprocessor, a modem processor, and a graphics processor (graphics processing unit, GPU).
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the coprocessor integrates AI capabilities, continues to operate in a low-power mode, and detects whether the user's action intention and the surrounding environment of the device have changed, and generates a corresponding event when the change is detected Report to the application processor.
  • the application processor is in a dormant state when there is no business.
  • an event reported from the coprocessor is received, it is woken up, and the application processor runs the application program corresponding to the event.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal to complete the control of fetching instructions and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. The repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • Interfaces can include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit, sound, I2S) interface, pulse code modulation (pulse code modulation (PCM) interface, universal asynchronous transceiver (universal) asynchronous receiver / transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input / output (GPIO) interface, subscriber identity module (SIM) interface, and / Or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, to realize the function of answering the phone call through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to realize the function of answering the call through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 to peripheral devices such as the display screen 194 and the camera 193.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI) and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through the DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured via software.
  • the GPIO interface can be configured as a control signal or a data signal.
  • the GPIO interface may be used to connect the processor 110 to the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present invention is only a schematic description, and does not constitute a limitation on the structure of the electronic device 100.
  • the electronic device 100 may also use different interface connection methods in the foregoing embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface.
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and / or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be disposed in the processor 110.
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna module 1, the antenna module 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the cellular antenna can be multiplexed as a wireless LAN diversity antenna. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA), etc.
  • the mobile communication module 150 can receive the electromagnetic wave from the antenna 1, filter and amplify the received electromagnetic wave, and transmit it to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor and convert it to electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area network (wireless local area networks, WLAN), Bluetooth (bluetooth, BT), global navigation satellite system (GNSS), frequency modulation (frequency modulation) applied to the electronic device 100. FM), Near Field Communication (NFC), Infrared (IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives the electromagnetic wave via the antenna 2, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110.
  • the wireless communication module 160 may also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it to electromagnetic waves through the antenna 2 to radiate it out.
  • the antenna 1 of the electronic device 100 and the mobile communication module 150 are coupled, and the antenna 2 and the wireless communication module 160 are coupled so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global mobile communication system (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long-term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time division code division multiple access
  • long-term evolution long term evolution
  • LTE long term evolution
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS)) and / or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 realizes a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connecting the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can use LCD (liquid crystal), OLED (organic light-emitting diode), active matrix organic light-emitting diode or active matrix organic light-emitting diode (active-matrix organic light) emitting diode, AMOLED), flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens, where N is a positive integer greater than 1.
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP processes the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, and the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the algorithm of image noise, brightness and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be set in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras, where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the electronic device 100 is selected at a frequency point, the digital signal processor is used to perform Fourier transform on the energy at the frequency point.
  • Video codec is used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in various encoding formats, for example: MPEG1, MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent recognition of the electronic device 100, such as image recognition, face recognition, voice recognition, and text understanding.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121.
  • the memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one function required application programs (such as sound playback function, image playback function, etc.) and so on.
  • the storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100 and the like.
  • the memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and also used to convert analog audio input into digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also known as "handset" is used to convert audio electrical signals into sound signals.
  • the voice can be received by bringing the receiver 170B close to the ear.
  • Microphone 170C also known as “microphone”, “microphone”, is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones. In addition to collecting sound signals, it may also achieve a noise reduction function. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the headset interface 170D is used to connect wired headsets.
  • the headphone jack can be a USB jack, or a 3.5mm open mobile electronic device (open mobile terminal) (OMTP) standard interface, and the American Telecommunications Industry Association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile electronic device
  • CTIA American Telecommunications Industry Association
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may be a parallel plate including at least two conductive materials. When force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for shooting anti-shake.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude by using the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 may detect the opening and closing of the clamshell according to the magnetic sensor 180D.
  • characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to recognize the posture of electronic devices, and be used in applications such as horizontal and vertical screen switching and pedometers.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting scenes, the electronic device 100 may use the distance sensor 180F to measure distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outward through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense the brightness of ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access to application locks, fingerprint taking pictures, fingerprint answering calls, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs performance reduction of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature. In some other embodiments, when the temperature is below another threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
  • Touch sensor 180K also known as "touch panel”. Can be set on the display 194. Used to detect touch operations on or near it. The detected touch operation may be passed to the application processor to determine the type of touch event and provide corresponding visual output through the display screen 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human body part. The bone conduction sensor 180M can also contact the pulse of the human body and receive a blood pressure beating signal. In some embodiments, the bone conduction sensor 180M may also be provided in the earphone.
  • the audio module 170 may parse out the voice signal based on the vibration signal of the vibrating bone block of the voice part acquired by the bone conduction sensor 180M to realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M to implement the heart rate detection function.
  • the key 190 includes a power-on key, a volume key, and the like.
  • the keys can be mechanical keys. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 may generate a vibration prompt.
  • the motor 191 can be used for vibration notification of incoming calls and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminder, receiving information, alarm clock, game, etc.
  • Touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate a charging state, a power change, and may also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a subscriber identity module (subscriber identity module, SIM).
  • SIM subscriber identity module
  • the SIM card can be inserted into or removed from the SIM card interface to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc. Multiple cards can be inserted simultaneously in the same SIM card interface. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through a SIM card to realize functions such as call and data communication.
  • the electronic device 100 uses eSIM, that is, an embedded SIM card.
  • eSIM that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, event-driven architecture, micro-core architecture, micro-service architecture, or cloud architecture.
  • the embodiment of the present invention takes a layered architecture Android system as an example to exemplarily explain the software structure of the electronic device 100.
  • FIG. 3 is a software block diagram of an application processor in the electronic device 100 according to an embodiment of the present invention.
  • the operating system of the application processor is the Android system, and the layered architecture divides the Android system into several layers, each of which has a clear role and division of labor.
  • the layers communicate with each other through a software interface.
  • the Android system is divided into four layers, from top to bottom are the application layer, the application framework layer, the Android runtime and the system library, and the kernel layer.
  • the application layer may include a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface) and programming framework for applications at the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and so on.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, intercept the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text and controls for displaying pictures.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including an SMS notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call state (including connection, hang up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear after a short stay without user interaction.
  • the notification manager is used to notify the completion of downloading, message reminders, etc.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • the text message is displayed in the status bar, a prompt sound is emitted, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core library and virtual machine. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one part is the function function that Java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in the virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer into binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library may include multiple functional modules. For example: surface manager (surface manager), media library (Media library), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • surface manager surface manager
  • media library Media library
  • 3D graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports a variety of commonly used audio, video format playback and recording, and still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least the display driver, camera driver, audio driver, and sensor driver.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps and other information of touch operations).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, for example, the control corresponding to the click operation is a camera application icon.
  • the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer. The camera captures still images or video.
  • FIG. 4 is a block diagram of the software structure of the application processor 210 and the coprocessor 220 in the electronic device 100 in the embodiment of the present application.
  • the operating system of the coprocessor 220 is a real-time operating system (RTOS).
  • RTOS real-time operating system
  • the layered architecture divides the RTOS system into several layers, and each layer has a clear role and division of labor.
  • the layers communicate with each other through a software interface.
  • the RTOS system includes a kernel layer 221, an application framework layer 222, and an application layer 223 from bottom to top.
  • the kernel layer 221 includes: a peripheral driver module 2211, a hardware acceleration module 2212, and an AI operator library module 2213.
  • Peripheral driver module 2211 It can provide a software interface for mounting various peripheral chips. For example, a normally-open low-power camera 230 may be mounted, and the low-power camera may provide a hardware basis for the coprocessor to perceive user behavior intentions or environmental changes. The coprocessor can analyze the characteristics of the user's movements and surrounding environment according to the image data collected by the low-power camera, which provides a data source for the coprocessor to process AI services.
  • Hardware acceleration module 2212 You can accelerate the process of running the AI model management module on the AI engine module by calling the operator in the AI operator library module through the acceleration mode. It can ensure that the AI engine module can quickly call the operators in the AI operator library module in real time, and provide the capability interface for various AI algorithms in the AI algorithm model of the application framework (framework) layer.
  • AI operator library module 2213 The AI engine module at the application layer can run the AI model management module at the application layer by calling the operator in the AI operator library module to perform operations such as environment recognition or face recognition. Due to the limited resources of the coprocessor, the AI operator library module that designs a large number of mathematical calculations can be solidified in hardware, and most of the AI operators can be implemented by the hardware, which can avoid the high processor load generated by the software implementation operator.
  • the interface of the hardware curing operator can be used by the kernel to provide an interface to the application model of the AI model management module 2233.
  • the peripheral devices that can be mounted on the peripheral driver module can also include, but are not limited to: sensors (which can be used to identify user actions), normally-open low-power microphones (which can be used to analyze users Features such as voice), position sensors (for example, global positioning system (GPS), wireless local area network (WIFI), modem (modem) (which can be used to provide user location information).
  • GPS global positioning system
  • WIFI wireless local area network
  • modem modem
  • GPS / WIFI / The data collected by the Modem can be used to generate positioning information, etc .
  • the image collected by the low-power normally-open camera can be used to analyze the user's facial features, expressions, environmental factors in the surrounding space, etc .
  • the data can be used to analyze the user's voice keywords, environmental background sound, etc.
  • the applications commonly used by users at home are usually different from the applications commonly used in companies, so the data collected by these peripheral devices can be used by the coprocessor for AI algorithms Further comprehensive judgment of the prediction results can improve the accuracy of the prediction
  • the application framework layer 222 includes: an AI application management module 2221, an AI algorithm management module 2222, and an AI algorithm model 2223.
  • AI application management module 2221 It can classify the data reported by the peripheral drive module 2211. For example, the received data is divided into image categories, video categories, audio categories, etc., so as to call AI algorithm models 2223 of different categories for analysis and processing.
  • AI algorithm management module 2222 responsible for algorithm management, according to the different types of data reported by the AI application management module 2221, the corresponding AI algorithm model can be selected from a plurality of running AI algorithm models 2223 for analysis.
  • AI algorithm model 2223 It can be a set of algorithm features that conform to the image and sound of certain services.
  • the AI algorithm model 2223 may be a set that conforms to the contour characteristics of the face.
  • the AI algorithm model 2223 may be a set of features that conform to a certain environmental scene.
  • the AI algorithm model 2223 can be trained through large-scale image data. After the training is completed, an algorithm model can be generated, and the corresponding AI operator can run the algorithm model to perform operations such as environment recognition or face recognition.
  • AI algorithm model 2223 may be integrated into the software system by default, or may be updated to the coprocessor 220 through the application processor 210, which is not specifically limited in the embodiment of the present application.
  • the application program layer 223 includes: an AI application layer module 2231, an AI engine module 2232, and an AI model management module 2233.
  • AI application layer module 2231 You can implement a variety of continuous and normally open AI applications in the application program layer 223 according to the scene requirements of the electronic device business design.
  • the AI application layer module 2231 can call various algorithms to obtain the AI recognition results of peripherally mounted various devices, and can report the corresponding AI event message to the application processor 210. If the application processor 210 is in a dormant state, after being awakened, the AI event message may be processed a second time.
  • AI engine module 2232 it can be responsible for scheduling and coordinating the AI algorithm model 2223 for operation. Since there are multiple AI algorithm models 2223 running at the same time, the scheduling management control of the AI engine module 2232 can ensure the orderly operation of the software to the greatest extent.
  • AI model management module 2233 In some embodiments, the application processor 210 may also optimize the AI algorithm model 2223. For example, positioning information such as GPS / WIFI / modem can be used to comprehensively judge the results of the AI algorithm model 2223 to improve the accuracy of the AI algorithm model 2223.
  • the AI model management module 2233 in the application layer 223 can modify certain features in the AI algorithm model 2223.
  • the kernel layer and the application framework layer are the core foundation of the entire software system, responsible for system resource scheduling, and provide the application layer with the computing power of the above-mentioned AI algorithms, such as providing AI operators, AI engines, and hardware accelerators.
  • the AI operator is integrated into the coprocessor in a hardware-hardened manner.
  • the AI engine is responsible for scheduling and coordinating AI operator operations. For example, when multiple AI algorithms are running at the same time, the AI engine's scheduling management control can maximize the hardware The ability to ensure the orderly operation of the software.
  • this application integrates AI computing capabilities into the coprocessor.
  • the coprocessor integrates AI computing power and can continuously run in a low-power mode to detect user action intentions and environmental changes.
  • the coprocessor is mounted on the application processor. When a corresponding event is detected, the application processor is triggered to wake up the application processor by reporting an AI event message.
  • the coprocessor can also perform AI operations according to the requirements of the application processor and report the calculation results.
  • the coprocessor can receive and process it at a fast enough speed when an external event or data is generated.
  • the results of its processing can be reported to the application processor or make a quick response to the requirements of the application processor within the specified time, and schedule all available resources to complete real-time tasks, so it can control all real-time tasks to run in harmony and respond. Fast speed and high reliability.
  • the application processor 210 and the coprocessor 220 are in a cooperative working mode.
  • the coprocessor carries AI computing capabilities, and the application processor carries various application service functions, providing users with a good human-machine experience.
  • the application processor can perform a normal sleep standby state and enter a low power consumption mode. After receiving the AI event message sent by the coprocessor, after the application processor is woken up, it receives the event reported by the coprocessor and triggers the corresponding business scene function.
  • Application processor 210 responsible for running various applications of electronic devices, including UI human-computer interaction interface, and cloud interaction, etc. When there is no business, the main controller system sleeps normally and enters a low power consumption mode.
  • the application processor 210 may include: AI local (AI native) 211, AI event message manager (AI service) 212, application (application, APP) 213, APP 214, APP 215.
  • AI local (AI) 211 The AI event message reported by the coprocessor 220 can be received, and the application processor 210 is woken up.
  • the AI algorithm model optimized by the application processor 210 may also be sent to the AI engine module 2232 of the coprocessor 220, and the AI engine module 2232 may update the AI algorithm model 2223 through the AI model management module 2233.
  • AI event message manager (AI) service 212 It can receive AI event messages reported by AI native 211, and manage the AI capability interface of electronic devices in a unified manner, and provide AI application program interfaces (application interprograme, API) for each business module. According to product business needs, realize various bright business functions. For example, different highlight business functions can be implemented according to different applications (APP213 or APP214 or APP215).
  • AI service 212 can also transfer the data to the cloud to complete a low-power business processing mode in which electronic devices and the cloud are combined.
  • the main frequency of the coprocessor is low
  • the AI operators involved in a large number of mathematical operations are integrated in a hardware-hardened manner
  • the peripheral devices are low-power normally-on devices, which can Normally open and run AI awareness in the mode, so that electronic devices can automatically sense changes in user actions or changes in the environment without relying on specific actions.
  • the application processor sleeps normally and enters a low-power mode.
  • the coprocessor has been running continuously in low-power mode, that is, to obtain data collected by low-power normally-open devices (such as sensors, low-power normally-open cameras, etc.), and according to the obtained data, call different AI algorithms for analysis , Real-time perception and speculation about whether the user's actions, intentions and environmental characteristics have changed. If it is determined that there is a change, an AI message is reported to the application processor.
  • the application processor's AI detects an event locally, the application processor is awakened and the application processor After being awakened, the application programs APP213 or APP214 or APP215 perform business functions corresponding to the event, for example, displaying recommendation information on the interface of the corresponding application program.
  • the scene of a specific event is not specifically limited.
  • the occurrence of this specific event can be used to indicate that the image collected by the low-power camera changes from a previous user-less face to a user-face, or from a previous user-face The face changes to no user face.
  • the coprocessor can analyze whether the user's face can be detected in the image according to the image data and the corresponding AI algorithm, and can judge the image collected by the low-power camera from the previous no user according to the previously collected state
  • the face changes to a face with a user, or from a previous face with a user to a face without a user (which can also be understood as a change in user behavior intention).
  • the occurrence of the specific event may be used to indicate that the environment surrounding the user has changed in the image collected by the low-power camera.
  • the coprocessor may analyze the target environment scene in the image according to the image data and the corresponding AI algorithm, and may analyze the environment scene around the user previously collected to change from the previous environment scene to the target environment scene.
  • the AI algorithm in the coprocessor includes the MobileNet model file.
  • the model file is used for image classification.
  • Part of the AI operator that needs to be called to run the MobileNet model file can be solidified into hardware, and another part It can be a software operator in the software operator library.
  • the application layer of the coprocessor uses the MobileNet model to perform operations on the collected multiple images, and the AI algorithm model of the application framework layer is called during the calculation process , And finally generate image analysis results.
  • the main frequency of the coprocessor is relatively low, AI operators involving a large number of mathematical operations can also be integrated in a hardware-hardened manner, and the peripheral devices are also low-power peripheral devices, although the peripheral devices are normally open devices.
  • the overall power consumption of the processor when running is still relatively low, and AI calculations can be performed in real time.
  • the coprocessor does not need to be networked, and the data collected by the low-power normally-on device is stored in the coprocessor, and the data security is high, so the user's privacy can be well protected.
  • An embodiment of the present application provides an information processing method, which is executed by an electronic device.
  • the processor in the electronic device includes an application processor 210 and a co-processor 220.
  • the co-processor 220 may always be at a relatively low operating frequency, so The coprocessor 220 can always be kept in a low-power running state.
  • the specific process of this information processing method is shown in Figure 5a, including:
  • Step 301a When the application processor 210 of the electronic device is in the sleep state for the first period of time, the coprocessor 220 acquires the environmental data collected by the low-power normally-on device during the first period of time, and obtains the business data from the application processor 210 .
  • low-power normally-open devices mainly refer to sensors, low-power normally-open cameras, low-power normally-open microphones, GPS / WIFI / Modem and other devices.
  • Environmental data can refer to the data collected by GPS / WIFI / Modem for positioning, the light brightness data collected by the sensor, the image collected by the low-power normally-open camera, and the audio data collected by the low-power normally-open microphone Wait.
  • different business scenarios will generate different business data correspondingly, and the business data may be a record of the mobile terminal.
  • the score record generated when the user opens the application or detects the use of a specific service For example, the score record generated when the user opens the application or detects the use of a specific service; the score record generated when the user determines to take a taxi or navigate to a certain place; the moment when the user specifically consumes a bank card RBI record.
  • the coprocessor 220 processes the acquired environmental data and business data in real time, regardless of the state of the application processor. Acquiring and processing data while the application processor is asleep is a special case. In other states of the application processor, the coprocessor will continue to perform AI operations in real time unless specified by the application processor. Provide real-time feedback on real environmental conditions and user intentions, and give recommendations or guidelines based on preset algorithms to provide value-added services to users.
  • step 302a the coprocessor 220 uses artificial intelligence algorithms to perform artificial intelligence calculations based on environmental data and business data to generate calculation results.
  • Step 303a When the calculation result meets the preset condition, the coprocessor 220 reports an event to the application processor 210, and the event includes the calculation result.
  • step 304a the application processor 210 displays the operation result as recommendation information on the interface according to the received event for the user to select.
  • the coprocessor 220 obtains in real time the environment data collected by the low-power normally-on device from the application processor 210, and obtains the business data from the application processor 210, and calls the AI algorithm to collect the collected environment data and use the business data Perform AI operations to generate prediction results.
  • the coprocessor 220 reports the operation result as an event to the application processor 210 (as an example, the coprocessor can generate AI messages at the application layer, and can send The main controller, that is, the application processor, reports the AI message).
  • the application processor 210 main controller
  • the application processor 210 may be responsible for running various applications of the electronic device, including but not limited to: user interface (user interface), human face interaction interface, face recognition, environment recognition, and automatic screen turning on and off.
  • user interface user interface
  • human face interaction interface face recognition
  • environment recognition environment recognition
  • automatic screen turning on and off When there is no service, the application processor 210 normally sleeps in standby and enters a low power consumption mode. After the coprocessor 220 reports the event, the application processor is woken up.
  • the application processor 210 can implement various bright business functions according to product business requirements, or pass event messages to other related business modules, and other business modules complete the final processing.
  • the application processing 210 receives the prediction result reported by the coprocessor 220, runs an application program corresponding to the prediction result, and displays the prediction result as recommendation information on the interface of the application program.
  • the coprocessor 220 can perform AI operations in the application processor sleep or low power mode, it can ensure the real-time operation of the operation, and can learn and calculate each user's action, so that the user's every action, every change The environmental factors will affect the subsequent AI operation results (such as recommended content). This influence is real-time, and the user can obtain feedback in time.
  • the embodiment of the present invention is greatly superior to the prior art idle time operation in terms of timeliness. , Can infer the user's intention in a more timely manner, and provide users with more intelligent and valuable recommendation information.
  • the method provided by the embodiment of the present application can make the electronic device not dependent on the specific operation of the user, can automatically sense the change of the user's intention, expression and environment in real time, can provide the user with the ability to seamlessly perceive the application service, and make the electronic device more intelligent , The interaction is more natural and can improve the efficiency of human-computer interaction.
  • the user's previous action will have an immediate impact on the current result.
  • the AI operation is performed in real time, and there is no need to wait until the system is idle.
  • AI operations are performed by co-processing for 7 * 24 hours, iteratively splits large-scale data into small data calculations, greatly reducing consumption, and can process data in real time, giving AI operation results for users to use, so it can Greatly improve real-time performance and provide users with more value-added services.
  • the present application also provides an information processing method.
  • the specific process is shown in FIG. 5b.
  • the specific process of the method may include:
  • Step 301b The application processor 210 of the electronic device detects the first operation of the user, and in response to the first operation, the application processor in the electronic device generates business data related to the first operation and saves the business data, for example, saves To the cache.
  • Step 302b The application processor 210 in the electronic device sends an instruction to the coprocessor 220 of the electronic device, where the instruction is used to instruct the coprocessor to report recommendation information according to the service data.
  • the instruction may occur immediately after the user's first operation.
  • Step 303b After receiving the instruction to report recommendation information from the application processor 210 of the electronic device, the coprocessor 220 in the electronic device obtains the service data (for example, from the cache area), and obtains a low-power normally-open device Environmental data collected within the set duration. The set duration is determined based on the time information of the business data.
  • step 301a For the specific content of environmental data and business data, please refer to the relevant introduction in step 301a above.
  • Step 304b The coprocessor of the electronic device runs an AI algorithm based on the business data and environment data to generate a prediction result.
  • AI algorithms may include recommendation algorithms, collaborative filtering algorithms, clustering algorithms, etc.
  • Step 305b When the calculation result meets the preset condition, the coprocessor 220 reports an event to the application processor 210, and the event includes the calculation result.
  • step 306b the application processor 210 displays the operation result as recommendation information on the interface according to the received event for the user to select.
  • the user's previous action will have an immediate impact on the current result.
  • the AI operation is performed in real time, and there is no need to wait until the system is idle.
  • the previous operation will affect the currently displayed recommendation result. Because AI operations are performed by co-processing for 7 * 24 hours, iteratively splits large-scale data into small data calculations, greatly reducing consumption, and can process data in real time, giving AI operation results for users to use, so it can Greatly improve real-time performance, improve interaction efficiency, increase terminal intelligence, and provide users with more value-added services.
  • the mobile phone detects that the headset is inserted into the earphone jack, and then the user opens the global search interface.
  • the recommended applications on the interface include applications that the user may use, as shown in FIG. 6a. If the user does not find the application he wants on this interface, the user can further enter keywords in the search bar. For example, as shown in FIG. 6b, the user enters FM in the search bar, and downloads and installs the Himalayan FM. After the user installs this application, if the user does not run the application but quits the global search or opens other applications, when the phone next detects the headset, if the user opens the global search interface (open the global search interface this The action can occur immediately after the action of exiting the global search or opening other applications).
  • the recommended applications displayed on this interface include applications that the user may use, but do not include the Himalayan FM, as shown in FIG. 6c.
  • the user if the user previously installed and started running Himalayan FM, as shown in Figure 6d, after the user exits the Himalayan FM application, the user next opens the global search interface (again, the action of opening the global search interface again can immediately After the action of exiting the Himalayan FM application), the recommended application includes the Himalaya FM application, as shown in Figure 6e.
  • the application processor of the mobile phone records every operation of the user, that is, generates and records business data corresponding to each operation. For example, in the previous example, the application processor searches the user for FM, installs FM, and whether Operations such as running Himalayan FM are recorded as business data, and the business data is stored in the cache area, and then the coprocessor 220 of the mobile phone obtains the business data in real time (for example, from the cache area), and runs the AI algorithm related to the recommended business based on the business data To generate prediction results in real time. Then, the coprocessor sends the prediction result to the application processor.
  • the application processor searches the user for FM, installs FM, and whether Operations such as running Himalayan FM are recorded as business data, and the business data is stored in the cache area, and then the coprocessor 220 of the mobile phone obtains the business data in real time (for example, from the cache area), and runs the AI algorithm related to the recommended business based on the business data To generate prediction results in real time. Then, the coprocessor sends the prediction result
  • the application processor obtains the prediction result from the coprocessor, it notifies the window manager in the application layer to adjust the "global search" "Interface" display information, the next time the phone detects that the earphone jack is inserted into the earphone, it will immediately display the "Himalaya FM" application in the application recommendation bar, as shown in Figure 6e. In this way, every action of the user will have an impact on the subsequent recommended content. This impact is real-time, and the user can get feedback in time. Therefore, the present invention is much better than the idle calculation in the prior art in terms of timeliness, and can be more timely guessed The user's intention is to provide users with more intelligent and valuable recommendation information.
  • the electronic device also combines the environmental data collected by the peripheral low-power normally-on device to comprehensively judge the prediction result.
  • the mobile phone detects that the headset is inserted into the earphone jack. If the user starts the phone application and the Himalayan FM application within the first period of time, the application processor of the mobile phone will record the user's call and operation of the Himalayan FM as Business data, and save the business data in the cache area. Assume that the mobile phone's coprocessor runs AI algorithms related to the recommended business based on business data, and the prediction result shows that the probability that the user may run the phone application at the next moment is 0.4, and the user may run the Himalayan FM application at the next moment. The probability of this behavior is 0.6.
  • the coprocessor further obtains the collected data of low-power normally-open devices such as GPS and WiFi, and determines the current location information of the user. If the current user is judged to be in the company, the coprocessor adjusts the prediction result according to the current location information of the user, will The probability that the user may run the phone application at the next moment is adjusted to 0.6, and the probability that the user may run the Himalayan FM application at the next moment is adjusted to 0.4, and the predicted result is reported to the application as an event.
  • the application processor obtains the prediction result from the coprocessor, it notifies the window manager in the application layer to adjust the display information of the "global search interface". When the mobile phone detects that the headset jack is inserted into the headset again, the The application is displayed in the application recommendation bar.
  • the coprocessor converts the environmental data and business data into feature vectors, and then undergoes discretization and normalization processing to form a feature matrix, and then the coprocessor
  • the feature matrix is substituted into the AI algorithm corresponding to the business as input parameters, and iteratively generates prediction results.
  • the so-called feature vector is a floating-point number or a set of floating-point numbers used to express features.
  • the feature vector used to express the feature of time is a floating point number between 0.0 and 23.0.
  • the feature vector used to express the feature of geographic location is composed of longitude and latitude.
  • the coprocessor needs to convert the wifi signal into a feature vector, that is, the n (n> 1) wifi signal with the highest signal strength is fixedly selected (service set identifier (SSID), and then n Each SSID is transformed into a number by a hash algorithm to obtain n feature vectors.
  • the training data part is a feature matrix converted by the coprocessor.
  • the coprocessor uses the feature matrix as an input parameter to the AI algorithm corresponding to the business to generate the probability corresponding to the three possible behaviors. Value, where y1 is the first possibility behavior with a probability of 0.67, y2 is the second possibility behavior with a probability of 0.22, and y3 is the third possibility behavior with a probability of 0.11.
  • the coprocessor can also perform AI operations in combination with user profile data in a database in a memory.
  • the label of a user's portrait is: the city is in Beijing, the male is, the company is in the World Trade Center, the favorite category is men's shoes, sports shoes, favorite brands are Nike, Adidas and so on.
  • the coprocessor can also perform AI operations in combination with the scene intelligence data obtained from the database in the memory.
  • the so-called situational intelligence refers to managing and arranging the daily life of the user, through the intelligent engine service, to remind the user in the form of a card.
  • the situational intelligence data generates an alarm card at 20:00 at night, which reminds the user to set a travel alarm.
  • the recommended time for the alarm clock is 3 hours before departure / departure.
  • the embodiments of the present application can use the above information processing method to perform real-time information prediction based on the business data generated by the user's current operation, and comprehensively judge the prediction results in combination with environmental data and user portrait data.
  • This method can achieve both The electronic device generates recommendation information in real time, and can always keep operating in a lower power consumption state.
  • This method can automatically sense the user's intentions, expressions and changes in the environment in real time, and can provide users with the ability to seamlessly perceive application services, making electronic devices more intelligent, more natural interaction, and can improve the efficiency of human-computer interaction.
  • An embodiment of the present application also provides a computer-readable storage medium.
  • the computer-readable storage medium includes a computer program.
  • the computer program runs on an electronic device, the electronic device may perform any one of the foregoing information processing methods. Implementation.
  • An embodiment of the present application further provides a computer program product, which, when the computer program product runs on an electronic device, causes the electronic device to perform any possible implementation of the foregoing information processing method.
  • the embodiments of the present application disclose an information processing apparatus. As shown in FIG. 8, the information processing apparatus is used to implement the method described in each method embodiment above, which includes: a main processing module 801, a co-processing module 802, a transceiver module 803, and a display module 804.
  • the main processing module 801 is used to support the electronic device to execute the method steps on the application processor side, such as step 301b in FIG. 5b
  • the co-processing module 802 is used to support the electronic device to support the method steps on the coprocessor side, such as the execution diagram
  • the transceiver module 801 is used to support the main processing module 801 to send instructions to the co-processing module 802 and receive events reported by the co-processing module.
  • the display module 805 is used to support the electronic device to display recommended information, for example, performing step 306b in FIG. 5b. All relevant content of the steps involved in the above method embodiments can be referred to the function description of the corresponding function module, which will not be repeated here.
  • the co-processing module 802 is used to support the method steps of the electronic device supporting the co-processor side, for example, performing steps 301a to 303a in FIG.
  • the main processing module 801 is used to receive the event reported by the co-processing module 802.
  • the display module 805 is used to support the electronic device to display recommended information, for example, to perform step 304a in FIG. 5a. All relevant content of the steps involved in the above method embodiments can be referred to the function description of the corresponding function module, which will not be repeated here.
  • the embodiments of the present application disclose an electronic device.
  • the electronic device may include: an application processor 901; a coprocessor 905; a memory 902; a display 903; one Or more application programs (not shown); and one or more computer programs 904, the above devices can be connected through one or more communication buses 906.
  • the one or more computer programs 904 are stored in the above-mentioned memory 902 and are configured to be executed by the application processor 901 and the coprocessor 905.
  • the one or more computer programs 904 include instructions, and the above-mentioned instructions may be used to execute The steps in the corresponding embodiment of FIG. 5b.
  • the application processor 901 is used to perform steps 301b and 302b in FIG. 5b
  • the coprocessor 905 is used to perform steps 303b to 305b in FIG. 5b
  • the display 903 is used to perform step 306b in FIG. 5b.
  • the coprocessor 905 is used to execute steps 301a to 304a in FIG. 5a
  • the application processor 901 is used to receive events reported by the coprocessor 905
  • the display 903 is used to execute FIG. ⁇ ⁇ 304a ⁇ Step 304a.
  • the functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or software function unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solutions of the embodiments of the present application may be essentially or part of the contribution to the existing technology or all or part of the technical solutions may be embodied in the form of software products, and the computer software products are stored in a storage
  • the medium includes several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or processor to perform all or part of the steps of the methods described in the embodiments of the present application.
  • the foregoing storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk, and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un procédé de traitement d'informations et un dispositif électronique, appliqués à un dispositif électronique comprenant un processeur d'application et un coprocesseur. Le procédé comprend les opérations suivantes : un coprocesseur d'un dispositif électronique reçoit une instruction de rapport d'informations de recommandation à partir d'un processeur d'application du dispositif électronique ; ensuite, le coprocesseur obtient, selon l'instruction, des données de service générées par le processeur d'application et des données d'environnement générées par un dispositif normalement sous tension à faible consommation d'énergie du dispositif électronique ; le coprocesseur effectue une opération intelligente artificielle sur les données de service et les données d'environnement par utilisation d'un algorithme intelligent artificiel pour générer un résultat d'opération ; et lorsque le résultat d'opération satisfait une condition prédéfinie, le coprocesseur rapporte le résultat d'opération au processeur d'application, de telle sorte que le processeur d'application affiche le résultat d'opération sous la forme d'informations de recommandation. Le procédé est utilisé pour mettre en œuvre la réalisation d'une opération intelligente artificielle au niveau d'un côté dispositif électronique en temps réel, permettant ainsi de fournir à temps des informations de recommandation précises à un utilisateur.
PCT/CN2018/110510 2018-10-16 2018-10-16 Procédé de traitement d'informations et dispositif électronique WO2020077540A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880072215.0A CN111316199B (zh) 2018-10-16 2018-10-16 一种信息处理方法及电子设备
PCT/CN2018/110510 WO2020077540A1 (fr) 2018-10-16 2018-10-16 Procédé de traitement d'informations et dispositif électronique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/110510 WO2020077540A1 (fr) 2018-10-16 2018-10-16 Procédé de traitement d'informations et dispositif électronique

Publications (1)

Publication Number Publication Date
WO2020077540A1 true WO2020077540A1 (fr) 2020-04-23

Family

ID=70283351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/110510 WO2020077540A1 (fr) 2018-10-16 2018-10-16 Procédé de traitement d'informations et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111316199B (fr)
WO (1) WO2020077540A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114466308A (zh) * 2020-10-22 2022-05-10 华为技术有限公司 一种定位方法和电子设备
WO2023082989A1 (fr) * 2021-11-12 2023-05-19 华为技术有限公司 Procédé et appareil de traitement de message, et premier dispositif électronique
CN116761207A (zh) * 2023-08-22 2023-09-15 杭州纵横通信股份有限公司 一种基于通信行为的用户画像构建方法与系统
EP4198688A4 (fr) * 2020-09-07 2023-10-25 Huawei Technologies Co., Ltd. Appareil de traitement d'image, dispositif électronique et procédé de traitement d'image

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752713B (zh) 2020-06-28 2022-08-05 浪潮电子信息产业股份有限公司 模型并行训练任务负载均衡方法、装置、设备及存储介质
CN114222020B (zh) * 2020-09-03 2022-11-25 华为技术有限公司 位置关系识别方法、设备及可读存储介质
CN112561047B (zh) * 2020-12-22 2023-04-28 上海壁仞智能科技有限公司 用于处理数据的装置、方法和计算机可读存储介质
CN113886196B (zh) * 2021-12-07 2022-03-15 上海燧原科技有限公司 片上功耗管理方法、电子设备及存储介质
CN114861152A (zh) * 2022-05-31 2022-08-05 Oppo广东移动通信有限公司 生物特征信息的处理方法、装置、电子设备以及存储介质
CN116795628B (zh) * 2023-05-24 2024-05-14 荣耀终端有限公司 终端设备的功耗处理方法、终端设备以及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202391475U (zh) * 2011-12-23 2012-08-22 北京中矿华沃科技股份有限公司 便携式安全记录仪
CN104516472A (zh) * 2013-09-29 2015-04-15 联想(北京)有限公司 处理器和数据处理方法
CN104656873A (zh) * 2013-11-25 2015-05-27 联想(北京)有限公司 一种信息处理方法与电子设备
US20160148615A1 (en) * 2014-11-26 2016-05-26 Samsung Electronics Co., Ltd. Method and electronic device for voice recognition
CN107277904A (zh) * 2017-07-03 2017-10-20 上海斐讯数据通信技术有限公司 一种终端及语音唤醒方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9733392B2 (en) * 2008-06-27 2017-08-15 Deep Sciences, LLC Methods of using environmental conditions in sports applications
CN104809501B (zh) * 2014-01-24 2018-05-01 清华大学 一种基于类脑协处理器的计算机系统
KR102137097B1 (ko) * 2014-08-21 2020-07-23 삼성전자주식회사 소모 전류 저감 방법 및 이를 지원하는 전자 장치
CN106407364B (zh) * 2016-09-08 2021-06-11 北京百度网讯科技有限公司 一种基于人工智能的信息推荐方法和装置
CN107094181A (zh) * 2017-05-27 2017-08-25 广东欧珀移动通信有限公司 信息输出方法及相关产品
CN107341201A (zh) * 2017-06-20 2017-11-10 广东欧珀移动通信有限公司 信息推送方法及相关产品
CN108057249B (zh) * 2017-11-29 2020-07-24 腾讯科技(成都)有限公司 一种业务数据处理方法和装置
CN108197327B (zh) * 2018-02-07 2020-07-31 腾讯音乐娱乐(深圳)有限公司 歌曲推荐方法、装置及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202391475U (zh) * 2011-12-23 2012-08-22 北京中矿华沃科技股份有限公司 便携式安全记录仪
CN104516472A (zh) * 2013-09-29 2015-04-15 联想(北京)有限公司 处理器和数据处理方法
CN104656873A (zh) * 2013-11-25 2015-05-27 联想(北京)有限公司 一种信息处理方法与电子设备
US20160148615A1 (en) * 2014-11-26 2016-05-26 Samsung Electronics Co., Ltd. Method and electronic device for voice recognition
CN107277904A (zh) * 2017-07-03 2017-10-20 上海斐讯数据通信技术有限公司 一种终端及语音唤醒方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4198688A4 (fr) * 2020-09-07 2023-10-25 Huawei Technologies Co., Ltd. Appareil de traitement d'image, dispositif électronique et procédé de traitement d'image
CN114466308A (zh) * 2020-10-22 2022-05-10 华为技术有限公司 一种定位方法和电子设备
CN114466308B (zh) * 2020-10-22 2023-10-10 华为技术有限公司 一种定位方法和电子设备
WO2023082989A1 (fr) * 2021-11-12 2023-05-19 华为技术有限公司 Procédé et appareil de traitement de message, et premier dispositif électronique
CN116761207A (zh) * 2023-08-22 2023-09-15 杭州纵横通信股份有限公司 一种基于通信行为的用户画像构建方法与系统
CN116761207B (zh) * 2023-08-22 2023-12-15 杭州纵横通信股份有限公司 一种基于通信行为的用户画像构建方法与系统

Also Published As

Publication number Publication date
CN111316199B (zh) 2022-08-19
CN111316199A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
EP3872807B1 (fr) Procédé de commande vocale et dispositif électronique
WO2021052263A1 (fr) Procédé et dispositif d'affichage d'assistant vocal
CN109814766B (zh) 一种应用显示方法及电子设备
WO2020077540A1 (fr) Procédé de traitement d'informations et dispositif électronique
WO2021063343A1 (fr) Procédé et dispositif d'interaction vocale
WO2020073288A1 (fr) Procédé de déclenchement de dispositif électronique permettant d'exécuter une fonction, et dispositif électronique associé
WO2023005282A9 (fr) Procédé et appareil de poussée de message
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
CN113805797B (zh) 网络资源的处理方法、电子设备及计算机可读存储介质
WO2022017474A1 (fr) Procédé de traitement de tâches et appareil associé
US11995317B2 (en) Method and apparatus for adjusting memory configuration parameter
WO2020024108A1 (fr) Procédé d'affichage d'icônes d'application et terminal
WO2021218429A1 (fr) Procédé de gestion d'une fenêtre d'application, dispositif terminal et support de stockage lisible par ordinateur
CN114079893A (zh) 蓝牙通信方法、终端设备及计算机可读存储介质
WO2021104122A1 (fr) Procédé et appareil permettant de répondre à une demande d'appel et dispositif électronique
WO2023273543A1 (fr) Procédé et appareil de gestion de dossier
US20240098354A1 (en) Connection establishment method and electronic device
CN115333941A (zh) 获取应用运行情况的方法及相关设备
WO2023207667A1 (fr) Procédé d'affichage, véhicule et dispositif électronique
CN116048831B (zh) 一种目标信号处理方法和电子设备
WO2021129453A1 (fr) Procédé de capture d'écran et dispositif associé
CN115206308A (zh) 一种人机交互的方法及电子设备
WO2024093703A1 (fr) Procédé et appareil de gestion d'instance, et dispositif électronique et support de stockage
CN118131891A (zh) 一种人机交互的方法和装置
CN117311484A (zh) 调整设备功耗的方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18937261

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18937261

Country of ref document: EP

Kind code of ref document: A1