WO2020233556A1 - Procédé de traitement de contenu d'appel et dispositif électronique - Google Patents

Procédé de traitement de contenu d'appel et dispositif électronique Download PDF

Info

Publication number
WO2020233556A1
WO2020233556A1 PCT/CN2020/090956 CN2020090956W WO2020233556A1 WO 2020233556 A1 WO2020233556 A1 WO 2020233556A1 CN 2020090956 W CN2020090956 W CN 2020090956W WO 2020233556 A1 WO2020233556 A1 WO 2020233556A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
interface
call
webpage
application
Prior art date
Application number
PCT/CN2020/090956
Other languages
English (en)
Chinese (zh)
Inventor
丁宁
张子曰
曹林
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020233556A1 publication Critical patent/WO2020233556A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the embodiments of the present application relate to communication technologies, and in particular, to a method for processing call content and an electronic device.
  • the electronic device displays a first interface 101.
  • the first interface 101 is a text message dialogue interface, which may include text and images. The user can perform multi-touch operations on any area in the first interface 101.
  • the electronic device in response to a user's multi-touch operation on a certain area (for example, the first area), the electronic device performs image processing on the area to extract text in the area (for example, listening to movies A is good), split the text into multiple fields (for example, "heard", "movie A” and "good”). Then, the electronic device displays the second interface 102.
  • the second interface 102 includes multiple tags 1021, and each tag corresponds to a field. The user can click the search button 1022 after selecting one or more field labels (for example, selecting the field label of "Movie A").
  • the electronic device in response to the user clicking the search button 1022, the electronic device starts a browser application and displays an interface 103.
  • the interface 103 is an application interface of a browser.
  • the electronic device automatically fills the field of the field label selected by the user (for example, movie A) into the search box and performs a search, so as to obtain information related to the field.
  • the prior art recognizes the displayed content to provide relevant information, and users hope to provide richer information services.
  • the embodiments of the present application provide a call content processing method and electronic device, which can process the call content to provide information related to the call content.
  • a method for processing call content includes: when the electronic device is in a call connection state, the electronic device receives a first input; in response to the first input, the electronic device obtains at least one key of the call content Information; the electronic device displays a first interface according to the above key information, and the first interface includes a label corresponding to the key information.
  • the content of the call may include first voice data, second voice data, first voice data, and second voice data.
  • the first voice data is voice data generated by the electronic device by collecting external sounds
  • the second voice data is voice data that the electronic device receives from other electronic devices connected to it in a call.
  • the key information may include text data of part of the call content, keywords in the call content, and webpage links related to the keywords.
  • the tags include any one or more of text tags, keyword tags, and information tags. Text tags correspond to text data; keyword tags correspond to keywords; information tags correspond to web links.
  • the first input may be that the electronic device is folded or the electronic device is unfolded.
  • the key information can be obtained in different ways.
  • the electronic device sends the content of the call to the first server; the first server receives the content of the call sent by the first electronic device, and converts the content of the call into text data; then, the first server extracts keywords from the text data , Sending the keyword to the second server to obtain a webpage link related to the keyword; finally, the first server sends the webpage link, text data or/and the keyword to the electronic device.
  • a user can obtain key information related to the call content while making a call.
  • the electronic device receives a third input to the information tag, and in response to the third input, the electronic device displays a web page associated with the information tag.
  • the electronic device can access the webpage associated with the information tag through the browser core, and display the webpage in a web view (Webview).
  • Webview a web view
  • users can quickly access web pages related to the content of the call through the information tag. Similar to browsing a web page in a browser application, the user can operate the web page to access other web pages.
  • the electronic device may receive a fourth input to the webpage; in response to the fourth input, the electronic device may display other webpages.
  • the electronic device can record the webpage link of the webpage visited by the user.
  • the electronic device can jump to the webpage displayed at the end of the call, so that the user can continue to browse the webpage through the browser application after the call, which improves the user experience.
  • the electronic device can jump to the corresponding application interface when the call is ended.
  • the design method specifically includes: the electronic device obtains the latest recorded webpage link; determines the application related to the recently recorded webpage link according to the recently recorded webpage link; if the application is installed, the electronic device obtains the application corresponding to the webpage link Link; start the related application, and display the corresponding application interface according to the application link; if the application is not installed, start the browser application, and display the corresponding webpage according to the webpage link.
  • an embodiment of the present application provides an electronic device that includes a display screen, a processor, and a memory for storing a computer program, where the computer program includes instructions, and when the instructions are executed by the processor, The electronic device is caused to perform the method according to any one of the first aspects.
  • the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the method described in any one of the first aspect.
  • this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the method described in any one of the first aspect.
  • the present application provides a graphical user interface, which specifically includes a graphical user interface displayed when an electronic device executes any method as in the first aspect.
  • the electronic equipment described in the second aspect, the computer storage medium described in the third aspect, the computer program product described in the fourth aspect, and the graphical user interface described in the fifth aspect provided above are all used to execute
  • the beneficial effects that can be achieved can refer to the beneficial effects of the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is a schematic diagram of a scene of a display content processing method provided by the prior art
  • FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 3 is a software structure block diagram of an electronic device provided by an embodiment of the application.
  • FIG. 4 is a schematic flowchart of a call content processing provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of a scene of another method for processing call content according to an embodiment of the application.
  • FIG. 6 is a schematic structural diagram of yet another electronic device provided by an embodiment of the application.
  • FIG. 7 is a schematic scenario diagram of another method for processing call content provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of a scene of another method for processing call content according to an embodiment of the application.
  • FIG. 9 is a schematic diagram of a scene of another method for processing call content according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of a scene of another call content processing method provided by an embodiment of the application.
  • FIG. 11 is a schematic diagram of another scene of a call content processing method provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram of a scene of another method for processing call content according to an embodiment of the application.
  • FIG. 13 is a schematic flowchart of another method for processing call content according to an embodiment of this application.
  • FIG. 14 is a schematic diagram of a scene of another method for processing call content according to an embodiment of the application.
  • the "user” in the embodiments of this application refers to a user who uses an electronic device.
  • a and/or B in the embodiments of the present application is merely an association relationship describing associated objects, indicating that there can be three types of relationships, for example, there may be three types of relationships, such as A alone, A and B at the same time, and B alone.
  • the character "/" in the embodiment of the present application generally indicates that the associated objects before and after are in an "or" relationship.
  • the method for processing call content provided by the embodiment of the present application may be applied to an electronic device.
  • the electronic device may be, for example, a mobile phone, a tablet (Personal Computer), a laptop (Laptop Computer), a digital camera, a personal digital assistant (PDA for short), a navigation device, and a mobile Internet Device (Mobile Internet Device, MID) or Wearable Device (Wearable Device), etc.
  • FIG. 2 shows a schematic diagram of the structure of the electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include pressure sensor 180A, gyroscope sensor 180B, air pressure sensor 180C, magnetic sensor 180D, acceleration sensor 180E, distance sensor 180F, proximity light sensor 180G, fingerprint sensor 180H, temperature sensor 180J, touch sensor 180K, ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a two-way synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to realize communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 100 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent cognition of the electronic device 100, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the microphone 170C converts the collected sound signal into an electrical signal, which is received by the audio module 170 and then converted into an audio signal.
  • the audio module can convert the audio signal into an electrical signal, which is received by the speaker 170A and converted into a sound signal for output.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194. Pressure sensor 180A
  • the capacitive pressure sensor may include at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch location but have different touch operation strengths may correspond to different operation instructions.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and used in applications such as horizontal and vertical screen switching, pedometers and so on.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, etc.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 executes to reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 100.
  • FIG. 3 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, and a notification manager.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text and controls that display pictures.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, etc.).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • the camera 193 captures still images or videos.
  • FIG. 4 is a schematic flowchart of a method for processing call content according to an embodiment of the application. As shown in Figure 4, the method includes:
  • Step 401 When the electronic device is in a call connection state, the electronic device receives a first input.
  • the electronic device being in the call connection state means that the electronic device has established a call connection with other electronic devices.
  • the call may include: a voice call or a video call.
  • the first user uses an electronic device (eg, the first electronic device) to make a video call with other electronic devices (eg, the second electronic device) used by the second user.
  • the electronic device is in a call connection state with other electronic devices.
  • the electronic device displays a call interface (for example, interface 501).
  • the call interface may include a call service activation button (eg, button 5011) for starting the call service.
  • the electronic device receives the user's first input.
  • the first input is: the call service activation button (eg, button 5011) is touched.
  • the first input includes, but is not limited to, the foregoing manner.
  • the first input may be a signal sent by the other device for instructing the electronic device to start a call service.
  • the Bluetooth headset sends a signal to the electronic device to instruct the electronic device to start the call service.
  • the first input may be: unfolding or folding the electronic device.
  • the electronic device is a foldable mobile phone.
  • Step 402 In response to the first input, the electronic device starts a call service.
  • the call service refers to providing at least one key information required by the user by processing the content of the call.
  • the call service includes, but is not limited to: uncommon word interpretation, sentence supplement, voice error correction, grammatical error correction, schedule management, translation, weather query, and navigation.
  • the electronic device in response to the first input (button 5011 is touched), the electronic device initiates a call service.
  • the electronic device when starting the call service or in the process of starting the call service, the electronic device may display a notification message 5021 for notifying that the electronic device is starting the call service.
  • Step 403 The electronic device obtains at least one key information provided by the call service.
  • step 403 may specifically include steps 4031-4039:
  • Step 4031 The electronic device sends the first data to the first server.
  • the first data may include: first voice data and/or second voice data.
  • the application scenario shown in Figure 5(c) is: after the electronic device starts the call service, the second user asks "Is there any good movie recently?"; the first user replied: "I heard movie A (The name of a movie) is good”.
  • the mobile communication module 150 of the electronic device receives second voice data sent by other devices (eg, the second electronic device). After the second voice data is converted into an electrical signal by the audio module 170, The speaker 170A is converted into a sound signal output (for example, "Is there any good movie recently?"). The electronic device may send the second voice data to the first server.
  • the first user approaches the microphone 170C to speak, and the first user's voice signal (for example, "I heard that movie A is good") is converted into an electrical signal by the microphone 170C, and then converted into the first voice data by the audio module 170.
  • the first voice data is voice data generated by the electronic device by collecting external sounds.
  • the electronic device sends the first voice data to the first server.
  • the first voice data corresponds to the content said by the first user
  • the second voice data corresponds to the content said by the second user. That is, the electronic device may send the content of the call between the first user and the second user to the first server, that is, the content of the call between the first electronic device and the second electronic device.
  • Step 4032 The first server receives the first data sent by the electronic device.
  • Step 4033 The first server generates second data according to the first data.
  • the generating of the second data according to the first data may include the following steps 40331 to 40334:
  • Step 40331 The first server converts the voice data in the first data into text data.
  • the electronic device sends the second voice data to the first server, and the first server converts the voice data into text data, and the second text data is “What’s a good movie? ?”.
  • the electronic device sends the first voice data to the first server, and the first server converts the voice data into text data to obtain the first text data: "I heard that movie A is good!.
  • Step 40332 The first server extracts keywords in the text data.
  • the keyword extracted by the first server may be: “movies”.
  • the keyword extracted by the first server may be: “movie A”.
  • Step 40333 The first server determines the second server.
  • the second server is used to provide key information related to the extracted keywords.
  • the first server may determine the second server according to the extracted keywords.
  • the first server may store one or more lists. As shown in Table 1, the list includes one or more keywords and the names of one or more servers corresponding to the keywords.
  • the first server matches the extracted keywords with the keywords in the list, and determines that the second server is the server corresponding to the keywords in the list that matches the extracted keywords.
  • the second server can be one server or multiple servers.
  • the first server determines that the second server is "Fandango”.
  • the first server may determine the second server according to the user's setting or the user's usage habits.
  • the keywords extracted by the first server are "buy”, “mobile phone” and “Huawei P30Pro”.
  • the servers corresponding to the above keywords are “Amazon” and “Taobao”.
  • the first server may determine that the second server is "Taobao”.
  • the server list also includes semantic types.
  • the first server may determine a server corresponding to the voice type according to the semantic type.
  • Step 40334 The first server generates second data.
  • the second data may include keywords.
  • the second data may be data in JavaScript Object Notation (JSON) format.
  • JSON is a built-in language feature of JavaScript that provides a simple way to describe complex objects.
  • the form of the second data may be different according to the second server.
  • the second data may be: information: ⁇ "type”: “mobile phone”; name: “Huawei P30Pro” " ⁇ .
  • the second data may be: information: ⁇ name: "movie A" ⁇ .
  • Step 4034 The first server sends a first processing request to the second server.
  • the first processing request includes the second data.
  • the first processing request is used to instruct the second server to provide information related to the second data.
  • Step 4035 The second server receives the first processing request sent by the first server.
  • Step 4036 In response to the first processing request, the second server sends third data to the first server.
  • the third data may include links, such as Uniform/Universal Resource Locator (URL), etc.
  • the information related to keywords can be obtained through the linked electronic device.
  • the information may include: product information, plot introduction, movie reviews, full text of poems, maps, singer introduction, news, hotel rankings, etc.
  • the third data may include: Taobao
  • an electronic device accesses a webpage through the webpage link, it can obtain information related to Huawei P30Pro.
  • the third data may include: Fandango webpage Go to the web link of the content introduction page of Movie A, such as ⁇ "web": www.fandango.com/movie-a/movie-overview ⁇ .
  • the electronic device accesses a webpage through the webpage link, it can obtain information related to Movie A.
  • Step 4037 The first server receives the third data sent by the second server.
  • Step 4038 The first server sends the fourth data to the electronic device.
  • the fourth data may include: keywords, text data and/or links.
  • the fourth data may further include: the name of the second server.
  • the fourth data may be:
  • Step 4039 The electronic device receives the fourth data sent by the first server.
  • the electronic device can acquire at least one key information of the content of the conversation with the first electronic device and the second electronic device. It is understandable that at this time, at least one key piece of information provided by the call service is the fourth data.
  • Step 404 The electronic device displays a call service interface according to the at least one key information.
  • the electronic device displays an interface 503.
  • the interface 503 includes a call interface (e.g., interface 5036) and a call service interface (e.g., interface 5037).
  • the electronic device displays the call interface in the display area 5031, and displays the call service interface in the display area 5032.
  • the electronic device can generate a text label, a keyword label and/or an information label according to at least one key information provided by the call service.
  • the call service interface may include text labels (for example, label 5033 "Are there any good movies recently?"), keyword labels (for example, label 5034 "Movie A") and/or information labels (for example, label 5035" Movie Introduction", "Reservation”).
  • the text label is used to display text data.
  • the keyword tag is used to display keywords.
  • the information label is associated with the link.
  • the information tag may be located near a text tag or keyword tag related to it.
  • the electronic device may receive a third input of the user to the information tag.
  • the third input may be: the user touches the information label.
  • the electronic device can access the webpage associated with the information tag through the browser kernel.
  • the electronic device can display the web page (e.g., the introduction page of movie A, the booking page of movie A) in a web view.
  • the electronic device may launch a browser application to access the webpage associated with the information tag.
  • the electronic device may display the application interface of the browser application in the second display area.
  • the call service interface may also include a website label.
  • the website tag is used to indicate the server corresponding to the web page link, such as Taobao, Amazon, etc.
  • the application tag or website tag may be located near the information tag related thereto.
  • the electronic device can distinguish and display labels of different categories by different colors, shapes, sizes, etc.
  • Step 405 The electronic device determines whether the second input is received.
  • step 407 If the electronic device receives the second input, the electronic device executes step 407; if the electronic device does not receive the second input of the first user, the electronic device executes step 406.
  • the second input is used to instruct the electronic device to end the call service.
  • the electronic device ends the call service.
  • the second input may be: the user long presses the power button.
  • the second input includes, but is not limited to, the above methods.
  • the second input may be: unfolding or folding the electronic device.
  • the electronic device is a foldable mobile phone, and when the angle ⁇ formed by the first part of the electronic device and the second part of the electronic device is less than a predetermined threshold, the electronic device ends the call service.
  • the second input may be a signal sent by the other device to instruct the electronic device to close the call service.
  • Step 406 The electronic device judges whether the call ends.
  • step 407 is executed; if the electronic device is still in the call state, step 403 is executed.
  • Step 407 The electronic device ends the call service.
  • the first electronic device when the first user or the second user hangs up the phone, the first electronic device ends the call connection with the second electronic device.
  • the first electronic device determines that the call is over and ends the call service.
  • the electronic device displays an interface 504.
  • the electronic device ends the call service, the electronic device ends the acquisition of the key information.
  • the method for processing call content can provide at least one key information related to the call content by processing the call content in real time through a server.
  • the first user can obtain key information related to the content of the call while talking with the second user.
  • users can also quickly access web pages related to the call content by touching the information label.
  • the first display area 5031 for displaying the call interface may be smaller, and the second display area for displaying the call service interface The area 5032 can be larger.
  • the call interface may be displayed as an icon (for example, icon 701) floating.
  • the electronic device repeatedly executes step 403 and step 404, and the electronic device generates one label after another.
  • the first user conducts a video call with the second user.
  • the electronic device repeatedly performs step 403 and step 404 to sequentially generate tags A-1, B-1, A-2, B-2, B-3, A-3, B-4, and B-5.
  • A-1, A-2 and A-3 are related to what the first user said.
  • B-1, B-2, B-3, B-4, and B-5 are related to what the second user said.
  • the electronic device can arrange multiple tags in chronological order.
  • the new label for example, label A-3) is located under the old label (for example, label B-3).
  • the electronic device can arrange the tags according to users.
  • the tags related to the second user’s conversation for example, tags B-1, B-2, and B-3) are arranged on the left;
  • tags related to the dialogue of tags A-1, A-2, A-3) are arranged on the right.
  • the electronic device can scroll to display labels.
  • the old label disappears from above, and the new label appears from below, so that the user can view the labels related to the current conversation.
  • the electronic device can set the scrolling speed of the label according to the speech rate of the call, or the user can also specify the scrolling speed of the label.
  • the electronic device may stop scrolling to display the one or more related labels.
  • the electronic device may stop scrolling display.
  • the electronic device scrolls quickly to display tags related to the current conversation.
  • the electronic device may determine the arrangement direction and scroll direction of the tags according to the number of users participating in the call.
  • the electronic device sequentially generates tags E-1, C-1, D-1, E-2, and C-2.
  • C-1 and C-2 are related to the content said by the first user;
  • D-1 is related to the content said by the second user;
  • E-1 and E-2 are related to the content said by the third user.
  • the electronic device can display the label of the first user in the upper third area, the label of the second user in the middle third area, and the label of the third user.
  • the label is displayed in the lower third of the area.
  • the new label is to the right of the old label.
  • the old label disappears from the left, and the new label appears from the right.
  • the new label can also be located to the left of the old label; the old label disappears from the right, and the new label appears from the left.
  • the style of the label and the display of the label in the embodiment of the present application include but are not limited to the manner described in the above embodiment.
  • the label style can be a round bubble.
  • the electronic device can determine the display position of the tag. Exemplarily, as shown in FIG. 9, when "movie A” is mentioned multiple times in this call, the electronic device may display a label related to movie A in the middle. For another example, the electronic device can determine the size of the label. Exemplarily, as shown in Figure 9, when the frequency of mentioning "movie A” in this call is higher than the frequency of mentioning "movie B", the size of the tag associated with movie A determined by the electronic device is greater than The size of the tags related to movie B.
  • the electronic device can also determine the display attributes such as the shape, transparency or color of the label.
  • keywords, text data, and web links are taken as examples to illustrate the key information of the call content. It is understandable that, as shown in FIG. 10, the key information includes but is not limited to this .
  • FIG. 10 is a schematic diagram of a scene of a method for processing call content according to an embodiment of the application.
  • the first server when the system language of the electronic device is the first language (for example, Chinese) and the call language is the second language (for example, English), the first server will perform according to the first data (for example, Voice data: "What's up") generates second data (for example, text data: "What's up?").
  • the first server may send a first processing request to the second server, the first processing request including the second data.
  • the first processing request may be used to instruct the second server to provide the first language translation of the second data (for example, the Chinese translation of "What's up?").
  • the second server sends third data (for example, text data: "How are you doing?") to the first server.
  • the first server sends fourth data to the electronic device, where the fourth data includes third data and second data.
  • the electronic device According to the fourth data, the electronic device generates a call service interface. (For example, display the text data "What’s up” and "How are you doing?").
  • the call content processing method in the embodiment of the present application includes, but is not limited to, providing data related to the second data by another server different from the first server.
  • the application scenario shown in Figure 10(b) is: Jack said: "Twinkle, twinkle, little star”.
  • the first server can provide sentence supplement services.
  • the first server may include a lyrics library.
  • the first server matches the second data (text data: "All the sky is little stars") with the lyrics in the lyrics library to obtain data related to the second data.
  • the data related to the second data may be: the subsequent lyrics of the lyrics involved in the text data, for example, "How I Wonder What You Are”.
  • the first server may include a poetry dictionary and/or a famous saying and sentence library to provide various sentence supplement services.
  • the application scenario shown in Figure 10(c) is: Jack said: “My major is mechanical major,” but Jack mistakenly pronounced “mechanical” as “jie”.
  • the first server can provide voice error correction services. Send the phonetic symbol (xie) of "Machine" to the electronic device.
  • the electronic device may also provide data related to the second data.
  • the application scenario shown in Figure 10(d) is: Tom asks "Are you free in the afternoon of April 3?"
  • the first server generates second data (for example, text data: "Are you free in the afternoon on April 3?”) based on the first data (for example, voice data: "Are you free in the afternoon on April 3?").
  • the first server sends the above-mentioned second data to the electronic device.
  • the internal memory 121 of the electronic device includes the user's schedule.
  • the electronic device acquires and displays data related to the second data according to the second data (for example, Jack's itinerary for the afternoon of April 3).
  • the first data sent by the electronic device to the first server is voice data
  • the first server processes the voice data to generate the second data as an example.
  • the call content processing method in the embodiment of the present application may perform the conversion of voice data to text data (step 40331) and the extraction of keywords (step 40332) by the electronic device. Send the keyword as the first data to the first server.
  • the electronic device sends the user's voice data and the voice data of other users to the first server.
  • the electronic device (such as the first electronic device) may send the voice data of the first user to the first server, and the electronic device of other users (such as the second electronic device) may send the second user to the first server.
  • Voice data may be sent.
  • the electronic device may record the webpage links of each webpage visited by the user.
  • the electronic device can start a browser application and jump to the web page displayed when the call service ends according to the recorded web page link.
  • FIG. 11 is a schematic diagram of a scene of a method for processing call content according to an embodiment of the application.
  • the user touches the first information label (for example, the information label "Book Ticket").
  • the first information tag is associated with the first web page (for example, the movie theater selection page under the booking page of movie A).
  • the electronic device accesses the first webpage through the browser kernel, and the electronic device displays the first webpage in a network view.
  • the first webpage may include one or more webpage tags. Similar to browsing a webpage in a browser application, the user can operate on the webpage tags in the webpage to access more webpages.
  • the first webpage includes a first webpage tag (eg, webpage tag "ABC Cinema"), and the first webpage tag is associated with a second webpage (eg, the time selection page of ABC cinema under the booking page of movie A) .
  • the electronic device receives the user's fourth input to the first webpage.
  • the fourth input may be: touching the first webpage tag in the first webpage.
  • the electronic device accesses the second webpage through the browser kernel, and displays the second webpage in a webpage view.
  • the second webpage may also include one or more webpage tags, and the user can continue to operate on one or more webpage tags in the second webpage to access one or more webpage tags in the second webpage. Or multiple web pages associated with web tags.
  • the electronic device can record the webpage link of the webpage.
  • Table 2 the electronic device may store one or more historical access lists to record web links of one or more web pages visited by the user.
  • the electronic device starts a browser application, and displays the second webpage according to the webpage link of the second webpage that is newly recorded.
  • FIG. 11(d) and FIG. 11(f) when the call ends, the electronic device starts a browser application, and displays the second webpage according to the webpage link of the second webpage that is newly recorded.
  • FIG. 11(d) and FIG. 11(f) when the call ends, the electronic device starts a browser application, and displays the second webpage according to the webpage link of the second webpage that is newly recorded.
  • FIG. 11(d) and FIG. 11(f) when the call ends, the electronic device starts a browser application, and displays the second webpage
  • the electronic device may display a notification message to remind the user that the electronic device is closing the call service, and ask the user whether to continue access.
  • the electronic device can start the browser application and jump to the webpage displayed at the end of the call.
  • the electronic device may not perform the above steps.
  • Figure 11(a) to Figure 11 is another method for processing call content provided in the embodiment of the application.
  • the browser application is launched after the call ends, and the web page displayed at the end of the call is jumped to, so that the user can coherently after the end of the call Browsing the web improves the user experience.
  • the applications may include installation-free applications such as quick applications and small programs.
  • installation-free applications such as quick applications and small programs.
  • the electronic device starts the application and jumps to the application interface corresponding to the webpage last visited by the user.
  • the method may further include:
  • Step 1301 the electronic device obtains the latest recorded webpage link.
  • the electronic device accesses the first webpage through the browser kernel, and the user can operate on the webpage tags in the webpage to access other webpages.
  • the newly recorded webpage link refers to the webpage link last visited by the user before ending the call service.
  • Step 1302 The electronic device determines whether an application related to the recently recorded webpage link is installed.
  • the electronic device can determine the related application according to the webpage link.
  • the application related to the webpage link may be a Taobao application.
  • step 1304 If the electronic device installs the application corresponding to the webpage, execute step 1304; if the electronic device does not install the application, execute step 1303;
  • Step 1303 The electronic device starts the browser application, and jumps to the corresponding webpage according to the webpage link.
  • Step 1304 The electronic device sends the webpage link to the first server.
  • Step 1305 The first server receives the webpage link sent by the electronic device.
  • Step 1306 The first server sends a second processing request to the second server, where the processing request includes the webpage link.
  • the second processing request is used to instruct the second server to return an application link corresponding to the webpage link.
  • Step 1307 The second server receives the second processing request sent by the first server.
  • Step 1308 In response to the second processing request, the second server sends an application link to the first server.
  • Step 1309 The first server receives the application link sent by the second server.
  • Step 1310 The first server sends the application link to the electronic device.
  • Step 1311. The electronic device receives the application link sent by the first server.
  • Step 1312. The electronic device starts the application, and jumps to the corresponding application interface according to the application link.
  • the electronic device can obtain the application link corresponding to the webpage link, and jump to the application interface corresponding to the webpage last visited by the user, so that the user can View related information through the app.
  • the electronic device may directly send the second processing request to the second server without going through the first server, and the second server may directly send the application link to the electronic device.
  • the electronic device may directly send a webpage link to the first server without going through the second server, and the first server may directly send the application link to the electronic device.
  • the first server stores a comparison table, and the comparison table includes a web page link and a link address of an application interface corresponding to the web page link, that is, an application link. The first server determines the application link corresponding to the webpage link according to the comparison table, and sends the application link to the electronic device.
  • the electronic device may also determine the corresponding application link according to the web page link.
  • the link in the third data includes a webpage link of a webpage and an application link of an application interface corresponding to the webpage link.
  • the electronic device may determine the application link corresponding to the recently recorded webpage link according to the recently recorded webpage link, the webpage link and the application link included in the third data.
  • the third data is: ⁇ "web": www.amazon.com/phones/huawei/p30-pro/; "app”: "amazon://phones/huawei/p30-pro/” ⁇ .
  • the electronic device will link the most recently recorded webpage (for example: www.amazon.com/phones/huawei/p30-pro/spec/) and the webpage link in the third data (www.amazon.com/phones/huawei/p30-pro) /)
  • the added field (spec/) as a suffix to the application link (amazon://phones/huawei/p30-pro/) in the third data to get the link to the most recently recorded webpage
  • the corresponding application link (amazon://phones/huawei/p30-pro/spec/). It should be noted that when the above method is adopted, the webpage link and the application link need to maintain the same or similar suffix form.
  • the electronic device can start multiple corresponding applications and jump to the corresponding application interface. Or the electronic device can start a browser to display corresponding multiple webpage interfaces through multiple windows. Or the electronic device can also start the corresponding application and the browser to display the corresponding interface respectively.
  • an exemplary description will be given below in conjunction with the application scenario shown in FIG. 14.
  • the electronic device displays a call service interface 1410.
  • the user touches the first information tag (the information tag "book ticket"), and the first information tag is associated with the first web page (the movie theater selection page under the booking page of movie A).
  • the electronic device accesses the first webpage through the browser kernel, and the electronic device displays the webpage interface 1411 of the first webpage.
  • the electronic device receives the fourth input.
  • the fourth input may be: touching the first webpage tag in the first webpage.
  • the electronic device In response to a touch operation on the first webpage label (webpage label "ABC Cinema") in the first webpage, the electronic device accesses the second webpage (time selection page under the booking page of movie A) through the browser kernel, The electronic device displays the web interface 1412 of the second web page.
  • the electronic device may also display a call service home button 1401, a return button 1402 and/or a delete button 1403.
  • the return button is used to instruct the electronic device to display the webpage visited by the last user.
  • the electronic device displays the second web page 1412 and the user touches the return button 1402, in response to the touch operation on the return button, the electronic device displays the web page interface 1411 visited by the last user.
  • the delete button is used to instruct the electronic device to end the display of the web page interface and display the call service interface.
  • the delete button 1403 the electronic device ends the display of the web interface and displays the call service interface in the second display area.
  • the call service home button 1401 is used to instruct the electronic device to display the call service interface.
  • the electronic device displays the call service interface 1410 together with the web interface 1412 in the second display area.
  • the electronic device displays the third information tag in the second display area.
  • the second display area includes the second webpage 1412 (the time selection page under the booking page of movie A) and the fourth webpage 1413 (the purchase page of Mate20Pro) at the end of the call, assuming the electronic device If the first application (Fandango) corresponding to the second webpage is installed, and the application (Amazon) corresponding to the fourth webpage is not installed, the electronic device starts the first application and jumps to the one corresponding to the second webpage. Apply the interface 1414, and start the browser to jump to the webpage 1415 corresponding to the fourth webpage direction.
  • the embodiment of the application discloses an electronic device, including: a display screen; a processor; a memory; one or more sensors; an application program; a computer program and a communication module.
  • the above devices can be connected through one or more communication buses.
  • the one or more computer programs are stored in the foregoing memory and configured to be executed by the one or more processors, and the one or more computer programs include instructions, and the foregoing instructions may be used to execute the foregoing application embodiments. The various steps.
  • the foregoing processor may specifically be the processor 110 shown in FIG. 1
  • the foregoing memory may specifically be the internal memory and/or the external memory 120 shown in FIG. 1
  • the foregoing display screen may specifically be the display shown in FIG. Screen 194
  • the above-mentioned sensor may specifically be one or more sensors in the sensor module 180 shown in FIG. 1
  • the above-mentioned communication module may be the mobile communication module 150 and/or the wireless communication module 160 shown in the embodiment of the present application. Do any restrictions.
  • GUI graphical user interface
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium can be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne, selon des modes de réalisation, un procédé de traitement de contenu d'appel et un dispositif électronique. Le procédé comprend les étapes suivantes : lorsqu'un dispositif électronique est dans un état connecté d'appel, le dispositif électronique reçoit une première entrée ; en réponse à la première entrée, le dispositif électronique acquiert au moins un élément d'informations clés du contenu d'appel ; et selon les informations clés, le dispositif électronique affiche une première interface, la première interface comprend une étiquette correspondant aux informations clés. Dans les modes de réalisation de la présente invention, un contenu d'appel peut être traité, et au moins un élément d'informations clés concernant un contenu d'appel est fourni.
PCT/CN2020/090956 2019-05-20 2020-05-19 Procédé de traitement de contenu d'appel et dispositif électronique WO2020233556A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910416825.4A CN111970401B (zh) 2019-05-20 2019-05-20 一种通话内容处理方法、电子设备和存储介质
CN201910416825.4 2019-05-20

Publications (1)

Publication Number Publication Date
WO2020233556A1 true WO2020233556A1 (fr) 2020-11-26

Family

ID=73357796

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/090956 WO2020233556A1 (fr) 2019-05-20 2020-05-19 Procédé de traitement de contenu d'appel et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111970401B (fr)
WO (1) WO2020233556A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113660375A (zh) * 2021-08-11 2021-11-16 维沃移动通信有限公司 通话方法、装置及电子设备
CN115268736A (zh) * 2021-04-30 2022-11-01 华为技术有限公司 界面切换方法及电子设备
CN115309312A (zh) * 2021-04-21 2022-11-08 花瓣云科技有限公司 一种内容显示方法与电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113672152A (zh) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 显示方法及装置
CN113761881A (zh) * 2021-09-06 2021-12-07 北京字跳网络技术有限公司 一种错别词识别方法及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090232288A1 (en) * 2008-03-15 2009-09-17 Microsoft Corporation Appending Content To A Telephone Communication
CN105279202A (zh) * 2014-07-25 2016-01-27 中兴通讯股份有限公司 一种检索信息的方法及装置
CN105550235A (zh) * 2015-12-07 2016-05-04 小米科技有限责任公司 信息获取方法及装置
CN106713628A (zh) * 2016-12-14 2017-05-24 北京小米移动软件有限公司 查询移动终端内存储的信息的方法、装置及移动终端
CN106777320A (zh) * 2017-01-05 2017-05-31 珠海市魅族科技有限公司 通话辅助方法及装置
US20170230497A1 (en) * 2016-02-04 2017-08-10 Samsung Electronics Co., Ltd. Electronic device and method of voice command processing therefor
CN107547717A (zh) * 2017-08-01 2018-01-05 联想(北京)有限公司 信息处理方法、电子设备及计算机存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103379013B (zh) * 2012-04-12 2016-03-09 腾讯科技(深圳)有限公司 一种基于即时通信的地理信息提供方法和系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090232288A1 (en) * 2008-03-15 2009-09-17 Microsoft Corporation Appending Content To A Telephone Communication
CN105279202A (zh) * 2014-07-25 2016-01-27 中兴通讯股份有限公司 一种检索信息的方法及装置
CN105550235A (zh) * 2015-12-07 2016-05-04 小米科技有限责任公司 信息获取方法及装置
US20170230497A1 (en) * 2016-02-04 2017-08-10 Samsung Electronics Co., Ltd. Electronic device and method of voice command processing therefor
CN106713628A (zh) * 2016-12-14 2017-05-24 北京小米移动软件有限公司 查询移动终端内存储的信息的方法、装置及移动终端
CN106777320A (zh) * 2017-01-05 2017-05-31 珠海市魅族科技有限公司 通话辅助方法及装置
CN107547717A (zh) * 2017-08-01 2018-01-05 联想(北京)有限公司 信息处理方法、电子设备及计算机存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115309312A (zh) * 2021-04-21 2022-11-08 花瓣云科技有限公司 一种内容显示方法与电子设备
CN115268736A (zh) * 2021-04-30 2022-11-01 华为技术有限公司 界面切换方法及电子设备
CN113660375A (zh) * 2021-08-11 2021-11-16 维沃移动通信有限公司 通话方法、装置及电子设备
CN113660375B (zh) * 2021-08-11 2023-02-03 维沃移动通信有限公司 通话方法、装置及电子设备

Also Published As

Publication number Publication date
CN111970401A (zh) 2020-11-20
CN111970401B (zh) 2022-04-05

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
WO2020238356A1 (fr) Procédé et appareil d'affichage d'interface, terminal, et support d'enregistrement
WO2020177622A1 (fr) Procédé d'affichage d'ensemble ui et dispositif électronique
WO2021063343A1 (fr) Procédé et dispositif d'interaction vocale
CN110597512B (zh) 显示用户界面的方法及电子设备
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
WO2020233556A1 (fr) Procédé de traitement de contenu d'appel et dispositif électronique
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
WO2020221063A1 (fr) Procédé de commutation entre une page parent et une sous-page, et dispositif associé
CN111669459B (zh) 键盘显示方法、电子设备和计算机可读存储介质
WO2022052776A1 (fr) Procédé d'interaction homme-ordinateur, ainsi que dispositif électronique et système
WO2020156230A1 (fr) Procédé de présentation d'une vidéo sur un dispositif électronique lors de l'arrivée d'un appel entrant et dispositif électronique
WO2021000841A1 (fr) Procédé de génération de photo de profil d'utilisateur, et dispositif électronique
WO2022068819A1 (fr) Procédé d'affichage d'interface et appareil associé
CN109819306B (zh) 一种媒体文件裁剪的方法、电子设备和服务器
WO2020238759A1 (fr) Procédé d'affichage d'interface et dispositif électronique
CN113961157A (zh) 显示交互系统、显示方法及设备
WO2022033432A1 (fr) Procédé de recommandation de contenu, dispositif électronique et serveur
CN113852714A (zh) 一种用于电子设备的交互方法和电子设备
WO2022057889A1 (fr) Procédé de traduction d'interface d'application et dispositif associé
WO2022022674A1 (fr) Procédé de disposition d'icône d'application et appareil associé
WO2021196980A1 (fr) Procédé d'interaction multi-écran, dispositif électronique, et support de stockage lisible par ordinateur
WO2021031862A1 (fr) Procédé de traitement de données et appareil associé
WO2022135157A1 (fr) Procédé et appareil d'affichage de page, ainsi que dispositif électronique et support de stockage lisible
CN116339568A (zh) 屏幕显示方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20810121

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20810121

Country of ref document: EP

Kind code of ref document: A1