WO2023029916A1 - Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible - Google Patents

Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible Download PDF

Info

Publication number
WO2023029916A1
WO2023029916A1 PCT/CN2022/111468 CN2022111468W WO2023029916A1 WO 2023029916 A1 WO2023029916 A1 WO 2023029916A1 CN 2022111468 W CN2022111468 W CN 2022111468W WO 2023029916 A1 WO2023029916 A1 WO 2023029916A1
Authority
WO
WIPO (PCT)
Prior art keywords
annotation
information
media data
terminal device
response
Prior art date
Application number
PCT/CN2022/111468
Other languages
English (en)
Chinese (zh)
Inventor
肖冬
胡熙
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023029916A1 publication Critical patent/WO2023029916A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06F40/221Parsing markup language streams
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates

Definitions

  • the present application relates to the field of terminals, and in particular to an annotation display method, device, terminal equipment and readable storage medium.
  • playing media data on a terminal device is a very common scenario.
  • a user plays media data such as video and audio through a terminal device, he may need to annotate key content in the media data.
  • the embodiment of the present application provides a annotation display method, device, terminal equipment and readable storage medium, which can improve the inconvenience of using and low viewing efficiency caused by the need to manually search for the annotated media data segment when viewing the content of the annotation. question.
  • the embodiment of the present application provides a comment display method, which is applied to a terminal device, including: responding to the comment display operation, displaying comment information and playback information, and the playback information is used to prompt to play the media corresponding to the comment information data, the playing information is obtained when the terminal device annotates the media data in response to an annotation operation when playing the media data.
  • the terminal device may be a mobile phone, a tablet computer, an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) device, a large-screen device, a notebook computer, a netbook, a personal digital assistant (personal digital assistant, PDA), etc.
  • augmented reality augmented reality, AR
  • virtual reality virtual reality
  • PDA personal digital assistant
  • the terminal device when the terminal device responds to the annotation display operation, it displays annotation information and playback information, wherein the playback information is used to prompt to play the media data corresponding to the annotation information, and the playback information is when the terminal device plays the media data. Obtained when the operation annotates media data.
  • annotating the media data in response to the annotation operation includes: generating annotation information in response to the annotation operation when the terminal device is playing the media data.
  • the playback information of the media data corresponding to the annotation information is generated, and the playback information includes the playback path of the media data and the playback time information of the media data when the annotation information is generated.
  • the annotation information is generated in response to the annotation operation, including:
  • the annotation data is received through the annotation data input interface.
  • annotation information is generated.
  • annotation data input interface may be an interface of an application program providing annotation, note, and memo functions, and the annotation data may be input by the user through a virtual keyboard of the terminal device, an input device connected to the terminal device, or a voice input device.
  • the annotation information is generated in response to the annotation operation, including:
  • Keywords in the media data segment to be annotated are identified, and annotation information of the media data segment to be annotated is generated according to the keyword.
  • the media data segment can be automatically annotated, and the annotation information is more closely aligned with the content of the media data segment, making it easier for users to view it more accurately It can accurately understand the content of the annotated media data segment and improve the user experience.
  • obtaining the media data segment to be annotated includes:
  • generating the playback information of the media data corresponding to the annotation information includes:
  • the playback time information of the media data corresponding to the annotation information is acquired according to the annotation start time point and the annotation end time point.
  • the playing information of the media data corresponding to the annotation information is generated according to the playing path of the media data and the playing time information of the media data corresponding to the annotation information.
  • the playback interface of the terminal device to play the media data includes a progress bar.
  • the method further includes:
  • the comment mark corresponding to the comment information is displayed on the progress bar.
  • the method further includes:
  • the media data segment corresponding to the annotation information is saved locally.
  • the media data segment corresponding to the annotation information is saved locally, including:
  • the user can accurately understand the content of the media data segment when viewing the media data segment, thereby improving user experience.
  • the method further includes:
  • an annotation display apparatus which is applied to a terminal device, including:
  • the display module is used to display annotation information and play information in response to the annotation display operation, the play information is used to prompt to play the media data corresponding to the annotation information, and the play information is when the terminal device plays the media data , obtained when the media data is annotated in response to an annotation operation.
  • the annotation display apparatus further includes an annotation module, configured to generate annotation information in response to an annotation operation when the terminal device plays media data.
  • the playback information of the media data corresponding to the annotation information is generated, and the playback information includes the playback path of the media data and the playback time information of the media data when the annotation information is generated.
  • the annotation module is specifically configured to respond to the annotation start operation and display the annotation data input interface.
  • the annotation data is received through the annotation data input interface. Stops receiving annotation data in response to an annotation end operation. According to the annotation data, annotation information is generated.
  • the annotation module is specifically configured to respond to the annotation start operation and the annotation end operation, and acquire the media data segment to be annotated. Keywords in the media data segment to be annotated are identified, and annotation information of the media data segment to be annotated is generated according to the keyword.
  • the annotation module is specifically configured to respond to the annotation start operation and obtain the annotation start time point. In response to the comment end operation, get the comment end time point. Acquire the media data segment to be annotated according to the annotation start time point, the annotation end time point, and the playback path of the media data.
  • the annotation module is specifically configured to acquire the playback time information of the media data corresponding to the annotation information according to the annotation start time point and the annotation end time point.
  • the playing information of the media data corresponding to the annotation information is generated according to the playing path of the media data and the playing time information of the media data corresponding to the annotation information.
  • the playback interface of the terminal device to play the media data includes a progress bar.
  • the device also includes a marking module, which is used to display the annotation mark corresponding to the annotation information on the progress bar according to the annotation start time point and the annotation end time point.
  • the device further includes a saving module, configured to save the media data segment corresponding to the annotation information locally according to the annotation start time point, the annotation end time point, and the playback path of the media data.
  • a saving module configured to save the media data segment corresponding to the annotation information locally according to the annotation start time point, the annotation end time point, and the playback path of the media data.
  • the saving module is specifically configured to acquire the media data segment to be saved according to the annotation start time point, the annotation end time point, and the playback path of the media data. Name the media data segment to be saved according to the annotation information and save it locally.
  • the device further includes a playing module, configured to play the media data corresponding to the annotation information according to the playing information in response to the playing operation.
  • an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and operable on the processor, wherein the processor executes the The method provided by the first aspect above is implemented when the computer program is described.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the method provided in the above-mentioned first aspect is implemented.
  • an embodiment of the present application provides a computer program product, which enables the terminal device to execute the method provided in the first aspect when the computer program product is run on the terminal device.
  • an embodiment of the present application provides a chip system, the chip system includes a memory and a processor, and the processor executes a computer program stored in the memory to implement the method provided in the first aspect.
  • an embodiment of the present application provides a chip system, the chip system includes a processor, the processor is coupled to the computer-readable storage medium provided in the fourth aspect, and the processor executes the computer program stored in the computer-readable storage medium, To realize the method provided by the first aspect.
  • FIG. 1 shows a schematic structural diagram of a terminal device applying a annotation display method
  • Fig. 2 shows a schematic diagram of the software structure of the terminal device of the annotation display method provided by an embodiment of the present application
  • Fig. 3 shows a schematic flow chart of a comment display method provided by an embodiment of the present application
  • Fig. 4 shows an interface when a terminal device using the comment display method plays video data
  • Fig. 5 shows an interface when a terminal device using the annotation display method responds to the annotation operation when playing video data
  • Fig. 6 shows an interface for annotating when a terminal device using the annotation display method plays video data
  • FIG. 7 shows an interface for annotating when a terminal device using the annotation display method plays video data
  • FIG. 8 shows an interface for annotating when a terminal device using the annotation display method plays video data
  • FIG. 9 shows an interface for annotating when a terminal device using the annotation display method plays video data
  • Fig. 10 shows an interface for annotating when a terminal device using the annotation display method plays video data
  • FIG. 11 shows another interface for annotating when a terminal device using the annotation display method plays video data
  • Fig. 12 shows another interface for annotating when a terminal device using the annotation display method plays video data
  • Fig. 13 shows another interface for annotating when a terminal device using the annotation display method plays video data
  • Fig. 14 shows an interface for displaying annotation information and playback information by a terminal device applying the annotation display method
  • FIG. 15 shows an interface for displaying annotation information and playback information by a terminal device applying the annotation display method
  • FIG. 16 shows an interface for displaying annotation information and playback information by a terminal device applying the annotation display method
  • Fig. 17 shows an interface for displaying annotation information and playback information by a terminal device applying the annotation display method
  • FIG. 18 shows an interface for displaying annotation information and playback information by a terminal device applying the annotation presentation method
  • Fig. 19 shows a schematic structural diagram of an annotation display device provided by an embodiment of the present application.
  • FIG. 20 shows a structural block diagram of a terminal device provided by an embodiment of the present application.
  • the term “if” may be construed, depending on the context, as “when” or “once” or “in response to determining” or “in response to detecting ".
  • the annotation display method provided in the embodiment of the present application can be applied to mobile phones, tablet computers, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, large-screen equipment, notebook computers, netbooks, personal digital assistants (personal digital assistants, etc.) digital assistant, PDA) and other terminal devices, the embodiments of the present application do not impose any restrictions on the specific types of terminal devices.
  • an annotation application program can be used to implement annotations on the playback content, and the annotation application program can be an application program such as a note, a memorandum, or the like.
  • the annotation information in the annotation application program has no corresponding relationship with the media data, so when reviewing the annotation information, it is impossible to quickly locate the corresponding media data and determine the playback progress of the media data when the annotation is made. It is necessary to manually find the annotated media data fragments, which is cumbersome to operate, inconvenient to use, low in viewing efficiency, and poor in user experience.
  • this application provides a comment display method, which can display comment information and playback information when the terminal device responds to the comment display operation, wherein the playback information is used to prompt to play the media data corresponding to the comment information, and the playback information is the Obtained when media data is annotated in response to an annotation operation while playing media data.
  • Fig. 1 shows a schematic structural diagram of a terminal device to which a annotation display method is applied.
  • the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, Antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, A display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in the embodiment of the present application does not constitute a specific limitation on the terminal device 100 .
  • the terminal device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the terminal device 100 when the terminal device 100 is a mobile phone, a tablet computer, or a large-screen device, it may include all components shown in the illustration, or may include only some of the components shown in the illustration.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the controller may be the nerve center and command center of the terminal device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to realize the touch function of the terminal device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface. Both I2S interface and PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transferred to and from parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface to realize the shooting function of the terminal device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the terminal device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transmit data between the terminal device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the terminal device 100 .
  • the terminal device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the terminal device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 . In some other embodiments, the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the terminal device 100 can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the terminal device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • At least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • At least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology.
  • Wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband code division Multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM , and/or IR technology, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband code division Multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • GLONASS global navigation satellite system
  • Beidou satellite navigation system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quasi-zenith satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED active matrix organic light emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed
  • quantum dot light emitting diodes quantum dot light emitting diodes (quantum dot light emitting diodes, QLED),
  • the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the terminal device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the focal length of the lens can be used to indicate the viewing range of the camera, and the smaller the focal length of the lens, the larger the viewing range of the lens.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the terminal device 100 may include cameras 193 with two or more focal lengths.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs.
  • the terminal device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the terminal device 100 can be realized through the NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
  • the NPU or other processors may be used to perform operations such as analysis and processing on the images in the video stored by the terminal device 100 .
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system and at least one application program required by a function (such as a sound playing function, an image playing function, etc.).
  • the storage data area can store data (such as audio data, phonebook, etc.) created during the use of the terminal device 100 .
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • a non-volatile memory such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the terminal device 100 may implement an audio function through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor.
  • the audio module 170 is used for converting digital audio signals into analog audio signals for output, and also for converting analog audio input into digital audio signals.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music through the speaker 170A, or listen to the hands-free call.
  • the speaker can play the comparison and analysis results provided by the embodiment of the present application.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the terminal device 100 may be provided with at least one microphone 170C. In some other embodiments, the terminal device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals.
  • the terminal device 100 can also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates of conductive material.
  • the terminal device 100 determines the intensity of pressure according to the change in capacitance.
  • the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyroscope sensor 180B can be used to determine the motion posture of the terminal device 100 .
  • the angular velocity of the terminal device 100 around three axes ie, x, y and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the terminal device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the terminal device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the terminal device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the terminal device 100 may detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the terminal device 100 in various directions (generally three axes). When the terminal device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the terminal device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the terminal device 100 emits infrared light through the light emitting diode.
  • the terminal device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100 . When insufficient reflected light is detected, the terminal device 100 may determine that there is no object near the terminal device 100 .
  • the terminal device 100 can use the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the terminal device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access to the application lock, take pictures with fingerprints, answer incoming calls with fingerprints, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the terminal device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 may reduce the performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the terminal device 100 when the temperature is lower than another threshold, the terminal device 100 heats the battery 142 to avoid abnormal shutdown of the terminal device 100 caused by the low temperature.
  • the terminal device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the terminal device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the terminal device 100 may receive key input and generate key signal input related to user settings and function control of the terminal device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the terminal device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the terminal device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the terminal device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the terminal device 100 and cannot be separated from the terminal device 100 .
  • FIG. 2 shows a schematic diagram of a software structure of a terminal device in a method for displaying annotations provided by an embodiment of the present application.
  • the operating system in the first device or the second device may be an Android system, a Microsoft window system (Windows), an Apple mobile operating system (iOS) or a Harmony OS.
  • the operating system of the first device or the second device is Hongmeng OS as an example for illustration.
  • the Hongmeng system can be divided into four layers, including a kernel layer, a system service layer, a framework layer, and an application layer, and the layers communicate through software interfaces.
  • the kernel layer includes a kernel abstraction layer (Kernel Abstract Layer, KAL) and a driver subsystem.
  • KAL includes multiple kernels, such as the kernel Linux Kernel of the Linux system, the lightweight IoT system kernel LiteOS, etc.
  • the driver subsystem may include a Hardware Driver Foundation (HDF).
  • HDF Hardware Driver Foundation
  • the hardware driver framework can provide a unified peripheral access capability and a driver development and management framework.
  • the multi-core kernel layer can select the corresponding kernel for processing according to the requirements of the system.
  • the system service layer is a collection of core capabilities of the Hongmeng system, and the system service layer provides services to applications through the framework layer.
  • This layer can include:
  • System Basic Capabilities Subsystem Set Provides basic capabilities for the operation, scheduling, and migration of distributed applications on multiple devices in the Hongmeng system. It can include subsystems such as distributed soft bus, distributed data management, distributed task scheduling, multilingual runtime, public basic library, multimode input, graphics, security, artificial intelligence (AI), user program framework, etc.
  • the multilingual runtime provides C or C++ or JavaScript (JS) multilingual runtime and the basic system class library, and can also be a static Java program using a compiler (that is, the Java language is used in the application or framework layer to develop part) provides the runtime.
  • JS JavaScript
  • Basic software service subsystem set provide public and general software services for Hongmeng system. It can include subsystems such as event notification, telephone, multimedia, Design For X (DFX), MSDP&DV, etc.
  • Enhanced software service subsystem set provide Hongmeng system with differentiated capability-enhanced software services for different devices. It can include smart screen proprietary services, wearable proprietary services, and Internet of Things (IoT) proprietary business subsystems.
  • IoT Internet of Things
  • Hardware service subsystem set provide hardware services for Hongmeng system. It can include subsystems such as location services, biometric identification, wearable dedicated hardware services, and IoT dedicated hardware services.
  • the framework layer provides Java, C, C++, JS and other multi-language user program frameworks and ability frameworks for Hongmeng system application development, and two user interface (UI) frameworks (including Java UI for the Java language) framework, a JS UI framework suitable for the JS language), and a multilingual framework application programming interface (Application Programming Interface, API) that is open to the outside world for various software and hardware services.
  • UI user interface
  • API Application Programming Interface
  • the application layer includes system applications and third-party non-system applications.
  • System applications may include applications installed by default on electronic devices such as desktops, control bars, settings, and phones.
  • Extended applications may be non-essential applications developed and designed by electronic device manufacturers, such as electronic device housekeeper, replacement and migration, memo, weather and other applications.
  • Third-party non-system applications can be developed by other manufacturers, but can run applications in the Hongmeng system, such as games, navigation, social or shopping applications.
  • the application of Hongmeng system consists of one or more meta-programs (Feature Ability, FA) or meta-services (Particle Ability, PA).
  • FA has a UI interface, which provides the ability to interact with users.
  • PA has no UI interface and provides the ability to run tasks in the background and a unified data access abstraction.
  • PA mainly provides support for FA, such as providing computing power as a background service, or providing data access capabilities as a data warehouse.
  • Applications developed based on FA or PA can realize specific business functions, support cross-device scheduling and distribution, and provide users with a consistent and efficient application experience.
  • HarmonyOS can realize hardware mutual assistance and resource sharing through distributed soft bus, distributed device virtualization, distributed data management and distributed task scheduling.
  • FIG. 3 shows a schematic flow chart of a comment display method provided by an embodiment of the present application. As an example but not a limitation, the method may be applied to the above-mentioned terminal device 100 .
  • the annotation display method includes the following steps:
  • the media data may be video data or audio data.
  • Media data can be stored locally or from the network.
  • the terminal device When the terminal device plays the locally stored media data, it can locate the storage location of the media data according to the storage path of the media data, and select a corresponding decoder according to the format of the media data to decode the media data before playing.
  • the locally stored media data may be video data captured by a terminal device or audio data recorded.
  • video data as an example, assume that the video data captured by the terminal device is stored in the path of "/internal storage/DCIM/Camera", and the format of the captured video data is the Moving Picture Experts Group4 (MP4) format.
  • MP4 Moving Picture Experts Group4
  • When playing video data first search for the video data to be played under the path of "/internal storage/DCIM/Camera”, then use the MP4 decoder to decode the found video data, and finally, decode the decoded data Render the display and play it.
  • MP4 Moving Picture Experts Group4
  • the media data When playing the media data from the network, the media data can be downloaded according to the Uniform Resource Locator (URL) corresponding to the media data.
  • the media data may be downloaded through streaming media, which may include real-time streaming or sequential streaming.
  • real-time streaming can be realized through Realtime Streaming Protocol (Realtime Streaming Protocol, RTSP) or Microsoft Media Server (Microsoft Media Server, MMS)
  • sequential streaming can be realized through Hypertext Transfer Protocol (Hypertext Transfer Prtcl, HTTP) or File Transfer Protocol (FTP) server implementation.
  • Realtime Streaming Protocol Realtime Streaming Protocol
  • MMS Microsoft Media Server
  • HTTP Hypertext Transfer Protocol
  • FTP File Transfer Protocol
  • Fig. 4 shows an interface when a terminal device applying the annotation display method plays video data.
  • the video data played by the terminal device in FIG. 4 comes from the network.
  • the interface includes a video playing area 301 and an information display area 302 .
  • the video playing area 301 is used to display the playing video data, which also includes a progress bar 3011 and a progress controller 3012, and the progress bar 3011 is used to display the playing progress of the video data.
  • the playback progress can be displayed by time. For example, "1:15" displayed on the left of the progress bar indicates that the video data has been played for 1 minute and 15 seconds, and "10:45” displayed on the right of the progress bar indicates that the video data The total duration of 10 minutes and 45 seconds.
  • the position of the progress controller 3012 on the progress bar can be determined according to the ratio of the elapsed playing time to the total playing time.
  • the left side of the progress controller 3012 represents the duration that has been played, and the right represents the duration to be played, and the sum of the duration that has been played and the duration to be played is equal to the total duration.
  • the duration that has been played and the duration to be played can be represented by progress bars of different colors, textures, or thicknesses.
  • the progress controller 3012 can also be used to control the playback progress. For example, referring to FIG. 4, when the terminal device receives a sliding operation acting on the area displaying the progress controller 3012, it can determine the time of the corresponding playback progress according to the direction and distance of the sliding, and play the video data at that time.
  • relevant information of the video data can be displayed, such as the title of the video data "Mathematics tutorial", the playback logo of other videos in the album described in the video data, recommended video data, comments, likes, forwarding, downloading, etc. functional interface.
  • This application does not limit the types and contents of the information displayed in the information display area 302 .
  • Fig. 5 shows an interface when a terminal device applying the annotation presentation method responds to an annotation operation when playing video data.
  • annotation operation is used to instruct the terminal device to acquire annotation information.
  • Annotation operations may include annotation start operations and annotation stop operations.
  • the annotation start operation may be a touch operation on the video playback area 301, or may be a voice command received by the terminal device, or may be a control signal from a control device connected to the terminal device.
  • the touch operation acting on the video playback area 301 may be a double-tap operation with two fingers, or may also be a double-tap operation with a single finger, a triple-tap operation with two fingers, or a preset gesture operation, etc.
  • the present application does not limit the specific form of the touch operation.
  • the terminal device When the terminal device detects the touch operation acting on the video playback area 301, it confirms that the terminal device has received an annotation instruction, generates an annotation mark 3013 on the progress bar, and then displays an annotation option box 3014 in the area corresponding to the annotation mark.
  • annotation mark 3013 is used to indicate that there is annotation information at this point in time
  • annotation option box 3014 is used to display at least one method for annotation, as shown in FIG. 5 , this embodiment provides "take notes” and " Mark Fragments" two methods for annotation.
  • the annotation method of "note-taking” is manual annotation, which needs to receive the annotation information input by the user
  • the annotation method of "marking fragment” is automatic annotation, which can identify the content of the marked fragment and automatically generate annotations
  • the realization of manual annotation and automatic annotation The method is not limited to this.
  • the terminal device when the terminal device plays video data and displays the video data playback interface, it renders images on different layers according to the received video data and each component in the playback interface.
  • the background of the interface can be rendered on Layer 1
  • the information display area 302 in the interface can be rendered on Layer 2
  • the video data is rendered on Layer 3
  • the progress bar is rendered on Layer 4
  • the progress controller is rendered on On layer five
  • annotation marks are rendered on layer six, and then multiple layers are superimposed in sequence, and the interface shown in Figure 5 is output.
  • each annotation corresponds to an annotation mark
  • each annotation mark can be distinguished by a different color.
  • annotation mark can be red; It is blue, and this application does not limit it.
  • the voice command can be used to wake up the smart assistant and indicate the voice message to be annotated. For example, when waking up the smart assistant, it needs to receive a voice message including "Xiaoyi Xiaoyi". When the voice message of "Xiaoyi, Xiaoyi, start annotating", the smart assistant can be woken up according to the voice message, and the smart assistant can send an annotation instruction to the terminal device according to the content after the wake-up password (that is, "start annotating").
  • the terminal device generates a comment mark 3013 on the progress bar, and then displays a comment option box 3014 in the area corresponding to the comment mark.
  • control device connected to the terminal device may be a keyboard, mouse, stylus, etc.
  • the keyboard and mouse may be connected to the terminal device in a wired or wireless manner.
  • the stylus can be wirelessly connected to the terminal device via Bluetooth.
  • the control signal may be that the stylus receives a press operation of a function button set on the stylus.
  • the function button set on the stylus receives a press operation, the stylus sends an annotation instruction to the terminal device.
  • the terminal device generates a comment mark 3013 on the progress bar, and then displays a comment option box 3014 in the area corresponding to the comment mark.
  • 6 to 10 show an interface for annotating when a terminal device applying the annotation presentation method plays video data.
  • the terminal device when the terminal device receives a click operation on the area displaying "mark segment" in the annotation option box 3014, it determines that the annotation method is "mark segment”. Then, referring to FIG. 7 , the terminal device may close the annotation option box 3014 and start recording the segment to be annotated.
  • the comment mark 3013 can be extended to the same position as the progress controller 3012 to indicate the progress of recording.
  • the terminal device continues to record the segment to be annotated until the terminal device receives an instruction to stop recording or the video data is played.
  • the instruction to stop the mark can be sent through the annotation stop operation, such as the double-click operation on the video playing area 301 shown in FIG. 7 .
  • An annotation stop operation is implemented in a similar manner to an annotation start operation, and details are not repeated here.
  • the terminal device After the terminal device stops operating in response to the comment, it displays the interface as shown in FIG. 8 , and in the comment box 3015 corresponding to the comment mark 3013 , the comment information "general formula" is displayed.
  • the annotation information can be obtained by identifying high-frequency keywords in the recorded video data segment by an image recognition algorithm, or by identifying high-frequency keywords appearing in the audio of the recorded video data segment by a speech recognition algorithm.
  • the statistics show that "general item formula” appears 5 times, "test site” appears 3 times, and other keywords appear less than 3 times. Then the "general item formula" can be determined as the annotation information of the video data segment.
  • the multiple keywords can be used together as annotation information for the video data segment.
  • the statistics show that "general item formula” appears 5 times, "test site” appears 5 times, and other keywords appear less than 5 times. Then it can be determined that the "general item formula, test point" is the annotation information of the segment of video data.
  • a function selection area 3016 may be displayed.
  • the function selection area 3016 may include a preview image of the video data segment and a plurality of function icons, such as "take notes", "save the segment” and so on.
  • a secondary annotation can be performed on the segment of video data.
  • Video data segments are named.
  • the area originally displayed in the function selection area 3016 where the function icon of "save segment” is displayed can be changed to the prompt message "segment has been saved” as shown in FIG. 10 , To remind the user that the video has been saved locally.
  • FIGS. 11 to 13 show another interface for annotating when a terminal device using the annotation presentation method is playing video data.
  • the terminal device when the terminal device receives a click operation on the area displaying "take notes" in the annotation option box 3014, it determines that the annotation method is "take notes”. Or, referring to FIG. 12 , when the terminal device receives a click operation acting on the area displaying "take notes" in the function selection area 3016, it determines that the annotation method is "take notes”.
  • the terminal device displays the interface shown in FIG. 13 .
  • the information display area 302 is partially compressed, and the compressed area is used to display the note recording area 303 .
  • the note recording area includes the note title "Chapter 9 Sequence".
  • the note title can be automatically named after identifying the title and content keywords of the video data, or it can receive the text information input by the user and name it according to the text information.
  • notes can be recorded with a stylus.
  • the stylus 304 is connected to the terminal device via Bluetooth, and when the note recording area 303 receives the movement track of the stylus in this area, the note content 3031 is displayed in this area according to the movement track of the stylus.
  • the stylus setting area 3032 can also be displayed in the note recording area 303 to set the display mode of the stylus movement track, such as displaying as pen marks, brush marks or pencils handwriting etc.
  • the stylus pen setting area 3032 can also provide color setting, thickness setting, etc. of the stylus handwriting, which is not limited in this application.
  • the virtual keyboard program in the terminal device can also receive the user's input operation, record the input text according to the input operation, and display it in the information display area 302 as the note content 3031 .
  • an external keyboard connected to the terminal device can also receive the user's input operation. After receiving the user's input operation, the external keyboard sends a corresponding input signal to the terminal device, and the terminal device generates a corresponding input signal according to the received input signal. The text of is displayed in the information display area 302 as the note content 3031 .
  • the note-taking area 303 can be realized by a note-taking application program, for example, the terminal device can open the note-taking application program after determining that the annotation method is "note-taking", and display the interface of the note-taking application program in the note-taking area 303 .
  • Note recording area 303 can be displayed not only in the form of split screen, but also in the form of floating window, picture-in-picture, etc. After the note is recorded, it is stored in the directory corresponding to the note application.
  • the note recording area 303 may receive an input operation again. If the corresponding annotation operation is not received when the input operation is received, it can be determined that the input operation is plain text content of notes. In this case, the text corresponding to the received input operation and the text entered during the comment can be displayed separately by means of dividing lines, changing the background color, and changing the text color when displaying the note content.
  • the playing information of the media data may include the playing path of the media data corresponding to the annotation information and the playing time information of the corresponding media data when the annotation information was generated.
  • the playback path of the media data includes a storage path of the local media data or a URL of the network media data.
  • the playback time information of the media data corresponding to when the annotation information is generated includes an annotation start time point and an annotation end time point.
  • the annotation start time point is the duration of the media data played when the terminal device receives the annotation operation
  • the annotation end time point is the duration of the media data played when the terminal device receives the instruction to stop recording.
  • the playback path of the media data can include the video data segment at the same time.
  • 14 to 18 show an interface for displaying annotation information and playback information by a terminal device applying the annotation display method.
  • FIG. 14 when the annotation information is generated by “taking notes”, the annotation information is stored in the note application program, and FIG. 14 shows the interface of the note application program. It includes a note directory area 401 and a note content area 402 .
  • the note directory area 401 also includes a note search box 4011 , a title 4012 of the note being displayed, and a title 4013 of the note to be selected. There can be multiple note titles 4013 to be selected.
  • the note search box 4011 is used to respond to user operations, search multiple stored notes according to the input content, and display the search results.
  • the title 4012 of the note being displayed is used to prompt the title of the note being displayed, and the title 4013 of the note to be selected is used to prompt other notes that can be displayed.
  • Note content area 402 includes note content 4021 , play information 4022 , note title 4023 and function area 4024 .
  • the note content 4021 is the annotation information
  • the play information 4022 is used to display the play information of the media data corresponding to the annotation information.
  • the content of the note title 4023 is consistent with the title 4012 of the note being displayed, and is used to prompt the title of the note being displayed.
  • the function area 4024 can provide various functions for modifying, organizing and sharing notes, which is not limited here.
  • a play icon 4014 can be displayed at the title to prompt the user that the note exists. Played media data.
  • a play icon 4014 can be displayed on “Chapter 9 Sequence” and "English”.
  • a click operation acting on an area displaying the play icon 4014 is received, the content of the note corresponding to the play icon 4014 can be displayed and the corresponding media data can be played.
  • annotation information may also be displayed on the playback interface of the media data.
  • each comment mark corresponds to a comment box 3015
  • the comment information corresponding to the comment mark 3013 is shown in the comment box 3015 .
  • annotation display operation may be an independent operation or a combination of multiple operations.
  • the annotation display operation may include a click operation, a voice control operation, or a gesture operation.
  • the comment box 3015 corresponding to the comment mark can be displayed.
  • a comment box 3015 corresponding to each comment mark may also be displayed in response to the click operation. If the annotation mark 3013 of the annotation display operation received corresponds to the annotation information of "taking notes", you can also refer to the interface shown in Figure 17, and when displaying the annotation box 3015, display the corresponding note content through the interface of the note application 4021 (that is, the comment information corresponding to the comment mark) and play information 4022 .
  • annotation display operation may also include any number of combinations of click operations, voice-activated operations, and gesture operations.
  • the terminal device when the terminal device receives a click operation acting on the icon of the note application program, the note application program is started, and at least one note title 4013 to be selected is displayed in the note application program, and when the terminal device receives the When the title 4013 (namely, the note including annotation information) is operated, the note corresponding to the title of the note is displayed, and the title of the note is changed to the title 4012 of the note being displayed.
  • the title 4013 namely, the note including annotation information
  • annotation display operations include but are not limited to click operations, voice control operations, and gesture operations, for example, sliding operations, key operations, and other operation types that can operate terminal devices.
  • the playing information is used to prompt to play the media data corresponding to the annotation information.
  • the play information 4022 includes a thumbnail image of the media data corresponding to the play information and the prompt message "play video".
  • the terminal device receives a click operation acting on the display area of the play information 4022, it can The video URL or the path of the stored video segment plays the video data, and the interface shown in FIG. 13 is displayed.
  • the video data if the video data is played according to the URL, it can be played from the media data playback time point in the playback information, starting from the annotation start time point, and played until the annotation end time point. If the video data is played according to the path of the stored video segment, it can be played directly.
  • the video playback interface 403 may be displayed in the form of a floating window or a picture-in-picture.
  • the video playback interface 403 can respond to control operations to implement functions such as changing position, size, playback progress, playback volume, pause, close, and full screen.
  • the size of the video playback interface 403 can be changed by responding to a pinch gesture operation acting on the area of the video playback interface 403, and the position of the video playback interface 403 can be moved by responding to a drag operation acting on the area of the video playback interface 403, etc.
  • the playback time point of the media data in the playback information corresponding to the comment information can be selected from the comment Play from the start time point to the end time point of annotation.
  • the playback plug-in or the corresponding application program may be woken up through an inter-program call connection, so as to realize playing the media data.
  • sequence numbers of the steps in the above embodiments do not mean the order of execution, and the execution order of each process should be determined by its function and internal logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
  • the operations in the example, such as “click operation” are only examples, and the “click operation” can be implemented by other operations, and should not constitute any limitation to the implementation process of the embodiment of the present application.
  • FIG. 19 shows a schematic structural diagram of an apparatus for displaying annotations provided by an embodiment of the present application. For ease of description, only parts related to the embodiments of the present application are shown.
  • terminal equipment including:
  • the display module 501 is used to display annotation information and playback information in response to the annotation display operation.
  • the playback information is used to prompt the media data corresponding to the annotation information to be played.
  • the playback information is to annotate the media data in response to the annotation operation when the terminal device is playing the media data. obtained when.
  • the annotation display apparatus further includes an annotation module 502, configured to generate annotation information in response to an annotation operation when the terminal device plays media data.
  • the annotation module 502 is further configured to generate play information of the media data corresponding to the annotation information, and the play information includes the play path of the media data and the play time information of the media data when the annotation information is generated.
  • the annotation module 502 is specifically configured to respond to the annotation start operation and display the annotation data input interface.
  • the annotation data is received through the annotation data input interface. Stops receiving annotation data in response to an annotation end operation. According to the annotation data, annotation information is generated.
  • the annotation module 502 is specifically configured to respond to the annotation start operation and the annotation end operation, and acquire the media data segment to be annotated. Keywords in the media data segment to be annotated are identified, and annotation information of the media data segment to be annotated is generated according to the keyword.
  • the annotation module 502 is specifically configured to respond to the annotation start operation and obtain the annotation start time point. In response to the comment end operation, get the comment end time point. Acquire the media data segment to be annotated according to the annotation start time point, the annotation end time point, and the playback path of the media data.
  • the annotation module 502 is specifically configured to acquire the playback time information of the media data corresponding to the annotation information according to the annotation start time point and the annotation end time point.
  • the playing information of the media data corresponding to the annotation information is generated according to the playing path of the media data and the playing time information of the media data corresponding to the annotation information.
  • the playback interface of the terminal device to play the media data includes a progress bar.
  • the device also includes a marking module 503, configured to display an annotation mark corresponding to the annotation information on the progress bar according to the annotation start time point and the annotation end time point.
  • the device further includes a saving module 504, configured to save the media data segment corresponding to the annotation information locally according to the annotation start time point, the annotation end time point, and the playback path of the media data.
  • a saving module 504 configured to save the media data segment corresponding to the annotation information locally according to the annotation start time point, the annotation end time point, and the playback path of the media data.
  • the saving module 504 is specifically configured to acquire the media data segment to be saved according to the annotation start time point, the annotation end time point, and the playback path of the media data. Name the media data segment to be saved according to the annotation information and save it locally.
  • the device further includes a playing module 505, configured to play the media data corresponding to the annotation information according to the playing information in response to the playing operation.
  • Each functional unit and module in the embodiment may be integrated into one processing unit, or each unit may exist separately physically, or two or more units may be integrated into one unit, and the above-mentioned integrated units may adopt hardware It can also be implemented in the form of software functional units.
  • FIG. 20 shows a structural block diagram of a terminal device provided by an embodiment of the present application.
  • the terminal device 600 of this embodiment includes: at least one processor 601 (only one is shown in FIG. 20 ), a memory 602, and a computer stored in the memory 602 and capable of running on the at least one processor 601 A program 603, when the processor 601 executes the computer program 603, implements the steps of the network switching method in each of the foregoing embodiments.
  • the terminal device 600 may be a smart phone, a tablet computer, a wearable device, an augmented reality (Augmented Reality, AR)/virtual reality (Virtual Reality, VR) device, a large-screen device, a vehicle terminal, and the like.
  • the terminal device may include, but not limited to, a processor 601 and a memory 602 .
  • FIG. 20 is only an example of the terminal device 600, and does not constitute a limitation to the terminal device 600. It may include more or less components than those shown in the figure, or combine some components, or different components. , for example, may also include input and output devices, network access devices, and so on.
  • the so-called processor 601 can be a central processing unit (Central Processing Unit, CPU), and the processor 601 can also be other general processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit , ASIC), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the storage 602 may be an internal storage unit of the terminal device 600 in some embodiments, such as a hard disk or memory of the terminal device 600 .
  • the memory 602 may also be an external storage device of the terminal device 600 in other embodiments, such as a plug-in hard disk equipped on the terminal device 600, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, flash memory card (Flash Card), etc.
  • the memory 602 may also include both an internal storage unit of the terminal device 600 and an external storage device.
  • the memory 602 is used to store operating systems, application programs, boot loaders (BootLoader), data, and other programs, such as program codes of computer programs.
  • the memory 602 can also be used to temporarily store data that has been output or will be output.
  • the embodiment of the present application also provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the foregoing method embodiments can be realized.
  • An embodiment of the present application provides a computer program product.
  • the computer program product When the computer program product is run on a mobile terminal, the mobile terminal can implement the steps in the foregoing method embodiments when executed.
  • An embodiment of the present application provides a chip system, the chip system includes a memory and a processor, and the processor executes a computer program stored in the memory, so as to implement the steps in the foregoing method embodiments.
  • An embodiment of the present application provides a chip system, the chip system includes a processor, the processor is coupled to a computer-readable storage medium, and the processor executes a computer program stored in the computer-readable storage medium, so as to implement the above-mentioned method embodiments. step.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the computer program includes computer program code
  • the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable medium may at least include: any entity or device capable of carrying computer program codes to the photographing device/terminal device, recording medium, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM) , Random Access Memory), electrical carrier signals, telecommunication signals, and software distribution media.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signals telecommunication signals
  • software distribution media Such as U disk, mobile hard disk, magnetic disk or optical disk, etc.
  • computer readable media may not be electrical carrier signals and telecommunication signals under legislation and patent practice.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods. For example, multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • a unit described as a separate component may or may not be physically separated, and a component displayed as a unit may or may not be a physical unit, that is, it may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.

Abstract

La présente demande est applicable au domaine des terminaux, et concerne un procédé et un appareil d'affichage d'annotation, un dispositif terminal et un support de stockage lisible. Le procédé d'affichage d'annotation consiste à : afficher des informations d'annotation et des informations de lecture en réponse à une opération d'affichage d'annotation, les informations de lecture étant utilisées pour donner une invite concernant la lecture de données multimédias correspondant aux informations d'annotation, et les informations de lecture étant obtenues par l'annotation des données multimédias en réponse à une opération d'annotation lorsque le dispositif terminal lit les données multimédias. Les informations d'annotation et les informations de lecture sont affichées en même temps, de telle sorte que, lorsqu'un utilisateur doit visualiser les données multimédias correspondant aux informations d'annotation, les données multimédias peuvent être lues rapidement selon les informations de lecture ; l'opération est simple et pratique, l'efficacité de visualisation peut être efficacement améliorée, et l'expérience d'utilisateur est améliorée.
PCT/CN2022/111468 2021-08-31 2022-08-10 Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible WO2023029916A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111019507.8 2021-08-31
CN202111019507.8A CN115730091A (zh) 2021-08-31 2021-08-31 批注展示方法、装置、终端设备及可读存储介质

Publications (1)

Publication Number Publication Date
WO2023029916A1 true WO2023029916A1 (fr) 2023-03-09

Family

ID=85292003

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/111468 WO2023029916A1 (fr) 2021-08-31 2022-08-10 Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible

Country Status (2)

Country Link
CN (1) CN115730091A (fr)
WO (1) WO2023029916A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116301556A (zh) * 2023-05-19 2023-06-23 安徽卓智教育科技有限责任公司 互动白板软件的互动方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930779A (zh) * 2010-07-29 2010-12-29 华为终端有限公司 一种视频批注方法及视频播放器
CN103024587A (zh) * 2012-12-31 2013-04-03 Tcl数码科技(深圳)有限责任公司 一种视频点播的信息标注及显示方法及装置
CN103517158A (zh) * 2012-06-25 2014-01-15 华为技术有限公司 一种生成可展示视频批注的视频的方法、装置及系统
US20140344661A1 (en) * 2013-05-20 2014-11-20 Google Inc. Personalized Annotations
CN105844987A (zh) * 2016-05-30 2016-08-10 深圳科润视讯技术有限公司 多媒体教学互动操作方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930779A (zh) * 2010-07-29 2010-12-29 华为终端有限公司 一种视频批注方法及视频播放器
CN103517158A (zh) * 2012-06-25 2014-01-15 华为技术有限公司 一种生成可展示视频批注的视频的方法、装置及系统
CN103024587A (zh) * 2012-12-31 2013-04-03 Tcl数码科技(深圳)有限责任公司 一种视频点播的信息标注及显示方法及装置
US20140344661A1 (en) * 2013-05-20 2014-11-20 Google Inc. Personalized Annotations
CN105844987A (zh) * 2016-05-30 2016-08-10 深圳科润视讯技术有限公司 多媒体教学互动操作方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116301556A (zh) * 2023-05-19 2023-06-23 安徽卓智教育科技有限责任公司 互动白板软件的互动方法、装置、电子设备及存储介质
CN116301556B (zh) * 2023-05-19 2023-08-11 安徽卓智教育科技有限责任公司 互动白板软件的互动方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN115730091A (zh) 2023-03-03

Similar Documents

Publication Publication Date Title
JP7142783B2 (ja) 音声制御方法及び電子装置
WO2021213120A1 (fr) Procédé et appareil de projection d'écran et dispositif électronique
WO2020211701A1 (fr) Procédé de formation de modèle, procédé de reconnaissance d'émotion, appareil et dispositif associés
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2020078299A1 (fr) Procédé permettant de traiter un fichier vidéo et dispositif électronique
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2020173370A1 (fr) Procédé pour déplacer des icônes d'applications et dispositif électronique
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
WO2021063237A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
WO2021159746A1 (fr) Procédé et système de partage de fichiers et dispositif associé
WO2020088633A1 (fr) Procédé de paiement, dispositif et unité d'équipement utilisateur
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2020024108A1 (fr) Procédé d'affichage d'icônes d'application et terminal
WO2021218429A1 (fr) Procédé de gestion d'une fenêtre d'application, dispositif terminal et support de stockage lisible par ordinateur
WO2022052740A1 (fr) Dispositif pliable et procédé de commande d'ouverture et de fermeture de celui-ci
WO2024045801A1 (fr) Procédé de capture d'écran, dispositif électronique, support et produit programme
WO2021042878A1 (fr) Procédé photographique et dispositif électronique
WO2023273543A1 (fr) Procédé et appareil de gestion de dossier
CN115756268A (zh) 跨设备交互的方法、装置、投屏系统及终端
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
WO2023029916A1 (fr) Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22863057

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE