WO2020078298A1 - 一种内容编辑的方法及终端 - Google Patents

一种内容编辑的方法及终端 Download PDF

Info

Publication number
WO2020078298A1
WO2020078298A1 PCT/CN2019/110915 CN2019110915W WO2020078298A1 WO 2020078298 A1 WO2020078298 A1 WO 2020078298A1 CN 2019110915 W CN2019110915 W CN 2019110915W WO 2020078298 A1 WO2020078298 A1 WO 2020078298A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
terminal
graffiti
layer
editing
Prior art date
Application number
PCT/CN2019/110915
Other languages
English (en)
French (fr)
Inventor
刘霖
高曦
韩笑
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/285,742 priority Critical patent/US20220005241A1/en
Priority to EP19873100.2A priority patent/EP3859680A4/en
Publication of WO2020078298A1 publication Critical patent/WO2020078298A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/32Image data format

Definitions

  • the present application relates to the technical field of terminals, in particular to a content editing method and terminal.
  • terminals such as mobile phones, tablet computers, etc.
  • Users can record mood, notes, plans and other information through the terminal anytime, anywhere. Therefore, how to enable users to record information such as text, pictures, and graffiti more conveniently using the terminal has important practical value.
  • Embodiments of the present application provide a content editing method and terminal, which help make it easier for users to record information while using the terminal, thereby improving user experience.
  • a content editing method provided by an embodiment of the present application includes:
  • the terminal displays an editable user interface, where the editable user interface displays multimedia content and graffiti content.
  • the graffiti content is generated in response to user operations on the multimedia content.
  • the editable user interface includes a content editing area, and the content editing area is used to add Or edit multimedia content and / or graffiti content.
  • the terminal may display the added or edited graffiti content in the content editing area in response to the operation of adding or editing graffiti content; or, the terminal may display the added or edited multimedia content in the content editing area in response to the operation of adding or editing multimedia content .
  • the multimedia content or graffiti content when multimedia content and graffiti content are displayed on the editable user interface, the multimedia content or graffiti content can be edited and / or added, thus helping the user to modify the multimedia content and / or graffiti content, thereby helping To improve the user experience.
  • multimedia content in the embodiments of the present application may be text, pictures, audio, video, moving pictures, and so on.
  • the terminal switches the editing of multimedia content and graffiti content in response to the user's switching operation.
  • the content editing area includes a first layer and a second layer.
  • the first layer is used to add or edit multimedia content
  • the second layer is used to add or edit graffiti content
  • the terminal responds to the user's first operation, after switching the position of the first layer and the second layer, it responds to the addition Or edit the graffiti content, display the added or edited graffiti content in the content editing area.
  • the user can graffiti anywhere on the second layer.
  • the editable user interface further includes a function button area, the function button area includes a drawing button, and the first operation is an operation on the drawing button. This helps to simplify the user's operation.
  • the size of the second layer is the same as the size of the first layer.
  • the user can graffiti anywhere in the content editing area.
  • the second layer is transparent. Through the above technical solutions, it helps users graffiti on multimedia content.
  • the terminal stores graffiti content in response to the operation of adding or editing graffiti content, where the graffiti content is saved in a sparse dot matrix manner. Helps make graffiti content independent of image size.
  • the terminal after detecting that the added graffiti content or multimedia content reaches or exceeds the preset position of the content editing area, the terminal automatically expands the content editing area by a preset size. This helps users add graffiti content and / or multimedia content.
  • an embodiment of the present application provides a terminal, a display screen, one or more processors, and a memory; multiple application programs; and one or more computer programs, wherein the one or more computer programs are stored In the memory, the one or more computer programs include instructions, which when executed by the terminal, cause the terminal to perform the following steps:
  • the editable user interface displays multimedia content and graffiti content, the graffiti content is generated in response to user operations on the multimedia content, and the editable user interface includes a content editing area ,
  • the content editing area is used to add or edit the multimedia content and / or graffiti content;
  • the added or edited multimedia content is displayed in the content editing area.
  • the instruction further includes: an instruction to switch the editing of the multimedia content and the graffiti content in response to a user's switching operation.
  • the content editing area includes a first layer and a second layer; the first layer is used to add or edit multimedia content, and the second layer is used to add or edit graffiti Content; the instructions also include:
  • the editable user interface further includes a function button area, the function button area includes a drawing button, and the first operation is an operation on the drawing button.
  • the size of the second layer is the same as the size of the first layer.
  • the second layer is transparent.
  • the instruction further includes: an instruction to store the graffiti content in response to an operation of adding or editing the graffiti content, wherein the graffiti content is saved in a sparse dot matrix manner.
  • the instruction further includes an instruction to automatically extend the content editing area by a preset size after detecting that the added graffiti content or multimedia content reaches or exceeds the preset position of the content editing area.
  • a chip provided in an embodiment of the present application is coupled to a memory in a terminal, so that the chip calls a computer program stored in the memory when the chip is running to implement the first aspect of the embodiment of the present application and Any possible design method provided in the first aspect.
  • a computer storage medium stores a computer program, and when the computer program runs on a terminal, the terminal executes the first aspect and any one of the first aspect is possible Design method.
  • a computer program product when the computer program product runs on a terminal, causes the terminal to perform the first aspect and any possible design method of the first aspect.
  • FIG. 1 is a schematic diagram of a hardware structure of a terminal provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a user interface in an embodiment of this application.
  • FIG. 3 is a schematic diagram of the positional relationship between the first layer and the second layer according to an embodiment of the application;
  • FIG. 4 is a schematic diagram of a user interface in another embodiment of this application.
  • FIG. 6 is a schematic diagram of a storage format of graffiti content according to an embodiment of the application.
  • FIG. 7 is an expanded schematic diagram of a content editing area according to an embodiment of the application.
  • FIG. 8 is a schematic diagram of a user interface according to another embodiment of this application.
  • FIG. 9 is a schematic flowchart of a content editing method according to an embodiment of this application.
  • FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a terminal according to another embodiment of the present application.
  • At least one refers to one or more, and “multiple” refers to two or more than two.
  • “And / or” describes the relationship of the related objects, indicating that there can be three relationships.
  • a and / or B can represent the following three relationships: A exists alone, A and B exist simultaneously, and B exists alone.
  • a and B can be singular or plural.
  • the character “/” generally indicates that the related object is a "or” relationship.
  • At least one (item) of the following” or similar expressions refer to any combination of these items, including any combination of a single item (s) or a plurality of items (s).
  • At least one (a) of a, b, or c can represent: a, b, c, a and b, a and c, b and c, or a, b and c, where a, b, c It can be single or multiple.
  • the terminal may be a portable terminal, such as a mobile phone, a tablet computer, a wearable device with a wireless communication function (such as a smart watch), an in-vehicle device, and the like.
  • portable terminals include but are not limited to piggybacking Or portable terminals of other operating systems.
  • the above-mentioned portable terminal may also be a laptop computer (Laptop) having a touch-sensitive surface (for example, a touch panel) or the like.
  • the terminal 100 may also be a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the terminal 100 may include a processor 110, an internal memory 121, an external memory interface 122, an antenna 1, a mobile communication module 131, an antenna 2, a wireless communication module 132, an audio module 140, a speaker 140A, a receiver 140B, a microphone 140C, a headset Interface 140D, display screen 151, subscriber identification module (SIM) card interface 152, camera 153, button 154, sensor module 160, universal serial bus (USB) interface 170, charging management module 180, Power management module 181 and battery 182.
  • the terminal 100 may further include a motor, an indicator, and the like.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (ISP), a controller, and a video Codec, digital signal processor (DSP), baseband processor, and / or neural-network processor (NPU), etc.
  • application processor application processor
  • AP application processor
  • modem processor graphics processor
  • ISP image signal processor
  • controller a controller
  • video Codec video Codec
  • DSP digital signal processor
  • NPU neural-network processor
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in the processor 110 may be a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. The repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the terminal 100.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one function required application programs (such as sound playback function, image playback function, etc.).
  • the storage data area may store data (such as audio data, phone book, memo, etc.) created during the use of the terminal 100 and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the external memory interface 122 may be used to connect an external memory card (for example, a Micro SD card) to realize the expansion of the storage capacity of the terminal 100.
  • the external memory card communicates with the processor 110 through the external memory interface 122 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the terminal 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 131 can provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied to the terminal 100.
  • the mobile communication module 131 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and so on.
  • the mobile communication module 131 can receive the electromagnetic wave signal from the antenna 1 and filter, amplify, etc. the received electromagnetic wave signal, and transmit it to the modem processor for demodulation.
  • the mobile communication module 131 can also amplify the signal modulated by the modulation and demodulation processor and convert it into an electromagnetic wave signal through the antenna 1 to radiate it out.
  • at least part of the functional modules of the mobile communication module 131 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 131 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 140A, a receiver 140B, etc.), or displays an image or video through a display screen 151.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module 131 or other functional modules.
  • the wireless communication module 132 can provide wireless local area networks (wireless local area networks, WLAN) (such as Wi-Fi networks), Bluetooth (bluetooth, BT), and global navigation satellite system (GNSS) that are applied to the terminal 100 , Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR) and other wireless communication solutions.
  • the wireless communication module 132 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 132 receives the electromagnetic wave signal via the antenna 2, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110.
  • the wireless communication module 132 can also receive the signal to be transmitted from the processor 110, perform frequency modulation and amplification on it, and convert it to an electromagnetic wave signal via the antenna 2 to radiate it out.
  • the antenna 1 and the mobile communication module 131 are coupled, and the antenna 2 and the wireless communication module 132 are coupled so that the terminal 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include a global mobile communication system (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Wideband code division multiple access (WCDMA), time division code division multiple access (time-division code division multiple access (TD-SCDMA), long-term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology, etc.
  • GSM global system for mobile communications
  • general packet radio service general packet radio service
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • WCDMA broadband Wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • LTE long-term evolution
  • BT GNSS
  • WLAN wireless local area network
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and / or satellite-based augmentation system (SBAS), etc.
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation system
  • the terminal 100 may implement audio functions through an audio module 140, a speaker 140A, a receiver 140B, a microphone 140C, a headphone interface 140D, an application processor, and the like. For example, music playback, recording, etc.
  • the audio module 140 can be used to convert digital audio information into an analog audio signal output and also to convert analog audio input into a digital audio signal.
  • the audio module 140 may also be used to encode and decode audio signals.
  • the audio module 140 may be disposed in the processor 110, or some functional modules of the audio module 140 may be disposed in the processor 110.
  • the speaker 140A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the terminal 100 can listen to music through the speaker 140A or answer a hands-free call.
  • the receiver 140B also known as "handset" is used to convert audio electrical signals into sound signals.
  • the terminal 100 answers a call or voice message, it can answer the voice by holding the receiver 140B close to the ear.
  • the microphone 140C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the terminal 100 may be provided with at least one microphone 140C.
  • the terminal 100 may be provided with two microphones 140C.
  • the terminal 100 may also implement a noise reduction function.
  • the terminal 100 may also be provided with three, four, or more microphones 140C, to achieve sound signal collection, noise reduction, and also to identify the sound source, to realize the directional recording function, and so on.
  • the headset interface 140D is used to connect wired headsets.
  • the earphone interface 140D may be a USB interface 130, or a 3.5mm open mobile terminal (OMTP) standard interface, a cellular telecommunications industry association (CTIA) standard interface, etc.
  • OMTP open mobile terminal
  • CTIA cellular telecommunications industry association
  • the terminal 100 may implement a display function through a GPU, a display screen 151, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and connects the display screen 151 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 151 can be used to display images, videos, and the like.
  • the display screen 151 may include a display panel.
  • the display panel may use a liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active matrix organic light-emitting diode (active-matrix organic light) emitting diodes (AMOLED), flexible light-emitting diodes (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (QLED), etc.
  • the terminal 100 may include 1 or N display screens 151, where N is a positive integer greater than 1.
  • the terminal 100 may also implement a shooting function through an ISP, a camera 153, a video codec, a GPU, a display screen 151, an application processor, and so on.
  • the ISP can be used to process the data fed back by the camera 153. For example, when taking a picture, the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also optimize the algorithm of image noise, brightness and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be set in the camera 193.
  • the camera 153 may be used to capture still images or video.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the terminal 100 may include 1 or N cameras 153, where N is a positive integer greater than 1.
  • the key 154 may include a power-on key, a volume key, and the like.
  • the button 154 may be a mechanical button or a touch button.
  • the terminal 100 may receive key input and generate key signal input related to user settings and function control of the terminal 100.
  • the sensor module 160 may include one or more sensors.
  • the sensor module 160 may further include an environment sensor, a distance sensor, a proximity light sensor, a bone conduction sensor, and the like.
  • the touch sensor 160A may also be referred to as a “touch panel”.
  • the touch sensor 160A may be disposed on the display screen 151, and the touch sensor 160A and the display screen 151 constitute a touch screen, which may also be referred to as a "touch screen”.
  • the touch sensor 160A is used to detect a touch operation acting on or near it.
  • the touch sensor 160A may pass the detected touch operation to the application processor to determine the type of touch event.
  • the terminal 100 may then provide a visual output related to the touch operation through the display screen 151 or provide an auditory output related to the touch operation through the speaker 140A.
  • the touch sensor 160A may also be disposed on the surface of the terminal 100, which is different from the location where the display screen 151 is located.
  • the fingerprint sensor 160 may be used to collect fingerprints.
  • the terminal 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access to application locks, fingerprint taking pictures, fingerprint answering calls, and the like.
  • the gyro sensor 160C may be used to determine the movement posture of the terminal 100.
  • the angular velocity of the terminal 100 about three axes ie, x, y, and z axes
  • the gyro sensor 160C can be used for shooting anti-shake.
  • the gyro sensor 160C detects the angle at which the electronic device 100 shakes, and calculates the distance that the lens module needs to compensate based on the angle, allowing the lens to counteract the shake of the terminal 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 160C can also be used in scenarios such as navigation and somatosensory games.
  • the pressure sensor 160D is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 160D may be provided on the display screen 151.
  • the capacitive pressure sensor may be a parallel plate including at least two conductive materials. When force is applied to the pressure sensor 180A, the capacitance between the electrodes changes. The terminal 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation is applied to the display screen 194, the terminal 100 detects the intensity of the touch operation according to the pressure sensor 180A. The terminal 100 may calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation intensities may correspond to different operation instructions. For example, when there is a touch operation with a touch operation intensity less than the first pressure threshold acting on the application icon of the memo, an instruction to view the memo is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the application icon of the memo, an instruction to create a new memo is executed.
  • the acceleration sensor 160E can detect the magnitude of the acceleration of the terminal 100 in various directions (generally three axes). When the terminal 100 is stationary, the magnitude and direction of gravity can be detected. The acceleration sensor 160E can also be used to recognize the posture of the terminal 100, and can also be applied to applications such as horizontal and vertical screen switching, pedometers, and the like.
  • the processor 110 may also include one or more interfaces.
  • the interface may be a SIM card interface 152.
  • the interface may also be a USB interface 170.
  • the interface may also be an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit integrated audio (inter-integrated circuit, sound, I2S) interface, a pulse code modulation (pulse code modulation (PCM) interface, a universal asynchronous transceiver transmission (Universal asynchronous receiver / transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input / output (GPIO) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit integrated audio
  • PCM pulse code modulation
  • UART universal asynchronous transceiver transmission
  • mobile industry processor interface mobile industry processor interface
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • the embodiments of the present application may connect different modules of the terminal 100 through an interface, so that the terminal 100 can implement different functions. For example, taking pictures and processing. It should be noted that the embodiment of the present application does not limit the connection mode of the interface in the terminal 100.
  • the SIM card interface 152 may be used to connect the SIM card.
  • the SIM card can be inserted into or removed from the SIM card interface 152 to achieve contact and separation with the terminal 100.
  • the terminal 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 152 can support Nano SIM cards, Micro SIM cards, SIM cards, and the like. Multiple SIM cards can be inserted into the same SIM card interface 152 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 152 can also be compatible with different types of SIM cards.
  • the SIM card interface 152 can also be compatible with external memory cards.
  • the terminal 100 can interact with the network through the SIM card to realize functions such as call and data communication.
  • the terminal 100 may use eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the terminal 100 and cannot be separated from the terminal 100.
  • the USB interface 170 is an interface conforming to the USB standard specification.
  • the USB interface 170 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 170 may be used to connect a charger to charge the terminal 100, or may be used to transfer data between the terminal 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones.
  • the USB interface 170 can also be used to connect to other terminals, such as augmented reality (augmented reality (AR) equipment).
  • AR augmented reality
  • the charging management module 180 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 180 may receive the charging input of the wired charger through the USB interface 170.
  • the charging management module 180 may receive wireless charging input through the wireless charging coil of the terminal 100. While the charging management module 180 charges the battery 182, the power management module 181 can also supply power to the terminal 100.
  • the power management module 181 is used to connect the battery 182, the charging management module 180, and the processor 110.
  • the power management module 181 receives input from the battery 182 and / or the charging management module 180, and supplies power to the processor 110, internal memory 121, external memory, display screen 151, camera 153, mobile communication module 131, wireless communication module 132, and the like.
  • the power management module 181 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 181 may also be disposed in the processor 110.
  • the power management module 181 and the charging management module 180 may also be set in the same device.
  • the hardware structure of the terminal 100 shown in FIG. 1 is only an example.
  • the terminal 100 of the embodiment of the present application may have more or less components than shown in the figure, may combine two or more components, or may have different component configurations.
  • the various components shown in the figures can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and / or application specific integrated circuits.
  • the terminal 100 in the embodiments of the present application may implement different functions by installing different applications, where the applications may be native applications, such as settings, phones, cameras, etc., or may be downloaded from the application store Third-party applications, such as WeChat, etc.
  • the terminal 100 can add or edit multimedia content (such as text, pictures, etc.) and / or graffiti content on the editable user interface, so that the user can conveniently modify or add multimedia content and graffiti content. Helps improve the user experience.
  • multimedia content such as text, pictures, etc.
  • graffiti content on the editable user interface
  • the content editing method of the embodiment of the present application can be applied to applications with the function of recording multimedia content.
  • the multimedia content in the embodiments of the present application may include text, audio, video, pictures, moving pictures, and so on.
  • the application with the function of recording multimedia content may be a memo or a document.
  • the following uses a memo as an example to describe in detail the content editing method provided by the embodiment of the present application.
  • the implementation manner is similar to that applied to the memo, and will not be repeated here.
  • the display screen 151 of the terminal 100 displays a main interface, where the main interface includes a memo icon.
  • the main interface may be the user interface 200 shown in FIG. 2.
  • the user interface 200 includes a memo icon 201.
  • the user interface 200 may also include icons of other applications, such as setting icons, camera icons, and gallery icons.
  • the user interface 200 may further include a status bar 202, a hideable navigation bar 203, and a Dock bar 204.
  • the status bar 202 may include the name of the operator (for example, China Mobile, etc.), mobile network (for example, 4G), Bluetooth icon, time, and remaining power.
  • the status bar 202 may further include a WiFi icon, an external device icon, and the like.
  • the navigation bar 203 may include a back button, a home button, and a historical task viewing button (menu button).
  • the Dock bar can include commonly used application icons, such as phone icons, information icons, mail icons, and weather icons. It should be noted that the application icon in the Dock bar can be set according to the needs of the user.
  • the terminal 100 displays the user interface 210 on the display screen 151 in response to the operation of the memo icon 201.
  • the user interface 210 may include a new memo button 211.
  • the user interface 210 may further include a saved memo list 212.
  • the user interface 210 may further include a search box 213.
  • the terminal 100 may cause the display screen 151 to display the memo including the keyword in the saved memo list 212 in response to the keyword entered by the user in the search box 213, thereby helping to improve the user's search from the saved memo list 212 The efficiency of the memo.
  • the operation on the memo icon 201 may be a user's touch operation on the memo icon 201, or other operations, such as long-pressing or re-pressing.
  • the terminal 100 may detect the operation of the icon 201 of the memo through the touch sensor 160A.
  • the terminal 100 may also display the user interface 210 on the display screen 151 in response to other operations (for example, a voice instruction of “open memo”, a shortcut gesture operation, etc.).
  • the terminal 100 may respond to the operation on the memo icon 201 based on the following manner:
  • the touch sensor 160A of the terminal 100 detects the memo icon 201 After the operation, send a touch event to the processor 110 (for example, an application processor).
  • the processor 110 determines that the type of the touch event is an operation to open a memo, and then notifies the display screen 151 to display the user interface 210.
  • the terminal 100 may display an editable user interface on the display screen 151 in response to the operation of the new memo button 211 or the memo saved in the memo list 212.
  • the terminal 100 may also display an editable user interface on the display screen 151 in response to other operations (such as a shortcut gesture operation, or a voice instruction for creating a new memo, or a voice instruction for opening a saved memo, etc.).
  • the editable user interface includes a content editing area.
  • the content editing area may display multimedia content and / or graffiti content.
  • the terminal 100 may display the added or edited graffiti content in the content editing area in response to the operation of adding or editing multimedia content; or, the terminal 100 may display the added or edited content in the content editing area in response to the operation of adding or editing multimedia content.
  • Multimedia content In the embodiment of the present application, the graffiti content may be generated in response to the user's operation on the multimedia content.
  • the content editing area includes a first layer and a second layer, where the user of the first layer adds or edits multimedia content, and the second layer is used to add or edit graffiti content, thereby helping to simplify The graffiti content can be generated in response to the user's operation on the multimedia content.
  • the positional relationship between the first layer and the second layer may be as shown in FIG. 3, taking layer 301 as the first layer and layer 302 as the second layer as an example, when layer 301 is overlaid on the map
  • terminal 100 may respond to the operation of adding and / or editing multimedia content, and display the added or edited multimedia content in the content editing area; when layer 302 is overlaid on layer 301, terminal 100 may respond to The operation of adding and / or editing graffiti content displays the added or edited graffiti content in the content editing area.
  • the terminal 100 may implement the switching of the positional relationship between the layer 301 and the layer 302 in response to the user's operation, thereby adding and / or editing multimedia content or graffiti content.
  • the operations of editing graffiti content may be operations such as deleting graffiti content, changing the color of the graffiti content, and changing the thickness of the lines of the graffiti content.
  • the operations of adding multimedia content in the embodiment of the present application may include operations such as adding text, pictures, video, and audio.
  • the operations of editing multimedia content may include operations of deleting multimedia content, replacing multimedia content, copying multimedia content, and pasting multimedia content.
  • the terminal 100 responding to the operation of the new memo button 211 and displaying an editable user interface on the display screen 151 as an example. Since the display screen 151 is an editable user interface displayed in response to the new memo button, when the user has not yet added multimedia content and / or graffiti content on the editable user interface, the content editing area may be blank, You can also automatically display the time when the memo was created.
  • the editable user interface displayed on the display screen 151 may be the user interface 220 shown in FIG. 2. In other embodiments, in response to the operation of the new memo button 211, the editable user interface displayed on the display screen 151 may be the user interface 230 shown in FIG.
  • the user interface 220 includes a content editing area 221.
  • the content editing area 221 includes a first layer and a second layer.
  • the terminal 100 may cover the first layer on the second layer in response to a touch operation on any position of the content editing area 221.
  • the terminal may display a virtual keyboard on the user interface 230 in response to a touch operation at any position of the content editing area 221.
  • the user interface 230 after the user interface 230 displays the virtual keyboard may be the user interface 220 shown in FIG. 2.
  • the terminal 100 may hide the virtual keyboard 223 in response to a pull-down operation on the user interface 220 or the like.
  • the user interface 220 includes a content editing area 221 and a virtual keyboard 223, and the first layer in the content editing area 221 is overlaid on the second layer.
  • the terminal 100 may display and add the corresponding text in the content editing area 221 in response to the user's operation of adding the text through the virtual keyboard 223. It should be noted that the terminal 100 adds text on the first layer.
  • the terminal 100 may also display the added picture, video, or audio in the content editing area 221 in response to the user's operation.
  • the terminal 100 may add pictures, video, or audio on the first layer.
  • the user interface 220 also includes a function bar 222.
  • the function bar 222 includes at least one of a camera button 227, a picture button 226, and a voice button 225. So as to facilitate users to add pictures, audio and video. It should be noted that the function bar 222 can be hidden or not.
  • the terminal 100 may open the gallery in response to the operation of the picture button 226, and display photos and videos saved in the gallery on the display screen 151, and the like.
  • the terminal 100 responds to the operation of selecting one or more pictures in the gallery, and displays the one or more pictures in the selected gallery in the content editing area 223.
  • the terminal 100 may turn on the camera in response to the operation of the camera button 227 and display the camera's shooting interface on the display screen 151.
  • the terminal 100 detects that the shooting is completed, it may display the photos or videos taken by the camera in the content editing area 221.
  • the terminal 100 can collect the sound in the external environment (such as that issued by the user) through the microphone 140C or other sound collection devices, and after the collection is completed, display the collection through the microphone 140C in the content editing area Voice.
  • the function bar 222 may further include a text editing button 228.
  • the terminal 100 may edit the format of the text displayed in the content editing area 221 in response to the operation of the text editing button 228. Therefore, it is convenient for the user to adjust the format of the text displayed in the content editing area 221.
  • the terminal 100 may switch the positions of the first layer and the second layer in response to the first operation.
  • the first operation may be a shortcut gesture operation, such as a three-finger slide-up operation, or a two-finger slide-down operation, or a voice command.
  • the function bar 222 further includes a drawing button 224.
  • the first operation may be an operation on the drawing button 224.
  • the drawing button in the embodiment of the present application may also be called a graffiti button, or other names, which is not limited. Take the drawing button as an example.
  • the terminal 100 in response to the operation of the drawing button 224 included on the user interface 220, the terminal 100 switches the positions of the first layer and the second layer so that the second layer overlays the first layer. Therefore, the terminal 100 can display the added or edited graffiti content in the content editing area 221 in response to the operation of adding or editing the graffiti content.
  • the size of the second layer is equal to the size of the first layer. Since the second layer is overlaid on the first layer, the user can graffiti anywhere in the content editing area 221 . Therefore, the user can annotate or mark the multimedia content added to the first layer. It should be noted that the size of the first layer and the size of the second layer may also be different.
  • the size of the second layer in the embodiment of the present application may be the size of the area where the multimedia content is located.
  • the content The size of the editing area is the size of the first layer.
  • the second layer is transparent.
  • the first layer may be transparent or non-transparent, which is not limited. Thereby, it helps the terminal 100 display multimedia content and graffiti content.
  • the terminal 100 may also display a drawing toolbar in response to the operation of the drawing button 224 included on the user interface 220.
  • the drawing toolbar 441 includes at least one of a pen type and a color selection button, an erase button, an undo typing button, a resume typing button, and a close button.
  • the embodiment of the present application does not limit the buttons included in the drawing toolbar. So that users can graffiti according to their different needs.
  • the terminal 100 may respond to the operation of the pen type and color selection buttons, and may display various pens (such as pencils, writing brushes, etc.) for graffiti on the display screen 151, and different pens draw different line types, and Color (such as red, yellow, etc.).
  • the user can select from various pens and colors for graffiti displayed on the display screen 151, for example, when the user selects red and brush, then when the user graffiti on the terminal 100, the color in the graffiti content is red and the line type is the brush Corresponding line type.
  • the terminal 100 may erase the graffiti content at the location selected by the user in response to the operation of the erase button.
  • the terminal 100 may cancel the graffiti content input most recently in response to the operation of the cancel input button.
  • the terminal 100 may restore the graffiti content that was most recently revoked in response to the operation of the resume typing button.
  • the terminal 100 may hide the graffiti toolbar 411 in response to the operation of the close button 417.
  • the user interface displayed by the terminal 100 may be the user interface 400 shown in FIG. 4.
  • the user interface 400 includes a content editing area 221 and a function bar 222.
  • the content editing area 221 displays the added multimedia content "Today's meeting requires that work be done in detail and in-depth investigations are conducted.
  • the following points are clarified: 1. Put an end to superficialism 2. Work together to make a difference 3. No details, success or failure in details" .
  • the function bar 223 includes a drawing button 224.
  • the terminal 100 switches the positions of the first layer and the second layer in response to the operation of the drawing button 224 included on the user interface 220, and displays the drawing toolbar 411.
  • a user interface 410 is displayed, where the user interface 410 includes a content editing area 221 and a drawing toolbar 411.
  • the content editing area 221 displays graffiti content 420.
  • the user interface 410 may also include a function bar 222.
  • the first layer is layer 501
  • the second layer is layer 502
  • layer 502 includes added multimedia content
  • layer 501 includes added graffiti content 420.
  • the content editing area 221 displays an overlay of the added multimedia content on the layer 520 and the added graffiti content 420 on the layer 501.
  • the terminal 100 may also display multimedia content and graffiti content in the content editing area 221 in response to the operation of the finish button 412.
  • the second layer is overlaid on the first layer.
  • the editing or adding of the multimedia content and the graffiti content can be switched interactively. Specifically, the switching can be performed in response to the user's operation. For example, in response to a user's operation of a drawing button, switching from editing or adding multimedia content to editing or adding graffiti content. As another example, in response to the user completing the operation of the graffiti content, switching from editing or adding graffiti content to editing or adding multimedia content.
  • the sparse dot matrix method may be used to store the graffiti content in the embodiments of the present application.
  • the format diagram when the graffiti content 420 is stored in a sparse dot matrix may be as shown in FIG. 6. Specifically, as shown in FIG.
  • the terminal 100 when the terminal 100 stores the graffiti content 420 in a sparse dot matrix manner, it includes a data header, a first stroke, and a second stroke, where the data header includes an identifier of the graffiti content 420 (such as HWGRAFFI, where graffiti
  • the identification of the content 420 can be determined according to the preset rules), the version number of the memo (VERSION), the total number of strokes included in the graffiti content 420 (PATH_COUNT), the data length of the graffiti content 420 (DATA_COUNT), used to prompt to save the first
  • the identification of the starting position of the stroke PATH_INDEX1, the identification of the starting position of the second stroke (PATH_INDEX2), the color index used for the first stroke (COLOR_INDEX1), the color index used for the second stroke (COLOR_INDEX2)
  • the pen type index used for the first stroke PEN_INDEX1
  • the pen type index used for the second stroke PEN_INDEX2
  • the first stroke includes the type identification (STROKE1) of the first stroke, which is used to indicate the line type used in the first stroke, and the coordinates (X, Y) and width (WIDTH) of each point constituting the first stroke.
  • the second stroke includes the type identifier (STROKE2) of the second stroke, used to indicate the line type used for the second stroke, and the coordinates (X, Y) and width (WIDTH) of the points constituting the second stroke.
  • the terminal 100 saves the multimedia content and / or graffiti content displayed in the same content editing area into the same file.
  • the multimedia content “Today ’s meeting requires detailed work and in-depth investigation. The following points are clarified: 1. Put an end to surfaceism 2. Group strategy 3. No matter how big, from "Details catch success or failure" and the graffiti content 420 are saved in the same file.
  • the terminal 100 may also, when the second layer of the content editing area 221 is overlaid on the first layer, if it is detected that the position of the graffiti content input by the user reaches the preset position in the content editing area (for example, the graffiti content entered by the user is 400 pixels from the bottom of the content editing area), and the content editing area is automatically expanded by a preset size.
  • the terminal 100 may also detect that the position of the graffiti content input by the user reaches the content editing area when the second layer of the content editing area is overlaid on the first layer through the processor 100 (for example, an application processor) In the preset position in, the content editing area is automatically expanded by the preset size.
  • automatically expanding the content editing area by a preset size refers to automatically expanding the second layer and the first layer at the same time by a preset size.
  • the display screen 151 of the terminal 100 displays the user interface 410 shown in FIG. 4, if the graffiti content 700 shown in FIG. 7 added by the user on the second layer 502 is detected, after the graffiti reaches the position A , The second layer and the first layer automatically extend down the preset size.
  • the shaded part in Fig. 7 is an extended part of the first layer and the second layer.
  • the preset size can be set as required. Normally, the preset size can be set as the size of the initial content editing area. It should be understood that when the terminal 100 responds to the operation of the new memo button 211, the size of the content editing area included in the user interface displayed on the display screen 151 is the size of the initial content editing area.
  • the terminal 100 may detect that the location of the graffiti content input by the user reaches the preset position in the content editing area If it is determined that the current size of the content editing area does not exceed the maximum limit, the preset size is automatically extended downward. When the current size of the content editing area reaches the maximum limit, the terminal 100 no longer extends the size of the content editing area downward.
  • the terminal 100 may also automatically expand content editing when it detects that the multimedia information added by the user exceeds the current size of the content editing area
  • the size of the area for the method of automatically expanding the size of the content editing area, please refer to the above method of automatically expanding the size of the content editing area, which will not be repeated here.
  • the display screen 151 of the terminal 100 displays a user interface 800, where the user interface 800 includes a content editing area 801, where the area 802 is an area after the initial content editing area is expanded.
  • the terminal 100 may respond to the user's sliding operation up and down in the area 810 of the display screen 151, so that the display screen 151 displays the content at the corresponding position in the corresponding content editing area 801.
  • the terminal 100 in the embodiment of the present application may also respond to the user's operation of selecting the memo in the saved memo list 212, and display the selected Memo user interface.
  • the terminal may respond to the user's operation to edit and / or add multimedia content or graffiti content on the user interface of the selected memo.
  • the specific way to add or edit graffiti content and multimedia content is similar to the way to add or edit graffiti content and multimedia content in the newly created memo, and will not be repeated here.
  • embodiments of the present application provide a content editing method, which can be implemented in a terminal 100 having the hardware structure shown in FIG. 1.
  • FIG. 9 a schematic flowchart of a content editing method provided by an embodiment of the present application. It includes the following steps.
  • Step 901 the terminal displays an editable user interface, wherein the editable user interface displays multimedia content and graffiti content, the graffiti content is generated in response to the user's operation on the multimedia content, and the editable user interface includes a content editing area and a content editing area It is used to add or edit the multimedia content and / or graffiti content.
  • Step 902 the terminal displays the added or edited graffiti content in the content editing area in response to the operation of adding or editing graffiti content; or the terminal displays the added or edited multimedia in the content editing area in response to the operation of adding or editing multimedia content content.
  • the method provided by the embodiments of the present application is introduced from the perspective of the terminal as an execution subject.
  • the terminal may include a hardware structure and / or a software module, and implement the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above functions is executed in a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application of the technical solution and design constraints.
  • an embodiment of the present application discloses a terminal 1000.
  • the terminal 1000 may include: a display screen 1001, one or more processors 1002, a memory 1003, and multiple application programs 1004; and one or more In the computer program 1005, the above devices can be connected through one or more communication buses 1006.
  • the one or more computer programs 1005 include instructions that are stored in the above-mentioned memory 1003 and configured to be executed by the one or more processors 1002 to implement the content editing method provided by the embodiments of the present application.
  • FIG. 11 shows a terminal 1100 of the present application, including a display module 1101 and a processing module 1102, wherein the display module 1101 can be used to perform step 901 in the content editing method shown in FIG. 9, the processing module 1102 is used to perform the operation of adding or editing graffiti content or multimedia content in step 902 in the content editing method shown in FIG. 9, so that the display module 1101 performs corresponding display.
  • the processors involved in the above embodiments may be general-purpose processors, digital signal processors (DSPs), application-specific integrated circuits (ASICs), ready-made programmable gate arrays (FPGAs) ) Or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • DSPs digital signal processors
  • ASICs application-specific integrated circuits
  • FPGAs ready-made programmable gate arrays
  • the methods, steps, and logical block diagrams disclosed in the embodiments of the present application may be implemented or executed.
  • the general-purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the method disclosed in conjunction with the embodiments of the present application may be directly embodied and executed by a hardware decoding processor, or may be executed and completed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory or electrically erasable programmable memory, registers, etc.
  • Storage media The storage medium is located in the memory, and the processor reads the instructions in the memory and completes the steps of the above method in combination with its hardware.
  • the disclosed system, device, and method may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the units is only a division of logical functions.
  • there may be other divisions for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solution of the present application essentially or part of the contribution to the existing technology or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to enable a computer device (which may be a personal computer, server, or network device, etc.) or processor to execute all or part of the steps of the methods described in the embodiments of the present application.
  • the foregoing storage media include various media that can store program codes, such as a U disk, a mobile hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种内容编辑的方法及终端,该方法包括:终端显示可编辑的用户界面(901),其中,可编辑的用户界面显示多媒体内容和涂鸦内容,涂鸦内容为响应用户在多媒体内容上操作生成,可编辑的用户界面包括内容编辑区域,内容编辑区域用于添加或编辑多媒体内容和/或涂鸦内容。终端响应于添加或编辑涂鸦内容的操作,在内容编辑区域显示添加或编辑后的涂鸦内容;或者终端响应于添加或编辑多媒体内容的操作,在内容编辑区域显示添加或编辑后的多媒体内容(902)。该方法有助于用户修改多媒体内容和/或涂鸦内容,从而有助于提高用户体验。

Description

一种内容编辑的方法及终端
本申请要求于2018年10月16日提交中国国家知识产权局,申请号为CN 201811202712.6、发明名称为“一种内容编辑的方法及终端”的中国专利申请,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,特别涉及一种内容编辑的方法及终端。
背景技术
随着终端技术的进步,终端(例如手机、平板电脑等)等广泛应用于人们的工作和生活中。用户可以随时随地通过终端来记录心情、笔记、计划等信息。因此如何使得用户能够使用终端更便捷地记录如文字、图片、涂鸦等信息,具有重要的实用价值。
发明内容
本申请实施例提供了一种内容编辑的方法及终端,有助于使得用户在使用终端记录信息更加便捷,从而提高用户体验。
第一方面,本申请实施例提供的一种内容编辑的方法,该方法包括:
终端显示可编辑的用户界面,其中,可编辑的用户界面显示多媒体内容和涂鸦内容,涂鸦内容为响应用户在多媒体内容上操作生成,可编辑的用户界面包括内容编辑区域,内容编辑区域用于添加或编辑多媒体内容和/或涂鸦内容。终端可以响应于添加或编辑涂鸦内容的操作,在内容编辑区域显示添加或编辑后的涂鸦内容;或者,终端响应于添加或编辑多媒体内容的操作,在内容编辑区域显示添加或编辑后的多媒体内容。
本申请实施例中由于可编辑的用户界面上显示多媒体内容和涂鸦内容时,能够编辑和/或添加多媒体内容或者涂鸦内容,因而有助于用户修改的多媒体内容和/或涂鸦内容,从而有助于提高用户体验。
应理解,本申请实施例中多媒体内容可以为文字、图片、音频、视频、动图等。
在一些实施例中,终端响应用户的切换操作,切换对多媒体内容和涂鸦内容的编辑。通过上述技术方案,有助于简化用户编辑或添加多媒体内容和涂鸦内容的实现方式。
在一些实施例中,内容编辑区域包括第一图层和第二图层。第一图层用于添加或编辑多媒体内容,第二图层用于添加或编辑涂鸦内容;终端响应于用户的第一操作,切换第一图层与第二图层的位置后,响应于添加或编辑涂鸦内容的操作,在内容编辑区域显示添加或编辑后的涂鸦内容。从而使得用户可以在第二图层的任意位置涂鸦。
在一些实施例中,可编辑的用户界面还包括功能按钮区域,功能按钮区域 包括绘图按钮,第一操作为对绘图按钮的操作。从而有助于简化用户的操作方式。
在一些实施例中,第二图层的尺寸大小和第一图层的尺寸大小相同。从而使得用户可以在内容编辑区域的任意位置涂鸦。
在一些实施例中,第二图层为透明的。通过上述技术方案,有助于用户在多媒体内容上涂鸦。
在一些实施例中,终端响应于添加或编辑涂鸦内容的操作,存储涂鸦内容,其中,涂鸦内容是以稀疏点阵的方式保存的。有助于使得涂鸦内容不受图像大小的限制。
在一些实施例中,终端检测添加的涂鸦内容或多媒体内容达到或超过内容编辑区域的预设位置后,自动将内容编辑区域扩展预设大小。从而有助于用户添加涂鸦内容和/或多媒体内容。
第二方面,本申请实施例提供了一种终端,显示屏、一个或多个处理器、存储器;多个应用程序;以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述终端执行时,使得所述终端执行以下步骤:
显示可编辑的用户界面;其中,所述可编辑的用户界面显示多媒体内容和涂鸦内容,所述涂鸦内容为响应用户在所述多媒体内容上操作生成,所述可编辑的用户界面包括内容编辑区域,所述内容编辑区域用于添加或编辑所述多媒体内容和/或涂鸦内容;
响应于添加或编辑所述涂鸦内容的操作,在所述内容编辑区域显示添加或编辑后的涂鸦内容;
响应于添加或编辑所述多媒体内容的操作,在所述内容编辑区域显示添加或编辑后的多媒体内容。
在一种可能的设计中,所述指令还包括:响应用户的切换操作切换对所述多媒体内容和所述涂鸦内容的编辑的指令。
在一种可能的设计中,所述内容编辑区域包括第一图层和第二图层;所述第一图层用于添加或编辑多媒体内容,所述第二图层用于添加或编辑涂鸦内容;所述指令还包括:
响应于用户的第一操作,切换所述第一图层与所述第二图层的位置的指令。
在一种可能的设计中,所述可编辑的用户界面还包括功能按钮区域,所述功能按钮区域包括绘图按钮,所述第一操作为对绘图按钮的操作。
在一种可能的设计中,所述第二图层的尺寸大小和所述第一图层的尺寸大小相同。
在一种可能的设计中,所述第二图层为透明的。
在一种可能的设计中,所述指令还包括:响应于添加或编辑所述涂鸦内容的操作存储所述涂鸦内容的指令,其中,所述涂鸦内容是以稀疏点阵的方式保存的。
在一种可能的设计中,所述指令还包括:检测添加的涂鸦内容或多媒体内容达到或超过所述内容编辑区域的预设位置后,自动将所述内容编辑区域扩展 预设大小的指令。
第三方面,本申请实施例提供的一种芯片,所述芯片与终端中的存储器耦合,使得所述芯片在运行时调用所述存储器中存储的计算机程序,实现本申请实施例第一方面以及第一方面提供的任一可能设计的方法。
第四方面,本申请实施例的一种计算机存储介质,该计算机存储介质存储有计算机程序,当所述计算机程序在终端上运行时,使得终端执行第一方面以及第一方面任意一种可能的设计的方法。
第五方面,本申请实施例的一种计算机程序产品,当所述计算机程序产品在终端上运行时,使得所述终端执行第一方面以及第一方面任意一种可能的设计的方法。
另外,第二方面至第五方面中任一种可能设计方式所带来的技术效果可参见第一方面中不同设计方式所带来的技术效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种终端的硬件结构的示意图;
图2为本申请的一实施例中的用户界面示意图;
图3为本申请一实施例第一图层和第二图层的位置关系示意图;
图4为本申请另一实施例中的用户界面的示意图;
图5为本申请另一实施例中的图层的示意图;
图6为本申请一实施例涂鸦内容的存储格式示意图;
图7为本申请一实施例内容编辑区域的扩展示意图;
图8为本申请另一实施例的用户界面的示意图;
图9为本申请实施例内容编辑方法的流程示意图;
图10为本申请一实施例终端的结构示意图;
图11为本申请另一实施例终端的结构示意图。
具体实施方式
应理解,本申请实施例中“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系。例如,A和/或B,可以表示以下三种关系:单独存在A,同时存在A和B,单独存在B。其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一(项)个”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a、b或c中的至少一项(个),可以表示:a,b,c,a和b,a和c,b和c,或a、b和c,其中a、b、c可以是单个,也可以是多个。
应理解,本申请实施例可以应用于终端中。本申请实施例中终端可以为便携式终端,诸如手机、平板电脑、具备无线通讯功能的可穿戴设备(如智能手表)、车载设备等。便携式终端的示例性实施例包括但不限于搭载
Figure PCTCN2019110915-appb-000001
Figure PCTCN2019110915-appb-000002
或者其它操作系统的便携式终端。上述便携式终端也可以是诸如具有触敏表面(例如触控面板)的膝上型计算机(Laptop)等。还应当理解的是,在本申请其他一些实施例中,终端100也可以是具有触敏表面(例如触控面板)的台式计算机。
如图1所示,为本申请实施例提供的一种终端的硬件结构示意图。具体的,终端100可以包括处理器110、内部存储器121、外部存储器接口122、天线1、移动通信模块131、天线2、无线通信模块132、音频模块140、扬声器140A、受话器140B、麦克风140C、耳机接口140D、显示屏151、用户标识模块(subscriber identification module,SIM)卡接口152、摄像头153、按键154、传感器模块160、通用串行总线(universal serial bus,USB)接口170、充电管理模块180、电源管理模块181和电池182。在另一些实施例中,终端100还可以包括马达、指示器等。
其中,处理器110可以包括一个或多个处理单元。例如:处理器110可以包括应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
在一些实施例中,处理器110中还可以设置存储器,用于存储指令和数据。示例的,处理器110中的存储器可以为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从该存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行终端100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等。存储数据区可存储终端100使用过程中所创建的数据(比如音频数据、电话本、备忘录等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、通用闪存存储器(universal flash storage,UFS)等。
外部存储器接口122可以用于连接外部存储卡(例如,Micro SD卡),实现扩展终端100的存储能力。外部存储卡通过外部存储器接口122与处理器110通信,实现数据存储功能。例如将音乐、视频等文件保存在外部存储卡中。
天线1和天线2用于发射和接收电磁波信号。终端100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块131可以提供应用在终端100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块131可以包括至少一个滤波器、开关、功率放大器、低噪声放大器(low noise amplifier,LNA)等。移动通信模块131可以由天线1接收电磁波信号,并对接收的电磁波信号进行滤波、放大等处理,传送至调制解调处理器进行解调。移动通信模块131还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波信号辐射出去。在一些实施例中,移动通信模块131的至少部分功能模块可以被设置于处理器110中。在一些实施例中, 移动通信模块131的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器140A、受话器140B等)输出声音信号,或通过显示屏151显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块131或其他功能模块设置在同一个器件中。
无线通信模块132可以提供应用在终端100上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络)、蓝牙(bluetooth,BT)、全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation,FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。无线通信模块132可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块132经由天线2接收电磁波信号,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块132还可以从处理器110接收待发送的信号,对其进行调频、放大,经天线2转为电磁波信号辐射出去。
在一些实施例中,天线1和移动通信模块131耦合,天线2和无线通信模块132耦合,使得终端100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA)、时分码分多址(time-division code division multiple access,TD-SCDMA)、长期演进(long term evolution,LTE)、BT、GNSS、WLAN、NFC、FM、和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS)、全球导航卫星系统(global navigation satellite system,GLONASS)、北斗卫星导航系统(beidou navigation satellite system,BDS)、准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)等。
终端100可以通过音频模块140、扬声器140A、受话器140B、麦克风140C、耳机接口140D以及应用处理器等实现音频功能。例如音乐播放、录音等。
音频模块140可以用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块140还可以用于对音频信号编码和解码。在一些实施例中,音频模块140可以设置于处理器110中,或将音频模块140的部分功能模块设置于处理器110中。
扬声器140A,也称“喇叭”,用于将音频电信号转换为声音信号。终端100可以通过扬声器140A收听音乐、或接听免提通话。
受话器140B,也称“听筒”,用于将音频电信号转换成声音信号。当终端100 接听电话或语音信息时,可以通过将受话器140B靠近人耳接听语音。
麦克风140C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风140C发声,麦克风140C可以用于采集用户的声音,然后,将用户的声音转换为电信号。终端100可以设置至少一个麦克风140C。在另一些实施例中,终端100可以设置两个麦克风140C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,终端100还可以设置三个、四个或更多麦克风140C,实现声音信号采集、降噪、还可以识别声音来源,实现定向录音功能等。
耳机接口140D用于连接有线耳机。耳机接口140D可以是USB接口130,也可以是3.5mm的开放移动终端平台(open mobile terminal platform,OMTP)标准接口、美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口等。
终端100可以通过GPU、显示屏151、以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏151和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏151可以用于显示图像、视频等。显示屏151可以包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、Miniled、MicroLed、Micro-oLed、量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,终端100可以包括1个或N个显示屏151,N为大于1的正整数。
终端100还可以通过ISP、摄像头153、视频编解码器、GPU、显示屏151以及应用处理器等实现拍摄功能。
ISP可以用于处理摄像头153反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点、亮度、肤色进行算法优化。ISP还可以对拍摄场景的曝光、色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头153可以用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB、YUV等格式的图像信号。在一些实施例中,终端100可以包括1个或N个摄像头153,N为大于1的正整数。
按键154可以包括开机键、音量键等。按键154可以是机械按键,也可以是触摸式按键。终端100可以接收按键输入,产生与终端100的用户设置以及功能控制有关的键信号输入。
传感器模块160可以包括一个或多个传感器。例如,触摸传感器160A、指纹传感器160B、陀螺仪传感器160C、压力传感器160D、加速度传感器160E等。在一些实施例中,传感器模块160还可以包括环境传感器、距离传感器、接近光传感器、骨传导传感器等。
触摸传感器160A,也可称为“触控面板”。触摸传感器160A可以设置于显示屏151,由触摸传感器160A与显示屏151组成触摸屏,也可称为“触控屏”。触摸传感器160A用于检测作用于其上或附近的触摸操作。触摸传感器160A可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。然后终端100可以通过显示屏151提供与触摸操作相关的视觉输出,或者通过扬声器140A提供与触摸操作相关的听觉输出。在另一些实施例中,触摸传感器160A也可以设置于终端100的表面,与显示屏151所处的位置不同。
指纹传感器160可以用于采集指纹。终端100可以利用采集的指纹特性实现指纹解锁、访问应用锁、指纹拍照、指纹接听来电等。
陀螺仪传感器160C可以用于确定终端100的运动姿态。在一些实施例中,可以通过陀螺仪传感器160C确定终端100围绕三个轴(即,x、y和z轴)的角速度。陀螺仪传感器160C可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器160C检测电子设100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消终端100的抖动,实现防抖。陀螺仪传感器160C还可以用于导航、体感游戏等场景。
压力传感器160D用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器160D可以设置于显示屏151。压力传感器160D的种类很多,如电阻式压力传感器、电感式压力传感器、电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。终端100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,终端100根据压力传感器180A检测所述触摸操作强度。终端100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于备忘录的应用图标时,执行查看备忘录的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于备忘录的应用图标时,执行新建备忘录的指令。
加速度传感器160E可检测终端100在各个方向上(一般为三轴)加速度的大小。当终端100静止时可检测出重力的大小及方向。加速度传感器160E还可以用于识别终端100的姿态,也可应用于横竖屏切换、计步器等应用。
在另一些实施例中,处理器110还可以包括一个或多个接口。例如,接口可以为SIM卡接口152。又例如,接口还可以为USB接口170。再例如,接口还可以为集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、通 用输入输出(general-purpose input/output,GPIO)接口等。可以理解的是,本申请实施例可以通过接口连接终端100的不同模块,从而使得终端100能够实现不同的功能。例如拍照、处理等。需要说明的是,本申请实施例对终端100中接口的连接方式不作限定。
其中,SIM卡接口152可以用于连接SIM卡。SIM卡可以通过插入SIM卡接口152,或从SIM卡接口152拔出,实现和终端100的接触和分离。终端100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口152可以支持Nano SIM卡、Micro SIM卡、SIM卡等。同一个SIM卡接口152可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口152也可以兼容不同类型的SIM卡。SIM卡接口152也可以兼容外部存储卡。终端100可以通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,终端100可以采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在终端100中,不能和终端100分离。
USB接口170是符合USB标准规范的接口。例如,USB接口170可以包括Mini USB接口、Micro USB接口、USB Type C接口等。USB接口170可以用于连接充电器为终端100充电,也可以用于终端100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。USB接口170还可以用于连接其他终端,例如增强现实技术(augmented reality,AR)设备等。
充电管理模块180用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块180可以通过USB接口170接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块180可以通过终端100的无线充电线圈接收无线充电输入。充电管理模块180为电池182充电的同时,还可以通过电源管理模块181为终端100供电。
电源管理模块181用于连接电池182、充电管理模块180与处理器110。电源管理模块181接收电池182和/或充电管理模块180的输入,为处理器110、内部存储器121、外部存储器、显示屏151、摄像头153、移动通信模块131和无线通信模块132等供电。电源管理模块181还可以用于监测电池容量、电池循环次数、电池健康状态(漏电、阻抗)等参数。在其他一些实施例中,电源管理模块181也可以设置于处理器110中。在另一些实施例中,电源管理模块181和充电管理模块180也可以设置于同一个器件中。
应理解,图1所示的终端100的硬件结构仅是一个示例。本申请实施例的终端100可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
另外,应理解,本申请实施例中的终端100可以通过安装不同的应用来实现不同的功能,其中应用可以为原生(native)应用,例如设置、电话、相机等,还可以为从应用商店下载的第三方应用,例如微信等。
下面结合图1所示的终端100对本申请实施例提供的内容编辑的方法进行 详细说明。
本申请实施例中,终端100可以通过在可编辑的用户界面上添加或编辑多媒体内容(例如文字、图片等)和/或涂鸦内容,从而使得用户可以方便的修改或添加多媒体内容和涂鸦内容,有助于提高用户体验。
需要说明的是,本申请实施例的内容编辑的方法可以应用于具有记录多媒体内容功能的应用中。本申请实施例中的多媒体内容可以包括文字、音频、视频、图片、动图等。其中,具有记录多媒体内容功能的应用可以为备忘录,也可以为文档等。下面以备忘录为例,对本申请实施例提供的内容编辑的方法进行详细说明。当本申请实施例提供的方法应用于其它应用时,其实现方式与应用于备忘录类似,在此不再赘述。
终端100的显示屏151显示主界面,其中主界面包括备忘录图标。示例的,主界面可以如图2所示的用户界面200。用户界面200包括备忘录图标201。除此之外,用户界面200还可以包括其它应用的图标,例如设置图标、相机图标、图库图标等。在一些实施例中,用户界面200还可以包括状态栏202、可隐藏的导航栏203和Dock栏204。其中,状态栏202中可以包括运营商的名称(例如,中国移动等)、移动网络(如4G)、蓝牙图标、时间和剩余电量。此外,可以理解的是,在其他一些实施例中,状态栏202中还可以包括WiFi图标、外接设备图标等。导航栏203中可以包括后退按钮(back button)、主屏幕键按钮(home button)和历史任务查看按钮(menu button)。Dock栏中可以包括常用的应用图标,例如电话图标、信息图标、邮件图标和天气图标。需要说明的是,Dock栏中的应用图标可以根据用户的需求进行相应的设定。
终端100响应于对备忘录图标201的操作,在显示屏151显示用户界面210。示例的,用户界面210可以包括新建备忘录按钮211。在其他一些实施例中,当终端100已保存有备忘录时,用户界面210还可以包括已保存的备忘录列表212。在另一些实施例中,用户界面210中还可以包括搜索框213。终端100可以响应于用户在搜索框213中输入的关键字,使得显示屏151显示已保存的备忘录列表212中包括关键字的备忘录,从而有助于提高用户查找从已保存的备忘录列表212中查找备忘录的效率。
需要说明的是,对备忘录图标201的操作可以为用户对备忘录图标201的触摸操作,也可以为其它操作,例如长按或重按等操作。示例的,终端100可以通过触摸传感器160A来检测对备忘录的图标201的操作。此外,本申请实施例中终端100还可以响应于其它操作(例如,“打开备忘录”的语音指令、快捷手势操作等),在显示屏151显示用户界面210。
示例的,以对备忘录图标201的操作为对备忘录的图标201的触摸操作为例,终端100可以基于下列方式响应于对备忘录的图标201的操作:终端100的触摸传感器160A检测到对备忘录图标201的操作后,向处理器110(例如应用处理器)发送触摸事件,处理器110接收到触摸事件后,确定触摸事件的类型为打开备忘录的操作,然后通知显示屏151显示用户界面210。
终端100可以响应于对新建备忘录按钮211或者备忘录列表212中已保存的备忘录的操作,在显示屏151显示可编辑的用户界面。此外,终端100还可 以响应于其它操作(例如快捷手势操作,或者新建备忘录的语音指令、又或者打开某一已保存的备忘录的语音指令等),在显示屏151显示可编辑的用户界面。
其中,可编辑的用户界面包括内容编辑区域。内容编辑区域可以显示多媒体内容和/或涂鸦内容。终端100可以响应于添加或编辑多媒体内容的操作,在内容编辑区域显示添加或编辑后的涂鸦内容;或者,终端100响应于添加或编辑多媒体内容的操作,在内容编辑区域显示添加或编辑后的多媒体内容。本申请实施例中涂鸦内容可以响应用户在多媒体内容上的操作生成。在一些实施例中,内容编辑区域包括第一图层和第二图层,其中,第一图层用户添加或编辑多媒体内容,第二图层用于添加或编辑涂鸦内容,从而有助于简化涂鸦内容可以响应用户在多媒体内容上的操作生成的方式。示例的,第一图层和第二图层的位置关系可以如图3所示,以图层301为第一图层、图层302为第二图层为例,当图层301覆盖在图层302上时,终端100可以响应于添加和/或编辑多媒体内容的操作,在内容编辑区域显示添加或编辑后的多媒体内容;当图层302覆盖于图层301上时,终端100可以响应于添加和/或编辑涂鸦内容的操作,在内容编辑区域显示添加或编辑后的涂鸦内容。其中,终端100可以响应于用户的操作,实现图层301和图层302的位置关系的切换,从而实现添加和/或编辑多媒体内容或涂鸦内容。
其中,编辑涂鸦内容的操作可以为删除涂鸦内容、改变涂鸦内容的颜色、改变涂鸦内容的线条的粗细等的操作。本申请实施例中添加辑多媒体内容的操作可以包括添加文字、图片、视频、音频等操作,编辑多媒体内容的操作可以包括删除多媒体内容、替换多媒体内容、复制多媒体内容、粘贴多媒体内容等的操作。
以终端100响应于对新建备忘录按钮211的操作,在显示屏151显示可编辑的用户界面为例。由于显示屏151是响应于新建备忘录按钮显示的可编辑的用户界面,当用户还未在可编辑的用户界面上进行多媒体内容和/或涂鸦内容的添加操作时,内容编辑区域可以是空白的,也可以自动显示创建备忘录的时间。
在一些实施例中,响应于对新建备忘录按钮211的操作,在显示屏151显示的可编辑的用户界面可以为如图2所示的用户界面220。在另一些实施例中,响应于对新建备忘录按钮211的操作,在显示屏151显示的可编辑的用户界面可以为如图2所示的用户界面230。用户界面220包括内容编辑区域221。内容编辑区域221包括第一图层和第二图层。终端100可以响应于对内容编辑区域221任意位置的触摸操作,将第一图层覆盖于第二图层之上。另外,终端还可以响应于内容编辑区域221任意位置的触摸操作,在用户界面230显示虚拟键盘。其中用户界面230显示虚拟键盘后的用户界面可以为如图2中所示的用户界面220。在一些实施例中,终端100可以响应于在用户界面220上的下拉操作等,隐藏虚拟键盘223。
其中,用户界面220包括内容编辑区域221和虚拟键盘223,且内容编辑区域221中的第一图层覆盖于第二图层之上。终端100可以响应于用户通过虚拟键盘223添加文字的操作,在内容编辑区域221显示添加相应的文字。需要说明的是,终端100是在第一图层上添加文字的。在一些实施例中,终端100还 可以响应于用户的操作,在内容编辑区域221显示添加的图片、视频或者音频等。示例的,终端100可以在第一图层上添加图片、视频或者音频等。示例的,用户界面220还包括功能栏222。功能栏222包括相机按钮227、图片按钮226、语音按钮225中的至少一个。从而便于用户添加图片、音频和视频等。需要说明的是,功能栏222是可以隐藏的,也可以不隐藏。
终端100可以响应于对图片按钮226的操作,打开图库,并在显示屏151显示图库中保存的相片和视频等。终端100响应于选中图库中某一个或多个图片的操作,并在内容编辑区域223显示选中的图库中的某一个或多个图片。终端100可以响应于对相机按钮227的操作,打开相机,并在显示屏151显示相机的拍摄界面。当终端100检测到拍摄完成后,可以在内容编辑区域221显示通过相机拍摄的相片或者视频等。终端100可以响应于对语音按钮225的操作,通过麦克风140C或者其它声音采集设备来采集外接环境(如用户等发出的)中的声音,并在采集完成后,在内容编辑区域显示通过麦克风140C采集的语音。在一些实施例中,功能栏222中还可以包括文本编辑按钮228。终端100可以响应于对文本编辑按钮228的操作,对内容编辑区域221显示的文字的格式进行编辑。从而便于用户调整内容编辑区域221显示的文字的格式。
需要说明的是,终端100可以响应于第一操作,切换第一图层和第二图层的位置。在一些实施例中,第一操作可以为快捷手势操作,如三指上滑操作、或者两指下滑操作等,也可以为语音指令等。在另一些实施例中,功能栏222还包括绘图按钮224。第一操作可以为对绘图按钮224的操作。需要说明的是,本申请实施例中绘图按钮又可称之为涂鸦按钮,或者其它名称,对此不作限定。以绘图按钮为例。示例的,终端100响应于对用户界面220上包括的绘图按钮224的操作,切换第一图层和第二图层的位置,以使得第二图层覆盖于第一图层上。从而使得终端100可以响应于添加或编辑涂鸦内容的操作,在内容编辑区域221显示添加或编辑后的涂鸦内容。在一些实施例中,第二图层的尺寸大小等于第一图层的尺寸大小,由于第二图层覆盖于第一图层之上,因此,用户可以在内容编辑区域221的任意位置进行涂鸦。从而使得用户的可以对添加到第一图层上的多媒体内容进行批注或标记等。需要说明的是,第一图层的尺寸大小和第二图层的尺寸大小也可以不同。示例的,当第一图层的尺寸大小和第二图层的尺寸大小不同时,本申请实施例中第二图层的尺寸大小可以为多媒体内容所在区域的大小,在该种情况下,内容编辑区域的大小为第一图层的尺寸大小。在又一些实施例中,第二图层是透明的。本申请实施例中第一图层可以是透明的,也可以是非透明的,对此不作限定。从而有助于终端100显示多媒体内容和涂鸦内容。
另外,在一些实施例中,终端100响应于对用户界面220上包括的绘图按钮224的操作,还可以显示绘图工具栏。示例的,绘图工具栏441涂鸦工具栏中包括笔型和颜色选择按钮、擦除按钮、撤销键入按钮、恢复键入按钮和关闭按钮中的至少一个。本申请实施例对绘图工具栏包括的按钮不作限定。从而使得用户可以根据自己不同的需求进行涂鸦。示例的,终端100可以响应于对笔型和颜色选择按钮的操作,可以在显示屏151上显示用于涂鸦的各种笔(例如 铅笔、毛笔等,其中,不同笔画出的线型不同)以及颜色(例如红色、黄色等)。用户可以从显示屏151上显示的用于涂鸦的各种笔和颜色中进行选择,例如用户选择红色和毛笔,则用户在终端100上涂鸦时,涂鸦内容中的颜色为红色,线型为毛笔对应的线型。又示例的,终端100可以响应于对擦除按钮的操作,擦除用户选中位置的涂鸦内容。终端100可以响应于对撤销键入按钮的操作,撤销最近一次输入的涂鸦内容。终端100可以响应于对恢复键入按钮的操作,恢复最近一次撤销的涂鸦内容。终端100可以响应于对关闭按钮417的操作,隐藏涂鸦工具栏411。
示例的,当用户在第一图层上添加的多媒体内容为“今天的会议要求,要把工作做细,深入细节调研。明确以下几点:1.杜绝表面主义2.群策群力3.事无巨细,从细节上抓成败”时,终端100显示的用户界面可以如图4中所示的用户界面400。用户界面400包括内容编辑区域221和功能栏222。内容编辑区域221显示添加的多媒体内容“今天的会议要求,要把工作做细,深入细节调研。明确以下几点:1.杜绝表面主义2.群策群力3.事无巨细,从细节上抓成败”。功能栏223中包括绘图按钮224。终端100响应于对用户界面220上包括的绘图按钮224的操作,切换第一图层和第二图层的位置,以及显示绘图工具栏411。示例的,终端100若响应于用户添加涂鸦内容420的操作,显示用户界面410,其中,用户界面410包括内容编辑区域221和绘图工具栏411。内容编辑区域221显示涂鸦内容420。示例的,用户界面410还可以包括功能栏222。
例如,如图5所示,第一图层为图层501,第二图层为图层502,图层502上包括添加的多媒体内容,图层501上包括添加的涂鸦内容420。内容编辑区域221显示的为图层520上包括添加的多媒体内容和图层501上包括添加的涂鸦内容420的叠加。
需要说明的是,用户可以在图层502上任意位置添加或编辑涂鸦内容。
本申请实施例中,当如图4所示,用户完成添加或编辑涂鸦内容的操作时,可以通过对完成按钮412操作,保存涂鸦内容420。在一些实施例中,终端100还可以响应于对完成按钮412的操作,在内容编辑区域221显示多媒体内容和涂鸦内容。为了便于用户查看备忘录,在一些实施例中,在显示屏151上显示多媒体内容和涂鸦内容时,第二图层覆盖于第一图层之上。终端100在保存涂鸦内容420之后,若检测到用户对内容编辑区域221操作,则可以重新切换第一图层和第二图层的位置,以使得第一图层覆盖于第二图层之上,使得用户可以重新对内容编辑区域中的多媒体内容进行操作。
需要说明的是,本申请实施例中对于多媒体内容和涂鸦内容的编辑或添加是可以交互切换的,具体的,可以响应于用户的操作进行切换。例如,响应于用户对绘图按钮的操作从编辑或添加多媒体内容切换到编辑或添加涂鸦内容。再例如,响应于用户完成涂鸦内容的操作,从编辑或添加涂鸦内容切换到编辑或添加多媒体内容。
其中,为了便于终端100保存涂鸦内容,使得终端100不受涂鸦内容的图像显示大小的限制,本申请实施例中可以采用稀疏点阵的方法存储涂鸦内容。示例的,若图4所示的涂鸦内容420是用户输入两个比划完成的,则涂鸦内容 420以稀疏点阵存储时的格式示意图可以如图6所示。具体的,如图6所示,终端100以稀疏点阵的方式存储涂鸦内容420时,包括数据头、第一笔划和第二笔划,其中数据头包括涂鸦内容420的标识(如HWGRAFFI,其中涂鸦内容420的标识可以根据预设规则确定)、备忘录的版本号(VERSION)、涂鸦内容420所包括的笔划的总数目(PATH_COUNT)、涂鸦内容420的数据长度(DATA_COUNT)、用于提示保存第一笔划的起始位置的标识(PATH_INDEX1)、用于提示保存第二笔划的起始位置的标识(PATH_INDEX2)、第一笔划所使用的颜色索引(COLOR_INDEX1)、第二笔划所使用的颜色索引(COLOR_INDEX2)、第一笔划所使用的笔型索引(PEN_INDEX1)、第二笔划所使用的笔型索引(PEN_INDEX2)。第一笔划包括第一笔划的类型标识(STROKE1),用于指示第一笔划所使用的线型、以及组成第一比划的各个点的坐标(X、Y)以及宽度(WIDTH)。第二笔划包括第二笔划的类型标识(STROKE2),用于指示第二笔划所使用的线型、以及组成第二笔划的各个点的坐标(X、Y)以及宽度(WIDTH)。
需要说明的是,终端100将同一个内容编辑区域显示的多媒体内容和/或涂鸦内容保存到同一个文件中。以图4所述的用户界面410为例,多媒体内容“今天的会议要求,要把工作做细,深入细节调研。明确以下几点:1.杜绝表面主义2.群策群力3.事无巨细,从细节上抓成败”和涂鸦内容420保存在同一个文件中。
本申请实施例中终端100还可以当内容编辑区域221的第二图层覆盖于第一图层之上时,若检测到用户输入的涂鸦内容的位置到达在内容编辑区域中的预设位置(例如用户输入的涂鸦内容距离内容编辑区域的最底部400个像素),则自动将内容编辑区域扩展预设大小。示例的,终端100还可以通过处理器100(例如应用处理器)在内容编辑区域的第二图层覆盖于第一图层之上时,检测到用户输入的涂鸦内容的位置到达在内容编辑区域中的预设位置,则自动将内容编辑区域扩展预设大小。
其中,需要说明的是,本申请实施例中将内容编辑区域自动扩展预设大小指的是自动将第二图层和第一图层同时扩展预设大小。
示例的,终端100的显示屏151显示图4所示的用户界面410时,若检测到用户在第二图层502上添加的如图7所示涂鸦内容700的过程中,涂鸦到位置A后,第二图层和第一图层自动向下扩展预设大小。图7中阴影部分为第一图层和第二图层的扩展部分。其中预设大小可以根据需要进行设定。通常情况下,将预设大小可以设置为初始内容编辑区域的大小。应理解,当终端100响应于对新建备忘录按钮211的操作时,在显示屏151显示的用户界面所包括的内容编辑区域的大小为初始内容编辑区域的大小。此外,通常情况下,由于终端100的存储空间是有限的,为了避免无限制的扩展内容编辑区域,终端100可以在检测到用户输入的涂鸦内容的位置到达在内容编辑区域中的预设位置后,若确定内容编辑区域当前的大小未超过最大限制,则自动向下扩展预设大小。当内容编辑区域当前的大小达到最大限制时,则终端100不再向下扩展内容编辑区域的大小。
需要说明的是,本申请实施例中当第一图层覆盖与第二图层之上时,终端 100当检测到用户添加的多媒体信息超出内容编辑区域当前的大小时,也可以自动扩展内容编辑区域的大小,其中自动扩展内容编辑区域大小的方式可以参见上述自动扩展内容编辑区域大小的方式,在此不再赘述。
例如,如图8所示,终端100的显示屏151显示用户界面800,其中用户界面800包括内容编辑区域801,其中区域802为初始内容编辑区域扩展后的区域。终端100可以响应于用户在显示屏151的区域810上下滑动操作,使得显视屏151显示相应内容编辑区域801中相应位置的内容。
还需要说明的是,以图2所示的用户界面210为例,本申请实施例中终端100还可以响应于用户选中已保存的备忘录列表212中的备忘录的操作,在显示屏151显示选中的备忘录的用户界面。其中终端可以响应于用户的操作,对选中的备忘录的用户界面上的多媒体内容或者涂鸦内容进行相应编辑和/或添加。具体的添加或编辑涂鸦内容和多媒体内容的方式与在新建的备忘录中添加或编辑涂鸦内容和多媒体内容的方式类似,在此不再赘述。
本申请实施例中各个实施例可以相互结合使用,也可以单独使用。
结合上述实施例及附图,本申请实施例提供一种内容编辑的方法,该方法可以在具有图1所示的硬件结构的终端100中实现。
如图9所示,本申请实施例提供的内容编辑方法的流程示意图。包括以下步骤。
步骤901,终端显示可编辑的用户界面,其中,可编辑的用户界面显示多媒体内容和涂鸦内容,涂鸦内容为响应用户在多媒体内容上操作生成,可编辑的用户界面包括内容编辑区域,内容编辑区域用于添加或编辑所述多媒体内容和/或涂鸦内容。
步骤902,终端响应于添加或编辑涂鸦内容的操作,在内容编辑区域显示添加或编辑后的涂鸦内容;或者终端响应于添加或编辑多媒体内容的操作,在内容编辑区域显示添加或编辑后的多媒体内容。
本申请实施例中图9所示的方法的具体实现方式可以参见上述相关实施例的介绍。
上述本申请提供的实施例中,从终端作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,终端可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
如图10所示,本申请实施例公开了一种终端1000,该终端1000可以包括:显示屏1001、一个或多个处理器1002、存储器1003、和多个应用程序1004;以及一个或多个计算机程序1005,上述各器件可以通过一个或多个通信总线1006连接。其中该一个或多个计算机程序1005包括指令,所述指令被存储在上述存储器1003中并被配置为被该一个或多个处理器1002执行,实现本申请实施例提供的内容编辑的方法。
基于相同的构思,图11所示为本申请的一种终端1100,包括显示模块1101 和处理模块1102,其中显示模块1101可以用于执行图9所示的内容编辑方法中的步骤901,处理模块1102用于执行图9所示的内容编辑方法中的步骤902中响应于添加或编辑涂鸦内容或多媒体内容的操作,使得显示模块1101进行相应的显示。
上述各个实施例中涉及处理器可以是通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的指令,结合其硬件完成上述方法的步骤。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技 术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本申请的保护范围之内,因此本申请的保护范围应以权利要求的保护范围为准。

Claims (18)

  1. 一种内容编辑的方法,其特征在于,所述方法包括:
    终端显示可编辑的用户界面,其中,所述可编辑的用户界面显示多媒体内容和涂鸦内容,所述涂鸦内容为响应用户在所述多媒体内容上操作生成,所述可编辑的用户界面包括内容编辑区域,所述内容编辑区域用于添加或编辑所述多媒体内容和/或涂鸦内容;
    所述终端响应于添加或编辑所述涂鸦内容的操作,在所述内容编辑区域显示添加或编辑后的涂鸦内容;或者
    所述终端响应于添加或编辑所述多媒体内容的操作,在所述内容编辑区域显示添加或编辑后的多媒体内容。
  2. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    所述终端响应用户的切换操作,切换对所述多媒体内容和所述涂鸦内容的编辑。
  3. 如权利要求1或2所述的方法,其特征在于,所述内容编辑区域包括第一图层和第二图层;所述第一图层用于添加或编辑多媒体内容,所述第二图层用于添加或编辑涂鸦内容;
    所述终端响应于添加或编辑所述涂鸦内容的操作,在所述内容编辑区域显示添加或编辑后的涂鸦内容之前,所述方法还包括:
    所述终端响应于用户的第一操作,切换所述第一图层与所述第二图层的位置。
  4. 如权利要求3所述的方法,其特征在于,所述可编辑的用户界面还包括功能按钮区域,所述功能按钮区域包括绘图按钮,所述第一操作为对绘图按钮的操作。
  5. 如权利要求3或4所述的方法,其特征在于,所述第二图层的尺寸大小和所述第一图层的尺寸大小相同。
  6. 如权利要求3至5任一所述的方法,其特征在于,所述第二图层为透明的。
  7. 如权利要求1至6任一所述的方法,其特征在于,所述方法还包括:
    所述终端响应于添加或编辑所述涂鸦内容的操作,存储所述涂鸦内容,其中,所述涂鸦内容是以稀疏点阵的方式保存的。
  8. 如权利要求1至7任一所述的方法,其特征在于,所述方法还包括:
    所述终端检测添加的涂鸦内容或多媒体内容达到或超过所述内容编辑区域的预设位置后,自动将所述内容编辑区域扩展预设大小。
  9. 一种终端,其特征在于,包括:显示屏、一个或多个处理器、存储器;所述处理器通过总线与所属显示屏和所述存储器连接;
    多个应用程序;
    以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述终端执行时,使得所述终端执行以下步骤:
    显示可编辑的用户界面;其中,所述可编辑的用户界面显示多媒体内容和 涂鸦内容,所述涂鸦内容为响应用户在所述多媒体内容上操作生成,所述可编辑的用户界面包括内容编辑区域,所述内容编辑区域用于添加或编辑所述多媒体内容和/或涂鸦内容;
    响应于添加或编辑所述涂鸦内容的操作,在所述内容编辑区域显示添加或编辑后的涂鸦内容;
    响应于添加或编辑所述多媒体内容的操作,在所述内容编辑区域显示添加或编辑后的多媒体内容。
  10. 如权利要求9所述的终端,其特征在于,所述指令还包括:
    响应用户的切换操作切换对所述多媒体内容和所述涂鸦内容的编辑的指令。
  11. 如权利要求9或10所述的终端,其特征在于,所述内容编辑区域包括第一图层和第二图层;所述第一图层用于添加或编辑多媒体内容,所述第二图层用于添加或编辑涂鸦内容;所述指令还包括:
    响应于用户的第一操作,切换所述第一图层与所述第二图层的位置的指令。
  12. 如权利要求11所述的终端,其特征在于,所述可编辑的用户界面还包括功能按钮区域,所述功能按钮区域包括绘图按钮,所述第一操作为对绘图按钮的操作。
  13. 如权利要求11或12所述的终端,其特征在于,所述第二图层的尺寸大小和所述第一图层的尺寸大小相同。
  14. 如权利要求11至13任一所述的终端,其特征在于,所述第二图层为透明的。
  15. 如权利要求9至14任一所述的终端,其特征在于,所述指令还包括:
    响应于添加或编辑所述涂鸦内容的操作存储所述涂鸦内容的指令,其中,所述涂鸦内容是以稀疏点阵的方式保存的。
  16. 如权利要求9至15任一所述的终端,其特征在于,所述指令还包括:
    检测添加的涂鸦内容或多媒体内容达到或超过所述内容编辑区域的预设位置后,自动将所述内容编辑区域扩展预设大小的指令。
  17. 一种芯片,其特征在于,所述芯片与终端中的存储器耦合,使得所述芯片在运行时调用所述存储器中存储的计算机程序,实现如权利要求1至8任一所述的方法。
  18. 一种计算机存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当计算机程序在终端上运行时,使得所述终端执行如权利要求1至8任一所述的方法。
PCT/CN2019/110915 2018-10-16 2019-10-14 一种内容编辑的方法及终端 WO2020078298A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/285,742 US20220005241A1 (en) 2018-10-16 2019-10-14 Content Editing Method and Terminal
EP19873100.2A EP3859680A4 (en) 2018-10-16 2019-10-14 CONTENT PROCESSING PROCEDURES AND TERMINAL DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811202712.6A CN109523609B (zh) 2018-10-16 2018-10-16 一种内容编辑的方法及终端
CN201811202712.6 2018-10-16

Publications (1)

Publication Number Publication Date
WO2020078298A1 true WO2020078298A1 (zh) 2020-04-23

Family

ID=65772242

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/110915 WO2020078298A1 (zh) 2018-10-16 2019-10-14 一种内容编辑的方法及终端

Country Status (4)

Country Link
US (1) US20220005241A1 (zh)
EP (1) EP3859680A4 (zh)
CN (3) CN115187702A (zh)
WO (1) WO2020078298A1 (zh)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115187702A (zh) * 2018-10-16 2022-10-14 华为技术有限公司 一种内容编辑的方法及终端
CN110196675B (zh) * 2019-04-17 2022-07-15 华为技术有限公司 一种添加批注的方法及电子设备
CN115484396B (zh) * 2021-06-16 2023-12-22 荣耀终端有限公司 一种视频处理方法及电子设备
CN115484399B (zh) * 2021-06-16 2023-12-12 荣耀终端有限公司 一种视频处理方法和电子设备
CN113672134B (zh) * 2021-07-30 2024-06-04 北京搜狗科技发展有限公司 媒体信息编辑方法、装置、计算机可读介质及电子设备
CN118071880A (zh) * 2022-11-22 2024-05-24 荣耀终端有限公司 一种图片编辑方法、电子设备及可读存储介质
CN116774877B (zh) * 2023-08-21 2023-12-19 福昕鲲鹏(北京)信息科技有限公司 Ofd文档页面中涂鸦笔迹的擦除方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130054229A1 (en) * 2011-08-31 2013-02-28 Samsung Electronics Co., Ltd. Portable device and method for multiple recording of data
CN105653195A (zh) * 2016-01-29 2016-06-08 广东欧珀移动通信有限公司 截屏方法及移动终端
CN106095261A (zh) * 2016-06-13 2016-11-09 网易(杭州)网络有限公司 一种在电子设备上添加笔记的方法和装置
CN106155542A (zh) * 2015-04-07 2016-11-23 腾讯科技(深圳)有限公司 图片处理方法及装置
CN109523609A (zh) * 2018-10-16 2019-03-26 华为技术有限公司 一种内容编辑的方法及终端

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7397949B2 (en) * 2000-06-21 2008-07-08 Microsoft Corporation Serial storage of ink and its properties
JP2002208022A (ja) * 2001-01-10 2002-07-26 Reile:Kk 表示制御方法、情報表示装置および媒体
CN1218574C (zh) * 2001-10-15 2005-09-07 华为技术有限公司 交互式视频设备及其字幕叠加方法
WO2005079349A2 (en) * 2004-02-12 2005-09-01 Mobileframe Llc Integrated deployment of software projects
US10318624B1 (en) * 2006-12-28 2019-06-11 Apple Inc. Infinite canvas
US8156445B2 (en) * 2008-06-20 2012-04-10 Microsoft Corporation Controlled interaction with heterogeneous data
US8885977B2 (en) * 2009-04-30 2014-11-11 Apple Inc. Automatically extending a boundary for an image to fully divide the image
US9477667B2 (en) * 2010-01-14 2016-10-25 Mobdub, Llc Crowdsourced multi-media data relationships
US20120324345A1 (en) * 2011-06-14 2012-12-20 Microsoft Corporation Transitioning between an Editing View and a Backstage View of an Electronic Document
CN103309568A (zh) * 2012-03-14 2013-09-18 联想(北京)有限公司 一种电子文档批注方法、装置及终端设备
CN102799365A (zh) * 2012-06-29 2012-11-28 鸿富锦精密工业(深圳)有限公司 电子设备及其圈选打印及传真方法
US9639969B1 (en) * 2013-07-25 2017-05-02 Overlay Studio, Inc. Collaborative design
US9383910B2 (en) * 2013-10-04 2016-07-05 Microsoft Technology Licensing, Llc Autoscroll regions
US20160232144A1 (en) * 2015-02-06 2016-08-11 Liang Zhou Browser extension allowing web users to draw on live web pages
US10897490B2 (en) * 2015-08-17 2021-01-19 E-Plan, Inc. Systems and methods for augmenting electronic content
US10346510B2 (en) * 2015-09-29 2019-07-09 Apple Inc. Device, method, and graphical user interface for providing handwriting support in document editing
CN105446647A (zh) * 2015-12-25 2016-03-30 智慧方舟科技有限公司 一种电纸书及其阅读笔记实现方法、装置
CN107025100A (zh) * 2016-02-01 2017-08-08 阿里巴巴集团控股有限公司 播放多媒体数据的方法、界面渲染方法及装置、设备
CN107291346B (zh) * 2016-04-12 2021-07-06 北京三星通信技术研究有限公司 终端设备的绘制内容处理方法、装置及终端设备
CN107132975A (zh) * 2017-05-26 2017-09-05 努比亚技术有限公司 一种控件编辑处理方法、移动终端以及计算机可读存储介质
WO2020082275A1 (zh) * 2018-10-24 2020-04-30 三星电子株式会社 终端设备的绘制内容处理方法、装置及终端设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130054229A1 (en) * 2011-08-31 2013-02-28 Samsung Electronics Co., Ltd. Portable device and method for multiple recording of data
CN106155542A (zh) * 2015-04-07 2016-11-23 腾讯科技(深圳)有限公司 图片处理方法及装置
CN105653195A (zh) * 2016-01-29 2016-06-08 广东欧珀移动通信有限公司 截屏方法及移动终端
CN106095261A (zh) * 2016-06-13 2016-11-09 网易(杭州)网络有限公司 一种在电子设备上添加笔记的方法和装置
CN109523609A (zh) * 2018-10-16 2019-03-26 华为技术有限公司 一种内容编辑的方法及终端

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MA, LI ET AL.: "The Client Design of Take Note Based on iOS", INFORMATION TECHNOLOGY, no. 3, 31 March 2017 (2017-03-31), pages 42 - 45,51, XP055801268 *
See also references of EP3859680A4

Also Published As

Publication number Publication date
CN109523609B (zh) 2022-05-24
CN109523609A (zh) 2019-03-26
EP3859680A1 (en) 2021-08-04
CN115187702A (zh) 2022-10-14
EP3859680A4 (en) 2021-11-10
US20220005241A1 (en) 2022-01-06
CN115187701A (zh) 2022-10-14

Similar Documents

Publication Publication Date Title
EP3896946B1 (en) Display method for electronic device having flexible screen and electronic device
WO2021213120A1 (zh) 投屏方法、装置和电子设备
WO2020078298A1 (zh) 一种内容编辑的方法及终端
US11669242B2 (en) Screenshot method and electronic device
US20220224665A1 (en) Notification Message Preview Method and Electronic Device
CN112449099B (zh) 一种图像处理方法、电子设备及云服务器
US20240179237A1 (en) Screenshot Generating Method, Control Method, and Electronic Device
WO2021063237A1 (zh) 电子设备的控制方法及电子设备
US11747953B2 (en) Display method and electronic device
CN110750317A (zh) 一种桌面的编辑方法及电子设备
WO2021057673A1 (zh) 一种图像显示方法及电子设备
US20240168624A1 (en) Screen capture method and related device
CN114115769A (zh) 一种显示方法及电子设备
CN114579016A (zh) 一种共享输入设备的方法、电子设备及系统
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
WO2021013106A1 (zh) 一种折叠屏照明方法和装置
WO2023241209A9 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
WO2020221062A1 (zh) 一种导航操作方法及电子设备
WO2022143180A1 (zh) 协同显示方法、终端设备及计算机可读存储介质
WO2020078297A1 (zh) 一种冻屏处理方法及终端
WO2021052015A1 (zh) 一种触控屏控制方法和电子设备
WO2022111690A1 (zh) 一种共享输入设备的方法、电子设备及系统
WO2023179259A1 (zh) 一种信息分享方法及相关设备
WO2022217969A1 (zh) 一种在应用中启用功能的方法及装置
WO2024027374A1 (zh) 隐藏信息显示方法、设备、芯片系统、介质及程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19873100

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019873100

Country of ref document: EP

Effective date: 20210428