CN116048349B - Picture display method and device and terminal equipment - Google Patents

Picture display method and device and terminal equipment Download PDF

Info

Publication number
CN116048349B
CN116048349B CN202210724893.9A CN202210724893A CN116048349B CN 116048349 B CN116048349 B CN 116048349B CN 202210724893 A CN202210724893 A CN 202210724893A CN 116048349 B CN116048349 B CN 116048349B
Authority
CN
China
Prior art keywords
picture
display
terminal device
displaying
processing operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210724893.9A
Other languages
Chinese (zh)
Other versions
CN116048349A (en
Inventor
李思文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210724893.9A priority Critical patent/CN116048349B/en
Publication of CN116048349A publication Critical patent/CN116048349A/en
Application granted granted Critical
Publication of CN116048349B publication Critical patent/CN116048349B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a picture display method, a picture display device and terminal equipment. In the method, first, a first picture stored in terminal equipment is determined in response to a first determining operation aiming at the terminal equipment; then, responding to a first processing operation for the first picture, and determining a second picture obtained after the first picture is processed; and then simultaneously displaying the first picture and the second picture. The first picture is an unprocessed picture, namely an original picture, and the second picture is a processed picture, and under the condition, the first picture and the second picture displayed on the display screen of the terminal device are compared, and a user can determine the effect of picture processing by looking up the first picture and the second picture. And if the user is not satisfied with the processing effect through the comparison of the first picture and the second picture, the processing can be continued, so that the scheme of the application is also beneficial to the user to obtain the satisfied picture processing effect.

Description

Picture display method and device and terminal equipment
Technical Field
The application relates to the field of terminal equipment, in particular to a picture display method and device and terminal equipment.
Background
In order to meet the requirement of users for viewing pictures, various terminal devices currently support the function of displaying pictures. Based on the function, after a certain picture stored in the terminal device is selected, the terminal device can display the picture so as to be convenient for a user to check.
In addition, users sometimes wish to process pictures, for example, to add filters to the pictures. In order to meet the requirement of the user, the terminal equipment can receive the processing operation of the user on the picture when displaying the picture, and adjust the picture currently being displayed according to the processing operation, so that the user can acquire the processed picture.
However, after processing a picture by using the existing scheme, a user often cannot determine the effect of processing the picture, for example, in some scenes, the user performs processing for improving the brightness of the picture, but the brightness is improved too greatly, so that the picture is distorted.
Disclosure of Invention
After processing the picture, the user often cannot determine the effect of processing the picture, and in order to solve the technical problem, the application discloses a picture display method and device through the following embodiments.
In a first aspect, an embodiment of the present application discloses a method for displaying a picture, including:
Determining a first picture stored in a terminal device in response to a first determination operation for the terminal device;
determining a second picture obtained after the first picture is processed in response to a first processing operation for the first picture;
and displaying the second picture while displaying the first picture.
Through the steps, the terminal equipment can display the first picture and the second picture simultaneously, the displayed first picture and the displayed second picture are compared, and a user can determine the processing effect of the second picture by looking up the first picture and the second picture.
In an alternative design, before the displaying the first picture and the displaying the second picture, the method further includes:
and determining that the current state of the terminal equipment accords with the state of multi-picture display.
Through the steps, the terminal equipment can display the first picture and the second picture at the same time under the condition that the current state accords with the state of multi-picture display.
In an alternative design, the determining that the current state of the terminal device conforms to the state of multi-graph display includes:
the terminal device comprises a folding screen formed by a first screen and a second screen, and the angle between the first screen and the second screen is determined to be smaller than a first threshold value;
Or,
the terminal equipment is in a split screen display state, and the ratio of the area of a first split screen to the area of the whole screen is determined to be larger than a second threshold value, wherein the first split screen is the split screen for receiving the first processing operation;
or,
determining a functional page of the terminal equipment corresponding to the first processing operation as a functional page supporting multi-picture display;
or,
and receiving a trigger signal for enabling the terminal equipment to enter the multi-picture display state.
In an alternative design, before the displaying the first picture and the displaying the second picture, the method further includes:
determining that the first processing operation includes a target operation, where the target operation includes an operation described in a white list, where the white list is used to describe an operation that supports multi-picture display, or where the target operation includes an operation that adjusts a parameter of the first picture, where the parameter includes at least one of brightness, contrast, and saturation.
Through the above steps, the first picture and the second picture can be displayed simultaneously only in the case where the first processing operation includes the target operation.
In an alternative design, the displaying the second picture simultaneously with displaying the first picture includes:
The first processing operation includes an operation of dragging an intensity slider of a parameter of the first picture, and after the dragging of the intensity slider is finished, the second picture is displayed while the first picture is displayed.
In an alternative design, after displaying the second picture while displaying the first picture, the method further includes:
and if the state of the terminal equipment is adjusted from the state conforming to the multi-picture display to the state conforming to the single-picture display, terminating displaying the first picture.
Through the steps, when the state of the terminal equipment is the state conforming to the single-picture display, the first picture is not displayed any more, and the second picture is displayed.
In an alternative design, after displaying the second picture while displaying the first picture, the method further includes:
determining a third picture in response to a second processing operation for the first picture, wherein the third picture is obtained after the second processing operation is performed on the first picture;
and displaying the first picture and the third picture simultaneously.
Through the steps, the terminal equipment can display the first picture and the third picture simultaneously after reprocessing the first picture, so that a user can determine the effect of the picture processing by checking the first picture and the third picture which are displayed simultaneously.
In an alternative design, after displaying the second picture while displaying the first picture, the method further includes:
determining a fourth picture in response to a third processing operation for the second picture, wherein the fourth picture is obtained after the third processing operation is performed on the second picture;
and displaying the first picture and the fourth picture simultaneously.
In an alternative design, after displaying the second picture while displaying the first picture, the method further includes:
determining a fifth picture in response to a fourth processing operation for the first picture, wherein the fifth picture is obtained after the fourth processing operation is performed on the first picture;
and displaying the fifth picture while displaying the first picture and the second picture.
In a second aspect, an embodiment of the present application discloses a picture display device, including:
a first picture determining module, configured to determine a first picture stored in a terminal device in response to a first determining operation for the terminal device;
the second picture determining module is used for determining a second picture obtained after the first picture is processed in response to a first processing operation for the first picture;
And the picture display module is used for displaying the second picture while displaying the first picture.
In a third aspect, an embodiment of the present application discloses a terminal device, including: a processor and a memory; the memory stores program instructions that, when executed by the processor, cause the terminal device to perform the method of the first aspect.
In a fourth aspect, embodiments of the present application disclose a computer storage medium having stored therein a computer program or instructions which, when executed, perform a method as described in the first aspect.
In a fifth aspect, embodiments of the present application disclose a chip system comprising a processor coupled to a memory for executing a computer program or instructions stored in the memory, which when executed, performs a method as described in the first aspect.
In the embodiment of the application, after the terminal device obtains the second picture based on the first processing operation for the first picture, the second picture is displayed while the first picture is displayed. The first picture is an unprocessed picture, namely an original picture, and the second picture is a processed picture, and in this case, the first picture and the second picture displayed on the display screen of the terminal device are compared at the same time, so that a user can determine the processing effect of the second picture by looking up the first picture and the second picture.
Further, if the user is not satisfied with the processing effect by comparing the first picture with the second picture, the processing can be continued, so that the user is facilitated to obtain the satisfied processing effect of the picture by the scheme provided by the embodiment of the application.
Drawings
FIG. 1 (a) is an interface diagram of a terminal device;
FIG. 1 (b) is an interface diagram of another terminal device;
fig. 2 is a schematic structural diagram of a terminal device disclosed in an embodiment of the present application;
fig. 3 is a software structural block diagram of a terminal device disclosed in an embodiment of the present application;
fig. 4 is a software structural block diagram of another terminal device disclosed in an embodiment of the present application;
fig. 5 is a schematic workflow diagram of a picture display method according to an embodiment of the present application;
fig. 6 is an interface schematic diagram of a terminal device disclosed in an embodiment of the present application;
FIG. 7 is a schematic workflow diagram of another method for displaying pictures according to an embodiment of the present application;
FIG. 8 is a schematic workflow diagram of another method for displaying pictures according to an embodiment of the present application;
FIG. 9 is a schematic workflow diagram of another method for displaying pictures according to an embodiment of the present application;
FIG. 10 is a schematic workflow diagram of another method for displaying pictures according to an embodiment of the present application;
FIG. 11 is a workflow diagram illustrating an example of a picture display method disclosed in an embodiment of the present application;
fig. 12 is a schematic structural diagram of a picture display device according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
The terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the various embodiments herein below, "at least one", "one or more" means one, two or more than two. The term "and/or" is used to describe an association relationship of associated objects, meaning that there may be three relationships; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
For clarity and conciseness in the description of the following embodiments, a brief description of the related art will be given first:
various types of terminal equipment (such as mobile phones, tablet computers, notebook computers and the like) support a picture display function, when a user needs to view a certain picture in a picture library of the terminal equipment, the picture can be selected, and the terminal equipment displays corresponding pictures based on the selection of the user.
In addition, the user sometimes needs to process the picture, for example, add a filter to the picture, adjust the brightness of the picture, or if the picture is a portrait, make the portrait beautiful.
In the prior art, when a user processes a picture, a terminal device firstly displays the picture according to the selection of the user, and then adjusts the picture currently being displayed according to the processing operation of the user, so that the picture currently displayed by the terminal device is adjusted to be the picture processed by the user.
In one example, where the terminal device is a smart phone, after the user selects a landscape within the terminal device, the terminal device displays the landscape, and the interface displayed by the terminal device may be as shown in fig. 1 (a). Then, the terminal device receives a processing operation performed on the picture by the user, in this example, the processing operation is performed on the picture to increase the brightness of the picture, after receiving the processing operation, the picture displayed by the terminal device is the picture with increased brightness, and accordingly, the screen of the terminal device may be as shown in fig. 1 (b).
As is apparent from the above description of the prior art, in the prior art, a terminal device may receive a processing operation for a picture when displaying the picture, and after receiving the processing operation, the terminal device displays the processed picture. However, in this case, the user often cannot determine the effect of the picture processing.
For example, in some scenes, the processing operation of the user is to increase the brightness of the picture, and after receiving the processing operation, the terminal device displays the picture with increased brightness, but the brightness of the picture may be too high, resulting in picture distortion.
Aiming at the problem that in the prior art, after processing an image, a user cannot determine the effect of image processing, the embodiment of the application provides an image display method, an image display device and terminal equipment.
In the scheme, after receiving a first processing operation for a first picture, terminal equipment carries out corresponding processing on the first picture to obtain a processed second picture, and displays the second picture while displaying the first picture, so that a user can determine the effect of picture processing by comparing the first picture with the second picture.
The terminal device in the embodiment of the present application may be a terminal device having a picture display function. In some embodiments, the terminal device may be a device capable of displaying pictures, such as a mobile phone, a tablet computer, a desktop, a laptop, a notebook, an Ultra-mobile personal computer (Ultra-mobile Personal Computer, UMPC), a handheld computer, a netbook, a personal digital assistant (Personal Digital Assistant, PDA), a wearable electronic device, and a smart watch, and the specific form of the terminal device is not particularly limited in this application.
In this embodiment, the structure of the terminal device may be shown in fig. 2, where fig. 2 is a schematic structural diagram of a terminal device to which the picture display method provided in the embodiment of the present application is applied.
As shown in fig. 2, the terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
Further, when the terminal device is a mobile phone, the terminal device may further include: antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It will be appreciated that the structure illustrated in this embodiment does not constitute a specific limitation on the terminal device. In other embodiments, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, wherein different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied on the terminal device. In some embodiments, the antenna 1 of the terminal device is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device can communicate with the network and other devices through wireless communication technology.
The terminal device implements display functions through a graphics processor (graphics processing unit, GPU), a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. A series of graphical user interfaces (graphical user interface, GUIs) may be displayed on the display 194 of the terminal device, these GUIs being the home screen of the terminal device.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the terminal device and data processing by executing instructions stored in the internal memory 121.
The terminal device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device can listen to music through the speaker 170A or listen to hands-free calls. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The gyro sensor 180B may be used to determine a motion gesture of the terminal device. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The terminal device may detect the opening and closing of the flip cover using the magnetic sensor 180D. The acceleration sensor 180E may detect the magnitude of acceleration of the terminal device in various directions (typically three axes). A distance sensor 180F for measuring a distance. The proximity light sensor 180G may include a Light Emitting Diode (LED) and a light detector. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is for detecting temperature. The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The bone conduction sensor 180M may acquire a vibration signal. The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys, touch keys, or virtual keys. The motor 191 may generate a vibration cue. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card.
In addition, an operating system is run on the components. Such as the iOS operating system developed by apple corporation, the Android open source operating system developed by google corporation, the Windows operating system developed by microsoft corporation, etc. An operating application may be installed on the operating system.
In order to determine the functional operations executed by each software architecture in the terminal device when the terminal device executes the scheme disclosed in the application, the embodiment of the application also discloses the software structure of the terminal device.
The operating system of the terminal device may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, a cloud architecture, or the like. In the embodiment of the application, a layered Android system is taken as an example, and a software structure of a terminal device is illustrated.
Fig. 3 is an exemplary diagram of a software architecture block diagram of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 3, the application layer may include application packages of applications such as cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, and short messages.
Further, in the embodiment of the present application, the application layer further includes: system user interface (i.e., system UI). The System UI is used for managing a user interface, and may provide status bar information (for example, display identifiers such as a battery remaining capacity and wifi signals 3G/4G), notification panel display, screenshot service, wallpaper service, and the like.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the size of the display screen, obtain parameters of each display area on the display interface, and so on. In the present application, the window manager may determine the area in which the terminal device is touched.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including camera icons.
The telephone manager is used for providing communication functions of the mobile phone. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
In addition, in the application, a system library comprises a state monitoring service, and the state monitoring service can perform state detection according to data reported by a sensor driver of a kernel layer.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Although the Android system is taken as an example for explanation, the basic principle of the embodiment of the present application is equally applicable to terminal devices based on iOS or Windows and other operating systems.
When the software structure block diagram of the terminal device is shown in fig. 3, if the touch area of the terminal device receives a touch operation, a sensor (for example, a pressure sensor or a temperature sensor) corresponding to the touch area outputs data corresponding to the touch operation. After the sensor driver of the kernel layer receives the data, the data is reported to a state monitoring service in a system library, the state monitoring service determines the touched position coordinates in the touch area according to the received data, and the touched position coordinates are reported to a window manager of the application program framework layer. The window manager determines the type of the touch operation (such as point touch, sideslip or pull-down) and the specific position of the touch according to the position coordinates of the touch. Then, the window manager may transmit the type of the current touch operation and the specific position of the touch operation to the application program layer, so that the application program layer executes a corresponding operation, for example, a system user interface in the application program layer controls a display screen of the terminal device to display a corresponding content, thereby implementing the picture display method provided by the embodiment of the application.
Further, referring to the software architecture diagram shown in fig. 4, in the embodiment of the present application, the system user interface of the application layer includes at least one unit, and each unit includes: an interface control management module (i.e., uiControl), a picture rendering module (i.e., render), and a status module (i.e., state), and the system user interface of the application layer further includes: edit status module (i.e., edit state), edit view module (i.e., edit view), edit management module (i.e., simpleedit manager), and picture management module (i.e., masterImage). Wherein the edit status module is interactable with each unit.
The editing state module is used for controlling the switch of the double-picture button, if the double-picture button is turned on, the terminal equipment can display the first picture and the second picture at the same time, and if the double-picture button is turned off, the terminal equipment does not display the double-picture, but only displays one picture. In one possible design, the edit status module may determine whether to open the double-map button based on the status of the terminal device or the received processing operation for the picture. And the editing state module can transmit the switch change of the double-picture button to the picture rendering module of each unit through the editing state module and the state module.
If the application layer includes a plurality of units, each unit may process picture drawing in different scenes, respectively. For example, one of the units may be responsible for adding picture drawing in a filter scene and the other unit for adjusting picture drawing in a luminance scene. If the application layer includes only one element, the element may process the drawing of pictures in each scene.
The interface control management module in the unit can acquire the type of touch operation and the specific position of the touch, which are transmitted by the window manager, and if the picture drawing under the scene is processed by the unit, the interface control management module transmits the type of touch operation and the specific position of the touch to the picture rendering module through the state module.
And the picture rendering module in the unit determines the processed picture according to the type of the received touch operation and the specific touch position, and renders the corresponding picture in a picture preview area of the picture of the terminal equipment based on the switch change of the double-picture button. If the picture rendering module determines that the double-picture button is closed, rendering the processed picture in the picture preview area, wherein in the case, the display screen of the terminal equipment only displays the processed picture; if the picture rendering module determines that the double-picture button is opened, the picture rendering module renders the picture before processing and the picture after processing in the picture preview area, and in this case, the display screen of the terminal device displays the picture before processing and the picture after processing at the same time.
In addition, the editing management module can package the pre-processed picture and the post-processed picture, and the picture management module can be used for storing the pre-processed picture and the post-processed picture.
In order to clarify the solutions provided in the present application, the following description will describe the solutions provided in the present application by referring to the drawings in the following embodiments.
An embodiment of the present application provides a method for displaying a picture, which is applied to a terminal device, and referring to a workflow schematic diagram shown in fig. 5, the method for displaying a picture provided in the embodiment of the present application includes the following steps:
step S11, in response to a first determination operation for the terminal device, determining a first picture stored in the terminal device.
In the embodiment of the application, at least one picture is generally stored in the terminal device. Wherein the first determining operation may be an operation of selecting a first picture stored in the terminal device. The first determining operation may include an operation for causing the terminal device to present a gallery of stored pictures, and an operation for selecting one picture in the gallery, for example.
In addition, after determining the first picture, the terminal device may also display the first picture so that the user may process the first picture later.
Step S12, a second picture obtained after the first picture is processed is determined in response to a first processing operation for the first picture.
The first processing operation may include a plurality of types of operations according to a processing requirement of the user for the first picture. For example, the first processing operation may include an operation of adding a filter to the first picture, an operation of adjusting properties of the first picture (e.g., brightness, contrast, saturation, etc.), and a beauty adjustment operation (e.g., face thinning and eyes enlargement, etc.) performed on the first picture.
In the embodiment of the application, the terminal device receives a first processing operation for the first picture, and determines that a picture obtained after the first picture is processed through the first processing operation is a second picture. The first processing operation may include an operation of a control for causing the terminal device to display a picture process, and an operation of selecting or adjusting a corresponding control, for example.
Wherein, if the filter is required to be added to the first picture, the first processing operation may include an operation for causing the terminal device to display a plurality of filters, and a selection operation for the filter that is desired to be added at this time; the first processing operation may include an operation for causing the terminal device to display a luminance slide bar, and an operation for adjusting the luminance slide bar, if it is necessary to increase the luminance of the first picture.
Step S13, displaying the second picture while displaying the first picture.
The first processing operation may include various forms of operations. If the first processing operation includes an operation of selecting a filter, in this step, the first picture and the second picture may be displayed at the same time after the operation of selecting the filter is received.
In addition, the first processing operation may further include an operation of dragging an intensity slider of a parameter of the first picture, which may include at least one of brightness, contrast, and saturation, by way of example. In this case, the second picture is displayed while the first picture is displayed, typically after the end of the drag on the intensity slider.
If the dragging of the intensity sliding bar is realized by the touch of the user, after the end of the touch of the user is detected, that is, the user finger is lifted from the screen of the terminal device, the end of the dragging of the intensity sliding bar can be determined.
In addition, when the terminal device displays the first picture and the second picture simultaneously, the first picture and the second picture may be displayed in parallel on the left and right sides of the display screen of the terminal device, and also displayed in parallel up and down, which is not limited in the embodiment of the present application.
In the embodiment of the application, after the second picture is obtained based on the first processing operation, the first picture is displayed and the second picture is also displayed. The first picture is an unprocessed picture, namely an original picture, and the second picture is a processed picture, and under the condition, the first picture and the second picture displayed on the display screen of the terminal device are compared, and a user can determine the effect of picture processing by looking up the first picture and the second picture.
Further, if the user is not satisfied with the processing effect by comparing the first picture with the second picture, the processing can be continued, so that the user is facilitated to obtain the satisfied processing effect of the picture by the scheme provided by the embodiment of the application.
In order to clarify the scheme provided by the embodiment of the application, an example is provided, and fig. 6 corresponding to the example is provided, where fig. 6 is an interface schematic diagram when the terminal device displays the first picture and the second picture at the same time.
In this example, the terminal device first receives a first determining operation of the user, and determines a first picture according to the first determining operation, and fig. 6 shows two scenery images, wherein the scenery image on the left side is the first picture.
Then, the terminal device receives a first processing operation for the first picture, where the processing needs to add a filter to the first picture, and accordingly, the first processing operation includes an operation for making the terminal device display a plurality of filters, and an operation for selecting a filter that the terminal device wants to add this time, and referring to fig. 6, the plurality of filters provided by the terminal device include a "original image" filter, a "classical" filter, a "morning light" filter, a "clean" filter, a "black and white" filter, a "blue-tone" filter, a "green onion" filter, and the like, and the filters selected by the first processing operation are the "clean" filters.
In this case, after receiving the first processing operation, the terminal device determines a second picture, which is a picture after adding the "clean" filter on the basis of the first picture, and the terminal device displays the first picture and the second picture at the same time, wherein the landscape image on the right side of fig. 6 is the second picture.
By looking at fig. 6, a user can compare the first picture with the second picture to determine the picture processing effect after adding a "clear" filter to the first picture.
Through the scheme provided by the embodiment, the terminal equipment can display the first picture and the second picture simultaneously. In a possible embodiment, the terminal device may display the first picture and the second picture simultaneously through the operations of steps S11 to S13 each time the picture needs to be processed.
In addition, in another possible embodiment, referring to the workflow diagram shown in fig. 7, before performing the operation of step S13, the method further includes the steps of:
step S14, determining whether the current state of the terminal device accords with the state of multi-diagram display, and if so, executing the operation of step S13.
Further, if it is determined in this step that the current state of the terminal device does not conform to the state of the multi-view display, the operation of step S15 may be further performed:
And S15, displaying a second picture.
That is, in this embodiment, the terminal device displays the first picture and the second picture at the same time when the current state of the terminal device conforms to the state of the multi-picture display. If the current state of the terminal device does not accord with the state of the multi-picture display, the terminal device only displays the second picture after determining the second picture.
The state of the multi-diagram display may include various states, and in one possible design, the present application may determine that the current state of the terminal device conforms to the state of the multi-diagram display by:
(1) The terminal device includes a folding screen composed of a first screen and a second screen, and determines that an angle between the first screen and the second screen is smaller than a first threshold.
That is, if the terminal device includes a folding screen constituted of a first screen and a second screen, and an angle between the first screen and the second screen is smaller than a first threshold value, it may be determined that the current state of the terminal device corresponds to the state of multi-picture display, and after determining the second picture, the terminal device may display the first picture and the second picture at the same time.
The specific value of the first threshold may be preset and may be adjusted according to the operation of the user. In one possible example, the first threshold may be 5 degrees.
When the angle between the first screen and the second screen is less than the first threshold, the folded screen may be generally considered to be currently unfolded. By this arrangement, it can be determined that the current state of the terminal device corresponds to the state of the multi-view display in the case where the terminal device includes a folding screen and the folding screen is not folded. When the folding screen of the terminal equipment is unfolded, the first screen and the second screen usually form a display screen, the display screen is large in size, and when the first picture and the second picture are displayed at the same time, a good picture display effect is achieved.
Therefore, by determining whether the current state of the terminal device accords with the state of multi-picture display or not in the mode (1), the first picture and the second picture which are simultaneously displayed by the terminal device can have a good picture display effect, and the watching experience of a user is improved.
(2) The terminal equipment is in a split screen state, the ratio of the area of the first split screen to the area of the whole screen is determined to be larger than a second threshold value, and the first split screen is the split screen for receiving the first processing operation.
The whole screen refers to the whole screen where the first split screen is located. If the terminal device comprises a folding screen formed by a first screen and a second screen, and the first split screen is positioned on the first screen, the whole screen refers to the whole of the first screen.
Users sometimes want the terminal device to perform multiple functions simultaneously, for example, the user may want to be able to view video played by the terminal device while processing pictures. In this case, the terminal device may perform screen division such that one screen is divided into at least two parts, each of which is one division, and different divisions may perform different functions.
If the terminal equipment is in the split-screen display state, whether the terminal equipment accords with the state of multi-picture display or not can be determined based on a comparison result of the ratio of the area of the first split screen to the area of the whole screen and the second threshold value. The specific value of the second threshold value and the first threshold value may be preset and may be adjusted according to the operation of the user. In one possible example, the second threshold may be 50%.
When the ratio of the area of the first split screen to the whole area of the screen is larger than the second threshold, the area of the first split screen receiving the first processing operation can be generally considered to be larger, so that better picture display effect can be obtained when the first picture and the second picture are displayed simultaneously, and the viewing experience of a user is improved.
(3) And determining the functional page corresponding to the first processing operation as a functional page supporting multi-graph display.
When a first processing operation is performed on the first picture, the terminal device needs to enter a corresponding function page. For example, if the first processing operation includes adding a filter to the first picture, then when the first picture is processed, a functional page to which the filter is added needs to be entered; if the first processing operation includes adjusting the attribute (such as brightness, contrast, saturation, etc.) of the first picture, when the first picture is processed, a corresponding adjustment function page needs to be entered; if the first processing operation includes a beauty adjustment of the first picture, a corresponding beauty adjustment function page needs to be entered when the first picture is processed.
In this scheme, according to the function page corresponding to the first processing operation, it is determined whether the terminal device conforms to the state of multi-diagram display, and if the function page corresponding to the first processing operation supports multi-diagram display, the terminal device conforms to the state of multi-diagram display.
In one possible design, various functional pages supporting multi-view display may be recorded, in which case by querying the record, it may be determined whether the functional page corresponding to the first processing operation is a functional page supporting multi-view display.
By the scheme, the terminal equipment can determine whether to enter the multi-diagram display state or not based on the function page corresponding to the first processing operation.
(4) A trigger signal for bringing the terminal device into a multi-view display state is received.
In this scheme, after receiving a trigger signal for enabling the terminal device to enter the multi-diagram display state, the terminal device determines that the current state of the terminal device conforms to the state of multi-diagram display. Wherein the trigger signal may be implemented in various forms.
In one possible design, the terminal device displays a target control, and after receiving a touch for the target control, the terminal device determines that the trigger signal is received. In this case, when the user wants the terminal device to display the first picture and the second picture simultaneously, the user may touch the target control, so that the terminal device enters a state of multi-picture display.
In another possible design, the terminal device determines that the trigger signal is received after receiving the trigger operation. For example, the triggering operation may be a long-press operation on a touch interface of the terminal device, or if the terminal device includes a camera, the triggering operation may further include a specific gesture operation, and after the camera captures the gesture operation, it is determined that a triggering signal is received.
Through the scheme, a user can determine whether to trigger the terminal equipment according to the self requirement, and the mode that the terminal equipment enters the multi-picture display state is more fit with the requirement of the user.
In another possible embodiment, referring to the workflow diagram shown in fig. 8, before performing the operation of step S13, the following steps are further included:
step S16, determining whether the first processing operation comprises a target operation, if so, executing the operation of step S13.
Further, if it is determined in this step that the first processing operation does not include the target operation, the operation of step S17 may also be performed:
and step S17, displaying the second picture.
That is, in this embodiment, the terminal device displays the first picture and the second picture at the same time in the case where the received first processing operation includes the target operation. If the first processing operation received by the terminal device does not comprise the target operation, the terminal device only displays the second picture after determining the second picture.
In one possible design, the target operation includes an operation described in a white list, where the white list is used to describe an operation supporting multi-view display. The white list can be preset, and in addition, a user can adjust operations recorded in the white list according to own requirements.
For example, if the terminal device displays the first picture and the second picture simultaneously each time the user wishes to make a beauty treatment on the pictures, the beauty treatment on the pictures may be set to be included in the white list.
In addition, the processing of the picture may further include clipping the picture, where the clipping of the picture does not change the content displayed by each pixel in the picture. Therefore, if the processing of the first picture is a cropping processing, the user generally does not need to compare the pictures before and after cropping. In this case, the operation of the clipping process may not be described in the white list. Accordingly, if the first processing operation is clipping processing of the first picture, it may be determined that the first processing operation does not include the target operation, and after determining the second picture, the terminal device only displays the second picture.
In this design, the terminal device may display the first picture and the second picture simultaneously when the first processing operation is an operation recorded in the white list, and the operation recorded in the white list may be set according to the user requirement.
Alternatively, in another possible design, the target operation includes an operation for adjusting a parameter of the first picture. Wherein the parameter may include at least one of brightness, contrast, and saturation.
By means of the design, the first picture and the second picture can be displayed simultaneously after the operation of adjusting the parameters of the first picture is received. After the parameters of the first picture are adjusted, the display content of the picture is changed, and the user is helped to better determine the picture processing effect by comparing the first picture with the second picture.
Alternatively, in another possible design, the target operation may include both an operation described in the white list and an operation for adjusting the parameter of the first picture, which is not limited in the embodiment of the present application.
In the above embodiment, a scheme of simultaneously displaying the first picture and the second picture after determining that the current state of the terminal device conforms to the state of multi-picture display, and a scheme of simultaneously displaying the first picture and the second picture after determining that the first processing operation includes the target operation are respectively disclosed. In another embodiment, the first picture and the second picture may be displayed at the same time only if the terminal device satisfies both conditions. Referring to the workflow diagram shown in fig. 9, the picture display method provided in this embodiment includes the following steps:
step S21, in response to a first determination operation for the terminal device, determining a first picture stored in the terminal device.
Step S22, a second picture obtained after the first picture is processed is determined in response to a first processing operation for the first picture.
The specific implementation process of step S21 to step S22 is the same as the specific implementation process of step S11 to step S12, and reference may be made to each other, and the detailed description thereof will be omitted.
Step S23, determining whether the current state of the terminal device accords with the state of the multi-diagram display, if so, executing the operation of step S24, and if not, executing the operation of step S26.
Step S24, determining whether the first processing operation includes a target operation, if yes, executing the operation of step S25, and if not, executing the operation of step S26.
Step S25, displaying the second picture while displaying the first picture.
The specific implementation process of step S25 is the same as that of step S13, and reference may be made to each other, which is not repeated here.
Step S26, displaying the second picture.
In addition, in the above-described step and fig. 9, after determining that the current state of the terminal device conforms to the state of the multi-diagram display, it is determined whether the first processing operation includes the target operation. In the actual picture display process, no strict time limitation exists between the two steps, and it may be determined whether the current state of the terminal device accords with the state of multi-picture display after the first processing operation includes the target operation, or whether the first processing operation includes the target operation and whether the current state of the terminal device accords with the state of multi-picture display at the same time.
By the scheme provided by the embodiment, the first picture and the second picture can be displayed at the same time when the current state of the terminal equipment accords with the state of multi-picture display and the first processing operation received by the terminal equipment comprises the target operation, and otherwise, the second picture is displayed only.
In another embodiment of the present application, after determining the second picture obtained after processing the first picture and determining that the current state of the terminal device conforms to the state of multi-picture display, the operation of displaying the first picture and the second picture simultaneously is performed, and then referring to the workflow diagram shown in fig. 10, the method may further include the following steps:
step S18, if the state of the terminal equipment is adjusted from the state conforming to the multi-picture display to the state conforming to the single-picture display, the display of the first picture is terminated. After terminating the display of the first picture, the display screen of the terminal device displays only the second picture.
If the terminal equipment comprises a folding screen formed by a first screen and a second screen, and the angle between the first screen and the second screen is smaller than a first threshold value, the terminal equipment accords with the state of multi-image display, the terminal equipment accords with the state of single-image display, namely the angle between the first screen and the second screen of the terminal equipment is not smaller than the first threshold value, namely the first screen and the second screen are not on the same plane, and the folding screen of the terminal equipment is in a folding state.
In addition, if the terminal device is in a split-screen display state and the ratio of the area of the first split screen to the area of the whole screen is greater than a second threshold value, the terminal device accords with the state of multi-picture display, and the terminal device accords with the state of single-picture display, namely the terminal device is in the split-screen display state and the ratio of the area of the first split screen to the area of the whole screen is less than or equal to the second threshold value.
Compared with the state of multi-picture display, when the state of the terminal equipment accords with the state of single-picture display, the interface for displaying the pictures is smaller, and in this case, the display of the first picture is stopped, and only the second picture is displayed, so that the second picture has a better display effect, and a user can conveniently view the second picture.
In addition, after the terminal device displays the first picture and the second picture simultaneously, the user may not be satisfied with the display effect of the second picture by comparing, and hope to process the first picture again. In this case, the present application discloses another embodiment, which further includes the following steps after the terminal device displays the first picture and the second picture simultaneously:
first, a third picture is determined in response to a second processing operation for the first picture, wherein the third picture is obtained after the second processing operation is performed on the first picture.
Then, the first picture and the third picture are displayed simultaneously.
In this scheme, if the user is not satisfied with the display effect of the second picture, the first picture may be processed again, in which case the user performs the second processing operation on the first picture. After receiving the second processing operation, the terminal device processes the first picture according to the second processing operation to obtain a corresponding third picture, and simultaneously displays the first picture and the third picture. The user can confirm the effect of the current picture processing by checking the first picture and the third picture which are displayed simultaneously.
In one example, the first picture includes a portrait of a person, and the first processing operation is a face thinning operation on the portrait of the person. After receiving the first processing operation, the terminal equipment determines a second picture, and simultaneously displays the first picture and the second picture, wherein the portrait of the person included in the second picture is subjected to face thinning processing. However, after comparing the first picture and the second picture, the user considers that the face thinning amplitude is too large, which causes distortion of the second picture, and hopes to reprocess the first picture. In this case, the user performs a second processing operation on the first picture, the second processing operation including a face thinning operation, but the face thinning amplitude is smaller than that of the first processing operation. After receiving the second processing operation, the terminal device reprocesses the first picture based on the second processing operation, acquires a corresponding third picture, and simultaneously displays the first picture and the third picture so that a user can check whether the third picture is distorted.
Through the scheme of the embodiment, the terminal equipment can determine the corresponding third picture according to the received second processing operation aiming at the first picture, the third picture is a picture after the first picture is processed again, the first picture and the third picture are displayed at the same time, the requirement that a user processes the first picture again is met, and the user can conveniently determine the effect of the reprocessed picture by viewing the first picture and the third picture.
Further, in this embodiment, if the terminal device displays the first picture and the second picture at the same time in a case where the received first processing operation includes the target operation, the terminal device may determine whether the second processing operation includes the target operation after determining the third picture. If the second processing operation includes the target operation, the terminal device displays the first picture and the third picture at the same time, and if the second processing operation does not include the target operation, the terminal device generally displays only the third picture.
In another application scenario, after the terminal device displays the first picture and the second picture simultaneously, the user may not be satisfied with the display effect of the second picture by comparing, and hope to continue to process the second picture. In this case, the present application discloses another embodiment, which further includes the following steps after the terminal device displays the first picture and the second picture simultaneously:
Firstly, determining a fourth picture in response to a third processing operation for the second picture, wherein the fourth picture is obtained after the third processing operation is performed on the second picture;
then, the first picture and the fourth picture are displayed simultaneously.
In this scenario, if the user wishes to continue processing the second picture, a third processing operation may be performed on the second picture. After receiving the third processing operation, the terminal device processes the first picture according to the third processing operation to obtain a corresponding fourth picture, and simultaneously displays the first picture and the fourth picture. The user can confirm the effect of the current picture processing by checking the first picture and the fourth picture which are displayed simultaneously.
In one example, the first processing operation is to increase picture brightness. After receiving the first processing operation, the terminal equipment determines a second picture, and simultaneously displays the first picture and the second picture, wherein the second picture is a picture with improved brightness. However, the user is not satisfied with the second picture after comparing the first picture and the second picture, and wants to add a filter to the second picture. In this case, the user performs a third processing operation on the second picture, the third processing operation including an operation of adding a filter. After receiving the third processing operation, the terminal equipment continues to process the second picture based on the third processing operation, adds a filter corresponding to the third processing operation for the second picture, acquires a corresponding fourth picture, and simultaneously displays the first picture and the fourth picture, so that a user can determine whether the picture processing effect is satisfactory or not by comparing the first picture and the fourth picture.
Through the scheme of the embodiment, the terminal equipment can determine the corresponding fourth picture according to the received third processing operation for the second picture, the fourth picture is a picture after the second picture is processed, the first picture and the fourth picture are displayed at the same time, the requirement that a user continues to process the second picture is met, and the user can conveniently determine the picture effect after the second picture is processed by checking the first picture and the fourth picture.
Further, in this embodiment, if the terminal device displays the first picture and the second picture at the same time in a case where the received first processing operation includes the target operation, the terminal device may determine whether the third processing operation includes the target operation after determining the fourth picture. If the third processing operation includes the target operation, the terminal device displays the first picture and the fourth picture at the same time, and if the third processing operation does not include the target operation, the terminal device generally displays only the fourth picture.
In some scenarios, a user may perform different processing operations on the first picture, and wish to compare picture processing effects corresponding to the different processing operations. For this application scenario, another embodiment is disclosed herein, which further includes the following steps after the terminal device displays the first picture and the second picture simultaneously:
Firstly, determining a fifth picture in response to fourth processing operation for the first picture, wherein the fifth picture is obtained after fourth processing operation is performed on the first picture;
then, the fifth picture is displayed while the first picture and the second picture are displayed.
In this scheme, the user performs various processing operations on the first picture. In this case, the terminal device determines a corresponding second picture after receiving the first processing operation for the first picture, and simultaneously displays the first picture and the second picture. Then, the terminal device receives a fourth processing operation for the first picture, determines a fifth picture according to the fourth processing operation, and displays the fifth picture simultaneously with the first picture and the second picture, namely, the terminal device displays the first picture, the second picture and the fifth picture simultaneously. The user can determine the picture processing effects respectively corresponding to different processing operations by looking at the pictures displayed simultaneously.
In one example, a user may wish to compare the processing effect of the first picture with two operations, increasing the brightness of the first picture and increasing the contrast of the first picture. In this case, the first processing operation may be to increase the brightness of the picture, and the terminal device determines a second picture after receiving the first processing operation, and simultaneously displays the first picture and the second picture, where the second picture is the picture after increasing the brightness. Then, the user performs a fourth processing operation on the first picture, the fourth processing operation being to improve the contrast of the picture. After receiving the fourth processing operation, the terminal device processes the first picture based on the fourth processing operation, obtains a fifth picture, wherein the fifth picture is a picture with improved contrast, and simultaneously displays the first picture, the second picture and the fifth picture.
The user can compare the display effects of the obtained pictures after processing the first picture through the first processing operation and the fourth processing operation respectively by looking at the pictures displayed at the same time, and determine whether the picture processing effects corresponding to the different processing operations are satisfied.
Through the scheme of the embodiment, after receiving various processing operations for the first picture, the terminal equipment can acquire each picture obtained after the first picture is processed through different processing operations, and simultaneously display each picture, so that a user can conveniently determine picture processing effects corresponding to different processing operations by checking each picture displayed simultaneously.
To clarify the advantages of the present application, an example is provided below in which the first processing operation is an operation of adding a filter, or the first processing operation is an adjustment of an attribute (e.g., brightness, contrast, or saturation) of the first picture or a beauty adjustment. Referring to the workflow diagram shown in fig. 11, this example includes the steps of:
step S31, in response to a first determination operation for the terminal device, determining a first picture stored in the terminal device.
Step S32, a second picture obtained after the first picture is processed is determined in response to a first processing operation for the first picture. Referring to fig. 11, this step may correspond to different picture-processed scenes. In one scenario corresponding to this step, a first processing operation is used to add a filter to the first picture. In this case, the first processing operation may include an operation of causing the terminal device to enter a function page in which a filter is added for the first picture and an operation of selecting any one filter of the function pages. Correspondingly, in the step, a second picture obtained after adding the filter to the first picture is determined.
In another scenario, the first processing operation is to adjust a property of the first picture or make a beauty adjustment. In this case, the first processing operation may include an operation of causing the terminal device to enter a function page for performing attribute adjustment or beauty adjustment for the first picture, a drag operation of receiving an intensity slider for an attribute of the first picture or an intensity slider for beauty, and an operation of determining whether to end drag of the intensity slider. If the user drags the intensity sliding bar in a touch area mode of the touch terminal device, in the step, the terminal device determines the position of the intensity sliding bar dragging and determines whether the intensity sliding bar finishes dragging, if so, the subsequent operation is continued, and if not, the operation of determining the position of the intensity sliding bar dragging is continued. In this case, it is generally determined that the drag on the intensity slider is ended when it is detected that the user has ended the drag on the intensity slider (i.e., the user lifts his or her hand).
After step S32 is completed, the operations performed by the terminal device may include a variety of operations.
In one design, after step S32 is completed, the terminal device performs step S33, and step S33 includes the following operations:
Step S33, after determining the second picture, the terminal device displays the first picture and the second picture simultaneously.
In another design, after step S32 is completed, the terminal device performs step S34, and step S34 includes the following operations:
step S34, determining whether the current state of the terminal device accords with the state of multi-diagram display, if so, executing the operation of step S35, and if not, executing the operation of step S40.
The method for determining whether the current state of the terminal device accords with the state of the multi-diagram display may refer to the above embodiment, and will not be described herein.
Step S35, determining whether the parameters of the picture obtained by processing the first picture through the first processing operation are in a default state, if so, executing the operation of step S36, and if not, executing the operation of step S38.
The parameter of the picture obtained after the first picture is processed through the first processing operation is in a default state, which means that the parameter of the first picture is not adjusted through the first processing operation. For example, if the first processing operation is to adjust the attribute of the first picture or make a beauty adjustment, the parameter of the picture obtained after the first processing operation processes the first picture is in a default state, which means that the adjustment amplitude is zero.
Step S36, determining whether the terminal device displays the first picture and the second picture at the same time, if yes, executing the operation of step S37, and if no, executing the operation of step S40.
If it is determined that the parameter of the picture obtained by processing the first picture through the first processing operation is the default state through the operation in step S35, the content displayed by the first picture is unchanged, that is, the content displayed by the second picture obtained by processing the first picture is the same as the content displayed by the first picture, and in this case, the first picture and the second picture do not need to be displayed simultaneously.
In addition, in this case, if the terminal device does not currently display the first picture and the second picture at the same time, the first picture continues to be displayed.
Step S37, switching to a state of single-image display from the state of simultaneously displaying the first image and the second image.
The state of single-picture display refers to a state that a terminal device displays a picture.
Since the terminal device simultaneously displays the first picture and the second picture in step S36, the terminal device may display a transition animation from simultaneously displaying the first picture and the second picture to single picture display and then enter a state of single picture display.
Step S38, determining whether the terminal device displays the first picture and the second picture at the same time, if not, executing the operation of step S39, and if so, executing the operation of step S41.
Step S39, switching from the state of single-picture display to the state of simultaneously displaying the first picture and the second picture.
Through the operation of step S35, it is determined that the parameter of the obtained picture is not the default state after the first picture is processed through the first processing operation, and through the operation of step S38, it is determined that the terminal device does not display the first picture and the second picture at the same time, then the terminal device may adjust the state of displaying the picture, and enter the state of displaying the first picture and the second picture at the same time.
In one possible design, this step may display a transition animation from a single-picture display state to a state in which the first picture and the second picture are displayed simultaneously, and then enter a state in which the first picture and the second picture are displayed simultaneously.
Step S40, entering a single-diagram display state.
When the terminal equipment is in a single-picture display state, a picture is displayed, and in one possible design, the picture displayed by the terminal equipment can be a processed picture.
Step S41, the display of the first picture and the second picture is kept, namely, the first picture and the second picture are continuously displayed at the same time.
In this example, the terminal device may adjust the state of the picture display, for example, from a state in which the first picture and the second picture are simultaneously displayed to a state in which the single picture is displayed, or from a state in which the single picture is displayed to a state in which the first picture and the second picture are simultaneously displayed. When the terminal device displays the first picture and the second picture at the same time, the user can determine the effect of picture processing by comparing the first picture with the second picture.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
The embodiment of the application discloses a picture display device. Referring to a schematic structural diagram shown in fig. 12, the picture display device includes: a first picture determination module 110, a second picture determination module 120, and a picture display module 130.
Wherein, the first picture determining module 110 is configured to determine, in response to a first determining operation for a terminal device, a first picture stored in the terminal device;
the second picture determining module 120 is configured to determine, in response to a first processing operation for the first picture, a second picture obtained after the first picture is processed;
The picture display module 130 is configured to display the second picture while displaying the first picture.
In one possible design, the picture display device is further configured to determine that the current state of the terminal device matches the state of the multi-picture display before the picture display module 130 displays the first picture and the second picture simultaneously.
The picture display device determines that the current state of the terminal device accords with the state of multi-picture display, and the method can be realized through the following steps:
the terminal equipment comprises a folding screen formed by a first screen and a second screen, and the angle between the first screen and the second screen is determined to be smaller than a first threshold value;
or the terminal equipment is in a split screen display state, and the ratio of the area of a first split screen to the area of the whole screen is determined to be larger than a second threshold value, wherein the first split screen is the split screen for receiving the first processing operation;
or determining the functional page of the terminal equipment corresponding to the first processing operation as a functional page supporting multi-picture display;
or receiving a trigger signal for enabling the terminal equipment to enter the multi-picture display state.
In one possible design, before the first picture and the second picture are displayed simultaneously by the picture display module 130, the picture display device is further configured to determine that the first processing operation includes a target operation, where the target operation includes an operation described in a white list, where the white list is used to describe an operation for supporting multi-picture display, or where the target operation includes an operation for adjusting a parameter of the first picture, where the parameter includes at least one of brightness, contrast, and saturation.
The first processing operation includes an operation of dragging an intensity slider of a parameter of the first picture, and the picture display module 130 is configured to display the second picture while displaying the first picture after the dragging of the intensity slider is completed.
In one possible design, after the first picture and the second picture are displayed by the picture display module 130 at the same time, the picture display device is further configured to terminate displaying the first picture if the state of the terminal device is adjusted from the state conforming to the multi-picture display to the state conforming to the single-picture display.
In one possible design, after the first picture and the second picture are displayed by the picture display module 130 at the same time, the picture display device is further configured to determine, in response to a second processing operation for the first picture, a third picture, where the third picture is a picture obtained by performing the second processing operation on the first picture, and display the first picture and the third picture at the same time.
In one possible design, after the first picture and the second picture are displayed by the picture display module 130 at the same time, the picture display device is further configured to determine, in response to a third processing operation for the second picture, a fourth picture, where the fourth picture is a picture obtained by performing the third processing operation on the second picture, and display the first picture and the fourth picture at the same time.
In one possible design, after the first picture and the second picture are displayed by the picture display module 130 at the same time, the picture display device is further configured to determine, in response to a fourth processing operation for the first picture, a fifth picture, where the fifth picture is a picture obtained by performing the fourth processing operation on the first picture, and display the fifth picture while displaying the first picture and the second picture.
Correspondingly, the embodiment of the application discloses a terminal device, referring to a structural schematic diagram shown in fig. 13, where the terminal device includes:
the processor 1101 and a memory,
the memory is used for storing program instructions;
the processor 1101 is configured to invoke and execute program instructions stored in the memory, and when the program instructions stored in the memory are executed by the processor 1101, cause the terminal device to perform all or part of the steps in the embodiments corresponding to fig. 5 and fig. 7 to 10.
Further, the terminal device may further include: a transceiver 1102 and a bus 1103, the memories include a random access memory 1104 and a read-only memory 1105.
The processor is coupled to the receiver, the random access memory and the read-only memory through buses respectively. When the terminal equipment needs to be operated, the basic input/output system solidified in the read-only memory or the bootloader guiding system in the embedded system is started to guide the terminal equipment to enter a normal operation state. After the terminal device enters a normal operation state, an application program and an operating system are run in a random access memory, so that the terminal device executes all or part of the steps in the embodiments corresponding to fig. 5 and fig. 7 to 10.
The terminal device according to the embodiment of the present invention may correspond to the terminal device in the embodiment corresponding to fig. 5 and fig. 7 to 10, and the processor, the storage, etc. in the terminal device may implement the functions and/or the implemented various steps and methods of the terminal device in the embodiment corresponding to fig. 5 and fig. 7 to 10, which are not described herein for brevity.
It should be noted that, the present embodiment may also be implemented based on a general physical server in combination with network function virtualization (english: network Function Virtualization, NFV) technology, where the terminal device is a virtual terminal device (e.g., a virtual host, a virtual router, or a virtual switch). The Virtual terminal apparatus may be a Virtual Machine (VM) running a program for transmitting an announcement message function, the Virtual Machine being deployed on a hardware device (e.g., a physical server). Virtual machines refer to complete computer systems that run in a completely isolated environment with complete hardware system functionality through software emulation. Those skilled in the art will appreciate from the present disclosure that many communication devices having the above-described functionality may be virtualized on a common physical server. And will not be described in detail herein.
In a specific implementation, the embodiment of the application further provides a computer storage medium, where a computer program or an instruction is stored in the computer storage medium, and when the computer program or the instruction are executed, the computer may implement all or part of the steps in the embodiments corresponding to fig. 5 and fig. 7 to fig. 10. The computer readable storage medium is provided in any device, which may be a random-access memory (RAM), and the memory may also include a non-volatile memory (non-volatile memory), such as a read-only memory (ROM), a flash memory (flash memory), a hard disk (HDD), or a Solid State Drive (SSD); the memory may also include combinations of the above types of memories, and the like.
Embodiments of the present application also provide a chip system including a processor coupled to a memory for supporting the apparatus to implement the functions involved in the above aspects, for example, displaying a first picture and a second picture simultaneously. In one possible design, the chip system further includes a memory for holding computer instructions and data necessary for the picture display device. The chip system may be formed of a chip or may include a chip and other discrete devices.
The method embodiments described herein may be independent schemes or may be combined according to internal logic, and these schemes fall within the protection scope of the present application.
It will be appreciated that in the various method embodiments described above, the methods and operations performed by the terminal device may also be performed by components (e.g., chips or circuits) that may be used in the terminal device.
The above embodiments describe the communication method provided in the present application. It will be appreciated that the terminal device, in order to implement the above-described functions, includes corresponding hardware structures and/or software modules that perform each of the functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the present application may divide the functional modules of the terminal device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The various illustrative logical units and circuits described in the embodiments of the application may be implemented or performed with a general purpose processor, a digital information processor, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the general purpose processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a digital information processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital information processor core, or any other similar configuration.
The steps of a method or algorithm described in the embodiments of the present application may be embodied directly in hardware, in a software element executed by a processor, or in a combination of the two. The software elements may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. In an example, a storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC, which may reside in a UE. In the alternative, the processor and the storage medium may reside in different components in a UE.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium, or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
The same and similar parts of the embodiments of this specification are all mutually referred to, and each embodiment is mainly described in the differences from the other embodiments. In particular, for apparatus and system embodiments, the description is relatively simple, as it is substantially similar to method embodiments, with reference to the description of the method embodiments section.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in essence or what contributes to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present invention.
The same or similar parts between the various embodiments in this specification are referred to each other. In particular, for the embodiments of the road constraint determining apparatus disclosed in the present application, since it is substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description in the method embodiments for the matters.
The embodiments of the present invention described above do not limit the scope of the present invention.

Claims (12)

1. A picture display method, characterized by comprising:
determining a first picture stored in a terminal device in response to a first determination operation for the terminal device;
determining a second picture obtained after the first picture is processed in response to a first processing operation for the first picture;
determining that the first processing operation includes a target operation, where the target operation includes an operation described in a white list, the white list is used for describing an operation supporting multi-image display, and the white list does not include a cropping operation, or the target operation includes an operation for adjusting a parameter of the first image;
displaying the second picture while displaying the first picture;
the display areas of the first picture and the second picture are the same in size, and the editing menu corresponding to the first processing operation is located in other areas except the display areas of the first picture and the second picture.
2. The method of claim 1, wherein prior to said displaying the second picture while displaying the first picture, further comprising:
And determining that the current state of the terminal equipment accords with the state of multi-picture display.
3. The method of claim 2, wherein the determining that the current state of the terminal device corresponds to a state of a multi-view display comprises:
the terminal device comprises a folding screen formed by a first screen and a second screen, and the angle between the first screen and the second screen is determined to be smaller than a first threshold value;
or,
the terminal equipment is in a split screen display state, and the ratio of the area of a first split screen to the area of the whole screen is determined to be larger than a second threshold value, wherein the first split screen is the split screen for receiving the first processing operation;
or,
determining a functional page of the terminal equipment corresponding to the first processing operation as a functional page supporting multi-picture display;
or,
and receiving a trigger signal for enabling the terminal equipment to enter the multi-picture display state.
4. The method of claim 1, wherein displaying the second picture while displaying the first picture comprises:
the first processing operation includes an operation of dragging an intensity slider of a parameter of the first picture, and after the dragging of the intensity slider is finished, the second picture is displayed while the first picture is displayed.
5. The method of claim 2, wherein the displaying the first picture is performed simultaneously with the displaying the second picture, further comprising:
and if the state of the terminal equipment is adjusted from the state conforming to the multi-picture display to the state conforming to the single-picture display, terminating displaying the first picture.
6. The method according to any one of claims 1 to 5, wherein after displaying the second picture while displaying the first picture, further comprising:
determining a third picture in response to a second processing operation for the first picture, wherein the third picture is obtained after the second processing operation is performed on the first picture;
and displaying the first picture and the third picture simultaneously.
7. The method according to any one of claims 1 to 5, wherein after displaying the second picture while displaying the first picture, further comprising:
determining a fourth picture in response to a third processing operation for the second picture, wherein the fourth picture is obtained after the third processing operation is performed on the second picture;
and displaying the first picture and the fourth picture simultaneously.
8. The method according to any one of claims 1 to 5, wherein after displaying the second picture while displaying the first picture, further comprising:
determining a fifth picture in response to a fourth processing operation for the first picture, wherein the fifth picture is obtained after the fourth processing operation is performed on the first picture;
and displaying the fifth picture while displaying the first picture and the second picture.
9. A picture display device, comprising:
a first picture determining module, configured to determine a first picture stored in a terminal device in response to a first determining operation for the terminal device;
the second picture determining module is used for determining a second picture obtained after the first picture is processed in response to a first processing operation for the first picture;
a picture display module, configured to determine that, after the first processing operation includes a target operation, display the first picture while displaying the second picture, where the target operation includes an operation described in a white list, the white list is used to describe an operation supporting multi-picture display, and the white list does not include a clipping operation, or the target operation includes an operation for adjusting a parameter of the first picture;
The display areas of the first picture and the second picture are the same in size, and the editing menu corresponding to the first processing operation is located in other areas except the display areas of the first picture and the second picture.
10. A terminal device, comprising: a processor and a memory; the memory stores program instructions that, when executed by the processor, cause the terminal device to perform the method of any of claims 1-8.
11. A computer storage medium having stored therein a computer program or instructions which, when executed, is adapted to carry out the method of any one of claims 1-8.
12. A chip system comprising a processor coupled to a memory for executing a computer program or instructions stored in the memory, which when executed, is adapted to carry out the method of any one of claims 1-8.
CN202210724893.9A 2022-06-23 2022-06-23 Picture display method and device and terminal equipment Active CN116048349B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210724893.9A CN116048349B (en) 2022-06-23 2022-06-23 Picture display method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210724893.9A CN116048349B (en) 2022-06-23 2022-06-23 Picture display method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN116048349A CN116048349A (en) 2023-05-02
CN116048349B true CN116048349B (en) 2024-04-12

Family

ID=86131883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210724893.9A Active CN116048349B (en) 2022-06-23 2022-06-23 Picture display method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN116048349B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111778A (en) * 2014-06-25 2014-10-22 小米科技有限责任公司 Method and device for picture display
CN107909634A (en) * 2017-11-30 2018-04-13 努比亚技术有限公司 Image display method, mobile terminal and computer-readable recording medium
CN109213416A (en) * 2018-08-31 2019-01-15 维沃移动通信有限公司 A kind of display information processing method and mobile terminal
CN109597550A (en) * 2018-11-19 2019-04-09 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN110188074A (en) * 2019-05-13 2019-08-30 珠海格力电器股份有限公司 A kind of Document Editing label display methods and equipment
CN111310041A (en) * 2020-02-12 2020-06-19 腾讯科技(深圳)有限公司 Image-text publishing method, model training method and device and storage medium
CN111382289A (en) * 2020-03-13 2020-07-07 闻泰通讯股份有限公司 Picture display method and device, computer equipment and storage medium
CN112017257A (en) * 2020-08-31 2020-12-01 北京字节跳动网络技术有限公司 Image processing method, apparatus and storage medium
CN112363683A (en) * 2020-11-23 2021-02-12 Vidaa美国公司 Method for supporting multi-layer display of webpage application and display equipment
CN112825040A (en) * 2019-11-21 2021-05-21 腾讯科技(深圳)有限公司 User interface display method, device, equipment and storage medium
CN113805762A (en) * 2021-09-29 2021-12-17 腾讯科技(深圳)有限公司 Page content display method, related device, equipment and storage medium
CN113946258A (en) * 2021-10-18 2022-01-18 珠海格力电器股份有限公司 Picture editing processing method and device, storage medium, processor and terminal equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111778A (en) * 2014-06-25 2014-10-22 小米科技有限责任公司 Method and device for picture display
CN107909634A (en) * 2017-11-30 2018-04-13 努比亚技术有限公司 Image display method, mobile terminal and computer-readable recording medium
CN109213416A (en) * 2018-08-31 2019-01-15 维沃移动通信有限公司 A kind of display information processing method and mobile terminal
CN109597550A (en) * 2018-11-19 2019-04-09 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN110188074A (en) * 2019-05-13 2019-08-30 珠海格力电器股份有限公司 A kind of Document Editing label display methods and equipment
CN112825040A (en) * 2019-11-21 2021-05-21 腾讯科技(深圳)有限公司 User interface display method, device, equipment and storage medium
CN111310041A (en) * 2020-02-12 2020-06-19 腾讯科技(深圳)有限公司 Image-text publishing method, model training method and device and storage medium
CN111382289A (en) * 2020-03-13 2020-07-07 闻泰通讯股份有限公司 Picture display method and device, computer equipment and storage medium
CN112017257A (en) * 2020-08-31 2020-12-01 北京字节跳动网络技术有限公司 Image processing method, apparatus and storage medium
CN112363683A (en) * 2020-11-23 2021-02-12 Vidaa美国公司 Method for supporting multi-layer display of webpage application and display equipment
CN113805762A (en) * 2021-09-29 2021-12-17 腾讯科技(深圳)有限公司 Page content display method, related device, equipment and storage medium
CN113946258A (en) * 2021-10-18 2022-01-18 珠海格力电器股份有限公司 Picture editing processing method and device, storage medium, processor and terminal equipment

Also Published As

Publication number Publication date
CN116048349A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
WO2021129326A1 (en) Screen display method and electronic device
CN112269527B (en) Application interface generation method and related device
US20220342850A1 (en) Data transmission method and related device
KR102534354B1 (en) System navigation bar display control method, graphical user interface and electronic device
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
US11747953B2 (en) Display method and electronic device
US10181203B2 (en) Method for processing image data and apparatus for the same
US11914850B2 (en) User profile picture generation method and electronic device
EP4060475A1 (en) Multi-screen cooperation method and system, and electronic device
CN111147660B (en) Control operation method and electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
US20230094172A1 (en) Cross-Device Application Invoking Method and Electronic Device
US20220214891A1 (en) Interface display method and electronic device
CN114115619A (en) Application program interface display method and electronic equipment
KR102176662B1 (en) An electronic device and operating metod thereof
US20230353862A1 (en) Image capture method, graphic user interface, and electronic device
WO2022161119A1 (en) Display method and electronic device
CN116450251A (en) Method for adapting page layout of multiple devices and electronic device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN114327175B (en) Information sharing method and device
WO2022222688A1 (en) Window control method and device
EP4310648A1 (en) Service card processing method, and electronic device
CN116048349B (en) Picture display method and device and terminal equipment
US11646062B2 (en) Method for controlling edit user interface of moving picture for clip alignment control and apparatus for the same
CN112653788B (en) User interface display method, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant