CN113157357A - Page display method, device, terminal and storage medium - Google Patents

Page display method, device, terminal and storage medium Download PDF

Info

Publication number
CN113157357A
CN113157357A CN202010076679.8A CN202010076679A CN113157357A CN 113157357 A CN113157357 A CN 113157357A CN 202010076679 A CN202010076679 A CN 202010076679A CN 113157357 A CN113157357 A CN 113157357A
Authority
CN
China
Prior art keywords
target
background
foreground object
background image
page
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010076679.8A
Other languages
Chinese (zh)
Inventor
罗义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010076679.8A priority Critical patent/CN113157357A/en
Publication of CN113157357A publication Critical patent/CN113157357A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides a page display method, a device, a terminal and a storage medium, wherein the method comprises the following steps: adjusting an original background image in a target page into a target background image corresponding to the current display mode; identifying associated background areas of each foreground object in the target page in the target background image; adjusting the foreground object to a target object corresponding to the display mode according to the background pixel value of the associated background area; and generating the target page according to all the target objects and the target background image. According to the technical scheme, when the foreground object is adjusted, the adjusting strategy can be determined according to the associated background area, the situation that the color of the adjusted background image is the same as or similar to that of the foreground object is avoided, and the display effect of the page is improved.

Description

Page display method, device, terminal and storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for page display.
Background
The mobile phone, the tablet computer and other terminal devices can adjust the color of each display object in the page according to the current display mode so as to enable the page to be matched with the display mode, for example, under the deep color display mode, reverse color and background dimming can be performed on foreground objects such as characters and icons, and therefore the displayed page is matched with the deep color display mode. Since the adjustment processes of the page background and the foreground object are performed separately, it may happen that the colors of the adjusted page background and the foreground object are the same or close, which reduces the contrast between the foreground object and the background and affects the display effect of the page.
Disclosure of Invention
The embodiment of the application provides a page display method, a page display device, a page display terminal and a page display storage medium, which can solve the problem that in the existing page display technology, after images and foreground objects in a page are adjusted according to a display mode, the colors are the same or similar.
In a first aspect, an embodiment of the present application provides a method for displaying a page, including:
adjusting an original background image in a target page into a target background image corresponding to the current display mode;
identifying associated background areas of each foreground object in the target page in the target background image;
adjusting the foreground object to a target object corresponding to the display mode according to the background pixel value of the associated background area;
and generating the target page according to all the target objects and the target background image.
In a possible implementation manner of the first aspect, the identifying an associated background region of each foreground object in the target page in the target background image includes:
dividing the target background image into a plurality of candidate background areas;
and selecting the associated background area corresponding to the foreground object from the candidate background areas according to the center coordinates of the foreground object and the boundary coordinates of the target background image.
In a possible implementation manner of the first aspect, the dividing the target background image into a plurality of candidate background regions includes:
obtaining a minimum foreground size, and taking the minimum foreground size as a determined block size;
and dividing the target background image based on the block size to obtain a plurality of candidate background areas.
In a possible implementation manner of the first aspect, the selecting, according to the center coordinates of the foreground object and the boundary coordinates of the target background image, the associated background region corresponding to the foreground object from the candidate background regions includes:
configuring related reference row and column numbers for each candidate background area according to the display position of each candidate background area in the target background image;
Figure BDA0002378659890000021
wherein, Columm0The reference column sequence number of the candidate background area is used as the reference column sequence number of the candidate background area; row0A reference row number of the candidate background area; (SrcX)0,SrcY0) Boundary coordinates of the candidate background image; (pX, pY) is the center coordinate of the candidate background region; (Size)x,Sizey) Is the region size of the candidate background region;
importing the center coordinates of the foreground object into a preset line-column conversion model, and calculating a target line-column number of the foreground object; the line-row conversion model specifically comprises:
Figure BDA0002378659890000022
wherein, Columm1The column serial number in the target row column number is used as the column serial number; row1The row serial number in the target row column number is used as the row serial number; (SrcX)1,SrcY1) Is the center coordinate of the foreground object; (Size)x,Sizey) Is the region size of the candidate background region; (Target)x,Targety) An object size of the foreground object;
and selecting the candidate background area matched with the reference row and column number and the target row and column number as the associated background area of the foreground object.
In a possible implementation manner of the first aspect, if the display mode is a dark color display mode, the adjusting the foreground object to a target object corresponding to the display mode according to the background pixel value of the associated background area includes:
acquiring a pixel value and transparency of a central coordinate of the associated background area;
determining a background brightness value of the associated background area according to the pixel value of the center coordinate and the transparency;
if the background brightness value is larger than a preset brightness threshold value, performing brightness reduction processing on the foreground object to generate the target object;
and if the background brightness value is smaller than or equal to the brightness threshold value, performing reverse color processing on the foreground object to generate the target object.
In a possible implementation manner of the first aspect, after determining the brightness value of the associated background region according to the pixel value of the center coordinate and the transparency, the method further includes:
determining a characteristic brightness value corresponding to the foreground object according to the pixel value of each pixel point in the foreground object;
if the brightness difference value between the background brightness value and the characteristic brightness value is larger than a preset reverse color threshold value, performing brightness reduction processing on the foreground object to generate the target object;
and if the brightness difference value is smaller than or equal to the reverse color threshold value, performing reverse color processing on the foreground object to generate the target object.
In a possible implementation manner of the first aspect, the identifying an associated background region of each foreground object in the target page in the target background image includes:
determining a display level of the foreground object within the target page;
selecting a target background image of a level adjacent to the display level in the target page as an associated background image of the foreground object;
determining an associated background region of the foreground object within the associated background image.
In a possible implementation manner of the first aspect, the adjusting the foreground object to be a target object corresponding to the display mode according to a background pixel value of the associated background area includes:
generating a preview object corresponding to the foreground object according to the background pixel value of the associated background area and the pixel value of the foreground object;
and generating the target object corresponding to the foreground object according to the pixel values of all the associated objects corresponding to the preview object and the pixel values of the preview object.
In a second aspect, an embodiment of the present application provides an apparatus for page display, including:
the background image adjusting unit is used for adjusting the original background image in the target page into a target background image corresponding to the current display mode;
the associated background area identification unit is used for identifying the associated background area of each foreground object in the target page in the target background image;
a foreground object adjusting unit, configured to adjust the foreground object to a target object corresponding to the display mode according to a background pixel value of the associated background area;
and the target page generating unit is used for generating the target page according to all the target objects and the target background image.
In a third aspect, an embodiment of the present application provides a terminal device, a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the method for displaying a page according to any one of the above first aspects when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, where a computer program is stored, where the computer program is implemented, when executed by a processor, to implement the method for page display according to any one of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the method for page display according to any one of the above first aspects.
It is to be understood that the beneficial effects of the second to fifth aspects can be found in the related descriptions of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that:
according to the method and the device, the original background image in the target page is adjusted to obtain the target background image, then the adjustment strategy of the foreground object is determined according to the brightness value of the associated background area of each foreground object in the target background image, the target object corresponding to the foreground object is generated according to the adjustment strategy, finally the target page is generated according to the target background image and the target object, when the foreground object is adjusted, the adjustment strategy is determined according to the associated background area, the situation that the color of the adjusted background image is the same as or similar to that of the foreground object is avoided, and the display effect of the page is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required for the embodiments or the prior art descriptions will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a block diagram of a partial structure of a mobile phone provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a software structure of a mobile phone according to an embodiment of the present application;
FIG. 3 is a flowchart of an implementation of a method for displaying a page according to a first embodiment of the present application;
FIG. 4 is a schematic diagram of a dark display mode according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a dark display mode according to an embodiment of the present application;
FIG. 6 is a schematic illustration of an identification of an associated background region provided by an embodiment of the present application;
FIG. 7 is a schematic illustration of an identification of an associated background region provided by another embodiment of the present application;
fig. 8 is a flowchart illustrating a specific implementation of a page displaying method S302 according to a second embodiment of the present application
FIG. 9 is an interaction diagram of processing units in the process of page display provided by an embodiment of the present application;
FIG. 10 is a schematic flowchart of page display based on a dark display mode according to an embodiment of the present application;
fig. 11 is a flowchart illustrating a specific implementation of a method S801 for displaying a page according to a third embodiment of the present application;
fig. 12 is a schematic diagram illustrating a partition of a candidate background area according to an embodiment of the present application;
fig. 13 is a flowchart illustrating a detailed implementation of a method S802 for displaying a page according to a fourth embodiment of the present application;
FIG. 14 is a schematic diagram of a selection of target column and row numbers provided by an embodiment of the present application;
fig. 15 is a flowchart illustrating a specific implementation of a method S304 for page display according to a fifth embodiment of the present application;
fig. 16 is a flowchart illustrating a specific implementation of a method for displaying a page according to a sixth embodiment of the present application;
fig. 17 is a flowchart illustrating a detailed implementation of a method S302 for displaying a page according to a seventh embodiment of the present application;
fig. 18 is a flowchart illustrating a detailed implementation of a method S303 for displaying a page according to an eighth embodiment of the present application;
FIG. 19 is a schematic diagram illustrating the generation of a target object according to an embodiment of the present application;
FIG. 20 is a block diagram of a page display device according to an embodiment of the present application;
fig. 21 is a schematic diagram of a terminal device according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The page display method provided by the embodiment of the application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, Augmented Reality (AR)/Virtual Reality (VR) devices, notebook computers, ultra-mobile personal computers (UMPC), netbooks, Personal Digital Assistants (PDA) and other terminal devices, and can also be applied to databases, servers and service response systems based on terminal artificial intelligence, and the embodiment of the application does not limit the specific types of the terminal devices at all.
For example, the terminal device may be a Station (ST) in a WLAN, and may be a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA) device, a handheld device with Wireless communication capability, a computing device or other processing device connected to a Wireless modem, a computer, a laptop, a handheld communication device, a handheld computing device, and/or other devices for communicating on a Wireless system and a next generation communication system, such as a Mobile terminal in a 5G Network or a Mobile terminal in a future-evolution Public Land Mobile Network (PLMN) Network, and so on.
By way of example and not limitation, when the terminal device is a wearable device, the wearable device may also be a generic term for intelligently designing daily wearing by applying wearable technology, developing wearable devices, such as glasses, gloves, watches, clothes, shoes, and the like with a shooting function. The wearable device is a portable device which is directly worn on the body or integrated to the clothes or accessories of the user, and is attached to the user body and used for recording images in the moving process of the user or acquiring environmental images and the like according to shooting instructions initiated by the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The general wear-type smart device is complete in function, large in size, capable of achieving complete or partial functions independent of the smart phone, such as smart watches or smart glasses, and only concentrating on certain application functions, and needing to be matched with other devices such as smart phones for use, such as smart watches with display screens, smart bracelets and the like.
Take the terminal device as a mobile phone as an example. Fig. 1 is a block diagram illustrating a partial structure of a mobile phone according to an embodiment of the present disclosure. Referring to fig. 1, the cellular phone includes: radio Frequency (RF) circuit 110, memory 120, input unit 130, display unit 140, sensor 150, audio circuit 160, near field communication module 170, processor 180, and power supply 190. Those skilled in the art will appreciate that the handset configuration shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 1:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information from a base station and then processes the received downlink information to the processor 180; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Message Service (SMS), and the like, and receives page data about a target page transmitted by a server through the RF circuit 110 and generates the target page according to the page data.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the mobile phone by running the software programs and modules stored in the memory 120, for example, storing page data of a target page in a cache area of the memory 120, adjusting the page data according to a current display mode, and then generating the target page. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 100. Specifically, the input unit 130 may include a touch panel 131 and other input devices 132. The touch panel 131, also referred to as a touch screen, may collect touch operations of a user (e.g., operations of the user on or near the touch panel 131 using any suitable object or accessory such as a finger or a stylus pen) thereon or nearby, and drive the corresponding connection device according to a preset program.
The display unit 140 may be used to display information input by the user or information provided to the user and various menus of the mobile phone, such as outputting an adjusted correction image. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 131 can cover the display panel 141, and when the touch panel 131 detects a touch operation on or near the touch panel 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although the touch panel 131 and the display panel 141 are shown as two separate components in fig. 1 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement the input and output functions of the mobile phone.
The handset 100 may also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The handset 100 may also include a camera 160. Optionally, the position of the camera on the mobile phone 100 may be front-located or rear-located, which is not limited in this embodiment of the application.
Optionally, the mobile phone 100 may include a single camera, a dual camera, or a triple camera, which is not limited in this embodiment.
For example, the cell phone 100 may include three cameras, one being a main camera, one being a wide camera, and one being a tele camera.
Optionally, when the mobile phone 100 includes a plurality of cameras, the plurality of cameras may be all front-mounted, all rear-mounted, or a part of the cameras front-mounted and another part of the cameras rear-mounted, which is not limited in this embodiment of the present application.
The terminal device may receive the device information page sent by the other device through the near field communication module 170, for example, the near field communication module 170 is integrated with a bluetooth communication module, establishes communication connection with the other mobile phone through the bluetooth communication module, receives device information fed back by the other mobile phone, and generates a device information page corresponding to the other mobile phone. Although fig. 1 shows the near field communication module 170, it is understood that it does not belong to the essential constitution of the cellular phone 100, and may be omitted entirely as needed within the scope not changing the essence of the application.
The processor 180 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the mobile phone. Alternatively, processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The handset 100 also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 180 via a power management system, so as to manage charging, discharging, and power consumption via the power management system.
The handset 100 also includes audio circuitry, a speaker, and a microphone that provides an audio interface between the user and the handset. The audio circuit can transmit the electric signal converted from the received audio data to the loudspeaker, and the electric signal is converted into a sound signal by the loudspeaker to be output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit and converted into audio data, which is then processed by the audio data output processor 180 and then sent to, for example, another mobile phone via the RF circuit 110, or output to the memory 120 for further processing. For example, a user may collect a voice signal of the user through an audio circuit, set a display mode of the mobile phone based on the voice signal, adjust a target page displayed in the foreground based on the currently configured display mode, and output the target page corresponding to the current display mode.
Fig. 2 is a schematic diagram of a software structure of the mobile phone 100 according to the embodiment of the present application. Taking the operating system of the mobile phone 100 as an Android system as an example, in some embodiments, the Android system is divided into four layers, which are an application layer, an application Framework (FWK) layer, a system layer and a hardware abstraction layer, and the layers communicate with each other through a software interface.
As shown in fig. 2, the application layer may be a series of application packages, and the application packages may include short messages, calendars, cameras, videos, navigation, gallery, calls, and other applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer may include some predefined functions, such as functions for receiving events sent by the application framework layer.
As shown in FIG. 2, the application framework layers may include a window manager, a resource manager, and a notification manager, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. Such as prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The application framework layer may further include:
a viewing system that includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures. The method and the device for adjusting the text type objects in the target page can operate in an application program framework layer, adjust the text type objects in the target page through the text control, adjust the background image in the target page through the picture display control, package all adjusted page data, and generate the target page.
The phone manager is used to provide the communication functions of the handset 100. Such as management of call status (including on, off, etc.).
The system layer may include a plurality of functional modules. For example: a sensor service module, a physical state identification module, a three-dimensional graphics processing library (such as OpenGL ES), and the like.
The sensor service module is used for monitoring sensor data uploaded by various sensors in a hardware layer and determining the physical state of the mobile phone 100;
the physical state recognition module is used for analyzing and recognizing user gestures, human faces and the like;
the three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The system layer may further include:
the surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used still image files, video format playback and recording, and audio, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The hardware abstraction layer is a layer between hardware and software. The hardware abstraction layer may include a display driver, a camera driver, a sensor driver, a microphone driver, and the like, and is used for driving related hardware of the hardware layer, such as a display screen, a camera, a sensor, a microphone, and the like.
In the embodiment of the present application, the execution subject of the flow is a device in which a program for page display is installed. As an example and not by way of limitation, the device of the page display program may specifically be a terminal device, and the terminal device may be a smart phone, a tablet computer, a notebook computer, and the like used by a user, and when receiving a display request, adjust a target page corresponding to the display request according to a current display mode, and enable the displayed target page to match the display mode. Fig. 3 shows a flowchart of an implementation of the method for displaying a page provided in the first embodiment of the present application, which is detailed as follows:
in S301, the original background image in the target page is adjusted to a target background image corresponding to the current display mode.
In this embodiment, the terminal device is configured with a display screen, and may respond to a display request initiated by a user through the display screen, for example, displaying an operation interface in an application program, or displaying an image stored in a database through an album application. Before the display operation is executed, the terminal device may adjust the content of the target page to be displayed through the page display method provided in this embodiment, so that the output target page is matched with the current display mode, thereby improving the consistency between the whole page and the display mode, and improving the display effect of the terminal device.
In a possible implementation manner, the display mode may be a dark color display mode, which may also be referred to as a "night mode", that is, under a condition that a display environment is dark or according to an actual display requirement of a user, a display object such as a control and a picture in a display interface of the terminal device may be set as a display mode with a lower overall brightness value. It should be noted that, in the dark color display mode, the brightness values of all the objects in the page are not adjusted to the low brightness value, but the overall visual effect of the whole page belongs to the low brightness mode, that is, the average brightness value of the page is lower than the preset brightness threshold. For example, the background image in the page may be set as a low-brightness image or a black background, and at this time, in order to make the text, the control, and the like in the page visible to the user, the color of the object such as the text or the control may be configured to be white or a color with higher brightness, but since the display area of the object in the whole page is relatively small compared to the background image, the overall visual effect of the page still outputs a display mode with low brightness, which matches the current display effect.
Illustratively, fig. 4 shows a schematic diagram of a dark display mode provided by an embodiment of the present application. Wherein, fig. 4 (a) shows the display effect of a certain target page before the adjustment to the deep color display mode; fig. 4 (b) shows the display effect of the target page after the adjustment to the dark color display mode. As shown in fig. 4, the background image of the target page is a monochrome image, which is white and belongs to a color with high brightness, at this time, the background image needs to be subjected to reverse color processing to be adjusted to a black background, and since the color of other foreground objects in the original page is black, in order to make the content visible, the foreground image needs to be correspondingly subjected to reverse color processing to adjust the color of the foreground object to white, so that the tone of the whole page is a dark tone.
Illustratively, fig. 5 shows a schematic diagram of a dark color display mode provided by an embodiment of the present application. Fig. 5 (a) shows the display effect of a certain target page before the adjustment to the deep color display mode; fig. 5 (b) shows the display effect of the target page after the adjustment to the dark color display mode. Referring to fig. 5, the background image of the target page is a multi-color mixed image, and since the multi-color mixed image cannot be subjected to a deep color processing by reverse color (since the reverse color processing may affect the display effect of the original image, or even cannot represent the content of the image that is originally required to be expressed, the multi-color image cannot be directly subjected to the reverse color processing), the deep color processing mainly reduces the contrast and the brightness value of the background image. In the prior art, because each display object in the page is processed independently without considering the relevance between the objects, the content such as the characters and the icons is adjusted by adopting a reverse color adjustment means, so that the colors of the characters and the images after the reverse color are the same or similar to the colors of the background images after the deep color processing, the visibility of the content such as the characters and the icons is reduced, and the overall display effect is affected. Therefore, in the existing page display technology, each display object is processed independently, different display objects in a page are displayed in an overlapped mode, the same or similar color situations may occur during combined display, in order to avoid the situations, when the page is adjusted, a terminal device can firstly process a background image, and then determine a corresponding adjustment strategy according to a related background area of a foreground image in the background image, and perform color adjustment matched with a display mode, so that the adjusted foreground object is matched with the current display mode while the contrast between the foreground object and the background image is ensured, and the situation that a part of objects of a generated target page are invisible due to the same or similar color is avoided.
In one possible implementation, the display mode may further include: a strong contrast display mode, a high brightness mode and the like, wherein if the display mode is the strong contrast mode, the contrast between the background image and the foreground object is required to be strong, and the terminal device can adjust the saturation, contrast, sharpness and other parameter values of the background image and the foreground object in the subsequent operation so as to enable the contrast of the page to be strong and to be matched with the display mode; correspondingly, when the display mode is the high-brightness mode, the terminal device can improve the proportion of the high-brightness color area, reduce the proportion of the shadow color area and improve the brightness value of the whole page by adjusting the high-brightness color areas of the background image and the foreground object. The terminal device can determine a page adjustment strategy according to different display modes, and adjust each display object in the page based on the page adjustment strategy so as to enable the adjusted target page to be matched with the current display mode.
In this embodiment, the terminal device may be configured with a default display mode, and the original page data of each target page may be data generated based on the default display mode, that is, the target page is generated according to the original page data in the default display mode, without adjusting the original page data. The default display mode may be referred to as a regular display mode. If the terminal device detects that the current display mode is the default display mode, the terminal device can output and display the received page data of the target page without adjusting parameters such as color or brightness of a display object in the page; otherwise, if the terminal device detects that the current display mode is the non-default display mode, the terminal device obtains a page adjustment strategy corresponding to the current display mode, adjusts the display object in the target page through the page adjustment strategy, and executes the relevant operations from S301 to S304.
In a possible implementation manner, when detecting that a display mode change instruction is input by a user, the terminal device detects whether a page displayed on a current foreground is matched with a changed display mode, and if so, does not need to adjust the currently displayed page; otherwise, if the currently displayed page does not match the changed display mode, the currently displayed page is taken as the target page, and the related operations from S301 to S304 are executed.
In a possible implementation manner, after detecting that the display mode is changed, the terminal device may acquire the application programs currently running in the background, and acquire the operation pages corresponding to the application programs. And matching each operation page with the current display mode, and if any operation page is not matched with the current display mode, adjusting the page in the modes from S301 to S304.
In a possible implementation manner, the terminal device may further store, in a local memory, page data of a target page in different display modes, or a server corresponding to the target page may configure corresponding page data according to the display modes, and the terminal device may obtain, from the local memory or the page server, page data matched with a current display mode and output the target page based on the obtained page data; if the local memory or the page server does not store the page data matched with the current display mode, adjusting the display object in the target page through the operations from S301 to S304, so as to generate the target page corresponding to the display mode.
In this embodiment, the target page may include a plurality of display objects, respectively a background image and a foreground object. The number of the background images may be one or multiple, that is, the target page may be obtained by splicing multiple different background images, the image size of each background image may also be different, and may also be matched with the display size of the terminal device, and the display size may also be adjusted according to the actual display content, which is not limited herein. The background images may be superimposed on each other, and the background of the target page may be formed by superimposing a plurality of background images on each other. Different background images can be configured with corresponding transparency, so that the background image on the bottom layer and the background image on the upper layer are mutually superposed to form a page background of the target page. In this case, when the terminal device adjusts the original background image to a target background image matched with the current implementation mode, the terminal device may first adjust the original background image of the bottommost layer, and determine an adjustment policy of the original background image of an adjacent upper level based on a luminance value of the target background image of the bottommost layer, and if a luminance difference between the luminance value of the target background image of the bottommost layer and the luminance value of the original background image of an adjacent upper level is smaller than a preset adjustment threshold, perform processing on the original background image of the upper level by using a first adjustment algorithm (for example, perform reverse processing on the original background image of the adjacent upper level); on the contrary, if the brightness difference between the brightness value of the target background image at the bottom layer and the brightness value of the original background image at the previous adjacent layer is greater than or equal to the preset adjustment threshold, the original background image at the previous layer is processed by the second adjustment algorithm (for example, the original background image at the previous adjacent layer is processed by brightness reduction).
In this embodiment, different display modes correspond to different adjustment algorithms. The terminal device can obtain the adjusting algorithm associated with the current display mode according to the current display mode. The adjustment algorithm may be configured with different adjustment models according to different display object types, for example, a first adjustment model for a background image, a second adjustment model for a text object, and a third adjustment model for an icon object. Of course, for the adjustment models of the same type of display object, different adjustment models may also be distinguished according to object attributes, for example, when the background image is a monochrome background image, the first adjustment model is adopted; for the multicolor background image, a second adjusting model is adopted; a corresponding adjustment model may also be determined according to the original color of the original background image, for example, if it is detected that the average pixel value of the original background image is between (255,125), the first adjustment model is adopted; if the average pixel value of the original background image is detected to be between (125,0), a second adjustment model is adopted, and the specific adjustment mode can be determined according to a plurality of dimensions of the display mode, the image level, the object type and the chromaticity of the image.
For example, the current display mode is a dark color display mode, and a certain original background image is a monochrome black image. And the terminal equipment acquires a page adjusting strategy corresponding to the deep color display mode. The page adjusting strategy is divided into two different adjusting algorithms according to the background image and the foreground image, namely a first adjusting algorithm and a second adjusting algorithm. When detecting that the currently required adjusting object is a background image, adjusting the original background image by adopting a first adjusting algorithm, wherein the first adjusting algorithm needs to identify that the obtained original background image is a monochrome image, and the color of the obtained original background image is black, the original background image is not adjusted, and only the backlight brightness of a display screen of the terminal equipment needs to be reduced.
In S302, associated background regions of the foreground objects in the target page in the target background image are identified.
In this embodiment, the target page includes a background image and a foreground object. The foreground object is specifically a display object displayed on an upper layer of the background image, and the foreground object includes but is not limited to: foreground icons, characters, controls, touch animation, popups and other display objects.
In this embodiment, the object information of each foreground object may include the center coordinates of the target page and the object size of the foreground object. The terminal device may identify and obtain the associated background area corresponding to the foreground object according to the two parameters, i.e., the display position and the object size.
In one possible implementation, the manner of determining the associated background area may be: if the number of the background images in the target page is multiple, the terminal device takes the target background image corresponding to the background area in which the center coordinates of the foreground object fall as the target background image associated with the foreground object according to the background area of each target background image; and the terminal equipment determines the coverage area of the foreground object in the target background image according to the object size and the central area of the foreground object, and identifies the coverage area as the associated background area of the foreground object in the target background image.
Exemplarily, fig. 6 shows an identification diagram of an associated background area provided in an embodiment of the present application. Referring to fig. 6, the target page includes 3 background images, which are a background image 1, a background image 2, and a background image 3, and the target page further includes a foreground object, which is an icon 1, the terminal device may determine a center coordinate of the foreground object by reading object information of the icon 1, and the center coordinate falls into a background area of the background image 1, so as to identify the background image 1 as an associated background image of the icon 1. And the terminal equipment extracts a corresponding associated background area in the associated background image according to the object size of the icon 1.
In a possible implementation manner, the target page includes a plurality of target background images, and different target background images can be overlapped with each other by adjusting the transparency, so as to form a page background of the target page. In this case, the terminal device may perform visible layer merging on all target background images of the target page, that is, merge a plurality of target background images in different layers according to the transparency of each target background image to generate a merged background image, and then determine an associated background region corresponding to the foreground object from the merged background image.
Exemplarily, fig. 7 shows an identification diagram of an associated background area provided in another embodiment of the present application. Referring to fig. 7, the target image includes a background image 1 and a background image 2. And the background image 1 is located at the next display level of the background image 2, and the foreground object is located at the last display level of the background image 2, in this case, the target background images corresponding to the foreground object are the background image 1 and the background image 2. Therefore, the terminal device can merge the background image 1 and the background image 2 to obtain a merged background image, and determine a corresponding associated background area in the merged background image according to the center coordinates of the foreground object and the object size.
In S303, the foreground object is adjusted to be a target object corresponding to the display mode according to the background pixel value of the associated background area.
In this embodiment, after determining the associated background area, the terminal device may calculate a background pixel value corresponding to the associated background area. The background pixel value may be a pixel average value of the associated background region, in this case, the terminal device may superimpose pixel values of each pixel point in the associated background region, calculate a pixel average value corresponding to the associated background region according to an accumulated value, and use the pixel average value as the background pixel value of the associated background region. Optionally, the terminal device may determine the weighting weight corresponding to each pixel point according to a distance value between each pixel point and the center coordinate of the associated background region, where the smaller the distance from the center coordinate is, the larger the value of the corresponding weighting weight is, perform weighted accumulation based on the weighting weight of each pixel point and the pixel value to obtain a weighted accumulated value, perform averaging according to the weighted accumulated value, and use the weighted average value as the background pixel value of the associated region.
In one possible implementation manner, the terminal device may perform downsampling on the target background image, that is, divide the target background image into a plurality of grid regions through a preset grid, and use a pixel value of a central coordinate point of each grid region as a pixel value of each grid region, so as to generate a downsampled image of the target background image. And the terminal equipment determines the background pixel value of the associated background region according to the pixel value of the grid region corresponding to the associated background region in the downsampled image. If the display area of the foreground object covers two or more grid areas, the covered grid area is used as the associated background area of the foreground object, or the grid area where the center coordinates of the foreground object are located is used as the associated background area corresponding to the foreground area. If the associated background area corresponds to a plurality of grid areas, the background pixel value of the associated background area may be determined according to an average value of the pixel values of the grid areas.
In this embodiment, the manner in which the terminal device adjusts the foreground object may be: the terminal equipment acquires a foreground pixel value of the foreground object and adjusts the foreground object according to the foreground pixel value, the background pixel value and the color tone corresponding to the current display mode.
In a possible implementation manner, if the dark color display mode of the display mode is a night display mode, the terminal device may perform reverse color processing on the foreground object if it is detected that a pixel difference value between the foreground pixel value and the background pixel value is smaller than a preset contrast threshold; on the contrary, if it is detected that the pixel difference value between the foreground pixel value and the background pixel value is greater than or equal to the preset contrast threshold value, it indicates that the foreground object has high visibility when being superimposed on the target background image for display before adjustment, and at this time, if the foreground object is continuously subjected to reverse color processing, the visibility is reduced after the foreground object is superimposed on the target background area, so that the terminal device does not perform reverse color processing on the foreground object, but reduces the brightness value of the foreground object, so as to match the dark color display mode.
In a possible implementation manner, if the display mode is a high-contrast mode, if the terminal device detects that a pixel difference value between a foreground pixel value and a background pixel value is smaller than a preset contrast threshold, performing reverse color processing on a foreground object, and improving the contrast of the foreground object and a target background image after the reverse color processing; on the contrary, if it is detected that the pixel difference value between the foreground pixel value and the background pixel value is greater than or equal to the preset contrast threshold value, it indicates that the foreground object and the target background object have a higher contrast ratio between adjustments, and at this time, the contrast ratio of the foreground object can be increased without performing reverse color processing on the foreground object, so as to obtain the target object.
In this embodiment, a plurality of foreground objects may be included in the target page, the operations of S302 and S303 are performed on each foreground object to generate a target object related to each foreground object, and if it is detected that all the foreground objects are adjusted, the operation of S304 may be performed at this time.
In S304, the target page is generated according to all the target objects and the target background image.
In this embodiment, the terminal device may package the target object corresponding to each adjusted foreground object and the target background image corresponding to the original background image, and generate a target page matched with the current display mode. In this case, the terminal device records the display coordinates and the object level of each foreground object in the target page, and similarly, the terminal device also records the display coordinates and the object level of each original background image in the target page, and based on the above parameters, the terminal device can determine the display area of each display object (i.e., the target object and the target background image) in the target page, and perform the splicing and the superimposition display, thereby obtaining the target page.
As can be seen from the above, in the page display method provided in this embodiment of the present application, the original background image in the target page is adjusted to obtain the target background image, then the adjustment policy of the foreground object is determined according to the brightness value of the associated background region of each foreground object in the target background image, the target object corresponding to the foreground object is generated according to the adjustment policy, and finally the target page is generated according to the target background image and the target object.
Fig. 8 is a flowchart illustrating a specific implementation of a method S302 for displaying a page according to a second embodiment of the present application. Referring to fig. 8, with respect to the embodiment described in fig. 3, in the method for displaying a page provided by this embodiment, S302 includes: s801 to S802 are specifically described as follows:
further, the identifying an associated background region of each foreground object in the target page in the target background image includes:
in S801, the target background image is divided into a plurality of candidate background regions.
In this embodiment, the terminal device may perform region division on the target background image, so as to divide a target background image with a larger area into a plurality of candidate background regions with a smaller area, thereby facilitating determination of the associated background region in a subsequent foreground object adjustment process. Because the background area matched with the object shape is intercepted from the target background image as the associated background area of the foreground object according to the object shape of the foreground object, the calculation amount is large, not only the contour information of the foreground object needs to be obtained, but also the area framing needs to be performed from the target background area according to the contour information, the areas of all the extracted candidate background areas are not consistent in size, and under the conditions that the number of foreground objects is large and the contour is complex, more calculation resources are consumed to perform the identification operation of the associated background area, so that the generation efficiency of the target page is reduced, the time consumption of page generation is prolonged, and the use experience of a user is reduced. Therefore, the terminal device can divide the target background image into a plurality of candidate background areas according to a preset division rule, so that when the associated background area is determined, the candidate background area can be selected, the target background image is not required to be intercepted according to the object outline of the foreground object, the operation amount of the terminal device is reduced, and the generation efficiency of the target page is improved.
In a possible implementation manner, the terminal device may generate grid lines with the area shape as a unit through a preset area shape, and divide the target background image based on the grid lines, so as to generate and obtain a plurality of candidate background areas. The shape of the region can be a polygon such as a rectangle, a square, a triangle, etc.
In a possible implementation manner, the terminal device may identify an object size of each foreground object in the target page, select an object size with a smallest value as the target size, configure a mesh based on the target size, and divide the target background image into a plurality of candidate background regions based on the mesh. The grid is configured according to the minimum value of the object size of the foreground object, so that each foreground object can be selected to obtain a candidate background area matched with the shape and size of the foreground object, the situation that a plurality of foreground objects correspond to the same associated background area due to the fact that the candidate background area is too large is avoided, the number of blocks of the candidate background area can be reduced as far as possible, and the selection efficiency is improved. Because the image is subjected to region division, actually, the image is subjected to a down-sampling process, the higher the down-sampling proportion is, the greater the improvement of the operation speed is, and the grid division is carried out according to the minimum size of the foreground object, so that the operation speed can be improved, the goodness of fit between the associated background region and the foreground object can be ensured, and the accuracy of subsequent adjustment operation is improved.
In S802, the associated background region corresponding to the foreground object is selected from the candidate background regions according to the center coordinates of the foreground object and the boundary coordinates of the target background image.
In this embodiment, when the terminal device configures the display position of each foreground object, the display position is a position of the foreground object relative to the display screen. However, since each target background image has a corresponding display area, when the associated background area of the foreground object in the target background image needs to be determined, the associated background area of the foreground object can be identified and obtained only by considering the display area of each candidate background area in the target background image relative to the display screen. Therefore, the terminal device needs to acquire the boundary coordinates of the target background image, and the display area of the target background area can be determined according to any boundary coordinate and the image size of the target background image. And according to the display position of each candidate background area on the target background image, the display area corresponding to each candidate background area can be determined, the display area in which the center coordinates of the foreground object fall is identified, and the associated background area of the foreground object in the target background image can be determined.
In one possible implementation, the manner of identifying the associated background area may be: determining a display area of the target background image according to the boundary coordinates and the image size of the target background image, determining the display area of each candidate background area based on the position of each candidate background area, determining the area coordinates of each candidate background area based on the display area of each candidate background area, calculating the distance value between the area coordinates and the center coordinates of the foreground object, and selecting the candidate background area with the distance value smaller than a preset distance threshold value as the associated background area.
Exemplarily, fig. 9 shows an interaction schematic diagram of each processing unit in the process of page display provided by an embodiment of the present application. Referring to fig. 9, the terminal device includes a display mode adjustment unit, a bitmap processing unit, a background image adjustment unit, a history layer color library, and a foreground object processing unit. The specific implementation process is as follows:
1. the display module adjusting unit can be used for analyzing a display object contained in the target page and sending an original background image in the target page to the bitmap processing unit for processing;
2. the display module adjusting unit may import the foreground object obtained by the analysis into the foreground object processing unit, wherein the display module adjusting unit may determine the center coordinates and object colors of the foreground objects before importing into the foreground object processing unit; it should be noted that steps 1 and 2 can be performed simultaneously;
3. after obtaining the original background image, the bitmap processing module may analyze at least one boundary coordinate and an image size corresponding to the original background image, and in particular, may determine pixel information of the background image, for example, the pixel information may be obtained by averaging a pixel value, a feature pixel value, and a color type included in the background image.
4. Importing an original background image containing pixel information and coordinate information into a background image adjusting unit, adjusting the original background image according to the pixel information and an adjusting algorithm matched with a current display mode to obtain a target background image, importing the target background image into a preprocessing unit, and performing region division on the target background image through the preprocessing unit to obtain a plurality of candidate background regions;
5. after the target background image is divided, the color library of the history layer can store and record the color of each layer to generate a corresponding database, and specifically, the color library of the history layer can identify the background pixel value corresponding to each target background area and return each background pixel value to the background image adjusting unit;
6. the background image adjusting unit can send the partitioned target background image and the background pixel values of each candidate background area to the foreground object processing unit;
7. the foreground image processing unit can adjust each foreground object according to the pixel value of each candidate background area and the corresponding display area to generate a target object;
8. the forward image processing unit returns the output target object to the display mode adjusting unit, and the target object and the target background image are combined through the display mode adjusting unit to generate a target page.
Illustratively, fig. 10 shows a flow diagram of page display based on a dark color display mode according to an embodiment of the present application. Referring to fig. 10, the target page includes four different display objects, which are root view Rootview, background layer ImageView, first foreground text TextView1 and second foreground text TextView 2. The terminal equipment simultaneously imports the four display objects into a preprocessing unit and determines corresponding preprocessing modes, wherein the preprocessing unit can sequentially determine the adjusting modes of the objects to be displayed according to the sequence of the display levels of the display objects, and because the Rotview is at the lowest display level, the preprocessing unit firstly determines the adjusting mode of the Rotview, the Rotview is a monochromatic image, and the monochromatic image needs to be subjected to reverse color processing when the adjustment of a deep color display mode is carried out; the ImageView is a display object of a second layer level and is a multicolor background image, and when the dark color display mode is adjusted, the brightness value of the ImageView is reduced without executing reverse color processing; finally, the luminance reduction processing is executed for the TextView1 and the TextView2 as the display object of the third hierarchy and the background image of the previous hierarchy as the ImageView, and at this time, if the reverse color processing is performed for the TextView1 and the TextView2, the color of the text object after the reverse color is the same as or similar to the color of the related background region, so the terminal device changes the original reverse color processing strategy and adopts the luminance reduction processing. The terminal equipment identifies the adjustment strategies of all display objects through the preprocessing unit, adjusts all the adjustment strategies and the display objects through the adjustment unit, outputs a target background image and a target object, and generates a target page based on the target background object and the target object.
In the embodiment of the application, a plurality of candidate background areas are obtained by blocking the target background image, and the associated target background area is selected from the candidate background areas according to the center coordinates of each foreground object, so that the selection efficiency of the associated background area can be improved, and the construction duration of the target page is reduced.
Fig. 11 shows a flowchart of a specific implementation of a method S801 for displaying a page according to a third embodiment of the present application. Referring to fig. 11, with respect to the embodiment shown in fig. 8, in the method for displaying a page provided in this embodiment, S801 includes: s1101 to S1102 are specifically described below:
further, the dividing the target background image into a plurality of candidate background regions includes:
in S1101, a minimum foreground size is acquired as a determination block size.
In this embodiment, when the terminal device performs region division on the target background image, the size of the basic region, that is, the above-mentioned block size, needs to be obtained. The terminal device may determine the block size by detecting the minimum size of the foreground object included in the target page, or may determine the block size according to the minimum foreground size of the industry standard.
In S1102, the target background image is divided based on the block size to obtain a plurality of candidate background regions.
In this embodiment, after determining the block size corresponding to each grid, the display screen may be divided into a plurality of areas based on the block size, and a plurality of candidate background areas may be obtained according to the grids covered by the target background image in the display screen. Exemplarily, fig. 12 shows a schematic diagram of dividing a candidate background region according to an embodiment of the present application. Referring to fig. 12, the target page includes two background images, which are a background image 1 and a background image 2, respectively, where a boundary coordinate of the background image 1 is not overlapped with a boundary of the display screen, the terminal device may divide the entire display screen based on the block size after obtaining the block size, so as to obtain a plurality of blocks, and the background image 1 and the background image 2 may perform region division according to the generated grid lines, so as to obtain a plurality of candidate background regions. The background image 1 includes 3 × 2 candidate background regions, and the row number of the first candidate background region is the first row, and the second row has a certain offset with respect to the screen boundary with respect to all the background regions of the display screen.
In the embodiment of the application, the partition size is determined by identifying the resolution of the display, and the target background image is subjected to region division based on the partition size, so that a plurality of candidate background regions are obtained, the partitioning accuracy is improved, and the overlapping of the associated background regions of the foreground object is avoided.
Fig. 13 is a flowchart illustrating a specific implementation of a method S802 for displaying a page according to a fourth embodiment of the present application. Referring to fig. 13, with respect to the embodiment described in fig. 8, S802 in the method for displaying a page provided in this embodiment includes: s1301 to S1303 are specifically detailed as follows:
further, the selecting the associated background region corresponding to the foreground object from the candidate background regions according to the center coordinates of the foreground object and the boundary coordinates of the target background image includes:
in S1301, configuring a reference rank number for each candidate background region according to a display position of each candidate background region in the target background image;
Figure BDA0002378659890000161
wherein, Columm0The reference column sequence number of the candidate background area is used as the reference column sequence number of the candidate background area; row0A reference row number of the candidate background area; (SrcX)0,SrcY0) Boundary coordinates of the candidate background image; (pX, pY) is the center coordinate of the candidate background region; (Size)x,Sizey) Is the region size of the candidate background region.
In this embodiment, after dividing the target background image into a plurality of candidate background regions, the terminal device may configure corresponding reference row and column numbers according to display positions of the candidate background regions in the target background image, and determine positions of the candidate background regions through grid coordinates, thereby implementing downsampling processing on the target background image.
It should be noted that if a plurality of foreground objects are included in the target page, only one operation of S1301 needs to be performed, that is, the reference column and row number of each candidate background area is determined, and since the target background area is gridded, only the approximate position of the foreground object needs to be determined when the associated background area is identified, so that the amount of calculation in selecting the associated background area is reduced, and the selection efficiency of the associated background area can be improved.
In S1302, the foreground object center coordinates are imported into a preset line-column conversion model, and a target line-column number of the foreground object is calculated; the line-row conversion model specifically comprises:
importing the center coordinates of the foreground object into a preset line-column conversion model, and calculating a target line-column number of the foreground object; the line-row conversion model specifically comprises:
Figure BDA0002378659890000171
wherein, Columm1The column serial number in the target row column number is used as the column serial number; row1The row serial number in the target row column number is used as the row serial number; (SrcX)1,SrcY1) Is the center coordinate of the foreground object; (Size)x,Sizey) Is the region size of the candidate background region; (Target)x,Targety) Is the object size of the foreground object.
In this embodiment, in the process of calculating the row and column numbers, the terminal device may calculate the target row and column numbers and the reference row and column numbers in a manner of rounding down, in which case, the above conversion formula may be changed to:
Figure BDA0002378659890000172
wherein the content of the first and second substances,
Figure BDA0002378659890000173
is a rounding function.
For example, if the center coordinates of a foreground object are (25,25), the object size of the foreground object is 3 × 3 pixels, and the area size of each candidate background area is 5 × 5, the target line number is calculated by the formula as follows: 5.3 for (25+3/2)/5, and 5 for a floor rounding by a rounding function.
In this embodiment, since a certain offset exists between the target background image and the display screen, in order to determine the relative position between the foreground object and the target background image, the offset corresponding to the boundary coordinates of the target background image needs to be considered when calculating the target row number corresponding to the foreground object in the target background image. The boundary coordinate is specifically a boundary point closest to the origin of coordinates corresponding to the display screen. For example, if the origin coordinate of the display screen is the screen boundary point at the upper left corner, the boundary coordinate of the target background image is the corresponding boundary coordinate of the upper left corner of the target background image in the coordinate system of the display screen.
Fig. 14 is a schematic diagram illustrating selection of a target row and column number according to an embodiment of the present application. Referring to fig. 14, the target background image is not completely overlapped with the screen boundary of the terminal device, and has a certain offset, that is, (SrcX)0,SrcY0). When determining the target row and column number of the foreground object relative to the target background area, the offset needs to be considered to avoid the selection misalignment of the associated background area due to the offset, as shown in fig. 14, if the offset is not considered, the foreground object is equivalent to the whole display screen, the corresponding row and column number is (3,1), and the row and column number relative to the target background area is (2,1),because a certain offset exists between the abscissa of the target background image and the origin of coordinates of the display screen, if the associated background area is obtained based on the row number and the column number of (3,1), the background area is dislocated, thereby affecting the subsequent adjustment effect.
In S1303, the candidate background region where the reference row and column number matches the target row and column number is selected as the associated background region of the foreground object.
In this embodiment, the terminal device may select, from the candidate background regions, a candidate background region in which the reference row and column number is matched with the target row and column number as the associated background region.
In the embodiment of the application, the selection efficiency of the associated background area can be improved by performing row-column coding on the candidate background area and configuring the corresponding associated background area for each foreground object according to the row-column number.
Fig. 15 is a flowchart illustrating a specific implementation of a method S304 for page display according to a fifth embodiment of the present application. Referring to fig. 15, with respect to the embodiment described in fig. 3, in the method for displaying a page provided by this embodiment, S304 includes: s1501 to S1504 are specifically detailed as follows:
further, if the display mode is a dark color display mode, the adjusting the foreground object to a target object corresponding to the display mode according to the background pixel value of the associated background area includes:
in S1501, the pixel value of the center coordinate of the associated background area and the transparency are acquired.
In this embodiment, in order to improve the adjustment efficiency, the terminal device may use the pixel value of the center coordinate of each candidate background region of the target background image as the characteristic pixel value of the entire candidate background region, so that during the adjustment of the foreground image, the target background image may be down-sampled into a mesh image, and the pixel value and the transparency of each mesh image may be determined according to the pixel value and the transparency of the center coordinate, so that the storage space of the target background image in the buffer region may be reduced, and the processing efficiency of the subsequent operation may be improved.
In S1502, a background luminance value of the associated background region is determined according to the pixel value of the center coordinate and the transparency.
In this embodiment, the background luminance value is related to a pixel value and also related to transparency of the target background image, and according to the size of the transparency, a color contribution ratio of the next-level image layer to the current target background image may be determined, for example, if the next-level image of the target background image is a white image and the current transparency is not 0, on the basis of the original pixel value, a proportion of white is added in a superimposing manner, so as to improve the luminance value of the entire image.
In S1503, if the background luminance value is greater than a preset luminance threshold, the luminance of the foreground object is reduced to generate the target object.
In this embodiment, since the current display mode is a dark color display mode, it is necessary to make the brightness value of the entire screen, at this time, if it is detected that the brightness value of the associated background area is greater than the preset brightness threshold, it indicates that the current associated background area is at a higher brightness, and the default color of the foreground object before adjustment is often a low-brightness color such as black, and at this time, if the color is reversed, the color is changed to a high-brightness color, so that the colors of the adjusted associated background image and the adjusted foreground object are the same or similar, and in order to avoid this, the terminal device does not reverse the color of the foreground object, but performs brightness reduction processing on the foreground object, so as to generate a target object with lower brightness.
The method for performing the brightness reduction processing may be: the terminal equipment converts the foreground object into an image in an HSV format, adjusts the value of the brightness V layer, and combines the adjusted V layer with the other two images, so that the aim of reducing the brightness of the foreground object is fulfilled.
In S1504, if the background brightness value is less than or equal to the brightness threshold, performing a reverse color processing on the foreground object to generate the target object.
In this embodiment, if the terminal device detects that the luminance value of the current associated background region is less than or equal to the luminance threshold, the terminal device may perform reverse color processing on the foreground object, and since the default color of the foreground object before adjustment is often a low-luminance color such as black, the foreground object may be changed to a color with higher luminance after the reverse color processing, and at this time, a higher color contrast exists between the foreground object and the associated background, so that the foreground object is clearly visible, and the visibility of the foreground object is improved.
In the embodiment of the application, the brightness value of the associated background area is determined by identifying and acquiring the pixel value and the transparency of the central coordinate of the associated background area, and the adjustment strategy of the foreground object is determined based on the brightness value, so that the condition that the background and the foreground object have the same or similar color is avoided, and the adjustment effect is improved.
Fig. 16 is a flowchart illustrating a specific implementation of a method for displaying a page according to a sixth embodiment of the present application. Referring to fig. 16, with respect to the embodiment shown in fig. 15, after determining the brightness value of the associated background area according to the pixel value of the center coordinate and the transparency, the method for displaying a page provided by this embodiment further includes: s1601 to S1603 are specifically detailed as follows:
further, after the determining the brightness value of the associated background area according to the pixel value of the center coordinate and the transparency, the method further includes:
in S1601, a feature brightness value corresponding to the foreground object is determined according to the pixel value of each pixel point in the foreground object.
In this embodiment, the terminal device may determine the adjustment policy of the foreground image directly according to the brightness value, and may determine the adjustment policy according to the actual brightness value of the foreground object. In this case, the terminal device may calculate a pixel mean value of the foreground object according to pixel values of each pixel point in the foreground object, and determine a characteristic luminance value of the foreground object based on the pixel mean value. In a possible implementation manner, the terminal device may select a pixel value of a center coordinate of the foreground object as a feature pixel value of the foreground object, and calculate a feature brightness value of the foreground object based on the feature pixel value of the center coordinate.
In S1602, if the brightness difference between the background brightness value and the feature brightness value is greater than a preset reverse color threshold, performing brightness reduction processing on the foreground object to generate the target object.
In this embodiment, if the terminal device detects that the luminance difference between the background luminance value and the feature luminance value of the foreground object before adjustment is greater than the preset reverse color threshold, it indicates that the contrast before adjustment has met the display requirement, and it is only necessary to adjust the luminance without performing reverse color processing on the foreground object.
In S1603, if the brightness difference is less than or equal to the inverse color threshold, performing inverse color processing on the foreground object to generate the target object.
In this embodiment, if the terminal device detects that the luminance difference between the background luminance value and the feature luminance value of the foreground object before adjustment is less than or equal to the preset reverse color threshold, it indicates that the contrast difference before adjustment is small, and at this time, the contrast between the foreground object and the background image can be improved by performing reverse color processing on the foreground object, so that the reverse color processing is performed.
In the embodiment of the application, the accuracy of the adjustment operation can be improved by acquiring the characteristic brightness value of the foreground object and determining the adjustment strategy according to the difference value between the characteristic brightness value and the background brightness value.
Fig. 17 is a flowchart illustrating a specific implementation of a method S302 for displaying a page according to a seventh embodiment of the present application. Referring to fig. 17, with respect to any one of the embodiments shown in fig. 3, fig. 8, fig. 11, fig. 13, fig. 15, and fig. 16, in S302, the method for displaying a page provided in this embodiment includes: S1701-S1703 are described in detail as follows:
further, the identifying an associated background region of each foreground object in the target page in the target background image includes:
in S1701, a display level of the foreground object within the target page is determined.
In this embodiment, the target page may include a plurality of layers, each layer corresponds to a display level, the foreground object is often at a higher display level, and the background image is often at a lower display level, so that a display effect in which the background image is used as a base and the foreground object covers the background image is achieved. In this case, a page may be overlapped by a plurality of background images to form a background of the entire page, that is, one foreground object may correspond to different background images at different display levels, and when the visibility of the foreground object and whether the same or similar colors may occur, the foreground object is often affected by the background images at adjacent display levels. In this case, the terminal device needs to determine a background image at a display level adjacent to the foreground object, and thus needs to identify the display level of the foreground object within the target page.
In S1702, a target background image of a level adjacent to the display level is selected as an associated background image of the foreground object in the target page.
In this embodiment, the terminal device may obtain the display level of each target background image, and select a target background image whose display level is adjacent to the display level of the foreground object as the associated background image of the foreground object.
In S1703, an associated background region of the foreground object within the associated background image is determined.
In this embodiment, after determining the associated background object of the forward object, the terminal device may determine the corresponding associated background area from the associated background image according to the display position of the foreground object. The manner of determining the associated background area may refer to the related description of S302, and may also be determined by the manners provided in the second to fourth embodiments, which are not described herein again.
In the embodiment of the application, the target background image adjacent to the foreground object level is determined as the associated background image, so that the associated background area can be accurately determined under the condition that a plurality of background images are overlapped, and the accuracy of determining the subsequent adjustment mode is improved.
Fig. 18 is a flowchart illustrating a specific implementation of a method S303 for displaying a page according to an eighth embodiment of the present application. Referring to fig. 18, with respect to any one of the embodiments shown in fig. 3, fig. 8, fig. 11, fig. 13, fig. 15, and fig. 16, in S303, the method for displaying a page provided by this embodiment includes: s1801 to S1802 are specifically described as follows:
further, the adjusting the foreground object to the target object corresponding to the display mode according to the background pixel value of the associated background area includes:
in S1801, a preview object corresponding to the foreground object is generated according to the background pixel value of the associated background region and the pixel value of the foreground object.
In this embodiment, the terminal device may determine the adjustment mode of the foreground object according to the upper and lower levels, and may also determine the adjustment mode of the terminal device according to the adjustment modes of the foreground objects adjacent to the previous level, so as to ensure the consistency of the overall display effect. In this case, the terminal device may generate a preview object of the foreground object according to the background pixel value of the associated background region and the pixel value of the foreground object. The above manner is performed for all foreground objects, and at this time, the preview object is not the display effect of the finally output adjusted foreground object.
In S1802, the target object corresponding to the foreground object is generated according to the pixel values of all the associated objects corresponding to the preview object and the pixel values of the preview object.
In this embodiment, the terminal device may generate a corresponding preview page according to all preview objects, and the terminal device may determine an associated object corresponding to each preview object according to the display position of each preview object. The associated object is another preview object associated with the preview object after the processing in S1801. The association relationship may be that the display positions of the two preview objects are adjacent; the object types of the two preview objects can be the same; particularly, if two preview objects are text objects and the two preview objects are located in the same sentence or paragraph, it can be recognized that the two preview objects have an association relationship.
In this embodiment, the terminal device may identify whether a difference between a pixel value of the preview object and a pixel value of the associated object is smaller than a preset associated threshold, and if so, identify that display effects between the preview object and the associated object are uniform, and take the preview object as a target object; on the contrary, if it is detected that the pixel difference value between the preview object and the associated object is greater than or equal to the associated threshold, it is recognized that the display effect between the preview object and the associated object is not uniform, at this time, the preview object and the associated object may be adjusted according to the pixel mean value of the preview object and the associated object, and the adjusted preview object is used as the target object to be displayed.
Fig. 19 shows a schematic diagram of generation of a target object according to an embodiment of the present application. See fig. 19 (a) for a preview page generated based on respective preview objects; fig. 19 (b) shows the target page after the preview object is adjusted based on the related object. Referring to fig. 19, it can be determined that the preview page contains four foreground objects, namely, the text "front", "scene", "pair", and "image", wherein after adjustment is performed according to the associated background area, the preview object of the text "front" is adjusted to be white, and the preview objects of other texts are adjusted to be black. Since the four objects are related objects, a uniform display effect is required, and the text "front" can be adjusted to a display effect consistent with the adjustment mode of other related objects, that is, to black, so as to ensure the consistency of the display effect of the related text.
In the embodiment of the application, the adjustment strategy of the foreground objects is determined according to the associated objects through the associated objects of the foreground objects in the same level, so that the uniformity of the display effect among the associated objects in the same level is ensured, and the display effect is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 20 is a block diagram illustrating a structure of a page display apparatus according to an embodiment of the present application, corresponding to the page display method according to the foregoing embodiment, and only the relevant portions of the page display apparatus according to the embodiment of the present application are shown for convenience of illustration.
Referring to fig. 20, the apparatus for page display includes:
a background image adjusting unit 201, configured to adjust an original background image in a target page to a target background image corresponding to a current display mode;
an associated background region identification unit 202, configured to identify an associated background region of each foreground object in the target page in the target background image;
a foreground object adjusting unit 203, configured to adjust the foreground object to a target object corresponding to the display mode according to a background pixel value of the associated background area;
a target page generating unit 204, configured to generate the target page according to all the target objects and the target background image.
Optionally, the associated background area identifying unit 202 includes:
a candidate background region dividing unit configured to divide the target background image into a plurality of candidate background regions;
and the candidate background area selecting unit is used for selecting the associated background area corresponding to the foreground object from the candidate background areas according to the center coordinates of the foreground object and the boundary coordinates of the target background image.
Optionally, the candidate background region dividing unit includes:
the block size determining unit is used for acquiring the minimum foreground size and taking the minimum foreground size as the determined block size;
and the grid dividing unit is used for dividing the target background image based on the block size to obtain a plurality of candidate background areas.
Optionally, the candidate background region selecting unit includes:
a reference row and column number acquisition unit, configured to configure a relevant reference row and column number for each candidate background region according to the display position of each candidate background region in the target background image;
Figure BDA0002378659890000221
wherein, Columm0The reference column sequence number of the candidate background area is used as the reference column sequence number of the candidate background area; row0A reference row number of the candidate background area; (SrcX)0,SrcY0) Boundary coordinates of the candidate background image; (pX, pY) is the center coordinate of the candidate background region; (Size)x,Sizey) Is the region size of the candidate background region;
the target row and column number acquisition unit is used for importing the center coordinates of the foreground object into a preset row and column conversion model and calculating the target row and column number of the foreground object; the line-row conversion model specifically comprises:
importing the center coordinates of the foreground object into a preset line-column conversion model, and calculating a target line-column number of the foreground object; the line-row conversion model specifically comprises:
Figure BDA0002378659890000222
wherein, Columm1The column serial number in the target row column number is used as the column serial number; row1The row serial number in the target row column number is used as the row serial number; (SrcX)1,SrcY1) Is the center coordinate of the foreground object; (Size)x,Sizey) Is the region size of the candidate background region; (Target)x,Targety) Is the foreground objectThe object size of (a);
and the associated background area determining unit is used for selecting the candidate background area with the reference row number matched with the target row number as the associated background area of the foreground object.
Optionally, the display mode is a dark color display mode, and the foreground object adjusting unit 203 includes:
a central coordinate acquisition unit for acquiring a pixel value and a transparency of a central coordinate of the associated background area;
a background luminance value determination unit configured to determine a background luminance value of the associated background region according to the pixel value of the center coordinate and the transparency;
the first brightness reduction processing unit is used for performing brightness reduction processing on the foreground object to generate the target object if the background brightness value is greater than a preset brightness threshold value;
and the first reverse color processing unit is used for performing reverse color processing on the foreground object to generate the target object if the background brightness value is less than or equal to the brightness threshold value.
Optionally, the page display unit further includes:
a feature brightness value obtaining unit, configured to determine a feature brightness value corresponding to the foreground object according to the pixel value of each pixel point in the foreground object;
the second brightness reduction processing unit is used for performing brightness reduction processing on the foreground object to generate the target object if a brightness difference value between the background brightness value and the characteristic brightness value is greater than a preset reverse color threshold value;
and the second reverse color processing unit is used for performing reverse color processing on the foreground object to generate the target object if the brightness difference value is smaller than or equal to the reverse color threshold value.
Optionally, the associated background area identifying unit 202 includes:
a display level determination unit for determining a display level of the foreground object within the target page;
the associated background image determining unit is used for selecting a target background image of a level adjacent to the display level in the target page as an associated background image of the foreground object;
an associated background region determining unit for determining an associated background region of the foreground object within the associated background image.
Optionally, the foreground object adjusting unit 203 includes:
a preview object generating unit, configured to generate a preview object corresponding to the foreground object according to a background pixel value of the associated background region and a pixel value of the foreground object;
and the associated object adjusting unit is used for generating the target object corresponding to the foreground object according to the pixel values of all associated objects corresponding to the preview object and the pixel values of the preview object.
Therefore, the page display device provided in the embodiment of the present application can also adjust the original background image in the target page to obtain the target background image, then determine the adjustment strategy of the foreground object according to the brightness value of the associated background area of each foreground object in the target background image, generate the target object corresponding to the foreground object according to the adjustment strategy, and finally generate the target page according to the target background image and the target object.
Fig. 21 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 21, the terminal device 21 of this embodiment includes: at least one processor 210 (only one shown in fig. 21), a memory 211, and a computer program 212 stored in the memory 211 and executable on the at least one processor 210, the processor 210 implementing the steps in any of the various page display method embodiments described above when executing the computer program 212.
The terminal device 21 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 210, a memory 211. Those skilled in the art will appreciate that fig. 21 is only an example of the terminal device 21, and does not constitute a limitation to the terminal device 21, and may include more or less components than those shown, or combine some components, or different components, such as an input/output device, a network access device, and the like.
The Processor 210 may be a Central Processing Unit (CPU), and the Processor 210 may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 211 may in some embodiments be an internal storage unit of the terminal device 21, such as a hard disk or a memory of the terminal device 21. The memory 211 may also be an external storage device of the terminal device 21 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 21. Further, the memory 211 may also include both an internal storage unit and an external storage device of the terminal device 21. The memory 211 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 211 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, because the contents of information interaction, execution process, and the like between the above-mentioned apparatuses/units are based on the same concept as that of the method embodiment of the present application, specific functions and technical effects thereof can be referred to specifically in the method embodiment section, and are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method implemented by the present application may be implemented by a computer program, which may be stored in a computer-readable storage medium and can implement the steps of the above-mentioned method embodiments when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or recited in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In another aspect, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (11)

1. A method of page display, comprising:
adjusting an original background image in a target page into a target background image corresponding to the current display mode;
identifying associated background areas of each foreground object in the target page in the target background image;
adjusting the foreground object to a target object corresponding to the display mode according to the background pixel value of the associated background area;
and generating the target page according to all the target objects and the target background image.
2. The method of claim 1, wherein the identifying associated background regions of respective foreground objects within the target page within the target background image comprises:
dividing the target background image into a plurality of candidate background areas;
and selecting the associated background area corresponding to the foreground object from the candidate background areas according to the center coordinates of the foreground object and the boundary coordinates of the target background image.
3. The method of claim 2, wherein the dividing the target background image into a plurality of candidate background regions comprises:
acquiring a minimum foreground size, and taking the minimum foreground size as a block size;
and dividing the target background image based on the block size to obtain a plurality of candidate background areas.
4. The method of claim 2, wherein the selecting the associated background region corresponding to the foreground object from the candidate background regions according to the center coordinates of the foreground object and the boundary coordinates of the target background image comprises:
configuring a relevant reference row column number for each candidate background area according to the display position of each candidate background area in the target background image;
Figure FDA0002378659880000021
wherein, Columm0The reference column sequence number of the candidate background area is used as the reference column sequence number of the candidate background area; row0A reference line sequence number of the candidate background region; (SrcX)0,SrcY0) Boundary coordinates of the candidate background image; (pX, pY) is the center coordinate of the candidate background region; (Siz)ex,Sizey) Is the region size of the candidate background region;
importing the center coordinates of the foreground object into a preset line-column conversion model, and calculating a target line-column number of the foreground object; the line-row conversion model specifically comprises:
Figure FDA0002378659880000022
wherein, Columm1The column serial number in the target row column number is used as the column serial number; row1The row serial number in the target row column number is used as the row serial number; (SrcX)1,SrcY1) Is the center coordinate of the foreground object; (Size)x,Sizey) Is the region size of the candidate background region; (Target)x,Targety) An object size of the foreground object;
and selecting the candidate background area matched with the reference row and column number and the target row and column number as the associated background area of the foreground object.
5. The method according to claim 1, wherein if the display mode is a dark display mode, the adjusting the foreground object to be the target object corresponding to the display mode according to the background pixel value of the associated background area comprises:
acquiring a pixel value and transparency of a central coordinate of the associated background area;
determining a background brightness value of the associated background area according to the pixel value of the center coordinate and the transparency;
if the background brightness value is larger than a preset brightness threshold value, performing brightness reduction processing on the foreground object to generate the target object;
and if the background brightness value is smaller than or equal to the brightness threshold value, performing reverse color processing on the foreground object to generate the target object.
6. The method of claim 5, further comprising, after said determining a luminance value of said associated background region based on said pixel value of said center coordinate and said transparency:
determining a characteristic brightness value corresponding to the foreground object according to the pixel value of each pixel point in the foreground object;
if the brightness difference value between the background brightness value and the characteristic brightness value is larger than a preset reverse color threshold value, performing brightness reduction processing on the foreground object to generate the target object;
and if the brightness difference value is smaller than or equal to the reverse color threshold value, performing reverse color processing on the foreground object to generate the target object.
7. The method according to any of claims 1-6, wherein said identifying associated background regions of respective foreground objects within said target page within said target background image comprises:
determining a display level of the foreground object within the target page;
selecting a target background image of a level adjacent to the display level in the target page as an associated background image of the foreground object;
determining an associated background region of the foreground object within the associated background image.
8. The method according to any one of claims 1-6, wherein said adjusting the foreground object to the target object corresponding to the display mode according to the background pixel value of the associated background region comprises:
generating a preview object corresponding to the foreground object according to the background pixel value of the associated background area and the pixel value of the foreground object;
and generating the target object corresponding to the foreground object according to the pixel values of all the associated objects corresponding to the preview object and the pixel values of the preview object.
9. An apparatus for page display, comprising:
the background image adjusting unit is used for adjusting the original background image in the target page into a target background image corresponding to the current display mode;
the associated background area identification unit is used for identifying the associated background area of each foreground object in the target page in the target background image;
a foreground object adjusting unit, configured to adjust the foreground object to a target object corresponding to the display mode according to a background pixel value of the associated background area;
and the target page generating unit is used for generating the target page according to all the target objects and the target background image.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the computer program.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 8.
CN202010076679.8A 2020-01-23 2020-01-23 Page display method, device, terminal and storage medium Pending CN113157357A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010076679.8A CN113157357A (en) 2020-01-23 2020-01-23 Page display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010076679.8A CN113157357A (en) 2020-01-23 2020-01-23 Page display method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113157357A true CN113157357A (en) 2021-07-23

Family

ID=76881965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010076679.8A Pending CN113157357A (en) 2020-01-23 2020-01-23 Page display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113157357A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113486216A (en) * 2021-07-27 2021-10-08 中国银行股份有限公司 Page merging method, device, server, medium and product
CN113781959A (en) * 2021-09-23 2021-12-10 Oppo广东移动通信有限公司 Interface processing method and device
CN116466952A (en) * 2023-06-19 2023-07-21 成都赛力斯科技有限公司 Control visual effect element adjusting method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591848A (en) * 2010-11-22 2012-07-18 微软公司 Selection of foreground characteristics based on background
CN105005461A (en) * 2015-06-23 2015-10-28 深圳市金立通信设备有限公司 Icon display method and terminal
CN105677349A (en) * 2016-01-05 2016-06-15 努比亚技术有限公司 Desktop icon display method and device
CN105809645A (en) * 2016-03-28 2016-07-27 努比亚技术有限公司 Word display method and device and mobile terminal
CN106126214A (en) * 2016-06-17 2016-11-16 青岛海信移动通信技术股份有限公司 The determination method and device of text color on a kind of interface
CN106851003A (en) * 2017-02-27 2017-06-13 努比亚技术有限公司 The method and device of text color is adjusted according to wallpaper color
CN107291334A (en) * 2017-06-27 2017-10-24 努比亚技术有限公司 A kind of icon font color determines method and apparatus
CN110609722A (en) * 2019-08-09 2019-12-24 华为技术有限公司 Dark mode display interface processing method, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591848A (en) * 2010-11-22 2012-07-18 微软公司 Selection of foreground characteristics based on background
CN105005461A (en) * 2015-06-23 2015-10-28 深圳市金立通信设备有限公司 Icon display method and terminal
CN105677349A (en) * 2016-01-05 2016-06-15 努比亚技术有限公司 Desktop icon display method and device
CN105809645A (en) * 2016-03-28 2016-07-27 努比亚技术有限公司 Word display method and device and mobile terminal
CN106126214A (en) * 2016-06-17 2016-11-16 青岛海信移动通信技术股份有限公司 The determination method and device of text color on a kind of interface
CN106851003A (en) * 2017-02-27 2017-06-13 努比亚技术有限公司 The method and device of text color is adjusted according to wallpaper color
CN107291334A (en) * 2017-06-27 2017-10-24 努比亚技术有限公司 A kind of icon font color determines method and apparatus
CN110609722A (en) * 2019-08-09 2019-12-24 华为技术有限公司 Dark mode display interface processing method, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113486216A (en) * 2021-07-27 2021-10-08 中国银行股份有限公司 Page merging method, device, server, medium and product
CN113486216B (en) * 2021-07-27 2024-02-09 中国银行股份有限公司 Page merging method, device, server, medium and product
CN113781959A (en) * 2021-09-23 2021-12-10 Oppo广东移动通信有限公司 Interface processing method and device
CN116466952A (en) * 2023-06-19 2023-07-21 成都赛力斯科技有限公司 Control visual effect element adjusting method and device, electronic equipment and storage medium
CN116466952B (en) * 2023-06-19 2023-09-01 成都赛力斯科技有限公司 Control visual effect element adjusting method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN107885533B (en) Method and device for managing component codes
EP3944080A1 (en) A method, an apparatus and a computer program product for creating a user interface view
CN108959361B (en) Form management method and device
CN113157357A (en) Page display method, device, terminal and storage medium
CN108196755B (en) Background picture display method and device
CN110795007B (en) Method and device for acquiring screenshot information
WO2021143284A1 (en) Image processing method and apparatus, terminal and storage medium
WO2022022177A1 (en) Display method and electronic device
CN111105474B (en) Font drawing method, font drawing device, computer device and computer readable storage medium
CN113518243A (en) Image processing method and device
CN112950525A (en) Image detection method and device and electronic equipment
CN112750190B (en) Three-dimensional thermodynamic diagram generation method, device, equipment and storage medium
US20230362293A1 (en) Method for Configuring Theme Color of Terminal Device, Apparatus, and Terminal Device
CN111275607B (en) Interface display method and device, computer equipment and storage medium
CN113553368A (en) Tag information processing method and device of multilayer pie chart and terminal
CN115936998A (en) Picture splicing method and device, electronic equipment and storage medium
CN115798418A (en) Image display method, device, terminal and storage medium
CN112037545B (en) Information management method, information management device, computer equipment and storage medium
CN113031838B (en) Screen recording method and device and electronic equipment
CN111694535B (en) Alarm clock information display method and device
CN113518171B (en) Image processing method, device, terminal equipment and medium
CN114257755A (en) Image processing method, device, equipment and storage medium
CN112560903A (en) Method, device and equipment for determining image aesthetic information and storage medium
CN114063945A (en) Mobile terminal and image display method thereof
CN116672707B (en) Method and electronic device for generating game prediction frame

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination