CN111552451A - Display control method and device, computer readable medium and terminal equipment - Google Patents

Display control method and device, computer readable medium and terminal equipment Download PDF

Info

Publication number
CN111552451A
CN111552451A CN202010274861.4A CN202010274861A CN111552451A CN 111552451 A CN111552451 A CN 111552451A CN 202010274861 A CN202010274861 A CN 202010274861A CN 111552451 A CN111552451 A CN 111552451A
Authority
CN
China
Prior art keywords
drawn
image
proportion
pixels
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010274861.4A
Other languages
Chinese (zh)
Other versions
CN111552451B (en
Inventor
姚坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realme Chongqing Mobile Communications Co Ltd
Original Assignee
Realme Chongqing Mobile Communications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realme Chongqing Mobile Communications Co Ltd filed Critical Realme Chongqing Mobile Communications Co Ltd
Priority to CN202010274861.4A priority Critical patent/CN111552451B/en
Publication of CN111552451A publication Critical patent/CN111552451A/en
Application granted granted Critical
Publication of CN111552451B publication Critical patent/CN111552451B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to the field of electronic device technologies, and in particular, to a display control method, a display control apparatus, a computer-readable medium, and a terminal device. The method comprises the following steps: responding to the target control operation, and acquiring a page to be converted; identifying the page to be converted to acquire an image to be drawn and/or a region to be drawn in the page to be converted; sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity, and calculating the proportion of each type of pixel points; calling an image drawing strategy, and determining a color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points; and/or calling a region drawing strategy, and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points. This disclosure can promote the display effect of dark mode.

Description

Display control method and device, computer readable medium and terminal equipment
Technical Field
The present disclosure relates to the field of electronic device technologies, and in particular, to a display control method, a display control apparatus, a computer-readable medium, and a terminal device.
Background
The time that intelligent mobile terminal equipment such as cell-phone, panel computer occupy people in daily life is more and more. When a user uses the mobile phone in a dark environment, if the display interface of the mobile phone always keeps high screen brightness, the eyes of the user are stimulated, and adverse effects are caused.
In the prior art, the brightness of the screen can be adjusted to adapt to the change of light. But such an approach may affect the overall content in the display interface. The contrast of the displayed content is easy to be reduced, and certain difficulty is caused to the user for identifying the displayed content on the screen.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a display control method, a display control apparatus, a computer readable medium and a terminal device, which can implement selective color transformation according to picture contents, and adapt to a color control mode of a user to a terminal.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a display control method including:
responding to the target control operation, and acquiring a page to be converted;
identifying the page to be converted to acquire an image to be drawn and/or a region to be drawn in the page to be converted;
sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity, and calculating the proportion of each type of pixel points;
calling an image drawing strategy, and determining a color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points; and/or
And calling a region drawing strategy, and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points.
According to a second aspect of the present disclosure, there is provided a display control apparatus comprising:
the page to be converted acquiring module is used for responding to the target control operation and acquiring a page to be converted;
the page to be converted identification module is used for identifying the page to be converted so as to acquire an image to be drawn and/or a region to be drawn in the page to be converted;
the pixel point sampling module is used for sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity and calculating the proportion of each type of pixel points;
the image drawing module is used for calling an image drawing strategy and determining a color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points; and/or
And the region drawing module is used for calling a region drawing strategy and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points.
According to a third aspect of the present disclosure, there is provided a computer-readable medium having stored thereon a computer program which, when executed by a processor, implements the display control method described above.
According to a fourth aspect of the present disclosure, there is provided a terminal device comprising:
one or more processors;
a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the display control method described above.
According to the display control method provided by the embodiment of the invention, the image to be drawn and the area to be drawn in the page to be converted are identified and the pixel points are sampled, so that the style, the content and the color composition of the image to be drawn and the area to be returned can be determined according to the pixel point proportion, the image size and the image format, the corresponding color return policy can be called, the corresponding color transformation policy is executed on the image to be drawn and the area to be drawn respectively, the color matching of all the content in the page in the color-changed deep color mode is more reasonable, and the display effect of the deep color mode is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically illustrates a flow chart of a display control method in an exemplary embodiment of the present disclosure;
FIG. 2 is a schematic diagram that schematically illustrates a page in a normal display mode, in an exemplary embodiment of the disclosure;
FIG. 3 is a schematic diagram that schematically illustrates a page after a color transform has been performed, in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a composition diagram of a display control apparatus in an exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates an electronic device structure diagram of a terminal device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the existing intelligent terminal device, the color of the content in the page is generally displayed according to a default or customized theme, and the color configuration is generally fixed. When a user uses the terminal device in an environment with dark optical fibers, for example, when a mobile phone or a tablet computer is used at night when the light is turned off, the display content of the screen can only be adjusted by adjusting the brightness of the screen. However, such a method may result in a decrease in contrast of page contents to some extent, a decrease in display effect, and a tendency to cause discomfort to the eyes of the user.
In view of the above-described drawbacks and disadvantages of the related art, a display control method is provided in the present exemplary embodiment. Referring to fig. 1, the display control method described above may include the steps of:
s11, responding to the target control operation, and acquiring the page to be converted;
s12, identifying the page to be converted to obtain an image to be drawn and/or an area to be drawn in the page to be converted;
s13, sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity, and calculating the proportion of each type of pixel points;
s14, calling an image drawing strategy, and determining a color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points; and/or
And S15, calling a region drawing strategy, and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points.
In the display control method provided in this exemplary embodiment, on one hand, the ratio of each type of pixel points, the image size, the format of the image, and the color composition according to the image to be drawn and the region to be drawn in the page to be converted can be accurately obtained by identifying the image to be drawn and the region to be drawn in the page to be converted and sampling the pixel points. On the other hand, the corresponding color transformation strategy can be executed for the image to be drawn and the area to be drawn according to the information, so that the color matching of the contents of all parts in the page in the color-changed deep color mode is more reasonable, the display effect of the deep color mode is improved, and the stimulation to the eyes of the user is reduced.
Hereinafter, each step of the display control method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
Step S11, in response to the target control operation, acquires a page to be converted.
In this exemplary embodiment, the display control method can be applied to an intelligent terminal device such as a mobile phone, a tablet computer, and a notebook computer equipped with a display. For example, the system application list of the terminal device may be configured with corresponding dark mode, dark mode or night mode, and dim mode options for performing color transformation on each page. The dark color mode/dark color mode can be defined as a darker color matching scheme, the optimal contrast ratio of the text foreground and the dark color background is utilized to optimize the colors of the text and the system icons, the consistency, the comfort and the reading legibility of the human eyes in viewing and feeling are ensured, and the method can be applied to system content and third-party application.
Specifically, for the terminal device, the terminal device may enter the dark mode in response to a user selecting the dark mode option in the interactive interface, for example, selecting the dark mode option in a setting list or a system application list.
After the terminal equipment enters the dark mode, all or part of interactive interface views of a system and an application program can be obtained, and the interactive interface views are used as pages to be converted. The method comprises the steps that various interactive interface views of application programs which are currently running in a foreground and running in a background can be used as pages to be converted; or, taking the multi-level interactive interface view associated with the current interactive interface view as a page to be converted; or, after the user starts a new application program, each interactive interface view formed by the application is used as a page to be converted, so that the situation that after the system enters a dark color mode, the system needs to process too many pages to be converted at the same time, and the calculation pressure of the processor is increased is avoided.
Or, in other exemplary embodiments of the present disclosure, each interactive interface view may be used as a page to be converted in advance in the background in an idle period, and a color conversion may be performed in advance.
Step S12, identifying the page to be converted to obtain the image to be drawn and/or the area to be drawn in the page to be converted.
In this exemplary embodiment, after entering the dark mode, each page to be converted may be identified, and each content such as a region to be rendered, an image to be rendered, and a text to be rendered in the page to be converted may be read. For example, the page to be converted may include text, pictures, buttons, controls, icons, and dividing lines. The text to be drawn and the image to be drawn can be displayed on the area to be drawn in an overlapping mode and displayed on the upper layer. For example, referring to the "setup" page in the normal/normal display mode shown in fig. 2, a plurality of images 201 to be drawn, for example: icon images corresponding to flight mode, bluetooth, search, and the like; also included are a plurality of text to be rendered 202, such as: "set", "mr. forest day", "find cell phone", etc. In addition, a region to be drawn 203 is also included. Further, a dividing line 204 is included.
In some exemplary embodiments of the present disclosure, the page to be converted may be further divided into a plurality of regions to be identified according to a dividing line included in the page to be converted, or according to the layout and layout of the page, so that the content identification and the color conversion may be performed in sequence according to the arrangement order of the regions to be identified in the page.
For example, referring to fig. 2, the current page may be divided into a plurality of regions to be recognized 205 according to a dividing line, and then the image to be drawn, the text to be drawn, and the region to be drawn in each region to be recognized may be sequentially and respectively recognized according to the arrangement order in the current page. Or, a plurality of recognition task threads can be established, and a plurality of areas to be drawn are recognized and subjected to color conversion simultaneously in an asynchronous mode. Therefore, the processing time is shortened, and the page rendering efficiency is improved.
Step S13, sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity, and calculating the proportion of each type of pixel points.
In this example embodiment, after determining one or more to-be-drawn images and/or to-be-drawn regions included in a to-be-drawn page, pixel sampling may be performed on the to-be-drawn images and the to-be-drawn regions according to a preset sampling rule, and an occupation ratio of each type of pixel is calculated.
Specifically, the sampling rule may be: the pixels of the picture are uniformly and symmetrically divided into 10 rows and 10 columns, and 100 points are collected in total. Meanwhile, in order to avoid no pixel point acquisition on the boundary, 10 points are respectively acquired on the transverse boundary and the longitudinal boundary of the picture, so that the total acquisition amount is about 120. The pixel point may include: any one or more of a color pixel, a transparent pixel, a white pixel, a black pixel, and a gray pixel.
After the pixel points are sampled, the percentage of the five pixel points can be calculated. Specifically, the following calculation methods may be adopted, including:
the color pixel percentage is the total number of color pixels/(the total number of pixel points-the total number of transparent pixels);
the transparent pixel percentage is equal to the total number of transparent pixels/the total number of pixel points;
white pixel percentage is the total number of white pixels/(total number of white pixels + total number of black pixels + total number of gray pixels);
black pixel percentage-total black pixels/(total white pixels + total black pixels + total gray pixels);
gray pixel percentage is the total number of gray pixels/(total number of white pixels + total number of black pixels + total number of gray pixels).
And step S14, calling an image drawing strategy, and determining the color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points.
In the present exemplary embodiment, for the image to be drawn, as shown in fig. 2, which may be displayed superimposed on the area to be drawn, the format of the image to be drawn may be first identified.
Specifically, when the ratio of the color pixels of the image to be drawn is greater than a first threshold, a ignoring operation is performed on the image to be drawn. For example, the first threshold may be set to any value between 6% and 10%. For example, when the percentage of color pixels is > 8%, it is determined that the image to be drawn tends to be in color, and no processing is performed for such an image.
Or, in some exemplary embodiments, when the image to be drawn is identified to be in a format of.9 and the proportion of the white pixels is greater than a second threshold value, performing a white background color reversal operation on the image to be drawn; or when the image to be drawn is identified to be in a format of.9, the sum of the proportion of the white pixels and the proportion of the black pixels is larger than a third threshold value, and the proportion of the white pixels is larger than the proportion of the black pixels, performing white background reverse color operation on the image to be drawn.
For example, the second threshold may be configured to be 80%, and the third threshold may be configured to be 90%. When the image to be drawn is fig. 9, if the percentage of white pixels is > 80%, or the percentage of white pixels plus the percentage of black pixels is greater than 90%, and the percentage of white pixels is higher, it is determined that the image to be drawn is mostly white and is a background, and it is necessary to reverse its color to black.
And step S15, calling a region drawing strategy, and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points.
In this exemplary embodiment, for the area to be drawn, the foreground and the background may be determined according to the size of the area to be drawn. Specifically, after the fact that the size of the area to be drawn is smaller than the target size and the proportion of the white pixels is larger than a fourth threshold value is identified, a white background reverse color operation is performed on the area to be drawn; or
After the fact that the size of the area to be drawn is smaller than the target size, the proportion of the transparent pixels is larger than a fifth threshold value, and the proportion of the white pixels is larger than the fifth threshold value is recognized, white foreground reverse color operation is carried out on the area to be drawn; or
After the fact that the size of the image to be drawn is smaller than the target size, the proportion of the transparent pixels is larger than a fifth threshold value, and the proportion of the black pixels is larger than a fifth threshold value is recognized, black foreground reverse color operation is carried out on the area to be drawn; or
And when the fact that the size of the image to be drawn is smaller than the target size and the proportion of the gray pixels exceeds a fifth threshold value is recognized, performing gray foreground reverse color operation on the area to be drawn.
For example, the target size may be configured to be 50 dp; the fourth threshold may be configured to 95%; the fifth threshold may be configured to be 50%.
When the area to be drawn is within 50dp, the possibility of being judged as the foreground is large, so when the percentage of white pixels is > 95%, it is judged as the white background. And the white background can be reversed to a black background.
If the transparent pixel percentage > 50% and the white pixel percentage > 50%, then the white foreground is determined. Then no processing may be done.
If the transparent pixel percentage > 50% and the black pixel percentage > 50%, then the black foreground is determined. The black foreground can be inversely transformed into a white foreground.
If the gray percentage exceeds 50%, the gray foreground is determined and the gray foreground can be inverted to the white foreground.
In the present exemplary embodiment, for the area to be drawn, after recognizing that the size of the area to be drawn is larger than the target size and the proportion of the transparent pixels is in the first transparent pixel interval, and the proportion of the white pixels is larger than the second threshold, performing a white background reverse operation on the area to be drawn; alternatively, the first and second electrodes may be,
when the fact that the size of the area to be drawn is larger than the target size and the proportion of the transparent pixels is in a first transparent pixel interval is recognized, and the proportion of the black pixels is larger than a second threshold value, performing black background reverse color operation on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a second transparent pixel interval is recognized, and the proportion of the white pixels is larger than a second threshold value, performing white foreground reverse color operation on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a second transparent pixel interval is recognized, and the proportion of the black pixels is larger than a second threshold value, black foreground reverse color operation is carried out on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a third transparent pixel interval is recognized, and the proportion of the white pixels is larger than a third threshold value, performing white foreground reverse color operation on the area to be drawn; or
And when the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a third transparent pixel interval is recognized, and the proportion of the black pixels is larger than a third threshold value, performing black foreground reverse color operation on the area to be drawn.
For example, the target size may be configured to be 50%, and the first transparent pixel interval may be configured to be less than 25%; the second transparent pixel interval may be configured to be 25% -50%; the third pixel section may be configured to be greater than 80%.
When the drawn picture area exceeds 50dp and the percentage of the transparent pixels is less than 25%, if the percentage of the white pixels is greater than 80%, determining that the picture area is a white background; if the black pixel percentage is greater than 80%, the image is determined to be a black background. At this time, the white background may be inverted to a black background; alternatively, the black background is not treated.
If the transparent pixel percentage is within 25% and 50%, if the white pixel percentage is > 80%, determining as a white foreground; if the percentage of black pixels is greater than 80%, the image is determined to be a black foreground. At this time, the white foreground may not be processed; alternatively, the black foreground is flipped over to the white foreground.
If the transparent pixel exceeds 80%, if the white pixel percentage is 60%, the white foreground is judged; if the black pixel percentage is greater than 60%, the black foreground is determined. At this time, the white foreground may not be processed; alternatively, the black foreground is inverted to a white foreground.
In addition, other white foreground and black background can be left untreated, and the original color is kept. The configuration of the numerical values in the above embodiments is exemplary, and may be specifically configured according to the user requirements.
In addition, in other exemplary embodiments of the present disclosure, as shown with reference to fig. 2, the text to be rendered may also be displayed in an overlapping manner on the region to be rendered. For the text to be drawn, the text to be drawn can be configured to be the target color according to the color of the region to be drawn after the reverse color operation. For example, as shown with reference to fig. 3, the text to be drawn may be modified to be white or gray. Or, if the color of the region to be drawn, which is superposed on the text to be drawn, is changed into white, the text to be drawn can be changed into black or other colors with higher color contrast with the region to be drawn.
Based on the above, in other exemplary embodiments of the present disclosure, the method described above may further include: acquiring brightness and/or contrast information of the image to be drawn and/or the area to be drawn after color conversion and current environment information, and performing optimization operation on the image to be drawn after color conversion according to the current environment information.
For example, after performing a color transformation, the brightness of the current environment may be collected and the paged color may be evaluated. By combining with the ambient brightness, the brightness and the contrast of each part in the page, or one or more of parameters such as definition, color saturation, color temperature, hue and the like can be optimized, so that the page after color conversion accords with the ambient bright color, and the display effect is improved. For example, each parameter can be fine-tuned by using a preset optimization rule.
According to the method provided by the embodiment of the disclosure, contents such as the image to be drawn, the area to be drawn, the text to be drawn and the like in the page to be converted are identified, so that the contents contained in the page can be accurately acquired. The method comprises the steps of sampling pixel points of an area to be drawn and an image to be drawn, accurately obtaining color components of the area to be drawn and the image to be drawn, judging the format or size of the image, and accurately judging a foreground, a background and corresponding colors, so that corresponding color transformation strategies can be accurately executed. Thereby effectively improving the display effect of the dark mode. In addition, after color conversion, color optimization is carried out on the page after color conversion by combining the light intensity of the current environment of the terminal equipment, so that the page color can better meet the environmental characteristics and the actual requirements of the user, and the eye of the user is prevented from being injured.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, as shown in fig. 4, the present exemplary embodiment further provides a display control apparatus 40, which is applied to a terminal device configured with a display screen, and includes: the system comprises a to-be-converted page acquisition module 401, a to-be-converted page identification module 402, a pixel point sampling module 403, an image drawing module 404 and an area drawing module 405. Wherein the content of the first and second substances,
the to-be-converted page obtaining module 401 may be configured to respond to a target control operation and obtain a to-be-converted page.
The to-be-converted page identification module 402 may be configured to identify the to-be-converted page to obtain an image to be drawn and/or a region to be drawn in the to-be-converted page.
The pixel point sampling module 403 may be configured to sample the image to be drawn and/or the area to be drawn to obtain a target number of pixel points, and calculate a ratio of each type of pixel point.
The image drawing module 404 may be configured to invoke an image drawing policy, and determine a color transformation policy of the image to be drawn based on the format of the image to be drawn and the ratio of each type of pixel point.
The region drawing module 405 may be configured to invoke a region drawing policy, and determine a color transformation policy of the region to be drawn based on the size of the region to be drawn and the ratio of each type of pixel point.
In one example of the present disclosure, the pixel point includes: any one or more of a color pixel, a transparent pixel, a white pixel, a black pixel, and a gray pixel.
In an example of the present disclosure, the image rendering module 404 may include: a first image rendering unit (not shown in the figure).
The first image drawing unit may be configured to perform an ignoring operation on the image to be drawn when a ratio of color pixels of the image to be drawn is greater than a first threshold.
In one example of the present disclosure, the image rendering module 404 may include: a second image drawing unit (not shown in the figure).
The second image drawing unit may be configured to, when the image to be drawn is identified as a 9-format and the proportion of the white pixels is greater than a second threshold, perform a white background color reversal operation on the image to be drawn; alternatively, the first and second electrodes may be,
and when the image to be drawn is identified to be in a 9 format, the sum of the proportion of the white pixels and the proportion of the black pixels is larger than a third threshold value, and the proportion of the white pixels is larger than the proportion of the black pixels, performing white background reverse color operation on the image to be drawn.
In an example of the present disclosure, the image to be drawn is displayed in an overlapping manner on the area to be drawn.
In one example of the present disclosure, the region drawing module 405 may include: a first region drawing unit (not shown in the figure).
The first region drawing unit may be configured to, when it is identified that the size of the region to be drawn is smaller than a target size and the proportion of the white pixels is greater than a fourth threshold, perform a white background inversion operation on the region to be drawn; or
After the fact that the size of the area to be drawn is smaller than the target size, the proportion of the transparent pixels is larger than a fifth threshold value, and the proportion of the white pixels is larger than the fifth threshold value is recognized, white foreground reverse color operation is carried out on the area to be drawn; or
After the fact that the size of the image to be drawn is smaller than the target size, the proportion of the transparent pixels is larger than a fifth threshold value, and the proportion of the black pixels is larger than a fifth threshold value is recognized, black foreground reverse color operation is carried out on the area to be drawn; or
And when the fact that the size of the image to be drawn is smaller than the target size and the proportion of the gray pixels exceeds a fifth threshold value is recognized, performing gray foreground reverse color operation on the area to be drawn.
In an example of the present disclosure, the region drawing module 405 may further include: a second region drawing unit (not shown in the figure).
The second region drawing unit may be configured to, when it is identified that the size of the region to be drawn is larger than a target size, and the proportion of the transparent pixels is in a first transparent pixel interval, and the proportion of the white pixels is larger than a second threshold, perform a white background reverse color operation on the region to be drawn; alternatively, the first and second electrodes may be,
when the fact that the size of the area to be drawn is larger than the target size and the proportion of the transparent pixels is in a first transparent pixel interval is recognized, and the proportion of the black pixels is larger than a second threshold value, performing black background reverse color operation on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a second transparent pixel interval is recognized, and the proportion of the white pixels is larger than a second threshold value, performing white foreground reverse color operation on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a second transparent pixel interval is recognized, and the proportion of the black pixels is larger than a second threshold value, black foreground reverse color operation is carried out on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a third transparent pixel interval is recognized, and the proportion of the white pixels is larger than a third threshold value, performing white foreground reverse color operation on the area to be drawn; or
And when the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a third transparent pixel interval is recognized, and the proportion of the black pixels is larger than a third threshold value, performing black foreground reverse color operation on the area to be drawn.
In an example of the present disclosure, a text to be rendered is superimposed and displayed on the region to be rendered; the apparatus may further comprise a text rendering module (not shown in the figures).
The text drawing module can be used for configuring the text to be drawn as a target color according to the color of the region to be drawn after the reverse color operation.
In an example of the present disclosure, the calculating the proportion of each type of pixel point adopts the following method, including:
the color pixel percentage is the total number of color pixels/(the total number of pixel points-the total number of transparent pixels);
the transparent pixel percentage is equal to the total number of transparent pixels/the total number of pixel points;
white pixel percentage is the total number of white pixels/(total number of white pixels + total number of black pixels + total number of gray pixels);
black pixel percentage-total black pixels/(total white pixels + total black pixels + total gray pixels);
gray pixel percentage is the total number of gray pixels/(total number of white pixels + total number of black pixels + total number of gray pixels).
In one example of the present disclosure, the apparatus may further include: an optimization execution module (not shown).
The optimization execution module can be used for acquiring brightness and/or contrast information of the image to be drawn and/or the area to be drawn after color conversion and current environment information, and executing optimization operation on the image to be drawn after color conversion according to the current environment information.
The details of each module in the display control apparatus are described in detail in the corresponding display control method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Fig. 5 shows a schematic diagram of a wireless communication device suitable for implementing an embodiment of the invention.
It should be noted that the electronic device 600 shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device 600 may specifically include: a processor 610, an internal memory 621, an external memory interface 622, a Universal Serial Bus (USB) interface 630, a charging management module 640, a power management module 641, a battery 642, an antenna 1, an antenna 2, a mobile communication module 650, a wireless communication module 660, an audio module 670, a speaker 671, a receiver 672, a microphone 673, an earphone interface 674, a sensor module 680, a display 690, a camera module 691, a pointer 692, a motor 693, buttons 694, and a Subscriber Identity Module (SIM) card interface 695. Among other things, sensor modules 680 may include a depth sensor 6801, a pressure sensor 6802, a gyroscope sensor 6803, an air pressure sensor 6804, a magnetic sensor 6805, an acceleration sensor 6806, a distance sensor 6807, a proximity light sensor 6808, a fingerprint sensor 6809, a temperature sensor 6810, a touch sensor 6811, an ambient light sensor 6812, and a bone conduction sensor 6813.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the electronic device 600. In other embodiments of the present application, the electronic device 600 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 610 may include one or more processing units, such as: the Processor 610 may include an Application Processor (AP), a modem Processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 610 for storing instructions and data. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by the processor 610. In some embodiments, the memory in the processor 610 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 610. If the processor 610 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 610, thereby increasing the efficiency of the system.
In some embodiments, processor 610 may include one or more interfaces. The Interface may include an Integrated Circuit (I2C) Interface, an Inter-Integrated Circuit built-in audio (I2S) Interface, a Pulse Code Modulation (PCM) Interface, a Universal Asynchronous Receiver Transmitter (UART) Interface, a Mobile Industry Processor Interface (MIPI), a General-purpose input/Output (GPIO) Interface, a Subscriber Identity Module (SIM) Interface, and/or a Universal Serial Bus (USB) Interface, etc.
The I2C interface is a bi-directional synchronous Serial bus including a Serial Data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 610 may include multiple sets of I2C buses. The processor 610 may be coupled to the touch sensor 6811, the charger, the flash, the camera module 691, etc., through different I2C bus interfaces, respectively. For example: the processor 610 may be coupled to the touch sensor 6811 via an I2C interface, such that the processor 610 and the touch sensor 6811 communicate via an I2C bus interface to implement touch functionality of the electronic device 600.
The I2S interface may be used for audio communication. In some embodiments, processor 610 may include multiple sets of I2S buses. The processor 610 may be coupled to the audio module 670 via an I2S bus to enable communication between the processor 610 and the audio module 670. In some embodiments, the audio module 670 may communicate audio signals to the wireless communication module 660 via an I2S interface to enable answering a call via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 670 and the wireless communication module 660 may be coupled by a PCM bus interface. In some embodiments, the audio module 670 may also transmit audio signals to the wireless communication module 660 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 610 and the wireless communication module 660. For example: the processor 610 communicates with the bluetooth module in the wireless communication module 660 through the UART interface to implement the bluetooth function. In some embodiments, the audio module 670 may transmit the audio signal to the wireless communication module 660 through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface may be used to connect the processor 610 with the display screen 690, the camera module 691, and other peripheral devices. The MIPI Interface includes a Camera Serial Interface (CSI), a display screen Serial Interface (DSI), and the like. In some embodiments, the processor 610 and the camera module 691 communicate via a CSI interface to implement the camera function of the electronic device 600. The processor 610 and the display screen 690 communicate via the DSI interface to implement the display function of the electronic device 600.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 610 with the camera module 691, the display screen 690, the wireless communication module 660, the audio module 670, the sensor module 680, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 630 is an interface conforming to the USB standard specification, and may specifically be a MiniUSB interface, a microsusb interface, a USB type c interface, or the like. The USB interface 630 may be used to connect a charger to charge the electronic device 600, and may also be used to transmit data between the electronic device 600 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 600. In other embodiments of the present application, the electronic device 600 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 640 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 640 may receive charging input from a wired charger via the USB interface 630. In some wireless charging embodiments, the charging management module 640 may receive a wireless charging input through a wireless charging coil of the electronic device 600. The charging management module 640 may also supply power to the electronic device through the power management module 641 while charging the battery 642.
The power management module 641 is configured to connect the battery 642, the charging management module 640 and the processor 610. The power management module 641 receives the input from the battery 642 and/or the charging management module 640, and supplies power to the processor 610, the internal memory 621, the display screen 690, the camera module 691, the wireless communication module 660, and the like. The power management module 641 may also be configured to monitor battery capacity, battery cycle count, battery state of health (leakage, impedance), and other parameters. In some other embodiments, the power management module 641 may be disposed in the processor 610. In other embodiments, the power management module 641 and the charging management module 640 may be disposed in the same device.
The wireless communication function of the electronic device 600 may be implemented by the antenna 1, the antenna 2, the mobile communication module 650, the wireless communication module 660, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 600 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 650 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 600. The mobile communication module 650 may include at least one filter, a switch, a power Amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 650 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 650 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 650 may be disposed in the processor 610. In some embodiments, at least some of the functional blocks of the mobile communication module 650 may be disposed in the same device as at least some of the blocks of the processor 610.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 671, the receiver 672, etc.) or displays an image or video through the display screen 690. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 610, and may be located in the same device as the mobile communication module 650 or other functional modules.
The Wireless Communication module 660 may provide a solution for Wireless Communication applied to the electronic device 600, including Wireless Local Area Networks (WLANs) (e.g., Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 660 may be one or more devices integrating at least one communication processing module. The wireless communication module 660 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 610. The wireless communication module 660 may also receive a signal to be transmitted from the processor 610, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 600 is coupled to mobile communication module 650 and antenna 2 is coupled to wireless communication module 660 such that electronic device 600 may communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Time-Division Multiple Access (TDSCDMA), Long Term Evolution (Long Term Evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a Global Navigation Satellite System (GLONASS), a beidou Satellite Navigation System (BDS), a Quasi-Zenith Satellite System (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 600 implements display functions via the GPU, the display screen 690, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 690 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 610 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 690 is used to display images, video, etc. The display screen 690 includes a display panel. The Display panel may be a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), an Active matrix Organic Light-Emitting Diode (Active-matrix Organic Light-Emitting Diode, AMOLED), a flexible Light-Emitting Diode (FLED), a miniature, a Micro-o led, a Quantum dot Light-Emitting Diode (QLED), or the like. In some embodiments, electronic device 600 may include 1 or N display screens 690, N being a positive integer greater than 1.
The electronic device 600 may implement a shooting function through the ISP, the camera module 691, the video codec, the GPU, the display screen 690, the application processor, and the like.
The ISP is used to process the data fed back by the camera module 691. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera module 691.
The camera module 691 is for capturing still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 600 may include 1 or N camera modules 691, where N is a positive integer greater than 1, and if the electronic device 600 includes N cameras, one of the N cameras is the main camera.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 600 selects at a frequency bin, the digital signal processor is used to perform a fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 600 may support one or more video codecs. In this way, the electronic device 600 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 600 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 622 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 600. The external memory card communicates with the processor 610 through the external memory interface 622 to implement data storage functions. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 621 may be used to store computer-executable program code, including instructions. The internal memory 621 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 600, and the like. In addition, the internal memory 621 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 610 executes various functional applications of the electronic device 600 and data processing by executing instructions stored in the internal memory 621 and/or instructions stored in a memory provided in the processor.
The electronic device 600 may implement audio functions through the audio module 670, the speaker 671, the receiver 672, the microphone 673, the headset interface 674, an application processor, and the like. Such as music playing, recording, etc.
The audio module 670 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 670 may also be used to encode and decode audio signals. In some embodiments, the audio module 670 may be disposed in the processor 610, or some functional modules of the audio module 670 may be disposed in the processor 610.
The speaker 671, also called "horn", is used to convert the electrical audio signals into sound signals. The electronic apparatus 600 can listen to music through the speaker 671 or listen to a hands-free call.
A receiver 672, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 600 receives a call or voice information, it can receive voice by placing the receiver 672 close to the ear.
A microphone 673, also known as a "microphone", is used to convert acoustic signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal into the microphone 673 by making a sound near the microphone 673 through the mouth of the user. The electronic device 600 may be provided with at least one microphone 673. In other embodiments, the electronic device 600 may be provided with two microphones 673 to implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 600 may further include three, four, or more microphones 673 to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headset interface 674 is used to connect wired headsets. The headset interface 674 may be a USB interface 630, or may be a 3.5mm Open Mobile electronic device Platform (OMTP) standard interface, a cellular telecommunications Industry Association of america (CTIA) standard interface.
The depth sensor 6801 is used to obtain depth information of the scene. In some embodiments, the depth sensor may be disposed in the camera module 691.
The pressure sensor 6802 is used for sensing the pressure signal and converting the pressure signal into an electrical signal. In some embodiments, pressure sensor 6802 may be disposed on display 690. The pressure sensor 6802 can be of a wide variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 6802, the capacitance between the electrodes changes. The electronic device 600 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 690, the electronic apparatus 600 detects the intensity of the touch operation according to the pressure sensor 6802. The electronic apparatus 600 can also calculate the position of the touch from the detection signal of the pressure sensor 6802. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 6803 may be used to determine a motion pose of the electronic device 600. In some embodiments, the angular velocity of electronic device 600 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensors 6803. The gyro sensor 6803 can be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 6803 detects a shake angle of the electronic device 600, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 600 through a reverse movement, thereby achieving anti-shake. The gyro sensor 6803 can also be used for navigation and body feeling game scenes.
The air pressure sensor 6804 is for measuring air pressure. In some embodiments, the electronic device 600 calculates altitude, aiding in positioning and navigation from barometric pressure values measured by the barometric pressure sensor 6804.
The magnetic sensor 6805 comprises a hall sensor. The electronic device 600 may detect the opening and closing of the flip holster using the magnetic sensor 6805. In some embodiments, when the electronic device 600 is a flip, the electronic device 600 can detect the opening and closing of the flip according to the magnetic sensor 6805. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 6806 can detect the magnitude of acceleration of the electronic device 600 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 600 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 6807 for measuring distance. The electronic device 600 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 600 may utilize the distance sensor 6807 to measure distances to achieve fast focus.
The proximity light sensor 6808 may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 600 emits infrared light to the outside through the light emitting diode. The electronic device 600 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 600. When insufficient reflected light is detected, the electronic device 600 may determine that there are no objects near the electronic device 600. The electronic device 600 can utilize the proximity light sensor 6808 to detect that the user holds the electronic device 600 close to the ear for communication, so as to automatically turn off the screen to save power. The proximity light sensor 6808 can also be used in a holster mode, a pocket mode automatically unlocking and locking the screen.
The fingerprint sensor 6809 is for collecting a fingerprint. The electronic device 600 can utilize the collected fingerprint characteristics to achieve fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 6810 is used to detect temperature. In some embodiments, the electronic device 600 implements a temperature processing strategy using the temperature detected by the temperature sensor 6810. For example, when the temperature reported by the temperature sensor 6810 exceeds a threshold, the electronic device 600 performs a reduction in performance of a processor located near the temperature sensor 6810 to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 600 heats the battery 642 when the temperature is below another threshold to avoid a low temperature causing the electronic device 600 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the electronic device 600 performs a boost on the output voltage of the battery 642 to avoid an abnormal shutdown due to low temperatures.
The touch sensor 6811 is also referred to as a "touch device". The touch sensor 6811 may be disposed on the display screen 690, and the touch sensor 6811 and the display screen 690 form a touch screen, which is also referred to as a "touch screen". The touch sensor 6811 is used to detect a touch operation applied thereto or therearound. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 690. In other embodiments, the touch sensor 6811 can be disposed on the surface of the electronic device 600 at a different location than the display screen 690.
The ambient light sensor 6812 is used to sense the ambient light level. Electronic device 600 may adaptively adjust the brightness of display 690 based on the perceived ambient light level. The ambient light sensor 6812 can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 6812 can also cooperate with the proximity light sensor 6808 to detect whether the electronic device 600 is in a pocket for protection against accidental touches.
The bone conduction sensor 6813 can acquire a vibration signal. In some embodiments, the bone conduction sensor 6813 can acquire vibration signals of the human voice vibrating a bone mass. The bone conduction sensor 6813 may receive a blood pressure pulsation signal in contact with the pulse of the human body. In some embodiments, the bone conduction sensor 6813 may also be disposed in a headset, integrated into a bone conduction headset. The audio module 670 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part acquired by the bone conduction sensor 6813, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure pulsation signal acquired by the bone conduction sensor 6813, so as to realize a heart rate detection function.
Keys 694 include a power-on key, a volume key, etc. Keys 694 may be mechanical keys. Or may be touch keys. The electronic apparatus 600 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 600.
The motor 693 may generate a vibration cue. The motor 693 can be used for incoming call vibration prompt and also for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 693 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 690. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 692 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, a missed call, a notification, etc.
The SIM card interface 695 is used for connecting a SIM card. The SIM card can be attached to and detached from the electronic device 600 by being inserted into the SIM card interface 695 or being pulled out of the SIM card interface 695. The electronic device 600 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 695 can support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 695 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 695 may also be compatible with different types of SIM cards. The SIM interface 695 may also be compatible with an external memory card. The electronic device 600 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 600 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 600 and cannot be separated from the electronic device 600.
In particular, according to an embodiment of the present invention, the processes described below with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by a Central Processing Unit (CPU), performs various functions defined in the system of the present application.
It should be noted that the computer readable medium shown in the embodiment of the present invention may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
It should be noted that, as another aspect, the present application also provides a computer-readable medium, which may be included in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (13)

1. A display control method, comprising:
responding to the target control operation, and acquiring a page to be converted;
identifying the page to be converted to acquire an image to be drawn and/or a region to be drawn in the page to be converted;
sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity, and calculating the proportion of each type of pixel points;
calling an image drawing strategy, and determining a color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points; and/or
And calling a region drawing strategy, and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points.
2. The method of claim 1, wherein the pixel point comprises: any one or more of a color pixel, a transparent pixel, a white pixel, a black pixel, and a gray pixel.
3. The method according to claim 2, wherein the invoking an image rendering policy and determining a color transformation policy of the image to be rendered based on the format of the image to be rendered and the ratio of each type of pixel point comprises:
and when the proportion of the color pixels of the image to be drawn is greater than a first threshold value, performing an ignoring operation on the image to be drawn.
4. The method according to claim 2, wherein the invoking an image rendering policy and determining a color transformation policy of the image to be rendered based on the format of the image to be rendered and the ratio of each type of pixel point comprises:
when the image to be drawn is identified to be in a format of 9 and the proportion of the white pixels is larger than a second threshold value, performing a white background color reversal operation on the image to be drawn; alternatively, the first and second electrodes may be,
and when the image to be drawn is identified to be in a 9 format, the sum of the proportion of the white pixels and the proportion of the black pixels is larger than a third threshold value, and the proportion of the white pixels is larger than the proportion of the black pixels, performing white background reverse color operation on the image to be drawn.
5. The method according to claim 3 or 4, wherein the image to be rendered is displayed superimposed on the area to be rendered.
6. The method according to claim 2, wherein the calling the region drawing policy, and determining the color transformation policy of the region to be drawn based on the size of the region to be drawn and the ratio of each type of pixel point, comprises:
when the size of the area to be drawn is identified to be smaller than the target size and the proportion of the white pixels is larger than a fourth threshold value, performing a white background reverse color operation on the area to be drawn; or
After the fact that the size of the area to be drawn is smaller than the target size, the proportion of the transparent pixels is larger than a fifth threshold value, and the proportion of the white pixels is larger than the fifth threshold value is recognized, white foreground reverse color operation is carried out on the area to be drawn; or
After the fact that the size of the image to be drawn is smaller than the target size, the proportion of the transparent pixels is larger than a fifth threshold value, and the proportion of the black pixels is larger than a fifth threshold value is recognized, black foreground reverse color operation is carried out on the area to be drawn; or
And when the fact that the size of the image to be drawn is smaller than the target size and the proportion of the gray pixels exceeds a fifth threshold value is recognized, performing gray foreground reverse color operation on the area to be drawn.
7. The method according to claim 2, wherein the calling the region drawing policy, and determining the color transformation policy of the region to be drawn based on the size of the region to be drawn and the ratio of each type of pixel point, comprises:
when the fact that the size of the area to be drawn is larger than the target size and the proportion of the transparent pixels is in a first transparent pixel interval is recognized, and the proportion of the white pixels is larger than a second threshold value, performing white background reverse color operation on the area to be drawn; alternatively, the first and second electrodes may be,
when the fact that the size of the area to be drawn is larger than the target size and the proportion of the transparent pixels is in a first transparent pixel interval is recognized, and the proportion of the black pixels is larger than a second threshold value, performing black background reverse color operation on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a second transparent pixel interval is recognized, and the proportion of the white pixels is larger than a second threshold value, performing white foreground reverse color operation on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a second transparent pixel interval is recognized, and the proportion of the black pixels is larger than a second threshold value, black foreground reverse color operation is carried out on the area to be drawn; or
When the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a third transparent pixel interval is recognized, and the proportion of the white pixels is larger than a third threshold value, performing white foreground reverse color operation on the area to be drawn; or
And when the fact that the size of the image to be drawn is larger than the target size and the proportion of the transparent pixels is in a third transparent pixel interval is recognized, and the proportion of the black pixels is larger than a third threshold value, performing black foreground reverse color operation on the area to be drawn.
8. The method according to claim 6 or 7, characterized in that the text to be drawn is displayed in an overlapping manner on the area to be drawn;
the method further comprises the following steps: and configuring the text to be drawn as a target color according to the color of the region to be drawn after the reverse color operation.
9. The method according to claim 1 or 2, wherein the calculating of the proportion of each type of pixel point is performed in the following manner:
the color pixel percentage is the total number of color pixels/(the total number of pixel points-the total number of transparent pixels);
the transparent pixel percentage is equal to the total number of transparent pixels/the total number of pixel points;
white pixel percentage is the total number of white pixels/(total number of white pixels + total number of black pixels + total number of gray pixels);
black pixel percentage-total black pixels/(total white pixels + total black pixels + total gray pixels);
gray pixel percentage is the total number of gray pixels/(total number of white pixels + total number of black pixels + total number of gray pixels).
10. The method of claim 1, further comprising:
acquiring brightness and/or contrast information of the image to be drawn and/or the area to be drawn after color conversion and current environment information, and performing optimization operation on the image to be drawn after color conversion according to the current environment information.
11. A display control apparatus, characterized by comprising:
the page to be converted acquiring module is used for responding to the target control operation and acquiring a page to be converted;
the page to be converted identification module is used for identifying the page to be converted so as to acquire an image to be drawn and/or a region to be drawn in the page to be converted;
the pixel point sampling module is used for sampling the image to be drawn and/or the area to be drawn to obtain pixel points with target quantity and calculating the proportion of each type of pixel points;
the image drawing module is used for calling an image drawing strategy and determining a color transformation strategy of the image to be drawn based on the format of the image to be drawn and the proportion of each type of pixel points; and/or
And the region drawing module is used for calling a region drawing strategy and determining the color transformation strategy of the region to be drawn based on the size of the region to be drawn and the proportion of each type of pixel points.
12. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the display control method according to any one of claims 1 to 10.
13. A terminal device, comprising:
one or more processors;
a storage device for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the display control method of any one of claims 1 to 10.
CN202010274861.4A 2020-04-09 2020-04-09 Display control method and device, computer readable medium and terminal equipment Active CN111552451B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010274861.4A CN111552451B (en) 2020-04-09 2020-04-09 Display control method and device, computer readable medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010274861.4A CN111552451B (en) 2020-04-09 2020-04-09 Display control method and device, computer readable medium and terminal equipment

Publications (2)

Publication Number Publication Date
CN111552451A true CN111552451A (en) 2020-08-18
CN111552451B CN111552451B (en) 2023-08-22

Family

ID=71998590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010274861.4A Active CN111552451B (en) 2020-04-09 2020-04-09 Display control method and device, computer readable medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN111552451B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111813492A (en) * 2020-08-19 2020-10-23 深圳市欢太科技有限公司 Display method and device of fast application, electronic equipment and storage medium
CN112650434A (en) * 2020-12-29 2021-04-13 微医云(杭州)控股有限公司 Scale generation method and device, electronic equipment and storage medium
CN114047447A (en) * 2021-11-10 2022-02-15 京东方科技集团股份有限公司 Electric quantity prompting method, electronic label, system and storage medium
WO2022042232A1 (en) * 2020-08-27 2022-03-03 华为技术有限公司 Dark mode generation method for user interface, electronic device, and storage medium
CN114449231A (en) * 2020-10-31 2022-05-06 荣耀终端有限公司 Image conversion method and device
WO2023220929A1 (en) * 2022-05-17 2023-11-23 北京小米移动软件有限公司 Interface display method and apparatus, terminal, and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080198175A1 (en) * 2007-02-20 2008-08-21 Microsoft Corporation Drag-And-Drop Pasting For Seamless Image Composition
JP2009010983A (en) * 1996-02-26 2009-01-15 Richard A Holub Method of associating different image display devices, color error detection method, and color measurement device
CN101625618A (en) * 2008-07-09 2010-01-13 纬创资通股份有限公司 Screen display area segmenting method, screen display area segmenting system and computer program product
WO2015039567A1 (en) * 2013-09-17 2015-03-26 Tencent Technology (Shenzhen) Company Limited Method and user apparatus for window coloring
CN104657417A (en) * 2014-12-17 2015-05-27 东软集团股份有限公司 Thermodynamic diagram processing method and thermodynamic diagram processing system
CN104657465A (en) * 2015-02-10 2015-05-27 腾讯科技(深圳)有限公司 Webpage display control method and device
CN106603838A (en) * 2016-12-06 2017-04-26 深圳市金立通信设备有限公司 Image processing method and terminal
US20170372479A1 (en) * 2016-06-23 2017-12-28 Intel Corporation Segmentation of objects in videos using color and depth information
CN108198146A (en) * 2017-12-29 2018-06-22 努比亚技术有限公司 A kind of noise-reduction method, equipment and computer readable storage medium
CN108877734A (en) * 2018-06-14 2018-11-23 Oppo广东移动通信有限公司 The color temperature adjusting method and Related product of touching display screen
US20190156526A1 (en) * 2016-12-28 2019-05-23 Shanghai United Imaging Healthcare Co., Ltd. Image color adjustment method and system
US20190371008A1 (en) * 2018-06-01 2019-12-05 Adobe Inc. Generating enhanced digital images by selectively transforming raster images to vector drawing segments

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009010983A (en) * 1996-02-26 2009-01-15 Richard A Holub Method of associating different image display devices, color error detection method, and color measurement device
US20080198175A1 (en) * 2007-02-20 2008-08-21 Microsoft Corporation Drag-And-Drop Pasting For Seamless Image Composition
CN101625618A (en) * 2008-07-09 2010-01-13 纬创资通股份有限公司 Screen display area segmenting method, screen display area segmenting system and computer program product
WO2015039567A1 (en) * 2013-09-17 2015-03-26 Tencent Technology (Shenzhen) Company Limited Method and user apparatus for window coloring
CN104657417A (en) * 2014-12-17 2015-05-27 东软集团股份有限公司 Thermodynamic diagram processing method and thermodynamic diagram processing system
CN104657465A (en) * 2015-02-10 2015-05-27 腾讯科技(深圳)有限公司 Webpage display control method and device
US20170372479A1 (en) * 2016-06-23 2017-12-28 Intel Corporation Segmentation of objects in videos using color and depth information
CN106603838A (en) * 2016-12-06 2017-04-26 深圳市金立通信设备有限公司 Image processing method and terminal
US20190156526A1 (en) * 2016-12-28 2019-05-23 Shanghai United Imaging Healthcare Co., Ltd. Image color adjustment method and system
CN108198146A (en) * 2017-12-29 2018-06-22 努比亚技术有限公司 A kind of noise-reduction method, equipment and computer readable storage medium
US20190371008A1 (en) * 2018-06-01 2019-12-05 Adobe Inc. Generating enhanced digital images by selectively transforming raster images to vector drawing segments
CN108877734A (en) * 2018-06-14 2018-11-23 Oppo广东移动通信有限公司 The color temperature adjusting method and Related product of touching display screen

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111813492A (en) * 2020-08-19 2020-10-23 深圳市欢太科技有限公司 Display method and device of fast application, electronic equipment and storage medium
CN111813492B (en) * 2020-08-19 2024-05-03 深圳市欢太科技有限公司 Display method and device for quick application, electronic equipment and storage medium
WO2022042232A1 (en) * 2020-08-27 2022-03-03 华为技术有限公司 Dark mode generation method for user interface, electronic device, and storage medium
CN114449231A (en) * 2020-10-31 2022-05-06 荣耀终端有限公司 Image conversion method and device
CN114449231B (en) * 2020-10-31 2023-11-24 荣耀终端有限公司 Image conversion method and device
CN112650434A (en) * 2020-12-29 2021-04-13 微医云(杭州)控股有限公司 Scale generation method and device, electronic equipment and storage medium
CN114047447A (en) * 2021-11-10 2022-02-15 京东方科技集团股份有限公司 Electric quantity prompting method, electronic label, system and storage medium
WO2023220929A1 (en) * 2022-05-17 2023-11-23 北京小米移动软件有限公司 Interface display method and apparatus, terminal, and storage medium

Also Published As

Publication number Publication date
CN111552451B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN111552451B (en) Display control method and device, computer readable medium and terminal equipment
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN111132234A (en) Data transmission method and corresponding terminal
CN111182140B (en) Motor control method and device, computer readable medium and terminal equipment
CN111770282B (en) Image processing method and device, computer readable medium and terminal equipment
WO2021169515A1 (en) Method for data exchange between devices, and related device
CN111930335A (en) Sound adjusting method and device, computer readable medium and terminal equipment
CN114880251A (en) Access method and access device of storage unit and terminal equipment
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN112037157A (en) Data processing method and device, computer readable medium and electronic equipment
CN111935705A (en) Data service management method and device, computer readable medium and terminal equipment
CN114005016A (en) Image processing method, electronic equipment, image processing system and chip system
CN116055859B (en) Image processing method and electronic device
CN113674258B (en) Image processing method and related equipment
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN112527220B (en) Electronic equipment display method and electronic equipment
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN113467747A (en) Volume adjustment method, electronic device, storage medium, and computer program product
CN114064381A (en) USB interface water inflow detection method and electronic equipment
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium
CN114466238A (en) Frame demultiplexing method, electronic device and storage medium
CN111432156A (en) Image processing method and device, computer readable medium and terminal equipment
CN111586236A (en) Electronic equipment marking method and device, computer readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant