CN117672110A - Display control system and method for tablet personal computer - Google Patents

Display control system and method for tablet personal computer Download PDF

Info

Publication number
CN117672110A
CN117672110A CN202311724601.2A CN202311724601A CN117672110A CN 117672110 A CN117672110 A CN 117672110A CN 202311724601 A CN202311724601 A CN 202311724601A CN 117672110 A CN117672110 A CN 117672110A
Authority
CN
China
Prior art keywords
data
scene
compensation
determining
tablet computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311724601.2A
Other languages
Chinese (zh)
Other versions
CN117672110B (en
Inventor
王亦方
刘根
张中明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oudu Lifang Technology Co ltd
Original Assignee
Guangdong Oudu Lifang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oudu Lifang Technology Co ltd filed Critical Guangdong Oudu Lifang Technology Co ltd
Priority to CN202311724601.2A priority Critical patent/CN117672110B/en
Publication of CN117672110A publication Critical patent/CN117672110A/en
Application granted granted Critical
Publication of CN117672110B publication Critical patent/CN117672110B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/302Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/865Monitoring of software
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0613The adjustment depending on the type of the information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the specification discloses a tablet computer display control system and method, the system includes: the environment data acquisition module is configured to acquire environment data; the motion data acquisition module is configured to acquire motion data of the tablet computer; a storage module configured to store operation data and usage data of a user; a processing module configured to: determining scene data based on at least one of the motion data and the environmental data; determining a display mode of the tablet computer based on the scene data and the usage data; determining a compensation scene based on the scene data; determining protection parameters based on the compensation scene; and controlling the tablet personal computer to display the picture based on the protection parameters.

Description

Display control system and method for tablet personal computer
Technical Field
The present disclosure relates to the field of device display technologies, and in particular, to a system and a method for controlling display of a tablet computer.
Background
At present, tablet computers are commonly used, and long-time use can cause certain influence on vision. CN109995942B provides an eye protection method and system for an intelligent terminal, which are used for comprehensively identifying the use environment and the use state of a user by utilizing hardware such as a distance sensor, a light sensor and a camera of the intelligent terminal, and identifying the content watched by the user by utilizing eye protection software, so that the eye protection threshold is intelligently adjusted, the practicability and the effectiveness of the eye protection method are improved, and the eye protection effect on the user is improved. However, the above method does not relate to a method for adjusting the terminal device in a complex environment. For example, how the screen should be adjusted when the user uses the terminal device while riding, walking, or other complex scene.
Therefore, the specification provides a display control system and a display control method for a tablet personal computer, so as to achieve the purpose of self-adaptive adjustment of screen display of the tablet personal computer in a complex environment, thereby better protecting eyes.
Disclosure of Invention
One of the embodiments of the present disclosure provides a tablet computer display control system. The tablet computer display control system comprises: the environment data acquisition module is configured to acquire environment data; the motion data acquisition module is configured to acquire motion data of the tablet computer; a storage module configured to store operation data and usage data of a user; a processing module configured to: determining scene data based on at least one of the motion data and the environmental data; determining a display mode of the tablet computer based on the scene data and the usage data; determining a compensation scene based on the scene data; determining a protection parameter based on the compensation scene; and controlling the tablet personal computer to display a picture based on the protection parameters.
One of the embodiments of the present disclosure provides a method for controlling display of a tablet pc, where the method is executed based on a processor and includes: determining scene data based on at least one of the environmental data and the motion data of the tablet computer; determining a display mode based on the scene data and the usage data of the user; determining a compensation scene based on the scene data; determining a protection parameter based on the compensation scene; and controlling the tablet personal computer to display a picture based on the protection parameters.
One of the embodiments of the present disclosure provides a display control device for a tablet computer, which includes a processor, where the processor is configured to execute the foregoing display control method for the tablet computer.
One of the embodiments of the present disclosure provides a computer-readable storage medium storing computer instructions that, when read by a computer, perform the aforementioned tablet computer display control method.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a block diagram of a tablet computer display control system according to some embodiments of the present disclosure;
FIG. 2 is an exemplary flowchart of a tablet computer display control method according to some embodiments of the present description;
FIG. 3 is a schematic diagram of a method of determining a display mode according to some embodiments of the present disclosure;
FIG. 4 is a schematic illustration of determining display parameters according to some embodiments of the present description;
fig. 5 is a schematic diagram illustrating determining compensation parameters according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
In order to reduce the damage of terminal equipment to eyesight, CN109995942B provides an eye protection method and system of an intelligent terminal, which are used for realizing comprehensive identification of a user using environment and a using state by utilizing hardware such as a distance sensor, a light sensor, a camera and the like of the intelligent terminal, and identifying content watched by a user by utilizing eye protection software, so that an eye protection threshold value is intelligently adjusted. The above method provides a certain protection for the eyes, but does not relate to a method for adjusting the device in a complex environment. For example, in a complex environment such as walking and riding (buses, subways, etc.), the user can adjust the method in a case where the environmental factors change greatly. Therefore, the present disclosure proposes a system and a method for controlling display of a tablet computer, so as to achieve the purpose of self-adaptive adjustment of screen display of the tablet computer in a complex environment, thereby better protecting eyes.
Fig. 1 is a block diagram of a tablet computer display control system according to some embodiments of the present disclosure. As shown in fig. 1, the tablet computer display control system 100 may include an environmental data acquisition module 110, a motion data acquisition module 120, a storage module 130, and a processing module 140.
The environmental data collection module 110 may be used to obtain environmental data around the tablet computer.
The motion data acquisition module 120 may be used to acquire motion data of a tablet computer.
The storage module 130 may be used to store operation data and usage data of a user.
The processing module 140 may be configured to determine scene data based on at least one of the motion data and the environmental data; determining a display mode of the tablet computer based on the scene data and the usage data; determining a compensation scene based on the scene data; determining protection parameters based on the compensation scene; and controlling the tablet personal computer to display the picture based on the protection parameters.
In some embodiments, the processing module 140 may determine scene data based on the environmental data, the motion data; determining a user characteristic based on the usage data; and determining a display mode based on the environmental characteristics and the user characteristics.
In some embodiments, the processing module 140 may be further configured to obtain the sequence of ambient light intensities in response to the compensation scene meeting a first condition, the first condition comprising the compensation scene being an ambient light compensation scene; based on the sequence of ambient light intensities, a basic display parameter is determined according to a preset period.
In some embodiments, the processing module 140 may be further configured to obtain facial image data of the user in response to the compensation scene meeting a second condition, the second condition including the compensation scene being a motion compensation scene; determining a usage distance feature based on the facial image data; determining estimated motion data based on the motion data and the scene data; based on the estimated motion data, using the distance features, compensation parameters are determined.
See fig. 3 and related content for details regarding scene data, display modes; see fig. 4, 5 for details regarding compensation scenarios, basic display parameters, and compensation parameters.
It should be understood that the system shown in fig. 1 and its modules may be implemented in a variety of ways. It should be noted that the above description of the tablet computer display control system and the modules thereof is for convenience of description only, and is not intended to limit the present disclosure to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. In some embodiments, the environmental data collection module, the motion data collection module, the storage module, and the processing module disclosed in fig. 1 may be different modules in one system, or may be one module to implement the functions of two or more modules described above. For example, each module may share one memory module, or each module may have a respective memory module. Such variations are within the scope of the present description.
Fig. 2 is an exemplary flowchart of a tablet computer display control method according to some embodiments of the present description. As shown in fig. 2, the process 200 includes the following steps. In some embodiments, the process 200 may be performed by a processor.
At step 210, scene data is determined based on at least one of the environmental data and the motion data.
The environmental data refers to the surrounding environmental data of the tablet computer at the time, for example, the environmental data may include ambient light intensity, ambient sound data, and the like.
In some embodiments, the environmental data may be collected by the environmental data collection module 110. The environmental data acquisition module 110 may include at least one of an image acquisition device, a light sensor, and a sound sensor.
The motion data refers to motion data of the tablet computer, for example, a motion speed, a motion direction, a motion acceleration, and the like of the tablet. The movement of the tablet computer is generated along with the movement of the user, for example, the tablet computer is used when the user walks, and the tablet computer rotates and/or shifts along with the action of the user.
In some embodiments, the motion data may be collected by the motion data collection module 120. The motion data acquisition module 120 may include at least a vibration sensor.
The scene data refers to a specific scene type of a user when using the tablet computer. For example, when a user uses a tablet computer, the environment in which the user is located is an indoor scene or an outdoor scene, and is a scene in which other users may use the tablet computer, such as walking, riding in a car, and the like.
In some embodiments, the scene data further may include hiking scene data and ride scene data. The hiking scene data is scene data of a user using a tablet computer in a hiking process. For example, a user uses scene data of a tablet computer while walking in a park. The riding scene data is scene data of a user using a tablet computer in the riding process. For example, scene data of a user while riding on a subway.
In some embodiments, the processing module may determine the scene data by querying a scene data reference table based on the environmental data and the motion data. The scene data reference table can be determined based on historical data of a tablet computer used by a user, and comprises historical environment data, historical motion data and reference scene data corresponding to the historical environment data and the historical motion data.
In some embodiments, the processing module may determine the scene data through a machine learning model. For example, the scene data may be determined by a scene determination layer of a first determination model based on the environment data and the motion data, wherein the first determination model is a machine learning model.
For more on the first determination model and its scene determination layer see the relevant description in fig. 3.
Step 220, determining a display mode based on the scene data and the usage data of the user.
The usage data refers to operation data of the tablet personal computer by a user. The usage data may include software, functions, and historical display data currently used by the user. The historical display data refers to adjustment data of display parameters when a user uses different software in a historical time, and the adjustment data comprises at least one of a historical display mode, historical screen brightness and historical screen contrast. The history display data may also include other history parameters, which may be specifically determined according to the actual situation.
The display mode refers to a manner of browsing the content displayed on the screen. The display modes may include a page flip/scroll mode, a manual/automatic mode, and the like. In the automatic mode, the display mode may further include a numerical value of parameters such as a page turning speed and a scroll speed.
In some embodiments, the processing module may determine the display mode by querying a display mode reference table based on the scene data and the usage data. The display mode reference table may be preset based on historical experience, and includes historical usage data, historical scene data, and reference display modes corresponding to the historical usage data.
In some embodiments, the processing module may determine the display mode through its learning model. For example, the display mode may be determined by a mode determination layer of a first determination model based on the scene data and the usage data, wherein the first determination model is a machine learning model.
For more details on the first determination model and its mode determination layer, see the relevant description in fig. 3.
At step 230, a compensation scene is determined based on the scene data.
In the process of using the tablet personal computer by a user, the requirements for screen parameters are different in different scenes. For example, when the light is dim and the screen shakes, different adjustment needs to be performed on the screen, so that the influence of the scene where the user is located on the viewing tablet computer is reduced. The compensation scene is a parameter for reflecting the adjustment requirement of the user on the screen parameter.
In some embodiments, the compensation scene may include an ambient light compensation scene as well as a motion compensation scene. The ambient light compensation scene is a scene in which the pointer compensates for the tablet computer under the condition that the ambient light changes greatly. The motion compensation scene is a scene in which the pointer compensates for the tablet computer in motion.
In some embodiments, the compensation scenario may also include an integrated compensation scenario. A complex compensation scene refers to a scene that requires simultaneous light compensation and motion compensation.
In some embodiments, the processing module may determine the compensation scene by querying a compensation scene reference table based on the scene data. The compensation scene reference table may be determined based on historical experience, and includes historical scene data and corresponding reference compensation scenes. The corresponding relation between the scene data and the compensation scene is recorded in the table. The data in the table may be constructed from historical scene data and historical compensation scene data.
Step 240, determining a protection parameter based on the compensation scene, and controlling the tablet computer to display the picture based on the protection parameter.
The protection parameters are parameters for adjusting the display mode of the tablet personal computer to protect eyes of a user. In some embodiments, the protection parameters may include at least one of a base display parameter and a compensation parameter.
The basic display parameters refer to display parameters of the tablet computer, and may include brightness, contrast, and the like. For example, when a user takes an airplane, the light environment is greatly changed, so that eyes of the user are tired or damaged when the user uses the tablet personal computer, and the basic display parameters can be used for adjusting the screen display parameters, so that the display of the screen is in a comfortable range for eyes of the user.
The compensation parameter is a parameter for compensating for shaking of the tablet computer. For example, when a user takes a car, the car can start and stop to bring the shaking of the tablet personal computer, so that eyes of the user are tired, and the compensation parameters can compensate the shaking, so that the visual angle of the user and the shaking of the screen keep a small gap or are relatively static.
In some embodiments, the protection parameters may be determined based on preset rules. For example, the processing module may preset protection parameters in different compensation scenarios, and operate corresponding protection parameters based on the determined compensation scenario.
In some embodiments, in response to the compensation scene meeting the first condition, the processing module may obtain an ambient light intensity sequence under the current scene; based on the sequence of ambient light intensities, a basic display parameter is determined according to a preset period. For more details on determining basic display parameters see fig. 4 and related content.
In some embodiments, the processing module may obtain facial image data of the user in response to the compensation scene meeting the second condition; determining a usage distance feature based on the facial image data; determining predicted motion data based on the motion data and the scene data; the compensation parameters are determined in accordance with the compensation period based on using the predicted motion data and using the distance features. See fig. 5 and related content for further details regarding determining compensation parameters.
In some embodiments, the processing module may control the tablet computer to display a screen based on the protection parameters.
According to some embodiments of the specification, by determining scene data of the tablet personal computer, determining a compensation scene and further determining protection parameters, display parameters can be adaptively adjusted according to user characteristics without manual touch under different complex environments, so that display of a screen is changed, a user obtains better use experience, and the method is intelligent and convenient.
FIG. 3 is a schematic diagram of a method of determining a display mode according to some embodiments of the present description. As shown in fig. 3, the process of determining the display mode includes the following.
In some embodiments, the processing module determines scene data based on the environmental data, the motion data; determining a user characteristic based on the usage data; based on the environmental data and the user characteristics, a display mode is determined.
The user characteristics refer to preference characteristics of the display mode when the user uses the tablet computer. For example, the user characteristics may include a display mode in which the user is accustomed to using manual page turning when stationary indoors, with 80% screen brightness.
In some embodiments, the user characteristics may be obtained by performing a mathematical statistical analysis of the user's historical usage data. For example, the frequency of using each display mode when a user reads under different circumstances is counted. For example, the user uses APP1 10 times in total, 8 times using manual page turning mode, 2 times using manual scroll mode, then the user preference feature may be [ (APP 1, manual page turning, 0.8), (APP 1, manual scroll, 0.2) ].
In some embodiments, the processing module may determine the display mode 320 through a first determination model 310, the first determination model 310 being a machine learning model. The first determination model 310 may include a scene determination layer 311 and a mode determination layer 312.
In some embodiments, the scene determination layer 311 is used to determine scene data. The inputs of the scene determination layer 311 may include the environment data 340, the motion data sequence 350, and the pose data sequence 360, and the outputs may include the scene data 330.
The sequence of motion data 350 is a sequence of user motion data over a period of time. For example, a sequence of all movement data during a ride of a user. See fig. 2 and related content for relevant details regarding athletic data.
The pose data is the position and direction of the tablet computer in three-dimensional space. The pose data may include position data, pose data (e.g., inclination, tilt angle) of the tablet, and the like. The pose data can be acquired by installing relevant pose acquisition software in the tablet computer. The pose data sequence 360 may be a sequence of pose data for a user over a period of time.
For more description of the context data 340, the scene data 330, see relevant content in fig. 2.
In some embodiments, the scene determination layer 341 may be trained separately by gradient descent or other means based on a number of first training samples with first tags. The first training sample may include sample environment data, a sample motion data sequence, and a sample pose data sequence. The first tag may be a real scene to which the sample data corresponds. The first tag may be obtained by a technician from a sample of the real scene.
In some embodiments, the input of the scene determination layer 311 also includes an ambient sound type, which is based on ambient sound data acquisition.
The ambient sound type may be a type of ambient noise around the tablet computer. Such as human voice, car whistling, water flow, etc.
In some embodiments, the ambient sound type may be determined analytically by means of an audio classification algorithm or a machine learning model, etc. The audio classification algorithm may include a minimum distance method, a decision tree method, and the like.
In some embodiments, when the input of the scene determination layer further includes an ambient sound type, the samples when training the scene determination layer further include a sample ambient sound type.
According to some embodiments of the present disclosure, by adding an environmental sound type to a scene determination layer, a variable factor is added to a model, and a gap between an output value and a true value is reduced, so that output scene data is more real, and a protection parameter can be determined more accurately later, so that display adjustment is more accurate.
In some embodiments, the mode determination layer 312 is used to determine a display mode of the tablet. The inputs to the mode determination layer 312 may include scene data 330, user characteristics 380, usage data 370, and operational data 390, and the outputs may include display modes 320.
The operation data refers to the operation of the tablet by the user. The operation data 345 may include operation data of a user's click, slide, input, etc. on the tablet.
For more on user features 380, usage data 370, and operational data 390, see the relevant description earlier in this specification.
In some embodiments, the pattern determination layer 312 may be trained alone by gradient descent or other means based on a plurality of second training samples with second labels. The second training samples may include sample scene data, sample user characteristics, sample usage data corresponding to the sample user, and sample operation data. The second label may be a corresponding display mode in a case where the second training sample corresponds. The second tag may be a tag that is set by the user as a display mode or a tag that is determined by acquiring a label of a technician.
In some embodiments of the present disclosure, by adding a machine learning model, a large number of sample training is integrated, so that scene data and display modes where a tablet computer is located can be obtained more accurately, thereby being beneficial to more accurately determining protection parameters.
According to some embodiments of the present disclosure, the display mode is determined by adding the user features, and the personalized factors are added, so that the tablet computer can present the display mode closer to the original habit of the user in various environments.
FIG. 4 is a schematic diagram illustrating determining display parameters according to some embodiments of the present description.
In some embodiments, the protection parameters include a base display parameter 430. The basic display parameters 430 refer to basic parameters for controlling display effects in the tablet pc. For example, the basic display parameters may include screen brightness, contrast, refresh rate, resolution, and the like.
In some embodiments, the processing module 140 may obtain the sequence of ambient light intensities 420 in response to the compensation scene 410 meeting a first condition, the first condition comprising the compensation scene being an ambient light compensation scene; based on the sequence of ambient light intensities 420, a basic display parameter 430 is determined according to a preset period.
An ambient light intensity sequence refers to a sequence of light intensities generated by light sources at different locations in the environment of the current scene for a given time. In some embodiments, the sequence of ambient light intensities may be acquired by a sensor (e.g., a light sensor) within the tablet.
The first condition is a condition for judging how to determine the protection parameter. The processing module may determine the base display parameter based on the sequence of ambient light intensities when the compensation scene meets the first condition. In some embodiments, the first condition includes the compensation scene being an ambient light compensation scene.
An ambient light compensation scene refers to a scene in which ambient light changes frequently. Such as trains or cars traversing tunnels, periodic start and stop of subways, etc.
The preset period refers to a period in which the basic display parameters are determined. In some embodiments, the preset period may be preset according to the actual situation.
In some embodiments, the processing module may determine the basic display parameters according to the aforementioned preset period to determine the appropriate parameters according to the current scenario in which the user uses the tablet computer.
In some embodiments, the processing module adjusts the preset period in response to the environmental data meeting a preset environmental condition.
In some embodiments, the predetermined ambient condition may be that the magnitude of the change in the intensity of light in the sequence of ambient light intensities is greater than a magnitude threshold, or that the frequency of the change is greater than a frequency threshold. When the variation amplitude or the variation frequency is large, the preset period needs to be shortened in order to avoid the condition that the picture is not seen clearly or the picture is too dazzling.
In some embodiments, the initially preset period may be preset based on the scene data.
In some embodiments, the processing module may query the preset period reference table to determine the amount of shortening of the preset period based on the magnitude or frequency of the change in the ambient light intensity data. The reference table of the preset period comprises the corresponding relation between the reference change amplitude and the reference change frequency of the environmental light intensity data and the shortening amount of the reference preset period. The first preset reference table may be constructed according to a priori knowledge or historical data, for example, the larger the variation amplitude of the environmental light intensity data is, the larger the shortening amount of the preset period is.
In some embodiments, the processing module updates the base display parameter with the period threshold in response to the preset period being less than the period threshold. Too frequent adjustment of the preset period may cause eye fatigue of the user, and therefore, when the preset period is less than the period threshold, the basic display parameter needs to be updated with the period threshold. For example, the preset period threshold is 5s, the preset period after confirming the adjustment amplitude should be 3s, and the basic display parameter is still updated with the preset period of 5 s.
In some embodiments, the processing module may determine the base display parameters in a variety of ways. For example, the processing module may determine the basic display parameter by looking up the parameter lookup table according to a preset period based on the current ambient light intensity sequence. The parameter comparison table comprises a reference ambient light intensity sequence, a reference preset period and a corresponding relation of reference basic display parameters. The second preset reference table may be constructed according to priori knowledge or historical data, where the stronger the ambient light intensity in the ambient light intensity sequence, the higher the brightness in the basic display parameters.
In some embodiments, the processing module may further construct a protection feature vector based on the current display mode, the ambient light intensity sequence, and the current power, and determine the basic display parameters based on a search result of the protection feature vector in the vector database. The vector database comprises a plurality of reference vectors and basic display parameters corresponding to each reference vector. The reference vector is constructed based on the historical display pattern, the historical ambient light intensity sequence, and the historical power in the historical data. The processing module can select a basic display parameter corresponding to the reference vector with the minimum vector distance as a recommended basic display parameter by calculating the vector distance between the protection feature vector and the reference vector.
In some embodiments of the present disclosure, determining the basic display parameter based on the current ambient light intensity sequence, the current power, and the like may enable the basic display parameter to be more matched with the power required by the usage scenario; for example, similar to a hiking scene, which needs to pay attention to electric quantity, the basic display parameters are determined by using fewer parameters, so that a certain electricity saving effect can be achieved.
In some embodiments, the processing module is further configured to: based on the current usage data of the user, the basic display parameters are adjusted.
In some embodiments, the processing module adjusts the base display parameter in response to the user's current usage time being greater than a usage time threshold. For example, brightness is adjusted to an appropriate level, display contrast is optimized, and the like.
In some embodiments, the adjustment amplitude may be determined in a variety of ways. For example, based on the time of use and the current power level, by preset rules; for example, the preset rule may be that the longer the usage time, the more the amount of electricity, and the larger the adjustment amplitude.
In some embodiments, the adjustment amplitude may be determined based on the time of use and the current power query adjustment amplitude reference table. The adjustment amplitude reference table comprises the corresponding relation between the reference using time, the reference current electric quantity and the reference basic display parameters. The adjustment amplitude reference table can be constructed according to priori knowledge or historical data, for example, the longer the use time is, the more the electric quantity is, the more the basic display parameters are required to be adjusted to be beneficial to eye protection (such as slightly lower brightness, slightly lower contrast and higher refresh rate). The adjustment amplitude may be a difference between the corresponding reference basic display parameter and the current basic display parameter, which is queried based on the usage time and the current power.
In some embodiments of the present disclosure, the adjustment of the basic display parameter based on the ambient light intensity, the current usage data of the user, and the like may enable the basic display parameter to be more matched with the current usage situation of the user, thereby better protecting the eye health of the user.
In some embodiments of the present description, when ambient light compensates for a scene, a more suitable base display parameter may be determined by determining the base display parameter from the ambient light intensity. For example, brightness, refresh rate and contrast of the screen all affect eye health of the user; lower brightness and contrast, higher refresh rate, better for the user's eyes, but too high refresh rate would result in too high power consumption, too low brightness and contrast would affect the user's experience. The basic display parameters are determined by using the ambient light intensity, so that the basic display parameters are more matched with the environment, and the eye health of a user is better protected; for example, whether to turn on an eye-protecting function, a blue light-preventing function, an anti-glare function, etc. can be determined by the intensity of the ambient light so as to better protect the eye health of the user.
Fig. 5 is a schematic diagram illustrating determining compensation parameters according to some embodiments of the present description.
In some embodiments, the protection parameters further include compensation parameters 540. The compensation parameter 540 refers to a parameter for correcting the current display screen.
In some embodiments, compensation parameters 540 may ensure that the eyes of the user and the pictures they view remain relatively stationary. For example, when a user uses a tablet computer while walking, the user's eyes may shake relative to the tablet computer due to the fluctuation of the body while walking; at this time, when the eyes of the user move upwards relatively, the picture of the tablet personal computer is corrected to be an upwards moving picture, so that the eyes of the user and the picture watched by the eyes of the user are kept relatively static. Illustratively, the compensation parameter 540 may include at least one of a horizontal movement magnitude, a vertical movement magnitude, a rotation magnitude, and a scaling, and the achieved effect may include at least one of a left movement, a right movement, a zoom-in, a zoom-out, and a tilt of the picture.
In some embodiments, the processing module is to enter the window mode in response to the scene data type belonging to the ride scene type. The window mode refers to reducing the current browsing screen so that the current browsing screen occupies only a part of the screen. The window mode can be beneficial to realizing the effects of enlarging, reducing and the like of the picture.
In some embodiments, the processing module may also determine initial picture data based on the motion data and the scene data.
The initial picture data refers to an initial size and a position of a picture. As described in the foregoing description of the window mode, to better facilitate the implementation of the effect of compensating for the parameter zoom-in and zoom-out, it is necessary to determine the initial frame data of the window mode.
In some embodiments, the initialization surface data is related to motion data. For example, when the shake amplitude of the tablet computer in the parallel direction is large, the horizontal size of the picture in the initial picture data is small; the size of the tablet personal computer is limited, and when the picture size of the initial picture data is too large, the movement of the window on the screen is limited, and the compensation effect is affected. Accordingly, the compensation effect can be ensured by correlating the initial picture data and the motion data.
In some embodiments, the processing module may construct a picture feature vector based on the motion data and the scene data, and determine the initial picture data based on a search result of the picture feature vector in the vector database. The vector database comprises a plurality of reference vectors and initial picture data corresponding to each reference vector. The reference vector is constructed based on the motion data and scene data in the history data. The processing module can select initial picture data corresponding to the reference vector with the minimum vector distance as the initial picture data by calculating the vector distance between the picture feature vector and the reference vector.
In some embodiments, the processing module is further configured to: acquiring facial image data 510 of the user in response to the compensation scene 410 conforming to a second condition, the second condition comprising the compensation scene 410 being a motion compensation scene; determining a usage distance feature 520 based on the facial image data 510; determining estimated motion data 530 based on the motion data 311, the scene data 330; based on the estimated motion data 530, using the distance features 520, compensation parameters 540 are determined.
The second condition is another condition for judging how to determine the protection parameters. The processing module may determine the compensation parameter when the compensation scenario meets the second condition. In some embodiments, the second condition includes the compensation scene being a motion compensation scene.
Facial image data 510 refers to facial image data of a user, which may be acquired by a camera of a tablet computer.
The use distance feature 520 refers to the distance of the user's eyes from the flat screen.
In some embodiments, the processing module may determine the usage distance feature in a variety of ways based on the facial image data. For example, the processing module may determine facial image features by an image analysis algorithm based on the facial image data, query a distance lookup table based on the facial image features, and determine the usage distance features. The distance comparison table can be determined based on statistical analysis of a large amount of sample data, and comprises sample facial image features and corresponding reference distances.
In some embodiments, the processing module may also determine the usage distance feature by a machine learning model based on the facial image. For example, the distance feature determination layer determines the usage distance features by a second determination model, which is a machine learning model, for more details see the relevant description below.
The predicted motion data 530 refers to predicted user motion data.
In some embodiments, the processing module may determine the predicted motion data in a variety of ways based on the motion data, the scene data. For example, the processing module may query a motion data look-up table based on motion data, scene data, and determine predicted motion data. The motion data comparison table can be determined based on historical data, and comprises sample motion data, sample scene data and reference motion data corresponding to the sample motion data.
In some embodiments, the processing module may also determine the predicted motion data based on the motion data via a machine learning model. For example, the estimated motion data is determined by a motion data prediction layer of a second determination model, which is a machine learning model, for more details see the relevant description below.
In some embodiments, the processing module may determine the compensation parameters 540 in a variety of ways based on the predicted motion data 530 using the distance features 520. For example, the processing module may determine the compensation parameters 540 by querying a compensation parameter lookup table; the compensation parameter lookup table includes a plurality of types of estimated motion data 530 and corresponding compensation parameters 540 corresponding to 520 using distance features. The compensation parameter comparison table comprises reference motion data and the corresponding relation between the reference using distance characteristic and the reference compensation parameter.
The compensation parameter comparison table can be constructed according to priori knowledge or historical data, for example, the larger the upper and lower amplitude of the tablet personal computer in the motion data is, the closer the user's eyes in the use distance feature is to the flat screen, the larger the vertical movement amplitude of the window in the compensation parameter is.
In some embodiments, the processing module may determine the compensation parameters 540 by a compensation parameter determination model.
In some embodiments, the compensation parameter determination model may be a machine learning model of the custom structure described below, or may be another neural network model. For example, convolutional neural network models.
In some embodiments, the compensation parameter determination model may include a distance feature determination layer, a motion data prediction layer, and a parameter determination layer.
The distance feature determination layer may be used to determine the usage distance features 520. In some embodiments, the input to the distance feature determination layer may include facial image data, and the output includes using the distance features 520. In some embodiments, the distance feature determination layer may be a convolutional neural network (Convolutional Neural Networks, CNN).
In some embodiments, the distance feature determination layer may be trained separately by gradient descent or otherwise based on a plurality of third training samples with third tags. In some embodiments, the third training sample may include sample facial image data. The third label corresponding to the third training sample can be the actual use distance characteristic of the sample user corresponding to the sample facial image data, and can be determined by a manual labeling or automatic labeling mode.
The motion data prediction layer may be used to determine the predicted motion data 530. In some embodiments, the input of the motion data prediction layer may include a sequence of motion data, a sequence of pose data, and scene data, and the output of the motion data prediction layer may include predicted motion data 530.
For more details on the motion data sequence, the pose data sequence, see the relevant description in fig. 3.
For more content on scene data see the relevant description in fig. 2.
In some embodiments, the motion data prediction layer may be a recurrent neural network model (Recurrent Neural Network, RNN).
In some embodiments, the motion data prediction layer may be trained separately by gradient descent or other means based on a plurality of fourth training samples with fourth labels.
In some embodiments, the fourth training sample may include a sample motion data sequence, a sample pose data sequence, and sample scene data corresponding to the first historical time. The fourth label corresponding to the fourth training sample may be actual motion data corresponding to the second historical time. Wherein the second historical time is later than the first historical time.
The parameter determination layer may be used to determine the compensation parameters 540. In some embodiments, the input of the parameter determination layer may include predicted motion data 530, the use distance feature 520, and picture feature data, and the output may include compensation parameters 540. In some embodiments, the parameter determination layer may be a Neural Network model (NN).
The image characteristic data refers to data related to image characteristics of the tablet computer. In some embodiments, the picture feature data may include optical character recognition (Optical Character Recognition, OCR) recognition information, text information, and image information. The picture characteristic data can be obtained by identifying the picture of the tablet personal computer through an OCR algorithm.
In some embodiments, the parameter determination layer may be trained separately by gradient descent or otherwise based on a plurality of fifth training samples with fifth labels.
In some embodiments, the fifth training sample may include sample motion data, sample usage distance features, and sample picture feature data. The fifth label corresponding to the fifth training sample may be the optimal compensation parameter corresponding to the foregoing sample. The fifth tag may be obtained by experiment on multiple sets of compensation parameters. For example, the experimental process may be to set multiple groups of compensation parameters, and perform experiments under sample conditions; selecting a group of compensation parameters with highest experimental scores as the optimal compensation parameters corresponding to the sample; wherein, the highest score of the experiment can be the highest score of the comfort level of the experimenter.
According to the compensation parameter determination model of some embodiments of the specification, the motion data is estimated and the compensation parameters are determined, so that the influence of various factors such as the distance from the eyes of the user to the flat screen, the motion data, the scene data and the like can be considered, the determination of the compensation parameters is efficient and accurate, and errors of manual determination are avoided.
In some embodiments, the motion compensated scene includes a periodic compensated scene and an aperiodic compensated scene.
The periodic compensation scene refers to a scene in which the motion data of the tablet computer has periodicity. For example, a tablet computer is used in walking, a tablet computer is used on a running machine, a subway is started and stopped regularly, and the like. The aperiodic compensated scene is a scene in which the motion data has no obvious periodicity; for example, the ambient light compensation scene is an aperiodic compensation scene.
In some embodiments, in response to the compensation scene being a periodic compensation scene, the processing module obtains motion data of the tablet computer; determining periodic features of the motion data based on the motion data; based on the motion data and the periodic characteristics, compensation parameters are determined.
The periodic characteristic refers to the duration of completing one periodic movement when the tablet personal computer performs the periodic movement. The periodic features relate to the context in which the user is using the tablet. For example, a user may use a tablet while walking, with periodic features of the tablet consistent with the user's stride frequency.
In some embodiments, the processing module may determine whether the scene is a periodic motion compensated scene via a periodic detection algorithm of the time series based on the slab motion data. A periodic detection algorithm of a time series refers to a method for identifying a pattern or a periodic variation that repeatedly occurs in time series data. Exemplary periodic detection algorithms include fourier transform algorithms, autocorrelation coefficient algorithms, and the like. Meanwhile, if the current scene is a periodic motion compensation scene, the above-mentioned periodic detection algorithm may also determine periodic characteristics of motion data. For more content on movement data see above.
In some embodiments, the processing module may determine the compensation parameter based on the motion data and the periodic characteristics in a manner that: based on the motion data and the periodic characteristics, periodic motion data are obtained; based on the periodic motion data, a search is performed in a template database to determine compensation parameters.
The periodic motion data refers to motion data of the tablet computer in one motion period. It will be appreciated that if the scene is a periodic motion compensated scene, the motion data is periodic and only the period compensation parameters within one period need to be determined, which can be referred to by compensation parameters at other times.
The template database stores a plurality of template periodic motion data and template periodic compensation parameters corresponding to the template periodic motion data.
The template periodic motion data refer to standard data of periodic motion of the tablet personal computer in different scenes. For example, periodic motion data of the tablet computer when the user walks with an unsynchronized frequency, periodic motion data of the tablet computer when the user takes a different vehicle, and the like.
The template period compensation parameters can be obtained by experiments on multiple groups of compensation parameters. For example, the experimental process can be that a plurality of groups of compensation parameters are set, and experiments are carried out under the condition of template periodic motion data; selecting a group of compensation parameters with highest experimental scores as template period compensation parameters under the template period motion data; wherein, the highest score of the experiment can be the highest score of the comfort level of the experimenter.
In some embodiments, the processing module retrieves from the template database based on the periodic motion data, and determines the compensation parameter as: calculating the similarity of the periodic motion data and the template periodic motion data in the template database, and taking the periodic compensation parameter corresponding to the template periodic motion data as the current periodic compensation parameter in response to the similarity being larger than a threshold value. In some embodiments, the similarity between the periodic motion data and the template periodic motion data may be obtained by calculating the distance between the two. The distance may include, but is not limited to, cosine distance, euclidean distance, manhattan distance, mahalanobis distance, or Minkowski distance, among others.
In some embodiments, the processing module determines the periodic compensation parameter by a distance feature determination layer and a parameter determination layer of the compensation parameter determination model in response to no template periodic motion data having a similarity greater than a threshold value being retrieved at the template database.
In some embodiments, when the processing module determines the periodic compensation parameter through the compensation parameter determining model, since the motion data is periodic, the current periodic motion data is the predicted motion data, and at this time, the periodic compensation parameter can be determined only by replacing the predicted motion data input by the parameter determining layer with the periodic motion data without changing the input of the distance feature determining layer. For more on the compensation parameter determination model, see above.
The motion compensated scene in some embodiments further comprises a synthetic compensated scene. The integrated compensation scene is a scene that requires both ambient light compensation and motion compensation, such as a subway scene.
In some embodiments, in response to the current compensation scenario being a composite compensation scenario, the processing module determines the protection parameters based on the sequence of ambient light intensities and the motion data. For more description of determining the compensation scenario, see the relevant description in step 230 of fig. 2 of the present specification.
In some embodiments, in response to the current compensation scenario being an integrated compensation scenario, the processing module determines the basic display parameters while also determining the compensation parameters, both acting simultaneously. For more on the processing module determining basic display parameters see the relevant description in fig. 4. For more content in determining the compensation parameters, reference is made to the methods for confirming the compensation parameters in the motion compensation scene and the periodic compensation scene described above.
In some embodiments of the present disclosure, the display screen is corrected by the compensation parameter, so that the display screen is more matched with the current scene and the current motion state of the user, and the comfort level of the user using the tablet computer is improved.
Some embodiments of the present disclosure provide a tablet computer display control device, including a processor, where the processor is configured to execute the foregoing tablet computer display control method.
Some embodiments of the present disclosure provide a computer-readable storage medium storing computer instructions that, when read by a computer, perform the aforementioned tablet computer display control method.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (10)

1. A tablet computer display control system, comprising:
the environment data acquisition module is configured to acquire environment data;
the motion data acquisition module is configured to acquire motion data of the tablet computer;
a storage module configured to store operation data and usage data of a user;
a processing module configured to:
determining scene data based on at least one of the motion data and the environmental data;
determining a display mode of the tablet computer based on the scene data and the usage data;
determining a compensation scene based on the scene data;
determining a protection parameter based on the compensation scene; the method comprises the steps of,
and controlling the tablet personal computer to display images based on the protection parameters.
2. The tablet display control system of claim 1, wherein the processing module is further configured to:
determining the scene data based on the environment data and the motion data;
determining a user characteristic based on the usage data;
the display mode is determined based on the environmental data and the user characteristics.
3. The tablet computer display control system of claim 1, wherein the protection parameters include basic display parameters;
the processing module is further configured to:
acquiring an ambient light intensity sequence in response to the compensation scene conforming to a first condition, the first condition comprising the compensation scene being an ambient light compensation scene;
and determining the basic display parameters according to a preset period based on the ambient light intensity sequence.
4. The tablet computer display control system of claim 1, wherein the protection parameters include compensation parameters;
the processing module is further configured to:
acquiring facial image data of the user in response to the compensation scene conforming to a second condition, the second condition comprising the compensation scene being a motion compensation scene;
Determining a usage distance feature based on the facial image data;
determining estimated motion data based on the motion data and the scene data;
and determining the compensation parameter based on the estimated motion data and the using distance characteristic.
5. A method for controlling display of a tablet computer, the method being executed on a processor, the method comprising:
determining scene data based on at least one of the environmental data and the motion data of the tablet computer;
determining a display mode based on the scene data and the usage data of the user;
determining a compensation scene based on the scene data;
determining a protection parameter based on the compensation scene; the method comprises the steps of,
and controlling the tablet personal computer to display images based on the protection parameters.
6. The tablet display control system of claim 1, wherein the determining a display mode based on the scene data and usage data of a user comprises:
determining the scene data based on the environment data and the motion data;
determining a user characteristic based on the usage data;
and determining a display mode of the tablet computer based on the environment data and the user characteristics.
7. The tablet computer display control system of claim 1, wherein the protection parameters include basic display parameters;
the determining protection parameters based on the compensation scene includes:
acquiring an ambient light intensity sequence in response to the compensation scene conforming to a first condition, the first condition comprising the compensation scene being an ambient light compensation scene;
and determining the basic display parameters according to a preset period based on the ambient light intensity sequence.
8. The tablet computer display control system of claim 1, wherein the protection parameters include compensation parameters;
the determining protection parameters based on the compensation scene includes:
acquiring facial image data of the user in response to the compensation scene conforming to a second condition, the second condition comprising the compensation scene being a motion compensation scene;
determining a usage distance feature based on the facial image data;
determining estimated motion data based on the motion data and the scene data;
and determining the compensation parameter based on the estimated motion data and the using distance characteristic.
9. A tablet computer display control device comprising a processor for executing the tablet computer display control method of any one of claims 5 to 8.
10. A computer-readable storage medium storing computer instructions that, when read by a computer in the storage medium, the computer performs the tablet computer display control method according to any one of claims 5 to 8.
CN202311724601.2A 2023-12-15 2023-12-15 Display control system and method for tablet personal computer Active CN117672110B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311724601.2A CN117672110B (en) 2023-12-15 2023-12-15 Display control system and method for tablet personal computer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311724601.2A CN117672110B (en) 2023-12-15 2023-12-15 Display control system and method for tablet personal computer

Publications (2)

Publication Number Publication Date
CN117672110A true CN117672110A (en) 2024-03-08
CN117672110B CN117672110B (en) 2024-07-09

Family

ID=90084486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311724601.2A Active CN117672110B (en) 2023-12-15 2023-12-15 Display control system and method for tablet personal computer

Country Status (1)

Country Link
CN (1) CN117672110B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090079754A1 (en) * 2007-09-25 2009-03-26 Himax Technologies Limited Display parameter adjusting method and apparatus for scene change compensation
CN101751209A (en) * 2008-11-28 2010-06-23 联想(北京)有限公司 Method and computer for adjusting screen display element
CN103885593A (en) * 2014-03-14 2014-06-25 深圳市中兴移动通信有限公司 Handheld terminal and screen anti-shake method and device of handheld terminal
CN107665698A (en) * 2017-11-13 2018-02-06 维沃移动通信有限公司 Ambient light intensity compensation method and device
CN109597555A (en) * 2018-12-06 2019-04-09 刘美连 A kind of method and system adjusting display mode according to scene and object
CN110022409A (en) * 2019-04-16 2019-07-16 维沃移动通信有限公司 A kind of terminal control method and mobile terminal
US20200160815A1 (en) * 2017-06-02 2020-05-21 Guangdong Xiaye Household Electrical Appliances Co., Ltd Control method
CN111651133A (en) * 2020-06-12 2020-09-11 广西世纪创新显示电子有限公司 Intelligent control display system and control method
CN111752516A (en) * 2020-06-10 2020-10-09 Oppo(重庆)智能科技有限公司 Screen adjustment method and device for terminal equipment, terminal equipment and storage medium
CN112187995A (en) * 2020-08-28 2021-01-05 北京小米移动软件有限公司 Illumination compensation method, illumination compensation device, and storage medium
CN113424550A (en) * 2019-01-09 2021-09-21 杜比实验室特许公司 Display management with ambient light compensation
CN114639332A (en) * 2022-03-21 2022-06-17 展讯半导体(南京)有限公司 Eye protection control method, system, equipment and storage medium for display screen
US20220238079A1 (en) * 2021-01-26 2022-07-28 Fuzhou Boe Optoelectronics Technology Co., Ltd. Display module and display method thereof, and display device
CN116431458A (en) * 2023-06-08 2023-07-14 深圳市华卓智能科技有限公司 Intelligent management system and method for tablet personal computer

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090079754A1 (en) * 2007-09-25 2009-03-26 Himax Technologies Limited Display parameter adjusting method and apparatus for scene change compensation
CN101751209A (en) * 2008-11-28 2010-06-23 联想(北京)有限公司 Method and computer for adjusting screen display element
CN103885593A (en) * 2014-03-14 2014-06-25 深圳市中兴移动通信有限公司 Handheld terminal and screen anti-shake method and device of handheld terminal
US20200160815A1 (en) * 2017-06-02 2020-05-21 Guangdong Xiaye Household Electrical Appliances Co., Ltd Control method
CN107665698A (en) * 2017-11-13 2018-02-06 维沃移动通信有限公司 Ambient light intensity compensation method and device
CN109597555A (en) * 2018-12-06 2019-04-09 刘美连 A kind of method and system adjusting display mode according to scene and object
CN113424550A (en) * 2019-01-09 2021-09-21 杜比实验室特许公司 Display management with ambient light compensation
CN110022409A (en) * 2019-04-16 2019-07-16 维沃移动通信有限公司 A kind of terminal control method and mobile terminal
CN111752516A (en) * 2020-06-10 2020-10-09 Oppo(重庆)智能科技有限公司 Screen adjustment method and device for terminal equipment, terminal equipment and storage medium
CN111651133A (en) * 2020-06-12 2020-09-11 广西世纪创新显示电子有限公司 Intelligent control display system and control method
CN112187995A (en) * 2020-08-28 2021-01-05 北京小米移动软件有限公司 Illumination compensation method, illumination compensation device, and storage medium
US20220238079A1 (en) * 2021-01-26 2022-07-28 Fuzhou Boe Optoelectronics Technology Co., Ltd. Display module and display method thereof, and display device
CN114639332A (en) * 2022-03-21 2022-06-17 展讯半导体(南京)有限公司 Eye protection control method, system, equipment and storage medium for display screen
CN116431458A (en) * 2023-06-08 2023-07-14 深圳市华卓智能科技有限公司 Intelligent management system and method for tablet personal computer

Also Published As

Publication number Publication date
CN117672110B (en) 2024-07-09

Similar Documents

Publication Publication Date Title
US11470385B2 (en) Method and apparatus for filtering video
US11756168B2 (en) Demonstration devices and methods for enhancement for low vision users and systems improvements
CN112969436B (en) Autonomous enhanced hands-free control in electronic vision assistance devices
CN104954865A (en) Electronic equipment display interface regulation method and electronic equipment
KR20150084925A (en) Modifying virtual object display properties to increase power performance of augmented reality devices
JPWO2005086092A1 (en) Image similarity calculation system and image search system
CN112639943A (en) Recessed color correction for improving color uniformity of head-mounted displays
US20180247610A1 (en) Modifying a presentation of content based on the eyewear of a user
US20130044135A1 (en) Electronic book and method for controlling display of files
EP3850467B1 (en) Method, device, and system for delivering recommendations
CN111546338A (en) Robot control method and device, robot and storage medium
CN115170455B (en) Image processing method and related device
US20110150298A1 (en) Identifying a characteristic of an individual utilizing facial recognition and providing a display for the individual
EP4154094B1 (en) Machine learning based forecasting of human gaze
KR20210106790A (en) Server, electronic apparatus and the control method thereof
CN117672110B (en) Display control system and method for tablet personal computer
US20220317456A1 (en) Blue light reduction
CN114612635B (en) Method and device capable of switching between augmented reality mode and virtual reality mode
CN115756286A (en) Display screen brightness control method and device, electronic equipment and readable storage medium
CN106897984A (en) A kind of non-linear background model update method towards still camera
JP4396328B2 (en) Image similarity calculation system, image search system, image similarity calculation method, and image similarity calculation program
CN116820246B (en) Screen adjustment control method and device with self-adaptive visual angle
KR102665453B1 (en) Apparatus and method for providing customized content based on gaze recognition
CN114851959B (en) Light adjusting method and light adjusting system for vehicle
CN116360905B (en) Tablet personal computer device based on ink screen, control method, system and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant