CN117037659A - Display control method and device - Google Patents

Display control method and device Download PDF

Info

Publication number
CN117037659A
CN117037659A CN202310946548.4A CN202310946548A CN117037659A CN 117037659 A CN117037659 A CN 117037659A CN 202310946548 A CN202310946548 A CN 202310946548A CN 117037659 A CN117037659 A CN 117037659A
Authority
CN
China
Prior art keywords
parameter
area
light
brightness
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310946548.4A
Other languages
Chinese (zh)
Inventor
王煜坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202310946548.4A priority Critical patent/CN117037659A/en
Publication of CN117037659A publication Critical patent/CN117037659A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0686Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/04Display device controller operating with a plurality of display units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The embodiment of the application discloses a display control method and a device, wherein the method comprises the following steps: obtaining a first parameter and a second parameter; the first parameter represents the position information of the target object, and the second parameter represents the relative pose of the target device; determining a target region based on the first parameter and the second parameter; the target area is an area which generates visual influence on the target object, and belongs to a screen area of the target equipment; the target area is brightness compensated in response to an instruction for visual impact.

Description

Display control method and device
Technical Field
The present application relates to the field of display control technologies, and in particular, to a display control method and apparatus.
Background
With the remarkable improvement of the development of digital technology and informatization technology, the use of electronic products is increasing, and the electronic devices may have visual influence on human eyes, such as glare, during the use process or the like. The visual impact may reduce the user experience of the electronic device, and therefore, how to improve the visual impact to the maximum extent is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the application provides a display control method, which comprises the following steps:
obtaining a first parameter and a second parameter; the first parameter represents the position information of the target object, and the second parameter represents the relative pose of the target device
Determining a target region based on the first parameter and the second parameter; the target area is an area which generates visual influence on the target object, and belongs to a screen area of the target equipment;
the target area is brightness compensated in response to an instruction for visual impact.
In some embodiments, the obtaining the second parameter comprises:
obtaining the second parameter if the target device comprises at least two screens; the second parameter characterizes a first included angle between the at least two screens;
obtaining the second parameter if the target device comprises a screen; the second parameter characterizes a second included angle between the first area and the second area; the one screen includes the first region and the second region.
In some embodiments, the obtaining the second parameter comprises:
obtaining the second parameter if the target device comprises a screen; the second parameter characterizes a third included angle between the screen and the target plane;
After the second parameter is obtained, the method further comprises:
obtaining a third parameter; the third parameter characterizes the position of a target light source, wherein the target light source is a light source capable of generating visual influence on the target object;
the determining a target area based on the first parameter and the second parameter includes:
the target region is determined based on the first parameter, the second parameter, and the third parameter.
In some embodiments, the obtaining the second parameter comprises:
obtaining the second parameter if the target device comprises a screen; the second parameter characterizes a third included angle between the screen and the target plane;
after the second parameter is obtained, the method further comprises:
obtaining a third parameter; the third parameter characterizes the position of a target light source, wherein the target light source is a light source capable of generating visual influence on the target object;
the determining a target area based on the first parameter and the second parameter includes:
the target region is determined based on the first parameter, the second parameter, and the third parameter.
In some embodiments, if the target device includes at least two screens, the determining a target area based on the first parameter and the second parameter performs brightness compensation on the target area, including one of:
Determining a target area comprising a light reflecting area based on the first parameter and the second parameter, and adjusting the light emitting brightness of the light reflecting area;
determining a target area comprising a light emitting area based on the first parameter and the second parameter, and adjusting the light emitting brightness of the light emitting area;
and determining a target area comprising a light emitting area and a light reflecting area based on the first parameter and the second parameter, and cooperatively adjusting the light emitting brightness of the light emitting area and the light reflecting area.
In some embodiments, adjusting the light emission luminance of the region includes:
and determining the reflection brightness of the reflection area, and adjusting the luminous brightness of the reflection area and/or adjusting the luminous brightness of the luminous area according to the reflection brightness.
In some embodiments, the determining a target region including a light emitting region and a light reflecting region based on the first parameter and the second parameter includes:
constructing all observation light paths based on the first parameter and the second parameter; the observation light path starts from the target object, reflects through a reflection point of the second screen and falls on a luminous point of the first screen; the at least two screens include at least the first screen and a second screen;
Determining a screen area where each light emitting point is located as the light emitting area, and determining a screen area where each light reflecting point is located as the light reflecting area;
and determining the light reflecting area and the light emitting area as the target area.
In some embodiments, the cooperatively adjusting the brightness of the light emitting area and the light reflecting area includes:
and if the focus of the target object is positioned on the second screen, adjusting the luminous brightness of the luminous area, and if the target condition is not met, adjusting the luminous brightness of the light reflecting area.
In some embodiments, the adjusting the light emitting brightness of the light emitting region includes:
obtaining an adjustment coefficient, and adjusting the luminous brightness of the luminous area based on the adjustment coefficient;
the adjusting the light-emitting brightness of the light-reflecting area comprises the following steps:
obtaining a fourth parameter and a fifth parameter; the fourth parameter characterizes the transmissivity of the first screen and the fifth parameter characterizes the reflectivity of the second screen;
determining the reflection brightness of the reflection area based on the fourth parameter, the fifth parameter and the brightness of the brightness-adjusted light-emitting area; the reflection brightness represents the brightness of reflection formed by the light-emitting area in the reflection area after adjustment;
And reducing the light-emitting brightness of the light-reflecting area by units of the light-reflecting brightness.
In some embodiments, adjusting the area light emission luminance includes:
determining different color channels corresponding to each emitted ray; one area corresponds to a plurality of emitted light rays, and the sum of the luminous brightness corresponding to each emitted light ray is the regional luminous brightness;
and adjusting the channel brightness of the channels with different colors to adjust the luminous brightness corresponding to the emitted light.
The present application provides a display control apparatus, including:
the acquisition module is used for acquiring the first parameter and the second parameter; the first parameter represents the position information of the target object, and the second parameter represents the pose information of the target device;
a determining module for determining a target area based on the first parameter and the second parameter; the target area is an area which generates visual influence on the target object, and belongs to a screen area of the target equipment;
and the adjusting module is used for responding to the instruction aiming at visual influence and carrying out brightness compensation on the target area.
According to the technical scheme provided by the embodiment of the application, the light reflecting area on the screen is determined, then the light reflecting intensity of the light reflecting area is determined, different target equipment is used for distinguishing scenes, the light emitting intensities of different areas of the target equipment are respectively adjusted, and further the visual influence of light reflecting on a target object is weakened or eliminated by adjusting the light emitting intensity of the screen.
Drawings
Fig. 1 is a schematic flow chart of a display control method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a method for determining a target area according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a display control device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a hardware entity of a computer device according to an embodiment of the present application.
Detailed Description
For a more complete understanding of the nature and the technical content of the embodiments of the present application, reference should be made to the following detailed description of embodiments of the application, taken in conjunction with the accompanying drawings, which are meant to be illustrative only and not limiting of the embodiments of the application.
For a more complete understanding of the nature and the technical content of the embodiments of the present application, reference should be made to the following detailed description of embodiments of the application, taken in conjunction with the accompanying drawings, which are meant to be illustrative only and not limiting of the embodiments of the application.
It should be noted that, the term "first\second\third" related to the embodiment of the present application is merely to distinguish similar objects, and does not represent a specific order for the objects, it is to be understood that "first\second\third" may interchange a specific order or sequence where allowed. It is to be understood that the "first\second\third" distinguishing objects may be interchanged where appropriate such that embodiments of the application described herein may be practiced in sequences other than those illustrated or described herein.
In an embodiment of the present application, the technical solution of the present application may be implemented on any electronic device, such as a tablet computer, a desktop computer, a notebook computer, a server, a public information inquiry station, a mobile phone, etc.
Fig. 1 is a flow chart of a display control method according to an embodiment of the present application. As shown in fig. 1, the method may include steps 101 to 103:
step 101, obtaining a first parameter and a second parameter.
In the embodiment of the present application, the current scene at least includes a target object and a target device, where the target device is used to display an image content to the target object, and accordingly, in the process that the target object receives the image content, due to the influence of illumination, the target device may have a reflection phenomenon in an area where the image content is displayed, so that the target object cannot receive the original image content.
In some embodiments, the first parameter characterizes position information of the target object. The target object is an object capable of receiving the image content displayed by the target device. By way of example, the target object may be, but is not limited to, a person, a video camera, a still camera, and the like. The above-mentioned position information may be a position coordinate of the target object in the world coordinate system, specifically, the first parameter is a position coordinate of the image capturing portion of the target object in the world coordinate system, for example, in the case that the target object is a person, the first parameter is a position coordinate of a person's eye in the world coordinate system; in the case where the target object is a camera, the first parameter is a coordinate position of a lens of the camera in a world coordinate system.
In some embodiments, the first parameter may be obtained by an internal sensor deployed in the target object or target device, or may be obtained by an external sensor deployed outside the target object. In an exemplary case where the target object is a person, the first parameter may be acquired through a wearable sensor disposed on the head, or may be acquired through an eye tracking device disposed in the current scene, and the first parameter may be acquired based on an image recognition and tracking algorithm. The eye tracking device may be a third party device or a sensor device in the target device.
In some embodiments, the second parameter characterizes a relative pose of the target device, such as an angle, orientation, etc. of the target device. The target device is a device capable of displaying image content. By way of example, the target device may be, but is not limited to, a tablet, notebook, cell phone, etc. The second parameter may be a relative pose between screens of the target device in the world coordinate system or a relative pose between a screen and a placement plane, for example, when the target device includes one screen, the second parameter may be a relative pose between a plurality of areas between screens in the world coordinate system or a relative pose between one screen and a desktop on which the screen is placed; when the target device comprises at least two screens, the second parameter is the relative pose between the at least two screens in the world coordinate system.
In some embodiments, the second parameter may be obtained by internal sensors within the target device, and the relative pose of the target device may also be determined by detecting changes in the angle of portions of the target device as they deform. The second parameter may be obtained by detecting the pose of the target device when the target device includes one screen by the gravity sensor, or may be obtained by detecting the bending angle of the bendable portion of the screen to determine the relative pose of the target device, and may be obtained by detecting the rotation angle of the connecting shaft of the two screens when the target device includes two screens.
Step 102, determining a target area based on the first parameter and the second parameter.
The target area is an area which generates visual influence on the target object, and belongs to a screen area of the target device.
In some embodiments, the visual effect on the target object refers to a negative effect that interferes with the normal viewing behavior of the user, for example, the visual effect may refer to a visual effect that is generated on the target object after the projection of strong light or other images and the content to be observed by the target object are overlapped in the target area, and for example, the target object needs to observe the movie being played on the screen, and at this time, the light emitted by the other light sources reflects light, and is overlapped with the movie being played, so that the target object cannot observe the original content of the movie well.
In some embodiments, the area that affects the vision of the target object, i.e. the target area, refers to the area that may directly or indirectly affect the vision of the target object, for example, the area a of the screen a generates the light reflection on the area B of the screen B, where the area B directly affects the vision of the user, and the area a indirectly affects the vision of the user, where the area a and/or the area B may be regarded as the target area, and the number of target areas is not limited, where the area a and the area B are merely exemplary illustrations of how to determine the target area, and in the foregoing illustrations, all areas of the screen a and the screen B are screen areas of the target device, and the foregoing A, B screen and the area a and the area B are merely for illustrating the area and the screen area that affect the vision of the target, and are not limited to the screen or the number of the areas.
In some embodiments, the target area may be determined by directly calculating based on the first parameter and the second parameter by an internal system of the target device, or by deploying a sensor for the target object and analyzing the acquired visual image, and determining the target area in combination with the first parameter and the second parameter, where, by way of example, the target device may determine, based on the first parameter and the second parameter, a light path from a light ray emitted by the target device that reaches the target object after being reflected, and further determine the target area, or by deploying a sensor for the target object, acquiring a real-time image, identifying a reflective area in the image, and determining, based on the first parameter and the second parameter, a target area on a screen area in the world coordinate system.
Step 103, in response to the instruction for visual impact, performing brightness compensation on the target area.
In some embodiments, the instruction is automatically issued by the system, which may be an instruction to adjust the target area screen, after the system has performed steps 101-102, to determine the target area and how to adjust the target area to eliminate the visual impact of the target area on the target object.
In some embodiments, the brightness compensation of the target area may be characterized by adjusting the brightness of the target area, specifically, may be adjusting the brightness of the target area entirely or adjusting the brightness of a part of the pixels in the target area individually, and any control method capable of adjusting the brightness of the screen may be used in the embodiments of the present application, which is not limited herein specifically.
Through the above, in the use of electronic equipment, because other screens or light sources emit light to form reflection on the screen, normal use of the screen by a user is affected, in this case, in order to reduce the reflection, if the light is no longer reflected into human eyes through the screen only by adjusting the angle of the screen, the effect achieved by this method is not ideal in the case of a large reflection area or more light sources. Therefore, the embodiment of the application does not adjust the screen angle, but determines the target area needing to be subjected to brightness compensation based on the position information of the target object and the pose information of the target device through steps 101-103, and performs brightness compensation on the target area to weaken or eliminate bad visual influence of glare on a user, thereby improving the viewing experience of the user, and simultaneously, the steps can be repeatedly executed along with the change of the first parameter and the second parameter so as to correct the position and the range of the target area, thereby ensuring that the user can always obtain a good compensation effect in the use process.
It should be noted that, the target area determined based on the first parameter and the second parameter is an area capable of generating a visual effect on the target object, and the subsequent adjustment of brightness is performed on the target area, it should be understood that as the target object is continuously moving, the first parameter is correspondingly changed, and the corresponding target area is correspondingly changed, so that it is necessary to determine the target area based on the first parameter and the second parameter, if brightness compensation is performed on the original target area when the first parameter is changed, a good compensation effect cannot be obtained and the use experience of the user is further reduced;
the glare and the reflection are the same in meaning hereinafter, and refer to the physical phenomenon generated by the light after being reflected by the screen, and in the following embodiments, the two can be replaced with each other, which is only different in that the glare mainly shows the subjective feeling of the influence of the reflection on human eyes, and the reflection objectively describes the physical phenomenon of light emission only from the perspective of equipment.
In the embodiment of the application, the target device in the current scene at least comprises two screens, when the target object looks at one of the screens, correspondingly, in the process that the target object receives the content displayed on the looked-at screen, because the other screens send out the influence of light rays, the target device may have a glare phenomenon in the area where the content is displayed, and the target object cannot watch the original display content.
In the above scenario, the second parameter obtained in step 101 of fig. 1 may be updated as: obtaining the second parameter if the target device comprises at least two screens; the second parameter characterizes a first angle between the at least two screens.
In some embodiments, the at least two screens of the target device are two screens independent of each other, and may display content independently, and control a switch independently, and illustratively, the screen EFAB and the screen ABCD in fig. 2 are two screens independent of each other of the device.
In some embodiments, the included angle between at least two screens may be obtained by detecting the rotation angle of the connecting shaft between the screens, or may be obtained by determining a plane equation of each screen in the world coordinate system to calculate the included angle between the screens, and taking the notebook computer as an example, when the notebook computer is in a buckled state, the rotation shaft between the two screens does not rotate at this time, which may be regarded as 0 ° between the two screens, and when the notebook computer is unfolded, the rotation shaft rotates, which may be regarded as the included angle between the two screens at this time; the coordinate system may be established by taking a certain screen plane of the notebook computer as a reference plane and the center of the screen as an origin, taking the size of the notebook computer as a coordinate value to be substituted into the coordinate system to calculate a plane equation of each screen, solving the angles between the planes, taking the angles obtained as the angles between the screens, and the second parameter obtained in the embodiment may be applied to step 102 to determine the target area based on the first parameter and the second parameter.
In the embodiment of the application, the target device in the current scene comprises a screen, the screen can be a foldable screen, when a user bends the screen, the screen is divided into two areas along the bending position, when the user focuses on the content of one area, the light rays emitted by the other screen area can generate glare in the area of the screen focused on by the user, and the viewing of the user is influenced.
In some embodiments, the target device in the above scenario that includes one screen is a foldable device. The screen of the target device may be deformed by bending or other means to form two display areas. By way of example, the target device may be a foldable cellular phone, a foldable tablet computer, or the like. The first area and the second area are two display areas formed by taking a bending area as a boundary when the screen of the foldable equipment is bent, and the second included angle is an included angle between the two display areas. In the case of a foldable mobile phone of the target device, the bending operation is performed on the screen of the mobile phone, and at this time, the screen is deformed due to bending, and two display areas are formed by taking the crease as a boundary, and the two display areas are a first area and a second area, and an included angle between the two areas is a second included angle.
In some embodiments, the second parameter may be obtained by detecting the relative pose information between the two regions through a gyroscope built in the target device, or may be directly obtained by detecting the bending angle of the target device. For example, when the target device is a foldable mobile phone, a plurality of gyroscopes may be set in the mobile phone to detect pose information of different areas to obtain the second parameter, or the second parameter may be directly determined by directly detecting a bending angle of a bendable portion of the mobile phone, and the second parameter obtained in this embodiment may be applied to step 102 to determine the target area based on the first parameter and the second parameter.
In the embodiment of the application, the target device in the current scene comprises a screen, and the screen can be an integral screen which is not foldable, and in the process of looking at the content played by the screen, the user cannot see the content originally played by the screen because the illumination generated by the light source outside the target device is reflected by the screen and overlapped with the content originally played by the screen.
In the above scenario, the target device includes a screen that is not deformable, i.e., the screen is entirely one area and is not deformable by bending or the like to form a plurality of display areas. By way of example, the target device may be a display, notebook, mobile phone, etc., which is not flexible in the screen. The target plane is a platform or plane on which the target device is placed. By way of example, the target plane may be a desktop surface on which a display is placed, a podium on which a notebook computer is placed, etc., wherein a horizontal plane may be taken as the target plane if the target device is not placed on any plane or table top. The third angle may be an angle between a screen of the target device and the target plane, and specifically, the second parameter is an angle between a plane of the screen of the target device and the target plane, for example, in a case where the target device is a mobile phone, the second parameter is an angle formed by a plane of the phone screen and a horizontal plane; in the case that the target object is a notebook computer placed on a desktop, the second parameter is an included angle formed between a screen of the notebook computer and the desktop.
In some embodiments, the second parameter may be obtained by obtaining the relative pose of the horizontal plane with the sensor inside the device using the horizontal plane directly as the target plane. For example, when the notebook computer is placed on the desktop, the desktop may be regarded as horizontal, and at this time, the second parameter, which is the angle between the screen of the notebook computer and the desktop, may be obtained only by obtaining the angle between the screen and the horizontal plane through the sensor built in the notebook computer, and the second parameter obtained in this embodiment may be applied to step 102, and the target area is determined based on the first parameter and the second parameter.
In some embodiments, after obtaining the second parameter, the method further comprises: obtaining a third parameter; the third parameter characterizes a position of a target light source, which is a light source that can have a visual impact on the target object.
In some embodiments, the current scene includes a target object, a target device, and a target light source, the target object and the target device representing the same objects and devices as in the embodiments described above.
In the above scenario, the third parameter characterizes the position information of the target light source. The target light source is a light source object except for the target device, specifically, the target light source can be any light-emitting object, such as an incandescent lamp, a flashlight, a sun and the like, and can be an object which does not emit light but can reflect light, such as a plane mirror, glass and the like. Wherein the position may be the position coordinate of the target light source in the world coordinate system, specifically, the third parameter is the position coordinate of the light emitting or reflecting portion of the target light source in the world coordinate system, for example, when the target light source is an incandescent lamp, the third parameter is the position coordinate of the incandescent lamp in the world coordinate system; when the target object is a plane mirror, the third parameter is the position coordinate of the part of the plane mirror, which reflects light and can reach the target object through the reflection of the screen, in the world coordinate system.
In some embodiments, the third parameter may be obtained by identifying the light source image by using an external photographing device of the target device, or may be determined by infrared scanning. In an exemplary case where the target light source is an incandescent lamp, the light source image may be acquired through a photographing device disposed outside the target device to identify the light source image to acquire a third parameter, or the distance and angle between the incandescent lamp and the target device may be acquired through infrared scanning to acquire the third parameter, where the third parameter acquired in this embodiment may be applied to step 102, and the target area is determined based on the first parameter, the second parameter and the third parameter.
In some embodiments, the determining the target area based on the first parameter and the second parameter may be further implemented by:
the target region is determined based on the first parameter, the second parameter, and the third parameter.
In some embodiments, if the target device includes a screen, determining the target area based on the first parameter, the second parameter, and the third parameter may further be implemented by:
the target object is regarded as a luminous light source, and the directions of all reflected light rays of the light rays emitted by the target object after being reflected by the screen are determined based on the first parameter and the second parameter; based on the third parameter and the direction of the reflected light, a set of reflected light passing through the target light source in the reflected light is determined, and a set of reflection points corresponding to the reflected light is determined as a target area, where the target area determined in this embodiment may be the area where brightness compensation is performed in step 103.
In the embodiment of the application, the target device in the current scene comprises at least two screens, wherein light rays emitted by one screen can form glare on the other screen, and the glare can have bad visual influence on the target object when the target object views any one of the screens.
In some embodiments, in the above scenario, the determining the target area based on the first parameter and the second parameter, and performing brightness compensation on the target area may be implemented by the following method: determining a target area comprising a light reflecting area based on the first parameter and the second parameter, and adjusting the light emitting brightness of the light reflecting area; determining a target area comprising a light emitting area based on the first parameter and the second parameter, and adjusting the light emitting brightness of the light emitting area; and determining a target area comprising a light emitting area and a light reflecting area based on the first parameter and the second parameter, and cooperatively adjusting the light emitting brightness of the light emitting area and the light reflecting area.
And adjusting the light-emitting brightness of the light-reflecting area according to the target area determined based on the first parameter and the second parameter. The embodiment of the application can adjust the overall luminous brightness of the screen where the light reflecting area is located, and can also adjust the luminous brightness of the area of the screen where the light reflecting area is located.
In the process of adjusting the overall brightness of the screen where the light reflecting area is located, the screen comprises the light reflecting area, so that the light emitting brightness of the light reflecting area is correspondingly adjusted after the overall brightness of the screen is adjusted.
In the process of adjusting the brightness of the area of the screen where the light reflecting area is located, the area needing to be adjusted in brightness, namely the light reflecting area, can be determined first. The light reflecting area is a screen area which is observed by a target object and generates light reflection on a screen, and the position of the light reflecting area is not fixed and can be changed along with the change of the first parameter and the second parameter, but does not exceed the screen range of target equipment. For example, when the target object observes the second screen, the target object may observe a glare or projection formed on the second screen by the content played on the first screen, and the area where the glare or projection is located is a reflective area, and along with the change of the position of the target object and the included angle between the first screen and the second screen, the position of the reflective area on the second screen also changes correspondingly. The light-emitting brightness of the light-reflecting area is the light-emitting brightness of the screen of the light-reflecting area, and specifically, the light-emitting brightness is the sum of the brightness of all pixel points of the screen in the light-reflecting area. For example, the screen of the reflective area on the second screen will emit light, and the sum of the luminous intensities of the pixels of the screen in the area is the luminous intensity of the reflective area.
In some embodiments, an observation light path of the target object may be established based on the first parameter and the second parameter, and it is determined that reflection areas on the second screen corresponding to all light paths emitted by the target object and capable of reaching the first screen through reflection by the second screen are light reflection areas. For example, when the target object is a person, the human eye can be regarded as the light source to emit light, the direction of the reflected light of the light emitted by the human eye after being reflected by the second screen is obtained based on the first parameter and the second parameter, and the reflection area formed on the second screen by the light which can reach the first screen by the reflected light is determined as the reflection area.
In some embodiments, the brightness of the reflective area may be adjusted by integrally adjusting the brightness of the screen of the reflective area, and the brightness of a portion of the pixels in the screen of the reflective area may be adjusted by adjusting the brightness of the portion of the pixels in the screen of the reflective area.
And adjusting the light-emitting brightness of the light-emitting region with respect to the above-described process of determining the target region including the light-emitting region based on the first parameter and the second parameter. The embodiment of the application can adjust the overall luminous brightness of the screen where the luminous area is located, and can also adjust the luminous brightness of the area of the screen where the luminous area is located.
In the process of adjusting the overall light-emitting brightness of the screen where the light-emitting area is located, since the screen includes the light-emitting area, after the overall light-emitting brightness of the screen is adjusted, the light-emitting brightness of the light-emitting area is correspondingly adjusted.
In the process of adjusting the light-emitting brightness of the area of the screen where the light-emitting area is located, the area where the brightness needs to be adjusted, that is, the light-emitting area, may be determined first. The light-emitting area is an area where light emitted from the screen can reach the target object through reflection of the light-reflecting area, wherein the light-emitting area and the light-reflecting area are positioned on different screens. For example, there is a region a on the first screen, and the light emitted by the region a may reach the target object by being reflected by the second screen, so that the region a on the first screen is the light emitting region.
In some embodiments, the light emitting region may be determined based on the same method as in the above embodiments. The light emitted by the target object is determined to reach the area formed by the first screen after being reflected by the second screen, and the brightness of the light-emitting area can be adjusted based on the method for adjusting the brightness of the light-reflecting area in the above embodiment.
And a process of adjusting the light emission brightness of the light reflection area and the light reflection area in cooperation with respect to the above-described target area including the light emission area and the light reflection area determined based on the first parameter and the second parameter. The embodiment of the application can adjust the whole brightness of the reflecting area and the screen where the reflecting area is positioned, and also can adjust the brightness of the reflecting area and the area of the screen where the luminous area is positioned.
In the process of adjusting the overall brightness of the screen where the light reflecting area and the light emitting area are located, the screen respectively comprises the light reflecting area and the light emitting area, so that after the overall brightness of the screen is adjusted, the light emitting brightness of the light reflecting area and the light emitting area is correspondingly adjusted.
In the process of adjusting the brightness of the areas of the screen where the light reflecting area and the light emitting area are located, the area where the brightness needs to be adjusted, namely the light reflecting area and the light emitting area, can be determined first.
In some embodiments, the collaborative adjustment characterization may adjust the brightness of the light emitting area and the light reflecting area at the same time, or may sequentially adjust the brightness of the light emitting area and the light reflecting area, the order and the number of adjustment are not limited, and the adjustment method for the light emitting brightness of each area may be the same as the adjustment method in the above embodiments, that is, the adjusted screen may overcome the visual influence of the glare on the target object.
In some embodiments, adjusting the light emission luminance of the region includes: and determining the reflection brightness of the reflection area, and adjusting the luminous brightness of the reflection area and/or adjusting the luminous brightness of the luminous area according to the reflection brightness.
In some embodiments, the retroreflective brightness of the retroreflective regions characterizes the brightness of the retroreflective formed by the light emitted from the retroreflective regions after being reflected by the retroreflective regions. The light emitted by the light emitting area of the first screen is emitted to the second screen through the first screen panel, and then is reflected by the second screen to finally reach the target object, so that the sum of the brightness of the light reaching the target object is the reflection brightness.
In some embodiments, the brightness of the light-emitting area can be adjusted to reduce the brightness of the light-reflecting area to a negligible extent, the brightness of the light-reflecting area after the light-reflecting brightness is overlapped with the brightness of the light-reflecting area is the same as the brightness of the light-reflecting area before the light-reflecting area is adjusted, and the brightness of the light-reflecting area and the brightness of the light-emitting area can be simultaneously adjusted to achieve the effect of overcoming the visual influence caused by light reflection. For the area with brightness adjustment, the brightness of the area is different from the brightness of other areas on the same screen, because the brightness of the same screen defaults to be the same, so that obvious glare can be generated, if the brightness of a certain area on the screen is lower, obvious visual influence can not be generated after the area is overlapped with the reflective brightness, and adjustment is not needed. Illustratively, after adjusting the brightness of the light-emitting region on the first screen, the brightness of the light-emitting region is different from the brightness of the other regions on the first screen.
In some embodiments, the determining a target region including a light emitting region and a light reflecting region based on the first parameter and the second parameter includes: constructing all observation light paths based on the first parameter and the second parameter; the observation light path starts from the target object, reflects through a reflection point of the second screen and falls on a luminous point of the first screen; the at least two screens include at least the first screen and a second screen; determining a screen area where each light emitting point is located as the light emitting area, and determining a screen area where each light reflecting point is located as the light reflecting area; and determining the light reflecting area and the light emitting area as the target area.
The observation light path characterization uses a target object as a light source, the light path formed by light rays emitted by the target object is characterized in that the reflection point is a contact point between the observation light path and the second screen when the observation light path is reflected by the second screen, and the light emitting point is an intersection point formed by the observation light path and the first screen after the observation light path is reflected by the second screen.
Wherein, the determination target area of the constructed observation optical path can be determined by taking the center of the second screen (ABCD) as an origin O, taking the axis parallel to the horizontal direction of the second screen as an X-axis, taking the axis parallel to the vertical direction of the second screen as a through-point O, taking the axis perpendicular to the second screen as a Z-axis, establishing a space rectangular coordinate system, and setting four vertexes of the second screen as a (X A ,y A ,z A ),B(x B ,y B ,z B ),C(x C ,y C ,z C ),D(x D ,y D ,z D ) The four vertices of the first screen are a (x A ,y A ,z A ),B(x B ,y B ,z B ),E(x E ,y E ,z E ),F(x F ,y F ,z F ) The included angle between the first screen and the second screen is theta. The position coordinates of the target object are I (x I ,y I ,z I ). I.e. x A =x D =x F =x 0 ,x B =x C =x E =-x 0 ,y A =y B =y 0 ,y C =y D =-y 0 ,z A =z B =z C =z D =0. Is easy to find by the included angle theta, z E =z F =2y 0 sinθ,y E =y F =2y 0 cos θ. Wherein the equation of the second screen is z=0 (-x) 0 ≤x≤x 0 ,-y 0 ≤y≤y 0 ). The normal vector of the plane in which it is located is: n= (0, 1), the plane equation of the first screen (AFEB) is a 0 x+B 0 y+C 0 z+D 0 =0(-x 0 ≤x≤x 0 )。
Wherein, parameter A in the equation of the first screen 0 、B 0 、C 0 、D 0 Can be derived from equation (1).
As the preparation calculation for establishing the observation optical path above, calculation will be performed taking as an example the establishment of one observation optical path.
Let the coordinates of any point G on the second screen be (x ', y', 0), point G can be regarded as a reflecting point on the second screen, the light should be emitted from a point H on the first screen, point H can be regarded as a reflecting point on the first screen, the light emitted from point H enters human eye I after being reflected by point G on the second screen, and since the light path is reversible, the human eye can be regarded as a reflecting point to establish an observation light path, namely, the light is regarded as being emitted from point I, and the equation of incident light IG is shown in formula (2):
the incident ray direction vector is shown in formula (3):
L=(x′-x I ,y′-y I ,-z I )x′∈[-x 0 ,x 0 ],y′∈[-y 0 ,y 0 ]formula (3);
the direction vector of the corresponding reflected light ray GH is shown in formula (4):
the equation for reflected ray GH is shown in equation (5):
At this time, the observation light path is established, the complete observation light path is IG-GH, and the light is emitted by the point I and reflected by the point G to reach the point H.
If the reflected light beam has an intersection point H with the first screen, the following formula (6) is solved, that is, the reflection will occur at the position G, and the reflection point corresponding to the point H is the point G:
expanding x' in the above formula (6) to [ -x) 0 ,x 0 ]Y' extends to [ -y 0 ,y 0 ]The method for determining the observation light path determines the observation light path formed by all the light rays emitted by the point I and directed to the second screen, so that the area formed by the intersection point of the observation light path and the first screen can be obtained, the area is determined as a light emitting area, and the corresponding area formed by the light reflecting point on the second screen is a light reflecting area.
In the embodiment of the application, the target device in the current scene comprises two screens, and when the target object looks at one of the screens, the light rays emitted by the other screen form reflection, so that adverse visual influence is generated on the target object.
In some embodiments, the co-adjusting the light emitting region and the light reflecting region comprises: and if the focus of the target object is positioned on the second screen, adjusting the luminous brightness of the luminous area, and if the target condition is not met, adjusting the luminous brightness of the light reflecting area.
In some embodiments, the focus of the target object characterizes the location of interest of the target object. For example, if the target object is a person, if the person looks at the second screen at this time, the focal point of the person's gaze is located on the second screen at this time, and if the target object is a camera, if the camera needs to capture the content of the second screen, the focal point of the camera is located on the second screen at this time.
In some embodiments, the target condition characterization may overcome the visual impact of light reflection. For example, if the target object still observes that there is reflection on the second screen after adjusting the brightness of the light emitting area on the first screen, the target condition is not satisfied at this time, and if the target object does not observe that there is reflection on the second screen, the target condition is satisfied at this time, that is, the brightness compensation of the light emitting area is insufficient to eliminate the visual influence on the user.
In some embodiments, the focal position of the target object may be determined by a tracking device built in the target device, and after determining that the focal point of the target object is located on the second screen, the light-emitting brightness of the light-emitting area on the first screen may be adjusted, specifically, the light-emitting brightness of the entire light-emitting area may be adjusted, or the brightness of a partial area within the light-emitting area may be adjusted.
In some embodiments, the value of the reflection brightness corresponding to the adjusted light emitting area may be calculated by the internal system of the target device, so as to determine whether the reflection brightness may have a visual effect on the target object, so as to determine whether the target condition is met, or whether the light currently reaching the target object includes reflection that affects the visual effect of the target object, so as to determine whether the target condition is met.
In some embodiments, when the target condition is not satisfied, the light-emitting brightness of the light-reflecting area is adjusted, and the adjustment method may be the same as the above-described method of adjusting the light-emitting brightness of the light-emitting area.
In some embodiments, the adjusting the light emitting brightness of the light emitting region includes: obtaining an adjustment coefficient, and adjusting the luminous brightness of the luminous area based on the adjustment coefficient; the adjusting the light-emitting brightness of the light-reflecting area comprises the following steps: obtaining a fourth parameter and a fifth parameter; the fourth parameter characterizes the transmissivity of the first screen and the fifth parameter characterizes the reflectivity of the second screen; determining the reflection brightness of the reflection area based on the fourth parameter, the fifth parameter and the brightness of the brightness-adjusted light-emitting area; the reflection brightness represents the brightness of reflection formed by the light-emitting area in the reflection area after adjustment; and reducing the light-emitting brightness of the light-reflecting area by units of the light-reflecting brightness.
In some embodiments, the light-emitting brightness of the region may be represented by a brightness value of each pixel, and the adjustment coefficient characterizes an operation parameter when adjusting each brightness value in the region. For example, the brightness value of all pixels in the current light emitting area is 100, and the brightness is required to be adjusted to be 80% of the current brightness based on the adjustment coefficient, the adjustment coefficient is 0.8, the brightness value of the adjusted pixels is 100×0.8=80, and due to factory setting of the screen, the adjustment coefficient has a maximum value and a minimum value, that is, the brightness of the screen has an upper limit and a lower limit when the screen is turned on.
In some embodiments, the transmittance is indicative of a ratio of an amount of light transmitted through the object to an amount of incident light in the incident light. Illustratively, there is a transparent cover on the screen, assuming that 90% of the light emitted by the pixel points is refracted through the transparent cover, and the remaining 10% of the light is reflected by the cover without passing through the cover, the transmittance of the transparent cover is 90%. The reflectivity is expressed as the ratio of the amount of light reflected by the surface of the object to the amount of light transmitted through the object, and the transparent cover plate is taken as an example, and assuming that 90% of the light emitted by the pixel points is reflected, the remaining 10% of the light passes through the cover plate, i.e. the reflectivity of the transparent cover plate is 90%, and the difference between the light transmittance and the reflectivity may be caused by the difference of the screen cover plate, the coating and the angle.
In some embodiments, the light reflection brightness of the light reflection area can be determined by setting the light emission point a on the first screen after the brightness is adjusted i Brightness isThe first screen has light transmittance p (alpha), the second screen has reflectivity r (alpha), and the luminous point a i Corresponding to the reflective point a 'on the second screen' i Is +.> Further, the reflection point a 'is formed' i Is reduced by->The corresponding numerical value can lead the sum of the reflection brightness and the luminous brightness of the reflection point to be the same as the reflection brightness before adjustment, and the reflection brightness of the reflection point to be described is +.>The light emission luminance is smaller than that of the reflection point before adjustment.
In some embodiments, adjusting the area light emission luminance includes: determining different color channels corresponding to each emitted ray; one area corresponds to a plurality of emitted light rays, and the sum of the luminous brightness corresponding to each emitted light ray is the regional luminous brightness; and adjusting the channel brightness of the channels with different colors to adjust the luminous brightness corresponding to the emitted light.
In some embodiments, a pixel includes three different color channels of RGB, a single pixel emits a light ray of a mixed color of three channels of RGB, a plurality of pixels are provided in a region, that is, each pixel emits a light ray, the sum of brightness of all pixels in the region is the brightness of the light ray in the region, the brightness of each pixel can be changed by adjusting the brightness of each channel of each pixel, and brightness difference exists between the combined different color channels and the combined light ray before combining, so that the adjustment accuracy can be improved by adjusting the brightness of the single color channel.
In some embodiments, the above-mentioned adjustment of the area light-emitting brightness may be achieved by the following method:
setting a first screen luminous point b on the first screen i The luminous brightness of (2) isThe R, G, B brightness is respectivelyI.e. < ->Reflection point c on second screen i Is +.>The R, G, B brightness is ∈>I.e. < ->Respectively adjust the luminous points b i And a reflection point c i So that the adjusted luminous point b i Luminous intensity +.>The adjusted reflection point c i Is a luminance of light emitted from the light sourceSatisfy->And->The adjustment operation is carried out on each reflecting point and the corresponding luminous point of the reflecting area, so that the target condition can be met, and the visual influence of the reflecting on the target object is overcome.
Based on the foregoing embodiments, the embodiment of the present application further provides a display control device, and fig. 3 is a schematic structural diagram of the display control device provided by the embodiment of the present application. The display control device 300 includes an acquisition module 301, a determination module 302, and an adjustment module 303.
An obtaining module 301, configured to obtain a first parameter and a second parameter; the first parameter characterizes the position information of the target object, and the second parameter characterizes the pose information of the target device.
In some embodiments, the obtaining module 301 is configured to obtain the second parameter when the target device includes at least two screens; the second parameter characterizes a first angle between the at least two screens.
In some embodiments, the obtaining module 301 is configured to obtain the second parameter when the target device includes a screen; the second parameter characterizes a second included angle between the first area and the second area; the one screen includes the first region and the second region.
In some embodiments, the obtaining module 301 is configured to obtain an adjustment coefficient.
In some embodiments, the obtaining module 301 is configured to obtain a fourth parameter and a fifth parameter; the fourth parameter characterizes a transmissivity of the first screen and the fifth parameter characterizes a reflectivity of the second screen.
In some embodiments, the obtaining module 301 is configured to obtain the second parameter when the target device includes a screen; the second parameter characterizes a third included angle between the one screen and the target plane.
In some embodiments, the obtaining module 301 is configured to obtain a third parameter; the third parameter characterizes a position of a target light source, which is a light source that can have a visual impact on the target object.
A determining module 302, configured to determine a target area based on the first parameter and the second parameter; and the target area is an area which generates visual influence on the target object, and belongs to a screen area of the target equipment.
In some embodiments, the determining module 302 is configured to determine a target area including a retroreflective area based on the first parameter and the second parameter.
In some embodiments, the determining module 302 is configured to determine a target area comprising the light emitting area based on the first parameter and the second parameter.
In some embodiments, the determining module 302 is configured to determine a target area including a light emitting area and a light reflecting area based on the first parameter and the second parameter.
In some embodiments, the determining module 302 is configured to determine a retroreflective brightness of the retroreflective regions.
In some embodiments, the determining module 302 is configured to construct the entire observation optical path based on the first parameter and the second parameter. The observation light path starts from the target object, reflects through the reflecting point of the second screen and falls on the luminous point of the first screen. The at least two screens include at least the first screen and the second screen.
And determining the screen area where each light emitting point is located as the light emitting area, and determining the screen area where each light reflecting point is located as the light reflecting area.
And determining the light reflecting area and the light emitting area as the target area.
In some embodiments, the determining module 302 is configured to determine a different color channel corresponding to each emitted light; one region corresponds to a plurality of emitted light rays, and the sum of the light-emitting brightness corresponding to each emitted light ray is the region light-emitting brightness.
An adjustment module 303, configured to perform brightness compensation on the target area in response to an instruction for visual impact.
In some embodiments, the adjusting module 303 is configured to adjust the light emitting brightness of the light reflection area.
In some embodiments, the adjusting module 303 is configured to adjust the light-emitting brightness of the light-emitting region.
In some embodiments, the adjusting module 303 is configured to cooperatively adjust the light-emitting brightness of the light-emitting area and the light-reflecting area.
In some embodiments, the adjusting module 303 is configured to adjust the light-emitting brightness of the light-reflecting area and/or adjust the light-emitting brightness of the light-emitting area according to the light-reflecting brightness.
In some embodiments, the adjusting module 303 is configured to adjust the light-emitting brightness of the light-emitting area when the focus of the target object is located on the second screen, and if the target condition is not met, adjust the light-emitting brightness of the light-reflecting area.
In some embodiments, the adjusting module 303 is configured to adjust the light-emitting brightness of the light-emitting area based on the adjustment coefficient.
In some embodiments, the adjusting module 303 is configured to determine the light reflection brightness of the light reflection area based on the fourth parameter, the fifth parameter, and the light emission brightness of the light emission area after brightness adjustment; the reflection brightness represents the brightness of reflection formed by the light-emitting area in the reflection area after adjustment; and reducing the light-emitting brightness of the light-reflecting area by units of the light-reflecting brightness.
In some embodiments, the adjusting module 303 is configured to adjust the channel brightness of the different color channels to adjust the light emitting brightness corresponding to the emitted light.
The description of the apparatus embodiments above is similar to that of the method embodiments above, with similar advantageous effects as the method embodiments. In some embodiments, the functions or modules included in the apparatus provided by the embodiments of the present application may be used to perform the methods described in the foregoing method embodiments, and for technical details that are not disclosed in the embodiments of the apparatus of the present application, reference should be made to the description of the embodiments of the method of the present application.
It should be noted that, in the embodiment of the present application, if the above-mentioned operation mode updating method is implemented in the form of a software function module, and sold or used as a separate product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or some of contributing to the related art may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes. Thus, embodiments of the application are not limited to any specific hardware, software, or firmware, or any combination of hardware, software, and firmware.
The embodiment of the application provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program capable of running on the processor, and the processor realizes part or all of the steps in the method when executing the program.
Embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs some or all of the steps of the above-described method. The computer readable storage medium may be transitory or non-transitory.
Embodiments of the present application provide a computer program comprising computer readable code which, when run in a computer device, causes a processor in the computer device to perform some or all of the steps for carrying out the above method.
Embodiments of the present application provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program which, when read and executed by a computer, performs some or all of the steps of the above-described method. The computer program product may be realized in particular by means of hardware, software or a combination thereof. In some embodiments, the computer program product is embodied as a computer storage medium, in other embodiments the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It should be noted here that: the above description of various embodiments is intended to emphasize the differences between the various embodiments, the same or similar features being referred to each other. The above description of apparatus, storage medium, computer program and computer program product embodiments is similar to that of method embodiments described above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus, the storage medium, the computer program and the computer program product of the present application, reference should be made to the description of the embodiments of the method of the present application.
Fig. 4 is a schematic diagram of a hardware entity of a computer device according to an embodiment of the present application, as shown in fig. 4, the hardware entity of the computer device 400 includes: a processor 401 and a memory 402, wherein the memory 402 stores a computer program executable on the processor 401, the processor 801 implementing the steps of the method of any of the embodiments described above when the program is executed.
The memory 402 stores a computer program executable on the processor, and the memory 402 is configured to store instructions and applications executable by the processor 401, and may also cache data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by each module in the processor 401 and the computer device 400, which may be implemented by a FLASH memory (FLASH) or a random access memory (Random Access Memory, RAM).
The processor 401, when executing a program, implements the steps of the operation mode updating method of any one of the above. The processor 401 generally controls the overall operation of the computer device 400.
An embodiment of the present application provides a computer storage medium storing one or more programs executable by one or more processors to implement the steps of the operation mode updating method of any of the embodiments above.
It should be noted here that: the description of the storage medium and apparatus embodiments above is similar to that of the method embodiments described above, with similar benefits as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and the apparatus of the present application, please refer to the description of the method embodiments of the present application.
The processor may be at least one of a target application integrated circuit (Application Specific Integrated Circuit, ASIC), a digital signal processor (Digital Signal Processor, DSP), a digital signal processing device (Digital Signal Processing Device, DSPD), a programmable logic device (Programmable Logic Device, PLD), a field programmable gate array (Field Programmable Gate Array, FPGA), a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, and a microprocessor. It will be appreciated that the electronic device implementing the above-mentioned processor function may be other, and embodiments of the present application are not limited in detail.
The computer storage medium/Memory may be a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable programmable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable programmable Read Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), a magnetic random access Memory (Ferromagnetic Random Access Memory, FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Read Only optical disk (Compact Disc Read-Only Memory, CD-ROM); but may also be various terminals such as mobile phones, computers, tablet devices, personal digital assistants, etc., that include one or any combination of the above-mentioned memories.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present application, the sequence number of each step/process described above does not mean that the execution sequence of each step/process should be determined by its functions and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present application. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units; can be located in one place or distributed to a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units. Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware related to program instructions, and the foregoing program may be stored in a computer readable storage medium, where the program, when executed, performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a magnetic disk or an optical disk, or the like, which can store program codes.
Alternatively, the above-described integrated units of the present application may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the related art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a removable storage device, a ROM, a magnetic disk, or an optical disk.
The foregoing is merely an embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application.

Claims (10)

1. A display control method, the method comprising:
obtaining a first parameter and a second parameter; the first parameter represents the position information of the target object, and the second parameter represents the relative pose of the target device;
Determining a target region based on the first parameter and the second parameter; the target area is an area which generates visual influence on the target object, and belongs to a screen area of the target equipment;
the target area is brightness compensated in response to an instruction for visual impact.
2. The method of claim 1, the obtaining a second parameter comprising:
obtaining the second parameter if the target device comprises at least two screens; the second parameter characterizes a first included angle between the at least two screens;
obtaining the second parameter if the target device comprises a screen; the second parameter characterizes a second included angle between the first area and the second area; the one screen includes the first region and the second region.
3. The method of claim 1, the obtaining a second parameter comprising:
obtaining the second parameter if the target device comprises a screen; the second parameter characterizes a third included angle between the screen and the target plane;
after the second parameter is obtained, the method further comprises:
obtaining a third parameter; the third parameter characterizes the position of a target light source, wherein the target light source is a light source capable of generating visual influence on the target object;
The determining a target area based on the first parameter and the second parameter includes:
the target region is determined based on the first parameter, the second parameter, and the third parameter.
4. The method of claim 1, if the target device includes at least two screens, determining a target area based on the first parameter and the second parameter, brightness compensating the target area, comprising one of:
determining a target area comprising a light reflecting area based on the first parameter and the second parameter, and adjusting the light emitting brightness of the light reflecting area;
determining a target area comprising a light emitting area based on the first parameter and the second parameter, and adjusting the light emitting brightness of the light emitting area;
and determining a target area comprising a light emitting area and a light reflecting area based on the first parameter and the second parameter, and cooperatively adjusting the light emitting brightness of the light emitting area and the light reflecting area.
5. The method of claim 4, adjusting the light-emitting brightness of the region, comprising:
and determining the reflection brightness of the reflection area, and adjusting the luminous brightness of the reflection area and/or adjusting the luminous brightness of the luminous area according to the reflection brightness.
6. The method of claim 4, the determining a target region comprising a light emitting region and a light reflecting region based on the first parameter and the second parameter, comprising:
constructing all observation light paths based on the first parameter and the second parameter; the observation light path starts from the target object, reflects through a reflection point of the second screen and falls on a luminous point of the first screen; the at least two screens include at least the first screen and a second screen;
determining a screen area where each light emitting point is located as the light emitting area, and determining a screen area where each light reflecting point is located as the light reflecting area;
and determining the light reflecting area and the light emitting area as the target area.
7. The method of claim 4, the cooperatively adjusting the brightness of the light-emitting area and the light-reflecting area comprising:
and if the focus of the target object is positioned on the second screen, adjusting the luminous brightness of the luminous area, and if the target condition is not met, adjusting the luminous brightness of the light reflecting area.
8. The method of claim 7, the adjusting the light-emitting brightness of the light-emitting region comprising:
obtaining an adjustment coefficient, and adjusting the luminous brightness of the luminous area based on the adjustment coefficient;
The adjusting the light-emitting brightness of the light-reflecting area comprises the following steps:
obtaining a fourth parameter and a fifth parameter; the fourth parameter characterizes the transmissivity of the first screen and the fifth parameter characterizes the reflectivity of the second screen;
determining the reflection brightness of the reflection area based on the fourth parameter, the fifth parameter and the brightness of the brightness-adjusted light-emitting area; the reflection brightness represents the brightness of reflection formed by the light-emitting area in the reflection area after adjustment;
and reducing the light-emitting brightness of the light-reflecting area by units of the light-reflecting brightness.
9. The method of claim 4, adjusting regional luminescence brightness, comprising:
determining different color channels corresponding to each emitted ray; one area corresponds to a plurality of emitted light rays, and the sum of the luminous brightness corresponding to each emitted light ray is the regional luminous brightness;
and adjusting the channel brightness of the channels with different colors to adjust the luminous brightness corresponding to the emitted light.
10. A display control apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring the first parameter and the second parameter; the first parameter represents the position information of the target object, and the second parameter represents the pose information of the target device;
A determining module for determining a target area based on the first parameter and the second parameter; the target area is an area which generates visual influence on the target object, and belongs to a screen area of the target equipment;
and the adjusting module is used for responding to the instruction aiming at visual influence and carrying out brightness compensation on the target area.
CN202310946548.4A 2023-07-28 2023-07-28 Display control method and device Pending CN117037659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310946548.4A CN117037659A (en) 2023-07-28 2023-07-28 Display control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310946548.4A CN117037659A (en) 2023-07-28 2023-07-28 Display control method and device

Publications (1)

Publication Number Publication Date
CN117037659A true CN117037659A (en) 2023-11-10

Family

ID=88601503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310946548.4A Pending CN117037659A (en) 2023-07-28 2023-07-28 Display control method and device

Country Status (1)

Country Link
CN (1) CN117037659A (en)

Similar Documents

Publication Publication Date Title
US11025814B2 (en) Electronic device for storing depth information in connection with image depending on properties of depth information obtained using image and control method thereof
JP4584246B2 (en) How to display an output image on an object
US9711114B1 (en) Display apparatus and method of displaying using projectors
US9916681B2 (en) Method and apparatus for selectively integrating sensory content
US8727539B2 (en) Projector and method of controlling projector
US20220006951A1 (en) Electronic device for recommending composition and operating method thereof
JPWO2014027569A1 (en) Display device
US11720996B2 (en) Camera-based transparent display
CN108398788B (en) Eye tracking device and virtual reality imaging device
JP2014188322A (en) Visual line detection device, visual line detection method and program
CN100423545C (en) Camera and its accessories, and camera system
CN107872659B (en) Projection arrangement and projecting method
CN106973236B (en) Shooting control method and device
CN109997067A (en) Use the display device and method of portable electronic device
US11750781B2 (en) Projection apparatus and projection method
CN109427089B (en) Mixed reality object presentation based on ambient lighting conditions
US20230252918A1 (en) Eyewear projector brightness control
CN117037659A (en) Display control method and device
JP4922093B2 (en) Marker detection device and marker detection program
CN117121478A (en) Wearable electronic device including multiple cameras
WO2015151387A1 (en) Image-processing device, image-processing method, and program
US11971552B2 (en) Electronic device, method of controlling the same, and storage medium
US20230412778A1 (en) Projector with field lens
JP2005070412A (en) Image projector and its focus adjustment method
CN115426458B (en) Light source detection method and related equipment thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination