CN113379855A - Image processing method, apparatus, device, computer program product and storage medium - Google Patents

Image processing method, apparatus, device, computer program product and storage medium Download PDF

Info

Publication number
CN113379855A
CN113379855A CN202110695060.XA CN202110695060A CN113379855A CN 113379855 A CN113379855 A CN 113379855A CN 202110695060 A CN202110695060 A CN 202110695060A CN 113379855 A CN113379855 A CN 113379855A
Authority
CN
China
Prior art keywords
color
user
color information
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110695060.XA
Other languages
Chinese (zh)
Inventor
王非非
丁卫涛
鲁公涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Optical Technology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN202110695060.XA priority Critical patent/CN113379855A/en
Publication of CN113379855A publication Critical patent/CN113379855A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention discloses an image processing method, which is applied to a head-up display system and comprises the following steps: acquiring sample image data displayed by a head-up display system, and generating a standard color chart according to the acquired sample image data; acquiring data of a user by using the standard color card to determine a corresponding relation between color information which can be identified by the user and the color information of the sample image data; and calibrating the image information in the head-up display system according to the corresponding relation to obtain a target image. The invention also discloses a device, equipment, a computer program product and a storage medium. According to the invention, the color information which can be normally identified by the user is acquired by acquiring the data of the user, and the image information in the head-up display system is calibrated based on the color information which can be normally identified by the user, so that the target image which can be normally identified by the user is obtained, and the driving safety of the user is improved.

Description

Image processing method, apparatus, device, computer program product and storage medium
Technical Field
The present invention relates to the field of driving assistance technologies, and in particular, to an image processing method, an apparatus, a device, a computer program product, and a storage medium.
Background
At present, for improving driving safety, a Head-up Display system (HUD) for auxiliary driving is installed on a plurality of vehicles, the HUD system can project current driving information such as vehicle speed, rotating speed, endurance, tire pressure, and interactive information such as navigation and multimedia in front of a driver through a windshield glass to form a virtual Display screen, so that the driver can acquire corresponding driving information under the condition that the instrument panel is not looked over in a low Head mode, and dangerous driving behaviors that the sight line leaves a driving path due to the fact that the instrument panel is looked over in the driving process are avoided.
However, for the person with poor color discrimination, the color can be clearly distinguished only when the color is relatively saturated, and the change of the hue can be distinguished when the wavelength is greatly different. And in order to ensure information display's intuitiveness, driving information in the HUD system utilizes the colour to distinguish and sign mostly, and the colour weakness personnel still have very big potential safety hazard if can not correctly discern the driving information of the different colours that the HUD system shows when driving the vehicle of having installed the HUD system.
Disclosure of Invention
The invention mainly aims to provide an image processing method, an image processing device, an image processing equipment, a computer program product and a storage medium applied to a head-up display system, and aims to improve the driving safety of people with color weakness.
Further, to achieve the above object, the present invention also provides an image processing method including the steps of:
acquiring sample image data displayed by a head-up display system, and generating a standard color chart according to the acquired sample image data;
acquiring data of a user by using the standard color card to determine a corresponding relation between color information which can be identified by the user and the color information of the sample image data;
and calibrating the image information in the head-up display system according to the corresponding relation to obtain a target image.
Optionally, the step of generating a standard color chart according to the acquired sample image data includes:
performing color extraction on the sample image data to acquire color information of the sample image data, wherein the color information of the sample image data comprises RGB values of the sample image data;
and generating a standard color card according to the RGB value of the sample image data.
Optionally, the step of performing data acquisition on a user by using the standard color chart to determine a correspondence between color information recognizable by the user and color information of the sample image data includes:
acquiring data of a user by using the standard color card to acquire color information which can be identified by the user, wherein the color information which can be identified by the user comprises a target RGB value;
fitting the target RGB values with RGB values of the sample image data to determine correspondence between color information recognizable by the user and color information of the sample image data.
Optionally, the step of acquiring data of the user by using the standard color chart to obtain color information that can be recognized by the user includes:
acquiring a data acquisition instruction of a user, and determining a target color card from the standard color cards according to the data acquisition instruction;
adjusting the color information of the target color card according to the data acquisition instruction so as to adjust the color information of the target color card to the target color information which can be identified by the user;
and recording the target color information of the target color card to obtain the color information which can be identified by the user.
Optionally, the step of adjusting the color information of the target color card according to the data acquisition instruction to adjust the color information of the target color card to the target color information that can be recognized by the user includes:
acquiring a first target color card from the target color card, and adjusting the color information of the first target color card according to the data acquisition instruction;
and when the confirmation instruction of the user is detected, returning and executing the step of acquiring the first target color card from the target color card until the color information of all the color cards in the target color card is adjusted to the target color information which can be identified by the user.
Optionally, after the step of performing calibration processing on the image in the head-up display system according to the corresponding relationship to obtain the target image, the method further includes:
displaying the target image on the head-up display system, and acquiring an image adjusting instruction of the user;
and returning and executing the step of performing data acquisition on the user by using the standard color card according to the image adjusting instruction so as to determine the corresponding relation between the color information which can be identified by the user and the color information of the sample image data.
Further, to achieve the above object, the present invention also provides an image processing apparatus comprising:
the system comprises a sample generation module, a head-up display module and a standard color card generation module, wherein the sample generation module is used for acquiring sample image data displayed by the head-up display system and generating the standard color card according to the acquired sample image data;
the data acquisition module is used for acquiring data of a user by using the standard color card so as to determine the corresponding relation between color information which can be identified by the user and the color information of the sample image data;
and the image processing module is used for carrying out calibration processing on the image information in the head-up display system according to the corresponding relation to obtain a target image.
Further, to achieve the above object, the present invention also provides an image processing apparatus comprising: a memory, a processor and a graph processing program stored on the memory and executable on the processor, the image processing program when executed by the processor implementing the steps of the image processing method as described above.
In addition, to achieve the above object, the present invention also provides a storage medium having an image processing program stored thereon, the image processing program implementing the steps of the image processing method as described above when executed by a processor.
Furthermore, to achieve the above object, the present invention also provides a computer program product comprising a computer program which, when being executed by a processor, realizes the steps of the image processing method as described above.
The embodiment of the invention provides an image processing method, an image processing device, image processing equipment, a computer program product and a storage medium. Compared with the prior art that the information displayed by the head-up display system cannot be cleared and identified by the color-weakening person in the driving process, so that great driving safety hidden danger exists, in the embodiment of the invention, the standard color card is generated by collecting the sample image data displayed by the head-up display system and according to the collected sample image data; acquiring data of a user by using the standard color card to determine a corresponding relation between color information which can be identified by the user and the color information of the sample image data; and calibrating the image information in the head-up display system according to the corresponding relation to obtain a target image.
Drawings
Fig. 1 is a schematic hardware structure diagram of an embodiment of an image processing apparatus according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a first embodiment of an image processing method according to the present invention;
fig. 3 is a functional block diagram of an image processing apparatus according to an embodiment of the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The image processing device (also called terminal, device or terminal device) in the embodiment of the invention can be a terminal device with display and data processing functions, such as a PC, a smart phone, a tablet computer, a portable computer and the like.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Optionally, the terminal may further include a camera, a Radio Frequency (RF) circuit, a sensor, an audio circuit, a WiFi module, and the like. Such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display screen according to the brightness of ambient light, and a proximity sensor that may turn off the display screen and/or the backlight when the mobile terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile terminal is stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer and tapping) and the like for recognizing the attitude of the mobile terminal; of course, the mobile terminal may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which are not described herein again.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and an image processing program.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call an image processing program stored in the memory 1005, which when executed by the processor, implements the operations in the image processing method provided by the embodiments described below.
Based on the hardware structure of the device, the embodiment of the image processing method is provided.
Referring to fig. 2, in a first embodiment of the image processing method of the present invention, the image processing method includes:
step S10, collecting sample image data displayed by the head-up display system, and generating a standard color chart according to the collected sample image data;
the image processing method is implemented in the image processing equipment, is applied to a head-up display system HUD of a vehicle, can be integrated with a central control system of the vehicle, and displays driving information such as vehicle speed, rotating speed, endurance, tire pressure, navigation and the like in front of a driver through projection, so that the driver can acquire the driving information under the condition that the sight line of the driver does not leave a driving route. The image processing method of the present invention is implemented in an image processing device, which may be a terminal device with data processing and/or display functions, such as a personal computer, a tablet computer, and a smart television, and when the image processing method is applied to a head-up display device integrated with an automobile central control system, the device may also be a central control computer of the automobile central control system.
Although the existing HUD system can project the driving information to the front of the driver through the front windshield of the automobile, the driver is prevented from lowering head to check the potential safety hazard generated by the instrument panel, for the driver with weak color, the driving information marked by the color cannot be clearly identified, for example, the congestion condition of the road is distinguished by red and green which are commonly used in navigation, and for the personnel with weak red or green color, if the red and green cannot be clearly identified, the road condition information cannot be timely acquired. Therefore, the present invention provides an image processing method, which aims to process image information projected by a HUD, so that a person with color weakness can clearly identify driving information, thereby improving driving safety of the person with color weakness.
Specifically, firstly, sampling processing is performed on an image in the HUD system, so as to acquire sample image data, and a standard color chart is generated according to the sample image data acquired by sampling. When the sampling, need sample the image of the different driving information display interface of HUD, ensure through the image information that HUD projection shows to the weak personnel, all can discern. And generating a standard color card by using the data acquired by sampling so as to determine the color information of the currently displayed image in the HUD system.
Further, in step S10, a refinement of the standard color chart is generated from the acquired sample image data, including steps a1-a 2:
a step a1 of performing color extraction on the sample image data to obtain color information of the sample image data, wherein the color information of the sample image data includes RGB values of the sample image data;
step A2, generating a standard color chart according to the RGB values of the sample image data.
When generating the standard color chart, firstly, color extraction is performed on sample image data obtained by sampling, so as to obtain color information of an image in the HUD system, where the obtained color information includes RGB values of the sample image data, that is, channel values of different color channels, and then the standard color chart is generated based on the extracted RGB values. The generated standard color card is a monochrome color card, that is, each color card image only includes one color, and the generated standard color card includes colors included in all display interfaces in the HUD system, and may include colors commonly used in driving information such as yellow in addition to three primary colors such as red, green and blue. For example, when the sample data image includes green, red, blue and yellow display information in the image of the navigation information display interface, the channel values of the corresponding green, red and blue information are extracted respectively to generate corresponding monochrome color cards, and then the RGB values of the yellow display information are extracted to generate corresponding yellow color cards. The colors contained in all the images in the extracted sample image data are extracted and corresponding color cards are generated, and color extraction processing is not performed on display information with repeated colors in the sample image data, so that unnecessary calculation is reduced.
Step S20, collecting data of a user by using the standard color card to determine the corresponding relation between the color information which can be identified by the user and the color information of the sample image data;
after the standard color card is generated, data collection is performed on a user by using the standard color card, in this embodiment, the user includes a color weakness driver, and by performing data collection on the user, a corresponding relationship between color information that can be recognized by the user and color information of sample image data is determined. As is clear, the color weakness includes red weakness (type a) having a poor ability to distinguish red, green weakness (type b) having a poor ability to distinguish green, blue weakness having a poor ability to distinguish blue, and total color weakness having a poor ability to distinguish red, green, and blue, in which red weakness (type a) and green weakness (type b) are more common, and the number of patients having blue weakness is very small, and the number of patients having total color weakness is less common, but the possibility of existence is not excluded. When different patients with weak colors have multiple colors, if the discrimination capability of some colors is poor, the information acquisition capability is reduced, and potential safety hazards are generated in the driving process.
The standard color card is generated by using the sample image data collected from the HUD system, data collection is carried out on the user, and the color information which can be identified by the user is obtained, so that whether the color information such as the current color saturation of the system can be clearly identified by the user can be determined.
Further, in step S20, data collection is performed on the user using the generated standard color chart to determine a refinement of the correspondence between the color information recognizable by the user and the color information of the sample image data, including steps B1-B2:
step B1, acquiring data of a user by using the standard color card to acquire color information which can be identified by the user, wherein the color information which can be identified by the user comprises a target RGB value;
step B2, fitting the target RGB values with the RGB values of the sample image data to determine a correspondence between the color information recognizable by the user and the color information of the sample image data.
The method comprises the steps of utilizing color cards with different colors in the generated standard color cards to carry out data acquisition on users, adjusting the color saturation of the generated standard color cards when carrying out data acquisition until the users can clear and identify the color information of the color cards, recording the color information corresponding to the color cards at the moment, wherein the color information is the color information which can be identified by the users and comprises target RGB values corresponding to the colors which can be identified by the users, and fitting the RGB values of the standard color cards and the target RGB values corresponding to the color information which can be identified by the users, so that the corresponding relation between the color information which can be identified by the users and the color information of sample image data is determined, wherein the corresponding relation can be expressed by a function expression after carrying out data fitting. For example, three channels of sample image data have values of R1, G1, and B1, three channels of normal pictures considered by a user have values of R2, G2, and B2, and corresponding relationships between R1 and R2, between G1 and G2, and between B1 and B2 are calculated, respectively, and after fitting, if a linear relationship is found between RGB values of the sample image data and RGB values that can be recognized by the user, it can be considered that the corresponding relationships between R1 and R2, between G1 and G2, and between B1 and B2 are proportional relationships, and the proportional relationships are expressed as a function expression of formula 1:
X2=k·X1+b (1)
wherein, X2RGB value, X, corresponding to color information considered normal by the user1RGB values corresponding to color information of the sample image data, k and b are scale parameters, and the values of k and b can be obtained by data fitting results. By the data acquisition mode, the corresponding relation between the RGB values of different colors which can be identified by the user and the RGB values of the sample image data can be acquired. When fitting the RGB value, can be directly fitting according to the RGB value before and after the colour chip adjustment of same colour, obtain the change law of RGB value under this colour, thereby obtain the corresponding relation, also can be according to the colour chip of different colours to the R value, the change law of G value and B value in different colours is fitted respectively, or combine above-mentioned two kinds of fitting methods when data are fitted, specifically, can set up different fitting modes in the HUD system, according to the user of different colour weakness types, select different data fitting modes, thereby obtain more accurate fitting result.
And step S30, carrying out calibration processing on the image information in the head-up display system according to the corresponding relation to obtain a target image.
After determining the corresponding relation between the color information which can be identified by the user and the color information of the image in the HUD system, calibrating the image information in the HUD system according to the corresponding relation, and processing the image in the HUD system into a target image which can be clearly identified by the user according to the corresponding relation, thereby improving the capability of the user for acquiring information. When the image information in the HUD system is calibrated, the calibration may be performed at the imaging end of the HUD system, or may be performed at the projection end, so that the same effect may be achieved, which is not specifically limited herein.
It is to be noted that, in the color information of the sample image data, different colors are obtained by adjusting values of three color channels based on three primary colors of red, green, and blue, and different colors and/or different color saturations are obtained, so to reduce the amount of calculation, data acquisition may be performed on the user based on only the three primary colors, only a standard color chart including red, green, and blue is generated through the sample image data, a single-color channel value of the three primary colors that the user can normally recognize is determined, a correspondence between the channel value of the three primary colors that the user can recognize and the channel value of the three primary colors of the image in the HUD system is determined, and the image in the HUD system is calibrated based on the correspondence. Wherein, if the corresponding relation between the color information that can be identified by the user and the color information of the HUD system is determined based on only the three primary colors, when the image information in the HUD system is calibrated, if the color is a color formed by mixing the three primary colors of red, green and blue, the channel values of the three color channels are simultaneously processed according to the corresponding relation of the RGB values to increase the color saturation of the image, and the color information of the image of the HUD system is processed into the color information that can be easily identified by the user.
Further, after the step S30, steps C1-C2 are also included:
step C1, displaying the target image on the head-up display system and acquiring the image adjustment instruction of the user;
and step C2, returning and executing the step of collecting data of the user by using the standard color card according to the image adjusting instruction so as to determine the corresponding relation between the color information which can be identified by the user and the color information of the sample image data.
Processing the color information of the image in the HUD system according to the corresponding relation between the color information which can be identified by the user and the color information of the sample image data to obtain a target image which can be identified by the user, then projecting and displaying the processed target image through the HUD system, thereby determining the identification degree of the user on the processed image, when an image adjusting instruction of the user is obtained, proving that the user is not satisfied with the currently processed target image, needing to continuously adjust the color information of the image, and then returning and re-executing the step of carrying out data acquisition on the user.
Therefore, the image processing method in this embodiment can be applied to display terminals such as smart televisions according to different application scenes, so that the ability of the color weakness personnel to identify the image information displayed by the terminals is improved, and the user experience is improved.
In the embodiment, the standard color card is generated by acquiring sample image data displayed by the head-up display system and according to the acquired sample image data; acquiring data of a user by using the standard color card to determine a corresponding relation between color information which can be identified by the user and the color information of the sample image data; and calibrating the image information in the head-up display system according to the corresponding relation to obtain a target image. The data acquisition is carried out on the color-poor user based on the color information of the image in the head-up display system, so that the corresponding relation between the color information which can be identified by the color-poor user and the color information of the image in the head-up display system is determined, the image information in the head-up display system is calibrated according to the corresponding relation, the image information in the head-up display system is adjusted to the image information which can be clearly identified by the color-poor user, the identification capability of the color-poor user on the image information in the head-up display system is improved, and the driving safety of the color-poor user is improved.
Further, on the basis of the above-described embodiment of the present invention, a second embodiment of the image processing method of the present invention is proposed.
This embodiment is a step refined in step B1 in the first embodiment, and includes steps B11-B13:
step B11, acquiring a data acquisition instruction of a user, and determining a target color card from the standard color cards according to the data acquisition instruction;
step B12, adjusting the color information of the target color card according to the data acquisition instruction so as to adjust the color information of the target color card to the target color information which can be identified by the user;
and step B13, recording the target color information of the target color card to obtain the color information which can be identified by the user.
Based on the foregoing embodiment, in this embodiment, data acquisition is performed on the user in the foregoing embodiment, specifically, when data acquisition is performed on the user, a data acquisition instruction of the user is firstly obtained mainly in a manner of information interaction with the user, where the data acquisition instruction includes a data acquisition type, and can be triggered by the user by clicking a data acquisition button preset on the HUD system. Therefore, the color weakness personnel have the partial color weakness and the full color weakness, wherein the partial color weakness comprises the red weakness, the green weakness and the blue weakness, if the user belongs to the partial color weakness type, the data acquisition is only needed to be carried out on the color with poor identification capability, and the data acquisition and calibration are not needed to be carried out on the color which can be normally identified by the user. And according to a data acquisition instruction of a user, determining a target color card according to the color weakness type of the user from the generated standard color cards, taking the user as red weakness as an example, selecting the color card containing the red channel value from the generated standard color cards as the target color card, and acquiring data of the user by using the selected target color card.
Further, when data acquisition is performed on a user, color information such as color saturation of a standard color card is adjusted until the color information of the color card is adjusted to color information that can be normally identified by the user, and the color information that can be normally identified by the user is recorded, that is, the color information that can be normally identified by the user can be obtained, wherein when the color information that can be normally identified by the user is recorded, an RGB value of the color information is mainly recorded.
The refinement of step B12 includes steps B121-B122:
step B121, obtaining a first target color card from the target color card, and adjusting color information of the first target color card according to the data acquisition instruction;
and step B122, when the confirmation instruction of the user is detected, returning and executing the step of acquiring the first target color card from the target color card until the color information of all the color cards in the target color card is adjusted to the target color information which can be identified by the user.
Furthermore, when data acquisition is performed on the user, the selected target color card comprises a plurality of color cards, which may comprise color cards of different colors, and firstly, one color card is selected from the selected target color card and color information of the selected color card is adjusted until a confirmation instruction of the user is detected, and then the current RGB value of the color card is recorded. For example, for a user with weak red, a red color card is selected from target color cards corresponding to weak red, the color saturation of the red color card is continuously adjusted until the user can normally identify the color position of the red color card, the RGB value currently corresponding to the red color card is recorded, then another color card with other colors including the R value is selected from the selected target color cards, the above steps are repeated until data collection is completed for all the color cards in the selected target color card, and then color information that the user can normally identify based on different colors including red is acquired.
When the color of the color card is adjusted, if the channel value of one of the color channels is adjusted only for users with weak colors, the color of the color card may be changed, so that the channel values of the three color channels of the color card can be adjusted simultaneously, and the color saturation, brightness and the like of the color card are increased on the premise of not changing the original color of the color card, thereby increasing the identification capability of the users.
In this embodiment, data acquisition is carried out to the user through selecting the target color card from the standard color card that generates, and the user to different types of weakness is pointed to going on data acquisition, has reduced the unnecessary calculated amount, simultaneously, carries out data acquisition to the user through the color card of different colours, can acquire the user based on the color information that different colours can normally discern, has improved data acquisition's accuracy, is favorable to promoting image processing's effect.
Further, referring to fig. 3, an embodiment of the present invention also proposes an image processing apparatus including:
the system comprises a sample generation module 10, a head-up display module and a standard color chart generation module, wherein the sample generation module is used for acquiring sample image data displayed by the head-up display system and generating the standard color chart according to the acquired sample image data;
a data acquisition module 20, configured to perform data acquisition on a user by using the standard color chart, so as to determine a correspondence between color information that can be recognized by the user and color information of the sample image data;
and the image processing module 30 is configured to perform calibration processing on the image information in the head-up display system according to the corresponding relationship to obtain a target image.
Optionally, the sample generation module 10 is further configured to:
performing color extraction on the sample image data to acquire color information of the sample image data, wherein the color information of the sample image data comprises RGB values of the sample image data;
and generating a standard color card according to the RGB value of the sample image data.
Optionally, the data acquisition module 20 is further configured to:
acquiring data of a user by using the standard color card to acquire color information which can be identified by the user, wherein the color information which can be identified by the user comprises a target RGB value;
fitting the target RGB values with RGB values of the sample image data to determine correspondence between color information recognizable by the user and color information of the sample image data.
Optionally, the data acquisition module 20 is further configured to:
acquiring a data acquisition instruction of a user, and determining a target color card from the standard color cards according to the data acquisition instruction;
adjusting the color information of the target color card according to the data acquisition instruction so as to adjust the color information of the target color card to the target color information which can be identified by the user;
and recording the target color information of the target color card to obtain the color information which can be identified by the user.
Optionally, the data acquisition module 20 is further configured to:
acquiring a first target color card from the target color card, and adjusting the color information of the first target color card according to the data acquisition instruction;
and when the confirmation instruction of the user is detected, returning and executing the step of acquiring the first target color card from the target color card until the color information of all the color cards in the target color card is adjusted to the target color information which can be identified by the user.
Optionally, the image processing apparatus further comprises an image display module, configured to:
displaying the target image on the head-up display system, and acquiring an image adjusting instruction of the user;
and returning and executing the step of performing data acquisition on the user by using the standard color card according to the image adjusting instruction so as to determine the corresponding relation between the color information which can be identified by the user and the color information of the sample image data.
Furthermore, an embodiment of the present invention further provides a storage medium, where an image processing program is stored, and the image processing program, when executed by a processor, implements the operations in the image processing method provided by the above-mentioned embodiment.
In addition, an embodiment of the present invention further provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program implements the operations in the image processing method provided in the foregoing embodiments.
The embodiments of the apparatus, the computer program product, and the storage medium of the present invention may refer to the embodiments of the image processing method of the present invention, and are not described herein again.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity/action/object from another entity/action/object without necessarily requiring or implying any actual such relationship or order between such entities/actions/objects; the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
For the apparatus embodiment, since it is substantially similar to the method embodiment, it is described relatively simply, and reference may be made to some descriptions of the method embodiment for relevant points. The above-described apparatus embodiments are merely illustrative, in that elements described as separate components may or may not be physically separate. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the invention. One of ordinary skill in the art can understand and implement it without inventive effort.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention essentially or contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the image and processing method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An image processing method applied to a head-up display system is characterized by comprising the following steps:
acquiring sample image data displayed by a head-up display system, and generating a standard color chart according to the acquired sample image data;
acquiring data of a user by using the standard color card to determine a corresponding relation between color information which can be identified by the user and the color information of the sample image data;
and calibrating the image information in the head-up display system according to the corresponding relation to obtain a target image.
2. The image processing method of claim 1, wherein the step of generating a standard color chart from the acquired sample image data comprises:
performing color extraction on the sample image data to acquire color information of the sample image data, wherein the color information of the sample image data comprises RGB values of the sample image data;
and generating a standard color card according to the RGB value of the sample image data.
3. The image processing method according to claim 2, wherein the step of performing data acquisition on a user using the standard color chart to determine correspondence between the color information recognizable by the user and the color information of the sample image data includes:
acquiring data of a user by using the standard color card to acquire color information which can be identified by the user, wherein the color information which can be identified by the user comprises a target RGB value;
fitting the target RGB values with RGB values of the sample image data to determine correspondence between color information recognizable by the user and color information of the sample image data.
4. The image processing method according to claim 3, wherein the step of acquiring data of a user by using the standard color chart to obtain color information recognizable by the user comprises:
acquiring a data acquisition instruction of a user, and determining a target color card from the standard color cards according to the data acquisition instruction;
adjusting the color information of the target color card according to the data acquisition instruction so as to adjust the color information of the target color card to the target color information which can be identified by the user;
and recording the target color information of the target color card to obtain the color information which can be identified by the user.
5. The image processing method of claim 4, wherein the target color chip comprises a plurality of color chips, and wherein the refinement of the step of adjusting the color information of the target color chip according to the data acquisition instruction to adjust the color information of the target color chip to the target color information recognizable by the user comprises:
acquiring a first target color card from the target color card, and adjusting the color information of the first target color card according to the data acquisition instruction;
and when the confirmation instruction of the user is detected, returning and executing the step of acquiring the first target color card from the target color card until the color information of all the color cards in the target color card is adjusted to the target color information which can be identified by the user.
6. The image processing method as claimed in claim 1, wherein after the step of performing the calibration process on the image in the head-up display system according to the correspondence relationship to obtain the target image, the method further comprises:
displaying the target image on the head-up display system, and acquiring an image adjusting instruction of the user;
and returning and executing the step of performing data acquisition on the user by using the standard color card according to the image adjusting instruction so as to determine the corresponding relation between the color information which can be identified by the user and the color information of the sample image data.
7. An image processing apparatus characterized by comprising:
the system comprises a sample generation module, a head-up display module and a standard color card generation module, wherein the sample generation module is used for acquiring sample image data displayed by the head-up display system and generating the standard color card according to the acquired sample image data;
the data acquisition module is used for acquiring data of a user by using the standard color card so as to determine the corresponding relation between the color information which can be identified by the user and the color information of the sample image data;
and the image processing module is used for carrying out calibration processing on the image information in the head-up display system according to the corresponding relation to obtain a target image.
8. An image processing apparatus characterized by comprising: memory, a processor and an image processing program stored on the memory and executable on the processor, the image processing program, when executed by the processor, implementing the steps of the image processing method according to any one of claims 1 to 6.
9. A storage medium, characterized in that the storage medium has stored thereon an image processing program which, when executed by a processor, implements the steps of the image processing method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the image processing method according to any one of claims 1 to 6 when executed by a processor.
CN202110695060.XA 2021-06-22 2021-06-22 Image processing method, apparatus, device, computer program product and storage medium Pending CN113379855A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110695060.XA CN113379855A (en) 2021-06-22 2021-06-22 Image processing method, apparatus, device, computer program product and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110695060.XA CN113379855A (en) 2021-06-22 2021-06-22 Image processing method, apparatus, device, computer program product and storage medium

Publications (1)

Publication Number Publication Date
CN113379855A true CN113379855A (en) 2021-09-10

Family

ID=77578511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110695060.XA Pending CN113379855A (en) 2021-06-22 2021-06-22 Image processing method, apparatus, device, computer program product and storage medium

Country Status (1)

Country Link
CN (1) CN113379855A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106199954A (en) * 2016-08-30 2016-12-07 喻阳 A kind of Optical devices correcting the blue weak achromatopsia of yellow and method for designing thereof
CN110728724A (en) * 2019-10-21 2020-01-24 深圳创维-Rgb电子有限公司 Image display method, device, terminal and storage medium
CN111656759A (en) * 2018-11-13 2020-09-11 华为技术有限公司 Image color correction method and device and storage medium
CN111782845A (en) * 2020-08-17 2020-10-16 Oppo(重庆)智能科技有限公司 Image adjusting method, image adjusting device and mobile terminal
US10911748B1 (en) * 2018-07-10 2021-02-02 Apple Inc. Display calibration system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106199954A (en) * 2016-08-30 2016-12-07 喻阳 A kind of Optical devices correcting the blue weak achromatopsia of yellow and method for designing thereof
US10911748B1 (en) * 2018-07-10 2021-02-02 Apple Inc. Display calibration system
CN111656759A (en) * 2018-11-13 2020-09-11 华为技术有限公司 Image color correction method and device and storage medium
CN110728724A (en) * 2019-10-21 2020-01-24 深圳创维-Rgb电子有限公司 Image display method, device, terminal and storage medium
CN111782845A (en) * 2020-08-17 2020-10-16 Oppo(重庆)智能科技有限公司 Image adjusting method, image adjusting device and mobile terminal

Similar Documents

Publication Publication Date Title
CN109643446B (en) Circuit device, electronic apparatus, and error detection method
US20120038670A1 (en) Apparatus and method for providing augmented reality information
CN107293265B (en) Display screen picture adjusting method, display terminal and readable storage medium
US20160042570A1 (en) Display control device, display control method, and program
JP2006350617A (en) Vehicle driving support apparatus
US20060095200A1 (en) Operating device
CN107627969B (en) Method and device for changing color of vehicle body and computer storage medium
US20050190197A1 (en) Map display device and method for operating thereof
US10896500B2 (en) Display and method for displaying dynamic information of object
CN112016344A (en) State detection method and device of signal indicator lamp and driving control method and device
JP2014015127A (en) Information display apparatus, information display method and program
CN102729824B (en) Image processing determining apparatus
de Oliveira Faria et al. Place in the world or place on the screen? Investigating the effects of augmented reality head-up display user interfaces on drivers’ spatial knowledge acquisition and glance behavior
CN114459600A (en) Ambient light detection method and device and display screen compensation display method and device
CN113379855A (en) Image processing method, apparatus, device, computer program product and storage medium
CN103959204A (en) Information processing device, information processing method, and recording medium
KR101692764B1 (en) Method for Providing Augmented Reality by using Virtual Point
WO2023072093A1 (en) Virtual parking space determination method, display method and apparatus, device, medium, and program
CN114666634B (en) Picture quality detection result display method, device, equipment and storage medium
US11780444B2 (en) Driving assistance apparatus and image processing method
CN114463358A (en) Screen projection display method and device, electronic equipment and readable storage medium
CN112957003A (en) Dyschromatopsia detection method, device, equipment and computer readable storage medium
US11734928B2 (en) Vehicle controls and cabin interior devices augmented reality usage guide
EP1348973A3 (en) Identification of channels and associated signal information contributing to a portion of a composite eye diagram
CN108921097B (en) Human eye visual angle detection method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210910

RJ01 Rejection of invention patent application after publication