EP2175414A1 - Informationsumwandlungsverfahren, informationsumwandlungsvorrichtung und informationsumwandlungsprogramm - Google Patents

Informationsumwandlungsverfahren, informationsumwandlungsvorrichtung und informationsumwandlungsprogramm Download PDF

Info

Publication number
EP2175414A1
EP2175414A1 EP08792201A EP08792201A EP2175414A1 EP 2175414 A1 EP2175414 A1 EP 2175414A1 EP 08792201 A EP08792201 A EP 08792201A EP 08792201 A EP08792201 A EP 08792201A EP 2175414 A1 EP2175414 A1 EP 2175414A1
Authority
EP
European Patent Office
Prior art keywords
textures
color
information conversion
colors
differences
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08792201A
Other languages
English (en)
French (fr)
Other versions
EP2175414A4 (de
Inventor
Kenta Shimamura
Po-Chieh Hung
Yorihiro Yamaya
Mayumi Takeda
Shin-Ichiroh Kitoh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Publication of EP2175414A1 publication Critical patent/EP2175414A1/de
Publication of EP2175414A4 publication Critical patent/EP2175414A4/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/40012Conversion of colour to monochrome
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted

Definitions

  • the present invention relates to information conversion methods, information conversion apparatuses, and information conversion programs that make chromatic image display possible in a state that is suitable for observation by color vision deficient persons.
  • the present invention relates to information conversion methods, information conversion apparatuses, and information conversion programs that, when a chromatic image display is made as an achromatic image display, make it possible to guess the original chromatic image display.
  • Color vision deficiency is one of the disorders of the eye, and is one in which there is some abnormality in the recognition of and discrimination between colors due to defects or abnormality of cone photoreceptor cells that recognize color.
  • color vision deficient persons as is described in Table 9.1 "Classifications of color vision deficient persons and abbreviated symbols for them"(p. 189) of “Fundamentals of Color Engineering” (p. 189) authored by Mitsuo Ikeda, Asakura Shoten, are classified according to the photoreceptor cells of red (L cone cells), green (M cone cells), and blue (S cone cells), and according to their sensitivities (see Fig. 10 ).
  • any person having no sensitivity in any of the photoreceptor cells is called color blind, which is classified as type P in the case of L cone cells, type D in the case of M cone cells, and type T in the case of S cone cells.
  • color weakness When any of the sensitivities is low, the situation is called color weakness which is respectively classified into types PA, DA, and TA.
  • the color vision characteristics of types P, D, and T are such that, as is described in Table 9.13 "Color confusion line of dichromatic color vision deficient persons" (p. 205) of “Fundamentals of Color Engineering” authored by Mitsuo Ikeda, Asakura Shoten, the colors present on this line (color confusion line) appear as a completely identical color, and it is not possible to distinguish between them (see Fig. 11 ).
  • Such color vision deficient persons cannot distinguish between the colors of an image viewed by a normal color vision person in the same manner, and hence image display or image conversion is necessary for color vision deficient persons.
  • a phenomenon similar to that of color vision abnormality can occur even for normal persons under light sources with restricted spectral distributions. Further, this phenomenon can also occur when photographing using a camera.
  • Non-patent Document 1 improves the ease of distinguishing by changes in the color by converting the display into colors that can be distinguished by color vision abnormal persons.
  • the color changes since there is a trade-off between the amount of color change for color vision abnormal persons and the colors recognized by normal color vision persons, when conversion is made into colors that can be recognized by color vision abnormal persons, the color changes largely, and there will be a big change in the impression from the original display. Because of this, it is difficult to share documents between normal persons and color vision abnormal persons. Although there is a setting of making the color change as low as possible, in that case, there is not much improvement in the ease of distinguishing for color vision abnormal persons. In addition, since the color which is changed is determined according to the content of the color of the display, there is a big problem that the original color changes.
  • Patent Document 1 not only classifies the display data into those for which color - shape conversion is made and those for which this conversion is not made, but also, further classifications are made in terms of the shape such as point, lines, surfaces, etc., a table is possessed that has the shapes corresponding to the predetermined colors, and the result of this classification is converted into shapes by referring to the table.
  • Patent Document 2 The technology described in Patent Document 2 is an apparatus that photographs the subject and converts in the display so that it can be distinguished by a color vision abnormal person. This is a method in which the areas with roughly the same color as the (one or more) colors of the locations identified by the user within the photographed subject are made to be distinguished from other areas. This has distinguishing using texture or blinking.
  • the original color cannot be maintained.
  • an object of a single color is converted into a shape, very often there is an increase into multiple colors, while it becomes possible to distinguish from an object of the roughly the same color because of multiple colors, in that case even if one color is maintained with the original color, the overall color of the object becomes a synthesis of multiple colors, and can sometimes become different from the original color.
  • Patent Document 3 in a machine using the RGB video signals, uses the method of increasing each of the RGB ratios in the case of a person with color vision weakness.
  • this method is limited to color vision weakness (trichromacy type of color vision abnormal persons).
  • this method cannot be used unless investigation is made as to which is the type among the many types of color vision characteristics in color weakness.
  • Patent Document 5 for color blind persons, the colors are converted so that the degrees of color mixture and color differences become small.
  • This above Patent Document 5 is a method of converting colors for color vision deficient persons, and this method is one in which the sum of the difference with the original color and the extent to which the converted color lies on the color confusion line is made as small as possible.
  • this method similar to the above Non-patent Document 1, this method has the problem that distinguishing is difficult while maintaining the original color since there is a trade off between distinguishing and maintaining the original color.
  • the present invention has been made in order to solve the above problems, and the purpose of the present invention is to provide an information conversion method, an information conversion apparatus, and an information conversion program that realizes a chromatic image display, in a condition suitable for observation by normal persons and persons with color vision abnormality, that can be distinguished close to the normal viewing equivalent that observed by normal persons.
  • the purpose of the present invention is to provide an information conversion method, an information conversion apparatus, and an information conversion program that, even when color image data is output in monochrome, makes it possible to distinguish the original chromatic condition.
  • the present invention for solving the above problems is one as described below.
  • the areas in the image in which the results of light reception are similar are colors that lie on the same color confusion line.
  • the ability to distinguish is enhanced further by making said textures have patterns or hatching with different angles according to the differences in the original colors.
  • said textures have patterns or hatching with different angles according to the differences in the original colors.
  • the ability to distinguish is enhanced further by making said textures have different contrasts according to the differences in the original colors.
  • the ability to distinguish is enhanced further by making said textures change with time according to the differences in the original colors.
  • the ability to distinguish is enhanced further by making said textures move in different directions according to the differences in the original colors.
  • the ability to distinguish is enhanced further by making said textures have a combination of two or more of - patterns or hatching with different angles according to the differences in the original colors, different contrasts according to the differences in the original colors, change with time or move at different speeds according to the differences in the original color, or move in different directions or with different speeds according to the differences in the original colors.
  • the above color confusion line method of the present invention is a method that has a particularly large effect on color blindness. Since color confusion lines are had by color blind persons, the color confusions of PA, DA, and TA who are color vision weak persons are respectively similar to the color confusions of P, D, and T who are color blind persons, and with the method of the present invention, it is possible to solve the display for both color blind persons and color vision weak persons.
  • the angle of the texture is determined for each area of a predetermined prescribed number of pixels, a large memory is not necessary for superimposing a texture such as hatching, etc., on the image. For example, it becomes possible to process the image only with a quantity of memory by dividing the image into strips so that the prescribed pixel width is incorporated.
  • the pattern has become a checkered pattern due to dither, etc, or in the case of a simple vertical or horizontal stripe pattern, since the appearance is that of an average color, upon judging the presence of dither or noise, it is not treated as a segment, and textures such as hatching, etc, are added in a condition that is close to visual appearance.
  • Fig. 2 is a block diagram showing the detailed configuration of an information conversion apparatus 100 according to a first preferred embodiment of the present invention.
  • the block diagram of the present information conversion apparatus 100 also expresses the processing procedure of the information conversion method, and the different routines of the information conversion program.
  • FIG. 2 the items around the parts that are necessary for explaining the operation of the present preferred embodiment have been shown, and the other different types of items such as a power supply switch, power supply circuit, etc., that are well known as parts of an information conversion apparatus have been omitted.
  • the information conversion apparatus 100 is configured so as to be provided with a control section 101 that carries out the controls for generating the textures according to the color vision characteristics, a storage section 103 that stores the information, etc., related to the color vision characteristics and the textures corresponding to them, an operation section 105 from which are input by the operator instructions related to the color vision characteristics information and the texture information, a texture generating section 110 that generates, according to the image data, the color vision characteristics information, and the texture information, various textures with different conditions according to the difference in the original colors regarding the regions on the color confusion line where, although the colors are different in the chromatic image the results of light reception are similar in the light receiving side and hence it is difficult to distinguish, and an image processing section 120 that synthesizes and outputs the textures generated by the texture generating section 110 and the original image data.
  • Fig. 1 shows the basic processing steps of the present preferred embodiment.
  • the color vision characteristics are determined that become the target at the time of carrying out information conversion of a color image according to the present preferred embodiment (Step S11 in Fig. 1 ). However, by making settings as described later, it is also possible not to carry out the step of determining the color vision characteristics but to set it in a fixed manner.
  • This color vision characteristics information is either input by the operator using the operation section 105, or is supplied from an external apparatus as the color vision characteristics information.
  • color vision characteristics information if it is the case of a color vision abnormal person it can be the information as to which type the person belongs among the types shown in Fig. 10 , or can be the information as to which color the person is not finding difficulty in distinguishing.
  • the information of color vision characteristics is the information related to the areas for which the colors are different in the chromatic image but the results of light reception at the light receiving side are similar (similar and difficult to distinguish).
  • the chromatic image data is input to the information conversion apparatus 100 (Step S12 in Fig. 1 ).
  • control section 101 determines the type of textures to be added by carrying out information conversion regarding the color image according to the present preferred embodiment (Step S13 in Fig. 3 ).
  • This type of texture is determined by the texture information, and this texture information is either input by the operator via the operation section 105, or is supplied as texture information from an external apparatus. Or else, it is also possible for the control section 101 to determine the texture information according to the image data.
  • a texture means the pattern in an image.
  • this also means hatching in the form of grid patterns as is shown in Fig. 3c .
  • the expression has been made in monochrome due to the specifications of patent application drawings, in actual fact, this should be taken to imply grid patterns in color.
  • the composition of hatching need not only be 2-valued rectangular waveforms, but can also be smooth waveforms such as a sine wave.
  • the texture generating section 110 for the areas that have different colors in the chromatic image but the results of light reception in the light receiving side are similar and are difficult to distinguish, generates textures according to the differences in the original colors (Step S14 in Fig. 3 ).
  • the textures as textures having patterns or hatching with different angles, or textures having patterns or hatching with different contrasts, or textures that change such as blinking at different intervals, textures that move at different time periods, or at different speeds, or in different directions, or textures that move in different directions.
  • textures are selected in accordance with that instruction. Further, if there is an instruction from the operation section 105 or from an external apparatus, the textures determined by the control section 101 are selected.
  • the texture generating section 110 under instruction from the control section 101 generates textures of different types, or with different angles, or with different contrasts, or textures that change at different periods.
  • red before information conversion ( Fig. 4b ) and the green before information conversion ( Fig. 4c ) are in a condition in which it is difficult to distinguish between when viewed by a person with color vision abnormality.
  • a hatching with an angle of 45 degrees is generated as the texture ( Fig. 4d ).
  • a hatching with an angle of 135 degrees is generated as the texture ( Fig. 4e ).
  • hatching with continuously changing angles according to that position is generated as the texture.
  • the textures according to the differences in the original colors, have different contrasts in the pattern or hatching of the texture.
  • the duty ratio of the pattern or hatching it is also possible to change continuously the thickness of the hatching line according to the position on the color confusion line. Also, it is also possible to change the duty ratio according to the brightness of the color to be expressed.
  • this texture can also be made a combination of two or more of - pattern or hatching with different angles according to the difference between the original colors, with different contrasts according to the difference between the original colors, changing with time or moving at different speeds according to the difference between the original colors, and moving in different directions and with different speeds according to the difference between the original colors.
  • the textures such as the above that are generated in the texture generating section 110 and the original image are synthesized (Step S15 in Fig. 1 ). Further, at this time, before and after adding the textures, it is desirable that no change occurs in the average color or average density of the image. For example, in the condition in which the textures have been added, dark colored hatchings are added in the base part of a lighter color than the color of the original image. In this manner, it is desirable that the observation by a normal person is not affected and the original view is retained by not changing the average color in the region in which a texture has been added from the original color, or by making it resemble the original color.
  • the image after conversion by adding textures to the original image in this manner in the image processing section is output to an external apparatus such as a display device or an image forming apparatus (Step S16 in Fig. 1 ).
  • the information conversion apparatus 100 can exist independently, or can also be incorporated inside an existing image processing apparatus, or an image display apparatus, or in an image outputting apparatus. Further, when incorporated inside another apparatus, this can also be configured to use commonly the image processing section or the control section of that other apparatus.
  • textures including patterns or hatching with different angles, textures having patterns or hatching with different contrasts, textures that change such as blinking at different periods, textures that move with different periods or speeds or in different directions, textures that move with different speeds or in different directions, or textures that are combinations of a plurality of these.
  • the parameters of the type of texture are what pattern, or hatching, or angle, or contrast the texture has to have.
  • the period of blinking of the texture constitute the temporal parameters of the texture. It is possible to determine these parameters in the following manner.
  • the temporal parameters (period, speed, etc.) at the time of changing the texture of the image or/and the parameters of the type of texture are determined to correspond to the relative position of the color of the object on the color confusion line.
  • the position naturally differs depending on the coordinate system such as RGB, or XYZ, the position can also be, for example, the position on the u'v' chromaticity diagram.
  • Relative position is the position that is expressed as a ratio with respect to the overall length of the line.
  • the position As a method of actually expressing the position, it is also possible to express the position by increasing the reference points further apart from points C and D. For example, the point of achromaticity or the points of intersection with black body locus, point of simulation of color vision abnormality, etc, can be added as a new reference point, point E, and the relative position of the point B can be taken on the line segment CE or the line segment ED.
  • Changing the temporal parameters (period, speed, etc.) or/and changing the parameter of the type of texture at the time of changing the texture of the image according to the position is obtaining, using the conversion function or the conversion table, from the position information such as the value of the equation (3-1-1), the temporal information (period, speed, etc.) at the time of changing the texture of the image or/and a part of the parameter of the type of texture. It is also possible to vary two or more parameters, and it is possible to increase the discrimination effect by making large the apparent change.
  • the above parameters can be continuous or noncontinuous, it is desirable that they are continuous.
  • the change is continuous, in a condition suitable for observation by color vision abnormality persons, distinguishing becomes possible close to the original view equivalent to the observation by normal persons, colors can be grasped accurately, and even fine differences in the colors can be understood.
  • digital processing it will not be completely continuous.
  • the contrast of textures is explained using a concrete example of parameter change.
  • the method of changing the contrast of hatching as a parameter change of the temporal parameters (period, speed, etc.) or/and parameter change of the type of texture at the time of changing the texture of a concrete image.
  • the contrast Cont_b of the color of point B is obtained using Equation (3-5-1).
  • This is the method of interpolating the contrast of the line segment CD taking as reference the contrast Cont_c of point C and the contrast Cont_d of the point D as the reference, and determining the contrast Cont_b according to the position of point B.
  • the color intensity is the length from the black point which is the origin to the target color, and is as shown in Fig. 6 .
  • the maximum value of the intensity differs depending on the chromaticity.
  • Equation (3-5-2) the intensity P can be expressed by Equation (3-5-3).
  • Equation (3-5-2) is an equation of intensity in which it is possible to change the maximum intensities of R, G, B, respectively can be changed by the ratios of the coefficients a, b, and c.
  • Equation (3-5-3) is an equation of intensity in which the intensities have been normalized to be 1.0 at the maximum brightness.
  • the average of all the colors displayed when the temporal parameters (period, speed, etc.) are changed at the time of changing the texture of the image or/and when the type of texture is changed is made roughly equal to the color of the image before conversion.
  • the method of simply adding up all the colors and dividing by the number of colors is simple, it is desirable to use an average considering the area, or an average considering the display duration, etc.
  • 'Adding up' is that of light synthesis by additive mixing of colors either when the present preferred embodiment is applied to light emission displays such as display monitors or electrical sign boards, or when applied to printed matter such as paper, painted sign boards, etc.
  • 'Roughly equal to' can mean either taking a color difference of 12 or less of the reference value which is taken as the same color system in JIS (JISZ8729 - (1980)), or can be within a color difference of 20 or less of the reference value which is the management of color name levels given in page 290 of the New Color Science Handbook, 2nd editi on.
  • the chromaticity of all the colors displayed when the temporal parameters (period, speed, etc.) are changed at the time of changing the texture of the image or when the type of texture is changed is made roughly equal to the chromaticity of the object before conversion.
  • it is possible to change the chromaticity of the texture pattern in this case, it becomes difficult to realize that it is a hatching because of the color vision characteristics of humans. This is because, in the color vision characteristics of humans, changes in darkness and brightness are more easily recognized than changes in the chromaticity.
  • By unifying the chromaticity it is possible to observe that it is a part constituting the same object, and also there is less feeling of discomfort. It is possible to convey without mistakes the chromaticity that leads to the judgment of color names.
  • the spatial frequency of the pattern of the texture used is changed according to the shape and size of the image.
  • the frequency is set according to the size of the image to which the texture is applied and according to the size of the text characters contained in the image.
  • the spatial frequency of the pattern is low and it is not possible to recognize the periodicity within the image, the person viewing cannot recognize a pattern as a pattern, but may recognize it as a separate image.
  • the spatial frequency of the pattern viewed by the observer is high, it may not be possible to recognize the presence or absence of the pattern. In particular, as the distance from the observer to the display increases, the frequency viewed by the observer becomes high, and it becomes difficult to recognize the presence or absence of the pattern.
  • the lower limit of the frequency is set according to the overall size of the object
  • the upper limit of the frequency is set according to the overall text character size, and any frequency within those lower and upper limits are used.
  • the frequency is higher than the lower limit, the periodicity of the pattern in the object can be recognized, and since it becomes clear that the pattern is really a pattern, the pattern is not mistakenly recognized as an object.
  • the frequency is up to a high frequency of the same level as the text character size.
  • the object characteristics detection section 107 extracts the spatial frequency contained in the image, the text character size, the size of figure objects, etc., as the object characteristics information, and conveys them to the control section 101.
  • the control section 101 determines the spatial frequency of the texture according to the object characteristics.
  • the frequency of the object is avoided and the frequency is made higher or lower than that frequency. This is done in order to avoid confusion between the object and the hatching, and to cause the recognition of the presence or absence of hatching.
  • Contour lines are provided at the locations where hatching is used. By doing this, confusion between hatching and object is avoided. This can be used not only for hatching but also for other textures.
  • contour lines are provided to the image for which hatching is used as the texture. It is desirable that the contour line is of the average color of the texture.
  • the shape of the image becomes clear due to the contour line, and also, by making it of the average color, since the two colors of the slant lines of hatching and the contour line are different, it becomes difficult to confuse the image to which hatching is added and its neighboring image.
  • One of the parameters is taken as the angle of segmenting the region. By doing this, while it becomes easy to distinguish, in addition, in the case of angles, since the observer has an absolute reference, the chromaticity can be judged more accurately. If the correspondence between angle and chromaticity is determined in advance, it is easy to memorize the legend.
  • the angle of region segmentation is used as a parameter.
  • the parameter of the angle can be viewed easily as a change in the shape, and can be judged absolutely.
  • the angle Ang of point B under the conditions shown in Fig. 5 is determined by the following Equation (3-12-1). If the point B is taken as the center of the line CD, the angle Ang of the points BCD can be like any one of Figs. 9a, 9b, and 9c .
  • Ang 90 x BD / CD + 45
  • the color confusion line was taken as a concrete example of the region in which the results of light reception on the light receiving side were similar and could not be discriminated, it is not necessary to restrict to this. For example, it is possible to apply this similarly even when it is not the shape of a line but is a band or a region having a specific area in the chromaticity diagram.
  • textures including patterns or hatching with different angles, textures having patterns or hatching with different contrasts, textures that change with time such as blinking at different periods, textures that move with different periods or speeds or in different directions, textures that move with different speeds or in different directions, distinguishing close to the original view equivalent to the observation by normal persons becomes possible in a condition suitable for observation by a color vision abnormality person.
  • this type of effect can also be used when a normal person or camera observes or photographs images under a light source have a special spectral distribution.
  • a light source having two types of single color lights it is only possible to see colors that connect to those chromaticity points in the chromaticity diagram. For other directions, by adding textures indicated in the present invention, it is possible distinguish the colors.
  • textures it is not only possible to use patterns, hatching, or, contrast, angle, blinking, etc., of the patterns or hatching, but also, in the case of printed matter, etc., it is possible to include touch feeling realizing projections and depressions. Because of this, according to the differences in the original colors, distinguishing close to the original view equivalent to the observation by normal persons becomes possible in a condition suitable for observation by color vision abnormality persons. In this case, if it is a display device, it is possible to realize by forming or changing the projections and depressions by the extent of projection of a plurality of pins, or in the case of printed matter, it is possible to realize smoothness or roughness using paints.
  • the above explanations were of concrete examples of making distinguishing easy by adding textures to color regions that are difficult to distinguish in a chromatic image
  • the above preferred embodiment can also be applied to colors that are difficult to distinguish in achromatic colors (gray scale), or for colors that are difficult to distinguish in dark and light colors in a single color monochromatic image, and it is possible to obtain the good effect by making distinguishing easy.
  • Fig. 12 is a flow chart showing the operations (the procedure of execution of the information conversion method) of an information conversion apparatus 100 according to a second preferred embodiment of the present invention
  • Fig. 13 is a block diagram showing the detailed configuration inside an information conversion apparatus 100 according to a second preferred embodiment of the present invention.
  • the image is divided into prescribed areas, and the hatching angle is determined for each typical value of the pixel value (color) of those areas. Because of this, since an area is present, there is the feature that visual recognition of the hatching angle inside that area becomes improved.
  • the following second preferred embodiment uses hatching as a concrete example of a texture, and concrete examples are described in which the hatching angle is determined for each of the prescribed areas, it is possible to apply this to the first preferred embodiment described above. Therefore, duplicate explanations are omitted for the parts that are common to the first preferred embodiment described above, and explanations are given mainly for the parts that are different from the first preferred embodiment.
  • the information conversion apparatus 100 is configured to have a control section 101 that executes the control for generating textures according to the color vision characteristics, a storage section 103 that stores the information, etc., related to the color vision characteristics and the textures corresponding to them, an operation section 105 from which are input by the operator instructions related to the color vision characteristics information and the texture information, a texture generating section 110' that generates, according to the image data, the color vision characteristics information, and the texture information, various textures with different conditions according to the difference in the original color regarding the regions on the color confusion line where, although the colors are different in the chromatic image the results of light reception are similar in the light receiving side and hence it is difficult to distinguish, and an image processing section 120 that synthesizes and outputs the textures generated by the texture generating section 110' and the original image data.
  • the texture generating section 110' is configured to be provided with an N-line buffer 111, a color position/hatching amount generation section 112, an angle calculation section 113, and an angle data storage section 114.
  • the N-line buffer is prepared (Step S1201 in Fig. 12 ), and every N line of the RGB image data from an external apparatus is stored each time in that N-line buffer (Step S1202 in Fig. 12 ).
  • the image data is segmented into areas configured from a plurality of pixels set in advance.
  • the area is segmented, and in the angle calculation section 113, N pixels x N pixels are cut out (Step S1203 in Fig. 12 ), and the typical value is calculated for each of those areas.
  • this area of N x N pixels can also be resolved further in terms of the color distribution.
  • the resolving is done in terms of a plurality of areas (segments) and the typical value for each of those segments is obtained. Because of this, in case the boundary of the image (the border part of color change) lies within a predetermined area, it is possible to make it a beautiful hatching without any artifacts.
  • a general method of segmentation is used for resolving the areas.
  • a line that is almost perpendicular to the color confusion line and that is also an auxiliary line that passes through the end of the color region is drawn (can be a straight line, a piecewise linear line, or a curved line).
  • the angle and contrast are made maximum on the auxiliary line B that passes through red and blue, and the angle and contrast are made minimum on the auxiliary line A that passes through green.
  • the hatching parameter, angle is determined based on the above auxiliary line A and the auxiliary line B.
  • the hatching angle is made equal to 45 degrees on the auxiliary line B passing through red and blue
  • the hatching angle is made equal to 135 degrees on the auxiliary line A passing through green.
  • the triangle shown in the figure is an sRGB area, and green is passing approximately through the fundamental color green of AdobeRGB (a trademark or a registered trademark of Adobe Systems Inc. in USA and in other countries, same hereinafter).
  • the color position/hatching amount generation section 112 determines the intensity of contrast.
  • Fig. 15 Step S1212 in Fig. 12 . Further, here, the calculation is made not for the above described N x N pixels but for each pixel.
  • the relationship is made proportional to the angle, at the color region boundary where there is no margin in the intensity direction, either the contrast intensity is made weak or the brightness of the original color is adjusted.
  • the intensity can be adjusted so that the target color is within the color region, and also, the contrast can be made weak.
  • R'G'B' is made equal to RGB/ ⁇ and Cont is made equal to Cont/ ⁇ .
  • the elements constituting hatching image are taken in advance in one line. Even the information of sub-pixels is also recorded in this hatching element. This is called the hatching element data.
  • the data of an appropriate location is called from the hatching element data.
  • hatching is generated by carrying out prescribed sampling from a sine curve. This is made dependent on the X coordinate, the Y coordinate, and the angle. It is good to use the following equations for calculation which have also been shown in Fig. 16 .
  • the part of the trigonometric functions can be calculated in advance and can be put in the form of a table, thereby making it possible to carry out the calculations at a high speed.
  • X axis COS (Angle x ⁇ /180) x
  • N_SUB Y axis TAN (Angle x ⁇ /180) x COS (Angle x ⁇ /180) x
  • X axis TAN (Angle x ⁇ /180) x COS (Angle x ⁇ /180) x N_SUB Y axis: COS (Angle x ⁇ /180) x N_SUB 1 Period of hatching: CYCLE (to 8) Number of sub-pixels: N_SUB (to 64) Angle: Angle (45 to 135)
  • the hatching information read out as above is superimposed on the pixel value according to the contrast intensity, thereby obtaining the new image data (Step S1207 in Fig. 12 ).
  • red and blue hatching of 45 degrees
  • gray (achromatic) hatching of 90 degrees
  • green hatching of 135 degrees
  • the angle of coverage of the range of the color region from the convergence point of the color confusion line has been set so as to avoid the respective angles of the color confusion lines of the first color vision abnormality persons, the second color vision abnormality persons, and the third color vision abnormality persons.
  • the change in the hatching angle has been made to be observed. Because of this, it has been made possible to be distinguished by all of the color vision abnormality persons.
  • gray has been set as the middle point, it is convenient to assume the green of AdobeRGB for green. Because of this, at the same time, it also becomes possible to correspond to the colors of a broader color region.
  • Similar colors within the same area are present at the top, bottom, left and right, and the number of their connections is more than the number of pixels constituting the area, they are considered as segments, and an average color is assigned from all the pixels constituting it.
  • the pixels that do not satisfy this are handled as exceptions, all the points of exception within a square block are collected together, and a comprehensive average color is assigned.
  • the pattern is a checkered pattern due to dither, etc., or if it is simple vertical or horizontal pattern, since it appears visually as an average color, it has not been treated as a segment.
  • Fig. 20a from left to right, shows 19 color samples that change gradually from green to red.
  • Fig. 20b from left to right, shows 19 color samples that change gradually from green to red with hatching added according to the present preferred embodiment.
  • Fig. 21a is an image in which the color is changing gradually (chromatic) so that top left is magenta and bottom right is green, and also, gray (density of achromatic color) is changing gradually so that top right is black and bottom left is white.
  • Fig. 21b is an image which is the image of Fig. 21a to which is added hatching in units of one pixel by calculating the angle in units of 1 pixel, and shows the generation of moire pattern phenomenon, and it can be seen that there is a hatching angle other than the expected one at gray (should have been a hatching angle of 90 degrees) and green (should have been a hatching angle of 120 degrees). Further, within the green region, there are areas in which there is a sudden change in the hatching angle that is not intended.
  • Fig. 22a is an image in which the color (chromatic) is changing gradually so that top left is red and bottom right is cyan, and gray (density of achromatic color) is changing gradually so that top right is black and bottom left is white.
  • Fig. 22b is an image which is the image of Fig. 22a to which is added hatching in units of one pixel by calculating the angle in units of 1 pixel, and shows the generation of moire pattern phenomenon, and it can be seen that there is a state of a hatching angle other than the intended one at red (should have been a hatching angle of about 45 degrees to 60 degrees) (at the position indicated by the arrow in the figure).
  • Fig. 23a is similar to Fig. 21a , and is an image in which the color is changing gradually (chromatic) so that top left is magenta and bottom right is green, and also, gray (density of achromatic color) is changing gradually so that top right is black and bottom left is white.
  • Fig. 23b is an image which is the image of Fig. 23a to which is added hatching with the angle calculated for areas in units of sixteen pixels, and the hatching is at an angle of 90 degrees for gray, the hatching angle is about 120 degrees for green, and the hatching angle is about 60 degrees for magenta, and can be viewed with the desired hatching angles. Further, there is no sudden change in the hatching angle.
  • Fig. 24a is similar to Fig. 22a , and is an image in which the color (chromatic) is changing gradually so that top left is red and bottom right is cyan, and gray (density of achromatic color) is changing gradually so that top right is black and bottom left is white.
  • Fig. 24b is an image which is the image of Fig. 24a to which is added hatching with the angle calculated for areas in units of sixteen pixels, and the hatching is at an angle of 90 degrees for gray, the hatching angle is about 45 degrees for red, and the hatching angle is about 120 degrees for cyan, and can be viewed with the desired hatching angles. Further, there is no sudden change in the hatching angle.
  • textures such as hatching were added to color images, thereby making it possible for both normal color vision persons and persons with color vision abnormality to recognize the differences in color.
  • the feature is that the above first preferred embodiment and the second preferred embodiment are applied.
  • the final monochrome image is formed by adding hatchings of different angles according to the differences in the colors. Because of this, the problem of distinction between colors being unable to be made during monochrome printing will be solved. In this case, it is possible to realize the above by incorporating the circuit or program implementing the above preferred embodiments in the computer, or printer, or copying machine.
  • a hatching an auxiliary hatching
  • a hatching is added by calculating the hatching angle in a direction that is roughly at right angles to it (see Fig. 26 ).
  • the hatching is formed on the image by superimposing this auxiliary hatching along with the main hatching. Because of this, it will be possible to distinguish between different colors even with monochrome printing or even for fully color blind persons.
  • the frequency and angle are made different in the auxiliary hatching compared to that in the main hatching.
  • Main hatching 45 to 135 degrees
  • Auxiliary hatching -45 to 45 degrees (or -30 to 30 degrees, in order to avoid overlapping).
  • the frequency is made higher than in the main hatching, thereby making it thinner.
  • a frequency of twice the frequency in the main hatching is good. Because of this, it is possible to distinguish between the types of hatching.
  • the main hatching is made vertical while the auxiliary hatching is made horizontal in order to make the discrimination of colors easier.
  • the hatching intensity it is good to take zero as near gray, and to increase the hatching intensity according to the distance from gray, for example, in the u'v' chromaticity diagram.
  • Fig. 27 is an example showing the condition in which these types of main hatching and auxiliary hatching have been used together, and it can be seen that bottom right is horizontal/vertical indicating the case of gray.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Business, Economics & Management (AREA)
  • Vascular Medicine (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)
EP08792201A 2007-08-07 2008-08-05 Informationsumwandlungsverfahren, informationsumwandlungsvorrichtung und informationsumwandlungsprogramm Withdrawn EP2175414A4 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2007205955 2007-08-07
JP2008008401 2008-01-17
PCT/JP2008/064015 WO2009020115A1 (ja) 2007-08-07 2008-08-05 情報変換方法、情報変換装置、および、情報変換プログラム

Publications (2)

Publication Number Publication Date
EP2175414A1 true EP2175414A1 (de) 2010-04-14
EP2175414A4 EP2175414A4 (de) 2011-05-18

Family

ID=40341347

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08792201A Withdrawn EP2175414A4 (de) 2007-08-07 2008-08-05 Informationsumwandlungsverfahren, informationsumwandlungsvorrichtung und informationsumwandlungsprogramm

Country Status (4)

Country Link
US (1) US8422071B2 (de)
EP (1) EP2175414A4 (de)
JP (1) JP4760979B2 (de)
WO (1) WO2009020115A1 (de)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103259961A (zh) * 2012-02-21 2013-08-21 精工爱普生株式会社 图像处理装置、图像处理方法以及程序
EP2528317A3 (de) * 2011-05-23 2014-07-30 Seiko Epson Corporation Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren und Druckmaterial
EP2993664A3 (de) * 2014-09-05 2016-03-30 Samsung Display Co., Ltd. Anzeigevorrichtung, anzeigesteuerungsverfahren und anzeigeverfahren

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5495338B2 (ja) * 2009-09-30 2014-05-21 Necディスプレイソリューションズ株式会社 画像信号処理装置及び画像信号処理方法
JP2011086090A (ja) * 2009-10-15 2011-04-28 Konica Minolta Holdings Inc 画像変換方法、画像変換装置、画像変換システム及び画像変換プログラム
JP5232756B2 (ja) * 2009-10-28 2013-07-10 京セラドキュメントソリューションズ株式会社 画像処理装置および画像形成装置
JP2011250167A (ja) * 2010-05-27 2011-12-08 Sony Corp 色変換装置、色変換方法及びプログラム
JP5676971B2 (ja) * 2010-08-18 2015-02-25 キヤノン株式会社 情報処理装置、表示制御方法、プログラム
JP5707099B2 (ja) * 2010-11-05 2015-04-22 浅田 一憲 色覚補助装置、色覚補助方法及びプログラム
EP2525326B1 (de) 2011-05-20 2014-05-14 Sony Corporation Bildverarbeitungsverfahren und Bildverarbeitungsvorrichtung
JP5790298B2 (ja) * 2011-08-17 2015-10-07 セイコーエプソン株式会社 画像処理装置、画像処理プログラムおよび画像処理方法
JP5838669B2 (ja) 2011-09-05 2016-01-06 株式会社リコー 画像処理装置、画像処理方法、プログラムおよび記録媒体
JP2013088726A (ja) * 2011-10-20 2013-05-13 Sharp Corp 表示システムおよび表示プログラム
JP6061382B2 (ja) * 2013-01-16 2017-01-18 東洋インキScホールディングス株式会社 画像処理装置、画像処理方法、画像処理プログラム
US9153055B2 (en) * 2013-06-25 2015-10-06 Xerox Corporation Color content in document rendering for colorblind users
JP6060062B2 (ja) * 2013-07-30 2017-01-11 京セラドキュメントソリューションズ株式会社 画像処理装置及びプログラム
US20150310767A1 (en) * 2014-04-24 2015-10-29 Omnivision Technologies, Inc. Wireless Typoscope
US20150332482A1 (en) * 2014-05-16 2015-11-19 Microsoft Corporation Detecting conformance of graphical output data from an application to a convention
JP2016165026A (ja) * 2015-03-06 2016-09-08 株式会社沖データ 画像処理装置
CN104881633B (zh) * 2015-04-28 2017-11-14 广东欧珀移动通信有限公司 一种色盲模式启动方法及智能眼镜
CN117475965B (zh) * 2023-12-28 2024-03-15 广东志慧芯屏科技有限公司 一种低功耗反射屏色彩增强方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095705A1 (en) * 2001-11-21 2003-05-22 Weast John C. Method and apparatus for modifying graphics content prior to display for color blind use
US20040085327A1 (en) * 2002-11-01 2004-05-06 Tenebraex Corporation Technique for enabling color blind persons to distinguish between various colors
US20040223641A1 (en) * 2003-02-14 2004-11-11 Fuji Xerox Co., Ltd Document processing apparatus
JP2005051405A (ja) * 2003-07-31 2005-02-24 Kyocera Mita Corp 画像処理装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001257867A (ja) 2000-03-13 2001-09-21 Minolta Co Ltd 画像処理装置、印刷装置、画像処理方法および記録媒体
JP2001293926A (ja) 2000-04-17 2001-10-23 Seiko Epson Corp プリンタ、プリンタホスト、これらを備えたプリンタシステム、及びプリンタホストの動作プログラムが記憶されている記憶媒体
JP2003223635A (ja) 2002-01-29 2003-08-08 Nippon Hoso Kyokai <Nhk> 映像表示装置および撮影装置
US7605930B2 (en) * 2002-08-09 2009-10-20 Brother Kogyo Kabushiki Kaisha Image processing device
JP4228670B2 (ja) * 2002-11-29 2009-02-25 富士ゼロックス株式会社 画像処理装置および画像処理方法ならびに画像処理プログラム
EP1453006A1 (de) 2003-02-28 2004-09-01 Océ-Technologies B.V. Transformiertes digitales Farbbild mit verbesserter Farbunterscheidbarkeit für Farbblinde
JP4097628B2 (ja) * 2004-06-01 2008-06-11 裕 大隅 色覚異常者への色存在の教示方法および教示プログラム
JP2005182432A (ja) 2003-12-19 2005-07-07 Toyo Ink Mfg Co Ltd 色変換装置
ATE430544T1 (de) 2003-12-03 2009-05-15 Tenebraex Corp System und verfahren zur identifizierung mindestens einer farbe für einen anwender
JP2005190009A (ja) * 2003-12-24 2005-07-14 Fuji Xerox Co Ltd 色覚支援装置、色覚支援方法、及び色覚支援プログラム
JP2006154982A (ja) * 2004-11-26 2006-06-15 Fuji Xerox Co Ltd 画像処理装置、画像処理方法、及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095705A1 (en) * 2001-11-21 2003-05-22 Weast John C. Method and apparatus for modifying graphics content prior to display for color blind use
US20040085327A1 (en) * 2002-11-01 2004-05-06 Tenebraex Corporation Technique for enabling color blind persons to distinguish between various colors
US20040223641A1 (en) * 2003-02-14 2004-11-11 Fuji Xerox Co., Ltd Document processing apparatus
JP2005051405A (ja) * 2003-07-31 2005-02-24 Kyocera Mita Corp 画像処理装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2009020115A1 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2528317A3 (de) * 2011-05-23 2014-07-30 Seiko Epson Corporation Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren und Druckmaterial
US9100623B2 (en) 2011-05-23 2015-08-04 Seiko Epson Corporation Image processing device and method for adding textures to background and to an object
CN103259961A (zh) * 2012-02-21 2013-08-21 精工爱普生株式会社 图像处理装置、图像处理方法以及程序
EP2632144A1 (de) * 2012-02-21 2013-08-28 Seiko Epson Corporation Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren und Programm
US8908963B2 (en) 2012-02-21 2014-12-09 Seiko Epson Corporation Image processing apparatus, image processing method, and program
CN103259961B (zh) * 2012-02-21 2015-11-11 精工爱普生株式会社 图像处理装置及图像处理方法
EP2993664A3 (de) * 2014-09-05 2016-03-30 Samsung Display Co., Ltd. Anzeigevorrichtung, anzeigesteuerungsverfahren und anzeigeverfahren
US10078988B2 (en) 2014-09-05 2018-09-18 Samsung Display Co., Ltd. Display apparatus, display control method, and display method

Also Published As

Publication number Publication date
WO2009020115A1 (ja) 2009-02-12
EP2175414A4 (de) 2011-05-18
US8422071B2 (en) 2013-04-16
JPWO2009020115A1 (ja) 2010-11-04
JP4760979B2 (ja) 2011-08-31
US20100134810A1 (en) 2010-06-03

Similar Documents

Publication Publication Date Title
US8422071B2 (en) Information conversion method, information conversion apparatus, and information conversion program
US20110090237A1 (en) Information conversion method, information conversion apparatus, and information conversion program
US7916152B2 (en) Technique for enabling color blind persons to distinguish between various colors
US20140153825A1 (en) Technique for enabling color blind persons to distinguish between various colors
EP1869875B1 (de) Farbumwandlungseinheit zur verringerung des farbsaums
JP2008086011A (ja) 色欠損画像をエンハンスするためのシステムおよび方法
JP5273389B2 (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
US11094093B2 (en) Color processing program, color processing method, color sense inspection system, output system, color vision correction image processing system, and color vision simulation image processing system
JP4429558B2 (ja) 画像表示方法及び装置
EP2066110B1 (de) Abbildungsverfahren zusammen mit verankerten Abbildungsbahnen mit verbesserter Einheitlichkeit
WO2016098301A1 (en) Image processing apparatus and image processing method
US20190246895A1 (en) System and method for device assisted viewing for colorblindness
EP2005412A2 (de) Verfahren für blinde personen zur unterscheidung zwischen verschiedenen farben
US6721069B1 (en) Mapping from full color to highlight color and texture
US6760123B1 (en) Method and system for mapping color to texture in a copier
Meguro et al. Simple color conversion method to perceptible images for color vision deficiencies
JP6977416B2 (ja) 画像処理装置、情報処理装置およびプログラム
JP5177222B2 (ja) 文書ファイル取扱方法、文書ファイル取扱装置、および、文書ファイル取扱プログラム
EP3496381B1 (de) Drucksache, drucksachenherstellungsverfahren, bilderzeugungsvorrichtung und trägermittel
WO2009133946A1 (ja) 画像処理方法、画像処理装置、および、画像処理プログラム
Kanazawa et al. Color conversion for multi-primary displays using a spherical average method
Koshikizawa et al. Color Appearance Control for Color Vision Deficiency by Projector-Camera System
JP2009296431A (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
Connah et al. An investigation into perceptual hue-ordering

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100128

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20110415

17Q First examination report despatched

Effective date: 20120319

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20160125