CN117203602A - Color vision assisting system - Google Patents

Color vision assisting system Download PDF

Info

Publication number
CN117203602A
CN117203602A CN202280029851.1A CN202280029851A CN117203602A CN 117203602 A CN117203602 A CN 117203602A CN 202280029851 A CN202280029851 A CN 202280029851A CN 117203602 A CN117203602 A CN 117203602A
Authority
CN
China
Prior art keywords
color
information
user
color vision
presentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280029851.1A
Other languages
Chinese (zh)
Inventor
上原皓
井元麻纪
铃木龙一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of CN117203602A publication Critical patent/CN117203602A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • A61B3/0325Devices for presenting test symbols or characters, e.g. test chart projectors provided with red and green targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

An information processing apparatus according to the present technology includes a presentation processing unit configured to present color vision assistance information generated based on a captured image of an object and color vision characteristic information of a user to the user.

Description

Color vision assisting system
Technical Field
The present technology relates to an information processing apparatus, an information processing method, and a color vision assisting system, and in particular, to a technique for assisting a user in color vision based on a color vision characteristic of the user.
Background
The color vision diversity of red, blue and green is known to vary from person to person.
The following patent document 1 discloses a technique of improving visibility to a user having a color vision defect by enlarging the content of display information.
Further, the following patent document 2 discloses a technique for determining the type and degree of color vision deficiency and a correction spectral characteristic curve related to color vision using a color vision tester, and designing glasses for correcting color vision based on these information.
Further, the following patent document 3 discloses a technique for correcting the color of an image based on color vision characteristic information.
CITATION LIST
Patent document
Patent document 1: japanese patent application laid-open No. 2009-86212
Patent document 2: japanese patent No. 5110760
Patent document 3: japanese patent application laid-open No. 2009-71541
Disclosure of Invention
Problems to be solved by the invention
The aim of the present technique is to enable a user to accurately identify the colour of an object that is actually seen.
Solution to the problem
An information processing apparatus according to the present technology includes a presentation processing unit configured to present color vision assistance information generated based on a captured image of an object and color vision characteristic information of a user to the user.
As described above, color vision assistance information generated based on the captured image of the object and the color vision characteristic information of the user is presented to the user, and thus the user can be assisted in easily recognizing the color of the object actually seen by the user.
An information processing method according to the present technology is an information processing method in which an information processing apparatus presents color vision assistance information generated based on a captured image of an object and color vision characteristic information of a user to the user.
With such an information processing method, an information processing apparatus according to the present technology described above can be realized.
Furthermore, the color vision assisting system according to the present technology includes: an imaging unit configured to capture an image of an object; an auxiliary information generating unit configured to generate color vision auxiliary information based on the captured image of the subject obtained by the imaging unit and color vision characteristic information of the user; and a presentation processing unit configured to present the color vision auxiliary information generated by the auxiliary information generating unit to a user.
As described above, color vision assistance information generated based on the captured image of the object and the color vision characteristic information of the user is presented to the user, and thus the user can be assisted in easily recognizing the color of the object actually seen by the user.
Drawings
Fig. 1 is a diagram showing an example of an external configuration of an information processing apparatus 1 as an embodiment according to the present technology.
Fig. 2 is a diagram showing an example of the electrical internal configuration of the information processing apparatus 1.
Fig. 3 is a functional block diagram showing functions of the information processing apparatus as an embodiment.
Fig. 4 is a diagram for describing a specific example of the calibration of the color vision characteristics.
Fig. 5 is an explanatory diagram of an example of the first type of behavior.
Fig. 6 is an explanatory diagram of an example of the second type of behavior.
Fig. 7 is a diagram showing an example of presenting color vision assistance information.
Fig. 8 is an explanatory diagram showing an example of color vision auxiliary information of the second element level.
Fig. 9 is an explanatory diagram of an enlarged image of an object.
Fig. 10 is a diagram for describing another example of color vision assistance information at the second element level.
Fig. 11 is a flowchart showing a process related to calibration for obtaining color vision characteristic information of a user.
Fig. 12 is a flowchart showing a process for detecting a user's behavior and presenting color vision assistance information in response to the behavior detection result.
Fig. 13 is an explanatory diagram of a first modification related to the device configuration.
Fig. 14 is an explanatory diagram of a second modification related to the device configuration.
Fig. 15 is an explanatory diagram of a third modification related to the device configuration.
Fig. 16 is an explanatory diagram of a fourth modification related to the device configuration.
Fig. 17 is an explanatory diagram of a fifth modification related to the device configuration.
Detailed Description
Hereinafter, embodiments will be described in the following order.
<1 > arrangement of information processing apparatus as an embodiment >
<2 > color vision assisting method as an embodiment
<3. Procedure >
<4 > modification related to device construction
<5 > modification related to presented information
<6. Other modifications >
<7. Overview of embodiments >
<8 > this technique
<1 > Structure of information processing apparatus as an embodiment >
Fig. 1 shows an example of an external configuration of an information processing apparatus 1 as an embodiment according to the present technology.
The information processing apparatus 1 includes a display screen 17a capable of displaying various types of information, and assists a user in terms of color vision by presenting information on the display screen 17 a.
In the present example, the information processing apparatus 1 is configured as a wrist phenotype apparatus, and includes: a main body 1a, a display 17a being formed on the main body 1 a; and a band portion 1b, the band portion 1b being connected to the main body 1a and attached to the wrist portion of the user.
Further, the information processing apparatus 1 includes an imaging unit 21, and the imaging unit 21 performs imaging by a solid-state imaging element and can sense an image. In the present example, the imaging unit 21 is provided separately from the main body 1a, and in particular, at a predetermined position of the belt portion 1 b. More specifically, in the present example, the imaging unit 21 is provided on the belt portion 1b so as to be located on the palm side when the information processing apparatus 1 is worn, such that the main body 1a is located on the back side of the hand and the palm side with respect to the wrist of the user.
In this example, a color image may be displayed on the display screen 17a, and the imaging unit 21 may generate the color image as a captured image.
As will be described later, the information processing apparatus 1 presents color vision assistance information generated based on a captured image of an object obtained by the imaging unit 21 and color vision characteristic information of the user to the user via the display screen 17 a.
Note that in this example, the information processing apparatus 1 is formed in a shape mimicking a wristwatch, but does not necessarily have a clock function.
Fig. 2 shows an example of an electrical internal configuration of the information processing apparatus 1.
As shown, the information processing apparatus 1 includes a Central Processing Unit (CPU) 11, a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and a nonvolatile memory unit 14. The nonvolatile memory unit 14 includes, for example, an electrically erasable programmable read only memory (EEP-ROM) or the like.
The CPU 11 executes various types of processing in accordance with a program stored in the ROM 12 or nonvolatile memory unit 14 or a program loaded from the storage unit 19 to the RAM 13 as described later. The RAM 13 also appropriately stores data and the like necessary for the CPU 11 to execute various types of processing.
The CPU 11, ROM 12, RAM 13, and nonvolatile memory unit 14 are connected to each other via a bus 23. In addition, an input/output interface 15 is also connected to the bus 23.
The input/output interface 15 may be connected with the following units: an input unit 16 for a user to perform an input operation; a display unit 17 including a liquid crystal panel, an organic Electroluminescence (EL) panel, and the like; a voice output unit 18 including a speaker and the like; a storage unit 19; a communication unit 20.
Here, the display screen 17a described above is a display screen included in the display unit 17.
The input unit 16 refers to an input device used by a user who uses the information processing apparatus 1. For example, various types of operation elements and operation devices such as a keyboard, a mouse, keys, a dial, a touch panel, and a remote controller are assumed as the input unit 16. The operation of the user is sensed by the input unit 16, and a signal corresponding to the input operation is interpreted by the CPU 11.
The display unit 17 includes the display screen 17a described above, and can display various types of information on the display screen 17a based on instructions from the CPU 11. Further, the display unit 17 may display various types of operation menus, icons, messages, and the like based on instructions from the CPU 11, that is, as a Graphical User Interface (GUI).
The storage unit 19 includes, for example, a storage medium such as a Hard Disk Drive (HDD) or a solid-state memory. The storage unit 19 may store various types of data such as image data captured by the imaging unit 21, and the like. Further, the storage unit 19 may also be used to store program data for causing the CPU 11 to execute various types of processing.
The communication unit 20 performs communication processing via a network including the internet, wired or wireless communication (e.g., near field communication, etc.) with peripheral devices.
Further, in the present example, the imaging unit 21 and the sensor unit 22 are connected to the input/output interface 15.
The imaging unit 21 includes, for example, a solid-state imaging element such as a Complementary Metal Oxide Semiconductor (CMOS) type or a Charge Coupled Device (CCD) type. For example, in a solid-state imaging element, a plurality of pixels are two-dimensionally arranged, the pixels having photoelectric conversion elements such as photodiodes. The imaging unit 21 performs, for example, correlated Double Sampling (CDS) processing, automatic Gain Control (AGC) processing, and the like on the electric signal generated by photoelectric conversion of each pixel, and further performs analog/digital (a/D) conversion processing to obtain captured image data as digital data.
The sensor unit 22 comprehensively indicates various types of sensors for detecting the behavior of the user. In the present example, the sensor unit 22 is provided with a motion sensor (for example, an acceleration sensor, an angular velocity sensor, or the like) that detects a motion of a body part of a user wearing the information processing apparatus 1.
Note that a sensor to be provided in the sensor unit 22 should be appropriately selected according to the form, application, and the like of the information processing apparatus 1, and a modification thereof will be described again later.
Further, in the present example, a sensor as a Radio Frequency Identifier (RFID) reader is provided in the sensor unit 22. Therefore, in the case where an "object" described later is an article or the like to which an RFID tag is attached, information stored in the RFID tag can be read.
<2 > color vision assisting method as an embodiment
Fig. 3 is a functional block diagram showing the function of the CPU 11 in the information processing apparatus 1 as an embodiment.
As shown, the CPU 11 includes a characteristic information acquisition unit F1, a behavior detection unit F2, an auxiliary information generation unit F3, and a presentation processing unit F4.
In the present embodiment, each of these functional units realizes a function of assisting color vision of a user whose color vision function is different from that of a general user. Hereinafter, for convenience, a person having a color vision function different from that of a general person is referred to as a "person having color vision diversity".
The characteristic information acquisition unit F1 acquires color vision characteristic information based on information input by a user. The color vision characteristic information is information indicating characteristics of human color vision, and corresponds to information indicating a difference in function in a case where a general color vision function is used as a reference for a user who is a person having color vision diversity that is difficult to distinguish between red and brown, or between blue and purple, or the like, for example.
The color vision characteristic information of the person having the color vision diversity is appropriately grasped, and thus the person can be appropriately assisted in color vision based on the color vision characteristic information.
In the present embodiment, the color vision characteristic information of the user is acquired by performing the calibration of the color vision characteristic. In this example, the calibration is performed in the following process: the reference color and the sample color are displayed on the display screen 17a and the user is allowed to select the sample color that appears the same as the reference color.
A specific example of the calibration of the color vision characteristic will be described with reference to fig. 4.
Calibration is basically performed by a method in which a predetermined color (such as red, green, or blue) is used as a reference color, and a user is allowed to select a sample color that looks the same as the reference color for each reference color.
Fig. 4A shows a screen to start calibration for a certain reference color. As shown in the figure, a color image of a target reference color and a start button B1 are displayed on a start screen. Here, the color image refers to an image having a certain color. Examples of the color image include a full red image and the like.
When the start button B1 is operated on the start screen, a color image of the sample color is displayed on the display screen 17a together with a color image of the reference color. In this example, the sample colors displayed side by side with the reference color are sequentially changed, and the user is allowed to select the sample color that appears the same as the reference color.
Specifically, fig. 4B to 4D each show a state in which the sample colors are sequentially changed in response to the operation of the start button B1 on the start screen shown in fig. 4A. In fig. 4, the sample color and the reference color are each represented by the number of oblique lines.
In response to the operation of the start button B1, a color image of the sample color is displayed on the display screen 17a, and a stop button B2 is displayed instead of the start button B1. In changing the sample color, the user operates the stop button B2 at a timing when the sample color that appears to be the same as the reference color is displayed. The stop button B2 is operated to select the sample color, and the characteristic information acquisition unit F1 may designate the sample color displayed on the display screen 17a as a color visually recognized by the user as the same as the reference color at the timing when the stop button B2 is operated.
Here, in sequentially changing the sample colors, when the sample colors are switched, it is desirable to temporarily hide the color images of the sample colors. Thus, this makes it easy for the user to recognize the switching of the sample color.
Furthermore, it is conceivable to change the sample color in a different pattern for each target reference color. For example, between the case where the target reference color is red and the case where the target reference color is blue, the hue of the sample color is changed in different modes, or the like. Here, the color change may be performed by changing at least one of hue, saturation, or brightness of a color image to be displayed.
Fig. 4E shows an example of a screen in the case where the stop button B2 is operated at the timing of fig. 4D as an example of a screen displayed in response to the operation of the stop button B2.
In the case of operating the stop button B2, a color image of the reference color and a color image of the sample color at the time of operating the stop button B2 are displayed on the display screen 17a, respectively, and the reset button B3 and the next button B4 are displayed.
The user can instruct the information processing apparatus 1 to reset the selection of the sample color by operating the reset button B3. Accordingly, in the case of operating the reset button B3, the characteristic information acquisition unit F1 realizes reselection of the sample color by displaying the start screen shown in fig. 4A on the display screen 17 a.
On the other hand, in the case where the next button B4 is operated, the characteristic information acquisition unit F1 acquires information of the sample color displayed on the display screen 17a at the timing when the stop button B2 is operated as information visually recognized by the user as the same color as the reference color. Here, the color information refers to color information specified by hue, saturation, and brightness.
Further, in the case where the next button B4 is operated, the characteristic information acquisition unit F1 acquires color vision characteristic information of the next reference color. Specifically, for the next reference color, a start screen similar to the start screen shown in fig. 4A is displayed, the sample color is sequentially changed in response to the operation of the start button B1, and a screen on which the reset button B3 and the next button B4 are arranged is displayed in response to the operation of the stop button B2, similarly to fig. 4E.
In the case of operating the reset button B3, the characteristic information acquisition unit F1 causes the start screen of the next reference color to be displayed again, and allows the user to reselect the sample color that appears the same as the reference color. Further, in the case where the next button B4 is operated, the characteristic information acquisition unit F1 acquires information of the sample color displayed when the stop button B2 is operated during the sample color change of the next reference color as information of a color that appears to be the same as the next reference color.
Here, in the case where the target reference color is the latest reference color, the characteristic information acquisition unit F1 displays an end screen indicating the end of calibration on the display screen 17a in response to the operation of the next button B4.
Calibration for obtaining color vision characteristic information is performed by the above-described method, whereby it is unnecessary for the user to perform an operation of changing the color of the sample. Further, since the sample color is displayed side by side with the reference color, it is easy for the user to compare the reference color with the sample color, and to obtain accurate color vision characteristic information.
Thus, it is possible to improve the accuracy of the calibration while reducing the burden of user operations related to the calibration, and ultimately to improve the accuracy of the color vision assistance.
Here, human color vision characteristics are known to include circadian variation and daily variation. In view of such circadian rhythm variation and daily variation, calibration should be performed relatively frequently. In the case where the operation load is large, it is difficult to frequently perform calibration, resulting in a decrease in accuracy of color vision assistance based on color vision characteristic information of the user. Since the operation load of the calibration is reduced, the calibration can be easily performed at a high frequency, and the accuracy of the color vision assistance is improved in this respect.
Note that, as understood from the description described above, it is desirable that the calibration for obtaining the color vision characteristic information is performed not only once but a plurality of times. For example, in consideration of the circadian variation described above, calibration should be performed a plurality of times a day. However, the calibration is performed once a day, for example, taking into account the daily variation.
The description returns to fig. 3.
The behavior detection unit F2 detects a behavior of the user with respect to the object. Specifically, the behavior detection unit F2 detects the behavior of the user with respect to the object based on the detection signal of the motion sensor (in this example, the detection signal of the acceleration or the angular velocity) included in the sensor unit 22 shown in fig. 2. More specifically, the behavior of the user is detected from the movement of the user's hand on the side where the information processing apparatus 1 is worn.
In this example, the first type of behavior and the second type of behavior are detected as behaviors of the user with respect to the object. Specifically, as shown in fig. 5, the first type of behavior is a behavior in which the user protrudes his/her hand toward the object (a behavior in which the user moves his/her hand closer to the object). As shown in fig. 6, the second type of behavior is a behavior in which the user lifts up the object.
Note that fig. 5 and 6 show an example in which the object is a shirt as one piece of clothing, but the object is not limited thereto.
In this example, in order to detect the first type of behavior, an image captured by the imaging unit 21 is used together with a detection signal of the motion sensor. Specifically, for the detection signal of the motion sensor, the behavior detection unit F2 determines that the user has performed the first type of behavior in the case where the detected signal waveform pattern matches or is similar to the signal waveform pattern when the hand is extended forward, and obtains an image in which the object gradually approaches from the image analysis of the captured image.
Further, the behavior detection unit F2 detects the second type of behavior using at least the detection signal of the motion sensor. Specifically, the behavior detection unit F2 determines that the user has performed the second type of behavior in a case where it has been recognized from the detection signal of the motion sensor that the user has moved his/her hand upward by a predetermined amount or more within a predetermined time after the detection of the first type of behavior.
Note that the method of detecting the first type of behavior and the second type of behavior is not limited to the above-described method, and various methods may be considered.
For example, an example has been described above in which the detection result of the first type of behavior is used in the detection of the second type of behavior, but it is also conceivable to perform the detection of the second type of behavior not based on the detection result of the first type of behavior but as an independent detection. In this case, it is conceivable to detect the second type of behavior, for example, under the following determination conditions: it is recognized from the captured image that the user's hand and the object are moving in conjunction with each other (moving relative to the background), and it is recognized from the detection signal of the motion sensor that the user's hand has moved upward by a predetermined amount or more within a period of time in which the user's hand and the object are moving in conjunction with each other.
As will be described later, in the present embodiment, the result of the user's behavior detected by the behavior detection unit F2 is used to present information of color vision assistance.
In fig. 3, the auxiliary information generating unit F3 generates color vision auxiliary information based on the captured image of the subject and the color vision characteristic information. The color vision assistance information is information for assisting the user in terms of color vision. Specifically, in the present example, information for helping a user (e.g., a person having color vision diversity) easily recognize the color of the object captured by the imaging unit 21 is generated as color vision assistance information.
Specific examples of the color vision auxiliary information will be described again below.
The presentation processing unit F4 presents color vision auxiliary information to the user. Specifically, in the present example, color vision assistance information is displayed on the display screen 17a of the display unit 17.
In the present example, the presentation processing unit F4 presents information indicating a color as presentation of color vision auxiliary information.
The "information indicating a color" referred to herein refers to information that enables a user to recall a specific color when presented with the information. Specific examples thereof include the above-described color image (image having a specific color), character information indicating color names (such as "blue", "red", "purple", and "bluish purple"), and the like.
Based on the captured image of the object and the color vision characteristic information of the user, information for helping the user (e.g., a person having color vision diversity) easily recognize the color of the object captured by the imaging unit 21 is generated as the color vision auxiliary information described above.
Fig. 7 is a diagram showing an example of presenting color vision assistance information.
Fig. 7 shows an example in which a color image of a color corresponding to the color of the object captured by the imaging unit 21 is displayed on the display screen 17a as color vision assistance information.
Here, the types of people with color vision diversity are classified according to a decline in which one of photoreceptors (red-sensitive photoreceptor, green-sensitive photoreceptor, and blue-sensitive photoreceptor) has a function. Specifically, the type of photoreceptors is not limited to photoreceptors in which deterioration in function is observed only in any one of the red photoreceptors, the green photoreceptors, and the blue photoreceptors, and there are some photoreceptors in which deterioration in function is observed in any two of the red photoreceptors, the green photoreceptors, and the blue photoreceptors.
For example, it is difficult for so-called two-color viewers having a degradation function of green-sensitive photoreceptors to distinguish between red and green, orange and yellowish green, green and brown, blue and violet, and the like. For example, in the case where the object is purple, the purple color of the object appears more bluish than it actually is, and thus it is difficult to distinguish the color from blue.
Therefore, in the case where the user is a person having color vision diversity, when the color of the target shirt is purple, a magenta image having a stronger redness of purple is generated as the color vision assistance information, and the color image is displayed on the display screen 17 a. Therefore, a user (e.g., a person having color vision diversity) can easily recognize the actual color of the object by referring to the color image displayed as the color vision auxiliary information. In particular, in the case described above, it is easy for the user to recognize that the color of the object perceived as blue by the user is actually purple.
In the color vision assistance information, the color to be presented to the user should be determined based on the type of color vision diversity of the user (the type of color vision characteristics) and the color of the imaging object. The type of color vision diversity can be obtained from the above-described calibration results. Specifically, the characteristic information acquisition unit F1 specifies the type of color vision diversity according to the calibration result, and obtains information on the type of color vision diversity as color vision characteristic information. In the case where the type of color vision diversity of the user can be specified, a color for making an actual color of an object (hereinafter, such color is referred to as "auxiliary color") easy to perceive for a person having color vision diversity can be specified.
In the present example, as described above, the auxiliary information generating unit F3 obtains an auxiliary color to be presented as color vision auxiliary information based on the color vision characteristic information (i.e., information of the type of color vision diversity and information of the color of the object specified from the captured image of the imaging unit 21) acquired by the characteristic information acquiring unit F1.
Note that the auxiliary color is obtained, for example, by preparing a table storing information of the auxiliary color corresponding to each combination of the types of color vision diversity and the color of the object and referring to the table or the like.
Here, in the case where the user is a person having color vision diversity, the auxiliary color indicated by the color vision auxiliary information is a color different from the color of the object.
The color of an object perceived by a user (e.g., a person with color vision diversity) is a different color than the color of the actual object. Thus, information of auxiliary colors different from the colors of the objects is presented, and thus it can be advantageous for users having color vision diversity to recognize the actual colors of the objects.
Further, in the present embodiment, the color vision characteristic information acquired by calibration may be restated as difference information between the reference color and a color that appears to be the same as the reference color. Then, in the present embodiment, the auxiliary color of the object may be restated as a color determined based on color vision characteristic information (such as such difference information).
Here, the difference information between the reference color and the color that appears the same as the reference color can be said to be the difference information between the reference color and the perceived color when the user perceives the reference color. In other words, as in the present embodiment, the presentation processing unit F4 that obtains the auxiliary color by the above-described exemplary method and presents color vision auxiliary information indicating the auxiliary color performs the following presentation processing. Thus, the processing is presentation of information indicating a color determined based on difference information between the color of the object and the perceived color when the user perceives the color of the object as color vision assistance information.
In the present example, the presentation processing unit F4 presents color vision assistance information based on the user behavior result detected by the behavior detection unit F2. Specifically, the presentation processing unit F4 switches the information element level of the information presented as the color vision assistance information in response to the type of the user behavior detected by the behavior detection unit F2.
The information element level referred to herein refers to the level of an element constituting information. Specifically, the difference in information element levels is, for example, a difference in the number of amounts or types of information to be presented, a difference in the quality or specificity of the information to be presented, or the like.
Here, the inventors of the present application have so far obtained the following knowledge through observation of the behavior of people with color vision diversity: the difference in the types of behaviors such as stretching out a hand to an object, picking up an object, changing the way of holding an object, and the like is related to the difference in the information element level as information required by the user for decision information. For example, the act of extending a hand to an object or the like typically indicates a request by a user (e.g., a person with color vision diversity) to identify the color of the object. Further, the behavior of picking up or changing the manner in which the object is held, etc., generally indicates a request by the user to obtain more detailed information in addition to the color of the object. Thus, as described above, the information element level of the information to be presented to the user is switched in response to the type of behavior of the user.
In the present example, in the case where the behavior detection unit F2 detects the above-described first type of behavior (behavior of extending the hand toward the object), as shown in fig. 7, the presentation processing unit F4 presents color vision assistance information including only a color image indicating the auxiliary color on the display screen 17a as shown in fig. 7.
Hereinafter, an information level including only information indicating a color for color vision assistance as described above is referred to as a "first element level" as an information element level of color vision assistance information.
On the other hand, in the case where the behavior detection unit F2 detects the above-described second type of behavior (behavior of lifting the object), as shown in fig. 8, the presentation processing unit F4 displays color vision assistance information of a second element level higher than the first element level in terms of the information element level on the display screen 17 a.
Specifically, fig. 8A shows an example in which attribute information (including attribute information other than color) of an object is displayed as color vision auxiliary information of a second element level on the color image of the first element level shown in fig. 7. This may be expressed as an example of displaying information indicating the color of the auxiliary color of the object and other information than the information indicating the color.
Here, character information indicating "purple" of the color of the object and character information indicating the size, composition, and price of the shirt object are illustrated as attribute information of the object.
Here, it is conceivable that in the case where the object is an article as in the present example, information on an RFID tag attached to the article read by the RFID reader described above is displayed as attribute information of the object.
Fig. 8B shows color vision assistance information of the second element level, which corresponds to a case where the user has previously input a desired color (requested color) with respect to the commodity object. Specifically, the color vision auxiliary information in this case is obtained by displaying image information in which a color image indicating the color of the object (the "viewed color" in the figure) and a color image indicating the auxiliary color of the color requested by the user are arranged side by side.
In other words, a color image indicating the color of the object and a color image indicating the color corresponding to the color requested by the user are displayed side by side.
As described above, the information element level of the color vision assistance information to be presented to the user is switched in response to the type of behavior of the user, and thus the information can be presented according to the information element level implicitly required by the user through its natural behavior, and it can be advantageous for the user to understand the object not only from the perspective of color vision but also from the perspective other than color vision. Furthermore, the presentation of the auxiliary information based on the natural behavior of the user makes it possible to provide accessibility with high affinity to the daily life of the user.
In particular, as shown in fig. 8B, as the color vision assistance information of the second element level, in the case where a color image indicating the color of an object and a color image indicating the color corresponding to the color required by the user are displayed side by side, the user can be appropriately assisted in having a higher-order understanding of whether the object is an object desired by the user in terms of hue (higher-order understanding beyond just understanding the primary understanding of the color of the object).
Note that in this example, the display of color vision assistance information related to the color requested by the user as shown in fig. 8B may be switched from the display of color vision assistance information shown in fig. 8A.
Specifically, in the present example, in the case where the second type of behavior is detected, the presentation processing unit F4 first displays color vision auxiliary information including the attribute information shown in fig. 8A. After color vision assistance information including attribute information is displayed in this way, in a case where the user has previously input a requested color, a display switching operation for color vision assistance information related to the requested color as shown in fig. 8B is received. In the case where the switching operation has been performed, color vision assistance information related to the requested color is displayed on the display screen 17 a. Thereafter, each time a display switching operation is performed, the display of the color vision auxiliary information including the attribute information is switched to the display of the color vision auxiliary information related to the requested color.
Here, an example of displaying information indicating an auxiliary color of the object has been described above as an example of color vision auxiliary information, but an enlarged image of the display object is also conceivable as color vision auxiliary information.
Fig. 9 is an explanatory diagram of an enlarged image of an object.
Fig. 9A shows a captured image of an object before enlargement. For example, in the case where the object includes a fine color arrangement pattern portion in a resistive element as shown in the drawing, it is difficult for a person having color vision diversity to recognize the color of the color arrangement pattern portion. Thus, as shown in fig. 9B, an enlarged image of the subject is displayed on the display screen 17a as color vision assistance information.
Note that, in the color vision auxiliary information in this case, as shown in fig. 9B for example, the aiming mark M is displayed at a predetermined position in the image frame (such as the center of the image frame), character information indicating the color of the portion indicated by the aiming mark M may be displayed as a part of the color vision auxiliary information (for example, display of "red" in the figure).
Further, for example, in the case where the color arrangement pattern in the color arrangement pattern portion represents attribute information of an object (resistance value in the resistance element) as in the resistance element, the attribute information obtained by image analysis of the object may be displayed as a part of color vision auxiliary information (for example, display of "1kΩ±5%" in the figure).
In the present example, displaying color vision assistance information by the enlarged image described above is performed under the following conditions: the behavior detection unit F2 detects a first type of behavior of the user, and an object captured in an image captured by the imaging unit 2 includes a fine color arrangement pattern portion.
Specifically, in the case where the behavior detection unit F2 detects the first type of behavior of the user, the presentation processing unit F4 in the present example analyzes the image captured by the imaging unit 21 and determines the presence or absence of the fine color arrangement pattern portion. It is conceivable to determine the presence or absence of the fine color arrangement pattern portion by, for example, detecting color arrangement pattern portions in which portions of different colors are arranged in a predetermined pattern and determining that an arrangement interval of each color in the color arrangement pattern portions is equal to or smaller than a predetermined interval.
In the case where it is determined that the fine color arrangement pattern portion exists, the presentation processing unit F4 generates an enlarged image including an image area of the fine color arrangement pattern portion in the captured image, and displays color vision assistance information including the enlarged image on the display screen 17 a.
Note that, although not shown, in the case of displaying an enlarged image as described above, color vision assistance information including information indicating an object auxiliary color may be displayed together with the enlarged image. For example, in the example of fig. 9B, information (e.g., color image or color character information) indicating the auxiliary color of the portion indicated by the sighting mark M is displayed together with the enlarged image or the like.
Further, as described above, color vision assistance information including attribute information of an object as shown in fig. 8A and color vision assistance information related to a required color as shown in fig. 8B are exemplified as color vision assistance information at the second element level, but various examples of color vision assistance information at the second element level are conceivable. For example, examples as shown in fig. 10A and 10B are also conceivable.
In fig. 10A, in the case where the object is a shirt, color vision assistance information including the following is exemplified: information indicating the color of the object ("purple T-shirt" in the figure); and information on the color of the pants, which is suitable for the shirt object (black and white in the figure).
In addition, in fig. 10B, in the case where the object is one tomato, color vision assistance information including: information indicating the color of the object (in the figure, "mature tomato", that is, red tomato); and information (in the figure, "marinate" and "curry") for suggesting a recommended dish of the subject for use.
<3. Procedure >
Next, we will describe a specific procedure example for realizing the color vision assisting method of the embodiment as described above with reference to flowcharts of fig. 11 and 12.
Note that the CPU 11 executes the processing shown in fig. 11 and 12 according to a program stored in the ROM 12 or the storage unit 19.
Fig. 11 is a flowchart showing a process related to calibration for obtaining color vision characteristic information of a user.
First, in step S101, the CPU 11 sets the reference color identifier n to an initial value "1". The reference color identifier n is the following identifier: which is used to identify a reference color to be processed among a plurality of reference colors defined as calibration targets.
In step S102 subsequent to step S101, the CPU 11 displays a start screen of the nth reference color. Specifically, for the nth reference color, a color image including the reference color and a screen of the start button B1 as shown in fig. 4A are displayed on the display screen 17 a.
In step S103 subsequent to step S102, the CPU 11 waits until the start button B1 is operated as a waiting process to start the operation.
In the case where the start button B1 is operated and it is determined to perform the start operation, the CPU 11 proceeds to step S104 and starts the sequential change of the sample colors corresponding to the nth reference color. Specifically, the display processing of sequentially changing the sample color in a predetermined pattern for the nth reference color is started.
In step S105 following step S104, the CPU 11 waits for the operation of the above-described stop button B2 as a waiting process for stopping the operation.
In the case where it is determined that the stop button B2 is operated and the stop operation is performed, the CPU 11 proceeds to step S106 and displays the operation screen after the stop. Specifically, as shown in fig. 4E, a screen including a color image of a reference color (nth reference color), a color image of a sample color at the time of stopping the operation, a reset button B3, and a next button B4 is displayed.
In step S107 after step S106, the CPU 11 determines the presence or absence of an operation for proceeding to the next step (i.e., the presence or absence of an operation of the next button B4). In the case where it is determined that the operation for proceeding to the next step is not performed, the CPU 11 determines the presence or absence of the reset operation (i.e., the presence or absence of the operation of the reset button B3) in step S108. In the case where it is determined in step S108 that the reset operation is not performed, the CPU 11 returns to step S107, and determines again the presence or absence of an operation for proceeding to the next step. The loop of step s107→s108→s107 causes the formation of a waiting process until an operation for proceeding to the next step or a reset operation is performed.
In the case where it is determined in step S108 that the reset operation has been performed, the CPU 11 returns to step S102. Thus, for the nth reference color, a sample color that appears the same as the nth reference color may be selected again.
Further, in the case where it is determined in step S107 that the operation for proceeding to the next step has been performed, the CPU 11 proceeds to step S109 and stores the sample color at the time of stopping as the perceived color corresponding to the nth reference color. Specifically, information of the sample color displayed on the display screen 17a when it is determined that the stop operation in step S105 has been performed is stored in a predetermined storage means (e.g., the RAM 13, the nonvolatile storage unit 14, etc.) as information visually recognized by the user as the same color as the nth reference color.
In step S110 subsequent to step S109, the CPU 11 determines whether the reference color identifier N is greater than or equal to the upper limit value N. In the case where the reference color identifier N is not greater than or equal to the upper limit value N, the CPU 11 proceeds to step S111, increments the reference color identifier N by 1, and returns to step S102. Thus, for the next reference color, processing is performed that allows the user to select a sample color that appears to be the same as the reference color.
On the other hand, in the case where it is determined in step S110 that the reference color identifier N is greater than or equal to the upper limit value N, the CPU 11 proceeds to step S112, displays an end screen, and ends the series of processing shown in fig. 11. Note that in step S112, a screen including information for notifying the user of completion of calibration, such as "calibration has been completed" or the like, is displayed as an end screen.
Fig. 12 is a flowchart showing a process for detecting a user's behavior and presenting color vision assistance information in response to a behavior detection result.
In fig. 12, the CPU 11 determines in step S201 whether the first type of behavior has been detected. As described above, in the present example, the first type of behavior is a behavior in which the user projects his/her hand toward the object (see fig. 5), and the behavior may be detected based on, for example, the image captured by the imaging unit 21 and the detection signal of the motion sensor in the sensor unit 22 as described above.
In the case where it is determined in step S201 that the first type of behavior has been detected, the CPU 11 proceeds to step S203, and generates and presents color vision assistance information of the first element level.
As described above, in the present example, switching between display of color vision assistance information based on only a color image of an auxiliary color as shown in fig. 7 and display of color vision assistance information including an enlarged image of an object as shown in fig. 9B is performed according to whether or not an object captured in an image captured by the imaging unit 21 includes a fine color arrangement pattern portion, with respect to color vision assistance information of a first element level. Specifically, in response to determining that the first type of behavior has been detected in step S201, the CPU 11 analyzes the image captured by the imaging unit 21 and determines the presence or absence of the fine color arrangement pattern portion. As described above, it is conceivable to determine the presence or absence of the fine color arrangement pattern portion by, for example, detecting color arrangement pattern portions in which portions of different colors are arranged in a predetermined pattern and determining whether an arrangement interval of each color in the color arrangement pattern portions is equal to or smaller than a predetermined interval.
In the case where it is determined that the fine color arrangement pattern portion exists, the CPU 11 generates an enlarged image including an image area of the fine color arrangement pattern portion in the captured image, and displays color vision assistance information including the enlarged image on the display screen 17 a.
On the other hand, in the case where it is determined that there is no fine color arrangement pattern portion, the CPU 11 designates an auxiliary color for the color of the object designated from the captured image based on the color vision characteristic information of the user obtained from the above-described calibration result, and displays color vision auxiliary information including a color image derived from the auxiliary color on the display screen 17 a.
In response to execution of the process in step S203 as described above, the CPU 11 advances the process to step S205. Note that the processing in step S205 will be described later.
On the other hand, in the case where it is determined in step S201 that the first type of behavior is not detected, the CPU 11 proceeds to step S202 and determines whether the second type of behavior has been detected. In the present example, as described above, the second type of behavior is a behavior in which the user lifts up the object (see fig. 6), and the behavior may be detected based on, for example, the detection signal of the motion sensor and the detection result of the first type of behavior as described above.
In the case where it is determined in step S202 that the second type of behavior has been detected, the CPU 11 proceeds to step S204, and generates and presents color vision assistance information of the second element level.
As described above, in the present example, switching from the display of the second element level of the color vision assistance information including the attribute information of the object (fig. 8A) to the display of the second element level of the color vision assistance information related to the color requested by the user is performed in response to the switching operation of the user. Specifically, in response to determining in step S202 that the second type of behavior has been detected, the CPU 11 first displays color vision assistance information including the attribute information shown in fig. 8A on the display screen 17 a. Thereafter, in the case where the user has previously input the requested color, the CPU 11 accepts a display switching operation for color vision assistance information related to the requested color as shown in fig. 8B, and displays the color vision assistance information related to the requested color on the display screen 17a in the case where the switching operation has been performed. Thereafter, each time a display switching operation is performed, the display of the color vision auxiliary information including the attribute information is switched to the display of the color vision auxiliary information related to the requested color.
The CPU 11 advances the process to step S205 in response to execution of the process in step S204.
Further, in the case where it is determined in step S202 that the second type of behavior is not detected, the CPU 11 skips the processing in step S204, and advances the processing to step S205.
In step S205, the CPU 11 determines whether or not to end the processing, that is, whether or not the processing of a predetermined condition, which is determined in advance as a condition that the processing should be ended, for example, the power supply of the information processing apparatus 1 is turned off, or the like, is satisfied.
In step S205, if the process is not ended, the CPU 11 returns to step S201. Thus, for example, color vision assistance information of a first element level is presented in step S203 after the first type of behavior is detected in step S201, a second type of behavior is detected in step S202, and color vision assistance information of a second element level is presented in step S204. Further, for example, after the second type of behavior is detected in step S202, color vision assistance information of the second element level is presented in step S204, and then, for example, the first type of behavior to another object is detected, color vision assistance information of the first element level may be presented for another object in step S203.
Furthermore, in the case where neither the first type of behavior nor the second type of behavior is detected, color vision assistance information of the first element level or the second element level is not presented.
In the case where it is determined in step S205 that the processing ends, the CPU 11 ends a series of processing shown in fig. 12.
<4 > modification related to device construction
As described above, an example has been described in which one of the wristwatch-type information processing devices 1 includes all elements for realizing color vision assistance as in the present embodiment. Specifically, one information processing apparatus 1 includes all of the following elements: a "characteristic information acquisition unit" that acquires color vision characteristic information of a user; an "imaging unit" that obtains a captured image of an object; an "auxiliary information generation unit" that generates color vision auxiliary information based on a captured image of an object and color vision characteristic information; a "behavior detection unit" that detects a behavior of a user with respect to an object; a "sensor unit" for behavior detection, which detects e.g. movements; a "presentation processing unit" that presents color vision assistance information to a user; and a "presentation unit" that presents the color vision auxiliary information in response to the processing of the presentation processing unit.
However, the form of the information processing apparatus according to the embodiment is not limited to the wrist phenotype. Furthermore, it is not necessary for one information processing apparatus to include each of the elements described above. Hereinafter, a modification of the configuration of the apparatus for realizing color vision assistance as an embodiment will be described.
Fig. 13 is an explanatory diagram of a first modification related to the device configuration.
In the first modification, a wristwatch-type device 30 and an eyeglass-type information processing device 1A are provided. In this case, the device 30 serves as a device that detects the movement of the user's hand on one side of the wearing device 30. Further, the information processing apparatus 1A includes the imaging unit 21 and the display unit 17, and can capture an image of an object and display color vision assistance information.
In this case, the apparatus 30 includes the above-described "sensor unit" that detects the movement of the hand of the user, and the information processing apparatus 1A includes the above-described "characteristic information acquisition unit", "imaging unit" (imaging unit 21), "auxiliary information generation unit", "behavior detection unit", "presentation processing unit", and "presentation unit" (display unit 17), and performs calibration for obtaining the color vision auxiliary information as described above, detects the behavior of the user based on the detection signal of the movement of the hand obtained by the apparatus 30, generates the color vision auxiliary information based on the color vision characteristic information, presents the color vision auxiliary information based on the behavior detection result of the user, and presents the color vision auxiliary information by the presentation unit.
Note that in the glasses-type information processing apparatus 1A, it is conceivable that the display unit 17 is configured as, for example, a projector apparatus or the like that projects an image onto a lens portion of glasses or a retina of a user.
Fig. 14 is an explanatory diagram of a second modification related to the device configuration.
The second modification provides a wristwatch-type device 30, an information processing device 1B, and a server device 32 capable of communicating with the information processing device 1B via a network 31.
The device 30 serves as a device for detecting a movement of a hand in a similar manner to the first modification. The information processing apparatus 1B includes an imaging unit 21 and a display unit 17.
In this case, the information processing apparatus 1B may include at least "characteristic information acquisition unit", "imaging unit", "presentation processing unit", and "presentation unit" among the above-described elements. At least one of the processes of the above-described "auxiliary information generating unit" and "behavior detecting unit" is performed by the server device 32 instead of the information processing device 1B. In the case where the "behavior detection unit" is provided in the server apparatus 32, the information processing apparatus 1B transmits a detection signal of the motion of the user obtained by the apparatus 30 to the server apparatus 32, and acquires information indicating a detection result of the behavior of the user obtained by the server apparatus 32. Further, in the case where the "auxiliary information generating unit" is provided in the server apparatus 32, the information processing apparatus 1B transmits the image captured by the imaging unit and the color vision characteristic information acquired by calibration to the server apparatus 32, and acquires the color vision auxiliary information generated by the server apparatus 32 based on these information, and the "presentation processing unit" causes the "presentation unit" to present the color vision auxiliary information.
Fig. 15 is an explanatory diagram of a third modification related to the device configuration.
The third modification is an example of the glasses-type information processing apparatus 1C including each of the elements described above.
In this case, it is conceivable to provide a myoelectric potential sensor for detecting movement of muscles at a portion near the eyes of the user, a line-of-sight sensor for detecting a line-of-sight direction, or the like as a "sensor unit" for detecting the behavior of the user. In this case, as the behavior of the user, for example, a behavior related to how the user views the object (such as a state in which the user is viewing the object or a state in which the user is looking at the object) may be detected. It is conceivable that the information element level of the color vision assistance information to be presented is switched in response to the type of behavior.
Note that even in the case of the glasses-type information processing apparatus 1C, the behavior may be detected based on the movement of the user's hand. For example, in the case where the hand of the user is captured in the captured image of the imaging unit 21, a behavior related to the movement of the hand of the user (for example, the user extending his/her hand toward the object, lifting the object, or the like) may be detected by performing image analysis on the image captured by the imaging unit 21. Note that in this case, it can be said that the imaging unit 21 (image sensor) also functions as a "sensor unit" for detecting the behavior of the user.
Fig. 16 is an explanatory diagram of a fourth modification related to the equipment configuration.
The fourth modification provides a chopstick type device 30D and a glasses type information processing device 1D. In this case, the device 30D is provided with a "sensor unit" (motion sensor) for detecting the movement of the chopstick-type device 30D. Further, the information processing apparatus 1D is provided with an "imaging unit" (imaging unit 21) and a "presentation unit" (display unit 17), and is also provided with a "characteristic information acquisition unit", an "auxiliary information generation unit", a "behavior detection unit", a "presentation processing unit", and a "presentation unit".
For example, it is difficult for the above-mentioned two-color viewer to perceive the doneness of the meat, and there is a possibility that the two-color viewer will eat uncooked meat. In the fourth modification, a case of saving such is envisaged. In this case, the "behavior detection unit" in the information processing apparatus 1D detects the behavior of the user holding the object (in this case, meat) using the chopstick-shaped apparatus 30D based on the detection signal of the motion sensor provided in the apparatus 30D. Then, in response to the detection of the behavior, the "presentation processing unit" causes the "presentation unit" (display unit 17) to present the color vision auxiliary information generated by the "auxiliary information generation unit".
Note that also in this case, it is conceivable to switch the information element level in response to the type of behavior. For example, it is conceivable to present color vision assistance information of a first element level (for example, a color image of an auxiliary color of the object) for a behavior in which the user holds the object using the chopstick-shaped device 30D, and color vision assistance information of a second element level (for example, information other than the color image such as an estimated time of cooking meat) for a behavior in which the object is inverted (meat is flipped), or the like.
Fig. 17 is an explanatory diagram of a fifth modification related to the device configuration.
The fifth modification is an example of the mirror-type information processing apparatus 1E. The information processing apparatus 1E includes all the elements of the above-described "characteristic information acquisition unit", "imaging unit", "auxiliary information generation unit", "behavior detection unit", "sensor unit", "presentation processing unit", and "presentation unit".
The information processing apparatus 1E is provided with an imaging unit 21 for capturing an image in the same direction as the direction in which the mirror surface faces. Thus, an image of the user standing in front of the mirror can be captured. In this case the mirror surface also serves as a display screen capable of displaying images and presents information to a user standing in front of the mirror.
Examples of the application of the information processing apparatus 1E include an application for assisting a user in selecting clothing. For example, in the case where a user standing in front of the mirror performs the action of putting a shirt on the chest, color vision assistance information is presented on the mirror surface. For example, as in the example of fig. 7, it is conceivable to present a color image of an auxiliary color indicating the color of the shirt as the object as color vision auxiliary information or the like.
In a fifth modification as well, it is conceivable to switch the information element level in response to the type of behavior. For example, for the act of putting a shirt on the chest of the user, a color image of an auxiliary color indicating the color of the shirt is presented as color vision auxiliary information of the first element level as described above. In the case where the user wears the shirt on the chest while taking a right or left turn, color vision assistance information including information suggesting a color suitable for the pants of the shirt is presented as color vision assistance information of the second element level, and the like. Alternatively, it is also conceivable to display a pants image collocated with a shirt on the leg of the user reflected in the mirror in the superimposed mode as color vision assistance information at the level of the second element in this case.
Note that in this case, it is conceivable to detect the behavior of the user by performing image analysis on the captured image of the imaging unit 21. In this case, the imaging unit 21 functions not only as a sensor for detecting the color of the object but also as a "sensor unit" for detecting the behavior of the user.
<5 > modification related to presented information
Although an example in which only visual information is presented as color vision auxiliary information has been described above, at least any one of auditory, tactile, and olfactory information may be presented as color vision auxiliary information in addition to visual information.
Here, examples of the unit for presenting the tactile information include a vibrator generating vibration, a blower device providing tactile stimulation by wind, a device providing electrotactile stimulation, and the like. Further, examples of the unit for presenting the olfactory information include a perfume diffuser and the like.
Here, it is conceivable to switch the information element levels about auditory information, tactile information, and olfactory information to be presented in response to the type of behavior of the user.
Examples of presenting the above-described audible information, tactile information, and olfactory information in response to the types of the first type of behavior and the second type of behavior will be described below.
At the time of the first type of behavior
Auditory sense: human-reminiscent sound of object
Haptic sensation: weak vibration, long vibration interval
Smell sense: human-reminiscent scent of a subject
At the time of the second type of behavior
Auditory sense: sound of indication object
Haptic sensation: intense vibration, short vibration interval
Smell sense: indicating smell of object
In terms of hearing, it is conceivable that the above-mentioned "human-reminiscent sound of a subject" is, for example, a nominal sound of a subject=a "garment" of a shirt, a nominal sound of a subject=a "vegetable" of a tomato, or the like. On the other hand, it is conceivable that the "sound indicating the subject" is, for example, a nominal sound of "shirt" of subject=shirt, a nominal sound of "tomato" of subject=tomato, or the like.
Further, in terms of smell, examples of the above-described "human-reminiscent smell of a subject" include a "sweet" smell, "grass-like" smell, and the like in the case of a subject=tomato. On the other hand, examples of "indicating the smell of the subject" include the smell of the tomato itself in the case of subject=tomato.
Note that in terms of presentation of haptic information, for example, it is conceivable to allow the user to perceive characters indicating the form, color, name, etc. of an object in a morse code manner.
<6. Other modifications >
Here, the embodiments are not limited to the specific examples described so far, and configurations as various modifications may be adopted.
For example, an example in which an enlarged image of an object is displayed as color vision assistance information has been described above, but the enlargement ratio of the image at this time may be determined based on information of the enlargement ratio set by an enlargement operation of the display object performed by the user in the past.
Further, when a color image is displayed as color vision auxiliary information, it is also conceivable to consider a difference in color appearance due to ambient light in determining the color of the color image. That is, for example, the type of the ambient light is specified by performing image analysis on the image captured by the imaging unit 21, and the color of the color image is determined according to the specified type of the ambient light.
Further, it is also conceivable to use a distance measurement sensor such as a time of flight (ToF) sensor, an illuminance sensor, or the like as a sensor for detecting the behavior of the user, for example, in addition to the motion sensor, the line-of-sight sensor, or the like as exemplified above.
<7. Overview of embodiments >
As described above, the information processing apparatuses (the information processing apparatuses 1, 1A, 1B, 1C, 1D, and 1E) according to the embodiments each include a presentation processing unit (a presentation processing unit F4) that presents color vision assistance information generated based on a captured image of an object and color vision characteristic information of a user to the user.
As described above, the color vision assistance information generated based on the captured image of the object and the color vision characteristic information of the user is presented to the user, and thus can help the user easily recognize the color of the object that the user actually sees.
Therefore, even in the case where the user is a person having color vision diversity, the user can be enabled to accurately recognize the color of the object actually seen.
Further, in the information processing apparatus according to the embodiment, the presentation processing unit presents information indicating a color as presentation of color vision auxiliary information.
As described above, the information indicating the color is presented to the user as color vision assistance information, and thus, in the case where the user is a person having color vision diversity, the user can be allowed to recall the actual color of the object that the user is viewing.
Therefore, it is possible to enable the user to accurately recognize the color of the object actually seen, with respect to the user of the person having the color vision diversity.
Further, in the information processing apparatus according to the embodiment, the presentation processing unit presents information indicating a color different from that of the object as the presentation of the color vision auxiliary information.
The color of an object perceived by a user (e.g., a person with color vision diversity) is different from the color of an actual object.
Thus, as described above, information of a color different from that of the object is presented, and thus it is possible to make it easier for a user (e.g., a person having color vision diversity) to recognize the actual color of the object.
Further, in the information processing apparatus according to the embodiment, the presentation processing unit presents information indicating a color determined based on difference information between the color of the object and the perceived color when the user perceives the color of the object as presentation of the color vision assistance information.
Accordingly, color information determined to compensate for differences between the actual color of the object and the color perceived by the user (e.g., a person having color vision diversity) may be presented.
Thus, it is possible to make it easier for a user (e.g., a person having color vision diversity) to recognize the actual color of the object.
Further, in the information processing apparatus according to the embodiment, the presentation processing unit presents a color image as the presentation of the color vision auxiliary information.
Using image information as color vision auxiliary information makes it possible for a user to intuitively understand the actual color of an object.
Further, in the information processing apparatus according to the embodiment, the presentation processing unit presents the enlarged image of the object as the presentation of the color vision auxiliary information.
For example, in the case where the color arrangement pattern portion of the object is fine, for example, in the case where the object is small or the like, it is difficult for a person having color vision diversity to recognize the color of the color arrangement pattern portion when viewing the object with the naked eye. Thus, an enlarged image of the object is presented as color vision assistance information.
Therefore, even if the color arrangement pattern portion in the object is fine, the user (e.g., a person having color vision diversity) can easily recognize the actual color of the color arrangement pattern portion.
Further, the information processing apparatus according to the embodiment includes a behavior detection unit (behavior detection unit F2) configured to detect a behavior of the user with respect to the object, wherein the presentation processing unit switches the information element level presented as the color vision assistance information in response to the type of the behavior.
Thus, information can be presented according to the information element level implicitly required by the user through its natural behavior, and it can be advantageous for the user to understand the object not only from the perspective of color perception but also from the perspective other than color perception. Furthermore, the presentation of auxiliary information based on the natural behavior of the user makes it possible to provide accessibility with high affinity to the daily life of the user.
Further, in the information processing apparatus according to the embodiment, in the case where the type of the behavior detected by the behavior detection unit is the first type, the presentation processing unit presents only the information indicating the color as the presentation of the color vision assistance information, and in the case where the type of the behavior detected by the behavior detection unit is the second type different from the first type, the presentation processing unit presents the information indicating the color and the different information other than the information indicating the color as the presentation.
The presentation processing unit may present appropriate information corresponding to each of a case where the user desires to recognize the color of the object and a case where the user desires to recognize an element other than the color of the object.
Thus, information can be presented according to the information element level implicitly required by the user through his natural behavior, and it can be advantageous for the user to understand the object not only from the perspective of color perception but also from the perspective other than color perception.
Further, in the information processing apparatus according to the embodiment, in the case where the type of the behavior detected by the behavior detection unit is the first type, the presentation processing unit presents only a color image indicating the color of the object as the presentation of the color vision auxiliary information; and in the case where the type of the behavior detected by the behavior detection unit is a second type different from the first type, the presentation processing unit displays a color image indicating a color of the object and a color image indicating a color corresponding to a color requested by the user side by side as a presentation.
Thus, for example, in the case where there is a color requested by the user in terms of an object such as a commodity, comparison between the color requested by the user and the color of the object can be facilitated.
Thus, the user can be suitably assisted in having a higher-order understanding of whether the object is an object desired by the user in terms of hue.
Further, the information processing apparatus according to the embodiment includes a characteristic information acquisition unit (characteristic information acquisition unit F1) that acquires color vision characteristic information based on information input by a user.
Accordingly, it is possible to receive an information input for specifying a color vision characteristic from a user, and acquire color vision characteristic information of the user from the information.
Therefore, it is possible to perform color vision assistance based on the color vision characteristics of the individual user, and to improve the accuracy of the color vision assistance.
Further, in the information processing apparatus according to the embodiment, the characteristic information acquisition unit causes the display screen to display the reference color and the sample color, allows the user to select the sample color that appears the same as the reference color, and obtains the color vision characteristic information based on the information of the selected sample color.
As described above, the user is allowed to select the sample color that appears the same as the reference color, and thus the information of the type of color vision diversity can be obtained from the difference between the sample color and the reference color as the color vision characteristic information of the user.
Therefore, as the color vision assistance, appropriate assistance according to the color vision characteristics of the user can be performed.
Further, in the information processing apparatus according to the embodiment, the characteristic information acquisition unit allows the user to select a sample color that appears the same as the reference color while sequentially changing the sample color displayed on the display screen side by side with the reference color.
This eliminates the need for the user to perform an operation of changing the color of the sample in the calibration for obtaining the color vision characteristic information of the user. Further, since the sample color is displayed side by side with the reference color, it is easy for the user to compare the reference color with the sample color, and to obtain accurate color vision characteristic information.
Thus, it is possible to improve the accuracy of the calibration while reducing the burden of user operations related to the calibration, and ultimately to improve the accuracy of the color vision assistance.
Further, the information processing apparatus according to the embodiment includes an auxiliary information generating unit (auxiliary information generating unit F3) that can generate color vision auxiliary information based on the color vision characteristic information.
Thus, the processing of generating color vision assistance information and presenting color vision assistance information can be performed by one device.
Further, the information processing apparatus according to the embodiment includes a presentation unit (presentation unit 17) that can present color vision auxiliary information in response to processing by the presentation unit.
Thus, the processing of generating color vision assistance information and presenting color vision assistance information can be performed by one device.
Further, the information processing apparatus according to the embodiment includes an imaging unit (imaging unit 21) that obtains a captured image of an object.
Thus, a series of operations of capturing an image of an object, generating color vision assistance information related to the object, and presenting the generated color vision assistance information can be completed by one apparatus.
Further, the information processing method according to the embodiment is an information processing method in which the information processing apparatus presents color vision assistance information generated based on the captured image of the object and the color vision characteristic information of the user to the user.
With such an information processing method, the information processing apparatus of the embodiment as described above can be realized.
Further, the color vision assisting system according to the embodiment includes: an imaging unit configured to capture an image of an object; an auxiliary information generating unit configured to generate color vision auxiliary information based on the captured image of the subject obtained by the imaging unit and color vision characteristic information of the user; and a presentation processing unit configured to present the color vision auxiliary information generated by the auxiliary information generating unit to a user.
As described above, color vision assistance information generated based on the captured image of the object and the color vision characteristic information of the user is presented to the user, and thus the user can be assisted in easily recognizing the color of the object actually seen by the user.
Therefore, even in the case where the user is a person having color vision diversity, the user can be enabled to accurately recognize the color of the object actually seen.
Note that the effects described in this specification are merely examples, are not limited, and other effects may exist.
<8 > this technique
Note that the present technology may also have the following configuration:
(1)
an information processing apparatus comprising:
and a presentation processing unit configured to present color vision assistance information generated based on the captured image of the object and color vision characteristic information of the user to the user.
(2)
The information processing apparatus according to the above (1), wherein
The presentation processing unit presents information indicating a color as a presentation of the color vision auxiliary information.
(3)
The information processing apparatus according to the above (2), wherein
The presentation processing unit presents information indicating a color different from that of the object as presentation of the color vision auxiliary information.
(4)
The information processing apparatus according to the above (2) or (3), wherein
The presentation processing unit presents, as the presentation of the color vision assistance information, information indicating a color determined based on difference information between a color of the object and a perceived color when the user perceives the color of the object.
(5)
The information processing apparatus according to any one of (2) to (4) above, wherein
The presentation processing unit presents a color image as a presentation of the color vision auxiliary information.
(6)
The information processing apparatus according to any one of (1) to (5) above, wherein
The presentation processing unit presents the enlarged image of the object as a presentation of the color vision assistance information.
(7)
The information processing apparatus according to any one of (1) to (6), further comprising
A behavior detection unit configured to detect a behavior of the user with respect to the object,
wherein the presentation processing unit switches an information element level of information presented as the color vision assistance information in response to the type of the behavior.
(8)
The information processing apparatus according to (7), wherein
In the case where the type of the behavior detected by the behavior detection unit is a first type, the presentation processing unit presents only information indicating a color as presentation of the color vision auxiliary information; and in the case where the type of the behavior detected by the behavior detection unit is a second type different from the first type, the presentation processing unit presents information indicating the color and different information other than the information indicating the color as presentations.
(9)
The information processing apparatus according to (7), wherein
In the case where the type of the behavior detected by the behavior detection unit is the first type, the presentation processing unit presents only a color image indicating a color of the object as the presentation of the color vision assistance information; and in the case where the type of the behavior detected by the behavior detection unit is a second type different from the first type, the presentation processing unit displays a color image indicating a color of the object and a color image indicating a color corresponding to a color required by the user side by side as a presentation.
(10)
The information processing apparatus according to any one of (1) to (9) above, further comprising
And a characteristic information acquisition unit configured to acquire the color vision characteristic information based on information input by the user.
(11)
The information processing apparatus according to the above (10), wherein
The characteristic information acquisition unit causes a display screen to display a reference color and a sample color, allows the user to select a sample color that appears the same as the reference color, and acquires the color vision characteristic information based on information of the selected sample color.
(12)
The information processing apparatus according to the above (11), wherein
The characteristic information acquisition unit allows the user to select the sample color that appears the same as the reference color, with sequentially changing the sample colors displayed side by side on the display screen with the reference color.
(13)
The information processing apparatus according to any one of (1) to (12), wherein
And an auxiliary information generating unit configured to generate the color vision auxiliary information based on the color vision characteristic information.
(14)
The information processing apparatus according to the above (13), further comprising
And a presentation unit configured to present the color vision auxiliary information in response to presentation by the presentation processing unit.
(15)
The information processing apparatus according to the above (14), further comprising
An imaging unit configured to obtain a captured image of the object.
(16)
An information processing method configured to:
the information processing apparatus is caused to present color vision assistance information generated based on a captured image of an object and color vision characteristic information of a user to the user.
(17) A color vision assisting system comprises
An imaging unit configured to capture an image of an object;
an auxiliary information generating unit configured to generate color vision auxiliary information based on a captured image of an object obtained by the imaging unit and color vision characteristic information of a user; and
and a presentation processing unit configured to present the color vision auxiliary information generated by the auxiliary information generating unit to the user.
List of reference numerals
1. 1A, 1B, 1C, 1D, 1E information processing apparatus
1a main body
1b belt portion
11 CPU
17. Display unit
17a display screen
18. Voice output unit
21. Image forming unit
22. Sensor unit
F1 Characteristic information acquisition unit
F2 Behavior detection unit
F3 Auxiliary information generating unit
F4 Presentation processing unit
B1 Start button
B2 Stop button
B3 Reset button
B4 Next button
M aiming mark
30 30D device
31. Network system
32. Server device

Claims (17)

1. An information processing apparatus comprising:
and a presentation processing unit configured to present color vision assistance information generated based on the captured image of the object and color vision characteristic information of the user to the user.
2. The information processing apparatus according to claim 1,
Wherein the presentation processing unit presents information indicating a color as the presentation of the color vision auxiliary information.
3. The information processing apparatus according to claim 2,
wherein the presentation processing unit presents information indicating a color different from the color of the object as the presentation of the color vision auxiliary information.
4. The information processing apparatus according to claim 2,
wherein the presentation processing unit presents, as the presentation of the color vision assistance information, information indicating a color determined based on difference information between a color of the object and a perceived color when the user perceives the color of the object.
5. The information processing apparatus according to claim 2,
wherein the presentation processing unit presents a color image as the presentation of the color vision auxiliary information.
6. The information processing apparatus according to claim 1,
wherein the presentation processing unit presents the enlarged image of the object as the presentation of the color vision auxiliary information.
7. The information processing apparatus according to claim 1, further comprising:
a behavior detection unit configured to detect a behavior of the user with respect to the object,
Wherein the presentation processing unit switches an information element level of information presented as the color vision assistance information in response to the type of the behavior.
8. The information processing apparatus according to claim 7,
wherein, in the case where the type of the behavior detected by the behavior detection unit is a first type, the presentation processing unit presents only information indicating a color as presentation of the color vision auxiliary information; and in the case where the type of the behavior detected by the behavior detection unit is a second type different from the first type, the presentation processing unit presents information indicating the color and different information other than the information indicating the color as presentations.
9. The information processing apparatus according to claim 7,
wherein, in a case where the type of the behavior detected by the behavior detection unit is the first type, the presentation processing unit presents only a color image indicating a color of the object as the presentation of the color vision assistance information; and in the case where the type of the behavior detected by the behavior detection unit is a second type different from the first type, the presentation processing unit displays a color image indicating a color of the object and a color image indicating a color corresponding to a color required by the user side by side as a presentation.
10. The information processing apparatus according to claim 1, further comprising:
and a characteristic information acquisition unit configured to acquire the color vision characteristic information based on information input by the user.
11. The information processing apparatus according to claim 10,
wherein the characteristic information acquisition unit causes a display screen to display a reference color and a sample color, allows the user to select a sample color that appears the same as the reference color, and acquires the color vision characteristic information based on information of the selected sample color.
12. The information processing apparatus according to claim 11,
wherein the characteristic information acquisition unit allows the user to select the sample color that appears the same as the reference color in a case where the sample colors displayed side by side on the display screen are sequentially changed.
13. The information processing apparatus according to claim 1, further comprising:
and an auxiliary information generating unit configured to generate the color vision auxiliary information based on the color vision characteristic information.
14. The information processing apparatus according to claim 13, further comprising:
A presentation unit configured to present the color vision assistance information in response to the presentation by the presentation processing unit.
15. The information processing apparatus according to claim 14, further comprising:
an imaging unit configured to obtain a captured image of the object.
16. An information processing method configured to:
the information processing apparatus is caused to present color vision assistance information generated based on a captured image of an object and color vision characteristic information of a user to the user.
17. A color vision assisting system comprising:
an imaging unit configured to capture an image of an object;
an auxiliary information generating unit configured to generate color vision auxiliary information based on a captured image of an object obtained by the imaging unit and color vision characteristic information of a user; and
and a presentation processing unit configured to present the color vision auxiliary information generated by the auxiliary information generating unit to the user.
CN202280029851.1A 2021-04-27 2022-02-21 Color vision assisting system Pending CN117203602A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021075311 2021-04-27
JP2021-075311 2021-04-27
PCT/JP2022/006906 WO2022230323A1 (en) 2021-04-27 2022-02-21 Color vision assistance system

Publications (1)

Publication Number Publication Date
CN117203602A true CN117203602A (en) 2023-12-08

Family

ID=83846960

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280029851.1A Pending CN117203602A (en) 2021-04-27 2022-02-21 Color vision assisting system

Country Status (4)

Country Link
JP (1) JPWO2022230323A1 (en)
CN (1) CN117203602A (en)
DE (1) DE112022002343T5 (en)
WO (1) WO2022230323A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5110760B1 (en) 1970-07-13 1976-04-06
JP4466067B2 (en) * 2003-12-19 2010-05-26 富士ゼロックス株式会社 Color vision support device and color vision support program
JP5101995B2 (en) 2007-09-10 2012-12-19 株式会社リコー Input control apparatus and image forming apparatus
JP2009071541A (en) 2007-09-12 2009-04-02 Ricoh Co Ltd Image processor, image processing method, program, and recording medium
JP2017085461A (en) * 2015-10-30 2017-05-18 株式会社日本総合研究所 Color conversion device, color conversion system and program

Also Published As

Publication number Publication date
JPWO2022230323A1 (en) 2022-11-03
WO2022230323A1 (en) 2022-11-03
DE112022002343T5 (en) 2024-02-15

Similar Documents

Publication Publication Date Title
US20220284844A1 (en) Dark mode display interface processing method, electronic device, and storage medium
CN108986766B (en) Information display terminal and information display method
TWI279735B (en) Method and device of color correction for projector, and projector
US10617301B2 (en) Information processing device and information processing method
CN104850432B (en) Adjust the method and device of color
CN109155053B (en) Information processing apparatus, information processing method, and recording medium
CN105573627B (en) It is a kind of to prompt the method and device that user carries out eyeshield by intelligent glasses
US11478062B2 (en) Makeup item presenting system, makeup item presenting method, and makeup item presenting server
EP3054372A2 (en) Method of providing notification and electronic device for implementing same
US10757337B2 (en) Information processing apparatus and information processing method to control exposure for imaging the eye
CN107168671A (en) A kind of information prompting method and device
JP2017085461A (en) Color conversion device, color conversion system and program
CN106980480A (en) Display system, eyewear and the control method for showing system
CN106961546A (en) Information processor and method, camera device, display device, control method
CN113055752A (en) Image quality adjusting method and device and smart television
JP6566240B2 (en) Information processing apparatus, information processing method, and program
CN107329573B (en) Control method and electronic device
CN117203602A (en) Color vision assisting system
CN112153292A (en) Shooting method and device and electronic equipment
JP2013157845A (en) Electronic mirror and program
EP1706081B1 (en) System and method for identifying at least one color for a user
CN105468135B (en) A kind of information processing method and electronic equipment
CN112395030A (en) Page processing method and device, electronic device and storage medium
JP2016092430A (en) Imaging system, information processing device, imaging method, program and storage medium
CN106341615B (en) Control the method and terminal of flash lamp

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination