WO2023053517A1 - 視認性情報取得装置の制御方法および視認性情報取得装置 - Google Patents
視認性情報取得装置の制御方法および視認性情報取得装置 Download PDFInfo
- Publication number
- WO2023053517A1 WO2023053517A1 PCT/JP2022/012970 JP2022012970W WO2023053517A1 WO 2023053517 A1 WO2023053517 A1 WO 2023053517A1 JP 2022012970 W JP2022012970 W JP 2022012970W WO 2023053517 A1 WO2023053517 A1 WO 2023053517A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- identifier
- visibility
- display
- subject
- information acquisition
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 45
- 230000007704 transition Effects 0.000 claims abstract description 106
- 230000008859 change Effects 0.000 claims description 20
- 230000008569 process Effects 0.000 description 31
- 238000012545 processing Methods 0.000 description 18
- 230000002123 temporal effect Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 238000011156 evaluation Methods 0.000 description 10
- 230000010365 information processing Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 210000003128 head Anatomy 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 201000003152 motion sickness Diseases 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04801—Cursor retrieval aid, i.e. visual aspect modification, blinking, colour changes, enlargement or other visual cues, for helping user do find the cursor in graphical user interfaces
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0257—Reduction of after-image effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present invention relates to a visibility information acquisition device control method and a visibility information acquisition device.
- Patent Document 1 discloses a virtual space image providing method for providing a virtual space image visually recognized by a user to a head mounted display (HMD).
- HMD head mounted display
- the rotation direction and the rotation speed of the HMD are obtained, and in the end regions on both sides of the virtual space image in the direction on the screen corresponding to the rotation direction, with the range and intensity corresponding to the rotation speed.
- VR Virtual Reality
- the present invention has been made with a focus on the above-mentioned points. It is an object of the present invention to provide a control method of a visibility information acquisition device and a visibility information acquisition device capable of acquiring in the above.
- one aspect of the present invention is to display a first identifier at a first position in front of a subject, and to display a second identifier at a second position spaced in the front-rear direction with respect to the first position.
- a control method for a visibility information acquisition device that acquires visibility information of the second identifier when the identifier is displayed and the viewpoint of the subject is moved from the first position to the second position.
- This control method comprises: a first display step of displaying the first identifier at the first position so that the subject is focused on the first position; and the second identifier at the second position.
- a display state of the second identifier is controlled according to a preset transition condition to display the second identifier a control step of transitioning the visibility of the second identifier from a reference state to a target state different from the reference state; and an obtaining step of obtaining from the subject.
- the control method of the visibility information acquisition device According to the control method of the visibility information acquisition device according to the present invention, information necessary for generating a virtual space image that achieves visibility close to the appearance of the real space when the subject's viewpoint moves can be obtained with high accuracy.
- FIG. 1 is a conceptual diagram showing the configuration of a visibility information acquisition device according to an embodiment of the present invention
- FIG. FIG. 4 is a conceptual diagram showing an example of a display unit realized in real space in the embodiment
- FIG. 4 is a conceptual diagram showing an example of a display unit realized in virtual space in the embodiment
- 10 is a flowchart showing an example of visibility information acquisition processing when the viewpoint moves away in the above embodiment.
- FIG. 10 is a diagram showing an example of an identifier displayed in a virtual space image when the viewpoint moves away from the user in the embodiment; It is a graph which shows an example of the temporal change of the visibility of the 2nd identifier whose display state is controlled according to transition conditions in the said embodiment.
- FIG. 10 is a diagram showing an example of an identifier displayed in a virtual space image when the viewpoint moves in a direction of approaching in the embodiment
- FIG. 4 is a diagram showing an example of determining a transition condition suitable for a driving simulator system based on the visibility information acquired by the visibility information acquisition device of the embodiment, and performing display control of a virtual space image according to the transition condition; .
- FIG. 1 is a conceptual diagram showing the configuration of a visibility information acquisition device according to one embodiment of the present invention.
- the visibility information acquisition device 1 of this embodiment includes, for example, a display unit 2 and an information processing unit 3.
- the display unit 2 can display the first identifier at a first position in front of the subject S seated at a predetermined position, and can display the second identifier at a second position spaced in the front-rear direction from the first position. can be displayed.
- Each of the first position and the second position includes a position in virtual space as well as a position in real space.
- the display unit 11 according to the present embodiment may be implemented by, for example, arranging two display devices in the real space, or setting two display areas in the virtual space using a head mounted display (HMD) or the like. can be realized by
- HMD head mounted display
- FIG. 2 is a conceptual diagram showing an example of the display unit 2 implemented in the physical space.
- the display unit 2 includes a tablet terminal 21 arranged at a position near the front of the subject S (hereinafter referred to as a "near position") Pn, and a position far in front of the subject S (hereinafter referred to as a "distant position"). ) and a display 22 located at Pf.
- the tablet terminal 21 and the display 22 are arranged substantially in front of the subject S, respectively.
- the tablet terminal 21 is installed diagonally below the front of the subject S with the screen facing the subject S side.
- an identifier In is displayed in an area positioned substantially in the center.
- the display 22 is installed at a height that does not overlap with the tablet terminal 21 within the field of view of the subject S.
- an identifier If is displayed in an area positioned substantially in the center.
- the distance Dn (FIG. 1) from the subject S to the tablet terminal 21 is set to 0.2 m, for example, and the distance Df (FIG. 1) from the subject S to the display 22 is set to 5 m, for example. It is However, the arrangement of the tablet terminal 21 and the display 22 is not limited to the above example.
- the display unit 2 when the subject S's viewpoint is moved from the tablet terminal 21 arranged at the near position Pn to the display 22 arranged at the far position Pf (hereinafter referred to as "movement of the viewpoint in the direction of moving away"),
- the position Pn, the identifier In and the tablet terminal 21 correspond to the first position, the first identifier and the first display device of the present invention, while the far position Pf, the identifier If and the display 22 correspond to the second position and the first display device of the present invention. 2 identifier and a second display.
- the far position Pf, the identifier If and the display 22 correspond to the first position, the first identifier and the first display device of the present invention, and the near position Pn, the identifier In and the tablet terminal 21 correspond to the second position, the second identifier and the first display device of the present invention. 2 display device.
- FIG. 3 is a conceptual diagram showing an example of the display section 2' implemented in the virtual space.
- the display unit 2' is implemented on a virtual space image VR displayed on the HMD mounted on the subject's S head.
- a first region 21' is formed below the central portion in the horizontal direction
- a second region 22' is formed above the central portion in the horizontal direction.
- the first region 21' and the second region 22' are set so as to be spaced apart in the depth direction (the front-rear direction as seen from the subject S) in the virtual space.
- the first area 21' is a virtual representation of the screen of the tablet terminal 21 in the real space shown in FIG.
- the second area 22' is a virtual representation of the screen of the display 22 in the real space shown in FIG.
- information corresponding to the near position Pn is set as the depth information of the first region 21'
- the second region 22 ' information corresponding to the far position Pf (distance Df from the subject S) is set.
- An identifier In is displayed approximately in the center of the first area 21'
- an identifier If is displayed approximately in the center of the second area 22'.
- the information processing section 3 includes, for example, a storage section 31, a display control section 32, a notification section 33, an information acquisition section 34, and a condition change section 35 as its functional blocks.
- the hardware configuration of the information processing section 3 is configured using, for example, a computer system including a processor, a memory, a user input interface, and a communication interface. That is, the information processing section 3 realizes the functions of the above blocks by reading and executing the programs stored in the memory by the processor of the computer system.
- the storage unit 31 changes the visibility of the second identifier (identifier In or If) to be displayed in the region around the viewpoint of the movement destination from the reference state to a target state different from the reference state when the viewpoint of the subject S moves. It stores transition conditions that cause A desired visibility level can be set for the reference state of visibility, and the height of the visibility level may be adjustable.
- the target state of visibility is set to be higher than the reference state.
- the setting of the reference state and target state of visibility is not limited to the above example, and it is also possible to set a target state lower than the reference state.
- the transition conditions include, for example, a display time T for continuing the display of the second identifier, a delay time L from when the subject S's viewpoint movement is completed until the visibility state transition of the second identifier starts, It includes a transition time ⁇ required for the transition from the reference state to the target state of the visibility of the identifier, and a time constant ⁇ that determines the temporal change in the degree of increase when the visibility of the second identifier is increased.
- the storage unit 31 also stores image data of identifiers In and If to be displayed on the display unit 2 . Furthermore, the visibility information of the second identifier acquired by the information acquiring section 34 and the like are also stored and accumulated in the storage section 31 .
- the transition conditions may be stored in the storage unit 31 in advance, or the transition conditions received from an external device (not shown) may be temporarily stored in the storage unit 31 and output to the display control unit 32 . Details of the transition condition, the image data of the identifiers In and If, and the visibility information of the second identifier will be described later.
- the display control unit 32 causes the display unit 2 to display the second identifier after displaying the first identifier, and controls the display state of the second identifier according to the transition conditions stored in the storage unit 31 . Specifically, when the visibility information corresponding to the movement of the viewpoint in the moving away direction is acquired, the display control unit 32 displays the identifier In is displayed, the identifier If is displayed on the display 22 of the display unit 2 (or the second area 22' of the display unit 2'), and the display state of the identifier If is controlled according to the transition condition.
- the display control unit 32 causes the display 22 of the display unit 2 (or the second area 22′ of the display unit 2′) to display the identifier If.
- the identifier In is displayed on the tablet terminal 21 of the display unit 2 (or the first area 21' of the display unit 2'), and the display state of the identifier In is controlled according to the transition condition.
- the state of visibility of the second identifier is controlled by, for example, blurring the image displayed as the second identifier.
- Blur processing is processing that changes the amount of information in image data so that the image appears blurred.
- the blurring process is image processing that reduces the amount of information that the subject S can visually confirm.
- Specific examples of blurring processing include processing to reduce the amount of information, processing to lower the resolution, processing to gradually reduce the display area, and processing to gradually increase the display area for the image displayed as the second identifier. , or a combination of these processes.
- a process of increasing the display area in stages and a process of decreasing the display area in stages can be performed in order or alternately to easily reproduce the out-of-focus state. .
- the reference state of the visibility of the second identifier is, for example, an out-of-focus blurred state after the blurring process is performed, and represents a state in which the amount of information that the user U can visually confirm about the image is small.
- the target state of visibility of the second identifier is, for example, a state in which the image is in focus before the blurring process is performed, and represents a state in which the user U can visually confirm the image with a large amount of information.
- the notification unit 33 generates an alert sound (not shown) provided in the display unit 2, 2' or the information processing unit 3 for prompting the subject S to move the viewpoint from the first position to the second position. Output via a speaker or the like.
- the notification unit 33 generates an alert sound in conjunction with the display control unit 32 .
- the notification unit 33 outputs an alert sound after a predetermined time (for example, 2 to 5 seconds) has elapsed since the display control unit 32 displayed the first identifier on the display unit 2 .
- the timing when the alert sound is output from the notification unit 33 substantially coincides with the timing when the second identifier is displayed on the display units 2 and 2 ′ by the display control unit 32 .
- the notification means to the subject S by the notification part 33 is not limited to an alert sound.
- characters or the like prompting to move the viewpoint may be displayed, or the experiment manager may shout to prompt the subject S to move the viewpoint.
- the notification may be given without the information to prompt the subject to move from the first position to the second position.
- the information acquisition unit 34 acquires the visibility information of the second identifier from the subject S during the transition from the reference state of the visibility of the second identifier to the target state.
- the visibility information of the second identifier is, for example, information indicating the result of evaluation by the subject S on how the second identifier looks when the visibility transitions from the reference state to the target state, It is information indicating the result of detection of the focus position of the subject S by a sensor or the like (not shown).
- the information indicating the evaluation result of the appearance of the second identifier by the subject S is input to the information acquisition unit 34 via a user input interface (not shown) connected to the information processing unit 3, or the subject S
- An output signal from a sensor or the like that detects the focal position of is input to the information acquisition unit 34, and the input information to the information acquisition unit 34 is acquired as the visibility information of the second identifier.
- the visibility information acquired by the information acquisition unit 34 is stored in the storage unit 31 in association with the corresponding transition condition.
- the condition changing unit 35 changes the transition condition stored in the storage unit 31 based on the visibility information of the second identifier acquired by the information acquiring unit 34 .
- the transition condition changed by the condition changing unit 35 is used when obtaining visibility information from the next time onward. It should be noted that the condition changing unit 35 may be provided as necessary, and may be omitted.
- FIG. 4 is a flowchart showing an example of visibility information acquisition processing when the viewpoint moves away.
- FIG. 5 is a diagram showing an example of an identifier displayed on the display unit 2' (virtual space image VR) when the viewpoint moves away.
- step S10 of FIG. 5 The identifier In (first identifier) is displayed in the first area 21' on the image VR.
- a fixation index such as "+” that easily fixes the viewpoint of the subject S at the position.
- the upper part of FIG. 5 shows a state in which the identifier In using the fixation index of "+” is displayed in the first region 21' on the virtual space image VR by the process of step S10.
- the identifier If is not displayed on the display 22 of the display unit 2 or the second area 22' on the virtual space image VR of the display unit 2'. Since the identifier In is displayed, the subject S starts to gaze at the identifier In, and a state is formed in which the subject S is focused on the near position Pn (first position).
- step S20 the display control unit 32 displays the identifier If (second identifier) on the display 22 of the display unit 2 or the second area 22' on the virtual space image VR of the display unit 2', and notifies The unit 33 outputs an alert sound for prompting the subject S to move the viewpoint. It is preferable that the display of the identifier If in step S20 and the output of the alert sound be performed after the completion of the display of the identifier In in step S10, and after a period of about 2 to 5 seconds has elapsed. . By providing such a standby time, it is possible to reliably create a state in which the subject S is focused on the near position Pn.
- a random index is preferably used as the identifier If displayed in step S20.
- the random index can be, for example, a number randomly selected from a plurality of numbers, an image of a road sign, an icon, characters, or the like.
- a number (random index) randomly selected from six numbers 0 to 5 is used as the identifier If.
- the middle part of FIG. 5 shows a state in which an identifier If using a random index of "0" is displayed in the second area 22' on the virtual space image VR by the process of step S20.
- the display of the identifier If at this stage is in a state in which the visibility of the identifier If is in a standard state, that is, in a blurry state after the blurring process is performed and is out of focus.
- the subject S moves the viewpoint from the near position Pn to the far position Pf in response to the alert sound, and visually recognizes the identifier If using the random index displayed at the far position Pf. become.
- the display control unit 32 controls the display state of the identifier If according to the transition conditions stored in the storage unit 31, and changes the visibility of the identifier If from the reference state to the target state (in-focus state).
- transition to The lower part of FIG. 5 shows a state in which the identifier If, which has reached the target state after completion of visibility transition, is displayed in the second area 22' on the virtual space image VR.
- step S40 the information acquisition unit 34 acquires from the subject S the visibility information of the identifier If during the transition from the reference state of the visibility of the identifier If to the target state. Acquisition of the visibility information of the identifier If by the information acquisition unit 34 is performed by inputting information indicating the evaluation result of the appearance of the identifier If by the subject S into the information acquisition unit 34 via the user input interface, or by checking the focal position of the subject S. is input to the information acquisition unit 34.
- the evaluation of how the identifier If is seen by the subject S may be performed, for example, according to the following six-level evaluation criteria. 6: Very easy to read 5: Easy to read 4: Can be read without difficulty 3: Somewhat difficult to read but can be read 2: Hardly readable 1: Unable to read
- the visibility of the identifier If transitions from the standard state to the target state. During this time, it evaluates to which of the six evaluation criteria the appearance of the identifier If corresponds, and inputs the evaluation result to the information acquisition unit 34 .
- the evaluation result is input to the information acquisition unit 34 at any timing during the transition of the visibility of the identifier If, when the transition is completed, or between the completion of the transition and the end of the display of the identifier If. good too.
- the information acquisition unit 34 can obtain more accurate visibility information. be able to obtain.
- the transition conditions include the display time T during which the display of the second identifier is continued, the delay time L from the completion of the movement of the viewpoint of the subject S to the start of the transition of the visibility of the second identifier, the second identifier and a time constant ⁇ that determines the temporal change in the degree of increase when the visibility of the second identifier is increased.
- the display time T for example, a time selected from four types of 1.0 seconds, 0.5 seconds, 0.35 seconds, and 0.25 seconds can be set as one of the transition conditions.
- the delay time L the transition time ⁇ , and the time constant ⁇
- conditions related to the subject S's age, sex, visual acuity, eye health, degree of eye opening, dominant eye, etc. and conditions related to the environment of the real space or virtual space, etc. It is possible to set it appropriately by taking into consideration.
- FIG. 6 is a graph showing an example of temporal changes in the visibility of the second identifier (identifier If) whose display state is controlled according to the transition condition when the viewpoint moves away from the user.
- the vertical axis of the graph in FIG. 6 represents the state of visibility V
- the horizontal axis represents time t.
- the state of the visibility V on the vertical axis increases as the distance from the intersection (origin) with the horizontal axis increases.
- the solid line in the graph of FIG. 6 represents temporal changes in the visibility of the second identifier whose display state is controlled according to the transition condition.
- the graph of FIG. 6 shows an example in which the value of the time constant ⁇ , which is one of the transition conditions, is changed to three values of 0.1, 0.2, and 0.3.
- the transition time ⁇ is the time required to complete the transition to the target state V1 after the delay time L has elapsed.
- a curve C over the transition time ⁇ represents a temporal change in the visibility of the second identifier when the visibility of the second identifier is raised to the target state V1. ), it is possible to approximate the temporal change of the focal length due to the focus adjustment function of the subject's S eyes.
- t represents time [s] after the start of transition.
- F represents the focal length [m] at time t.
- Do represents a diopter that is the reciprocal of the focal length at the start of viewpoint movement.
- Dt represents a diopter that is the reciprocal of the focal length at the end of the viewpoint movement.
- e represents Napier's number (the base of natural logarithms).
- ⁇ represents a time constant.
- the distance Dn from the subject S to the near position Pn is used as the focal length at the start of the viewpoint movement
- the distance Dn from the subject S to the far position Pf is used as the focal length at the end of the viewpoint movement.
- a distance Df to can be used.
- Do in the above equation (1) means 1/Dn
- Dt means 1/Df.
- the focal length F in the above equation (1) corresponds to the state of visibility V at time t.
- the time constant ⁇ is set according to the transition condition, and the length of the transition time ⁇ (the shape of the curve C) changes according to the time constant ⁇ .
- the dashed line in the graph of FIG. 6 represents the temporal change in visibility corresponding to the blurring process applied to the virtual space image in the conventional technology as described above.
- the blurring process applied to the virtual space image is executed at a speed corresponding to the performance of hardware responsible for image processing. Therefore, the transition from the reference state V0 of visibility to the target state V1 is completed in a short period of time substantially simultaneously with the movement of the viewpoint.
- the transition conditions include, for example, acquisition of visibility information, real space as shown in FIG.
- a condition may be set as to whether to perform in the virtual space as shown in FIG.
- presence/absence of the blurring process and presence/absence of the delay time L may be set as one of the transition conditions.
- step S50 the information acquisition unit 34 stores the acquired visibility information of the identifier If in the storage unit 31 in association with the corresponding transition condition. As a result, information (visibility information) about how the identifier If looks when the visibility state of the identifier If is changed according to the transition condition is accumulated in the storage unit 31 .
- step S60 the condition changing unit 35 changes the transition condition stored in the storage unit 31 based on the visibility information of the identifier If acquired by the information acquiring unit 34.
- This transition condition is changed, for example, when the visibility information of the identifier If acquired by the information acquisition unit 34 corresponds to evaluation criteria 2 described above, the display time T set as the transition condition , delay time L, etc. are changed to different values so that the visibility information acquired from the next time onwards is improved.
- the process returns to step S10, and the above series of processes are repeatedly executed.
- an example of changing the transition condition based on the acquired visibility information has been described. You may make it acquire visibility information while.
- FIG. 7 is a flowchart illustrating an example of visibility information acquisition processing when the viewpoint moves in the direction of approaching.
- FIG. 8 is a diagram showing an example of an identifier displayed on the display unit 2' (virtual space image VR) when the viewpoint moves in the approaching direction.
- step S110 of FIG. 8 An identifier If (first identifier) using a fixation index of "+" is displayed in the second region 22' on the image VR.
- the upper part of FIG. 8 shows a state in which the identifier If is displayed in the second area 22' on the virtual space image VR by the process of step S110.
- the identifier In is not displayed on the tablet terminal 21 of the display unit 2 or the first area 21' on the virtual space image VR of the display unit 2'. Since the identifier If is displayed, the subject S starts to gaze at the identifier If, and a state is formed in which the subject S is focused on the far position Pf (first position).
- step S120 the display control unit 32 assigns an identifier In (second identifier) using a random index to the tablet terminal 21 of the display unit 2 or the first region 21' on the virtual space image VR of the display unit 2'. is displayed, and the notification unit 33 outputs an alert sound for prompting the subject S to move the viewpoint.
- the display of the identifier In and the output of the alert sound in step S120 are performed in the same manner as in the case of moving the viewpoint in the direction of moving away from the identifier In in step S110, after the completion of the display of the identifier In in step S110. It is better to wait and let it happen.
- the middle part of FIG. 8 shows a state in which an identifier In using a random index of "0" is displayed in the first area 21' on the virtual space image VR by the process of step S120.
- the display of the identifier In at this stage is in a state where the visibility of the identifier In is in a standard state, that is, in an out-of-focus and blurred state after the blurring process has been performed.
- the subject S moves the viewpoint from the far position Pf to the near position Pn in response to the alert sound, and visually recognizes the identifier In using the random index displayed at the near position Pn. become.
- the display control unit 32 controls the display state of the identifier In according to the transition conditions stored in the storage unit 31, and changes the visibility of the identifier In from the reference state to the target state (in-focus state).
- transition to The lower part of FIG. 8 shows a state in which the identifier In, which has reached the target state after the visibility transition is completed, is displayed in the first area 21' on the virtual space image VR.
- step S140 the information acquisition unit 34 acquires from the subject S the visibility information of the identifier In during the transition from the reference state of the visibility of the identifier In to the target state.
- the acquisition processing of the visibility information of the identifier In by the information acquisition unit 34 is performed in the same manner as the processing of acquiring the visibility information of the identifier If in the movement of the viewpoint in the moving away direction described above.
- Do in the above equation (1) means 1/Df
- Dt means 1/Dn.
- step S150 the information acquisition unit 34 stores the acquired visibility information of the identifier In in the storage unit 31 in association with the corresponding transition condition. As a result, information (visibility information) about how the identifier In appears when the visibility state of the identifier In is changed according to the transition condition is accumulated in the storage unit 31 .
- step S160 the condition changing unit 35 changes the transition condition stored in the storage unit 31 based on the visibility information of the identifier In acquired by the information acquiring unit 34.
- the process returns to step S110, and the above series of processes are repeatedly executed.
- the series of processes for moving the viewpoint in the moving away direction (steps S10 to S60 in FIG. 4) and the series of processes for moving the viewpoint in the approaching direction (steps S110 to S160 in FIG. 7) are alternately and repeatedly executed. It is also possible to Also, a plurality of sets with different transition conditions may be repeatedly executed in a predetermined order or randomly. For example, if two sets of different transition conditions are alternately executed, the visibility information when the viewpoint moves from one set to the other set and the visibility information when the viewpoint moves from the other set to the other set are obtained. It is possible to acquire the visibility information in the case where the transition is made, and it becomes easy to evaluate the desired transition condition.
- the visibility information acquired by the visibility information acquisition device 1 of the present embodiment as described above and the information related to the transition conditions associated therewith can be used, for example, for visibility evaluation of various objects in vehicle development such as automobiles, vehicle operation, and so on.
- a driving simulator system used for simulating experiences, etc. it can be used for display control of virtual space images that are performed when the user's viewpoint moves.
- FIG. 9 shows an example in which transition conditions suitable for a driving simulator system are determined based on the visibility information acquired by the visibility information acquisition device 1 of the present embodiment, and display control of a virtual space image is performed according to the transition conditions. is shown.
- the virtual space image shown in FIG. 9 is an image displayed on the HMD or the like worn by the user, and expresses the scene in the virtual space that the user can see from the driver's seat of the vehicle.
- the virtual space image includes, as objects representing the vehicle, the upper part of the steering wheel, the upper part of the dashboard, the right front pillar, the front edge of the roof, the rearview mirror, the right side mirror, and the like.
- the number "8" displayed at the bottom center of the virtual space image is an object for evaluating visibility near the upper end of the steering wheel.
- the virtual space image also includes roads, sidewalks, buildings, road signs (stop), etc. as objects representing stationary objects outside the vehicle.
- the user's viewpoint ( ⁇ mark) is at a near position Pn on the number "8" displayed near the upper end of the steering wheel.
- Objects located inside the enclosed viewpoint peripheral area An are in focus, and objects located outside the viewpoint peripheral area An are out of focus and blurred.
- the white arrow in the figure indicates the direction in which the user's viewpoint moves. move to Note that the ⁇ mark indicating the user's viewpoint is not displayed in the actual virtual space image.
- the display state of the object in the viewpoint surrounding area Af of the movement destination is determined according to the transition condition determined based on the visibility information acquired by the visibility information acquisition device 1. is controlled, and the visibility of the object changes from a blurred state (reference state) to a focused state (target state).
- the virtual space image shown in the lower part of FIG. 9 shows a state in which visibility transition has been completed.
- the visibility of the object in the viewpoint peripheral region An of the movement source changes from an in-focus state to a blurred state, contrary to the transition of the visibility of the object in the viewpoint peripheral region Af of the movement destination.
- the display state of the object is controlled so as to transition to the state.
- the transition condition is determined based on the visibility information acquired by the visibility information acquisition device 1 of the present embodiment as described above, and the display control of the virtual space image in the driving simulator system is performed according to the transition condition, , vehicle development, etc., it becomes possible to accurately evaluate the visibility in the real space of an object displayed on a virtual space image. Also, if a vehicle driving simulation experience is performed using a driving simulator system, a more realistic driving experience can be provided to the user.
- the display control unit 32 displays the first identifier at the first position in front of the subject S
- a second identifier is displayed at a second position spaced apart in the direction, and the viewpoint of the subject S is moved from the first position to the second position.
- the display control unit 32 controls the display state of the second identifier according to the transition condition to transition the visibility of the second identifier from the reference state to the target state, and the visibility information of the second identifier during the transition is
- the information is obtained from the subject S by the information obtaining unit 34 .
- a series of processes from displaying the first identifier to acquiring visibility information is set as one set, and multiple sets are repeatedly executed under mutually different transition conditions. This makes it possible to easily acquire visibility information corresponding to various transition conditions.
- the display time T of the second identifier is set within 1 second.
- the time required to focus on the movement destination is approximately one second or less.
- the transition condition is changed based on the acquired visibility information. This makes it possible to efficiently set various transition conditions, making it easier to acquire a wide range of visibility information.
- the visibility information acquisition device 1 of the present embodiment if the first and second identifiers are displayed on the first and second display devices (tablet terminal 21, display 22), virtual space such as HMD It becomes unnecessary to prepare a display device, and visibility information can be obtained at low cost. Further, if a notification (output of an alert sound) prompting the subject to move the viewpoint is performed, the viewpoint of the subject S can be reliably moved, and more accurate visibility information can be obtained. becomes possible.
- the present invention is not limited to the above-described embodiments, and various modifications and changes are possible based on the technical idea of the present invention.
- an example has been described in which the visibility of the second identifier is maintained in the reference state during the period from the completion of the movement of the viewpoint of the subject S until the delay time L elapses.
- the display state of the second identifier may be controlled so that the visibility is slightly increased during L.
- the transition from the reference state of the visibility of the second identifier to the target state is performed according to the function of formula (1).
- the state transition of the visibility of the second identifier may be performed according to a map or the like that associates the focal length.
- the diopters Do and Dt in the formula (1) are the reciprocals of the distances Dn and Df in the above-described embodiment. is shown, the reciprocal of the distance (focal length) from the subject S to the identifier In displayed at the near position Pn, and the distance (focal length) from the subject S to the identifier If displayed at the far position Pf may be used as the diopters Do and Dt.
- the degree of change in the visibility of the second identifier can be changed according to the deviation between the focal length at the start of the viewpoint movement and the focal length at the end of the viewpoint movement.
- the functions (blocks) of the storage unit 31, the display control unit 32, the notification unit 33, the information acquisition unit 34, and the condition change unit 35 are included in one information processing unit 3 (computer system).
- the target (control device) that implements each of the above functions can be a single target (control device) or a plurality of targets (control devices).
- each of the functions described above may be implemented by distributing them to a plurality of information processing apparatuses.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
図1は、本発明の一実施形態に係る視認性情報取得装置の構成を示す概念図である。
図1において、本実施形態の視認性情報取得装置1は、例えば、表示部2と、情報処理部3とを備える。表示部2は、所定位置に着座した被験者Sの前方の第1位置に第1識別子を表示可能であり、かつ、第1位置に対して前後方向に間隔を空けた第2位置に第2識別子を表示可能である。第1位置および第2位置のそれぞれは、現実空間内の位置だけでなく、仮想空間内の位置を含む。本実施形態による表示部11は、例えば、現実空間に2つの表示装置を配置して実現してもよく、或いは、ヘッドマウントディスプレイ(HMD)等を利用して仮想空間に2つの表示領域を設定して実現してもよい。
図2において、表示部2は、被験者Sの前方の近い位置(以下、「近位置」とする)Pnに配置されたタブレット端末21と、被験者Sの前方の遠い位置(以下、「遠位置」とする)Pfに配置されたディスプレイ22とを有している。タブレット端末21およびディスプレイ22は、被験者Sの概ね正面にそれぞれ配置されている。タブレット端末21は、被験者Sの前方斜め下側で画面を被験者S側に向けて設置されている。タブレット端末21の画面には、略中央に位置する領域に識別子Inが表示される。ディスプレイ22は、被験者Sの視野内でタブレット端末21とは重ならない高さに設置されている。ディスプレイ22の画面には、略中央に位置する領域に識別子Ifが表示される。本実施形態では、被験者Sからタブレット端末21までの距離Dn(図1)が、例えば0.2mに設定されており、被験者Sからディスプレイ22までの距離Df(図1)が、例えば5mに設定されている。ただし、タブレット端末21およびディスプレイ22の配置は上記の一例に限定されない。
図3において、表示部2’は、被験者Sの頭部に装着されたHMDに表示される仮想空間画像VR上で実現されている。仮想空間画像VRには、左右方向の中央部下側に第1領域21’が形成されているとともに、左右方向の中央部上側に第2領域22’が形成されている。第1領域21’および第2領域22’は、仮想空間において、奥行方向(被験者Sから見た前後方向)に間隔を空けた配置となるように設定されている。第1領域21’は、図2に示した現実空間におけるタブレット端末21の画面を仮想的に実現したものである。また、第2領域22’は、図2に示した現実空間におけるディスプレイ22の画面を仮想的に実現したものである。したがって、HMDに表示される仮想空間画像VRには、第1領域21’の奥行情報として、近位置Pn(被験者Sからの距離Dn)に対応する情報が設定されているとともに、第2領域22’の奥行情報として、遠位置Pf(被験者Sからの距離Df)に対応する情報が設定されている。第1領域21’の略中央には識別子Inが表示され、第2領域22’の略中央には識別子Ifが表示される。
図4は、視点が遠ざかる方向に移動する場合の視認性情報の取得処理の一例を示すフローチャートである。また、図5は、視点が遠ざかる方向に移動する場合に表示部2’(仮想空間画像VR)に表示される識別子の一例を示す図である。
6:非常に読みやすい
5:読みやすい
4:苦労せずに読める
3:多少読みにくいが読める
2:やっと読める
1:読めない
被験者Sは、識別子Ifの視認性が基準状態から目標状態に遷移する間、該識別子Ifの見え方が上記6段階の評価基準のうちのいずれに該当するかを評価し、その評価結果を情報取得部34に入力する。情報取得部34への評価結果の入力は、識別子Ifの視認性の遷移中、遷移が完了した時、遷移が完了してから識別子Ifの表示が終了するまでの間のいずれのタイミングであってもよい。ただし、遷移が開始されてから少なくとも完了するまでの間における識別子Ifの視認性を評価した結果が情報取得部34に入力されるようにすれば、より正確な視認性情報が情報取得部34で取得できるようになる。
上記(1)式において、tは、遷移を開始してからの時間[s]を表している。Fは、時間tでの焦点距離[m]を表している。Doは、視点移動開始時の焦点距離の逆数であるディオプター(diopter)を表している。Dtは、視点移動終了時の焦点距離の逆数であるディオプターを表している。eは、ネイピア数(自然対数の底)を表している。τは、時間定数を表している。図5のように視点が遠ざかる方向に移動する場合、視点移動開始時の焦点距離として被験者Sから近位置Pnまでの距離Dnを用いるとともに、視点移動終了時の焦点距離として被験者Sから遠位置Pfまでの距離Dfを用いることができる。この場合、上記(1)式におけるDoは1/Dnを意味し、Dtは1/Dfを意味する。
図7は、視点が近づく方向に移動する場合の視認性情報の取得処理の一例を示すフローチャートである。また、図8は、視点が近づく方向に移動する場合に表示部2’(仮想空間画像VR)に表示される識別子の一例を示す図である。
2,2’…表示部
3…情報処理部
21…タブレット端末
21’…仮想空間画像の第1領域
22…ディスプレイ
22’…仮想空間画像の第2領域
31…記憶部
32…表示制御部
33…報知部
34…情報取得部
35…条件変更部
An,Af…視点周辺領域
L…遅延時間
Pn…近位置
Pf…遠位置
S…被験者
V0…視認性の基準状態
V1…視認性の目標状態
VR…仮想空間画像
α…遷移時間
τ…時間定数
Claims (7)
- 被験者の前方の第1位置に第1識別子を表示するとともに、前記第1位置に対して前後方向に間隔を空けた第2位置に第2識別子を表示して、前記被験者の視点を前記第1位置から前記第2位置に移動させたときの前記第2識別子の視認性情報を取得する視認性情報取得装置の制御方法において、
前記被験者の焦点が前記第1位置に合わされた状態となるように、前記第1位置に前記第1識別子を表示する第1表示ステップと、
前記第2位置に前記第2識別子を表示して、前記被験者の視点を前記第1位置から前記第2位置に移動させる第2表示ステップと、
予め設定した遷移条件に従って前記第2識別子の表示状態を制御して、前記第2識別子の視認性を基準状態から該基準状態とは異なる目標状態に遷移させる制御ステップと、
前記第2識別子の視認性の基準状態から目標状態への遷移中における前記第2識別子の視認性情報を前記被験者から取得する取得ステップと、
を含むことを特徴とする視認性情報取得装置の制御方法。 - 前記第1表示ステップ、前記第2表示ステップ、前記制御ステップ、および前記取得ステップを1セットとして複数セットを繰り返し実行し、
前記複数セットのそれぞれは、前記制御ステップにおける前記遷移条件が互いに異なることを特徴とする請求項1に記載の視認性情報取得装置の制御方法。 - 前記複数セットのそれぞれは、前記第2識別子の表示時間が1秒以内に設定されていることを特徴とする請求項2に記載の視認性情報取得装置の制御方法。
- 前記取得ステップで取得した前記視認性情報に基づいて、次セットの前記制御ステップで用いる前記遷移条件を変更するステップを含むことを特徴とする請求項2に記載の視認性情報取得装置の制御方法。
- 前記第1表示ステップは、前記第1位置に配置される第1表示装置に前記第1識別子を表示し、
前記第2表示ステップは、前記第2位置に配置される第2表示装置に前記第2識別子を表示するとともに、前記被験者に対して視点の移動を促す報知を行う、ことを特徴とする請求項1に記載の視認性情報取得装置の制御方法。 - 被験者の前方の第1位置に第1識別子を表示するとともに、前記第1位置に対して前後方向に間隔を空けた第2位置に第2識別子を表示して、前記被験者の視点を前記第1位置から前記第2位置に移動させたときの前記第2識別子の視認性情報を取得する視認性情報取得装置において、
前記第1位置に前記第1識別子を表示可能であり、かつ、前記第2位置に前記第2識別子を表示可能な表示部と、
前記第2識別子の視認性を基準状態から該基準状態とは異なる目標状態に遷移させる遷移条件を記憶した記憶部と、
前記表示部に前記第1識別子を表示させた後に前記第2識別子を表示させるとともに、前記記憶部に記憶されている前記遷移条件に従って、前記第2識別子の表示状態を制御する表示制御部と、
前記被験者に対して視点の移動を促す報知部と、
前記第2識別子の視認性の基準状態から目標状態への遷移中における前記第2識別子の視認性情報を前記被験者から取得する情報取得部と、
を備えることを特徴とする視認性情報取得装置。 - 前記情報取得部で取得された前記視認性情報に基づいて、前記記憶部に記憶されている前記遷移条件を変更する条件変更部を備えることを特徴とする請求項6に記載の視認性情報取得装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280064035.4A CN117980987A (zh) | 2021-09-30 | 2022-03-22 | 视认性信息取得装置的控制方法和视认性信息取得装置 |
KR1020247007968A KR20240035929A (ko) | 2021-09-30 | 2022-03-22 | 시인성 정보 취득 장치의 제어 방법 및 시인성 정보 취득 장치 |
US18/695,238 US20240274043A1 (en) | 2021-09-30 | 2022-03-22 | Control method for visibility information acquisition device and visibility information acquisition device |
EP22875400.8A EP4411718A1 (en) | 2021-09-30 | 2022-03-22 | Control method for visibility information acquisition device and visibility information acquisition device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-161259 | 2021-09-30 | ||
JP2021161259A JP2023050904A (ja) | 2021-09-30 | 2021-09-30 | 視認性情報取得装置の制御方法および視認性情報取得装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023053517A1 true WO2023053517A1 (ja) | 2023-04-06 |
Family
ID=85782155
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/012970 WO2023053517A1 (ja) | 2021-09-30 | 2022-03-22 | 視認性情報取得装置の制御方法および視認性情報取得装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240274043A1 (ja) |
EP (1) | EP4411718A1 (ja) |
JP (1) | JP2023050904A (ja) |
KR (1) | KR20240035929A (ja) |
CN (1) | CN117980987A (ja) |
WO (1) | WO2023053517A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH045948A (ja) * | 1990-04-24 | 1992-01-09 | Nissan Motor Co Ltd | 認知機能測定装置 |
JP2000245697A (ja) * | 1999-03-04 | 2000-09-12 | Nippon Hoso Kyokai <Nhk> | 視機能測定装置および視機能測定用記憶媒体 |
JP2014130204A (ja) * | 2012-12-28 | 2014-07-10 | Seiko Epson Corp | 表示装置、表示システム、および、表示装置の制御方法 |
JP2017059196A (ja) * | 2015-12-22 | 2017-03-23 | 株式会社コロプラ | 仮想現実空間映像表示方法、及び、プログラム |
JP2017138701A (ja) | 2016-02-02 | 2017-08-10 | 株式会社コロプラ | 仮想空間画像提供方法、及びそのプログラム |
US20200311881A1 (en) * | 2018-01-23 | 2020-10-01 | Facebook Technologies, Llc | Systems and Methods for Generating Defocus Blur Effects |
CN112806952A (zh) * | 2020-12-31 | 2021-05-18 | 北京大学第三医院(北京大学第三临床医学院) | 一种动态离焦曲线测试系统及其测试方法 |
JP2021514716A (ja) * | 2018-02-23 | 2021-06-17 | エシロール・アンテルナシオナル | 被検者の視機能を変化させるための方法、被検者の球面屈折矯正の必要性を測定するための方法、及びこれらの方法を実施するための光学システム |
-
2021
- 2021-09-30 JP JP2021161259A patent/JP2023050904A/ja active Pending
-
2022
- 2022-03-22 EP EP22875400.8A patent/EP4411718A1/en active Pending
- 2022-03-22 US US18/695,238 patent/US20240274043A1/en active Pending
- 2022-03-22 KR KR1020247007968A patent/KR20240035929A/ko unknown
- 2022-03-22 WO PCT/JP2022/012970 patent/WO2023053517A1/ja active Application Filing
- 2022-03-22 CN CN202280064035.4A patent/CN117980987A/zh active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH045948A (ja) * | 1990-04-24 | 1992-01-09 | Nissan Motor Co Ltd | 認知機能測定装置 |
JP2000245697A (ja) * | 1999-03-04 | 2000-09-12 | Nippon Hoso Kyokai <Nhk> | 視機能測定装置および視機能測定用記憶媒体 |
JP2014130204A (ja) * | 2012-12-28 | 2014-07-10 | Seiko Epson Corp | 表示装置、表示システム、および、表示装置の制御方法 |
JP2017059196A (ja) * | 2015-12-22 | 2017-03-23 | 株式会社コロプラ | 仮想現実空間映像表示方法、及び、プログラム |
JP2017138701A (ja) | 2016-02-02 | 2017-08-10 | 株式会社コロプラ | 仮想空間画像提供方法、及びそのプログラム |
US20200311881A1 (en) * | 2018-01-23 | 2020-10-01 | Facebook Technologies, Llc | Systems and Methods for Generating Defocus Blur Effects |
JP2021514716A (ja) * | 2018-02-23 | 2021-06-17 | エシロール・アンテルナシオナル | 被検者の視機能を変化させるための方法、被検者の球面屈折矯正の必要性を測定するための方法、及びこれらの方法を実施するための光学システム |
CN112806952A (zh) * | 2020-12-31 | 2021-05-18 | 北京大学第三医院(北京大学第三临床医学院) | 一种动态离焦曲线测试系统及其测试方法 |
Also Published As
Publication number | Publication date |
---|---|
KR20240035929A (ko) | 2024-03-18 |
JP2023050904A (ja) | 2023-04-11 |
US20240274043A1 (en) | 2024-08-15 |
CN117980987A (zh) | 2024-05-03 |
EP4411718A1 (en) | 2024-08-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9771083B2 (en) | Cognitive displays | |
EP3691926A1 (en) | Display system in a vehicle | |
Pfannmüller et al. | A comparison of display concepts for a navigation system in an automotive contact analog head-up display | |
JP2017030737A (ja) | 車両用表示装置、および車両用表示方法 | |
JP2021140785A (ja) | 注意に基づいた通知 | |
EP3575926A1 (en) | Method and eye tracking system for providing an approximate gaze convergence distance of a user | |
CN112384883A (zh) | 可穿戴设备及其控制方法 | |
US11915340B2 (en) | Generating a display of an augmented reality head-up display for a motor vehicle | |
CN113924518A (zh) | 控制机动车的增强现实平视显示器装置的显示内容 | |
Sudkamp et al. | The role of eye movements in perceiving vehicle speed and time-to-arrival at the roadside | |
WO2023053517A1 (ja) | 視認性情報取得装置の制御方法および視認性情報取得装置 | |
KR20230034448A (ko) | 차량 및 그 제어 방법 | |
WO2023053515A1 (ja) | 仮想空間画像生成装置および方法 | |
CN106095375B (zh) | 显示控制方法和装置 | |
JP2007280203A (ja) | 情報提示装置、自動車、及び情報提示方法 | |
JP7261370B2 (ja) | 情報処理装置、情報処理システム、情報処理方法、および、コンピュータプログラム | |
WO2023053516A1 (ja) | 仮想空間画像生成装置および方法 | |
JP2021056358A (ja) | ヘッドアップディスプレイ装置 | |
KR20210025765A (ko) | 차량 디스플레이 제어 장치, 그를 포함한 시스템 및 그 방법 | |
WO2021200913A1 (ja) | 表示制御装置、画像表示装置、及び方法 | |
JP2021124972A (ja) | 情報処理システム、および情報処理プログラム | |
JP2017102331A (ja) | 車両用ヘッドアップディスプレイの評価支援装置 | |
CN113360109A (zh) | 车载显示方法、设备及车辆 | |
KR20230171396A (ko) | 헤드업 디스플레이를 위한 고스트 이미지 완화 | |
CN118778255A (zh) | 图像处理方法、抬头显示装置、计算机可读介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22875400 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20247007968 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280064035.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18695238 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202417026309 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022875400 Country of ref document: EP Effective date: 20240430 |