WO2024029680A1 - Procédé et dispositif d'étalonnage de hud à l'aide d'une caméra à l'intérieur d'un véhicule - Google Patents

Procédé et dispositif d'étalonnage de hud à l'aide d'une caméra à l'intérieur d'un véhicule Download PDF

Info

Publication number
WO2024029680A1
WO2024029680A1 PCT/KR2023/003612 KR2023003612W WO2024029680A1 WO 2024029680 A1 WO2024029680 A1 WO 2024029680A1 KR 2023003612 W KR2023003612 W KR 2023003612W WO 2024029680 A1 WO2024029680 A1 WO 2024029680A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
measurement device
virtual image
pattern
measuring
Prior art date
Application number
PCT/KR2023/003612
Other languages
English (en)
Korean (ko)
Inventor
정영주
Original Assignee
숙명여자대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 숙명여자대학교산학협력단 filed Critical 숙명여자대학교산학협력단
Publication of WO2024029680A1 publication Critical patent/WO2024029680A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0025Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present invention relates to a HUD calibration method and device using an automobile interior camera, and more specifically, to a method and device for measuring optical characteristics of a three-dimensional virtual image generated by an augmented reality device.
  • Augmented reality is a field of virtual reality (AR) and is a computer graphics technique that synthesizes virtual objects or information in the real environment to make them look like objects that exist in the original environment. It is frequently used in digital media. It is used.
  • Augmented reality is also called mixed reality (MR) because it combines the real world with a virtual world with additional information in real time and displays it as a single image.
  • MR mixed reality
  • augmented reality can be used in remote medical diagnosis, broadcasting, architectural design, and manufacturing process management.
  • smartphones With the recent widespread adoption of smartphones, it has entered a full-fledged commercialization phase, and various products are being developed in the game and mobile solution industries as well as in the education field.
  • a wearable computer may be used to realize augmented reality outdoors.
  • a head mounted display HMD
  • HUD head up display
  • a wearable computer may be used to realize augmented reality outdoors.
  • a head mounted display enables augmented reality by displaying computer graphics and text in real time over the actual environment seen by the user.
  • a head up display HUD
  • augmented reality by displaying various information necessary for vehicle operation on the outside of the vehicle's windshield.
  • a head-up display displays the light source output from the inside of the vehicle to the outside of the windshield on a virtual plane located outside the windshield of the vehicle, so that the driver can view the vehicle in that virtual plane without moving his or her eyes while driving.
  • Augmented reality can be implemented by providing information necessary for operation.
  • geometric characteristics including the position of the virtual plane formed by the augmented reality device, can be determined according to the optical characteristics of individual augmented reality devices such as HMD and HUD.
  • the present invention seeks to provide a method and device for measuring optical characteristics of a virtual image generated by an augmented reality device.
  • the present invention uses the optical characteristics of the virtual image generated by the augmented reality device to determine the distance to the virtual image and the lookdown/up angle of the virtual image based on the user of the augmented reality device. It is intended to provide a method and device for calculating down/up angle, horizontal/vertical field of view, static distortion, ghosting level, etc.
  • the method for measuring the optical properties of an augmented reality device uses a plurality of cameras arranged around a predetermined measurement reference position, and multiple cameras are output on a virtual plane by the augmented reality device. Taking a test image including a pattern; Obtaining angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras; and calculating coordinates of the plurality of patterns based on the measurement reference position based on the plurality of captured images captured by the plurality of cameras, the angle of view information, and the arrangement information.
  • the plurality of cameras are a central camera located at the measurement reference position, a left camera and a right camera located symmetrically around the measurement reference position, and the plurality of patterns are aligned horizontally and vertically in the test image.
  • the step of calculating the coordinates of the plurality of patterns includes the number of horizontal pixels of the plurality of captured images, the coordinates of the plurality of patterns in the plurality of captured images, and the plurality of cameras included in the angle of view information.
  • the coordinates of the plurality of patterns can be calculated using the angle of view and the distance between the left camera and the right camera included in the arrangement information.
  • the coordinates of the plurality of patterns may be calculated using Equation 1.
  • x ij , y ij , and z ij are the x, y, and z-axis coordinates of the horizontal i-th and vertical j-th patterns based on the measurement reference position
  • is the distance between the left camera and the right camera
  • M is the number of horizontal pixels of the plurality of captured images
  • is the angle of view of the plurality of cameras
  • m L ij is the horizontal coordinate of the i-th horizontal and j-th pattern in the captured images of the left camera
  • m R ij is the horizontal coordinate of the i-th horizontal and j-th pattern vertically in the captured image of the right camera
  • m C ij is the horizontal coordinate of the i-th horizontal and j-th vertical pattern in the captured image of the center camera. It is a direction coordinate.
  • calculating a virtual image distance between the measurement reference position and the virtual plane using the coordinates of the measurement reference position and the coordinates of at least one of the plurality of patterns on the virtual plane may be included.
  • the virtual image distance may be calculated using Equation 2.
  • D VI is the virtual image distance
  • x 22 , y 22 , and z 22 are the coordinates of one of the plurality of patterns.
  • a look down/up angle from the measurement reference position to the virtual plane is determined by using the coordinates of the measurement reference position and the coordinates of at least one of the plurality of patterns on the virtual plane. It may further include a step of calculating .
  • the lookdown/up angle may be calculated using Equation 3.
  • ⁇ down/up is the lookdown/up angle
  • x 22 , y 22 , and z 22 are the coordinates of one of the plurality of patterns.
  • the horizontal field of view of the measurement reference position is calculated using the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the horizontal direction among the plurality of patterns on the virtual plane. Additional steps may be included.
  • the horizontal viewing angle may be calculated using Equation 4.
  • ⁇ H FOV is the horizontal viewing angle
  • O is the coordinate of the measurement reference position
  • P 21 and P 23 are the coordinates of the two patterns located at both ends in the horizontal direction.
  • the vertical field of view of the measurement reference position is calculated using the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the vertical direction among the plurality of patterns on the virtual plane. Additional steps may be included.
  • the vertical viewing angle may be calculated using Equation 5.
  • ⁇ V FOV is the vertical viewing angle
  • O is the coordinate of the measurement reference position
  • P 12 and P 32 are the coordinates of the two patterns located at both ends in the vertical direction.
  • the step of calculating static distortion for each of the three axes based on the measurement reference position may be further included, based on the coordinates of the plurality of patterns on the virtual plane.
  • the step of calculating the coordinates of the plurality of patterns further calculates the coordinates of a plurality of ghost patterns corresponding to each of the plurality of patterns, based on the coordinates of the plurality of patterns and the coordinates of the plurality of ghost patterns. Therefore, a step of calculating a ghosting level may be further included.
  • the optical characteristic measuring device of the augmented reality device uses a plurality of cameras arranged around a predetermined measurement reference position, and outputs the output on a virtual plane by the augmented reality device.
  • a photographing unit that photographs a test image including a plurality of patterns
  • an acquisition unit that acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras
  • a calculation unit that calculates coordinates of the plurality of patterns based on the measurement reference position, based on the plurality of captured images captured by the plurality of cameras, the angle of view information, and the arrangement information.
  • the plurality of cameras are a central camera located at the measurement reference position, a left camera and a right camera located symmetrically around the measurement reference position, and the plurality of patterns are aligned horizontally and vertically in the test image.
  • the calculation unit determines the number of horizontal pixels of the plurality of captured images, the coordinates of the plurality of patterns in the plurality of captured images, the angle of view of the plurality of cameras included in the angle of view information, and the arrangement information. Using the distance between the left camera and the right camera, the coordinates of the plurality of patterns can be calculated.
  • the calculation unit may calculate the coordinates of the plurality of patterns using Equation 6.
  • x ij , y ij , and z ij are the x, y, and z-axis coordinates of the horizontal i-th and vertical j-th patterns based on the measurement reference position
  • is the distance between the left camera and the right camera
  • M is the number of horizontal pixels of the plurality of captured images
  • is the angle of view of the plurality of cameras
  • m L ij is the horizontal coordinate of the i-th horizontal and j-th pattern in the captured images of the left camera
  • m R ij is the horizontal coordinate of the i-th horizontal and j-th pattern vertically in the captured image of the right camera
  • m C ij is the horizontal coordinate of the i-th horizontal and j-th vertical pattern in the captured image of the center camera. It is a direction coordinate.
  • the calculation unit may further calculate a virtual image distance between the measurement reference position and the virtual plane using the coordinates of the measurement reference position and the coordinates of at least one of the plurality of patterns on the virtual plane.
  • the calculation unit may calculate the virtual image distance using Equation 7.
  • D VI is the virtual image distance
  • x 22 , y 22 , and z 22 are the coordinates of one of the plurality of patterns.
  • the calculation unit further calculates the lookdown/up angle with respect to the virtual plane from the measurement reference position using the coordinates of the measurement reference position and the coordinates of at least one of the plurality of patterns on the virtual plane. You can.
  • the calculation unit may calculate the lookdown/up angle using Equation 8.
  • ⁇ down/up is the lookdown/up angle
  • x 22 , y 22 , and z 22 are the coordinates of one of the plurality of patterns.
  • the calculation unit may further calculate the horizontal viewing angle of the measurement reference position using the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the horizontal direction among the plurality of patterns on the virtual plane. there is.
  • the calculation unit may calculate the horizontal viewing angle using Equation 9.
  • ⁇ H FOV is the horizontal viewing angle
  • O is the coordinate of the measurement reference position
  • P 21 and P 23 are the coordinates of the two patterns located at both ends in the horizontal direction.
  • the calculation unit may further calculate the vertical viewing angle of the measurement reference position using the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the vertical direction among the plurality of patterns on the virtual plane. there is.
  • the calculation unit may calculate the vertical viewing angle using Equation 10.
  • ⁇ V FOV is the vertical viewing angle
  • O is the coordinate of the measurement reference position
  • P 12 and P 32 are the coordinates of the two patterns located at both ends in the vertical direction.
  • the calculation unit may further calculate static distortion for each of the three axes based on the measurement reference position, based on the coordinates of the plurality of patterns on the virtual plane.
  • the calculation unit further calculates coordinates of a plurality of ghost patterns corresponding to each of the plurality of patterns based on the plurality of captured images, the angle of view information, and the arrangement information, and calculates the coordinates of the plurality of patterns and Based on the coordinates of the plurality of ghost patterns, the ghosting level can be further calculated.
  • the present invention has the effect of easily measuring the optical characteristics of a virtual image generated by an augmented reality device by using a plurality of cameras.
  • the present invention uses the optical characteristics of the virtual image generated by the augmented reality device to determine the distance to the virtual image and the lookdown/up angle of the virtual image based on the user of the augmented reality device. It has the effect of calculating down/up angle, horizontal/vertical field of view, static distortion, ghosting level, etc.
  • FIG. 1 is a flowchart showing a method of measuring optical characteristics of an augmented reality device according to an embodiment of the present invention.
  • Figure 2 is a flowchart showing a method for calculating a virtual image distance according to an embodiment of the present invention.
  • Figure 3 is a flowchart showing a lookdown/up angle calculation method according to an embodiment of the present invention.
  • Figure 4 is a flowchart showing a method for calculating a horizontal viewing angle according to an embodiment of the present invention.
  • Figure 5 is a flowchart showing a method for calculating a vertical viewing angle according to an embodiment of the present invention.
  • Figure 6 is a flowchart showing a method for calculating static distortion according to an embodiment of the present invention.
  • Figure 7 is a flowchart showing a method for calculating a ghosting level according to an embodiment of the present invention.
  • Figure 8 is a block diagram showing an apparatus for measuring optical characteristics of an augmented reality device according to an embodiment of the present invention.
  • 9A and 9B are diagrams for explaining an environment for measuring optical characteristics of an augmented reality device according to an embodiment of the present invention.
  • Figure 10 is a diagram for explaining the results of shooting a test image on a virtual plane using a plurality of cameras according to an embodiment of the present invention.
  • 11A and 11B are diagrams for explaining the coordinates of a plurality of patterns included in a captured image using a plurality of cameras according to an embodiment of the present invention.
  • Figures 12a and 12b are diagrams for explaining a method of calculating coordinates of a plurality of patterns according to an embodiment of the present invention.
  • Figure 13 is a diagram for explaining a method of calculating a virtual image distance according to an embodiment of the present invention.
  • Figures 14a and 14b are diagrams for explaining a method of calculating lookdown/up angles according to an embodiment of the present invention.
  • 15A and 15B are diagrams for explaining a method of calculating the horizontal viewing angle and vertical viewing angle according to an embodiment of the present invention.
  • Figure 16 is a diagram for explaining a method of calculating static distortion according to an embodiment of the present invention.
  • Figure 17 is a diagram for explaining a method of calculating a ghosting level according to an embodiment of the present invention.
  • Figure 18 shows a method of measuring optical properties according to embodiments.
  • Figure 19 shows an example of a multi-viewpoint image according to embodiments.
  • Figure 20 shows a method for displaying the location of a virtual image.
  • Figure 21 shows a process for generating points and depth according to embodiments.
  • Figure 22 shows equal depth grouping according to embodiments.
  • Figure 23 shows HUD virtual object projection according to embodiments.
  • Figure 24 shows an optical specific device according to embodiments.
  • first, second, A, and B may be used to describe various components, but the components should not be limited by the terms. The above terms are used only for the purpose of distinguishing one component from another.
  • a first component may be named a second component, and similarly, the second component may also be named a first component without departing from the scope of the present invention.
  • the term and/or includes any of a plurality of related stated items or a combination of a plurality of related stated items.
  • the present invention relates to a method and device for measuring the optical characteristics of a virtual reality device, and the measurement can be performed in the following environment.
  • the user's eyes are located in the eye box, and a virtual plane generated by the output of the virtual reality device may be formed outside a transparent or translucent screen (e.g., a vehicle's windshield). .
  • a virtual plane generated by the output of the virtual reality device may be formed outside a transparent or translucent screen (e.g., a vehicle's windshield).
  • the user can see the entire virtual plane by moving only his eyes.
  • a plurality of cameras may be arranged in the eye box centered on the measurement reference position. More specifically, cam C may be placed at the measurement reference position, and cam L and cam R may be placed at positions symmetrical to both sides. Meanwhile, a plurality of patterns may be arranged horizontally and vertically (e.g., 3x3) in the test image.
  • the present invention is not limited to being practiced only in such environments, and of course can be practiced in many different environments.
  • the location and size of the eye box, the number and arrangement of cameras, the number and arrangement of patterns included in the test image, etc. may vary depending on the measurement environment.
  • FIG. 1 is a flowchart showing a method of measuring optical characteristics of an augmented reality device according to an embodiment of the present invention.
  • the optical properties measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • one camera may be placed at the measurement reference position located in the center of the eye box, and the remaining cameras may be placed symmetrically toward the front at the same height on both sides of the camera.
  • the optical characteristic measurement device can be connected to a plurality of cameras wirelessly or wired and transmit a command to photograph a test image on a virtual plane.
  • step S120 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • the optical characteristic measuring device may receive information about the angle of view of the camera and information about the arrangement of the camera from the user, and obtain angle of view information and arrangement information.
  • the information regarding the camera's angle of view may be a horizontal angle of view
  • the information regarding the arrangement of the camera may be the separation distance between cameras symmetrically arranged on both sides of the measurement reference position.
  • the optical characteristic measurement device calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images, view angle information, and arrangement information captured by the plurality of cameras.
  • the optical characteristic measuring device includes information about the size of the plurality of captured images, information about the coordinates within the image of the plurality of patterns included in the plurality of captured images, information about the angle of view of the plurality of cameras, and information about the angle of view of the plurality of cameras. Using information about the arrangement, the three-dimensional coordinates of a plurality of patterns on a virtual plane with the measurement reference position as the origin (0, 0, 0) can be calculated.
  • the optical property measurement device includes a central camera located at the measurement reference position, a left camera and a right camera located symmetrically around the measurement reference position, and a plurality of patterns are displayed horizontally and vertically in the test image.
  • a central camera located at the measurement reference position
  • a left camera and a right camera located symmetrically around the measurement reference position
  • a plurality of patterns are displayed horizontally and vertically in the test image.
  • the plurality of cameras may be a central camera (cam C ) located at the measurement reference position, and a left camera (cam L ) and a right camera (cam R ) located symmetrically around the measurement reference position. Additionally, nine patterns can be arranged horizontally and vertically in the test image.
  • the optical characteristic measurement device includes the number of horizontal pixels of the plurality of captured images, the coordinates of the plurality of patterns in the plurality of captured images, the angle of view of the plurality of cameras included in the angle of view information, and the left and right cameras included in the arrangement information. Using the distance between them, the three-dimensional coordinates of each of the nine patterns on a virtual plane with the measurement reference position as the origin (0, 0, 0) can be calculated.
  • an optical property measuring device may calculate the coordinates of a plurality of patterns using Equation 1.
  • x ij , y ij , and z ij are the x, y, and z-axis coordinates of the horizontal i-th and vertical j-th patterns based on the measurement reference position
  • is the distance between the left and right cameras
  • M is a plurality of is the number of horizontal pixels of the captured image
  • is the angle of view of multiple cameras
  • m L ij is the horizontal coordinate of the i-th horizontal and j-th pattern in the captured image of the left camera
  • m R ij is the horizontal coordinate of the j-th pattern in the captured image of the left camera.
  • the central camera (cam C ) is placed at the measurement reference position, which is the center of the eye box, and the left camera (cam L ) and right camera (cam R ) are placed spaced apart at a distance of ⁇ .
  • the optical characteristic measurement device can shoot a test image on the virtual plane using the central camera (cam C ), left camera (cam L ), and right camera (cam R ) arranged to face the front.
  • the test image captured using the left camera (cam L ) (captured image by cam L ) is biased to the right
  • the test image captured using the central camera (cam C ) (captured image by cam C ) is biased to the right.
  • the test image is not biased
  • the test image captured using the right camera (cam R ) (captured image by cam R ) may be biased to the left.
  • P ij may be the three-dimensional coordinate of the center of the horizontal i-th and vertical j-th pattern.
  • the pixel coordinates of the nine patterns appearing in the captured image can be expressed as P L ij , P C ij , and P R ij , respectively, for the left camera (cam L ), central camera (cam C ), and It may refer to the coordinates of the pattern that appears in the image captured by the right camera (cam R ).
  • P L ij (m L ij , n L ij )
  • P C ij (m C ij , n C ij )
  • P R ij (m R ij , n R ij ).
  • P L ij , P C ij , and P R ij may be the pixel coordinates of the center of the i-th horizontal and j-th pattern.
  • Equation 2 the proportional relationship shown in Equation 2 below is established.
  • z is the distance along the z-axis from the measurement reference position to the virtual plane
  • is the camera's angle of view
  • is the distance between the left and right cameras
  • m L ij is the horizontal i in the captured image of the left camera.
  • m R ij are the horizontal coordinates of the i-th horizontal and j-th vertical patterns in the captured image of the right camera
  • M is the number of horizontal pixels in the captured image.
  • equation 1 can be obtained by modifying equation 2.
  • x 11 , y 11 , and z 11 can be calculated through Equation 1.
  • Figure 2 is a flowchart showing a method for calculating a virtual image distance according to an embodiment of the present invention.
  • the optical property measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • step S220 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • step S230 the optical characteristic measurement device calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images captured by the plurality of cameras, the angle of view information, and the arrangement information.
  • step S240 the optical property measurement device calculates a virtual image distance between the measurement reference position and the virtual plane using the coordinates of the measurement reference position and the coordinates of at least one of a plurality of patterns on the virtual plane.
  • the optical properties measurement device calculates the distance to the coordinates (x 22 , y 22 , z 22 ) of P 22 based on the measurement reference position (0, 0, 0) to obtain a virtual image distance. can be calculated.
  • the optical characteristic measurement device may calculate the virtual image distance using Equation 3.
  • D VI is the virtual image distance
  • Figure 3 is a flowchart showing a lookdown/up angle calculation method according to an embodiment of the present invention.
  • step S310 the optical properties measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • step S320 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • step S330 the optical characteristic measurement device calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images, view angle information, and arrangement information captured by the plurality of cameras.
  • step S340 the optical property measurement device determines a look down/up angle from the measurement reference position to the virtual plane using the coordinates of the measurement reference position and at least one coordinate of a plurality of patterns on the virtual plane. angle) is calculated.
  • the lookdown/up angle is an angle representing the difference between the height of the eyebox and the virtual plane, and indicates whether the user is looking up or down at the virtual plane.
  • the optical property measurement device may calculate the lookdown/up angle using Equation 4.
  • ⁇ down/up is the lookdown/up angle
  • Figure 4 is a flowchart showing a method for calculating a horizontal viewing angle according to an embodiment of the present invention.
  • the optical properties measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • step S420 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • step S430 the optical characteristic measurement device calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images captured by the plurality of cameras, view angle information, and arrangement information.
  • step S440 the optical properties measurement device uses the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the horizontal direction among a plurality of patterns on the virtual plane to determine the horizontal field of view of the measurement reference position. ) is calculated.
  • O (0, 0, 0)
  • P 21 two patterns located at both ends in the horizontal direction among a plurality of patterns on the virtual plane.
  • the angle ⁇ P 21 OP 23 can be calculated as the horizontal viewing angle.
  • the optical characteristic measuring device may calculate the horizontal viewing angle using Equation 5.
  • ⁇ H FOV is the horizontal viewing angle
  • O is the coordinate of the measurement reference position
  • P 21 and P 23 are the coordinates of two patterns located at both ends in the horizontal direction among a plurality of patterns.
  • Figure 5 is a flowchart showing a method for calculating a vertical viewing angle according to an embodiment of the present invention.
  • the optical property measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • step S520 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • step S530 the optical characteristic measurement device calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images captured by the plurality of cameras, the angle of view information, and the arrangement information.
  • step S540 the optical property measurement device uses the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the vertical direction among a plurality of patterns on the virtual plane to determine the vertical field of view of the measurement reference position. ) is calculated.
  • O (0, 0, 0)
  • P 12 two patterns located at both ends in the vertical direction among a plurality of patterns on the virtual plane.
  • the optical property measurement device may calculate the vertical viewing angle using Equation 6.
  • ⁇ V FOV is the vertical viewing angle
  • O is the coordinate of the measurement reference position
  • P 12 and P 32 are the coordinates of the two patterns located at both ends in the vertical direction.
  • Figure 6 is a flowchart showing a method for calculating static distortion according to an embodiment of the present invention.
  • the optical properties measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • step S620 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • step S630 the optical characteristic measurement device calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images captured by the plurality of cameras, the angle of view information, and the arrangement information.
  • step S640 the optical properties measurement device calculates static distortion for each of the three axes based on the measurement reference position, based on the coordinates of a plurality of patterns on the virtual plane.
  • the deviation of the three-dimensional coordinates of a plurality of patterns based on the line corresponding to each of the three axes (x, y, z) Indicates deviation degree.
  • the optical properties measurement device can calculate the static distortion for each of the three axes using Equation 7.
  • Figure 7 is a flowchart showing a method for calculating a ghosting level according to an embodiment of the present invention.
  • the optical property measurement device uses a plurality of cameras arranged around a predetermined measurement reference position to capture a test image including a plurality of patterns output on a virtual plane by the augmented reality device.
  • step S720 the optical characteristic measuring device acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • step S730 the optical characteristic measurement device determines the coordinates of a plurality of patterns and the coordinates of a plurality of ghost patterns based on the measurement reference position based on the plurality of captured images, view angle information, and arrangement information captured by the plurality of cameras. Calculate .
  • a ghost pattern may appear on a vehicle's windshield, which transmits half of the incoming light and reflects the other half. More specifically, referring to FIG. 17, the two physical layers of the windshield cause a ghost phenomenon, so that the pattern on the virtual plane and the ghost pattern corresponding to the pattern appear to the user as double images overlapping or blurred. It can be seen (blurred).
  • the optical characteristic measuring device may calculate the coordinates of a plurality of ghost patterns corresponding to each of the plurality of patterns using the same method as the method of calculating the coordinates of the plurality of patterns.
  • the optical characteristic measuring device may calculate a ghosting level based on the coordinates of a plurality of patterns and the coordinates of a plurality of ghost patterns.
  • the optical characteristic measuring device can calculate the ghosting level from the gap between the original pattern and the corresponding ghost pattern.
  • the optical property measurement device can calculate the ghosting level using Equation 8.
  • Ghost is the ghosting level
  • x Gij , y Gij , and z Gij are the x, y, z coordinates of the ith horizontal and jth ghost pattern vertically.
  • Figure 8 is a block diagram showing an apparatus for measuring optical characteristics of an augmented reality device according to an embodiment of the present invention.
  • an apparatus 800 for measuring optical characteristics of an augmented reality device includes a photographing unit 810, an acquisition unit 820, and a calculation unit 830.
  • the photographing unit 810 uses a plurality of cameras arranged around a predetermined measurement reference position to photograph a test image including a plurality of patterns output on a virtual plane by an augmented reality device.
  • the acquisition unit 820 acquires angle of view information including information about the angle of view of the plurality of cameras and arrangement information including information about the arrangement of the plurality of cameras.
  • the calculation unit 830 calculates the coordinates of a plurality of patterns based on the measurement reference position based on the plurality of captured images, view angle information, and arrangement information captured by the plurality of cameras.
  • the plurality of cameras are a central camera located at the measurement reference position, a left camera and a right camera located symmetrically around the measurement reference position, and when a plurality of patterns are arranged horizontally and vertically in the test image.
  • the calculation unit 830 calculates the number of horizontal pixels of the plurality of captured images, the coordinates of the plurality of patterns in the plurality of captured images, the angle of view of the plurality of cameras included in the angle of view information, and the distance between the left and right cameras included in the arrangement information. Using the distance, the coordinates of a plurality of patterns can be calculated.
  • the calculation unit 830 may calculate the coordinates of a plurality of patterns using Equation 9.
  • x ij , y ij , and z ij are the x, y, and z-axis coordinates of the horizontal i-th and vertical j-th patterns based on the measurement reference position
  • is the distance between the left and right cameras
  • M is a plurality of is the number of horizontal pixels of the captured image
  • is the angle of view of multiple cameras
  • m L ij is the horizontal coordinate of the i-th horizontal and j-th pattern in the captured image of the left camera
  • m R ij is the horizontal coordinate of the j-th pattern in the captured image of the left camera.
  • the calculation unit 830 may further calculate the virtual image distance between the measurement reference position and the virtual plane using the coordinates of the measurement reference position and the coordinates of at least one of a plurality of patterns on the virtual plane.
  • the calculation unit 830 may calculate the virtual image distance using Equation 10.
  • D VI is the virtual image distance
  • x 22 , y 22 , and z 22 are the coordinates of one pattern among a plurality of patterns.
  • the calculation unit 830 may further calculate the lookdown/up angle with respect to the virtual plane from the measurement reference position using the coordinates of the measurement reference position and at least one coordinate of a plurality of patterns on the virtual plane. You can.
  • the calculation unit 830 may calculate the lookdown/up angle using Equation 11.
  • ⁇ down/up is the lookdown/up angle
  • x 22 , y 22 , and z 22 are the coordinates of one pattern among a plurality of patterns.
  • the calculation unit 830 further calculates the horizontal viewing angle of the measurement reference position using the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the horizontal direction among a plurality of patterns on the virtual plane. You can.
  • the calculation unit 830 may calculate the horizontal viewing angle using Equation 12.
  • ⁇ H FOV is the horizontal viewing angle
  • O is the coordinate of the measurement reference position
  • P 21 and P 23 are the coordinates of the two patterns located at both ends in the horizontal direction.
  • the calculation unit 830 further calculates the vertical viewing angle of the measurement reference position using the coordinates of the measurement reference position and the coordinates of two patterns located at both ends in the vertical direction among a plurality of patterns on the virtual plane. You can.
  • the calculation unit 830 may calculate the vertical viewing angle using Equation 13.
  • ⁇ V FOV is the vertical viewing angle
  • O is the coordinate of the measurement reference position
  • P 12 and P 32 are the coordinates of the two patterns located at both ends in the vertical direction.
  • the calculation unit 830 may further calculate static distortion for each of the three axes based on the measurement reference position, based on the coordinates of a plurality of patterns on the virtual plane.
  • the calculation unit 830 further calculates the coordinates of a plurality of ghost patterns corresponding to each of the plurality of patterns based on the plurality of captured images, view angle information, and arrangement information, and calculates the coordinates of the plurality of patterns and Based on the coordinates of a plurality of ghost patterns, the ghosting level can be further calculated.
  • the above-described embodiments of the present invention can be written as a program that can be executed on a computer, and can be implemented in a general-purpose digital computer that operates the program using a computer-readable recording medium.
  • the computer-readable recording media includes magnetic storage media (eg, ROM, floppy disk, hard disk, etc.) and optical read media (eg, CD-ROM, DVD, etc.).
  • magnetic storage media eg, ROM, floppy disk, hard disk, etc.
  • optical read media eg, CD-ROM, DVD, etc.
  • cameras according to embodiments may correspond to a light measuring device (LMD).
  • An optical measurement device may generate a virtual image plane and generate images including patterns at different positions.
  • Methods for measuring optical properties include generating images including points in each pattern for a virtual image plane using one or more optical measurement devices, each image comprising one or more Captured based on light measurement devices, each image corresponding to at least one of a left image, a central image, and a right image; and generating positions of points based on one or more light measurement devices and each pattern; and the position may be obtained based on a field of view of one or more light measurement devices and a gap between the left light measurement device and the right light measurement device.
  • one LDM can capture three images for the virtual plane at the center, left, and right positions, and multiple LDMs can capture three images for the virtual plane at the center, left, and right positions. can do.
  • the positions of the points may be estimated based on, for example, the capture angle for the left LDM location and the capture angle for the right LDM location.
  • Equation 1 the coordinate value of the position of the pattern of the image can be calculated.
  • a left image including points in the pattern is captured based on the left light measurement device of one or more light measurement devices, and
  • the central image containing the points is captured based on the central optical measurement device of one or more optical measuring devices, and the right image containing points in the pattern is captured based on the right optical measuring device of the one or more optical measuring devices.
  • the index according to embodiments may indicate the number of positions, and the field of view may correspond to the angle of view.
  • the position coordinate values are the horizontal pixel index of the left light measurement device, the horizontal pixel index of the right light measurement device, the horizontal pixel index of the center light measurement device, and the field of view of the left light measurement device. It can be calculated based on .
  • the method of measuring optical properties may further include measuring a virtual image distance for a virtual image plane based on the position in the pattern at the center and an optical measurement device. there is.
  • the step of measuring the look down angle and look up angle for the virtual image plane may be further included based on the position within the pattern and the virtual image distance.
  • a method for measuring optical properties includes measuring horizontal distortion for a virtual image plane based on the center and a line between the center top point and the center bottom point; and measuring vertical distortion for the virtual image plane based on a line between center and a point left of center and a point right of center; may include.
  • An optical specific measurement method/device according to embodiments may be referred to as a method/device according to embodiments.
  • Figure 18 shows a method of measuring optical properties according to embodiments.
  • the optical properties measuring device can measure optical properties based on FIGS. 1 to 7, 18, etc.
  • the photographing unit 810 may generate at least one image.
  • the optical properties measuring apparatus according to the embodiments of FIG. 25 may generate at least one multi-viewpoint image. For example, n multi-view images may be generated.
  • the devices shown in FIGS. 8 and 25 may be referred to as a light measuring device (LMD), a camera, a processor equipped with a camera, an optical characteristic measuring device, etc.
  • LMD light measuring device
  • camera a camera
  • processor equipped with a camera
  • optical characteristic measuring device etc.
  • Multi-view images according to embodiments are acquired by an optical properties measurement device according to embodiments mounted on a vehicle, an optical properties measurement device detachable from the vehicle, a light measuring device (LMD), a camera, a processor equipped with a camera, etc. It can be.
  • the vehicle may include an optical characteristic measuring device near the dashboard, black box, and rearview mirror, an optical characteristic measuring device detachable from the vehicle, a light measuring device (LMD), a camera, and a processor equipped with a camera.
  • One or more cameras can acquire multi-view images and can be installed at various locations in the vehicle.
  • the method for measuring optical properties may perform the following operations on one or more multi-viewpoint images. For example, when n multi-viewpoint images are captured, the following calibration operation can be applied to each n images.
  • the method of measuring optical properties according to embodiments may further include displaying the position of the virtual image.
  • the method of measuring optical properties according to embodiments may further include calculating the depth of the position of the virtual image.
  • the method of measuring optical properties according to embodiments may further include estimating the depth of the virtual image.
  • the method of measuring optical properties according to embodiments may further include matching the depth of the position and the depth of the virtual image.
  • the method of measuring optical properties according to embodiments may further include a calibration step.
  • the multi-view image acquired in step 1800 may be a left image 1900, a center image 1901, and a right image 1902 captured by a camera or an optical measurement device.
  • step 1803 positions 1 to 9 (changeable) of the HUD virtual image can be displayed on the multi-view image (see FIG. 20).
  • the virtual object may be an image visible to the user's eyes.
  • the HUD virtual object may be projected as circles 1 to 9 in FIG. 20.
  • step 1804 the user can calculate the depth of the positions corresponding to circles 1 to 9 of FIG. 20.
  • the depth of circles in patterns 1 to 9 may be calculated by at least one camera or light measurement device.
  • step 1805 the points and depth of the multi-view image can be obtained (see FIG. 21).
  • step 1805 the coordinate value of the geometric position of the virtual image can be obtained.
  • the process of obtaining coordinate values refer to the explanations of Equation 1 to Equation 4.
  • Depth may mean the z value of a point of a virtual plane (image), such as in FIG. 11.
  • step 1806 matching can be displayed as shown in FIG. 22.
  • step 1807 storing positions with the same depth (for example, number 5, see FIG. 22) means calibration.
  • Vehicles according to embodiments use the same location (e.g., No. 5) for the depth by the camera and the depth by the HUD when implementing the HUD, and use locations with different depths (e.g., No. 6 and No. 8, Figure 2). By ignoring (see 22), an accurate HUD can be implemented.
  • the driver can select whether to match the depth of the virtual image and perform HUD correction.
  • Figure 19 shows an example of a multi-viewpoint image according to embodiments.
  • three images may be generated: a left image 1900, a center image 1901, and a right image 1902.
  • An optical specific measurement device may include at least one camera.
  • An optical specific measurement device may include an optical measurement unit.
  • the position of the light measurement unit of the specific optical measurement device may change or the light measurement unit may move in the position mounted on the vehicle. For example, as the light measurement unit moves to the center, left, right, etc., a multi-view image can be generated (acquired) as shown in FIG. 19.
  • Figure 20 shows a method for displaying the location of a virtual image.
  • Figure 20 shows step 1803 of Figure 19.
  • An optical specific measurement method and device can display multiple positions of a HUD virtual image in a multi-view image. Positions according to embodiments may include positions 1 to 9.
  • the virtual image is an image for the HUD and can have depth i.
  • the multi-view image 2000 is the multi-view image (left, center, right) described in FIGS. 18-19.
  • Virtual Image (2001) is a virtual image of the HUD.
  • a virtual image can have a certain depth.
  • the virtual image may be the box area 2001 in Figure 20.
  • Location (2002) is the location for virtual image (2001). Location (2002) is a changeable value. Location 2002 may have one or more location values. For example, there may be 9 locations for the virtual image, and may be referred to as first location, second location, ..., ninth location, etc.
  • a vehicle can project a HUD virtual object on a VID (Vehicle Information Device). The driver and user of the vehicle can visually see the virtual object.
  • VID Vehicle Information Device
  • the driver can select the location of the virtual image based on the UI. Some of the first to ninth positions may be displayed on the camera screen based on the touch method, signal input transmission method, etc.
  • the method/device for measuring optical properties may display positions of the virtual image based on numbers, such as 2001.
  • Virtual images and locations are displayed on the HUD, and the driver can select one or more locations based on the UI.
  • 2001 shows an example of the displayed results.
  • Figure 21 shows a process for generating points and depth according to embodiments.
  • the multi-view images 2100, 2101, and 2102 are the above-described multi-view images.
  • Points 2103 are corresponding points of each multi-view image.
  • step 1805 the coordinate value of point 2103 can be calculated. And, you can obtain a depth of point 2103.
  • the value of point 2103 of the virtual image can be used.
  • the depth i of point 2103 may be the depth of the virtual image. Depth i may be HUD depth i.
  • Figure 22 shows equal depth grouping according to embodiments.
  • the method/device according to embodiments can calculate the depth on the HUD virtual image and group positions with the same calculated depth using RGB colors.
  • numbers 1, 2, 4, and 5 correspond to the same calculated depth location and are displayed in red color
  • numbers 3 and 6 are grouped in green color
  • numbers 7 to 9 are grouped in blue color. Can be grouped into .
  • Drivers can select locations based on the UI. By touching the camera screen or HUD screen, you can select the same depth as the HUD's virtual object among red, green, and blue.
  • O can be displayed, and if the depths are different, X can be displayed.
  • the same depth can be grouped using color information.
  • the driver can select positions 1 to 9.
  • HUD calibration can be performed from selected colors.
  • VID can display information based on the calculated depth.
  • Figure 23 shows HUD virtual object projection according to embodiments.
  • the method/device according to embodiments may project a HUD virtual object on the VID as shown in FIG. 23 after the calibration process.
  • an error tolerance range e.g., +X meters, -meters.
  • the depth can be calculated, and the HUD virtual object can be projected based on the corrected and selected depth.
  • HUD calibration optimized for the driver can be performed.
  • Figure 24 shows an optical specific device according to embodiments.
  • Figure 18 can be performed by the device of Figure 24.
  • Each component in Figure 24 may correspond to hardware, software, processor, and/or a combination thereof.
  • Figure 24 may correspond to the device of Figure 8.
  • the light measurement unit may be a camera. It may be a light measurement unit that acquires virtual images in FIGS. 1 to 7 and 18.
  • the processor may be a processor that performs the operations of Figures 1 to 7, Figure 18, etc.
  • Memory can store data and information related to the operation of the processor. Memory can provide the necessary data to the processor.
  • the memory may be connected to an optical measurement unit, processor, etc.
  • the optical measurement unit may capture multi-view images as shown in step 1800 and FIG. 19.
  • the processor may display positions 1 to 9 of the HUD virtual image on the multi-view image as in step 1803.
  • the processor may calculate the depth of positions 1 to 9 of the virtual image as in step 1804.
  • the process of calculating the depth (coordinate Z value) of the pattern of the image can refer to Equations 1 to 4.
  • the calculated depth at positions 1 to 9 may be the camera depth i.
  • the processor may estimate the depth of the HUD virtual image as in step 1805.
  • Corresponding points can be acquired at various locations by a camera or a light measurement device LDM, the coordinates of the points can be calculated, and the depth can be calculated.
  • the processor compares the camera depth i calculated (estimated) from the multi-view image (positions 1 to 9: some positions may be selected by the driver) and the virtual image HUD depth i (2103), and selects the same position. It can be displayed on the UI and grouped.
  • a calibrated HUD virtual object can be projected by grouping and selecting the same depths through the depths for specific locations in a multi-view image acquired by a camera or a light measurement device LDM.
  • the processor 1) projects a HUD virtual object on the VID (image visible to the user's eyes), 1-1) allows the driver to select the object based on the UI, 1-2) signals such as a touch on the camera screen can be received to display some of the positions 1-9, 2) through SW processing, obtain corresponding points and depth based on multiple LDM acquisition information, and 3) through SW processing, color the same depth. Grouping using information, 4) Providing UI-based selection information to the driver, 4-1) Receiving information such as touches on the screen such as camera, LDM, VID, etc., and selecting a HUD virtual object of the same depth as the HUD virtual object among specific colors. You can choose a location. Information required for driving can be calibrated and displayed on the VID based on the selected location and depth of the location. 5) HUD virtual objects can be projected onto the VID. Steps 1 to 5 can be performed repeatedly.
  • the method for measuring optical properties includes generating a multi-viewpoint image, the multi-viewpoint image including a left image, a center image, and a right image; Displaying the position of a virtual image for a multi-view image; calculating the depth of the position of the virtual image; estimating the depth of the virtual image; matching the depth of the location and the depth of the virtual image; and calibrating the error of the multi-view image based on the matched depth; may include.
  • the virtual image includes points in each pattern for the virtual image, and the positions of the points are generated based on one or more light measurement devices and each pattern.
  • the left image including points within the pattern of the virtual image is captured based on the left side of one or more light measurement devices, and the center image including points within the pattern is captured based on the center of one or more light measurement devices.
  • a right image is captured based on a light measurement device, and a right image comprising points in the pattern is captured based on a right light measurement device of one or more light measurement devices.
  • the coordinate values of the position are calculated based on the horizontal pixel index of the left light measurement device, the horizontal pixel index of the right light measurement device, the horizontal pixel index of the central light measurement device, and the field of view of the left light measurement device.
  • the method of measuring optical properties may further include measuring a virtual image distance for a virtual image based on the position within the central pattern and the optical measurement device.
  • the method of measuring optical properties may further include measuring the look-down angle and look-up angle for the virtual image plane based on the position within the pattern and the virtual image distance.
  • the optical property measurement method includes measuring a horizontal field of view for a virtual image plane based on a distance to a point to the left of center and a distance to a point to the right of center; and measuring a vertical field of view for the virtual image plane based on the distance to the central top point and the distance to the central bottom point; may further include.
  • the optical property measurement method includes measuring horizontal distortion for a virtual image plane based on a center and a line between the center top point and the center bottom point; and measuring vertical distortion for the virtual image plane based on a line between center and a point left of center and a point right of center; You can include .
  • a method of measuring optical properties includes generating a multi-viewpoint image, the multi-viewpoint image including a left image, a center image, and a right image; Displaying positions of virtual images for multi-view images; estimating the depth of locations in the virtual image; Grouping locations based on depth; selecting some locations from the grouped locations; and projecting the corrected virtual object based on some selected positions and depths. may include.
  • a device that performs a method for measuring optical properties includes a memory; Processor associated with memory; and an optical measurement unit connected to the processor, wherein the processor generates a multi-view image, the multi-view image includes a left image, a center image, and a right image, and displays the position of the virtual image for the multi-view image. , calculate the depth of the position of the virtual image, estimate the depth of the virtual image, match the depth of the position and the depth of the virtual image, and based on the matched depth, the error of the multi-view image can be calibrated.
  • the optical property measurement device includes memory; Processor associated with memory; and an optical measurement unit connected to the processor, wherein the processor generates a multi-view image, the multi-view image includes a left image, a center image, and a right image, and displays positions of the virtual image for the multi-view image. , it is possible to estimate the depth of the positions of the virtual image, group the positions based on the depth, select some positions from the grouped positions, and project the corrected virtual object based on the selected partial positions and depth.
  • 3D HUD vehicle service can be improved from the perspective of driver visibility and convenience. Additionally, by mounting an optical measurement device on a vehicle, measurement parameters for measuring optical characteristics, such as points and depth for a HUD virtual image, can be efficiently obtained. Based on the obtained parameter information, a calibration process that can reduce errors from the driver's perspective can be efficiently performed. 3D HUD can be combined with autonomous driving technology to enable safe and accurate autonomous driving.
  • the various components of the devices of the embodiments may be implemented by hardware, software, firmware, or a combination thereof.
  • Various components of the embodiments may be implemented with one chip, for example, one hardware circuit.
  • the components according to the embodiments may be implemented with separate chips.
  • at least one or more of the components of the device according to the embodiments may be composed of one or more processors capable of executing one or more programs, and the one or more programs may be executed. It may perform one or more of the operations/methods according to the examples, or may include instructions for performing them.
  • Executable instructions for performing methods/operations of a device may be stored in a non-transitory CRM or other computer program product configured for execution by one or more processors, or may be stored in one or more processors. It may be stored in temporary CRM or other computer program products configured for execution by processors. Additionally, memory according to embodiments may be used as a concept that includes not only volatile memory (eg, RAM, etc.) but also non-volatile memory, flash memory, and PROM. Additionally, it may also be implemented in the form of a carrier wave, such as transmission over the Internet. Additionally, the processor-readable recording medium is distributed in a computer system connected to a network, so that the processor-readable code can be stored and executed in a distributed manner.
  • first, second, etc. may be used to describe various components of the embodiments. However, the interpretation of various components according to the embodiments should not be limited by the above terms. These terms are merely used to distinguish one component from another. It's just a thing. For example, a first user input signal may be referred to as a second user input signal. Similarly, the second user input signal may be referred to as the first user input signal. Use of these terms should be interpreted without departing from the scope of the various embodiments.
  • the first user input signal and the second user input signal are both user input signals, but do not mean the same user input signals unless clearly indicated in the context.
  • operations according to embodiments described in this document may be performed by a transmitting and receiving device including a memory and/or a processor depending on the embodiments.
  • the memory may store programs for processing/controlling operations according to embodiments, and the processor may control various operations described in this document.
  • the processor may be referred to as a controller, etc.
  • operations may be performed by firmware, software, and/or a combination thereof, and the firmware, software, and/or combination thereof may be stored in a processor or stored in memory.
  • the transmitting and receiving device may include a transmitting and receiving unit that transmits and receives media data, a memory that stores instructions (program code, algorithm, flowchart and/or data) for the process according to embodiments, and a processor that controls the operations of the transmitting and receiving device. You can.
  • a processor may be referred to as a controller, etc., and may correspond to, for example, hardware, software, and/or a combination thereof. Operations according to the above-described embodiments may be performed by a processor. Additionally, the processor may be implemented as an encoder/decoder, etc. for the operations of the above-described embodiments.
  • embodiments may be applied in whole or in part to point cloud data transmission and reception devices and systems.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Un procédé de mesure de caractéristique optique selon des modes de réalisation peut comprendre les étapes consistant à : générer des images multi-vues, les images multi-vues comprenant une image gauche, une image centrale et une image droite ; afficher l'emplacement d'une image virtuelle pour les images multi-vues ; calculer la profondeur de l'emplacement de l'image virtuelle ; estimer la profondeur de l'image virtuelle ; mettre en correspondance la profondeur de l'emplacement et la profondeur de l'image virtuelle ; et étalonner une erreur entre les images multi-vues sur la base de la profondeur adaptée. De plus, le procédé de mesure de caractéristique optique peut comprendre les étapes consistant à : générer des images multi-vues, les images multi-vues comprenant une image gauche, une image centrale et une image droite ; afficher les emplacements d'images virtuelles pour les images multi-vues ; estimer les profondeurs des emplacements des images virtuelles ; grouper les emplacements sur la base des profondeurs ; sélectionner certains emplacements parmi les emplacements groupés ; et projeter un objet virtuel corrigé sur la base de certains emplacements et profondeurs sélectionnés.
PCT/KR2023/003612 2022-08-05 2023-03-17 Procédé et dispositif d'étalonnage de hud à l'aide d'une caméra à l'intérieur d'un véhicule WO2024029680A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0098030 2022-08-05
KR1020220098030A KR20240020051A (ko) 2022-08-05 2022-08-05 자동차 내부 카메라를 이용한 hud 캘리브레이션 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2024029680A1 true WO2024029680A1 (fr) 2024-02-08

Family

ID=89849063

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/003612 WO2024029680A1 (fr) 2022-08-05 2023-03-17 Procédé et dispositif d'étalonnage de hud à l'aide d'une caméra à l'intérieur d'un véhicule

Country Status (2)

Country Link
KR (1) KR20240020051A (fr)
WO (1) WO2024029680A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150012976A (ko) * 2013-07-26 2015-02-04 삼성전자주식회사 다시점 영상 처리 장치 및 그 영상 처리 방법
US20170310997A1 (en) * 2012-12-27 2017-10-26 Nippon Telegraph And Telephone Corporation Image encoding method, image decoding method, image encoding apparatus, image decoding apparatus, image encoding program, and image decoding program
KR20190072742A (ko) * 2017-12-18 2019-06-26 전자부품연구원 캘리브레이션된 멀티 카메라 입력 기반 실시간 초다시점 중간시점 영상 합성 방법 및 시스템
KR20190118846A (ko) * 2018-04-11 2019-10-21 한국과학기술연구원 다초점 증강현실 장치
KR20210108283A (ko) * 2020-02-25 2021-09-02 광운대학교 산학협력단 깊이영상 카메라로부터 획득된 3d 영상의 품질 향상 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170310997A1 (en) * 2012-12-27 2017-10-26 Nippon Telegraph And Telephone Corporation Image encoding method, image decoding method, image encoding apparatus, image decoding apparatus, image encoding program, and image decoding program
KR20150012976A (ko) * 2013-07-26 2015-02-04 삼성전자주식회사 다시점 영상 처리 장치 및 그 영상 처리 방법
KR20190072742A (ko) * 2017-12-18 2019-06-26 전자부품연구원 캘리브레이션된 멀티 카메라 입력 기반 실시간 초다시점 중간시점 영상 합성 방법 및 시스템
KR20190118846A (ko) * 2018-04-11 2019-10-21 한국과학기술연구원 다초점 증강현실 장치
KR20210108283A (ko) * 2020-02-25 2021-09-02 광운대학교 산학협력단 깊이영상 카메라로부터 획득된 3d 영상의 품질 향상 방법

Also Published As

Publication number Publication date
KR20240020051A (ko) 2024-02-14

Similar Documents

Publication Publication Date Title
WO2020141729A1 (fr) Dispositif de mesure corporelle et procédé de commande associé
WO2014148698A1 (fr) Dispositif d'affichage et son procédé de commande
WO2015142023A1 (fr) Procédé et dispositif corporel pour fournir une interface d'entrée virtuelle
WO2016171403A1 (fr) Dispositif électronique et procédé
WO2017111268A1 (fr) Dispositif d'affichage déformable et procédé d'affichage d'image utilisant ce dernier
WO2014058086A1 (fr) Dispositif de traitement d'image et procédé de traitement d'image
WO2016056737A1 (fr) Dispositif d'affichage et son procédé de commande
WO2019117652A1 (fr) Appareil à prisme, et appareil photographique comprenant celui-ci
WO2020101420A1 (fr) Procédé et appareil de mesurer des caractéristiques optiques d'un dispositif de réalité augmentée
WO2016104952A1 (fr) Appareil d'affichage et procédé d'affichage
WO2022050507A1 (fr) Procédé et système de surveillance d'un module de génération d'énergie photovoltaïque
WO2021251549A1 (fr) Dispositif d'affichage
WO2015183050A1 (fr) Système de poursuite optique, et procédé de calcul de posture et d'emplacement de partie marqueur dans un système de poursuite optique
WO2018212608A1 (fr) Système de marquage mobile, procédé de commande de dispositif de marquage mobile, et support d'enregistrement lisible par ordinateur
WO2020060235A1 (fr) Dispositif de caméra
WO2019125036A1 (fr) Procédé de traitement d'image et appareil d'affichage associé
WO2021125903A1 (fr) Dispositif pouvant être porté comprenant un appareil de suivi de l'œil et procédé de fonctionnement du dispositif pouvant être porté
WO2021040156A1 (fr) Dispositif de mesure du corps et procédé de commande associé
WO2015194773A1 (fr) Dispositif d'affichage et son procédé de commande
WO2010120120A2 (fr) Procédé permettant une interaction d'utilisateur dans un laser, et dispositif associé
WO2020209624A1 (fr) Dispositif de visiocasque et procédé de fonctionnement associé
WO2016017906A1 (fr) Dispositif d'affichage, dispositif de correction d'affichage, système de correction d'affichage, et procédé de correction d'affichage
WO2014178610A1 (fr) Système de suivi optique et méthode de suivi utilisant le système
WO2019031676A1 (fr) Procédé de traitement d'images et dispositif associé
WO2024029680A1 (fr) Procédé et dispositif d'étalonnage de hud à l'aide d'une caméra à l'intérieur d'un véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23850193

Country of ref document: EP

Kind code of ref document: A1