WO2016115874A1 - 能自动调节景深的双目ar头戴设备及景深调节方法 - Google Patents

能自动调节景深的双目ar头戴设备及景深调节方法 Download PDF

Info

Publication number
WO2016115874A1
WO2016115874A1 PCT/CN2015/086360 CN2015086360W WO2016115874A1 WO 2016115874 A1 WO2016115874 A1 WO 2016115874A1 CN 2015086360 W CN2015086360 W CN 2015086360W WO 2016115874 A1 WO2016115874 A1 WO 2016115874A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
human eye
preset
mapping relationship
information
Prior art date
Application number
PCT/CN2015/086360
Other languages
English (en)
French (fr)
Inventor
黄琴华
李薪宇
Original Assignee
成都理想境界科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 成都理想境界科技有限公司 filed Critical 成都理想境界科技有限公司
Publication of WO2016115874A1 publication Critical patent/WO2016115874A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present invention relates to the field of head-mounted display devices, and more particularly to a binocular AR head-mounted device capable of automatically adjusting depth of field and a depth-of-depth adjustment method thereof.
  • the head-mounted display device is an optimal use environment of Augmented Reality Technique (AR), which can present virtual information in a real environment through a head-mounted device window.
  • AR Augmented Reality Technique
  • an embodiment of the present invention first provides a depth of field adjustment method for a binocular AR headset, the method comprising:
  • center point pair coordinate data of two sets of effective display information corresponding to the distance dis of the target object to the human eye according to the distance dis of the target object to the human eye and the preset distance mapping relationship ⁇ , wherein the preset distance map
  • the relationship ⁇ represents a mapping relationship between the center point pair coordinate data and a distance dis of the target object to the human eye;
  • the information source images of the virtual information to be displayed are respectively displayed on the left and right image display sources.
  • the distance dis of the target to the human eye is obtained by a binocular stereo vision system.
  • the distance dis of the target to the human eye is determined according to the following expression:
  • h is the distance from the binocular stereo vision system to the human eye
  • Z is the distance between the target and the binocular stereo vision system
  • T is the baseline distance
  • f is the focal length
  • x l and x r are the target images in the left image, respectively.
  • x coordinate in the right image is the x coordinate in the right image.
  • the visual line-of-sight information is detected by the gaze tracking system, and the distance dis of the target to the human eye is determined based on the spatial visual information data.
  • the distance dis of the target to the human eye is determined according to the following expression:
  • the distance dis of the target to the human eye is determined by the camera imaging scale.
  • the distance dis of the target to the human eye is determined by the depth of field camera.
  • the information source image of the virtual information to be displayed is respectively displayed on the left and right image display sources with the center point pair coordinate as the center position.
  • the information source image of the virtual information to be displayed is respectively displayed on the left and right image display sources with the offset center point as the center position of the coordinate preset orientation.
  • the method further comprises: correcting the preset distance map when the user first uses the headset, and/or when the user uses the headset each time Relationship ⁇ .
  • the step of modifying the preset distance mapping relationship ⁇ comprises:
  • the image display source of the control headset device displays the preset information source image on the left and right image display sources respectively;
  • the preset distance mapping relationship ⁇ is corrected by the correction factor.
  • the preset distance mapping relationship ⁇ is expressed as:
  • dis represents the distance from the target to the human eye
  • h represents the fitted curve function
  • (SL, SR) represents the coordinate data of the center point pair of the two sets of effective display information.
  • constructing the preset distance mapping relationship ⁇ includes:
  • Step 1 displaying a preset test image at a preset position of the left and right image display sources
  • Step 2 Obtain a line of sight space vector when the user looks at the virtual test chart, and determine a set of the preset test image position and the distance from the corresponding target object according to the line of sight space vector and the display position of the preset test image. Mapping relationship data of the distance of the eye;
  • Step 3 sequentially reduce the center distance of the preset test image according to a preset rule, and repeat step two until the mapping relationship data between the preset test image position of the k group and the distance from the corresponding target object to the human eye is obtained. ;
  • Step 4 Fitting a mapping relationship between the preset test image position of the k group and the distance of the corresponding target object from the human eye, and constructing the preset distance mapping relationship ⁇ .
  • the invention also provides a binocular AR wearing device capable of automatically adjusting the depth of field, comprising:
  • An image display source including a left image display source and a right image display source
  • a distance data acquisition module for acquiring data relating to a distance dis of the target object to the human eye
  • a data processing module coupled to the distance data collection module, configured to determine a distance dis of the target object to the human eye according to the data related to the distance dis of the target object to the human eye, and combined with the preset distance mapping relationship ⁇ Determining the center point pair coordinate data of the two sets of effective display information corresponding to the distance dis of the target object to the human eye, and displaying the information source image of the virtual information to be displayed according to the center point pair coordinate data, respectively The left and right images are displayed on the source;
  • the preset distance mapping relationship ⁇ represents a mapping relationship between the center point pair coordinate data and a distance dis of the target object to the human eye.
  • the distance data collection module comprises any one of the following items:
  • Single camera, binocular stereo vision system, depth of field camera and gaze tracking system Single camera, binocular stereo vision system, depth of field camera and gaze tracking system.
  • the data processing module is configured to display the information source image of the virtual information to be displayed on the left and right image display sources with the offset center point as the center position of the coordinate preset orientation. .
  • the data processing module is configured to display the information source image of the virtual information to be displayed on the left and right image display sources with the center point pair coordinate as the center position.
  • the binocular AR headset further corrects the pre-time when the user first uses the headset, and/or when the user uses the headset each time Set the distance mapping relationship ⁇ .
  • the preset distance mapping relationship ⁇ is expressed as:
  • dis represents the distance from the target to the human eye
  • h represents the fitted curve function
  • (SL, SR) represents the coordinate data of the center point pair of the two sets of effective display information.
  • the binocular AR wearing device and the depth of field adjusting method provided by the invention can accurately superimpose the virtual information to the vicinity of the human eye gaze point, so that the virtual information is highly integrated with the environment, and the virtual reality is truly realized.
  • the solution of the invention is simple, and only needs to obtain the distance from the target object to the human eye under the premise of presetting the mapping relationship ⁇ in the headwear device.
  • the distance from the target to the human eye can be obtained in various ways, and can be realized by a device or method such as a binocular ranging depth camera, which has mature hardware technology, high reliability and low cost.
  • the traditional depth of field adjustment is to change the optical original image distance.
  • the invention breaks the traditional thinking, does not change the optical device structure, and adjusts the depth of field by adjusting the equivalent center distance between the two sets of effective display information on the image display source, which is technological. And it is more practical than changing the optical focal length.
  • Figure 1 is a schematic view of a human eye space line of sight
  • FIG. 2 is a schematic flow chart of a depth of field adjustment method of a binocular AR wearing device according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of imaging of a camera
  • FIG. 4 is a schematic diagram of an equivalent symmetry axis OS of two left and right image sources and an equivalent symmetry axis OA of two optical systems according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of a test chart when the calibration distance mapping relationship ⁇ is performed according to an embodiment of the present invention
  • FIG. 6 is a schematic diagram showing a gradual change of a test chart when the distance mapping relationship ⁇ is determined according to an embodiment of the present invention.
  • Figure 1 shows a schematic view of the human eye space line of sight.
  • A, B, C, and D respectively represent objects in different orientations in space.
  • the direction of the line of sight of the left and right eyes is the appropriate amount of space represented by the corresponding line segment.
  • the line of sight directions of the left eye OL and the right eye OR are the space vectors represented by the line segment OLA and the line segment ORA, respectively; when the human eye looks at the target B, the left eye OL and the right eye OR
  • the direction of the line of sight is the space vector represented by the line segment OLB and the line segment ORB.
  • the left line of sight vector L in the left and right line of sight vectors of the human eye in the user coordinate system can be expressed as (L x , L y , L z , L ⁇ , L ⁇ , L ⁇ ), where (L x , L y , L z ) is the point coordinate on the left line of sight vector, (L ⁇ , L ⁇ , L ⁇ ) is the direction angle of the left line of sight vector; similarly, the right line of sight vector R can be expressed as (R x , R y , R z , R ⁇ , R ⁇ , R ⁇ ).
  • the right and left line of sight of the human eye can be used to obtain the vertical distance dis of the gaze point (for example, the object A) from the user:
  • the left and right eyes of the wearer can respectively observe two left and right virtual images.
  • the wearer's binocular observation will be an overlapping virtual image at a certain distance from the wearer.
  • the distance of the virtual picture from the human eye is determined by the line-of-sight space vector formed by the left and right virtual images and the left and right eyes, respectively.
  • the distance of the virtual picture from the human eye is equal to the vertical distance dis of the target from the user, the virtual picture has a spatial position consistent with the target.
  • the line-of-sight space vector formed by the left and right eyes is determined by the object to be viewed, and on the binocular head-wearing device, the center point pair coordinates of the left and right sets of effective display information can determine the line-of-sight space vector formed by the left and right eyes of the user.
  • the projection distance L n of the virtual image in the binocular wear device has a correspondence relationship with the center point pair coordinates of the left and right sets of effective display information on the image source of the headset, and the distance L n of the virtual image from the human eye is equal to the target object distance.
  • the correspondence can be converted into the distance mapping relationship ⁇ .
  • the distance mapping relationship ⁇ represents the center point pair coordinate (which can also be understood as a pixel point pair on the image display source) between the left and right sets of effective display information on the image display source of the headset, and the distance dis from the target object to the human eye. Mapping relationship.
  • the distance mapping relationship ⁇ may be either a formula or a discrete data correspondence, and the present invention is not limited thereto.
  • the distance mapping relationship ⁇ can be obtained in a plurality of different ways (for example, determining the distance mapping relationship ⁇ by means of offline calibration, etc., and obtaining the distance before leaving the factory.
  • the mapping relationship ⁇ is stored in the headwear, etc., and the present invention is also not limited thereto.
  • FIG. 2 is a schematic flow chart showing a depth adjustment method of a binocular AR headset provided by the embodiment.
  • step S201 when the user views the target object in the external environment using the headset, the distance dis of the target object to the human eye is acquired.
  • the headwear device obtains the distance dis of the target object to the human eye through the binocular stereo vision system in step S201.
  • the binocular stereo vision system mainly uses the parallax principle to perform ranging. Specifically, the binocular stereo vision system can determine the distance dis of the target object from the human eye according to the following expression:
  • h is the distance from the binocular stereo vision system to the human eye
  • Z is the distance between the target and the binocular stereo vision system
  • T is the baseline distance
  • f is the focal length of the binocular stereo vision system
  • x l and x r are respectively represented The x coordinate of the target in the left and right images.
  • the binocular stereo vision system may be implemented by using different specific devices, and the present invention is not limited thereto.
  • the binocular stereo vision system can be two cameras with the same focal length, a moving camera, or other reasonable devices.
  • the head mounted device may also adopt other reasonable methods to obtain the distance dis of the target object to the human eye, and the present invention is not limited thereto.
  • the headwear device can obtain the distance dis of the target object to the human eye through the depth of field camera, and can also detect the spatial visual line information data when the human eye looks at the target through the gaze tracking system and according to the information data.
  • the distance from the target to the human eye can also be determined by the camera imaging ratio.
  • the head mounted device can be as follows The expression is calculated to obtain the depth of field ⁇ L:
  • ⁇ L 1 and ⁇ L 2 represent the depth of the foreground and the depth of the back, respectively
  • represents the allowable circle diameter
  • f represents the focal length of the lens
  • F represents the aperture value
  • L represents the focus distance.
  • the depth of field ⁇ L is the distance dis from the target to the human eye.
  • the head wear device When the head wear device detects the distance dis of the target object to the human eye by detecting the visual line of sight information data when the human eye is gazing at the target by the gaze tracking system, the head wear device can adopt the content illustrated in FIG. 1 and the expression (1). The distance dis of the target to the human eye is determined, and will not be described here.
  • the headset calculates the distance to the human eye by the camera imaging scale
  • the actual size of the target needs to be stored in advance, and then the image containing the target is captured by the camera, and the pixel of the target in the captured image is calculated. Dimensions; then use the captured image to the database to retrieve the actual size of the target into the library; finally calculate the distance d from the target to the human eye using the captured image size and the actual size.
  • Fig. 3 is a schematic view showing the imaging of the camera, wherein AB represents an object, A'B' represents an image, the object distance OB is u, and the image distance OB' is v, which is obtained by a triangular similarity relationship:
  • the object distance can be calculated according to the expression (7).
  • the distance from the target to the human eye is the object distance u
  • the actual size of the target object is the object length x
  • the pixel size of the target object is the image length y.
  • the image distance v is determined by the internal optical structure of the camera. After the optical structure of the camera is determined, the image distance v is a fixed value.
  • the preset distance mapping relationship ⁇ can be used to determine the effective display information of the left and right groups. Center point pair coordinate data.
  • the preset distance mapping relationship ⁇ is preset in the headset, and may be either a formula or a discrete data correspondence.
  • the distance mapping relationship ⁇ can be expressed by the following expression:
  • (SL, SR) represents the coordinates of the center point pair of the effective display information
  • h represents the distance between the target object to the human eye and the coordinates of the center point pair of the effective display information.
  • the distance mapping relationship ⁇ may also be expressed in other reasonable forms, and the present invention is not limited thereto.
  • step S203 After obtaining the center point pair coordinate data of the two sets of effective display information, in step S203, the center point of the two sets of effective display information is used as the reference position, and the information source image of the virtual information to be displayed is respectively Displayed left and right on the image display source.
  • the center point pair coordinate as the reference position refers to the center position of the effective display information with the corresponding pixel point pair coordinate
  • the information source image of the virtual information to be displayed is displayed on the image display source respectively.
  • the information source image of the virtual information may also be displayed in other reasonable manners according to the center point pair coordinate as the reference position, and the present invention is not limited thereto.
  • the reference position with the center point pair coordinates refers to a position at which the coordinates of the corresponding pixel point pair are offset as the center position of the effective display information, and the information of the virtual information to be displayed is displayed.
  • the source image is displayed on the image display source in left and right. At this point, the user can see the virtual information next to the target through the headset.
  • the virtual information can be displayed next to the target by setting a certain offset to avoid obstructing the target, which is more in line with the user's habits.
  • the information source image of the left and right virtual information at the time of the offset preferably needs to maintain the synchronization offset, that is, the center distance and the relative position of the left and right information source images remain unchanged, only in the image display source. The position on the top has changed.
  • the distance mapping relationship ⁇ is preset inside the headset, which can be obtained by an offline calibration test. Generally, the distance mapping relationship ⁇ is tested by the manufacturer and stored in the headwear device before leaving the factory. The distance mapping relationship ⁇ is related to the structure of the head-mounted device. After the structure is fixed, the distance mapping relationship ⁇ is almost fixed.
  • the distance mapping relationship ⁇ can be obtained by collecting the Q test user data through the gaze tracking system, and each test user observes the k sets of test charts.
  • Q is a positive integer. It should be noted that the value of Q can be 1 if needed.
  • the equivalent symmetry of the left and right image sources is identical to the equivalent symmetry axis OA of the two sets of optical systems.
  • OL and OR represent the left and right eyes, respectively, D represents the pupil distance, and d 0 represents the distance between the main optical axes of the two optical systems.
  • the data of the line-of-sight space vector ie, the user's spatial line-of-sight information data
  • the k-group image displays the correspondence between the coordinate data of the center point of the test chart and the line of sight vector data on the source.
  • the step of acquiring, according to each user, the correspondence between the test map center point coordinate data and the spatial line of sight information data on the k group image display source includes:
  • Step 1 After the tester wears the wearing device, the image on the headset shows that the source test points display two identical test charts.
  • the test chart displayed by the image display source is a cross chart, and the center distance between the two sets of the ten-word maps L 1 and L 2 is d 1 and the center point of the cross chart Regarding the OS symmetry (in the present embodiment, the virtual image is referred to as an OS pair), wherein the center distance d 1 of the two sets of the sigma maps L 1 and L 2 is smaller than the two optical system main optical axis distances d 0 .
  • Step 2 When the test user looks at the virtual cross-tab that is projected in front of the human eye through the head-mounted device window, the gaze tracking system records the line-of-sight space vector when the test user looks at the virtual cross-tab, thereby obtaining a set of data.
  • the distance of the virtual picture from the human eye is determined by the line-of-sight space vector formed by the left and right virtual images and the left and right eyes, respectively.
  • the virtual picture has a spatial position consistent with the target.
  • the coordinates of the left and right cross charts displayed by the image source in the first group test chart in the image source coordinate system are (SLX 1 , SLY 1 ) and (SRX 1 , SRY 1 ), respectively.
  • the line-of-sight tracking system records the right and left eye line-of-sight vector coordinates of the currently tested user through the wearing device window while watching the virtual image completely overlapped by the optical system of the wearing device, and records the test user.
  • the left and right eye gaze vector coordinates are (ELX 1 , ELY 1 ) and (ERX 1 , ERY 1 ) when looking at the first set of test charts.
  • the first group of the test chart with only the image source is displayed around spider FIG position ⁇ (SLX 1, SLY 1) , (SRX 1, SRY 1) ⁇ abbreviated as (SL 1, SR 1)
  • the left and right eye line vector coordinates ⁇ (ELX 1 , ELY 1 ), (ERX 1 , ERY 1 ) ⁇ of the test user at this time are simply abbreviated as (EL 1 , ER 1 ), then the expression (8) can be expressed. for:
  • the distance L n_1 from the gaze point to the human eye can be obtained from the left and right eye gaze vectors, so that the user can see through the wearing device window.
  • the mapping relationship between the distance L n_1 of the virtual image projected by the head-mounted device and the center coordinate (SL 1 , SR 1 ) of the left-right display information on the image source of the headset device that is:
  • Step 3 sequentially reduce the center distance of the left and right cross-tabs displayed on the image display source according to a preset rule. Referring to FIG. 6, repeat step 2 each time the center distance is reduced.
  • Each set of data is a correspondence between coordinate data of the cross-point map center point and the spatial line-of-sight information data on the image display source, namely:
  • the k-group data can be used to obtain the virtual picture distance of the image information seen by the k-group user through the head-mounted device window after being projected by the head-mounted device.
  • the distance between the distance and the distance between the information center of the head-mounted device image source that is:
  • the data fitting of the k*Q group mapping relationship data can obtain the fitting curve function h between the coordinates of the left and right points on the display screen and the line of sight data of the human eye, according to the obtained fitting curve formula h and the existing display.
  • the coordinate data of the pair of left and right points on the screen can be calculated by substituting the coordinate data into the fitting curve formula to obtain the corresponding distance of the virtual projection information from the human eye as shown in the following formula:
  • (SL p , SR p ) represents the center position coordinates of one pair of left and right symmetrical information displayed on the image source of the headwear device
  • L n — p represents the distance of the virtual screen from the human eye
  • L n represents a distance from the virtual screen to the human eye
  • (SL, SR) represents a central position coordinate of a pair of left and right symmetric information displayed on the image source of the headset.
  • the center position coordinates (SL, SR) need to be within the corresponding image display source.
  • the distance between the virtual picture seen by the user through the wearing device window and the human eye is L n and the target object to the human eye.
  • the distance dis is equal.
  • a method similar to the calibration distance mapping relationship ⁇ can be used to make the distance mapping relationship ⁇ a simple one.
  • the calibration is such that the distance mapping relationship ⁇ is more adapted to the user.
  • each user wears a head-mounted device with a slight deviation in wearing position.
  • a similar method can be used to correct the distance mapping relationship ⁇ each time it is worn.
  • the distance mapping relationship ⁇ is modified based on different usage states of different users or users.
  • the headset starts, and the image display source displays left and right symmetry.
  • the cross-fork diagram the eye tracking system records the line-of-sight space vector of the eyeball when the user looks at the overlapping cross- tabs projected in front of the human eye, and the distance between the head-mounted device and the virtual projection information from the human eye according to the data set L n_p and the device
  • the upper image shows the mapping relationship ⁇ of the left and right symmetric pixel pairs (SL p , SR p ) of the source to the user's correction, which can be expressed as:
  • the projection distance data L n_y corresponding to the cross chart coordinates can be obtained by using the mapping relationship ⁇ stored by the headset device, that is, the second distance. Comparing the second distance L n_y with the aforementioned first distance L n_x , a compensation coefficient (ie, correction factor) w can be obtained, so that the root mean square error between the calculated data and the test data is minimized.
  • a compensation coefficient ie, correction factor
  • Sight line tracking is a technique for acquiring the current "gaze direction" of a subject by using various detection means such as electronic/optical. It is a reference to certain eye structures and features that are unchanged when the eye is rotated, and the position change characteristics and The line-of-sight variation parameters are extracted between these invariant features, and then the line of sight direction is obtained by the geometric model or the mapping model.
  • the distance between the target of different depth of field in front of the human eye is obtained according to one of the methods of the foregoing four methods, and the user can perform external control (such as voice control, button control, etc.).
  • the head-mounted device giving instructions to the head-mounted device, such as requesting to display information of one of the objects (for example, target A), the head-mounted device will receive an instruction according to the distance specified by the user-specified object (for example, target A) from the user.
  • Information related to the target eg, target A
  • the device central processor can obtain the coordinates (SL p , SR p ) of a set of pixel point pairs, and the information to be projected related to the target object is in the device.
  • the image source is displayed the same on the left and right, and is centered on (SL p , SR p ) or a certain offset position of (SL p , SR p ).
  • the user can see the virtual projection of the information related to the specified target at a certain distance from the user (the distance, that is, the distance of the target object from the user) through the wearing device window.
  • the embodiment further provides a binocular AR wearing device capable of automatically adjusting the depth of field, which includes an image display source, a distance data collecting module and a data processing module, and the distance processing relationship ⁇ is stored in the data processing module.
  • the distance mapping relationship ⁇ represents a mapping relationship between the coordinates of the center point pair of the left and right sets of effective display information on the image display source of the headset and the distance dis of the target object from the human eye.
  • the distance data acquisition module acquires data related to the target object to the human eye distance dis, and transmits the data to the data processing module.
  • the distance data acquisition module may be any one of a single camera, a binocular stereo vision system, a depth of field camera, and a line of sight tracking system.
  • the distance data acquisition module can acquire data related to the distance dis of the target object to the human eye through the camera imaging ratio.
  • the distance data acquisition module can use the parallax principle ranging method to obtain data related to the distance dis of the target object to the human eye.
  • the distance data acquisition module is a line-of-sight tracking system
  • the distance data acquisition module acquires data related to the distance dis of the target object to the human eye according to the foregoing expression (1).
  • the distance data acquisition module is a depth of field camera, the distance data acquisition module can directly obtain data related to the distance dis of the target object to the human eye.
  • the data processing module calculates the distance dis of the target object to the human eye according to the data transmitted from the data acquisition module, and acquires the center of the two sets of effective display information corresponding to the distance dis of the target object to the human eye according to the distance mapping relationship ⁇ .
  • Point coordinate data The data processing module controls the image display source to use the corresponding point pair coordinate data as a reference position, and displays the information source image of the virtual information to be displayed on the image display source.
  • the data processing module controls the image display source to display the information source image of the virtual information with the corresponding point pair coordinate as the reference position, which may be centered on the corresponding point pair coordinate.
  • the information source image of the virtual information to be displayed is displayed on the image display source side by side, or may be an information source image of the virtual information to be displayed at a certain offset position of the center point, and displayed on the image display side by side. Sources of the invention are not limited thereto.
  • the distance mapping relationship ⁇ may also be obtained or modified in other reasonable manners, and the present invention is also not limited thereto.
  • the binocular AR headset and the depth of field adjustment method provided by the present invention can accurately superimpose virtual information near the position of the human eye gaze point, so that the virtual information is highly integrated with the environment, realizing the true meaning.
  • the solution of the invention is simple, and only needs to obtain the distance from the target object to the human eye under the premise of presetting the mapping relationship ⁇ in the headwear device.
  • the distance from the target to the human eye can be obtained in various ways, and can be realized by a device or method such as a binocular ranging depth camera, which has mature hardware technology, high reliability and low cost.
  • the traditional depth of field adjustment is to change the optical original image distance.
  • the invention breaks the traditional thinking, does not change the optical device structure, and adjusts the depth of field by adjusting the equivalent center distance between the two sets of effective display information on the image display source, which is technological. And it is more practical than changing the optical focal length.
  • the invention is not limited to the specific embodiments described above.
  • the invention extends to any new feature or any new combination disclosed in this specification, as well as any novel method or process steps or any new combination disclosed.

Abstract

一种能自动调节景深的双目AR头戴设备及景深调节方法,其中,该方法包括:获取目标物到人眼的距离dis;根据目标物到人眼的距离dis和预设距离映射关系δ,获取与目标物到人眼的距离dis对应的左右两组有效显示信息的中心点对坐标数据,其中,预设距离映射关系δ表示中心点对坐标数据与目标物到人眼的距离dis之间的映射关系;根据中心点对坐标数据,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。该方法能够实现将虚拟信息精确叠加到人眼注视点位置附近,使虚拟信息与环境高度融合,实现真正意义上的增强虚拟现实。

Description

能自动调节景深的双目AR头戴设备及景深调节方法
相关技术的交叉引用
本申请要求享有2015年01月21日提交的名称为:“能自动调节景深的双目AR头戴设备及景深调节方法”的中国专利申请CN201510029879.7的优先权,其全部内容通过引用并入本文中。
技术领域
本发明涉及头戴显示设备领域,尤其涉及一种能自动调节景深的双目AR头戴设备及其景深调节方法。
背景技术
随着穿戴设备的兴起,各种头戴显示设备成为各大巨头公司的研发热点,头戴显示设备也逐渐进入人们的视野。头戴显示设备是增强现实技术(Augmented Reality Technique,简称为AR)的最佳运用环境,其能将虚拟信息通过头戴设备窗口呈现在真实环境中。
然而,多数现有的AR头戴显示设备对于AR信息的叠加仅仅考虑与目标位置X、Y轴坐标的相关性,而未考虑目标的深度信息,这样也就使得虚拟信息只是漂浮在人眼前方,而与环境融合度不高,导致AR头戴显示设备的用户体验度欠佳。
在现有技术中,也存在在头戴设备上调节景深的方法,然而这些方法大多都是采用机械调节的方式来调节光学透镜组的光学结构,从而改变光学原件像距,进而实现虚像景深调节。而这中景深调节方式会造成得头戴设备体积大、成本高且精度难以控制等问题。
发明内容
本发明所要解决的技术问题是为了解决现有的AR头戴设备由于采用机械调节来调节虚像的景深而造成的头戴设备体积大、成本高且精度难以控制等问题。为解决上述问题,本发明的一个实施例首先提供了一种双目AR头戴设备的景深调节方法,所述方法包括:
获取目标物到人眼的距离dis;
根据目标物到人眼的距离dis和预设距离映射关系δ,获取与目标物到人眼的距离dis对应的左右两组有效显示信息的中心点对坐标数据,其中,所述预设距离映射关系δ表示所述中心点对坐标数据与目标物到人眼的距离dis之间的映射关系;
根据所述中心点对坐标数据,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
根据本发明的一个实施例,通过双目立体视觉系统获得目标物到人眼的距离dis。
根据本发明的一个实施例,根据如下表达式确定所述目标物到人眼的距离dis:
Figure PCTCN2015086360-appb-000001
其中,h表示双目立体视觉系统距人眼距离,Z表示目标物与双目立体视觉系统之间的距离,T表示基线距,f表示焦距,xl和xr分别表示目标物在左图像和右图像中的x坐标。
根据本发明的一个实施例,通过视线追踪系统检测人眼注视目标物时空间视线信息数据,并根据所述空间视线信息数据确定目标物到人眼的距离dis。
根据本发明的一个实施例,根据如下表达式确定所述目标物到人眼的距离dis:
Figure PCTCN2015086360-appb-000002
其中,(Lx,Ly,Lz)和(Lα,Lβ,Lγ)分别表示左视线矢量上目标点的坐标和方向角,(Rx,Ry,Rz)和(Rα,Rβ,Rγ)分别表示右视线矢量上目标点的坐标和方向角。
根据本发明的一个实施例,通过摄像机成像比例来确定目标物到人眼的距离dis。
根据本发明的一个实施例,通过景深摄像机来确定目标物到人眼的距离dis。
根据本发明的一个实施例,在所述方法中,以中心点对坐标为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
根据本发明的一个实施例,在所述方法中,以偏移中心点对坐标预设方位的位置为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
根据本发明的一个实施例,所述方法还包括:当使用者初次使用所述头戴设备时和/或当所述使用者每次使用所述头戴设备时,修正所述预设距离映射关系δ。
根据本发明的一个实施例,修正所述预设距离映射关系δ的步骤包括:
控制头戴设备的图像显示源将预设信息源图像,分别显示在左右图像显示源上;
获取在观察到左右图像显示源上显示的预设信息源图像在人眼前方重叠在一起时人眼的视线空间矢量,并根据所述空间视线矢量得到第一距离;
根据所述预设信息源图像在所述左右图像显示源上的坐标数据,利用预设距离映射关系δ得到第二距离;
根据所述第一距离和第二距离,确定修正因子;
利用所述修正因子对所述预设距离映射关系δ进行修正。
根据本发明的一个实施例,所述预设距离映射关系δ表示为:
Figure PCTCN2015086360-appb-000003
其中,dis表示目标物到人眼的距离,h表示拟合曲线函数,(SL,SR)表示左右两组有效显示信息的中心点对的坐标数据。
根据本发明的一个实施例,在所述方法中,构建所述预设距离映射关系δ包括:
步骤一、在所述左右图像显示源的预设位置处显示预设测试图像;
步骤二、获取用户注视虚拟测试图时的视线空间矢量,根据所述视线空间矢量和所述预设测试图像的显示位置,确定一组所述预设测试图像位置和与对应的目标物距离人眼的距离的映射关系数据;
步骤三、按预设规律依次缩小所述预设测试图像的中心距离,并重复步骤二,直至得到k组所述预设测试图像位置和与对应的目标物距离人眼的距离的映射关系数据;
步骤四、对所述k组所述预设测试图像位置和与对应的目标物距离人眼的距离的映射关系数据进行拟合,构建得到所述预设距离映射关系δ。
本发明还提供了一种能自动调节景深的双目AR头戴设备,其包括:
光学系统;
图像显示源,其包括左图像显示源和右图像显示源;
距离数据采集模块,其用于获取与目标物到人眼的距离dis有关的数据;
数据处理模块,其与所述距离数据采集模块连接,其用于根据所述与目标物到人眼的距离dis有关的数据确定目标物到人眼的距离dis,并结合预设距离映射关系δ,确定与目标物到人眼的距离dis对应的左右两组有效显示信息的中心点对坐标数据,并根据所述中心点对坐标数据,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上;
其中,所述预设距离映射关系δ表示所述中心点对坐标数据与目标物到人眼的距离dis之间的映射关系。
根据本发明的一个实施例,所述距离数据采集模块包括以下所列项中的任一项:
单个摄像机、双目立体视觉系统、景深摄像机和视线追踪系统。
根据本发明的一个实施例,所述数据处理模块配置为以偏移中心点对坐标预设方位的位置为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
根据本发明的一个实施例,所述数据处理模块配置为以中心点对坐标为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
根据本发明的一个实施例,所述双目AR头戴设备还在使用者初次使用所述头戴设备时和/或当所述使用者每次使用所述头戴设备时,修正所述预设距离映射关系δ。
根据本发明的一个实施例,所述预设距离映射关系δ表示为:
Figure PCTCN2015086360-appb-000004
其中,dis表示目标物到人眼的距离,h表示拟合曲线函数,(SL,SR)表示左右两组有效显示信息的中心点对的坐标数据。
本发明所提供的双目AR头戴设备及其景深调节方法能够实现将虚拟信息精确叠加到人眼注视点位置附近,使虚拟信息与环境高度融合,实现真正意义上的增强虚拟现实。
本发明方案简单,在头戴设备内预置离映射关系δ的前提下,只需要获取目标物到人眼的距离即可。而目标物到人眼的距离的获取方式多样,可通过双目测距会景深摄像头等设备或方法来实现,硬件技术成熟,可靠性高且成本低。
传统景深调节均是从改变光学原件像距入手,本发明打破传统思维,不改变光学器件结构,通过调节图像显示源上左右两组有效显示信息的等效中心距离实现调节景深,具有开创性,且相比改变光学焦距,更具有实用性。
本发明的其它特征和优点将在随后的说明书中阐述,并且,部分地从说明书中变得显而易见,或者通过实施本发明而了解。本发明的目的和其他优点可通过在说明书、权利要求书以及附图中所特别指出的结构来实现和获得。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图:
图1为人眼空间视线路径示意图;
图2为本发明一个实施例的双目AR头戴设备的景深调节方法流程示意图;
图3为摄像头成像示意图;
图4为本发明一个实施例的左右两部分图像源的等效对称轴OS与两组光学系统的等效对称轴OA示意图;
图5为本发明一个实施例的标定距离映射关系δ时的测试图示意;
图6为本发明一个实施例的标定距离映射关系δ时的测试图的渐变示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
当人眼(包括左眼OL和右眼OR)注视不同空间区域的目标物时,左眼OL与右眼OR的视线矢量是不一样的。图1示出了人眼空间视线路径示意图。在图1中,A、B、C和D分别代表空间中不同方位的目标物,当人眼观察或注视其中某一个目标物时,左右眼的视线方向分别为相应线段代表的空间适量。
例如,当人眼注视目标物A时,左眼OL和右眼OR的视线方向分别为线段OLA和线段ORA所代表的空间矢量;当人眼注视目标物B时,左眼OL和右眼OR的视线方向分别为线段OLB和线段ORB所代表的空间矢量。当获知了注视某一目标物(例如目标物A)时左右眼的视线空间向量后,可根据视线空间向量计算出该目标物与人眼之间的距离。
当人眼注视某一目标物(例如目标物A)时,在使用者坐标系内人眼的左右视线矢量中左视线矢量L可以表示为(Lx,Ly,Lz,Lα,Lβ,Lγ),其中(Lx,Ly,Lz)为左视线矢量上的一点坐标,(Lα,Lβ,Lγ)为左视线矢量的方向角;同理,右视线矢量R可以表示为(Rx,Ry,Rz,Rα,Rβ,Rγ)。
根据空间解析学方法,利用人眼的左右视线适量可以求解得到注视点(例如目标物A)距使用者的垂直距离dis:
Figure PCTCN2015086360-appb-000005
在增强现实头戴设备领域,通过双目头戴设备,佩戴者的左右眼能够分别观察到左右两幅虚拟图像。当左眼观察左侧虚拟图像的视线与右眼观察右侧虚拟图像的视线在空间区域相汇时,佩戴者的双目观察到的将是一幅重叠的并距佩戴者一定距离的虚拟画面。此虚拟画面距离人眼的距离是由左右虚拟图像分别与左右眼构成的视线空间矢量决定的。当虚拟画面距离人眼的距离等于目标距使用者的垂直距离dis时,虚拟画面便与目标物具有一致的空间位置。
左右眼构成的视线空间矢量是由其观看的目标物所决定,而在双目头戴设备上,左右两组有效显示信息的中心点对坐标又可以决定用户左右眼构成的视线空间矢量,因此双目头戴设备中虚像的投影距离Ln与头戴设备图像源上左右两组有效显示信息的中心点对坐标存在对应关系,当将虚拟画面距离人眼的距离Ln等于目标物距使用者的垂直距离dis时,该对应关系可转换为距离映射关系δ。即,距离映射关系δ表示头戴设备图像显示源上左右两组有效显示信息的中心点对坐标(也可以理解为图像显示源上的像素点对)与目标物到人眼的距离dis之间的映射关系。
需要指出的是,在本发明的不同实施例中,距离映射关系δ可以既为一个公式,也可以为离散数据对应关系,本发明不限于此。
还需要指出的是,在本发明的不同实施例中,距离映射关系δ可以通过多种不同的方式来获得(例如通过离线标定等方式确定出距离映射关系δ,并在出厂前将得到的距离映射关系δ存储于头戴设备内等),本发明同样不限于此。
图2示出了本实施例所提供的双目AR头戴设备的景深调节方法的流程示意图。
本实施例所提供的双目AR头戴设备的景深调节方法,在步骤S201中,在用户使用头戴设备观看外接环境中的某目标物时,获取该目标物到人眼的距离dis。
本实施例中,头戴设备在步骤S201中通过双目立体视觉系统获得目标物到人眼的距离dis。双目立体视觉系统主要利用视差原理来进行测距。具体地,双目立体视觉系统可以根据如下表达式确定目标物距人眼的距离dis:
Figure PCTCN2015086360-appb-000006
其中,h表示双目立体视觉系统距人眼距离,Z表示目标物与双目立体视觉系统之间的距离,T表示基线距,f表示双目立体视觉系统焦距,xl和xr分别表示目标物在左图像和右图像中的x坐标。
需要说明的是,在本发明的不同实施例中,双目立体视觉系统可以采用不同的具体器件来实现,本发明不限于此。例如在本发明的不同实施例中,双目立体视觉系统既可以为两个焦距相同的摄像机,也可以为一运动的摄像机,抑或是其他合理的器件。
同时,还需要说明的是,在本发明的其他实施例中,头戴设备还可以采用其他合理的方法来获得目标物到人眼的距离dis,本发明同样不限于此。例如在本发明的不同实施例中,头戴设备既可以通过景深摄像机获得目标物到人眼的距离dis,也可以通过视线追踪系统检测人眼注视目标物时空间视线信息数据并根据该信息数据来确定目标物到人眼的距离dis,还可以通过摄像机成像比例来确定目标物到人眼的距离dis。
当头戴设备通过景深摄像机获得目标物到人眼的距离dis时,头戴设备可以根据如下 表达式计算得到景深ΔL:
Figure PCTCN2015086360-appb-000007
Figure PCTCN2015086360-appb-000008
Figure PCTCN2015086360-appb-000009
其中,ΔL1和ΔL2分别表示前景深和后景深,δ表示允许弥散圆直径,f表示镜头焦距,F表示光圈值,L表示对焦距离。此时,景深ΔL即为目标物到人眼的距离dis。
当头戴设备通过视线追踪系统检测人眼注视目标物时的空间视线信息数据来计算目标物到人眼的距离dis时,头戴设备可以采用图1以及表达式(1)所阐述的内容来确定目标物到人眼的距离dis,在此不再赘述。
当头戴设备通过摄像机成像比例计算目标物到人眼的距离dis时,需要预先将目标物的实际尺寸入库,然后采用摄像机拍摄包含目标物的图像,并计算目标物在拍摄图像中的像素尺寸;随后用拍摄图像到数据库检索得到目标物入库的实际尺寸;最后用拍摄图像尺寸与实际尺寸计算出目标物到人眼的距离dis。
图3示出了摄像头成像示意图,其中,AB表示物,A′B′表示像,记物距OB为u,像距OB′为v,则由三角形相似关系可得:
Figure PCTCN2015086360-appb-000010
由表达式(6)可得:
Figure PCTCN2015086360-appb-000011
其中,x表示物长,y表示像长。
当摄像头焦距固定时,根据表达式(7)即可计算出物距。在该实施例中,目标物到人眼的距离即为物距u,目标物体的实际尺寸即为物长x,目标物的像素尺寸即为像长y。像距v由摄像头内部光学结构确定,摄像头光学结构确定后,像距v即为定值。
再次如图2所示,得到目标物到人眼的距离dis后,在步骤S202中根据目标物到人眼的距离dis,利用预设距离映射关系δ即可确定出左右两组有效显示信息的中心点对坐标数据。本实施例中,预设距离映射关系δ是在头戴设备内预置的,其既可以为一个公式,也可以为离散数据对应关系。
具体地,本实施例中,距离映射关系δ可以采用如下表达式进行表示:
Figure PCTCN2015086360-appb-000012
其中,dis表示目标物到人眼的距离,(SL,SR)表示有效显示信息的中心点对的坐标,h表示目标物到人眼的距离dis与有效显示信息的中心点对的坐标之间的拟合曲线函数。
需要说明的是,在本发明的其他实施例中,距离映射关系δ还可以表示为其他合理形式,本发明不限于此。
得到左右两组有效显示信息的中心点对坐标数据后,在步骤S203中,以该左右两组有效显示信息的中心点对坐标数据为参考位置,将需显示的虚拟信息的信息源图像,分别左右显示在图像显示源上。
本实施例中,以中心点对坐标为参考位置是指以对应的像素点对坐标为有效显示信息的中心位置,将需显示的虚拟信息的信息源图像,分别左右显示在图像显示源上。此时使用者通过头戴设备便可以在目标物位置上看到虚拟信息。
需要说明的是,在本发明的其他实施例中,还可以根据中心点对坐标为参考位置,以其他合理方式来显示虚拟信息的信息源图像,本发明不限于此。例如在发明的一个实施例中,以中心点对坐标为参考位置是指将与对应的像素点对的坐标具有一定偏移的位置作为有效显示信息的中心位置,将需显示的虚拟信息的信息源图像,分左右显示在图像显示源上。此时,使用者通过头戴设备便可以在目标物旁边看到虚拟信息。
在该实施例中,通过设置一定的偏移量可以将虚拟信息显示在目标物旁边,以免遮挡目标物,更符合用户习惯。
需要指出的是,在该实施例中,在偏移时左右虚拟信息的信息源图像优选地需要保持同步偏移,即左右信息源图像中心间距和相对位置保持不变,仅其在图像显示源上的位置发生改变。
本实施例中,距离映射关系δ是预置在头戴设备内部的,其可以通过离线标定测试得到。一般地,距离映射关系δ由厂家测试后,在出厂前存储于头戴设备内。距离映射关系δ与头戴设备的结构相关,结构固定后,距离映射关系δ差不多也就固定了。
然而,对于不同使用者,佩戴误差需要一定的修正系数进行修正。为了更充分公开本发明方案,下面举例说明距离映射关系δ的一种标定方法,需要指出的是,本处仅为举例,不限定标定方法仅为这一种。
距离映射关系δ可以通过视线追踪系统采集Q个测试用户数据,且每个测试用户观察k组测试图而获得。其中,Q为正整数,需要指出的是,在需要的情况下,Q的取值可以为1。
假设头戴设备的图像显示源左右两部分显示源区域的分辨率均为N*M,即水平和 垂直分辨率分别是M和N,如图4所示,左右两部分图像源的等效对称轴OS与两组光学系统的等效对称轴OA一致。在图4中,OL和OR分别表示左眼和右眼,D表示瞳距,d0表示两组光学系统主光轴距离。
在确定距离映射关系δ时,通过使每个测试用户观察k组测试图可以获得k组测试用户的视线空间矢量(即用户的空间视线信息数据)数据,根据这k组视线空间矢量数据可以得到k组图像显示源上测试图中心点坐标数据与视线空间矢量数据之间的对应关系。
具体地,本实施例中,基于每个用户获取k组图像显示源上测试图中心点坐标数据与空间视线信息数据之间的对应关系的步骤包括:
步骤一、测试者佩戴上头戴设备后,头戴设备上的图像显示源分左右显示两幅相同的测试图。如图5所示,本实施例中图像显示源所显示的测试图为十字叉图为例,两组十叉字图L1和L2的中心距离为d1,且十字叉图的中心点关于OS对称(本实施例中,以虚拟图像以OS对称为例),其中,两组十叉字图L1和L2的中心距离d1小于两组光学系统主光轴距离d0
步骤二、当测试用户通过头戴设备窗口注视投影在人眼前方重叠在一起的虚拟十字叉图时,视线追踪系统记录测试用户注视虚拟十字叉图时的视线空间矢量,从而获得一组数据。
此虚拟画面距离人眼的距离是由左右虚拟图像分别与左右眼构成的视线空间矢量决定的。当虚拟画面距离人眼的距离等于目标距使用者的垂直距离dis时,虚拟画面便与目标物具有一致的空间位置。
本实施例中,记第1组测试图中图像源显示的左右十字叉图在图像源坐标系中的坐标分别为(SLX1,SLY1)和(SRX1,SRY1)。在图像源显示该十字叉图时,视线跟踪系统依次记录下当前测试用户透过头戴设备窗口注视经头戴设备光学系统投影后完全重叠的虚拟图时的左右眼视线矢量坐标,记测试用户在注视第1组测试图时左右眼视线矢量坐标分别为(ELX1,ELY1)和(ERX1,ERY1)。如此可获得一组图像源十字叉图位置和与之对应的左右眼视线矢量坐标的映射关系,即:
Figure PCTCN2015086360-appb-000013
本实施例中,将第1组测试图只用图像源显示的左右十字叉图在位置{(SLX1,SLY1),(SRX1,SRY1)}简记为(SL1,SR1),而将此时测试用户的左右眼视线矢量坐标{(ELX1,ELY1),(ERX1,ERY1)}简记为(EL1,ER1),那么表达式(8)便可以表示为:
Figure PCTCN2015086360-appb-000014
依据图1所示的人眼视觉理论及表达式(1)可知,由左右眼视线矢量可以获得此时注视点到人眼的距离Ln_1,因此也就可以得到使用者通过头戴设备窗口看到的图像信息经头戴设备投影后的虚拟画面距使用者的距离Ln_1与头戴设备图像源上左右显示信息的中心坐标(SL1,SR1)的映射关系,即:
Figure PCTCN2015086360-appb-000015
步骤三、按预设规律依次缩小图像显示源上显示的左右十字叉图的中心距离,参见图6,每次缩小该中心距离后,重复步骤二。
如此进行k次操作后,便共计可以获得k组数据。而每组数据为图像显示源上十字叉图中心点坐标数据与空间视线信息数据之间的对应关系,即:
Figure PCTCN2015086360-appb-000016
依据图1所示的视觉理论及表达式(1)可知,利用上述k组数据,可以得到k组使用者通过头戴设备窗口看到的图像信息经头戴设备投影后的虚拟画面距使用者的距离与头戴设备图像源上左右显示信息中心距离的映射关系,即:
Figure PCTCN2015086360-appb-000017
对Q个测试用户进行上述操作,便共可获得k*Q组映射关系,即:
Figure PCTCN2015086360-appb-000018
对这k*Q组映射关系数据进行数据拟合可得到显示屏上左右点对坐标与人眼空间视线数据之间的拟合曲线函数h,依据获得的拟合曲线公式h以及己有的显示屏上左右点对的坐标数据,可以将此坐标数据代入拟合曲线公式计算得到对应的所需要的虚拟投影信息距人眼的距离如下式所示,即:
Figure PCTCN2015086360-appb-000019
其中,(SLp,SRp)表示头戴设备图像源上显示的其中一对左右对称信息的中心位置坐标,Ln_p表示虚拟画面距人眼的距离。
表达式(15)可以简化为:
Figure PCTCN2015086360-appb-000020
其中,Ln表示虚拟画面到人眼的距离,(SL,SR)表示头戴设备图像源上显示的其中一对左右对称信息的中心位置坐标。当然,中心位置坐标(SL,SR)需要在相应的图像显示源内。
由于在头戴设备的使用过程中,为保证虚拟画面与目标物具有一致的空间景深,因此使用者通过头戴设备窗口看到的虚拟画面距人眼的距离Ln与目标物到人眼的距离dis是 相等的。因此表达式(16)也就可以等同于:
Figure PCTCN2015086360-appb-000021
由于每个用户视线有差异性,因此在使用者初次使用所述头戴设备时,为了得到更好的显示效果,可以采用类似标定距离映射关系δ的方法,对距离映射关系δ做一次简单的标定,以使得距离映射关系δ更加适配于该使用者。同样,每个用户每次佩戴头戴设备时佩戴位置也会有微小偏差,在每次佩戴时,也可以采用类似的方法,对距离映射关系δ做修正。
具体地,本实施例中,基于不同使用者或使用者的不同使用状态来对距离映射关系δ进行修正方式是当使用者佩戴上头戴设备时,头戴设备启动,图像显示源显示左右对称十字叉图,眼动跟踪系统记录使用者注视投影在人眼前方的重叠十字叉图时的眼球的视线空间矢量,头戴设备依据此数据组对虚拟投影信息距人眼的距离Ln_p与设备上的图像显示源的左右对称像素点对(SLp,SRp)的映射关系δ做适配使用者的修正,具体可以表示为:
Figure PCTCN2015086360-appb-000022
其中,w表示修正因子。
同理,表达式(18)也可以等同为:
Figure PCTCN2015086360-appb-000023
在上述修正过程中,可获得一组使用者初次佩戴该设备进行校正测试系统记录的显示屏上左右对称十字叉图坐标及对应的使用者的视线空间矢量数据,而根据此视线空间矢量数据及表达式(1)可以计算得到对应的投影距离Ln_x,即第一距离;
同时,根据此时显示屏上左右对称十字叉图坐标,利用头戴设备所存储的映射关系δ,可得到该十字叉图坐标对应的投影距离数据Ln_y,即第二距离。将此第二距离Ln_y与前述第一距离Ln_x进行比较,便可以得到一补偿系数(即修正因子)w,从而使得计算数据与测试数据的均方根误差最小。
如果需要对离映射关系δ做适配于使用者的校正,则出厂设备上需要配置视线追踪系统;如果不需要对离映射关系δ做适配于使用者的校正,则出厂设备上可以不配置视线追踪系统。视线追踪是利用电子/光学等各种检测手段获取受试者当前“注视方向”的技术,它是利用眼球转动时相对位置不变的某些眼部结构和特征作为参照,在位置变化特征和这些不变特征之间提取视线变化参数,而后通过几何模型或映射模型获取视线方向。
使用者通过本发明头戴设备看外界环境时,根据前述四中方法之一的方法获得人眼 前方不同景深目标距使用者的距离,使用者可以通过外部控制(例如语音控制、按键控制等)给头戴设备下达指令,如要求显示其中一个目标物的信息(例如目标物A),头戴设备获得指令后将依据使用者指定的目标物(例如目标物A)距使用者的距离将与目标物(例如目标物A)相关的信息在设备图像源上对应显示。即,根据目标物(例如目标物A)距使用者的距离,设备中央处理器可以获得一组像素点对的坐标(SLp,SRp),与该目标物相关的需投影的信息在设备图像源上左右相同显示,并以(SLp,SRp)或以(SLp,SRp)一定偏移位置为中心。使用者通过头戴设备窗口便可以在距使用者一定距离(该距离即目标物距使用者的距离)处看到与指定目标物相关信息的虚拟投影。
本实施例还提供了一种能自动调节景深的双目AR头戴设备,其包括图像显示源、距离数据采集模块和数据处理模块,数据处理模块内存储有距离映射关系δ。其中,距离映射关系δ表示头戴设备图像显示源上左右两组有效显示信息的中心点对坐标与目标物距人眼的距离dis之间的映射关系。
使用者通过头戴设备看外界环境时,距离数据采集模块获取与目标物到人眼距离dis有关的数据,并将这些数据传送至数据处理模块。在本发明的不同实施例中,距离数据采集模块可以为单个摄像机、双目立体视觉系统、景深摄像机、视线追踪系统中的任一种。
当距离数据采集模块为单个摄像机时,距离数据采集模块可以通过摄像机成像比例来获取与目标物到人眼的距离dis有关的数据。当距离数据采集模块为双目立体视觉系统时,距离数据采集模块则可以利用视差原理测距的方法,来获得与目标物到人眼的距离dis有关的数据。当距离数据采集模块为视线追踪系统时,距离数据采集模块根据前述表达式(1)来获取与目标物到人眼的距离dis有关的数据。当距离数据采集模块为景深摄像机时,距离数据采集模块能够直接获取得到与目标物到人眼的距离dis有关的数据。
数据处理模块根据距离数据采集模块传来的数据计算目标物到人眼的距离dis,并根据在距离映射关系δ获取与目标物到人眼的距离dis相对应的左右两组有效显示信息的中心点对的坐标数据。数据处理模块控制图像显示源,以对应点对坐标数据为参考位置,将需显示的虚拟信息的信息源图像,分左右显示在图像显示源上。
需要说明的是,在本发明的不同实施例中,数据处理模块控制图像显示源以对应的点对坐标为参考位置来显示虚拟信息的信息源图像既可以是以对应的点对坐标为中心位置,将需显示的虚拟信息的信息源图像分左右显示在图像显示源上,也可以是在中心点对坐标一定偏移位置处将需显示的虚拟信息的信息源图像,分左右显示在图像显示源上,本发明不限于此。
头戴设备获取以及修正距离映射关系δ的原理以及过程在上述描述中己经进行了详细地阐述,在此不再赘述。需要说明的是,在本发明的其他实施例中,距离映射关系δ还可以通过其他合理方式来获得或进行修正,本发明同样不限于此。
从上述描述中可以看出,本发明所提供的双目AR头戴设备及其景深调节方法能够实现将虚拟信息精确叠加到人眼注视点位置附近,使虚拟信息与环境高度融合,实现真正意义上的增强虚拟现实。
本发明方案简单,在头戴设备内预置离映射关系δ的前提下,只需要获取目标物到人眼的距离即可。而目标物到人眼的距离的获取方式多样,可通过双目测距会景深摄像头等设备或方法来实现,硬件技术成熟,可靠性高且成本低。
传统景深调节均是从改变光学原件像距入手,本发明打破传统思维,不改变光学器件结构,通过调节图像显示源上左右两组有效显示信息的等效中心距离实现调节景深,具有开创性,且相比改变光学焦距,更具有实用性。
本说明书中公开的所有特征,或公开的所有方法或过程中的步骤,除了互相排斥的特征和/或步骤以外,均可以以任何方式组合。
本说明书(包括任何附加权利要求、摘要和附图)中公开的任一特征,除非特别叙述,均可被其他等效或具有类似目的的替代特征加以替换。即,除非特别叙述,每个特征只是一系列等效或类似特征中的一个例子而己。
本发明并不局限于前述的具体实施方式。本发明扩展到任何在本说明书中披露的新特征或任何新的组合,以及披露的任一新的方法或过程的步骤或任何新的组合。

Claims (19)

  1. 一种双目AR头戴设备的景深调节方法,其中,所述方法包括:
    获取目标物到人眼的距离dis;
    根据目标物到人眼的距离dis和预设距离映射关系δ,获取与目标物到人眼的距离dis对应的左右两组有效显示信息的中心点对坐标数据,其中,所述预设距离映射关系δ表示所述中心点对坐标数据与目标物到人眼的距离dis之间的映射关系;
    根据所述中心点对坐标数据,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
  2. 如权利要求1所述的方法,其中,通过双目立体视觉系统获得目标物到人眼的距离dis。
  3. 如权利要求2所述的方法,其中,根据如下表达式确定所述目标物到人眼的距离dis:
    Figure PCTCN2015086360-appb-100001
    其中,h表示双目立体视觉系统距人眼距离,Z表示目标物与双目立体视觉系统之间的距离,T表示基线距,f表示焦距,xl和xr分别表示目标物在左图像和右图像中的x坐标。
  4. 如权利要求1所述的方法,其中,通过视线追踪系统检测人眼注视目标物时空间视线信息数据,并根据所述空间视线信息数据确定目标物到人眼的距离dis。
  5. 如权利要求4所述的方法,其中,根据如下表达式确定所述目标物到人眼的距离dis:
    Figure PCTCN2015086360-appb-100002
    其中,(Lx,Ly,Lz)和(Lα,Lβ,Lγ)分别表示左视线矢量上目标点的坐标和方向角,(Rx,Ry,Rz)和(Rα,Rβ,Rγ)分别表示右视线矢量上目标点的坐标和方向角。
  6. 如权利要求1所述的方法,其中,通过摄像机成像比例来确定目标物到人眼的距离dis。
  7. 如权利要求1所述的方法,其中,通过景深摄像机来确定目标物到人眼的距离dis。
  8. 如权利要求1所述的方法,其中,在所述方法中,以中心点对坐标为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
  9. 如权利要求1所述的方法,其中,在所述方法中,以偏移中心点对坐标预设方位的位置为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
  10. 如权利要求1所述的方法,其中,所述方法还包括:当使用者初次使用所述头戴设备时和/或当所述使用者每次使用所述头戴设备时,修正所述预设距离映射关系δ。
  11. 如权利要求1所述的方法,其中,修正所述预设距离映射关系δ的步骤包括:
    控制头戴设备的图像显示源将预设信息源图像,分别显示在左右图像显示源上;
    获取在观察到左右图像显示源上显示的预设信息源图像在人眼前方重叠在一起时人眼的视线空间矢量,并根据所述空间视线矢量得到第一距离;
    根据所述预设信息源图像在所述左右图像显示源上的坐标数据,利用预设距离映射关系δ得到第二距离;
    根据所述第一距离和第二距离,确定修正因子;
    利用所述修正因子对所述预设距离映射关系δ进行修正。
  12. 如权利要求1所述的方法,其中,所述预设距离映射关系δ表示为:
    Figure PCTCN2015086360-appb-100003
    其中,dis表示目标物到人眼的距离,h表示拟合曲线函数,(SL,SR)表示左右两组有效显示信息的中心点对的坐标数据。
  13. 如权利要求1所述的方法,其中,在所述方法中,构建所述预设距离映射关系δ包括:
    步骤一、在所述左右图像显示源的预设位置处显示预设测试图像;
    步骤二、获取用户注视虚拟测试图时的视线空间矢量,根据所述视线空间矢量和所述预设测试图像的显示位置,确定一组所述预设测试图像位置和与对应的目标物距离人眼的距离的映射关系数据;
    步骤三、按预设规律依次缩小所述预设测试图像的中心距离,并重复步骤二,直至得到k组所述预设测试图像位置和与对应的目标物距离人眼的距离的映射关系数据;
    步骤四、对所述k组所述预设测试图像位置和与对应的目标物距离人眼的距离的映射关系数据进行拟合,构建得到所述预设距离映射关系δ。
  14. 一种能自动调节景深的双目AR头戴设备,其中,其包括:
    光学系统;
    图像显示源,其包括左图像显示源和右图像显示源;
    距离数据采集模块,其用于获取与目标物到人眼的距离dis有关的数据;
    数据处理模块,其与所述距离数据采集模块连接,其用于根据所述与目标物到人眼的距离dis有关的数据确定目标物到人眼的距离dis,并结合预设距离映射关系δ,确定与目标物到人眼的距离dis对应的左右两组有效显示信息的中心点对坐标数据,并根据所述中 心点对坐标数据,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上;
    其中,所述预设距离映射关系δ表示所述中心点对坐标数据与目标物到人眼的距离dis之间的映射关系。
  15. 如权利要求14所述的双目AR头戴设备,其中,所述距离数据采集模块包括以下所列项中的任一项:
    单个摄像机、双目立体视觉系统、景深摄像机和视线追踪系统。
  16. 如权利要求14所述的双目AR头戴设备,其中,所述数据处理模块配置为以偏移中心点对坐标预设方位的位置为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
  17. 如权利要求14所述的双目AR头戴设备,其中,所述数据处理模块配置为以中心点对坐标为中心位置,将需显示的虚拟信息的信息源图像,分别显示在左右图像显示源上。
  18. 如权利要求14所述的双目AR头戴设备,其中,所述双目AR头戴设备还在使用者初次使用所述头戴设备时和/或当所述使用者每次使用所述头戴设备时,修正所述预设距离映射关系δ。
  19. 如权利要求14所述的双目AR头戴设备,其中,所述预设距离映射关系δ表示为:
    Figure PCTCN2015086360-appb-100004
    其中,dis表示目标物到人眼的距离,h表示拟合曲线函数,(SL,SR)表示左右两组有效显示信息的中心点对的坐标数据。
PCT/CN2015/086360 2015-01-21 2015-08-07 能自动调节景深的双目ar头戴设备及景深调节方法 WO2016115874A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510029879 2015-01-21
CN201510029879.7 2015-01-21

Publications (1)

Publication Number Publication Date
WO2016115874A1 true WO2016115874A1 (zh) 2016-07-28

Family

ID=56416370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/086360 WO2016115874A1 (zh) 2015-01-21 2015-08-07 能自动调节景深的双目ar头戴设备及景深调节方法

Country Status (2)

Country Link
CN (1) CN106199964B (zh)
WO (1) WO2016115874A1 (zh)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092355A (zh) * 2017-04-07 2017-08-25 北京小鸟看看科技有限公司 控制移动终端在vr头戴设备中内容输出位置的方法、装置和系统
CN112101275A (zh) * 2020-09-24 2020-12-18 广州云从洪荒智能科技有限公司 多目摄像头的人脸检测方法、装置、设备及介质
CN112890761A (zh) * 2020-11-27 2021-06-04 成都怡康科技有限公司 一种视力测试提示方法及可穿戴设备
CN112914494A (zh) * 2020-11-27 2021-06-08 成都怡康科技有限公司 一种基于视标自适应调节的视力测试方法及可穿戴设备
CN114252235A (zh) * 2021-11-30 2022-03-29 青岛歌尔声学科技有限公司 头戴显示设备的检测方法、装置、头戴显示设备及介质
CN114564108A (zh) * 2022-03-03 2022-05-31 北京小米移动软件有限公司 图像展示的方法、装置和存储介质
CN114757829A (zh) * 2022-04-25 2022-07-15 歌尔股份有限公司 拍摄校准方法、系统、设备及存储介质
CN117351074A (zh) * 2023-08-31 2024-01-05 中国科学院软件研究所 基于头戴式眼动仪和深度相机的视点位置检测方法及装置

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107116555A (zh) * 2017-05-27 2017-09-01 芜湖星途机器人科技有限公司 基于无线zigbee室内定位的机器人导向移动系统
WO2018232630A1 (zh) * 2017-06-21 2018-12-27 深圳市柔宇科技有限公司 三维影像预处理方法、装置及头戴显示设备
CN108663799B (zh) * 2018-03-30 2020-10-09 蒋昊涵 一种vr图像的显示控制系统及其显示控制方法
CN108632599B (zh) * 2018-03-30 2020-10-09 蒋昊涵 一种vr图像的显示控制系统及其显示控制方法
CN108710870A (zh) * 2018-07-26 2018-10-26 苏州随闻智能科技有限公司 智能穿戴设备及智能穿戴设备系统
CN112731665B (zh) * 2020-12-31 2022-11-01 中国人民解放军32181部队 一种自适应双目立体视觉微光夜视头戴系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05328408A (ja) * 1992-05-26 1993-12-10 Olympus Optical Co Ltd ヘッド・マウンテッド・ディスプレイ
JPH11202256A (ja) * 1998-01-20 1999-07-30 Ricoh Co Ltd 頭部搭載型画像表示装置
US20130050258A1 (en) * 2011-08-25 2013-02-28 James Chia-Ming Liu Portals: Registered Objects As Virtualized, Personalized Displays
CN103487938A (zh) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 头戴显示装置
CN103917913A (zh) * 2011-10-05 2014-07-09 谷歌公司 在近眼显示器上自动聚焦的方法
CN104076513A (zh) * 2013-03-26 2014-10-01 精工爱普生株式会社 头戴式显示装置、头戴式显示装置的控制方法、以及显示系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103336575B (zh) * 2013-06-27 2016-06-29 深圳先进技术研究院 一种人机交互的智能眼镜系统及交互方法
CN103499886B (zh) * 2013-09-30 2015-07-08 北京智谷睿拓技术服务有限公司 成像装置和方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05328408A (ja) * 1992-05-26 1993-12-10 Olympus Optical Co Ltd ヘッド・マウンテッド・ディスプレイ
JPH11202256A (ja) * 1998-01-20 1999-07-30 Ricoh Co Ltd 頭部搭載型画像表示装置
US20130050258A1 (en) * 2011-08-25 2013-02-28 James Chia-Ming Liu Portals: Registered Objects As Virtualized, Personalized Displays
CN103917913A (zh) * 2011-10-05 2014-07-09 谷歌公司 在近眼显示器上自动聚焦的方法
CN104076513A (zh) * 2013-03-26 2014-10-01 精工爱普生株式会社 头戴式显示装置、头戴式显示装置的控制方法、以及显示系统
CN103487938A (zh) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 头戴显示装置

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092355A (zh) * 2017-04-07 2017-08-25 北京小鸟看看科技有限公司 控制移动终端在vr头戴设备中内容输出位置的方法、装置和系统
CN107092355B (zh) * 2017-04-07 2023-09-22 北京小鸟看看科技有限公司 控制移动终端在vr头戴设备中内容输出位置的方法、装置和系统
CN112101275A (zh) * 2020-09-24 2020-12-18 广州云从洪荒智能科技有限公司 多目摄像头的人脸检测方法、装置、设备及介质
CN112890761A (zh) * 2020-11-27 2021-06-04 成都怡康科技有限公司 一种视力测试提示方法及可穿戴设备
CN112914494A (zh) * 2020-11-27 2021-06-08 成都怡康科技有限公司 一种基于视标自适应调节的视力测试方法及可穿戴设备
CN114252235A (zh) * 2021-11-30 2022-03-29 青岛歌尔声学科技有限公司 头戴显示设备的检测方法、装置、头戴显示设备及介质
CN114564108A (zh) * 2022-03-03 2022-05-31 北京小米移动软件有限公司 图像展示的方法、装置和存储介质
CN114757829A (zh) * 2022-04-25 2022-07-15 歌尔股份有限公司 拍摄校准方法、系统、设备及存储介质
CN117351074A (zh) * 2023-08-31 2024-01-05 中国科学院软件研究所 基于头戴式眼动仪和深度相机的视点位置检测方法及装置

Also Published As

Publication number Publication date
CN106199964B (zh) 2019-06-21
CN106199964A (zh) 2016-12-07

Similar Documents

Publication Publication Date Title
WO2016115874A1 (zh) 能自动调节景深的双目ar头戴设备及景深调节方法
WO2016115870A1 (zh) 双目ar头戴显示设备及其信息显示方法
WO2016115871A1 (zh) 能自动调节景深的双目ar头戴设备及景深调节方法
WO2016115873A1 (zh) 双目ar头戴显示设备及其信息显示方法
WO2016115872A1 (zh) 双目ar头戴显示设备及其信息显示方法
US10271042B2 (en) Calibration of a head mounted eye tracking system
US11854171B2 (en) Compensation for deformation in head mounted display systems
JP2020034919A (ja) 構造化光を用いた視線追跡
CN110764613B (zh) 基于头戴式眼动模组的眼动追踪校准方法
WO2020139736A1 (en) Headset adjustment for optimal viewing
JP6596678B2 (ja) 視線測定装置および視線測定方法
US20230255476A1 (en) Methods, devices and systems enabling determination of eye state variables
CN104345454A (zh) 头戴式视觉辅助系统及其成像方法
JP6324119B2 (ja) 回転角度算出方法、注視点検出方法、情報入力方法、回転角度算出装置、注視点検出装置、情報入力装置、回転角度算出プログラム、注視点検出プログラム及び情報入力プログラム
CN109308472B (zh) 一种基于虹膜投影匹配函数的三维视线估计方法
CN105872527A (zh) 双目ar头戴显示设备及其信息显示方法
TWI761930B (zh) 頭戴式顯示裝置以及距離量測器
KR101817436B1 (ko) 안구 전위 센서를 이용한 영상 표시 장치 및 제어 방법
CN109917908B (zh) 一种ar眼镜的图像获取方法及系统
JP6496917B2 (ja) 視線測定装置および視線測定方法
US20230393655A1 (en) Electronic apparatus
EP4236755A1 (en) Systems and methods for visual field testing in head-mounted displays

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15878544

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15878544

Country of ref document: EP

Kind code of ref document: A1