WO2023184109A1 - 眼球跟踪装置、方法以及显示设备 - Google Patents

眼球跟踪装置、方法以及显示设备 Download PDF

Info

Publication number
WO2023184109A1
WO2023184109A1 PCT/CN2022/083482 CN2022083482W WO2023184109A1 WO 2023184109 A1 WO2023184109 A1 WO 2023184109A1 CN 2022083482 W CN2022083482 W CN 2022083482W WO 2023184109 A1 WO2023184109 A1 WO 2023184109A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
eyeball
tracking
eye tracking
cameras
Prior art date
Application number
PCT/CN2022/083482
Other languages
English (en)
French (fr)
Inventor
薛亚冲
董学
吴仲远
孙建康
闫桂新
邵继洋
Original Assignee
京东方科技集团股份有限公司
北京京东方技术开发有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方技术开发有限公司 filed Critical 京东方科技集团股份有限公司
Priority to PCT/CN2022/083482 priority Critical patent/WO2023184109A1/zh
Priority to CN202280000578.XA priority patent/CN117280303A/zh
Publication of WO2023184109A1 publication Critical patent/WO2023184109A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present application relates to the field of eye tracking technology, and in particular to an eye tracking device, method and display device.
  • An eye-tracking device is a device that determines the location of eye gaze.
  • Current eye tracking devices usually include an eye tracking camera and a fill light.
  • the eye tracking camera can capture images within a certain range.
  • the eye tracking device can measure the pupil position of the eyeball in the image and the spot position of the fill light on the eyeball. Determine the gaze position of the eyeball.
  • the range that the eye-tracking camera can capture is the tracking range of the eye-tracking device. When the eyeball is located in the tracking range, the eye-tracking device can determine the gaze position of the eyeball.
  • the tracking range of the above-mentioned eye tracking device is small.
  • Embodiments of the present application provide an eye tracking device, method and display device.
  • the technical solutions are as follows:
  • an eye tracking device which includes: a bracket, a control component, a fill light component, and at least two eye tracking cameras;
  • the at least two eye-tracking cameras are located at at least two different positions of the bracket, and the shooting ranges of the at least two eye-tracking cameras have at least a partial shooting range that does not overlap;
  • the supplementary light component is located on the bracket;
  • the control component is electrically connected to the at least two eye-tracking cameras, and is used to determine the gaze position of the eyeballs within the shooting range of the at least two eye-tracking cameras.
  • the shooting ranges of two adjacent eye-tracking cameras have overlapping areas.
  • the size of the overlapping area is larger than the preset size of the eyeball.
  • the eye tracking device further includes: an image acquisition component;
  • the image acquisition component is located on the bracket and is electrically connected to the control component.
  • the shooting range of the at least two eye tracking cameras is located within the shooting range of the image acquisition component.
  • the separation area between the edges of the shooting range of the at least two eye-tracking cameras and the edge of the shooting range of the image acquisition component, and the size of the separation area is larger than the preset face size.
  • the bracket is located outside the surface to be observed, the bracket includes a mounting part for setting the surface to be observed, and the control component is used to determine the position of the eyeballs within the shooting range of the at least two eye-tracking cameras. The gaze position on the surface to be observed.
  • control component is used to determine the gaze of the eyeballs on the surface to be observed within the shooting range of the at least two eye-tracking cameras and the distance between the surface to be observed and the surface to be observed satisfies a preset distance range.
  • the optical axes of the at least two eye-tracking cameras are parallel to each other and coplanar.
  • the first plane determined by the optical axes of the at least two eye-tracking cameras intersects a perpendicular line passing through the center of the surface to be observed, and the intersection point is located at within the preset distance range.
  • intersection point is located at the center within the preset distance range.
  • the surface to be observed is a display surface of a display panel, and the bracket includes a mounting portion for mounting the display panel;
  • the at least two eye tracking cameras are located below the mounting part.
  • the at least two eye-tracking cameras are arranged in a horizontal direction on the bracket.
  • the surface to be observed is a display surface of a display panel, and the bracket includes a mounting portion for mounting the display panel;
  • the eye tracking device also includes: an image acquisition component;
  • the image acquisition component is located on the bracket and below the installation part.
  • the image acquisition component is electrically connected to the control component.
  • the shooting range of the at least two eye tracking cameras is located within the shooting range of the image acquisition component. within the range.
  • the fill light assembly includes at least two fill light lamps, and the at least two fill light lamps are located at different positions on the bracket.
  • the at least two fill lights are arranged in a horizontal direction on the bracket.
  • the image acquisition component includes a color camera
  • the eye tracking camera includes an infrared camera
  • the resolution of the color camera is smaller than the resolution of the infrared camera.
  • an eye tracking method is provided.
  • the method is used in a control component of an eye tracking device.
  • the eye tracking device further includes: a bracket, a fill light component, and at least two eye tracking cameras. ;
  • the at least two eye-tracking cameras are located at at least two different positions of the bracket, and at least part of the shooting ranges of the at least two eye-tracking cameras do not overlap;
  • the light fill component is located on the bracket On;
  • the control component is electrically connected to the at least two eye tracking cameras,
  • the methods include:
  • the gaze position of the eyeball is determined based on the first image.
  • the eye tracking device further includes: an image acquisition component; the image acquisition component is located on the bracket and is electrically connected to the control component,
  • the method further includes:
  • Determining the gaze position of the eyeball based on the first image includes:
  • a gaze position of the eyeball is determined based on the position of the eyeball in the first image and the first image.
  • the first image is an image collected by a first eye-tracking camera among the at least two eye-tracking cameras,
  • Determining the gaze position of the eyeball based on the position of the eyeball in the first image and the first image includes:
  • the line of sight model of the at least two eye-tracking cameras and the line-of-sight model of the eye-tracking camera is used for the position of the pupil of the eyeball and the position of the light spot in the first image collected by the eye-tracking camera. , determine the gaze position of the eyeball;
  • the gaze position of the eyeball is determined based on the line of sight models of the at least two eyeball tracking cameras, and the pupil position and light spot position of the eyeball.
  • determining the gaze position of the eyeball based on the line of sight model of the at least two eyeball tracking cameras, and the pupil position and light spot position of the eyeball includes:
  • a target gaze position is determined.
  • the first image is an image collected by a first eye-tracking camera among the at least two eye-tracking cameras,
  • the fill light assembly includes at least two fill light lamps, the at least two fill light lamps are located at different positions on the bracket, and the first image includes two eyeballs,
  • obtaining the position of the pupil of the eyeball and the position of the light spot on the eyeball of the light emitted by the light supplement component based on the position of the eyeball in the first image and the first image include:
  • the positions of the pupils of the two eyeballs are obtained, and the light emitted by the at least two supplementary light components is in the two The location of the light spot on the eyeball;
  • the obtaining the line of sight models of the at least two eye-tracking cameras includes:
  • the sample data includes the position of the pupil of one of the two eyeballs, and the position of a light spot on the one eyeball, at least one
  • the line of sight model corresponding to the sample data is the line of sight model of the first eye-tracking camera among the at least two eye-tracking cameras.
  • a display device including a display panel, a housing, and an eye tracking device;
  • the eye tracking device includes: a control component, a light filling component, an image acquisition component and at least two eye tracking cameras;
  • the light supplement component, the image acquisition component and the at least two eye-tracking cameras are located on the housing, and are all facing the light emitting direction of the display panel.
  • the at least two eye-tracking cameras are located on at least two sides of the housing. different positions;
  • the control component is electrically connected to the at least two eye tracking cameras.
  • the at least two eye-tracking cameras are located below the display panel on the housing and arranged in a horizontal direction.
  • the display surface of the display panel is rectangular, and one side of the display surface is parallel to the horizontal direction;
  • the distance between any two adjacent eye-tracking cameras is equal.
  • the eye tracking cameras are coplanar with the display surface of the display panel, the number of eye tracking cameras is 3, and the distance between two adjacent eye tracking cameras among the 3 eye tracking cameras is satisfy:
  • L is used to determine the distance
  • p is determined by the width of the eyeball
  • e is determined by the distance between the two eyeballs
  • is the horizontal field of view angle of the eye tracking camera
  • D1 and D2 are respectively the preset The minimum distance and the maximum distance between the target area for eye tracking by the eye tracking camera and the display surface in a direction perpendicular to the display surface;
  • is the preset overall field of view angle of the three eye-tracking cameras.
  • the optical axes of the three eye-tracking cameras are parallel to each other and coplanar, and the first plane determined by the optical axes of the three eye-tracking cameras intersects a perpendicular line passing through the center of the display surface, and the intersection point within the target area.
  • intersection point is located at the center of the target area in a direction perpendicular to the display surface, and the eye tracking camera satisfies:
  • b is used to determine the angle between the optical axis of the eye tracking camera and the display surface
  • k is the angle between the center of the display surface and the eye tracking camera in the vertical direction. distance.
  • the image capture component is located below the display panel on the housing, and is equidistant from both ends of the lower edge of the display surface.
  • the housing includes a base and a frame located on the base;
  • the light supplement component, the image acquisition component and the at least two eye tracking cameras are all installed on the frame.
  • the at least two eye-tracking cameras can form a larger tracking range
  • the control component can confirm the gaze position of the eyeball within this larger tracking range, solves the problem of a small tracking range of the eye tracking device in the related art, and achieves the effect of increasing the tracking range of the eye tracking device.
  • Figure 1 is a schematic structural diagram of an eye tracking device provided by an embodiment of the present application.
  • Figure 2 is a schematic structural diagram of another eye tracking device provided by an embodiment of the present application.
  • Figure 3 is a bottom view of the eye tracking device shown in Figure 2;
  • Figure 4 is a left view of the eye tracking device shown in Figure 3;
  • Figure 5 is a schematic diagram of an eyeball viewing display surface in an embodiment of the present application.
  • Figure 6 is a flow chart of an eye tracking method provided by an embodiment of the present application.
  • Figure 7 is a flow chart of another eye tracking method provided by an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a display device provided by an embodiment of the present application.
  • an eye tracking device usually only includes an eye tracking camera. Due to the particularity of its functions, the shooting range of a conventional eye tracking camera (the shooting range can be measured by the field of view of the eye tracking camera and the preset distance range for eye tracking). For the same tracking distance range , the larger the field of view, the larger the shooting range, the smaller the field of view, the smaller the shooting range) is usually smaller, for example, the field of view (Field of View, FOV) is usually below 30 degrees, which causes the eyeball to
  • the tracking range of the tracking device (the tracking range can be the same as the shooting range of the eye-tracking camera) is small, and the eyeballs can easily escape the tracking range of the eye-tracking device, resulting in a poor user experience of the eye-tracking device.
  • the preset distance refers to the distance between the eyeball and the eyeball tracking camera.
  • the preset distance range is the range of the distance.
  • the eyeball tracking camera can track eyeballs within the range and located in the field of view.
  • Infrared cameras are usually infrared cameras. Infrared cameras with larger field of view are more difficult to manufacture, usually require customization, and are more expensive. For example, taking a 32-inch screen as an example, the viewing distance is set to 400-700mm, the viewing range is 60° (that is, the minimum angle between the viewing line of sight and the screen is greater than 30°), and the pupil positioning error is 0 pixels ( p), the line of sight calculation accuracy is ⁇ 1°; using these parameters, it can be found that the number of pixels in the image captured by the eye-tracking camera within 1° must be at least 45.2p, which is the angle of the image collected by the infrared camera.
  • the embodiments of the present application provide an eye tracking device, method and display device, which can solve this problem.
  • FIG. 1 is a schematic structural diagram of an eye tracking device 10 provided by an embodiment of the present application.
  • the eye tracking device 10 may be partially or fully integrated into a display device.
  • the eye tracking device 10 includes: a bracket 11 , a control component 12 , a fill light component 13 and at least two eye tracking cameras 14 .
  • At least two eye-tracking cameras 14 are located at at least two different positions of the bracket 11 , and at least part of the shooting ranges (f1 and f2) of the at least two eye-tracking cameras 14 do not overlap.
  • the fill light component 13 is located on the bracket 11 .
  • the control component 12 is electrically connected to at least two eye-tracking cameras 14 and is used to determine the gaze position of the eyeballs within the shooting range of the at least two eye-tracking cameras 14 .
  • Figure 1 shows a structure in which the eye tracking device includes two eye tracking cameras, but the number of eye tracking cameras can be more, such as 3, 4, 5, 6, 7, 8, etc. This is not the case in the embodiment of the present application. Make restrictions.
  • the shooting ranges f1 and f2 of the two eye-tracking cameras 14 have partial overlapping areas, and there are also partially non-overlapping areas.
  • the two eye-tracking cameras 14 together form a larger shooting range.
  • the eye tracking device provided by the embodiment of the present application is provided with at least two eye tracking cameras, and the shooting ranges of the at least two eye tracking cameras have at least part of the shooting ranges that do not overlap, and thus the at least two eye tracking cameras
  • the camera can form a larger tracking range, and the control component can confirm the eye gaze position within this larger tracking range, solving the problem of the smaller tracking range of the eye tracking device in related technologies, and achieving increased eye tracking.
  • the eye tracking device can achieve a larger eye tracking range through multiple eye tracking cameras with smaller shooting ranges, without using eye tracking cameras with larger shooting ranges, thus reducing the cost of the eye tracking device. Manufacturing difficulty and cost.
  • FIG. 2 is a schematic structural diagram of another eye tracking device provided by an embodiment of the present application. This eye tracking device has some adjustments based on the eye tracking device shown in FIG. 1 .
  • the shooting ranges of two adjacent eye-tracking cameras 14 have an overlapping area q1 .
  • the combined shooting range formed by the shooting ranges of the at least two eye-tracking cameras 14 can be a continuous range, which avoids the inability to perform eye tracking when the eyeball is located between the shooting ranges of two adjacent eye-tracking cameras 14 The problem.
  • the size of the overlapping area q1 may be larger than the size of the preset eyeball. Specifically, the size of the minimum position of the overlapping area q1 may be greater than or equal to the size of the preset maximum position of the eyeball. That is to say, the overlapping area q1 of the shooting range of adjacent eye-tracking cameras 14 can accommodate at least one eyeball. This ensures that any one of the adjacent eye-tracking cameras 14 can collect a complete eyeball.
  • the image can facilitate the analysis of the image of the eyeball, and avoid the problem that the eyeball tracking camera 14 can only collect the image of part of the eyeball, making it difficult to analyze the image of the eyeball.
  • the preset size of the eyeball may refer to the normal size of the eyeball, for example, it may be greater than or equal to the normal maximum size of the eyeball.
  • Figure 2 shows a structure in which the number of eye-tracking cameras is three, but the embodiment of the present application does not limit this.
  • Figure 3 is a bottom view of the eye tracking device shown in Figure 2 (the control assembly is not shown in Figure 3).
  • the bracket 11 includes a mounting part 111 for setting the surface m1 to be observed, and the control component is used to determine the gaze position of the eyeballs within the shooting range of at least two eye-tracking cameras on the surface m1 to be observed.
  • the surface m1 to be observed may be the display surface of the display panel
  • the mounting part 111 may be a mounting part for mounting the display panel.
  • the control component is used to determine, within the shooting range of at least two eye-tracking cameras 14, the number of eyeballs on the surface m1 to be observed whose distance from the surface m1 to be observed satisfies a preset distance range. Gaze position.
  • the preset distance range may be a range in which the vertical distance from the surface m1 to be observed is greater than D1 and less than D2. It should be noted that in this structure, the eye tracking camera is tilted relative to the plane m1 to be observed, so the preset distance range can be a preset distance range converted based on the tilt angle.
  • the converted preset distance range It refers to the range of distance in the direction perpendicular to the surface m1 to be observed.
  • the optical axes of at least two eye-tracking cameras 14 are parallel to each other and coplanar.
  • the first plane m2 determined by the optical axes of at least two eye-tracking cameras 14 intersects the perpendicular line c1 passing through the center of the surface to be observed, and the intersection j is located at the preset within the specified distance range. That is to say, the vertical distance between the intersection j and the surface to be observed m1 is greater than D1 and less than D2.
  • At least two eye-tracking cameras 14 are located below the mounting part 111, and the eye-tracking cameras 14 are located below the surface m1 to be observed.
  • the lower part here may refer to the side of the surface m1 to be observed facing the ground. Since the eyeball has structures such as eyelashes and eyelids, when the eyeball tracking camera 14 is located at a higher position, it may be blocked by the eyelashes, eyelids and other structures, making it difficult to obtain a relatively complete and clear image of the eyeball.
  • the embodiment of the present application provides In the eye tracking device, the eye tracking camera 14 is located below the installation part 111 and is in a lower position.
  • the intersection j is located at the center within the preset distance range. That is, the vertical distance between the intersection j and the surface to be observed m1 is:
  • At least two eye-tracking cameras 14 are arranged in a horizontal direction on the bracket 11 . That is, the at least two eye tracking cameras 14 are arranged on the bracket 11 in a direction parallel to the horizontal plane. Since the eyeball usually only moves in the direction parallel to the horizontal plane when observing the surface to be observed (such as the display surface of a monitor), even if it moves in the vertical and horizontal directions, the moving distance will be small, making it difficult to escape from the eye-tracking camera. 14 in the vertical direction.
  • At least two eye-tracking cameras 14 are arranged in the horizontal direction on the bracket 11, which can maximize the use of each eye-tracking camera 14 and further increase the The shooting range of the at least two eye tracking cameras 14 constitutes the tracking range.
  • FIG. 4 is a left side view of the eye tracking device shown in FIG. 3 .
  • the eye tracking device also includes: an image acquisition component 15; the image acquisition component 15 is located on the bracket 11 and is electrically connected to the control component 12.
  • the shooting range of at least two eye tracking cameras 14 is located within the shooting range of the image acquisition component 15. That is to say, the shooting range of the image acquisition component 15 is larger than the shooting range of the at least two eye-tracking cameras 14 and includes the shooting range of the at least two eye-tracking cameras 14 .
  • the field of view angle of the image acquisition component 15 is greater than the field of view angle of any eye tracking camera 14.
  • the field of view angle v of the image acquisition component 15 Greater than the field of view a of the eye tracking camera 14, and at the shortest distance of the preset distance range (the vertical distance D1 from the surface m1 to be observed), and at the farthest distance D2 (the vertical distance D1 from the surface m1 to be observed) At the position where the vertical distance is D2), the area that the image acquisition component 15 can capture is larger than the area that the at least two eye tracking cameras 14 can capture.
  • the image acquisition component 15 is not an eye-tracking camera, and when it has a larger field of view, the manufacturing difficulty and cost are not too high. In this way, an image acquisition component 15 with a larger field of view can be provided in the eye tracking device to cooperate with at least two eye tracking cameras 14 to determine the gaze position of the eyeballs. For example, due to the small field of view of the eye-tracking camera, it may be difficult to include the image of the entire human face in the collected image, and it will also be difficult to determine the position of the eyeball based on the image of the human face.
  • the image acquisition component 15 has a larger field of view, and the probability that the acquired image only includes an image of the entire face is relatively high, thereby reducing the difficulty of determining the position of the eyeball based on the image of the face, and improving the determined eyeball position.
  • the degree of accuracy is conducive to the determination of eye gaze position by the eye tracking device.
  • the size of the separation area g1 is larger than the size of the preset human face. size. This ensures that the image capture component 15 can also capture images of a complete human face at the edge of the shooting range of the eye tracking camera 14 .
  • the size of the preset face may refer to a preset standard face model, for example, the size in the horizontal direction may be 135 mm or larger.
  • the size of the separation area g1 may be larger than the preset face size at the minimum distance of the preset distance range of the eye tracking device (that is, at s1 shown in Figure 2). This is because the shooting ranges of the eye-tracking camera 14 and the image acquisition component 15 are both fan-shaped in the horizontal direction, and the field of view of the image acquisition component 15 is larger than that of the eye-tracking camera 14 , so the separation area g1 (i.e., FIG.
  • the size of the shaded area in 2 (the size in the direction parallel to the surface m1 to be observed) will gradually increase in the direction away from the surface m1 to be observed, as shown in Figure 2 in the eye tracking device of the interval area g1
  • the size x1 of s1 at the minimum distance of the preset distance range is smaller than the size x2 of s2 at the maximum distance. Based on this, when the size of the separation area g1 is larger than the size of the preset face at the minimum distance of the preset distance range of the eye tracking device, then at each position in the preset distance range, the edge of the shooting range of the image acquisition component 15 and The size of the interval g1 between the edges of the shooting range of the eye tracking camera 14 will be larger than the preset face size.
  • the image acquisition component 15 is located on the bracket 11 and below the mounting part 111 , and the image acquisition component 15 is electrically connected to the control component 12 .
  • the lower part here may refer to the side of the surface m1 to be observed facing the ground. Since the eyeball has structures such as eyelashes and eyelids, when the image acquisition component 15 is located at a higher position, it may be blocked by the eyelashes, eyelids and other structures, making it difficult to obtain a more complete and clear image of the eyeball.
  • the embodiment of the present application provides In the eye tracking device, the image acquisition component 15 is located below the mounting part 111 and is in a lower position.
  • the probability that the image acquisition component 15 is blocked by structures such as eyelashes and eyelids is small, and the acquisition is relatively easy.
  • the probability of a complete and clear image of the eyeball is higher, making it easier to determine the gaze position of the eyeball.
  • the image acquisition component 15 includes a color camera
  • the eye tracking camera 14 includes an infrared camera
  • the resolution of the color camera is smaller than that of the infrared camera. Since the color camera is used to determine the position of the eyeball, but does not need to determine the position of the pupil and light spot, the requirement for resolution is low, and a color camera with a lower resolution can be set in the image acquisition component 15 to implement the function.
  • the color camera is a conventional camera, which is less difficult to manufacture. When the resolution is lower, the cost will be lower, which can reduce the cost of the eye tracking device.
  • the color camera has a resolution of 640*480, a frame rate of 120fps, and a horizontal field of view of 90°.
  • the fill light assembly 13 includes at least two fill light lamps 131 , and the at least two fill light lamps 131 are located at different positions on the bracket 11 .
  • the light emitted by the fill light can form a spot on the eyeball. Since the position of the fill light and the position of the eye tracking camera are known, another method is to use the position of the pupil of the eyeball and the position of the pupil of the eyeball in the image collected by the eye tracking camera.
  • the position of the spot in the eyeball caused by the light emitted by the fill light, as well as the positions of the fill light and the eye tracking camera are used to determine the gaze position of the eyeball.
  • the image collected by the eye-tracking camera does not include the image of the complete eyeball, and the light spot may be lost. At this time, the gaze position of the eyeball cannot be determined. .
  • multiple fill light lamps are provided, and the multiple fill light lamps are located at different positions. Furthermore, the multiple light spots generated by the light emitted by the multiple fill light lamps on the eyeball will also be located at different positions. position, and the probability of multiple light spots being lost at the same time is smaller, thus increasing the probability of successfully determining the gaze position of the eyeball.
  • the fill light can be an infrared fill light.
  • the at least two fill lights 131 are arranged in a horizontal direction on the bracket 11 . It can be seen from the above that the eyeball usually moves in the horizontal direction relative to the surface m1 to be observed, and then when the at least two fill lights 131 are arranged in the horizontal direction on the bracket 11, the fill light 131 can cover a larger area. The range of eye movement can thereby improve the tracking range of the eye tracking device.
  • the eye tracking device provided by the embodiment of the present application is provided with at least two eye tracking cameras, and the shooting ranges of the at least two eye tracking cameras have at least part of the shooting ranges that do not overlap, and thus the at least two eye tracking cameras
  • the camera can form a larger tracking range, and the control component can confirm the eye gaze position within this larger tracking range, solving the problem of the smaller tracking range of the eye tracking device in related technologies, and achieving increased eye tracking.
  • the eye tracking device can achieve a larger eye tracking range through multiple eye tracking cameras with smaller shooting ranges, without using eye tracking cameras with larger shooting ranges, thus reducing the cost of the eye tracking device. Manufacturing difficulty and cost.
  • the embodiment of the present application shows that the eye tracking device and the display panel are both located on the bracket to form an integral device.
  • the eye tracking device and the display panel can also be separate structures, and the embodiment of the present application does not limit this.
  • the eye tracking device includes 3 infrared cameras, 2 infrared LED lights, 1 color camera, and a control component; the color camera has a lower resolution and is used to collect human faces.
  • Image parameters are as follows: resolution 640*480, frame rate 120fps, horizontal field of view 90°; the other three infrared cameras have higher resolution and are used to collect pupil images.
  • the three infrared cameras can be abbreviated as: IR1, IR2 and IR3 , the parameters are as follows: resolution 1280*720, frame rate 120fps, horizontal field of view 30°; an infrared LED light with a wavelength of 850 nanometers is set between adjacent infrared cameras for infrared fill light; the overall hardware structure layout can be referred to Figure 4.
  • the process of determining the theoretical parameters of an eye-tracking camera can include:
  • Figure 5 is a schematic diagram of an eyeball observing a display surface in an embodiment of the present application.
  • the size of the display surface m1 of the screen is P (this size is the length of the diagonal of the display surface of the screen.
  • the horizontal observation range of the eyeball is H°
  • the vertical observation range is V°
  • the preset distance between the eyeball and the display surface m1 of the screen is (D1, D2)
  • the radius of the eyeball for r the radius of the eyeball for r;
  • the opening angle of the screen in the eyeball is: ⁇ _H* ⁇ _V, as shown in Figure 2;
  • the arc length of the pupil's lateral movement is C:
  • the pupil positioning error is n pixels, and the required sight line calculation accuracy is e°.
  • the degree of pupil rotation is ⁇ H.
  • the pupil's gaze position is in the horizontal direction
  • the number of pixels it moves is m:
  • the opening angle formed by the pupil movement arc length C in the camera is ⁇ , and the formula is as follows:
  • the viewing distance is set to 400-700mm.
  • the viewing distance can be equal to the preset distance of the eye tracking device for eye tracking.
  • the viewing angle range is 60°.
  • the pupil The positioning error is 0pixel, and the line of sight calculation accuracy is ⁇ 1°; based on these known parameters and the above content, it can be seen that the number of pixels captured by the eye-tracking camera within 1° range needs to be at least 45.2, that is, the number of pixels of the eye-tracking camera
  • this function can be realized through at least two eye-tracking cameras with smaller field of view and one color camera.
  • the optical axes of the image acquisition component 15 (including the color camera) and the three eye tracking cameras (infrared cameras) point directly forward, and the color camera has a large field of view, and its The shooting range covers all infrared shooting ranges.
  • the preset distance of the eye tracking device for eye tracking can be preset to 400-700mm, and the eye tracking requirement range is ⁇ 30°. That is, the parameter requirements of the eye tracking device are: at the nearest operating distance of 400mm, the human head movement reaches ⁇ 30° °, the line of sight calculation can also be performed; similarly at 700mm, it is necessary to ensure that the movement of the human head reaches ⁇ 30°, and the line of sight calculation can be performed.
  • the shooting range of adjacent eye tracking cameras is required to have an overlapping area, and the width of the overlapping area is greater than the width of the single eye, and the height of the overlapping area is greater than the height of the single eye;
  • the width of a single eye is about 30mm, and the height of a single eye is about 20mm; then the horizontal overlap area of adjacent eye tracking cameras is at least 30mm, and the vertical intersection area is at least 30mm.
  • the stacking area is at least 20mm.
  • the width of the overlapping area q1 at the minimum distance s1 of the preset distance range of the eye-tracking device is p (single eye width)
  • the lateral FOV of the eye-tracking camera is ⁇
  • the width of the eye-tracking device is The overall field of view is ⁇ .
  • the width (dimension in the horizontal direction) of eye tracking at this distance can be obtained:
  • the field of view angle must be ⁇
  • the lateral tracking width W1 must be at least:
  • the horizontal tracking width W2 is at least:
  • the following settings are made: on the left boundary in the horizontal direction, use right eye tracking, and on the horizontal For the right boundary in the direction, use left eye tracking, so that when the head moves to the boundary, the eye tracking camera does not need to cover both eyes, only one eye is needed.
  • the width of the horizontal shooting coverage of the three eye tracking cameras at s1 needs to be ⁇ W1
  • the width of the horizontal shooting coverage at s2 needs to be ⁇ W2, as shown in Figure 2; then:
  • the distance between adjacent eye tracking cameras 14 in the horizontal direction is 166mm, as shown in Figure 2; at s1, the size of the overlapping area q1 in the horizontal direction is 30mm; if the camera aspect ratio is 16:9 , the longitudinal field of view is 16°; from the above, it can be seen that the PPD required by the eye tracking camera is 45.2, then the horizontal resolution is 1265 and the vertical resolution is 723. Furthermore, the resolution of the eye-tracking camera only needs to be greater than 1265*723.
  • the eye-tracking camera provided by the embodiment of the present application greatly reduces the requirements for lateral resolution (for As for the longitudinal resolution requirement, the eye tracking camera provided by the embodiment of the present application can have the same longitudinal resolution as the eye tracking camera in the related art).
  • the resolution of an eye-tracking camera in the eye-tracking device provided by the embodiment of the present application may be 1280*800.
  • the placement angles and positions of the three eye tracking cameras 14 and the image acquisition component 15 are as shown in Figure 3.
  • the camera angle is determined by: the optical axes of at least two eye-tracking cameras 14 are parallel to each other and coplanar, and the first plane m2 determined by the optical axes of at least two eye-tracking cameras 14 and the vertical line c1 passing through the center of the surface to be observed intersect, and the intersection point j is located between the preset distance D1-D2, for example, at the intermediate distance (D1+D2)/2. If the distance between the center of the display surface m1 and the eye-tracking camera at the bottom is k, then the placement angle b of the camera optical axis is:
  • the shooting range needs to cover the shooting range of all infrared cameras; according to the standard face model, the average width of the face is 136mm.
  • a buffer area of face width can be added to the left and right of the eye tracking camera shooting area.
  • the horizontal shooting width of the three eye tracking cameras at 600mm can be calculated to be 952mm. Therefore, the horizontal shooting width of the color camera is:
  • the horizontal FOV of a color camera at 600mm is at least:
  • the eye tracking device provided by the embodiment of the present application can only use three conventional infrared cameras and one conventional color camera to achieve eye tracking with a horizontal range of more than 60°, far exceeding the eyeball tracking in related technologies.
  • the tracking angle of the tracking device in the horizontal direction is only about 32°; and the total cost of three conventional infrared cameras and one conventional color camera is relatively low, generally no more than a thousand yuan.
  • the eye tracking device can also have other parameters, such as eye tracking.
  • Parameters such as the number of cameras, the field of view of the eye-tracking camera, and the preset distance range of the eye-tracking camera for eye tracking can also be other values.
  • FIG. 6 is a flow chart of an eye tracking method provided by an embodiment of the present application.
  • the eye tracking method can be applied to the control component of the eye tracking device provided by the above embodiment.
  • the method includes the following steps:
  • Step 601 Turn on the fill light component.
  • Step 602 Obtain first images acquired by at least two eye-tracking cameras.
  • Step 603 Determine the gaze position of the eyeball based on the first image.
  • the eye tracking method acquires a first image through at least two eye tracking cameras, and determines the gaze position of the eyeball based on the first image. Due to the shooting range of the at least two eye tracking cameras, At least some of the shooting ranges do not overlap, and the at least two eye tracking cameras can form a larger tracking range, so that the control component can confirm the gaze position of the eyeballs within this larger tracking range, solving the problem in related technologies. For the problem that the tracking range of the eye's gaze position is small, the effect of increasing the tracking range of the eye tracking device is achieved.
  • FIG. 7 is a flow chart of an eye tracking method provided by an embodiment of the present application.
  • the eye tracking method can be applied to the control component of the eye tracking device provided by the above embodiment.
  • the method includes the following steps:
  • Step 701 Turn on the fill light component.
  • the fill light component The light emitted by the fill light component can form a light spot on the eyeball. Since the position of the fill light component and the position of the eye tracking camera are known, another method is to use the eyeball in the image collected by the eye tracking camera. The position of the pupil and the spot of the light emitted by the fill-light component in the eyeball, as well as the positions of the fill-light component and the eye-tracking camera are used to determine the gaze position of the eyeball.
  • the fill light assembly includes at least two fill light lamps, and the at least two fill light lamps are located at different positions on the bracket.
  • the light emitted by the fill light can form a spot on the eyeball. Since the position of the fill light and the position of the eye tracking camera are known, another method is to use the position of the pupil of the eyeball and the position of the pupil of the eyeball in the image collected by the eye tracking camera.
  • the position of the spot in the eyeball caused by the light emitted by the fill light, as well as the positions of the fill light and the eye tracking camera are used to determine the gaze position of the eyeball.
  • the image collected by the eye-tracking camera does not include the image of the complete eyeball, and the light spot may be lost. At this time, the gaze position of the eyeball cannot be determined. .
  • multiple fill light lamps are provided, and the multiple fill light lamps are located at different positions. Furthermore, the multiple light spots generated by the light emitted by the multiple fill light lamps on the eyeball will also be located at different positions. position, and the probability of multiple light spots being lost at the same time is smaller, thus increasing the probability of successfully determining the gaze position of the eyeball.
  • Step 702 Obtain the second image collected by the image acquisition component.
  • the image acquisition component has a large shooting range, and the shooting range of each eye tracking camera is located within the shooting range of the image acquisition component.
  • Step 703 Obtain face information based on the second image.
  • the face information may include the location of the area of the face in the second image.
  • a face detection algorithm can be used to obtain the face information in the second image.
  • Step 704 Determine the position of the eyeball in the second image based on the face information.
  • the position of the eyeball in the second image can be further determined based on the face information, and the position can be identified in the form of eyeball area coordinates.
  • the face pose coordinates can also be determined based on the face information.
  • Step 705 Determine the position of the eyeball in the first image based on the position of the eyeball in the second image.
  • the conversion process may include :
  • the internal parameter matrices of camera A (camera A can refer to the image acquisition component) and camera B (camera B can refer to the eye tracking camera) are as follows:
  • z represents the distance from the eyeball to the surface to be observed, s represents the horizontal distance between camera A and camera B, t represents the vertical distance between camera A and camera B; (u A , v A ) represents the distance between camera A and camera B in the vertical direction; Eyeball area coordinates, (u B , v B ) represent the eyeball area coordinates in camera B, fxA represents a parameter related to the horizontal focal length of camera A, which can be considered as the horizontal focal length of camera A when the unit is one pixel, fxB can be thought of as the horizontal focal length of camera B when the unit is one pixel, fyA can be thought of as the vertical focal length of camera A when the unit is one pixel, fyB can be thought of as the vertical focal length of camera B when the unit is one pixel.
  • U A is a parameter related to the horizontal resolution of camera A, which can be approximately half of the horizontal resolution of camera A
  • V A is a parameter related to the vertical resolution of camera A, which
  • Step 706 Based on the position of the eyeball in the first image and the first image, obtain the position of the pupil of the eyeball and the position of the light spot on the eyeball of the light emitted by the supplementary light component.
  • This step can include:
  • an adaptive threshold detection method based on cumulative grayscale histogram can be used for pupil detection to obtain the position of the pupil.
  • the method includes:
  • the pupil center coordinates can be used as the coordinates of the pupil.
  • the pupil center coordinates can also be obtained through other methods, and the embodiments of the present application do not limit this.
  • This pixel area includes the pupil coordinates and is centered on the pupil coordinates.
  • the value of m can be determined based on the diameter of the pupil. For example, when the pupil diameter range is 20 to 50 pixels , then m can be detected within 80).
  • the spot is generally located near the pupil. Therefore, spot detection within a certain range around the pupil can improve the accuracy and speed of spot detection.
  • the spot detection threshold to 200, which is an empirical value. According to the threshold, the spot image is obtained by binarization, and the spot center coordinates are obtained by ellipse fitting. The spot center coordinates can be used as the position of the spot.
  • step 706 includes:
  • the positions of the pupils of the two eyeballs are obtained, as well as the positions of the light spots on the two eyeballs of the light emitted by the at least two supplementary light components.
  • Step 707 Obtain the line of sight models of at least two eye-tracking cameras.
  • the line-of-sight models of the two eye-tracking cameras Before the current moment, if the line-of-sight models of the two eye-tracking cameras have been determined, the previously determined line-of-sight models can be obtained directly. If the line-of-sight models of at least two eye-tracking cameras have not been determined before the current moment, then in this step Determine the line-of-sight model for at least two eye-tracking cameras.
  • the line of sight model of the eye tracking camera is used to determine the gaze position of the eyeball based on the position of the eyeball's pupil and the position of the light spot in the first image collected by the eyeball tracking camera.
  • the first image includes two eyeballs. If the fill light component includes multiple fill lights, then for each eye tracking camera, the pupil of each eyeball, and the spot generated by each fill light can correspond to a line of sight model. For example, if the number of fill lights is n, then for each eye tracking camera, 2n sight models are included.
  • step 707 may include:
  • a line of sight model corresponding to at least one sample data is determined.
  • the sample data includes the position of a pupil of one eyeball among the two eyeballs, and the position of a light spot on one eyeball, and the line of sight model corresponding to at least one sample data is the line of sight model of the first eyeball tracking camera.
  • determining the line of sight model corresponding to at least one sample data may specifically include:
  • the display surface of the display panel can display preset content.
  • the display panel can display 9 calibration points in sequence.
  • the eyeball looks at the Click for a certain time (such as 2 seconds); the first calibration point disappears, the second calibration point appears, and the same operation as the first calibration point is performed until all 9 calibration points are displayed; for each eye tracking camera,
  • the relative coordinates of the first spot of one of the two fill lights and the pupil generates 9 calibration data, while the relative coordinates of the second spot of the other fill light and the pupil also generate 9 calibration data, that is :
  • Each eye tracking camera will generate four sets of calibration data (the first light spot of the left eye - pupil, the second light spot of the left eye - pupil, the first light spot of the right eye - pupil, the second light spot of the right eye - pupil).
  • the four sets of calibration data obtained by the first eye-tracking camera are coordinates in the coordinate system of the first eye-tracking camera. These coordinates can be converted into coordinates in the world coordinate system.
  • a conversion formula can be:
  • u B and v B represent the spot coordinates in the coordinate system of the first eye tracking camera
  • RB represents the rotation angle of the optical axis of the first eye tracking camera relative to the world coordinate axis
  • T B represents the rotation angle of the first eye tracking camera's optical axis relative to the world coordinate axis.
  • the coordinate difference between the origin of the coordinate system and the origin of the world coordinate system (the origin of the world coordinate system can be located at the center of the display surface, with the +Y axis directly above, the +X axis directly to the right, and the +Z axis directly in front);
  • x w , y w , z w represents the spot coordinates in the world coordinate system;
  • the internal parameter matrix of the first eye-tracking camera is the internal parameter matrix of the first eye-tracking camera.
  • this part of the internal parameter matrix can be replaced by the internal parameter matrix of the corresponding eye-tracking camera.
  • the coordinate conversion method of the pupil can be similar to this method, and will not be described again in the embodiment of the present application.
  • the line of sight model of the first eye tracking camera can be determined through these four sets of calibration data.
  • the line of sight model uses a polynomial regression model, as shown below
  • _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ , b 5 is an unknown parameter, obtained through the above four sets of calibration data, x and y are the relative coordinates of the spot and the pupil.
  • the line-of-sight model of the first eye-tracking camera can be converted into other line-of-sight models that determine other eye-tracking cameras through coordinate transformation.
  • the coordinates in the coordinate system of the first eye-tracking camera and the coordinates of the second eye-tracking camera can be determined.
  • the conversion relationship of the coordinates in the coordinate system of the eye tracking camera), a coordinate conversion formula is as follows:
  • u Bn and v Bn represent the coordinates of the light spot in the coordinate system of the second eye tracking camera
  • u B2 and v B2 represent the image coordinates in the first eye tracking camera.
  • fxBn can represent the second eyeball.
  • the horizontal focal length of the tracking camera fyBn can represent the vertical focal length of the second eye tracking camera
  • fyB2 can represent the vertical focal length of the first eye tracking camera
  • fxB2 can represent the horizontal focal length of the first eye tracking camera
  • U B2 is the sum of the first eye tracking camera Parameters related to the horizontal resolution of the camera may be approximately half of the horizontal resolution of the first eye-tracking camera.
  • V B2 is a parameter related to the vertical resolution of the first eye-tracking camera, which may be approximately half of the horizontal resolution of the first eye-tracking camera.
  • Half of the vertical resolution s represents the horizontal distance between the first eye-tracking camera and the second eye-tracking camera, t represents the vertical distance between the first eye-tracking camera and the second eye-tracking camera;
  • first eye-tracking The internal parameter matrices of camera B2 and second eye tracking camera Bn can be as follows:
  • the conversion relationship between the coordinates of the second eye tracking camera and the coordinates of the world coordinate system can be determined. Then based on these conversion relationships, the line of sight model of each eye tracking camera can be determined through the above four sets of calibration data.
  • Step 708 Determine the gaze position of the eyeball based on the line of sight models of at least two eyeball tracking cameras, as well as the pupil position of the eyeball and the position of the light spot.
  • Step 708 may include:
  • At least one gaze position of the eyeball can be determined through the line of sight models of at least two eye tracking cameras.
  • this gaze position can be determined as the target gaze position
  • the target gaze position can be determined based on the average of the coordinates of the multiple gaze positions.
  • the eye tracking device used in the eye tracking method provided by the embodiment of the present application has multiple eye cameras, at least two of them can be turned on when implementing the eye tracking method.
  • eye tracking camera For example, when the eyeball is within the shooting range of an eye-tracking camera, the eye-tracking camera and the eye-tracking cameras adjacent to the eye-tracking camera can be turned on, while other eye-tracking cameras are not turned on. This can ensure eye tracking. When the function is realized normally, the power consumption of the eye tracking device is saved.
  • the eye tracking device includes five eye tracking cameras arranged horizontally.
  • the central eye tracking camera and the two adjacent eyeballs can be turned on. Tracking cameras, the two edge eye tracking cameras are not turned on, which saves the power consumption of the two eye tracking cameras and does not affect the normal implementation of the eye tracking function.
  • the eye tracking method acquires a first image through at least two eye tracking cameras, and determines the gaze position of the eyeball based on the first image. Due to the shooting range of the at least two eye tracking cameras, At least some of the shooting ranges do not overlap, and the at least two eye tracking cameras can form a larger tracking range, so that the control component can confirm the gaze position of the eyeballs within this larger tracking range, solving the problem in related technologies. For the problem that the tracking range of the eye's gaze position is small, the effect of increasing the tracking range of the eye tracking device is achieved.
  • the embodiment of the present application further provides a display device 20 .
  • the display device 20 includes a display panel 21 , a housing 22 and an eye tracking device 23 .
  • the eye tracking device 23 includes: a control component 231 , a fill light component 232 , an image acquisition component 233 and at least two eye tracking cameras 234 .
  • the light filling component 232 , the image acquisition component 233 and at least two eye tracking cameras 234 are located on the housing 22 , and are all facing the light emitting direction of the display panel 21 .
  • the at least two eye tracking cameras 234 are located at at least two different positions on the housing 22 .
  • the control component 231 is electrically connected to at least two eye tracking cameras 234 .
  • the bracket in the eye tracking device provided in the above embodiments can be combined with the housing of the display device, for example, can be integrated with the housing, or can be installed on the housing.
  • At least two eye-tracking cameras 234 are located below the display panel 21 on the housing 22 and arranged along the horizontal direction f1.
  • the display surface m1 of the display panel 21 is rectangular, and one side of the display surface m1 is parallel to the horizontal direction f1.
  • the distance between any two adjacent eye tracking cameras 234 is equal.
  • the eye-tracking cameras 234 are coplanar with the display surface m1 of the display panel 21 , the number of eye-tracking cameras 234 is 3, and the spacing between two adjacent eye-tracking cameras 234 among the three eye-tracking cameras 234 satisfies:
  • L is used to determine the distance
  • p is determined by the width of the eyeball
  • e is determined by the distance between the two eyeballs
  • is the horizontal field of view angle of the eye tracking camera 23
  • D1 and D2 are respectively the preset eye tracking camera 234. The minimum distance and maximum distance between the target area of eye tracking and the display surface m1 in the direction perpendicular to the display surface m1;
  • is the preset overall field of view angle of the three eye-tracking cameras.
  • the optical axes of the three eye-tracking cameras 234 are parallel to each other and coplanar.
  • the first plane determined by the optical axes of the three eye-tracking cameras intersects the perpendicular line passing through the center of the display surface m1, and the intersection point is located in the target area. .
  • intersection point is located at the center of the target area in the direction perpendicular to the display surface, and the eye tracking camera satisfies:
  • b is used to determine the angle between the optical axis of the eye tracking camera 234 and the display surface m1
  • k is the distance in the vertical direction between the center of the display surface m1 and the eye tracking camera 234.
  • the image capture component 233 is located below the display panel 21 on the housing 22 and is equidistant from both ends of the lower edge of the display surface m1. That is, the image capture component 233 can be located at the middle position of the lower edge of the display surface m1, so that the shooting range of the image capture component 233 can cover the shooting range of the eye tracking camera 234.
  • the housing 22 includes a base 221 and a frame 222 located on the base 221; the light filling component 232, the image acquisition component 233 and at least two eye tracking cameras 234 are all installed on the frame 222.
  • the control component 231 may include a processor in the display device.
  • the control component 231 may be located in the host of the desktop display device.
  • the display panel can be various display panels such as a liquid crystal display panel or an organic light-emitting diode display panel.
  • the display device can be a monitor located on a table, or can be a vertical monitor, etc., and can determine the gaze position of the eyeballs when displaying images.
  • the display device provided by the embodiment of the present application is provided with at least two eye-tracking cameras, and the shooting ranges of the at least two eye-tracking cameras have at least part of the shooting ranges that do not overlap, and further the at least two eye-tracking cameras do not overlap.
  • the eye-tracking camera can form a larger tracking range, and the control component can confirm the gaze position of the eyeballs within this larger tracking range, solving the problem of the small tracking range of the eye-tracking device in related technologies, and achieving increased Effects of tracking range of eye-tracking devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

本申请提供了一种眼球跟踪装置、方法以及显示设备,属于眼球跟踪技术领域,该眼球跟踪装置(10)包括:支架(11)、控制组件(12)、补光组件(13)以及至少两个眼球跟踪相机(14);至少两个眼球跟踪相机(14)位于支架的至少两个不同的位置,且至少两个眼球跟踪相机(14)的拍摄范围至少具有部分拍摄范围不重叠;控制组件(12)与至少两个眼球跟踪相机(14)电连接。该眼球跟踪装置(10)中包括至少两个眼球跟踪相机(14),且这至少两个眼球跟踪相机(14)的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相机(14)可以构成一个较大的跟踪范围,控制组件可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中眼球跟踪装置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。

Description

眼球跟踪装置、方法以及显示设备 技术领域
本申请涉及眼球跟踪技术领域,特别涉及一种眼球跟踪装置、方法以及显示设备。
背景技术
眼球跟踪装置是一种能够确定眼球注视位置的装置。
目前的眼球跟踪装置通常包括眼球跟踪相机以及补光灯,该眼球跟踪相机能够拍摄到一定范围内的图像,眼球跟踪装置能够根据该图像中眼球的瞳孔位置以及补光灯在眼球上的光斑位置确定得到眼球的注视位置。其中,眼球跟踪相机所能够拍摄到的范围即为眼球跟踪装置的跟踪范围,当眼球位于该跟踪范围时,该眼球跟踪装置能够确定眼球的注视位置。
但是,上述眼球跟踪装置的跟踪范围较小。
发明内容
本申请实施例提供了一种眼球跟踪装置、方法以及显示设备。所述技术方案如下:
根据本申请实施例的一方面,提供一种眼球跟踪装置,所述眼球跟踪装置包括:支架、控制组件、补光组件以及至少两个眼球跟踪相机;
所述至少两个眼球跟踪相机位于所述支架的至少两个不同的位置,且所述至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠;
所述补光组件位于所述支架上;
所述控制组件与所述至少两个眼球跟踪相机电连接,用于确定所述至少两个眼球跟踪相机的拍摄范围内的眼球的注视位置。
可选地,所述至少两个眼球跟踪相机中,两个相邻的眼球跟踪相机的拍摄范围具有重叠区域。
可选地,所述重叠区域的尺寸大于预设眼球的尺寸。
可选地,所述眼球跟踪装置还包括:图像采集组件;
所述图像采集组件位于所述支架上,且与所述控制组件电连接,所述至少两个眼球跟踪相机的拍摄范围位于图像采集组件的拍摄范围内。
可选地,所述至少两个眼球跟踪相机的拍摄范围的边缘与所述图像采集组件的拍摄范围的边缘之间具有间隔区域,所述间隔区域的尺寸大于预设人脸的尺寸。
可选地,所述支架位于待观察面外,所述支架包括了用于设置待观察面的安装部,所述控制组件用于确定所述至少两个眼球跟踪相机的拍摄范围内的眼球在所述待观察面上的注视位置。
可选地,所述控制组件用于确定所述至少两个眼球跟踪相机的拍摄范围内,与所述待观察面之间的距离满足预设距离范围的眼球在所述待观察面上的注视位置;
所述至少两个眼球跟踪相机的光轴相互平行且共面,所述至少两个眼球跟踪相机的光轴确定的第一平面与经过所述待观察面的中心的垂线相交,且交点位于所述预设距离范围内。
可选地,所述交点位于所述预设距离范围内的中心。
可选地,所述待观察面为显示面板的显示面,所述支架包括用于安装所述显示面板的安装部;
所述至少两个眼球跟踪相机位于所述安装部的下方。
可选地,所述至少两个眼球跟踪相机在所述支架上沿水平方向排布。
可选地,所述待观察面为显示面板的显示面,所述支架包括用于安装所述显示面板的安装部;
所述眼球跟踪装置还包括:图像采集组件;
所述图像采集组件位于所述支架上,且位于所述安装部的下方,所述图像采集组件与所述控制组件电连接,所述至少两个眼球跟踪相机的拍摄范围位于图像采集组件的拍摄范围内。
可选地,所述补光组件包括至少两个补光灯,所述至少两个补光灯位于所述支架上的不同位置。
可选地,所述至少两个补光灯在所述支架上沿水平方向排布。
可选地,所述图像采集组件包括彩色相机,所述眼球跟踪相机包括红外相机,所述彩色相机的分辨率小于所述红外相机的分辨率。
根据本申请实施例的另一方面,提供一种眼球跟踪方法,所述方法用于眼 球跟踪装置的控制组件中,所述眼球跟踪装置还包括:支架、补光组件以及至少两个眼球跟踪相机;所述至少两个眼球跟踪相机位于所述支架的至少两个不同的位置,且所述至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠;所述补光组件位于所述支架上;所述控制组件与所述至少两个眼球跟踪相机电连接,
所述方法包括:
开启所述补光组件;
获取所述至少两个眼球跟踪相机获取的第一图像;
基于所述第一图像确定眼球的注视位置。
可选地,所述眼球跟踪装置还包括:图像采集组件;所述图像采集组件位于所述支架上,且与所述控制组件电连接,
所述基于所述第一图像确定眼球的注视位置之前,所述方法还包括:
获取所述图像采集组件采集的第二图像;
基于所述第二图像获取人脸信息;
基于所述人脸信息确定眼球在所述第二图像中的位置;
所述基于所述第一图像确定眼球的注视位置,包括:
基于所述眼球在所述第二图像中的位置,确定所述眼球在所述第一图像中的位置;
基于所述眼球在所述第一图像中的位置以及所述第一图像,确定所述眼球的注视位置。
可选地,所述第一图像为所述至少两个眼球跟踪相机中的第一眼球跟踪相机采集的图像,
所述基于所述眼球在所述第一图像中的位置以及所述第一图像,确定所述眼球的注视位置,包括:
基于所述眼球在所述第一图像中的位置以及所述第一图像,获取所述眼球的瞳孔的位置,以及所述补光组件发出的光线在所述眼球上的光斑的位置;
获取所述至少两个眼球跟踪相机的视线模型,所述眼球跟踪相机的视线模型用于根据所述眼球跟踪相机采集得到的第一图像中,所述眼球的瞳孔的位置以及所述光斑的位置,确定所述眼球的注视位置;
基于所述至少两个眼球跟踪相机的视线模型,以及所述眼球的瞳孔位置和光斑的位置,确定所述眼球的注视位置。
可选地,所述基于所述至少两个眼球跟踪相机的视线模型,以及所述眼球的瞳孔位置和光斑的位置,确定所述眼球的注视位置,包括:
分别通过所述至少两个眼球跟踪相机的视线模型确定得到所述眼球的至少一个注视位置;
基于所述至少一个注视位置,确定一个目标注视位置。
可选地,所述第一图像为所述至少两个眼球跟踪相机中第一眼球跟踪相机采集的图像,
所述补光组件包括至少两个补光灯,所述至少两个补光灯位于所述支架上的不同位置,所述第一图像中包括两个眼球,
所述基于所述眼球在所述第一图像中的位置以及所述第一图像,获取所述眼球的瞳孔的位置,以及所述补光组件发出的光线在所述眼球上的光斑的位置,包括:
基于所述两个眼球在所述第一图像中的位置以及所述第一图像,获取所述两个眼球的瞳孔的位置,以及所述至少两个补光组件发出的光线在所述两个眼球上光斑的位置;
所述获取所述至少两个眼球跟踪相机的视线模型,包括:
基于每个样本数据,确定至少一个所述样本数据对应的视线模型,所述样本数据包括所述两个眼球中一个眼球的瞳孔的位置,以及所述一个眼球上的一个光斑的位置,至少一个所述样本数据对应的视线模型为所述至少两个眼球跟踪相机中所述第一眼球跟踪相机的视线模型。
根据本申请实施例的另一方面,提供一种显示设备,所述显示设备包括显示面板、外壳以及眼球跟踪装置;
所述眼球跟踪装置包括:控制组件、补光组件、图像采集组件以及至少两个眼球跟踪相机;
所述补光组件、图像采集组件以及所述至少两个眼球跟踪相机位于所述外壳上,且均朝向所述显示面板的出光方向,所述至少两个眼球跟踪相机位于所述外壳的至少两个不同的位置;
所述控制组件与所述至少两个眼球跟踪相机电连接。
可选地,所述至少两个眼球跟踪相机在所述外壳上位于所述显示面板的下方,且沿水平方向排布。
可选地,所述显示面板的显示面呈矩形,且所述显示面的一条边与水平方 向平行;
所述至少两个眼球跟踪相机中,任意两个相邻的眼球跟踪相机之间的距离相等。
可选地,所述眼球跟踪相机与所述显示面板的显示面共面,所述眼球跟踪相机的数量为3,3个所述眼球跟踪相机中两个相邻的眼球跟踪相机之间的间距满足:
Figure PCTCN2022083482-appb-000001
其中,L用于确定所述间距,p由眼球宽度决定,e由两个眼球之间的间距决定,α为所述眼球跟踪相机的水平视场角,D1和D2分别为预设的所述眼球跟踪相机进行眼球跟踪的目标区域在垂直于所述显示面的方向上,与所述显示面之间的最小距离以及最大距离;
Figure PCTCN2022083482-appb-000002
θ为预设的3个所述眼球跟踪相机的整体视场角。
可选地,3个所述眼球跟踪相机的光轴相互平行且共面,3个所述眼球跟踪相机的光轴确定的第一平面与经过所述显示面的中心的垂线相交,且交点位于所述目标区域内。
可选地,所述交点位于所述目标区域在垂直于所述显示面的方向上的中心,所述眼球跟踪相机满足:
Figure PCTCN2022083482-appb-000003
其中,所述b用于确定所述眼球跟踪相机的光轴与所述显示面之间的夹角,所述k为所述显示面的中心与所述眼球跟踪相机在垂直方向上之间的距离。
可选地,所述图像采集组件在所述外壳上位于所述显示面板的下方,且与所述显示面的下边缘的两端的距离相等。
可选地,所述外壳包括底座以及位于所述底座上的框体;
所述补光组件、所述图像采集组件以及所述至少两个眼球跟踪相机均安装于所述框体上。
本申请实施例提供的技术方案带来的有益效果至少包括:
通过在眼球跟踪装置中设置至少两个眼球跟踪相机,且这至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相 机可以构成一个较大的跟踪范围,控制组件可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中眼球跟踪装置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种眼球跟踪装置的结构示意图;
图2是本申请实施例提供的另一种眼球跟踪装置的结构示意图;
图3是图2所示的眼球跟踪装置的仰视图;
图4是图3所示的眼球跟踪装置的左视图;
图5是本申请实施例中一种眼球观察显示面的示意图;
图6是本申请实施例提供的一种眼球跟踪方法的流程图;
图7是本申请实施例提供的另一种眼球跟踪方法的流程图;
图8是本申请实施例提供的一种显示设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
相关技术中,眼球跟踪装置通常仅包括一个眼球跟踪相机。而眼球跟踪相机由于其功能的特殊性,常规的眼球跟踪相机的拍摄范围(拍摄范围可以由眼球跟踪相机的视场角以及进行眼球跟踪的预设距离范围来衡量,对于同样的跟踪距离的范围,视场角越大,则拍摄范围越大,视场角越小,则拍摄范围越小)通常较小,例如视场角(Field of View,FOV)通常在30度以下,这就导致眼球跟踪装置的跟踪范围(跟踪范围可以与眼球跟踪相机的拍摄范围相同)较小,眼球非常容易脱离眼球跟踪装置的跟踪范围,导致眼球跟踪装置的用户体验较差。其中,预设距离是指眼球与眼球跟踪相机之间的距离,预设距离范围为该距离的范围,眼球跟踪相机可以对该范围内,且位于视场角中的眼球进行跟踪。
目前的眼球跟踪相机通常为红外相机,而视场角较大的红外相机,制造难度较大,通常需要定制,且成本较高。示例性的,以32寸屏幕为例,观看距离设定为400-700mm,观看范围为60°(也即是观看视线与屏幕的最小夹角大于30度),瞳孔定位误差为0个像素(p),视线计算精度<±1°;利用这些参数可求得,眼球跟踪相机在1°范围内拍摄的图像包含的像素数至少需要达到45.2p,即红外相机所采集的到的图像的角分辨率(Pixels Per Degree,PPD)(角分辨率指视场角中的平均每1°夹角内填充的像素点的数量)至少为45.2,才能保证视线计算精度达到<±1°;即,如果眼球追踪水平范围为60°,那么对应的横向分辨率需要达到60*45.2=2712p;当前市面上如此高分辨率的红外相机且达到120fps(每秒帧数)(眼球跟踪相机通常要求较高fps)的相机的制造难度通常较大,且成本较高。
本申请实施例提供了一种眼跟踪装置、方法以及显示设备,可以解决该问题。
图1是本申请实施例提供的一种眼球跟踪装置的结构示意图,该眼球跟踪装置10可以部分或者全部结合于显示设备中。该眼球跟踪装置10包括:支架11、控制组件12、补光组件13以及至少两个眼球跟踪相机14。
至少两个眼球跟踪相机14位于支架11的至少两个不同的位置,且至少两个眼球跟踪相机14的拍摄范围(f1和f2)至少具有部分拍摄范围不重叠。
补光组件13位于支架11上。
控制组件12与至少两个眼球跟踪相机14电连接,用于确定至少两个眼球跟踪相机14的拍摄范围内的眼球的注视位置。
图1示出了眼球跟踪装置包括两个眼球跟踪相机的结构,但眼球跟踪相机的数量还可以为更多,如3、4、5、6、7、8等,本申请实施例对此不进行限制。
由图1可以看出,两个眼球跟踪相机14的拍摄范围f1和f2存在部分重叠区域,还存在部分不重叠的区域,两个眼球跟踪相机14共同的构成了一个较大的拍摄范围。
综上所述,本申请实施例提供的眼球跟踪装置,通过设置至少两个眼球跟踪相机,且这至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相机可以构成一个较大的跟踪范围,控制组件可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中眼球跟踪 装置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。
另外,该眼球跟踪装置可以通过多个拍摄范围较小的眼球跟踪相机,来实现一个较大的眼球的跟踪范围,而无需使用拍摄范围较大的眼球跟踪相机,如此便降低了眼球跟踪装置的制造难度以及成本。
图2是本申请实施例提供的另一种眼球跟踪装置的结构示意图,该眼球跟踪装置在图1所示的眼球跟踪装置的基础上进行了一些调整。
其中,可选地,至少两个眼球跟踪相机14中,两个相邻的眼球跟踪相机14的拍摄范围具有重叠区域q1。如此可以使这至少两个眼球跟踪相机14的拍摄范围所构成的合并拍摄范围是一个连续的范围,避免了眼球位于两个相邻的眼球跟踪相机14的拍摄范围之间时,无法进行眼球跟踪的问题。
其中,重叠区域q1的尺寸可以大于预设眼球的尺寸,具体的,重叠区域q1的最小位置处尺寸可以大于或等于预设眼球的最大位置处尺寸。也即是相邻的眼球跟踪相机14的拍摄范围的重叠区域q1,至少可以容纳一个眼球,如此可以保证相邻的眼球跟踪相机14中,任意一个眼球跟踪相机14都可以采集的得到完整的眼球的图像,可以便于对眼球的图像进行分析,避免出现眼球跟踪相机14仅能够采集得到部分眼球的图像,难以对眼球的图像进行分析的问题的发生。其中,预设眼球的尺寸可以参考眼球的常规尺寸例如可以大于或等于眼球常规的最大尺寸。
图2示出的是眼球跟踪相机的数量为3的结构,但本申请实施例对此不进行限制。
图3是图2所示的眼球跟踪装置的仰视图(图3中未示出控制组件)。其中,支架11包括用于设置待观察面m1的安装部111,控制组件用于确定至少两个眼球跟踪相机的拍摄范围内的眼球在待观察面m1上的注视位置。在一种示例性的实施例中,该待观察面m1可以为显示面板的显示面,进而安装部111可以为用于安装显示面板的安装部。
在一种示例性的实施例中,控制组件用于确定至少两个眼球跟踪相机14的拍摄范围内,与待观察面m1之间的距离满足预设距离范围的眼球在待观察面m1上的注视位置。如图3所示,该预设距离范围可以为与待观察面m1的垂直距离大于D1且小于D2的范围。需要说明的是,由于此种结构下,眼球跟踪相机相对于待观察面m1倾斜设置,因而该预设距离范围可以是基于倾斜角度所转 换得到的预设距离范围,转换后的预设距离范围是指垂直于待观察面m1的方向上的距离的范围。
至少两个眼球跟踪相机14的光轴相互平行且共面,至少两个眼球跟踪相机14的光轴确定的第一平面m2与经过待观察面的中心的垂线c1相交,且交点j位于预设距离范围内。也即是该交点j与待观察面m1的垂直距离大于D1且小于D2。
在一种示例性的实施例中,至少两个眼球跟踪相机14位于安装部111的下方,进而该眼球跟踪相机14位于待观察面m1的下方。这里的下方可以是指待观察面m1朝向地面的一侧。由于眼球具有睫毛以及眼皮等结构,因而当眼球跟踪相机14位于较高的位置时,可能会被睫毛以及眼皮等结构阻挡,进而难以获取较为完整清晰的眼球的图像,而本申请实施例提供的眼球跟踪装置中,眼球跟踪相机14位于安装部111的下方,处于一个较低的位置,当眼球观察待观察面m1时,眼球跟踪相机14被睫毛以及眼皮等结构阻挡的概率较小,获取较为完整清晰的眼球的图像的概率较大,便于确定眼球的注视位置。其中,交点j位于预设距离范围内的中心。也即是,交点j与待观察面m1的垂直距离为:
(D1+D2)/2。眼球位置该位置处的概率较大,交点j位置该位置时,可以进一步提升获取的眼球图像的清晰度(眼球越靠近眼球跟踪相机的光轴,则获取的眼球的图像就会越清晰)。
在一种示例性的实施例中,至少两个眼球跟踪相机14在支架11上沿水平方向排布。也即是这至少两个眼球跟踪相机14沿平行于水平面的方向在支架11上排布。由于眼球在观察待观察面(如显示器的显示面时)时,通常仅在平行于水平面的方向上移动,即便在垂直与水平面的方向上移动,移动距离也会较小,难以脱离眼球跟踪相机14在竖直方向上的拍摄范围,因而在本申请实施例中,将至少两个眼球跟踪相机14在支架11上沿水平方向排布,可以最大化利用每个眼球跟踪相机14,进一步增大这至少两个眼球跟踪相机14的拍摄范围所构成的跟踪范围。
图4是图3所示的眼球跟踪装置的左视图。其中,眼球跟踪装置还包括:图像采集组件15;图像采集组件15位于支架11上,且与控制组件12电连接,至少两个眼球跟踪相机14的拍摄范围位于图像采集组件15的拍摄范围内。也即是该图像采集组件15的拍摄范围大于至少两个眼球跟踪相机14的拍摄范围,且包括了该至少两个眼球跟踪相机14的拍摄范围。在眼球跟踪装置的预设距离 范围确定后,该图像采集组件15的视场角大于任意一个眼球跟踪相机14的视场角,具体的,请参考图2,图像采集组件15的视场角v大于眼球跟踪相机14的视场角a,且在预设距离范围的最近距离处(与待观察面m1的垂直距离为D1的位置处),以及最远距离D2处(与待观察面m1的垂直距离为D2的位置处),该图像采集组件15所能拍摄到区域均大于该至少两个眼球跟踪相机14所能拍摄到的区域。
该图像采集组件15并非是眼球跟踪相机,具有较大的视场角时,制造难度以及成本也不会太高。如此便可以在眼球跟踪装置的中设置一个视场角较大的图像采集组件15,以与至少两个眼球跟踪相机14配合,来确定眼球的注视位置。示例性的,由于眼球跟踪相机的视场角较小,所采集的图像中,可能难以包括整个人脸的图像,进而基于人脸的图像来确定眼球的位置也会较为困难。而图像采集组件15的视场角较大,采集得到的图像仅包括整个人脸的图像的概率较大,进而降低了基于人脸的图像来确定眼球的位置的难度,提升了确定的眼球位置精确程度,有利于眼球跟踪装置进行眼球注视位置的确定。
在一种示例性的实施例中,至少两个眼球跟踪相机14的拍摄范围的边缘与图像采集组件15的拍摄范围的边缘之间具有间隔区域g1,间隔区域g1的尺寸大于预设人脸的尺寸。如此便可以确保在眼球跟踪相机14的拍摄范围的边缘处,图像采集组件15也能够采集得到完整人脸的图像。该预设人脸的尺寸可以参考预设的标准人脸模型,例如在水平方向上的尺寸可以为135毫米或更大。
其中,间隔区域g1尺寸可以在眼球跟踪装置的预设距离范围的最小距离处(也即是图2所示的s1处)大于预设人脸的尺寸。这是由于眼球跟踪相机14以及图像采集组件15的拍摄范围在水平方向上均为扇形,且图像采集组件15的视场角大于眼球跟踪相机14的视场角,进而该间隔区域g1(即图2中的阴影区域)的尺寸(在平行于待观察面m1方向上的尺寸)会沿远离待观察面m1的方向上,逐渐增大,如图2中该间隔区域g1的在在眼球跟踪装置的预设距离范围的最小距离处s1的尺寸x1,小于在最大距离处s2的尺寸x2。基于此,当间隔区域g1尺寸在眼球跟踪装置的预设距离范围的最小距离处大于预设人脸的尺寸时,则在预设距离范围的各个位置,图像采集组件15的拍摄范围的边缘与眼球跟踪相机14的拍摄范围的边缘之间的间隔区域g1的尺寸均会大于预设人脸的尺寸。
在一种示例性的实施例中,请参考图4,图像采集组件15位于支架11上, 且位于安装部111的下方,图像采集组件15与控制组件12电连接。这里的下方可以是指待观察面m1朝向地面的一侧。由于眼球具有睫毛以及眼皮等结构,因而当图像采集组件15位于较高的位置时,可能会被睫毛以及眼皮等结构阻挡,进而难以获取较为完整清晰的眼球的图像,而本申请实施例提供的眼球跟踪装置中,图像采集组件15位于安装部111的下方,处于一个较低的位置,当眼球观察待观察面m1时,图像采集组件15被睫毛以及眼皮等结构阻挡的概率较小,获取较为完整清晰的眼球的图像的概率较大,便于确定眼球的注视位置。
在一种示例性的实施例中,图像采集组件15包括彩色相机,眼球跟踪相机14包括红外相机,彩色相机的分辨率小于红外相机的分辨率。由于彩色相机用于确定眼球的位置,但可以不进行瞳孔以及光斑的位置的确定,因而对于分辨率的要求较低,进而图像采集组件15中便可以设置分辨率较低的彩色相机来实现功能,另外,彩色相机为常规的相机,制造难度较低,在分辨率也较低时,其成本也会较低,如此便可以降低眼球跟踪装置的成本。示例性的,彩色相机的分辨率为640*480,帧率为120fps,水平视场角为90°。
请参考图4,补光组件13包括至少两个补光灯131,至少两个补光灯131位于支架11上的不同位置。补光灯发出的光线可以在眼球上形成光斑,由于补光灯的位置以及眼球跟踪相机的位置是已知的,进而一种方法是根据眼球跟踪相机采集的图像中,眼球的瞳孔的位置以及补光灯发出的光线在眼球中的光斑的位置,以及补光灯和眼球跟踪相机的位置来确定眼球的注视位置。但是,由于眼球所处位置以及角度等一些原因,导致眼球跟踪相机所采集得到的图像并不包括完整的眼球的图像,进而可能会存在光斑丢失的情况发生,此时就无法确定眼球的注视位置。
而本申请实施例中,设置了多个补光灯,也这多个补光灯位于不同的位置,进而这多个补光灯发出的光线在眼球上产生的多个光斑也会位于不同的位置,而多个光斑同时丢失的概率较小,进而便提升了能够成功确定眼球的注视位置的概率。
需要说明的是,当眼球跟踪相机包括红外相机时,该补光灯可以为红外补光灯。
可选地,该至少两个补光灯131在支架11上沿水平方向排布。由上述内容可知,眼球通常相对于待观察面m1,在水平方向上移动,继而当该至少两个补光灯131在支架11上沿水平方向排布时,可以使补光灯131覆盖更大的眼球活 动范围,进而提升该眼球跟踪装置的跟踪范围。
综上所述,本申请实施例提供的眼球跟踪装置,通过设置至少两个眼球跟踪相机,且这至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相机可以构成一个较大的跟踪范围,控制组件可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中眼球跟踪装置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。
另外,该眼球跟踪装置可以通过多个拍摄范围较小的眼球跟踪相机,来实现一个较大的眼球的跟踪范围,而无需使用拍摄范围较大的眼球跟踪相机,如此便降低了眼球跟踪装置的制造难度以及成本。
本申请实施例示出的是眼球跟踪装置与显示面板均位于支架上构成了一个整体的装置,但眼球跟踪装置与显示面板也可以为分离的结构,本申请实施例对此不进行限制。
在一个具体的例子中,本申请实施例提供的眼球跟踪装置包含3个红外相机,2个红外LED灯,1个彩色相机,和控制组件;其中彩色相机分辨率较低,用于采集人脸图像,参数如下:分辨率640*480,帧率120fps,水平视场角90°;另外3个红外相机分辨率较高,用于采集瞳孔图像,3个红外相机可以简称:IR1、IR2以及IR3,参数如下:分辨率1280*720,帧率120fps,水平视场角30°;相邻红外相机之间设置一个波长为850纳米的红外LED灯,用于红外补光;整体硬件结构布局可以参考图4。
眼球跟踪相机的理论参数的确定过程可以包括:
请参考图5,图5是本申请实施例中一种眼球观察显示面的示意图,其中,屏幕的显示面m1的尺寸为P(该尺寸为屏幕的显示面的对角线的长度,本示例以长宽比为16:9的屏幕进行说明),眼球的水平观察范围为H°,纵向观察范围为V°,眼球与屏幕的显示面m1的预设距离为(D1,D2),眼球半径为r;
则在距离屏幕d处,屏幕的在眼球中所成张角为:θ_H*θ_V,如图2所示;
则,屏幕宽*高为:
Figure PCTCN2022083482-appb-000004
屏幕在眼球中所成张角:
Figure PCTCN2022083482-appb-000005
下面以横向分辨率计算为例,保持头部不动,眼球从注视屏幕左侧移动至 注视屏幕右侧,瞳孔横向运动弧长为C:
Figure PCTCN2022083482-appb-000006
瞳孔定位误差为n个像素,需求的视线计算精度为e°,则上述瞳孔在从注视屏幕左侧移动至注视屏幕右侧后,瞳孔转动的度数为θ H,当瞳孔的注视位置在水平方向上移动时,其移动的像素数为m:
Figure PCTCN2022083482-appb-000007
瞳孔运动弧长C在相机中所成张角为α,公式如下:
Figure PCTCN2022083482-appb-000008
设相机横向分辨率为M,根据相机拍摄到的眼球运动像素数m、眼球运动在相机中所成张角α、相机本身的横向视场角H之间的等比关系可得:
Figure PCTCN2022083482-appb-000009
本示例性实施例中,以32寸屏幕为例,观看距离即为设定为400-700mm,该观看距离可以与眼球跟踪装置进行眼球跟踪的预设距离相等,观看角度范围为60°,瞳孔定位误差为0pixel,视线计算精度<±1°;基于这些已知参数以及上述内容可知,眼球跟踪相机在1°范围内拍摄的图像包含的像素数至少需要达到45.2个,即,眼球跟踪相机的PPD至少为45.2,才能保证视线计算精度达到<±1°;即,如果眼球追踪横向使用范围60°,那么对应的横向分辨率需要达到60*45.2=2712;当前市面上如此高分辨率且帧率达到120fps的红外相机较少,且成本较高。
而本申请实施例通过至少两个视场角较小的眼球跟踪相机以及一个彩色相机即可以实现该功能。
对于横向布局,如图2、图3和图4所示,图像采集组件15(包括彩色相机)以及三个眼球跟踪相机(红外相机)的光轴指向正前方,彩色相机视场角大,其拍摄范围覆盖所有红外的拍摄范围。
可以预先设置眼球跟踪装置进行眼球跟踪的预设距离为400-700mm,眼球追踪需求范围是±30°,即眼球跟踪装置的参数要求为:在最近使用距离400mm处,人头部运动达到±30°,也能够进行视线计算;同样在700mm处也要确保人头部运动达到±30°的情况下,能够进行视线计算。则在400mm处,要求相邻的眼球跟踪相机的拍摄范围存在重叠区域,且重叠区域的宽度大于单眼宽度,重 叠区域的高度大于单眼的高度;对于标准人脸(本领域确定的一个标准人脸,其具有较为标准的参数,用于进行人脸相关的各种计算),单眼宽度约为30mm,单眼高度约为20mm;则可得相邻眼球跟踪相机横向交叠区域至少为30mm,纵向交叠区域至少为20mm。
设相邻眼球跟踪相机横向间距为L,在眼球跟踪装置的预设距离范围的最小距离s1处的交叠区域q1宽度为p(单眼宽度),眼球跟踪相机横向FOV为α,眼球跟踪装置的整体的视场角为θ,在眼球跟踪装置的预设距离范围的最小距离s1处,横向跟踪宽度≥W1,最大距离s2处,横向跟踪宽度≥W2,其中,跟踪宽度是指眼球跟踪相机可以在该距离处进行眼球跟踪的宽度(水平方向上的尺寸),则可得:
在s1处,要满足视场角为θ,横向追踪宽度W1至少为:
Figure PCTCN2022083482-appb-000010
在s2处,横向追踪宽度W2至少为:
Figure PCTCN2022083482-appb-000011
为减少对眼球跟踪相机的视场角的需求(降低相机FOV可减小相机分辨率需求),在s2处,进行如下设定:在水平方向上的左侧边界,使用右眼追踪,在水平方向上的右侧边界,使用左眼追踪,这样设定可以当头部移动到边界时,眼球跟踪相机拍摄无需覆盖两只眼,只需要单眼即可。
三个眼球跟踪相机在s1处横向拍摄覆盖的宽度需要≥W1,在s2处横向拍摄覆盖的宽度需要≥W2,如图2所示;则:
Figure PCTCN2022083482-appb-000012
e为两个眼球之间的间距,示例性的,可以为65,由此可得:α≥28°,取α=28°,则L=166mm。
也即是相邻的眼球跟踪相机14在水平方向上的间距为166mm,如图2所示;在s1处,重叠区域q1在水平方向上的尺寸为30mm;若相机横纵比为16:9,纵向视场角为16°;由上述可知,眼球跟踪相机需求的PPD为45.2,则横向分辨率为1265,纵向分辨率为723。进而眼球跟踪相机的分辨率大于1265*723即可,相较于相关技术中眼球跟踪相机的分辨率需要大于2712,本申请实施例提供的 眼球跟踪相机大大缩小了对于横向分辨率的要求(对于纵向分辨率的要求,本申请实施例提供的眼球跟踪相机可以与相关技术中眼球跟踪相机的纵向分辨率相同)。示例性的,本申请实施例提供的眼球跟踪装置中的一个眼球跟踪相机的分辨率可以为1280*800。
3个眼球跟踪相机14和图像采集组件15的摆放角度和位置如图3所示。
其中,相机角度确定方式:至少两个眼球跟踪相机14的光轴相互平行且共面,至少两个眼球跟踪相机14的光轴确定的第一平面m2与经过待观察面的中心的垂线c1相交,且交点j位于预设距离D1-D2之间,例如在中间距离(D1+D2)/2处。若显示面m1的中心到底边的眼球跟踪相机之间的距离为k,则相机光轴的摆放角度b为:
Figure PCTCN2022083482-appb-000013
对于彩色相机,其拍摄范围,需要覆盖所有红外相机的拍摄范围;依据标准人脸模型,人脸平均宽度为136mm,为确保眼球跟踪相机拍摄区域的边界处,RGB相机能采集到完整人脸,可以在眼球跟踪相机拍摄区域左右各增加一个人脸宽度的缓冲区域,基于上述公开内容可以求得三个眼球跟踪相机在600mm处的水平拍摄宽度为952mm,因此彩色相机的水平拍摄宽度为:
952+136*2=1224mm;
由此,彩色相机在600mm处的水平FOV至少为:
arctan(1224/2/600)*2≈90°;
在硬件方面,本申请实施例提供的眼球跟踪装置,可以仅使用3个常规的红外相机和1个常规的彩色相机,即可实现水平范围超过60°的眼球追踪,远超相关技术中的眼球跟踪装置在水平方向的跟踪角度只有32°左右的水平;并且3个常规的红外相机和1个常规的彩色相机的总成本较低,一般不超过千元。
需要说明的是,上述示例性的例子给出了一种眼球跟踪装置的具体参数,但本申请实施例对此并不进行限制,也即是眼球跟踪装置还可以具有其他的参数,例如眼球跟踪相机的数量,眼球跟踪相机的视场角以及眼球跟踪相机进行眼球跟踪的预设距离范围等参数还可以为其他值。
图6是本申请实施例提供的一种眼球跟踪方法的流程图,该眼球跟踪方法可以应用于上述实施例提供的眼球跟踪装置的控制组件中,该方法包括下面几个步骤:
步骤601、开启补光组件。
步骤602、获取至少两个眼球跟踪相机获取的第一图像。
步骤603、基于第一图像确定眼球的注视位置。
综上所述,本申请实施例提供的眼球跟踪方法,通过至少两个眼球跟踪相机获取第一图像,并基于该第一图像确定眼球的注视位置,由于这至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相机可以构成一个较大的跟踪范围,如此控制组件便可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中对于眼球的注视位置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。
图7是本申请实施例提供的一种眼球跟踪方法的流程图,该眼球跟踪方法可以应用于上述实施例提供的眼球跟踪装置的控制组件中,该方法包括下面几个步骤:
步骤701、开启补光组件。
补光组件,补光组件发出的光线可以在眼球上形成光斑,由于补光组件的位置以及眼球跟踪相机的位置是已知的,进而一种方法是根据眼球跟踪相机采集的图像中,眼球的瞳孔的位置以及补光组件发出的光线在眼球中的光斑的位置,以及补光组件和眼球跟踪相机的位置来确定眼球的注视位置。
在一种示例性的实施例中,补光组件包括至少两个补光灯,至少两个补光灯位于支架上的不同位置。补光灯发出的光线可以在眼球上形成光斑,由于补光灯的位置以及眼球跟踪相机的位置是已知的,进而一种方法是根据眼球跟踪相机采集的图像中,眼球的瞳孔的位置以及补光灯发出的光线在眼球中的光斑的位置,以及补光灯和眼球跟踪相机的位置来确定眼球的注视位置。但是,由于眼球所处位置以及角度等一些原因,导致眼球跟踪相机所采集得到的图像并不包括完整的眼球的图像,进而可能会存在光斑丢失的情况发生,此时就无法确定眼球的注视位置。
而本申请实施例中,设置了多个补光灯,也这多个补光灯位于不同的位置,进而这多个补光灯发出的光线在眼球上产生的多个光斑也会位于不同的位置,而多个光斑同时丢失的概率较小,进而便提升了能够成功确定眼球的注视位置的概率。
步骤702、获取图像采集组件采集的第二图像。
该图像采集组件具有一个较大的拍摄范围,每个眼球跟踪相机的拍摄范围均位于该图像采集组件的拍摄范围内。
步骤703、基于第二图像获取人脸信息。
由于该图像采集组件的拍摄范围较大,进而该图像采集组件采集的得到的第二图像包括完整的人脸的概率较大,基于第二图像获取人脸信息也会更为精确。该人脸信息可以包括人脸在第二图像中的区域的位置。
示例性的,可以采用人脸检测算法来得到第二图像中的人脸信息。
步骤704、基于人脸信息确定眼球在第二图像中的位置。
确定了人脸信息后,即可以进一步基于人脸信息确定眼球在第二图像中的位置,该位置可以以眼球区域坐标的形式标识。此外,还可以基于人脸信息确定确定人脸位姿坐标。
步骤705、基于眼球在第二图像中的位置,确定眼球在第一图像中的位置。
由于眼球在第二图像中的位置等信息是在图像采集组件的坐标系下确定的参数,而第一图像为眼球跟踪相机获取的图像,进而要先进行坐标系的转换,该转换过程可以包括:
Figure PCTCN2022083482-appb-000014
Figure PCTCN2022083482-appb-000015
上式中,相机A(相机A可以指图像采集组件)和相机B(相机B可以指眼球跟踪相机)的内部参数矩阵如下:
Figure PCTCN2022083482-appb-000016
Figure PCTCN2022083482-appb-000017
z表示眼球到待观察面的距离,s表示相机A和相机B在水平方向的间距,t表示相机A和相机B在竖直方向上的间距;(u A,v A)表示相机A中的眼球区域坐标,(u B,v B)表示相机B中的眼球区域坐标,fxA表示与相机A水平焦距相关的一个参数,可以认为是在单位为一个像素的情况下,相机A的水平焦距,类似的,fxB可以认为是在单位为一个像素时,相机B的水平焦距,fyA可以认为是在单位为一个像素时,相机A的垂直焦距,fyB可以认为是在单位为一个像素时,相机B的垂直焦距,U A为和相机A的水平分辨率相关的参数,可以约为相机A的水平分辨率的一半,V A为和相机A的垂直分辨率相关的参数,可以约为 相机A的垂直分辨率的一半,U B为和相机B的水平分辨率相关的参数,可以约为相机B的水平分辨率的一半,V B为和相机B的垂直分辨率相关的参数,可以约为相机B的垂直分辨率的一半。
如此便可以确定眼球在第一图像中的位置。
步骤706、基于眼球在第一图像中的位置以及第一图像,获取眼球的瞳孔的位置,以及补光组件发出的光线在眼球上的光斑的位置。
该步骤可以包括:
1)瞳孔检测。
示例性的,可以采用基于累积灰度直方图的自适应阈值检测方法来进行瞳孔检测,以获取瞳孔的位置。该方法包括:
1.1获取当前眼球区域的灰度直方图;
1.2将所得的灰度直方图按照从小到大进行累加,由于瞳孔区域灰度值较小,因此在累加后的灰度直方图中,瞳孔区域的灰度值和非瞳孔区域灰度值之间会出现一个拐点,这个拐点记为当前瞳孔区域的阈值;
1.3得到瞳孔区域的阈值后,对眼球区域进行二值化处理,以及椭圆拟合,即可以得到瞳孔中心坐标。该瞳孔中心坐标即可以作为瞳孔的坐标。
当然,也可以通过其他方式来得到瞳孔中心坐标,本申请实施例对此不进行限制。
2)进行光斑检测。
2.1在上述瞳孔坐标周围m*m的像素区域(该像素区域包括瞳孔坐标,且以瞳孔坐标为中心,m的取值可以依据瞳孔的直径来确定,例如瞳孔直径范围为20~50个像素时,则m可以为80)内进行光斑检测,对于眼球跟踪装置,光斑一般位于瞳孔附近,因此在瞳孔周围一定范围内进性光斑检测,可以提高光斑检测精度和速度。
2.2设定光斑检测阈值为200,该值为经验值,依据该阈值二值化得到光斑图像,并椭圆拟合得到光斑中心坐标,该光斑中心坐标可以作为光斑的位置。
由于眼球的数量通常为2,进而第一图像中包括两个眼球。若补光组件包括多个补光灯,每个补光灯也会在眼球上形成一个光斑,在此基础上,步骤706包括:
基于两个眼球在第一图像中的位置以及第一图像,获取两个眼球的瞳孔的位置,以及至少两个补光组件发出的光线在两个眼球上光斑的位置。
步骤707、获取至少两个眼球跟踪相机的视线模型。
在当前时刻之前,若已经确定了两个眼球跟踪相机的视线模型,则可以直接获取之前确定的视线模型,若当前时刻之前未确定了至少两个眼球跟踪相机的视线模型,则可以在本步骤确定至少两个眼球跟踪相机的视线模型。
其中,眼球跟踪相机的视线模型用于根据眼球跟踪相机采集得到的第一图像中,眼球的瞳孔的位置以及光斑的位置,确定眼球的注视位置。
由于眼球的数量通常为2,进而第一图像中包括两个眼球。若补光组件包括多个补光灯,则对于每一个眼球跟踪相机,每一个眼球的瞳孔,与每一个补光灯产生的光斑均可以对应一个视线模型。例如,补光灯的数量为n,则对于每一个眼球跟踪相机,包括2n个视线模型。
进而步骤707可以包括:
基于每个样本数据,确定至少一个样本数据对应的视线模型。
其中,样本数据包括两个眼球中一个眼球的瞳孔的位置,以及一个眼球上的一个光斑的位置,至少一个样本数据对应的视线模型为第一眼球跟踪相机的视线模型。
其中,基于每个样本数据,确定至少一个样本数据对应的视线模型具体可以包括:
1)确定第一眼球跟踪相机的视线模型。
在确定第一眼球跟踪相机的视线模型时,显示面板的显示面可以显示预设的内容,示例性的,显示面板可以依次出现9个标定点,当第一个标定点出现时,眼球注视该点一定时间(如2秒);第一个标定点消失,出现第二个标定点,执行与第一个标定点相同的操作,直到9个标定点全部显示完成;对于每个眼球跟踪相机,两个补光灯中的一个补光灯的第一光斑与瞳孔的相对坐标生成9个标定数据,同时另一个补光灯的第二光斑与瞳孔的相对坐标也会生成9个标定数据,即:每个眼球跟踪相机会生成四组标定数据(左眼第一光斑-瞳孔,左眼第二光斑-瞳孔,右眼第一光斑-瞳孔,右眼第二光斑-瞳孔)。
通过第一眼球跟踪相机得到的四组标定数据为第一眼球跟踪相机的坐标系中的坐标,可以将这些坐标转换为世界坐标系中的坐标,示例性的,一种转换公式可以为:
Figure PCTCN2022083482-appb-000018
其中,u B、v B表示第一眼球跟踪相机的坐标系中的光斑坐标,R B表示第一眼球跟踪相机的光轴相对于世界坐标轴的旋转角度,T B表示第一眼球跟踪相机的坐标原点相对世界坐标系原点的坐标差(世界坐标系原点可以位于显示面的中心位置,正上方为+Y轴,正右方为+X轴,正前方为+Z轴);x w、y w、z w表示世界坐标系下的光斑坐标;
Figure PCTCN2022083482-appb-000019
为第一眼球跟踪相机的内参矩阵,其他眼球跟踪相机进行世界坐标系的转换时,可以将该部分内参矩阵替换为对应的眼球跟踪相机的内参矩阵。瞳孔的坐标转换方式可以与该方式类似,本申请实施例在此不再赘述。
得到了世界坐标系下的四组标定数据后,可以先通过这四组标定数据来确定第一眼球跟踪相机的视线模型。
在一种示例性的实施例中,视线模型采用多项式回归模型,如下所示
Figure PCTCN2022083482-appb-000020
其中,X G和Y G为眼球注视位置在世界坐标系的坐标,a 0、a 1、a 2、a 3、a 4、a 5,b 0、b 1、b 2、b 3、b 4、b 5,为未知的参数,通过上述四组标定数据求得,x和y为斑与瞳孔的相对坐标。
将眼球跟踪相机对应的四组标定数据代入视线模型中,可以得到第一眼球跟踪相机的四个视线模型。
2)基于第一眼球跟踪相机的视线模型与其他眼球跟踪相机的相对位置,确定其他眼球跟踪相机的视线模型。
可以通过坐标转换来将第一眼球跟踪相机的视线模型转换为其他确定其他眼球跟踪相机的视线模型。示例性的,首先可以确定第一眼球跟踪相机的坐标系中的坐标与第二眼球跟踪相机(第二眼球跟踪相机可以为至少两个眼球跟踪相机中,除第一眼球跟踪相机外的任一眼球跟踪相机)的坐标系中的坐标的转换关系,一种坐标转换公式如下:
Figure PCTCN2022083482-appb-000021
Figure PCTCN2022083482-appb-000022
其中,u Bn、v Bn表示第二眼球跟踪相机的坐标系中的光斑的坐标,u B2、v B2表示第一眼球跟踪相机中的图像坐标,与上述实施例类似,fxBn可以表示第二 眼球跟踪相机的水平焦距,fyBn可以表示第二眼球跟踪相机的垂直焦距,fyB2可以表示第一眼球跟踪相机的垂直焦距,fxB2可以表示第一眼球跟踪相机的水平焦距,U B2为和第一眼球跟踪相机的水平分辨率相关的参数,可以约为第一眼球跟踪相机的水平分辨率的一半,V B2为和第一眼球跟踪相机的垂直分辨率相关的参数,可以约为第一眼球跟踪相机的垂直分辨率的一半,s表示第一眼球跟踪相机和第二眼球跟踪相机在水平方向的间距,t表示第一眼球跟踪相机和第二眼球跟踪相机在竖直方向上的间距;第一眼球跟踪相机B2和第二眼球跟踪相机Bn的内参矩阵可以如下:
Figure PCTCN2022083482-appb-000023
Figure PCTCN2022083482-appb-000024
通过该内参矩阵以及上述转换世界坐标系的方法,即可以确定第二眼球跟踪相机的坐标与世界坐标系的坐标的转换关系。之后便可以基于这些转换关系,通过上述的四组标定数据来确定每个眼球跟踪相机的视线模型。
步骤708、基于至少两个眼球跟踪相机的视线模型,以及眼球的瞳孔位置和光斑的位置,确定眼球的注视位置。
步骤708可以包括:
1)分别通过至少两个眼球跟踪相机的视线模型确定得到眼球的至少一个注视位置。
由于获取光斑可能失败,进而可能无法通过每一个视线模型均得到注视位置,进而本申请实施例中,可以分别通过至少两个眼球跟踪相机的视线模型确定得到眼球的至少一个注视位置。
2)基于至少一个注视位置,确定一个目标注视位置。
当得到的注视位置的数量为1时,则可以将这一个注视位置确定为目标注视位置;
当得到的注视位置的数量大于或等于2个时,则可以基于这多个注视位置的坐标的平均值,确定目标注视位置。
如此可以提升确定的目标注视位置的准确性。
在一种示例性的实施例中,由于本申请实施例提供的眼球跟踪方法所应用 的眼球跟踪装置中,眼球相机的数量为多个,进而在实现眼球跟踪方法时,可以开启其中的至少两个眼球跟踪相机。例如,当眼球位于某一个眼球跟踪相机的拍摄范围内时,则可以开启该眼球跟踪相机,以及该眼球跟踪相机相邻的眼球跟踪相机,其他眼球跟踪相机则不开启,如此可以在保证眼球跟踪的功能正常实现的情况下,节省眼球跟踪装置的功耗。
示例性的,眼球跟踪装置包括水平排布的5个眼球跟踪相机,则当眼球位于中央的眼球跟踪相机的拍摄范围内时,则可以开启该中央的眼球跟踪相机,以及相邻的两个眼球跟踪相机,最边缘的两个眼球跟踪相机则不开启,如此便节省了两个眼球跟踪相机的功耗,且不会影响眼球跟踪的功能正常实现。
综上所述,本申请实施例提供的眼球跟踪方法,通过至少两个眼球跟踪相机获取第一图像,并基于该第一图像确定眼球的注视位置,由于这至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相机可以构成一个较大的跟踪范围,如此控制组件便可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中对于眼球的注视位置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。
根据本申请实施例的另一方面,如图8所示,本申请实施例还提供一种显示设备20,该显示设备20包括显示面板21、外壳22以及眼球跟踪装置23。
眼球跟踪装置23包括:控制组件231、补光组件232、图像采集组件233以及至少两个眼球跟踪相机234。
补光组件232、图像采集组件233以及至少两个眼球跟踪相机234位于外壳22上,且均朝向显示面板21的出光方向,至少两个眼球跟踪相机234位于外壳22的至少两个不同的位置。
控制组件231与至少两个眼球跟踪相机234电连接。
其中,上述实施例中提供的眼球跟踪装置中的支架可以结合与该显示设备的外壳上,例如可以与该外壳为一体件,或者,可以安装于该外壳上。
可选地,至少两个眼球跟踪相机234在外壳22上位于显示面板21的下方,且沿水平方向f1排布。
可选地,显示面板21的显示面m1呈矩形,且显示面m1的一条边与水平方向f1平行.
至少两个眼球跟踪相机中234,任意两个相邻的眼球跟踪相机234之间的距 离相等。
可选地,眼球跟踪相机234与显示面板21的显示面m1共面,眼球跟踪相机234的数量为3,3个眼球跟踪相机234中两个相邻的眼球跟踪相机234之间的间距满足:
Figure PCTCN2022083482-appb-000025
其中,L用于确定间距,p由眼球宽度决定,e由两个眼球之间的间距决定,α为眼球跟踪相机234的水平视场角,D1和D2分别为预设的眼球跟踪相机234进行眼球跟踪的目标区域在垂直于显示面m1的方向上,与显示面m1之间的最小距离以及最大距离;
Figure PCTCN2022083482-appb-000026
θ为预设的3个眼球跟踪相机的整体视场角。
可选地,3个眼球跟踪相机234的光轴相互平行且共面,3个眼球跟踪相机的光轴确定的第一平面与经过显示面m1的中心的垂线相交,且交点位于目标区域内。
可选地,交点位于目标区域在垂直于显示面的方向上的中心,眼球跟踪相机满足:
Figure PCTCN2022083482-appb-000027
其中,b用于确定眼球跟踪相机234的光轴与显示面m1之间的夹角,k为显示面m1的中心与眼球跟踪相机234在垂直方向上之间的距离。
可选地,图像采集组件233在外壳22上位于显示面板21的下方,且与显示面m1的下边缘的两端的距离相等。也即是该图像采集组件233可以位于显示面m1的下边缘的中间位置处,如此可以便于图像采集组件233的拍摄范围覆盖眼球跟踪相机234的拍摄范围。
可选地,外壳22包括底座221以及位于底座221上的框体222;补光组件232、图像采集组件233以及至少两个眼球跟踪相机234均安装于框体222上。
控制组件231可以包括显示设备中的处理器,例如该显示设备为台式显示设备时,则该控制组件231可以位于台式显示设备的主机中。
该显示面板可以为液晶显示面板或者有机发光二极管显示面板等各种显示面板。
该显示设备可以位于桌上的显示器,或者可以为立式显示器等,可以在显示图像时,确定眼球的注视位置。
另外,关于该显示设备中眼球跟踪装置的内容,可以参考上述实施例,本申请实施例在此不再赘述。
综上所述,本申请实施例提供的显示设备,该显示设备通过设置至少两个眼球跟踪相机,且这至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠,进而这至少两个眼球跟踪相机可以构成一个较大的跟踪范围,控制组件可以确认位于这较大的跟踪范围内进行眼球的注视位置,解决了相关技术中眼球跟踪装置的跟踪范围较小的问题,实现了增大眼球跟踪装置的跟踪范围的效果。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的可选实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (27)

  1. 一种眼球跟踪装置,其特征在于,所述眼球跟踪装置包括:支架、控制组件、补光组件以及至少两个眼球跟踪相机;
    所述至少两个眼球跟踪相机位于所述支架的至少两个不同的位置,且所述至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠;
    所述补光组件位于所述支架上;
    所述控制组件与所述至少两个眼球跟踪相机电连接,用于确定所述至少两个眼球跟踪相机的拍摄范围内的眼球的注视位置。
  2. 根据权利要求1所述的眼球跟踪装置,其特征在于,所述至少两个眼球跟踪相机中,两个相邻的眼球跟踪相机的拍摄范围具有重叠区域。
  3. 根据权利要求2所述的眼球跟踪装置,其特征在于,所述重叠区域的尺寸大于预设眼球的尺寸。
  4. 根据权利要求1所述的眼球跟踪装置,其特征在于,所述眼球跟踪装置还包括:图像采集组件;
    所述图像采集组件位于所述支架上,且与所述控制组件电连接,所述至少两个眼球跟踪相机的拍摄范围位于图像采集组件的拍摄范围内。
  5. 根据权利要求3所述的眼球跟踪装置,其特征在于,所述至少两个眼球跟踪相机的拍摄范围的边缘与所述图像采集组件的拍摄范围的边缘之间具有间隔区域,所述间隔区域的尺寸大于预设人脸的尺寸。
  6. 根据权利要求1所述的眼球跟踪装置,其特征在于,所述支架包括了用于设置待观察面的安装部,所述控制组件用于确定所述至少两个眼球跟踪相机的拍摄范围内的眼球在所述待观察面上的注视位置。
  7. 根据权利要求6所述的眼球跟踪装置,其特征在于,所述控制组件用于确定所述至少两个眼球跟踪相机的拍摄范围内,与所述待观察面之间的距离满足 预设距离范围的眼球在所述待观察面上的注视位置;
    所述至少两个眼球跟踪相机的光轴相互平行且共面,所述至少两个眼球跟踪相机的光轴确定的第一平面与经过所述待观察面的中心的垂线相交,且交点位于所述预设距离范围内。
  8. 根据权利要求7所述的眼球跟踪装置,其特征在于,所述交点位于所述预设距离范围内的中心。
  9. 根据权利要求7所述的眼球跟踪装置,其特征在于,所述待观察面为显示面板的显示面,所述支架包括用于安装所述显示面板的安装部;
    所述至少两个眼球跟踪相机位于所述安装部的下方。
  10. 根据权利要求7所述的眼球跟踪装置,其特征在于,所述至少两个眼球跟踪相机在所述支架上沿水平方向排布。
  11. 根据权利要求6所述的眼球跟踪装置,其特征在于,所述待观察面为显示面板的显示面,所述支架包括用于安装所述显示面板的安装部;
    所述眼球跟踪装置还包括:图像采集组件;
    所述图像采集组件位于所述支架上,且位于所述安装部的下方,所述图像采集组件与所述控制组件电连接,所述至少两个眼球跟踪相机的拍摄范围位于图像采集组件的拍摄范围内。
  12. 根据权利要求1-11任一所述的眼球跟踪装置,其特征在于,所述补光组件包括至少两个补光灯,所述至少两个补光灯位于所述支架上的不同位置。
  13. 根据权利要求12所述的眼球跟踪装置,其特征在于,所述至少两个补光灯在所述支架上沿水平方向排布。
  14. 根据权利要求4所述的眼球跟踪装置,其特征在于,所述图像采集组件包括彩色相机,所述眼球跟踪相机包括红外相机,所述彩色相机的分辨率小于 所述红外相机的分辨率。
  15. 一种眼球跟踪方法,其特征在于,所述方法用于眼球跟踪装置的控制组件中,所述眼球跟踪装置还包括:支架、补光组件以及至少两个眼球跟踪相机;所述至少两个眼球跟踪相机位于所述支架的至少两个不同的位置,且所述至少两个眼球跟踪相机的拍摄范围至少具有部分拍摄范围不重叠;所述补光组件位于所述支架上;所述控制组件与所述至少两个眼球跟踪相机电连接,
    所述方法包括:
    开启所述补光组件;
    获取所述至少两个眼球跟踪相机获取的第一图像;
    基于所述第一图像确定眼球的注视位置。
  16. 根据权利要求15所述的方法,其特征在于,所述眼球跟踪装置还包括:图像采集组件;所述图像采集组件位于所述支架上,且与所述控制组件电连接,
    所述基于所述第一图像确定眼球的注视位置之前,所述方法还包括:
    获取所述图像采集组件采集的第二图像;
    基于所述第二图像获取人脸信息;
    基于所述人脸信息确定眼球在所述第二图像中的位置;
    所述基于所述第一图像确定眼球的注视位置,包括:
    基于所述眼球在所述第二图像中的位置,确定所述眼球在所述第一图像中的位置;
    基于所述眼球在所述第一图像中的位置以及所述第一图像,确定所述眼球的注视位置。
  17. 根据权利要求15所述的方法,其特征在于,所述基于所述眼球在所述第一图像中的位置以及所述第一图像,确定所述眼球的注视位置,包括:
    基于所述眼球在所述第一图像中的位置以及所述第一图像,获取所述眼球的瞳孔的位置,以及所述补光组件发出的光线在所述眼球上的光斑的位置;
    获取所述至少两个眼球跟踪相机的视线模型,所述眼球跟踪相机的视线模型用于根据所述眼球跟踪相机采集得到的第一图像中,所述眼球的瞳孔的位置 以及所述光斑的位置,确定所述眼球的注视位置;
    基于所述至少两个眼球跟踪相机的视线模型,以及所述眼球的瞳孔位置和光斑的位置,确定所述眼球的注视位置。
  18. 根据权利要求17所述的方法,其特征在于,所述基于所述至少两个眼球跟踪相机的视线模型,以及所述眼球的瞳孔位置和光斑的位置,确定所述眼球的注视位置,包括:
    分别通过所述至少两个眼球跟踪相机的视线模型确定得到所述眼球的至少一个注视位置;
    基于所述至少一个注视位置,确定一个目标注视位置。
  19. 根据权利要求17所述的方法,其特征在于,所述第一图像为所述至少两个眼球跟踪相机中第一眼球跟踪相机采集的图像,所述补光组件包括至少两个补光灯,所述至少两个补光灯位于所述支架上的不同位置,所述第一图像中包括两个眼球,
    所述基于所述眼球在所述第一图像中的位置以及所述第一图像,获取所述眼球的瞳孔的位置,以及所述补光组件发出的光线在所述眼球上的光斑的位置,包括:
    基于所述两个眼球在所述第一图像中的位置以及所述第一图像,获取所述两个眼球的瞳孔的位置,以及所述至少两个补光组件发出的光线在所述两个眼球上光斑的位置;
    所述获取所述至少两个眼球跟踪相机的视线模型,包括:
    基于每个样本数据,确定至少一个所述样本数据对应的视线模型,所述样本数据包括所述两个眼球中一个眼球的瞳孔的位置,以及所述一个眼球上的一个光斑的位置,至少一个所述样本数据对应的视线模型为所述至少两个眼球跟踪相机中所述第一眼球跟踪相机的视线模型。
  20. 一种显示设备,其特征在于,所述显示设备包括显示面板、外壳以及眼球跟踪装置;
    所述眼球跟踪装置包括:控制组件、补光组件、图像采集组件以及至少两 个眼球跟踪相机;
    所述补光组件、图像采集组件以及所述至少两个眼球跟踪相机位于所述外壳上,且均朝向所述显示面板的出光方向,所述至少两个眼球跟踪相机位于所述外壳的至少两个不同的位置;
    所述控制组件与所述至少两个眼球跟踪相机电连接。
  21. 根据权利要求20所述的显示设备,其特征在于,所述至少两个眼球跟踪相机在所述外壳上位于所述显示面板的下方,且沿水平方向排布。
  22. 根据权利要求21所述的显示设备,其特征在于,所述显示面板的显示面呈矩形,且所述显示面的一条边与水平方向平行;
    所述至少两个眼球跟踪相机中,任意两个相邻的眼球跟踪相机之间的距离相等。
  23. 根据权利要求22所述的显示设备,其特征在于,所述眼球跟踪相机与所述显示面板的显示面共面,所述眼球跟踪相机的数量为3,3个所述眼球跟踪相机中两个相邻的眼球跟踪相机之间的间距满足:
    Figure PCTCN2022083482-appb-100001
    其中,L用于确定所述间距,p由眼球宽度决定,e由两个眼球之间的间距决定,α为所述眼球跟踪相机的水平视场角,D1和D2分别为预设的所述眼球跟踪相机进行眼球跟踪的目标区域在垂直于所述显示面的方向上,与所述显示面之间的最小距离以及最大距离;
    Figure PCTCN2022083482-appb-100002
    θ为预设的3个所述眼球跟踪相机的整体视场角。
  24. 根据权利要求23所述的显示设备,其特征在于,3个所述眼球跟踪相机的光轴相互平行且共面,3个所述眼球跟踪相机的光轴确定的第一平面与经过所述显示面的中心的垂线相交,且交点位于所述目标区域内。
  25. 根据权利要求24所述的显示设备,其特征在于,所述交点位于所述目标区域在垂直于所述显示面的方向上的中心,所述眼球跟踪相机满足:
    Figure PCTCN2022083482-appb-100003
    其中,所述b用于确定所述眼球跟踪相机的光轴与所述显示面之间的夹角,所述k为所述显示面的中心与所述眼球跟踪相机在垂直方向上之间的距离。
  26. 根据权利要求20-25任一所述的显示设备,其特征在于,所述图像采集组件在所述外壳上位于所述显示面板的下方,且与所述显示面的下边缘的两端的距离相等。
  27. 根据权利要求20-25任一所述的显示设备,其特征在于,所述外壳包括底座以及位于所述底座上的框体;
    所述补光组件、所述图像采集组件以及所述至少两个眼球跟踪相机均安装于所述框体上。
PCT/CN2022/083482 2022-03-28 2022-03-28 眼球跟踪装置、方法以及显示设备 WO2023184109A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/083482 WO2023184109A1 (zh) 2022-03-28 2022-03-28 眼球跟踪装置、方法以及显示设备
CN202280000578.XA CN117280303A (zh) 2022-03-28 2022-03-28 眼球跟踪装置、方法以及显示设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/083482 WO2023184109A1 (zh) 2022-03-28 2022-03-28 眼球跟踪装置、方法以及显示设备

Publications (1)

Publication Number Publication Date
WO2023184109A1 true WO2023184109A1 (zh) 2023-10-05

Family

ID=88198592

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/083482 WO2023184109A1 (zh) 2022-03-28 2022-03-28 眼球跟踪装置、方法以及显示设备

Country Status (2)

Country Link
CN (1) CN117280303A (zh)
WO (1) WO2023184109A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160018889A1 (en) * 2014-07-21 2016-01-21 Tobii Ab Method and apparatus for detecting and following an eye and/or the gaze direction thereof
CN108399001A (zh) * 2017-02-06 2018-08-14 上海青研科技有限公司 一种vr/ar中双目立体视觉眼动分析方法及装置
US10466779B1 (en) * 2017-07-24 2019-11-05 Facebook Technologies, Llc Event camera for eye tracking
US10698481B1 (en) * 2017-09-28 2020-06-30 Apple Inc. Glint-assisted gaze tracker
US20200379263A1 (en) * 2019-06-03 2020-12-03 Beijing 7Invensun Technology Co., Ltd. Eye tracking apparatus and eye tracking device
CN113190115A (zh) * 2021-04-25 2021-07-30 歌尔股份有限公司 眼球追踪方法及虚拟现实设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160018889A1 (en) * 2014-07-21 2016-01-21 Tobii Ab Method and apparatus for detecting and following an eye and/or the gaze direction thereof
CN108399001A (zh) * 2017-02-06 2018-08-14 上海青研科技有限公司 一种vr/ar中双目立体视觉眼动分析方法及装置
US10466779B1 (en) * 2017-07-24 2019-11-05 Facebook Technologies, Llc Event camera for eye tracking
US10698481B1 (en) * 2017-09-28 2020-06-30 Apple Inc. Glint-assisted gaze tracker
US20200379263A1 (en) * 2019-06-03 2020-12-03 Beijing 7Invensun Technology Co., Ltd. Eye tracking apparatus and eye tracking device
CN113190115A (zh) * 2021-04-25 2021-07-30 歌尔股份有限公司 眼球追踪方法及虚拟现实设备

Also Published As

Publication number Publication date
CN117280303A (zh) 2023-12-22

Similar Documents

Publication Publication Date Title
CN106662917B (zh) 眼睛跟踪校准系统和方法
US9375639B2 (en) Image display system and head-mounted display device
JP5414946B2 (ja) ヘッドマウントディスプレイおよびその位置ずれ調整方法
JP6596678B2 (ja) 視線測定装置および視線測定方法
JP5985336B2 (ja) パノラマ画像作成装置、及び、プログラム
JP5583980B2 (ja) 画像処理方法および画像処理装置
US10831238B2 (en) Display method for flexible display screen and flexible display device
WO2021208646A1 (zh) 车辆a柱显示组件的显示方法、系统、设备和存储介质
WO2011136213A1 (ja) 表示装置
US11366315B2 (en) Image processing apparatus, method for controlling the same, non-transitory computer-readable storage medium, and system
US10754474B2 (en) Projector system
CN112805755A (zh) 信息处理装置、信息处理方法和记录介质
WO2023184109A1 (zh) 眼球跟踪装置、方法以及显示设备
WO2019000534A1 (zh) 终端设备及显示方法
US20210176452A1 (en) Head-mounted display having an image sensor array
JP2000201289A (ja) 映像入出力装置及び映像取得方法
EP3859429B1 (en) Display apparatus and method of compensating for visual artifacts
CN110927973A (zh) 显示装置
US10522110B1 (en) Apparatuses, systems, and methods for measuring and adjusting the luminance of a head-mounted display
JP6932526B2 (ja) 画像表示装置、画像表示方法及びプログラム
US11436987B1 (en) Adaptive backlight activation for low-persistence liquid crystal displays
JP6436606B1 (ja) 医療映像システム
JP7572237B2 (ja) 視線キャリブレーションシステム
US20240214545A1 (en) Method and device for naked eye 3d displaying vehicle instrument
US20180144722A1 (en) Display Control Method and Display Control Apparatus

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280000578.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22933966

Country of ref document: EP

Kind code of ref document: A1