WO2024041488A1 - 一种电子设备 - Google Patents

一种电子设备 Download PDF

Info

Publication number
WO2024041488A1
WO2024041488A1 PCT/CN2023/114070 CN2023114070W WO2024041488A1 WO 2024041488 A1 WO2024041488 A1 WO 2024041488A1 CN 2023114070 W CN2023114070 W CN 2023114070W WO 2024041488 A1 WO2024041488 A1 WO 2024041488A1
Authority
WO
WIPO (PCT)
Prior art keywords
image acquisition
angle
acquisition device
light
electronic device
Prior art date
Application number
PCT/CN2023/114070
Other languages
English (en)
French (fr)
Inventor
李俊生
黄通兵
Original Assignee
北京七鑫易维信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京七鑫易维信息技术有限公司 filed Critical 北京七鑫易维信息技术有限公司
Publication of WO2024041488A1 publication Critical patent/WO2024041488A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Definitions

  • the present disclosure relates to the technical field of electronic equipment, and in particular, to an electronic equipment.
  • Eye tracking usually refers to tracking eye movements by measuring the position of the eye's gaze point or the movement of the eyeball relative to the head.
  • Eye tracking module in an electronic device, the movement and gaze direction of the user's eyes when looking at a specific target can be tracked, which is beneficial to the interaction between the user and the electronic device.
  • Eye tracking technology can be implemented using optical recording methods.
  • the principle of the optical recording method is to use an infrared camera to record the subject's eye movements, that is, to obtain eye images that can reflect eye movements, and to extract eye features from the acquired eye images to establish a line of sight estimation model.
  • the eye features may include: pupil position, pupil shape, iris position, iris shape, eyelid position, eye canthus position, light spot position (or Purchin spot), etc.
  • Optical recording methods include the pupillary-corneal reflex method.
  • the principle of the pupil-corneal reflection method is that a near-infrared light source is illuminated at the eye, and the eye is photographed by an infrared camera. At the same time, the reflection point of the light source on the cornea, which is the light spot, is captured, thereby obtaining an eye image with the light spot.
  • setting an eye tracking module in an electronic device is usually implemented by setting the eye tracking module on a display screen of the electronic device.
  • the present disclosure provides an electronic device, which realizes the eye-tracking function of the electronic device without increasing the thickness of the display screen of the electronic device through an image acquisition device provided inside the host and a light source and light-transmitting area provided on the host. .
  • the present disclosure provides an electronic device.
  • the electronic device includes a display screen and a host.
  • An image acquisition device is provided inside the host.
  • a light source and a light-transmitting area are provided on the host.
  • the transparent area The light area overlaps with the projection of the collection range of the image collection device on the host, and the image collection device is configured to collect eye images of the user using the electronic device through the light-transmitting area, and the eye
  • the partial image includes light spots formed by the light source.
  • the number of the light sources is at least two, each light source is arranged at least on both sides of the image acquisition device, and the luminous center of the light source is arranged on the same straight line as the focal point of the lens of the image acquisition device. superior.
  • a transparent cover plate or filter is provided on the light-transmitting area.
  • the image acquisition device and the host computer are fixedly installed at a set angle.
  • the image acquisition device is arranged toward the plane where the keyboard area on the host computer is located, and the image acquisition device directly acquires the eye image through the light-transmitting area.
  • a reflective device is provided in the host computer, the image acquisition device faces the reflective device, and the reflective device is configured to reflect light reflected by the user's eyes.
  • the reflective device includes one or more of the following:
  • the image acquisition device is installed in the host computer.
  • the electronic device also includes:
  • a motor the motor is arranged in the host machine, the motor is connected to the image acquisition device, and the motor is configured to adjust the acquisition range of the image acquisition device.
  • the electronic device also includes:
  • Equipment angle measurement device the equipment angle measurement device is connected to the image acquisition device, the equipment angle measurement device is configured to collect the device angle parameters of the image acquisition device, the equipment angle measurement device is also configured to The device angle parameters are transmitted to the processor for the processor to perform calibration point correction based on the device angle parameters.
  • the elevation angle of the image acquisition device is determined based on one or more of the following:
  • the position of the user's head; the adjustment parameters of the display screen, the adjustment parameters are determined based on the current screen angle parameters of the display screen.
  • one end of the display screen is connected to one end of the host through a second rotating shaft, a screen angle measuring device is provided in the second rotating shaft, and the screen angle measuring device is configured to measure the screen angle of the display screen. Angle parameter, and transmit the screen angle parameter to the processor for the processor to perform calibration point correction based on the screen angle parameter.
  • the present disclosure provides an electronic device.
  • the electronic device includes: a display screen and a host; an image acquisition device is provided inside the host; a light source and a light-transmitting area are provided on the host; the light-transmitting area and the collection range of the image acquisition device are within the range of the Projections on the host computer overlap, and the image collection device is configured to collect an eye image of a user using the electronic device through the light-transmitting area, where the eye image includes a light spot formed by the light source.
  • the above technical solution realizes the eye tracking function of the electronic device without increasing the thickness of the display screen of the electronic device through the image acquisition device provided inside the host and the light source and light-transmitting area provided on the host.
  • Figure 1 is a schematic structural diagram of an electronic device provided according to the present disclosure
  • Figure 2 is a partial enlarged view of the image acquisition device and the light-transmitting area from the perspective shown in Figure 1 provided according to the present disclosure
  • Figure 3 is a schematic structural diagram of yet another electronic device provided according to the present disclosure.
  • Figure 4 is a schematic structural diagram of yet another electronic device provided according to the present disclosure.
  • Figure 5 is a partial enlarged view of the image acquisition device, the light-transmitting area, the motor, the first rotating shaft and the device angle measurement device from the perspective shown in Figure 4 provided according to the present disclosure;
  • Figure 6a is a schematic diagram of a user's gaze point position provided according to the present disclosure.
  • Figure 6b is a schematic diagram of the electronic device correcting the calibration point when the user's head position changes and the screen angle parameter of the display screen remains unchanged according to the present disclosure
  • Figure 7a is a schematic diagram before the position of the display screen is changed according to the present disclosure.
  • 7b is a schematic diagram of the electronic device correcting the calibration point when the user's head position remains unchanged and the screen angle parameter of the display screen changes according to the present disclosure.
  • center In the description of the present disclosure, it should be noted that the terms “center”, “upper”, “lower”, “left”, “right”, “front”, “back”, etc. indicate an orientation or positional relationship based on the attached The orientation or positional relationship shown in the figure. For example, “top” and “bottom” are set along the direction of the header and footer of the paper; “left” and “right” are set facing the direction of the paper, and “front” is set perpendicular to the paper. “Back” refers to the direction perpendicular to the paper surface and from the back of the paper to the back of the paper.
  • FIG 1 is a schematic structural diagram of an electronic device according to the present disclosure. As shown in Figure 1, the present disclosure provides an electronic device.
  • the electronic device includes:
  • An image acquisition device 3 is provided inside the host 2.
  • a light source 4 and a light-transmitting area 5 are provided on the host 2.
  • the light-transmitting area 5 is consistent with the collection range of the image acquisition device 3.
  • the projection on the host 2 overlaps, and the image acquisition device 3 is configured to collect the eye image of the user using the electronic device through the light-transmitting area 5, and the eye image includes the light source generated by the light source. 4 formed light spots.
  • the image acquisition device 3 is arranged inside the host 2, and the light source 4 and the light-transmitting area 5 are arranged on the host 2, so that the image acquisition device 3 can collect the eye image of the user using the electronic device through the light-transmitting area 5. It realizes the eye tracking function of the electronic device without increasing the thickness of the electronic device.
  • the electronic device is a laptop computer as an example.
  • the electronic device is not limited to a laptop computer and can be any electronic device that requires eye tracking.
  • the display screen 1 may include a housing, and a screen provided on the housing for display.
  • the host 2 includes a casing, and a keyboard may also be provided on the casing.
  • the display screen 1 is connected to the host computer 2 through a second rotating axis.
  • the second rotating axis may refer to the rotating axis connecting the display screen 1 and the host computer 2.
  • the electronic device can make the relative positions of the display screen 1 and the host computer 2 present different angles through the second rotating axis. , specifically the relative positions of the display screen 1 and the host computer 2 can be adjusted according to actual needs.
  • image acquisition device 3 includes but is not limited to: infrared camera equipment, infrared image sensor, camera or video camera, etc.
  • the way in which the image acquisition device 3 is installed inside the host 2 is not limited, as long as the image acquisition device 3 can be installed inside the host 2 .
  • the image acquisition device 3 can be fixedly installed in the host 2; for another example, the image acquisition device 3 can be movably installed in the host 2 through slide rails.
  • the image acquisition device 3 can be adjusted by The image acquisition device 3 slides left and right in the slide rail to ensure that the image acquisition device 3 can still collect the user's eye image when the user's position relative to the electronic device changes; for another example, the image acquisition device 3 can be in the host 2
  • the motor controls the rotation of the image acquisition device 3, thereby adjusting the range of the image acquisition device 3 to collect images of the user's eyes.
  • the installation position of the image capture device 3 inside the host 2 is not limited, as long as the image capture device 3 can capture the user's eye image when the user uses the electronic device.
  • the eye image can be considered as an image including the user's eyes.
  • the image acquisition device 3 is disposed inside the host 2.
  • the image acquisition device 3 is located between the second rotating shaft and the keyboard, at a set distance from the center of the second rotating shaft.
  • the center position may be the position of the midpoint of the second rotating axis.
  • the set distance is not limited here, and can be determined according to the size of the electronic device.
  • the light source 4 is disposed on the host computer 2 so that the light emitted by the light source 4 can be reflected to the electronic device through the user's eyes, thereby forming light spots in the eye image collected by the image acquisition device 3 .
  • the image acquisition device 3 takes pictures of the eyes, and correspondingly captures the reflection points of the light source 4 on the cornea, that is, light spots, also known as Purchin spots, thus forming an eye image with light spots. .
  • the wavelength of the light source 4 is not limited, as long as it can ensure that the image acquisition device 3 can collect eye images with light spots.
  • the light source 4 may be infrared light in a human eye-safe wavelength band. Compared with light in other wavelength bands, when the infrared light in the human eye-safe wavelength band is irradiated to the eye, it will be greatly affected by the water in the lens of the human eye. Absorption greatly reduces the intensity of light when it reaches the retina.
  • the arrangement of the light sources 4 provided on the host 2 is not limited, as long as the light spots can exist in the eye image, such as in a predetermined arrangement, such as a straight shape, a square shape, a circle, etc.
  • the number of light sources 4 is not limited, and the position of the light sources 4 on the host computer 2 is not limited, as long as it can ensure that the image acquisition device 3 can collect eye images with light spots.
  • Different light sources 4 can be set at different positions on the host computer 2, as long as the eye image includes light spots.
  • the light sources 4 are respectively arranged on the host 2 close to the second rotating axis, and the two light sources 4 are respectively arranged at the left and right ends of the image acquisition device 3.
  • the two light sources 4 are
  • the image acquisition device 3 is located on the same straight line, and the image acquisition device 3 is the midpoint of the two light sources 4, as shown in Figure 1.
  • the position of the light-transmitting area 5 on the host 2 is not limited, as long as the image capture device 3 can collect eye images of the user using the electronic device through the light-transmitting area 5 .
  • the light-transmitting area 5 is located above the image acquisition device 3.
  • the light-transmitting area 5 can be arranged in a plane parallel to the host 2, or at a certain angle to a plane parallel to the host 2.
  • the specific needs depend on the location of the image acquisition device 3.
  • the shooting range is determined, and it is sufficient to finally ensure that the position of the light-transmitting area 5 covers the shooting range of the image acquisition device 3 .
  • the light-transmitting area 5 overlaps with the projection of the collection range of the image acquisition device 3 on the host 2. It can be understood that the light-transmitting area 5 is within the collection range of the image processing device 3 and covers the entire image collection.
  • the shooting range of device 3 to the field of view of the human eye. After the light emitted by the light source 4 is reflected by the user's eyes, the light reflected by the user's eyes first passes through the light-transmitting area 5 and is then collected by the image acquisition device 3 .
  • the light-transmitting area 5 can transmit all the light reflected by the user's eyes, or can transmit only the light reflected by the user's eyes corresponding to the wavelength band of the light source 4, which can be set according to actual needs.
  • the image acquisition device 3 is arranged inside the host 2, and the light source 4 and the light-transmitting area 5 are arranged on the host 2, so that the image acquisition device 3 can collect the eye image of the user using the electronic device through the light-transmitting area 5, and realize The eye tracking function of electronic devices.
  • the eye tracking function of the electronic device can be realized without increasing the thickness of the display screen in the electronic device.
  • the number of light sources 4 is at least two, each light source 4 is arranged at least on both sides of the image acquisition device 3, and the luminous center of the light source 4 is arranged on the same straight line as the focus of the lens of the image acquisition device 3.
  • the number of light sources 4 is at least two, and each light source 4 is arranged at least on both sides of the image acquisition device 3. It can be understood that at least one light source 4 is distributed on both sides of the image acquisition device 3, so that the light source 4 emits light.
  • the center is set on the same straight line as the focal point of the lens of the image acquisition device 3 .
  • the light sources 4 there are six light sources 4 , and the light sources 4 are distributed in the area between the second rotating axis and the keyboard on the host 2 , and are distributed around the image acquisition device 3 .
  • the three light sources 4 can be arranged in a straight line on each side of the image acquisition device 3 .
  • One light source 4 is distributed on both sides of the image acquisition device 3.
  • the other light source 4 can be distributed at any position on the keyboard.
  • the position of the light source 4 on the keyboard can be adjusted according to actual needs. , as long as it can ensure that the light emitted by the three light sources 4 can be received by the image acquisition device 3 after being reflected by the user's eyes.
  • the luminous center is the geometric center point where the array is located.
  • the focus of the lens of the image acquisition device 3 may refer to the convergence point of parallel light rays after being refracted by the lens of the image acquisition device 3 .
  • the image acquisition device 3 has one lens, the image acquisition device 3 The focus of the lens is the focus of the lens; when the lens of the image acquisition device 3 is a lens group composed of multiple lenses, the focus of the lens of the image acquisition device 3 is the focus of the lens group.
  • the luminous center of the light source 4 and the focal point of the lens of the image capture device 3 may not be in a straight line, as long as it can be ensured that the light emitted by the light source 4 can be received by the image capture device 3 after being reflected by the user's eyes. That’s it. You can set the position of the light source’s luminous center according to actual needs.
  • a transparent cover plate or filter is provided on the light-transmitting area 5 .
  • the transparent cover can cover the entire shooting range of the image acquisition device 3 to the viewing angle of the human eye, and can transmit all the light reflected by the user's eyes.
  • the light emitted by the light source 4 is reflected by the user's eyes, and the light reflected by the user's eyes is emitted to the image capture device 3 through the transparent cover, and then is received by the image capture device 3 .
  • the image acquisition device 3 installed inside the host 2 can be protected, and at the same time, the acquisition range of the image acquisition device 3 will not be blocked.
  • the filter may refer to an optical device used to select a required radiation wavelength band.
  • the type of the filter is not limited, for example, it can be a thin film filter.
  • the filter can be determined according to the wavelength band of the light source 4, so that the light emitted by the light source 4 can pass through the filter and then reach the image acquisition device 3 after being reflected by the user's eyes.
  • the light source 4 is infrared light
  • the light reflected by the user's eyes It can be infrared light.
  • the filter can only pass the infrared light of the band corresponding to the light source 4, but the light of other bands cannot pass the filter.
  • the light that finally reaches the image acquisition device 3 is only the light in the corresponding wavelength band of the light source 4, thereby making the eye image collected by the image acquisition device 3 clearer.
  • the size of the transparent cover or filter is not limited, as long as the transparent cover or filter can cover the entire shooting range of the image acquisition device 3 to the viewing angle of the human eye.
  • the size of the transparent cover or filter coincides with the entire light-transmitting area; another example, the size of the transparent cover or filter is larger than the light-transmitting area domain and cover the entire light-transmitting area.
  • the image acquisition device 3 is protected; by arranging a light filter on the light-transmitting area 5, the accuracy of the eye images collected by the image acquisition device 3 is improved. .
  • the image acquisition device 3 and the host computer 2 are fixedly installed at a set angle.
  • the set angle may refer to the angle between the set plane where the image acquisition device 3 and the host computer 2 are located.
  • the set angle is not limited. Different set angles can make the image acquisition device 3 correspond to different acquisition ranges.
  • the angle can be any angle greater than or equal to 0 degrees, as long as the image capture device 3 can capture the user's eye image.
  • the method of fixed installation of the image acquisition device 3 is not limited, as long as the image acquisition device 3 can be fixed inside the host 2 .
  • the image acquisition device 3 is fixed inside the host 2 through welding, and is at a set angle with the host 2; the image acquisition device 3 is fixed inside the host 2 through threaded connection, and is at a set angle with the host 2.
  • the image acquisition device 3 By fixing the image acquisition device 3 and the host 2 at a set angle without a moving mechanism structure, the image acquisition device 3 can be more reliably fixed inside the host 2. At the same time, when installing the image acquisition device 3, by adjusting the set angle , the image acquisition device 3 can also be provided with different acquisition ranges.
  • the image acquisition device 3 is arranged toward the plane where the keyboard area on the host 2 is located, and the image acquisition device 3 directly acquires eye images through the light-transmitting area 5 .
  • the image capture device 3 is arranged toward the plane where the keyboard area on the host 2 is located. It can be understood that the projection area of the shooting range of the image capture device 3 on the host 2 is on the plane where the keyboard area on the host 2 is located.
  • Figure 2 is a partially enlarged view of the image capture device and the light-transmitting area from the perspective shown in Figure 1 provided according to the present disclosure. It can be seen from Figure 2 that the image capture device 3 is set toward the plane where the keyboard area 6 on the host 2 is located, and the user's eyes are The reflected light passes through the light-transmitting area 5 and is collected by the image acquisition device 3 to form a light spot. That is, the image acquisition device 3 can directly collect eye images through the light-transmitting area 5. There is no need to add other components, thus saving the internal space of the host 2.
  • a reflective device is provided in the host 2, the image acquisition device 3 faces the reflective device, and the reflective device is configured to reflect the light reflected by the user's eyes.
  • FIG. 3 is a schematic structural diagram of yet another electronic device provided according to the present disclosure.
  • a reflection device 7 provided inside the host 2 is shown.
  • the type of the reflecting device 7 is not limited, as long as it can reflect the light reflected by the user's eyes.
  • it may include one or more of the following: mirrors, prisms and dichroic mirrors, configured to reflect light reflected by the user's eyes.
  • the image acquisition device 3 faces the reflective device 7, and the reflective device 7 is configured to reflect the light reflected by the user's eyes. It can be understood that after the light emitted by the light source 4 is reflected by the user's eyes, it passes through the light-transmitting area 5 and reaches the reflective device 7. The device 7 reflects the light reflected by the user's eyes, and then is collected by the image acquisition device 3 .
  • the location of the reflection device 7 inside the host 2 is not limited, as long as the reflection device 7 can reflect the light reflected by the user's eyes when the image acquisition device 3 collects the image of the user's eyes.
  • the reflection device 7 is arranged inside the host computer 2, the reflection device 7 is within the collection range of the image acquisition device 3, and forms a certain angle with the plane where the host computer 2 is located. By changing the angle between the reflection device 7 and the plane where the host computer 2 is located, angle, so that the optical path of the light reflected by the user's eyes is changed, thereby adjusting the collection range of the image collection device 3.
  • the light-transmitting area 5 and the reflecting device 7 can correspond to the light source 4 , so that the light emitted by the light source 4 can pass through the light-transmitting area 5 and then reach the reflecting device 7 after being reflected by the user's eyes.
  • the light source 4 is infrared light
  • the light-transmitting area 5 is provided with a filter
  • the filter is an infrared filter
  • the reflecting device 7 is a dichroic mirror. After the light reflected by the user's eyes is filtered by the infrared filter, only the infrared light of the corresponding band of the light source 4 can reach the reflecting device 7.
  • the reflecting device 7 almost completely transmits the infrared light of the corresponding band of the light source 4, while it transmits the infrared light of other wavelengths completely.
  • the light is almost completely reflected, and the reflection device 7 reflects the light reflected by the user's eyes, and is finally collected by the image acquisition device 3 .
  • only light source 4 finally corresponds to The infrared light in the waveband can be collected by the image collection device 3, making the eye images collected by the image collection device 3 clearer.
  • a reflective device 7 is provided in the host 2 to reflect the light reflected by the user's eyes.
  • the reflection can be made by changing the position of the reflective device 7 in the host 2 or the angle between the reflective device 7 and the plane where the host 2 is located.
  • the optical path of the light reflected by the user's eyes changes, thereby making the collection range of the image collection device 3 more flexible.
  • the reflective device 7 includes one or more of the following:
  • the type of prism is not limited as long as it can be set to reflect the light reflected by the user's eyes.
  • it can be a simple prism, a roof prism or a compound prism, etc.
  • a simple prism can refer to a prism with all working surfaces perpendicular to the main cross section. It is divided into prisms in the form of primary reflection, secondary reflection and triple reflection;
  • a roof prism can refer to a prism with a ridge surface, and the ridge surface is a mutually perpendicular reflecting surface;
  • a compound prism can refer to a prism composed of two or more prisms. of prism.
  • a dichroic mirror can separate the incident light into a specific spectrum and change the light path direction of part of the spectrum when it is incident at 45 degrees or at a large angle. Its characteristic is that it almost completely transmits light of a certain wavelength, while it is almost completely transparent to light of other wavelengths. Light is almost completely reflected.
  • the type of dichroic mirror is not limited, for example, it can be a single-bandpass dichroic mirror or a multi-bandpass dichroic mirror.
  • the light reflected by the user's eyes can have different properties after being reflected by the reflective device 7 .
  • the image acquisition device 3 in the present disclosure can also be movable inside the host 2 .
  • the image acquisition device 3 is movably installed in the host computer 2 .
  • the manner in which the image acquisition device 3 is movablely installed in the host 2 is not limited, as long as the image acquisition device 3 can be movablely installed in the host 2 .
  • the image acquisition device 3 is movablely installed in the host computer 2 through a base.
  • the base is fixed in the host computer 2.
  • the image acquisition device 3 is connected to the base through a rotating shaft.
  • the image acquisition device 3 can rotate on the base through the rotating axis, thereby adjusting the image acquisition device. 3. Collect the range of user’s eye images.
  • the electronic device further includes:
  • the motor 8 is arranged in the host computer 2 , the motor 8 is connected with the image acquisition device 3 , and the motor 8 is arranged to adjust the acquisition range of the image acquisition device 3 .
  • the motor 8 may refer to an electromagnetic device that converts or transmits electric energy according to the law of electromagnetic induction.
  • the installation position of the motor 8 inside the host computer 2 is not limited, as long as the motor 8 can control the rotation of the image acquisition device 3 .
  • the motor 8 can be disposed in an area close to the second rotation axis inside the host 2.
  • the position of the motor 8 is determined according to the position of the image acquisition device 3 in the host 2.
  • the way in which the motor 8 is arranged inside the host machine 2 is not limited, as long as the motor 8 can be arranged inside the host machine 2 .
  • the motor 8 can be fixedly or movablely installed inside the host 2 .
  • the type of the motor 8 is not limited in this disclosure, as long as it can control the rotation of the image acquisition device 3 .
  • it can be a DC motor, an asynchronous motor or a synchronous motor, etc.
  • the way in which the motor 8 is connected to the image acquisition device 3 is not limited.
  • the motor 8 and the image acquisition device 3 are connected through the first rotating shaft.
  • the image acquisition device 3 can be fixed on the host 2, the motor 8 is also fixed on the host 2, and the motor 8 and the image acquisition device 3 are movably connected.
  • the image acquisition device 3 is fixedly connected to the motor 8 and then movably connected to the host 2 .
  • FIG 4 is a schematic structural diagram of another electronic device according to the present disclosure.
  • the image acquisition device 3 is movably installed in the host 2 through the motor 8.
  • the motor 8 communicates with the image acquisition device through the first rotating shaft 9. 3 connections.
  • the first rotating shaft 9 may refer to a shaft connecting the motor 8 and the image acquisition device 3 and configured to bear both bending moment and torque during rotation.
  • the first rotating shaft 9 is not limited in this disclosure, as long as the motor 8 can control the rotation of the image acquisition device through the first rotating shaft 9 .
  • the first rotating shaft 9 is a straight-shaped rotating shaft.
  • the motor 8 is connected to the image acquisition device 3 through the first rotating shaft 9 and is configured to control the image acquisition device.
  • the device 3 rotates, and the motor 8 controls the rotation of the image acquisition device 3 to adjust the acquisition range of the image acquisition device 3.
  • the electronic device further includes:
  • the equipment angle measurement device 10 is connected to the image acquisition device 3.
  • the equipment angle measurement device 10 is configured to collect the device angle parameters of the image acquisition device 3.
  • the equipment angle measurement device 10 is also configured to transmit the device angle parameters to the processor. for the processor to perform calibration point correction based on the device angle parameters.
  • the device angle measuring device 10 may refer to a device configured to measure the angle of the device, and configured to collect device angle parameters of the image acquisition device 3 .
  • the device angle parameter may be a parameter characterizing the angle or angle transformation of the image acquisition device 3 .
  • the device angle parameters of the image acquisition device 3 may include but are not limited to the first parameter before the image acquisition device 3 is rotated by the motor 8, the second parameter after the image acquisition device 3 is rotated by the motor 8, the image acquisition device 3 after the motor 8 rotates. 8.
  • the set angle parameter may refer to the angle parameter of the image acquisition device 3 set according to actual needs. It may be set according to the set angle parameter and The image acquisition device 3 measures the rotation angle of the image acquisition device 3 through the second parameter after the motor 8 rotates, such as the third parameter.
  • the angle of rotation of the image acquisition device 3 can be obtained through the device angle measuring device 10, which facilitates control of the shooting range of the image acquisition device 3.
  • the first parameter, the second parameter and the third parameter may be angle-related parameters, and the angle may be the angle of the image acquisition device 3 relative to the reference object.
  • the reference object is not limited and can be set according to the actual situation.
  • the reference object is host 2.
  • the set angle parameter may be the initial angle between the image acquisition device 3 and the host computer 2 .
  • the type of the device angle measuring device 10 is not limited, as long as it can measure the angle at which the image capturing device 3 rotates following the motor 8 .
  • the equipment angle measuring device 10 can be an angle sensor.
  • the angle sensor has a rotatable shaft. When the shaft rotates, the angle sensor counts according to the rotation of the shaft. When it rotates in one direction, the count increases. When the rotation direction changes, the count increases. reduce.
  • equipment corner The angle measuring device 10 is a motor 8. Since the image acquisition device 3 is controlled by the motor 8, the angle parameters of the image acquisition device 3 are known to the motor 8. The motor 8 is able to determine the angular parameters of the image acquisition device 3 .
  • the method of connecting the equipment angle measuring device 10 to the image capturing device 3 is not limited, as long as the angle measuring device 10 can measure the angle of the image capturing device 3 rotating with the motor 8 .
  • the equipment angle measurement device 10 is connected to the image acquisition device 3 and the motor 8 through the first rotating shaft 9.
  • the motor 8 rotates, it drives the first rotating shaft 9 to rotate.
  • the first rotating shaft 9 can drive the image acquisition device 3 to rotate, and then the first rotating shaft 9 can drive the image acquisition device 3 to rotate.
  • the equipment angle measuring device 10 determines the equipment angle parameter of the image acquisition equipment 3 according to the rotation of the first rotating shaft 9 .
  • Figure 5 is a partially enlarged view of the image acquisition device, the light-transmitting area, the motor, the first rotating shaft and the device angle measurement device from the perspective shown in Figure 4 provided according to the present disclosure. It can be seen from Figure 5 that the image acquisition device 3 moves through the motor 8 Installed in the host machine 2 , the motor 8 is connected to the image acquisition device 3 through the first rotating shaft 9 . The equipment angle measuring device 10 is connected to the image acquisition device 3 and collects the device angle parameters of the image acquisition device 3 .
  • the motor 8 When the motor 8 rotates, the motor 8 drives the first rotating shaft 9 to rotate, and the first rotating shaft 9 can drive the image acquisition device 3 to rotate, so that the device angle measuring device 10 determines the device angle parameter of the image acquisition device 3 according to the rotation of the first rotating shaft 9 .
  • the equipment angle measuring device 10 is further configured to transmit the equipment angle parameters to the processor, so that the processor can perform calibration point correction based on the equipment angle parameters.
  • the calibration point can be thought of as the point displayed on display 1, set for line of sight calibration.
  • the device angle measuring device 10 transmits the device angle parameters to the processor, which allows the processor to obtain the rotation angle of the image acquisition device 3, that is, the elevation angle of the image acquisition device 3, and then perform calibration point correction through the device angle parameters.
  • the method of correcting the calibration point through the device angle parameters is not limited.
  • the processor obtains the rotation angle of the image acquisition device 3 according to the device angle parameters uploaded by the device angle measurement device 10, and then obtains the angle change of the central optical axis of the image acquisition device 3. , through the angle change of the central optical axis of the image acquisition device 3, the calibration point and the gaze point are before the direction of the central optical axis of the image acquisition device 3 is adjusted. Finally, a one-to-one correspondence is always maintained.
  • the correction of the calibration point by the processor based on the device angle parameter is realized.
  • the elevation angle of the image acquisition device 3 is determined based on one or more of the following:
  • the adjustment parameters are determined based on the screen angle parameters of the current display screen 1.
  • the user's head position may refer to the position of the user's head relative to the display screen 1 .
  • the screen angle parameter of the display screen 1 may refer to the angle parameter of the display screen 1 relative to the host computer 2.
  • the screen angle parameter may include but is not limited to the first angle parameter before the angle of the display screen 1 relative to the host computer 2 changes, the screen angle parameter of the display screen 1
  • the second angle parameter after the angle relative to the host 2 changes and the third angle parameter when the angle of the display screen 1 relative to the host 2 is a set angle.
  • the set angle may be the angle of the display screen 1 relative to the host 2 set according to actual needs. For example, when the electronic device is turned off, the corresponding set angle of the display screen 1 relative to the host 2 is 0.
  • the adjustment parameters of the display screen 1 can be determined through the screen angle parameter, that is, the angle change of the display screen 1 relative to the host 2 when adjusting the display screen 1, or the angle change relative to the display screen 1 before adjustment, or the angle change of the display screen 1 relative to the set angle. Angle changes.
  • the image acquisition device 3 can synchronously adjust the elevation angle of the image acquisition device 3 according to the change in the user's head position.
  • the processor performs calibration point correction according to the device angle parameters determined by the elevation angle of the image acquisition device 3 .
  • the elevation angle of the image acquisition device 3 can be adjusted synchronously according to the adjustment parameters of the display screen 1, so that the image acquisition device 3 can adjust the adjustment parameters of the display screen 1 during the change process.
  • the collection of user eye images can always be guaranteed.
  • the elevation angle of the image acquisition device 3 can be adjusted synchronously according to the user's head position and the adjustment parameters of the display screen 1 .
  • Image acquisition is determined by one or more of the user's head position and the adjustment parameters of the display screen 1
  • the elevation angle of the device 3 can be adjusted synchronously when the position of the user's head and/or the adjustment parameters of the display screen 1 change, so that the image acquisition device 3 can always ensure the collection of images of the user's eyes.
  • one end of the display screen 1 is connected to one end of the host computer 2 through a second rotating shaft.
  • a screen angle measuring device 11 is provided in the second rotating shaft.
  • the screen angle measuring device 11 is configured to measure the screen angle parameter of the display screen 1.
  • the screen angle parameter is transmitted to the processor for the processor to perform calibration point correction based on the screen angle parameter.
  • the screen angle measuring device 11 may refer to a device configured to measure the relative angle between the display screen 1 and the host 2 , and the screen angle parameter of the display screen 1 can be obtained through the screen angle measuring device 11 .
  • the type of the screen angle measuring device 11 is not limited, as long as it can determine the screen angle parameter of the display screen 1 , for example, it can be an angle sensor.
  • the screen angle measuring device 11 and the device angle measuring device 10 may be the same angle measuring device, or they may be different angle measuring devices, as long as they can achieve the corresponding functions of the angle measuring devices.
  • the position of the screen angle measuring device 11 in the second rotation axis is not limited, as long as the screen angle parameter of the display screen 1 can be determined.
  • the screen angle measuring device 11 may be arranged at the rightmost end of the second rotating axis; for another example, the screen angle measuring device 11 may be arranged at the leftmost end of the second rotating axis.
  • the screen angle measurement device 11 transmits the screen angle parameters to the processor.
  • the processor may obtain the angle change of the display screen 1 relative to the host 2, and then perform calibration point correction through the screen angle parameters.
  • the method of correcting the calibration point through the screen angle parameter is not limited.
  • the screen angle parameter of the display screen 1 changes, correspondingly, the calibration point on the display screen 1 changes following the angle change of the display screen 1.
  • the user displays The fixation point on screen 1 also changes.
  • the electronic device determines the rotation angle of the display screen 1 according to the screen angle measuring device 11, derives the positional relationship between the changed calibration point and the changed gaze point, and moves the changed calibration point to the position corresponding to the changed gaze point. position, so that the gaze point and the calibration point always maintain a one-to-one correspondence before and after the angle position of the display screen 1 is adjusted.
  • the gaze point and the calibration point always maintain a one-to-one correspondence.
  • the processor corrects the calibration point based on the screen angle parameter.
  • the elevation angle of the image acquisition device 3 can be adjusted synchronously according to the adjustment parameters of the display screen 1, so that the image acquisition device 3 can adjust the adjustment parameters of the display screen 1 during the change process.
  • the collection of user eye images can always be guaranteed.
  • the processor performs calibration point correction based on the device angle parameters determined by the elevation angle of the image acquisition device 3 and the screen angle parameters of the display screen 1 measured by the screen angle measurement device 11 .
  • the elevation angle of the image acquisition device 3 can be adjusted according to the user's head position and the adjustment parameters of the display screen 1 .
  • the processor performs calibration point correction based on the device angle parameters determined by the elevation angle of the image acquisition device 3 and the screen angle parameters of the display screen 1 measured by the screen angle measurement device 11 .
  • the electronic equipment in the present disclosure can measure the device angle parameters of the image acquisition device 3 by setting the device angle measurement device 10, and set the screen angle measurement device to measure the screen angle parameters of the display screen 1.
  • the processor can measure the device angle parameters and/or the screen according to the device angle parameters. Angle parameters are used for calibration point correction.
  • Figures 6a and 6b show a schematic diagram of the electronic device correcting the calibration point when the user's head position changes and the screen angle parameter of the display screen 1 remains unchanged.
  • Figure 6a is a user gaze point provided according to the present disclosure.
  • FIG. 6b is a schematic diagram of the electronic device correcting the calibration point when the user's head position changes and the screen angle parameter of the display screen remains unchanged according to the present disclosure.
  • the user's head position is head position 16, the user's gaze point on display screen 1 is gaze point 12, and the calibration point is calibration point 13. It is set so that when the user gazes at display screen 1, the user's gaze point is The gaze point position is calibrated so that the gaze point 12 and the calibration point 13 are finally at the same position on the display screen 1 .
  • Figure 6a also shows the user's line of sight 14 and the central optical axis 15 of the image capture device 3. Among them, the user's line of sight 14 is the path of the user's line of sight from the eyes to the display screen 1 .
  • the electronic device synchronously adjusts the image acquisition device 3 according to the change in the user's head position.
  • the elevation angle is, that is, the electronic device controls the motor 8 to drive the image acquisition device 3 to rotate and adjust the direction of the central optical axis of the image acquisition device 3.
  • the calibration point When the angle of the image acquisition device 3 changes, the calibration point will deflect with the image acquisition device 3 , that is, the central optical axis of the image acquisition device 3 changes, and the calibration point 13 will deflect with the image acquisition device 3 to the calibration point 20 .
  • the electronic device measures the device angle parameter of the image capture device 3 according to the device angle measurement device 10 to obtain the rotation angle of the image capture device 3, and then obtains the angle change of the central optical axis of the image capture device 3, and derives the position and gaze point of the calibration point 20 12, and move the calibration point 20 to the gaze point 12, so that the calibration point 20 and the gaze point 12 always maintain a one-to-one correspondence before and after the direction of the central optical axis of the image acquisition device 3 is adjusted.
  • the correction of the calibration point by the processor based on the device angle parameter is realized.
  • Figures 7a and 7b are schematic diagrams of the electronic device correcting the calibration point when the user's head position remains unchanged but the screen angle parameter of the display screen changes.
  • Figure 7a is a display screen position change according to the present disclosure.
  • Figure 7b is a schematic diagram of the electronic device correcting the calibration point when the user's head position remains unchanged and the screen angle parameter of the display screen changes according to the present disclosure.
  • the position of the display screen before changing was position 23, the user's gaze point on the display screen is gaze point 21, the calibration point is calibration point 22, the user's line of sight is 24, and the central optical axis of the image acquisition device 3 is 25.
  • the user's line of sight is 24, and the central optical axis of the image acquisition device 3 is 25.
  • the electronic device obtains the adjustment angle of the display screen based on the screen angle parameter of the display screen measured by the screen angle measuring device 11, derives the positional relationship between the calibration point 29 and the gaze point 27, and moves the calibration point from the calibration point 29 to the calibration point.
  • the gaze point and the calibration point always maintain a one-to-one correspondence before and after the angle position of the display screen is adjusted.
  • the one-to-one correspondence between the gaze point and the calibration point is always maintained before and after the screen angle parameter is changed, enabling the processor to correct the calibration point based on the screen angle parameter.
  • the electronic equipment provided by the disclosure includes a display screen, a host, an eye camera module (i.e., an eye camera, also known as an image acquisition device), a reflection device, an eye camera window (i.e., a light-transmitting area), and an infrared lamp (i.e., a light source) .
  • an eye camera module i.e., an eye camera, also known as an image acquisition device
  • a reflection device i.e., a light-transmitting area
  • an infrared lamp i.e., a light source
  • the eye diagram camera is installed in the internal structural space of the host and is installed separately from the screen (i.e. display screen).
  • An eye camera captures an image of the eye directly, or an eye camera captures an image of the eye through reflection from a reflective device.
  • the reflective device may be a mirror, prism or dichroic mirror.
  • the luminous center of the infrared lamp and the focus of the lens of the eye diagram camera are set on the same axis.
  • the eye diagram camera is installed at a certain angle with the display screen, and the reflection device is installed at a certain angle with the eye diagram camera.
  • the infrared filter is set on the optical path between the eye diagram camera and the human eye.
  • the eye diagram camera can also be installed inside the host, and the eye diagram camera is installed separately from the screen.
  • the eye diagram camera is set up in the following four situations:
  • the eye diagram camera is set inside the host and is separated from the screen.
  • the eye diagram camera is fixed and installed separately.
  • the eye diagram camera does not adjust with the screen.
  • a first angle measurement device ie, screen angle measurement device
  • the screen rotation axis ie, the second rotation axis
  • the processor adjusts parameters based on the screen, correcting the calibration points.
  • the eye diagram camera is set inside the host and is separated from the screen. There is a reflection device and window (i.e. light-transmitting area) in front of the eye diagram camera.
  • the eye diagram camera and reflection device are fixedly installed and do not follow. Screen adjustment.
  • a first angle measurement device is provided on the screen rotation axis, and the first angle measurement device transmits the current screen angle parameters to the processor.
  • the processor adjusts parameters based on the screen, correcting the calibration points.
  • the eye diagram camera is set inside the host and is separated from the screen.
  • the eye diagram camera is movable and adjusted synchronously with the screen to ensure that the relative angle between the eye diagram camera and the screen remains unchanged. .
  • the position of the calibration point remains unchanged before and after the eye camera and screen are adjusted. If the position of either the eye camera or the screen changes, the position of the calibration point will change. By correcting the position of the calibration point, the calibration point and the gaze point always maintain a one-to-one correspondence before and after adjustment.
  • a first angle measurement device is provided on the screen rotation axis, and the first angle measurement device transmits the current screen angle parameters to the processor.
  • the eye diagram camera is connected to the motor through a rotating shaft (i.e., the first rotating shaft).
  • a second angle measuring device i.e., the equipment angle measuring device
  • the second angle measuring device measures the current eye diagram camera angle parameters (i.e., the equipment angle parameters). transmitted to the processor.
  • the processor adjusts parameters according to the screen, synchronously adjusts the eye diagram camera elevation angle, and corrects the calibration point.
  • the eye diagram camera is installed inside the host and is separated from the screen.
  • the eye diagram camera is installed movable and adjusts synchronously with the change of the head position.
  • a first angle measurement device is provided on the screen rotation axis, and the first angle measurement device transmits the current screen angle parameters to the processor.
  • the eye camera is connected to the motor through a rotating shaft, and a second angle measurement device is provided at the rotating shaft.
  • the second angle measurement device transmits the current eye camera angle parameters to the processor.
  • the processor synchronously adjusts the eye diagram camera elevation angle according to changes in the user's head position and corrects the calibration point.
  • the image acquisition device is arranged below the screen, and the image acquisition device and the screen are arranged separately.
  • the setting position of the image acquisition device is more flexible.
  • the image acquisition device is installed in the host casing.
  • the space inside the host casing is larger, making it easier to hide the image acquisition device.
  • the image acquisition device is larger in size and is set up separately from the screen, so it does not occupy the screen space.
  • the screen can be made thinner and lighter, making the appearance of the electronic device more concise and beautiful.
  • the advantage is that there is no moving mechanism structure and it is more reliable.
  • the fixation point is estimated using the current intersection point.
  • the advantage is that it can adapt to a larger head movement range, and the image acquisition device has a wider acquisition range and greater flexibility.
  • the gaze point is estimated by measuring the image capture device and display angles passed to an algorithm set up to calculate the current intersection of the line of sight with the display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)

Abstract

本公开提供了一种电子设备。电子设备包括:显示屏和主机,其中,主机内部设置有图像采集设备,主机上设置有光源和透光区域,透光区域与图像采集设备的采集范围在主机上的投影交叠,图像采集设备设置于通过透光区域采集使用电子设备的用户的眼部图像,眼部图像包括有由光源形成的光斑。本公开提供的电子设备,通过设置在主机内部的图像采集设备和设置在主机上的光源和透光区域,在不增加电子设备显示屏厚度的前提下,实现了电子设备的眼球追踪功能。

Description

一种电子设备
本申请要求于2022年08月22日提交中国专利局、申请号为202211007084.2、申请名称“一种电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及电子设备技术领域,尤其涉及一种电子设备。
背景技术
眼球追踪通常是指通过测量眼睛的注视点的位置或者眼球相对头部的运动而实现对眼球运动的追踪。通过在电子设备中设置眼球追踪模块,能够追踪用户在看特定目标时眼睛的运动和注视方向,有利于用户与电子设备的交互。
眼球追踪技术可以采用光学记录法实现。光学记录法的原理是:利用红外相机记录被测试者的眼睛运动情况,即获取能够反映眼睛运动的眼部图像,从获取到的眼部图像中提取眼部特征,以建立视线的估计模型。其中,眼部特征可以包括:瞳孔位置、瞳孔形状、虹膜位置、虹膜形状、眼皮位置、眼角位置、光斑位置(或者普尔钦斑)等。光学记录法包括瞳孔-角膜反射法。瞳孔-角膜反射法的原理是,近红外光源照向眼睛,由红外相机对眼部进行拍摄,同时拍摄到光源在角膜上的反射点即光斑,由此获取到带有光斑的眼部图像。
现有技术中,在电子设备中设置眼球追踪模块,通常是通过在电子设备的显示屏上设置眼球追踪模块来实现的。
然而,将眼球追踪模块设置在电子设备的显示屏上来实现眼球追踪,会增加显示屏的厚度,进而增加了电子设备的厚度,降低了电子设备的便携性 和用户的体验感。
发明内容
本公开提供了一种电子设备,通过设置在主机内部的图像采集设备和设置在主机上的光源和透光区域,在不增加电子设备显示屏厚度的前提下,实现了电子设备的眼球追踪功能。
第一方面,本公开提供了一种电子设备,所述电子设备,包括:显示屏和主机,所述主机内部设置有图像采集设备,所述主机上设置有光源和透光区域,所述透光区域与所述图像采集设备的采集范围在所述主机上的投影交叠,所述图像采集设备设置为通过所述透光区域采集使用所述电子设备的用户的眼部图像,所述眼部图像包括有由所述光源形成的光斑。
可选地,所述光源的个数为至少两个,各所述光源至少设置在所述图像采集设备的两侧,所述光源发光中心与所述图像采集设备的透镜的焦点设置在同一直线上。
可选地,所述透光区域上设置有透明盖板或滤光片。
可选地,所述图像采集设备与所述主机呈设定角度固定安装。
可选地,所述图像采集设备朝向所述主机上键盘区域所在平面设置,所述图像采集设备直接通过所述透光区域采集所述眼部图像。
可选地,所述主机内设置有反射装置,所述图像采集设备朝向所述反射装置,所述反射装置设置为反射经用户眼部反射的光线。
可选地,所述反射装置包括如下一个或多个:
镜面、棱镜和二向色镜。
可选地,所述图像采集设备活动安装在所述主机内。
可选地,所述电子设备还包括:
电机,所述电机设置在所述主机内,所述电机与所述图像采集设备连接,所述电机设置为调整所述图像采集设备的采集范围。
可选地,所述电子设备还包括:
设备角度测量装置,所述设备角度测量装置与所述图像采集设备连接,所述设备角度测量装置设置为采集所述图像采集设备的设备角度参数,所述设备角度测量装置还设置为将所述设备角度参数传输至处理器,以供所述处理器基于所述设备角度参数进行校准点纠正。
可选地,所述图像采集设备的仰角基于如下一个或多个确定:
用户头部位置;所述显示屏的调整参数,所述调整参数基于当前所述显示屏的屏幕角度参数确定。
可选地,所述显示屏的一端通过第二转轴和所述主机的一端连接,所述第二转轴内设置有屏幕角度测量装置,所述屏幕角度测量装置设置为测量所述显示屏的屏幕角度参数,并将所述屏幕角度参数传输至处理器,以供所述处理器基于所述屏幕角度参数进行校准点纠正。
本公开提供了一种电子设备。所述电子设备包括:显示屏和主机,所述主机内部设置有图像采集设备,所述主机上设置有光源和透光区域,所述透光区域与所述图像采集设备的采集范围在所述主机上的投影交叠,所述图像采集设备设置为通过所述透光区域采集使用所述电子设备的用户的眼部图像,所述眼部图像包括有由所述光源形成的光斑。上述技术方案通过设置在主机内部的图像采集设备和设置在主机上的光源和透光区域,在不增加电子设备显示屏厚度的前提下,实现了电子设备的眼球追踪功能。
应当理解,本部分所描述的内容并非旨在标识本公开的实施例的关键或重要特征,也不用于限制本公开的范围。本公开的其它特征将通过以下的说明书而变得容易理解。
附图说明
为了更清楚地说明本公开中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的 一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是根据本公开提供的一种电子设备的结构示意图;
图2是根据本公开提供的图1所示视角针对图像采集设备和透光区域的局部放大图;
图3是根据本公开提供的又一种电子设备的结构示意图;
图4是根据本公开提供的又一种电子设备的结构示意图;
图5是根据本公开提供的图4所示视角针对图像采集设备、透光区域、电机、第一转轴和设备角度测量装置的局部放大图;
图6a是根据本公开提供的用户注视点位置的示意图;
图6b是根据本公开提供的用户的头部位置改变而显示屏的屏幕角度参数不变时电子设备纠正校准点的示意图;
图7a是根据本公开提供的显示屏位置改变之前的示意图;
图7b是根据本公开提供的用户的头部位置不变而显示屏的屏幕角度参数改变时电子设备纠正校准点的示意图。
具体实施方式
下面结合附图和实施例对本公开作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释本公开,而非对本公开的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本公开相关的部分而非全部结构。
在不冲突的情况下,本公开中的实施例及实施例中的特征可以相互组合。本公开使用的术语“包括”及其变形是开放性包括,即“包括但不限于”。术语“基于”是“至少部分地基于”。术语“一个实施例”表示“至少一个实施例”。
需要注意,本公开中提及的“第一”、“第二”等概念仅用于对相应内 容进行区分,并非用于限定顺序或者相互依存关系。
需要注意,本公开中提及的“一个”、“多个”的修饰是示意性而非限制性的,本领域技术人员应当理解,除非在上下文另有明确指出,否则应该理解为“一个或多个”。
在本公开的描述中,需要说明的是,术语“中心”、“上”、“下”、“左”、“右”、“前”、“后”等指示的方位或位置关系为基于附图所示的方位或位置关系。例如,“上”和“下”是沿纸面的页眉和页脚方向而设定的;“左”和“右”是面向纸面的方向而设定的,“前”是垂直于纸面且从纸背向纸面方向;“后”是垂直于纸面且从纸面向纸背方向。这种设定仅是为了便于描述本公开,而不是指示所指的装置或元件必须具有特定的方位,因此不能理解为对本公开的限制。此外,下面所描述的本公开不同实施方式中所涉及的技术特征或技术方案只要彼此之间未构成冲突就可以相互结合。
实施例
图1是根据本公开提供的一种电子设备的结构示意图,如图1所示,本公开提供了一种电子设备,所述电子设备包括:
显示屏1和主机2,所述主机2内部设置有图像采集设备3,所述主机2上设置有光源4和透光区域5,所述透光区域5与所述图像采集设备3的采集范围在所述主机2上的投影交叠,所述图像采集设备3设置为通过所述透光区域5采集使用所述电子设备的用户的眼部图像,所述眼部图像包括有由所述光源4形成的光斑。
本公开中将图像采集设备3设置在主机2内部,光源4和透光区域5设置在主机2上,使图像采集设备3能够通过透光区域5采集使用电子设备的用户的眼部图像,在不增加电子设备厚度的同时,实现了电子设备的眼球追踪功能。
本实施例以电子设备为笔记本电脑为例,此处不限定电子设备为笔记本电脑,可以为任意需要进行眼球追踪的电子设备。
显示屏1可以包括壳体,以及设置在壳体上的设置为显示的屏幕。
主机2包括一壳体,在壳体上还可以设置有键盘。
显示屏1通过第二转轴和主机2连接,第二转轴可以是指连接显示屏1与主机2的转轴,电子设备通过第二转轴可以使显示屏1与主机2的相对位置呈现不同的夹角,具体可以根据实际需要调整显示屏1和主机2的相对位置。
图像采集设备3的种类此处不作限定,只要能够得到眼部图像即可,示例性的,图像采集设备3包括但不限于:红外摄像设备、红外图像传感器、照相机或摄像机等。
图像采集设备3在主机2内部的设置方式不作限定,只要能够将图像采集设备3设置在主机2内部即可。如,图像采集设备3可以固定设置在主机2内;又如,图像采集设备3可以通过滑轨可移动式的设置在主机2内,当用户相对于电子设备的位置发生变化时,可以通过调整图像采集设备3在滑轨内左右滑动,进而保证在用户相对于电子设备的位置发生变化时图像采集设备3仍可以采集到用户的眼部图像;又如,图像采集设备3可以在主机2内活动设置,通过电机控制图像采集设备3转动,进而调整图像采集设备3采集用户眼部图像的范围。
图像采集设备3在主机2内部的设置位置不作限定,只要能够在用户使用电子设备时,图像采集设备3能够采集到用户的眼部图像即可。其中,眼部图像可以认为是包括用户眼睛的图像。
在一个实施例中,图像采集设备3设置在主机2内部,图像采集设备3所处的位置在第二转轴与键盘之间,距离第二转轴中心位置设定距离的位置处,第二转轴的中心位置可以为第二转轴的中点所在位置。设定距离此处不作限定,具体可以根据电子设备的尺寸确定。
光源4设置在主机2上,以使得光源4发射的光能够经用户眼部反射至电子设备,进而在图像采集设备3所采集的眼部图像中形成光斑。具体的, 在光源4照向用户眼睛后,由图像采集设备3对眼部进行拍照,相应拍摄光源4在角膜上的反射点,即光斑,又称普尔钦斑,由此形成带有光斑的眼部图像。
光源4的波长不作限定,只要能够保证图像采集设备3能够采集到带有光斑的眼部图像即可。
示例性的,光源4可以为处在人眼安全波段的红外光,相对于其他波段的光,在人眼安全波段的红外光照射到眼部时,会被人眼的晶状体中的水体大幅度吸收,使光到达视网膜时的强度大大降低。
设置在主机2上的光源4的排布不做限定,只要能够在眼部图像中存在光斑即可,如以预定的排布方式,如一字型,口字形,圆形等。
光源4的个数不作限定,光源4设置在主机2上的位置不作限定,只要能够保证图像采集设备3能够采集到带有光斑的眼部图像即可。
在一个实施例中,光源4至少一个。不同的光源4可以设置在主机2上的不同位置处,只要能够保证眼部图像中包括有光斑即可。
在一个实施例中,光源4有两个,光源4分别设置在主机2上靠近第二转轴位置处,且两个光源4分别设置在图像采集设备3的左右两端,如两个光源4与图像采集设备3位于同一直线上,图像采集设备3为两个光源4的中点,如图1中所示。
透光区域5设置在主机2上的位置不作限定,只要能够使图像采集设备3通过透光区域5采集使用电子设备的用户的眼部图像即可。如,透光区域5处于图像采集设备3的上方,透光区域5可以设置在与主机2平行的平面内,也可以与主机2平行的平面呈一定的角度,具体需要根据图像采集设备3的拍摄范围确定,最终保证透光区域5的位置覆盖图像采集设备3的拍摄范围即可。
透光区域5与图像采集设备3的采集范围在主机2上的投影交叠,可以理解为,透光区域5处在图像处理设备3的采集范围内,覆盖整个图像采集 设备3到人眼的视场角的拍摄范围。光源4发出的光经过用户眼部反射后,经用户眼部反射的光首先经过透光区域5,再被图像采集设备3所采集。
透光区域5可以透过全部经用户眼部反射的光,也可以只透过与光源4波段相对应经用户眼部反射的光,具体可以根据实际需要来设定。
本公开中将图像采集设备3设置在主机2内部,光源4和透光区域5设置在主机2上,使图像采集设备3能够通过透光区域5采集使用电子设备的用户的眼部图像,实现了电子设备的眼球追踪功能。此外,通过将在图像采集设备3设置在主机2内部,无需增加电子设备中显示屏的厚度,即可实现电子设备的眼球追踪功能。
在一个实施例中,光源4的个数为至少两个,各光源4至少设置在图像采集设备3的两侧,光源4的发光中心与图像采集设备3的透镜的焦点设置在同一直线上。
其中,光源4的个数为至少两个,各光源4至少设置在图像采集设备3的两侧,可以理解为,图像采集设备3的两侧分别至少分布一个光源4,以使光源4的发光中心与图像采集设备3的透镜的焦点设置在同一直线上。
在一个实施例中,光源4有六个,光源4分布在主机2上第二转轴与键盘中间的区域,并围绕图像采集设备3分布。
在一个实施例中,光源4有六个,图像采集设备3的两侧分别分布三个光源4,三个光源4在图像采集设备3的各侧可以均呈一字型排列。
在一个实施例中,光源4有三个,其中,图像采集设备3的两侧分别分布一个光源4,另外一个光源4可以分布在键盘中的任意位置,可以根据实际需要调整光源4在键盘的位置,只要能够保证三个光源4发出的光被用户眼部反射后可以被图像采集设备3所接收即可。
当光源4有多个时,发光中心为阵列所在的几何中心点。
图像采集设备3的透镜的焦点可以是指平行光线经图像采集设备3的透镜折射后的会聚点。当图像采集设备3的透镜为一个时,图像采集设备3的 透镜的焦点为该透镜的焦点;当图像采集设备3的透镜为多个透镜组成的透镜组时,图像采集设备3的透镜的焦点即为透镜组的焦点。
将光源4的发光中心与图像采集设备3的透镜的焦点设置在同一直线上,能够保证经光源4发出的光被用户眼部反射后可以被图像采集设备3所接收。
在一个实施例中,光源4的发光中心与图像采集设备3的透镜的焦点也可以不在一条直线上,只要能够保证经光源4发出的光被用户眼部反射后可以被图像采集设备3所接收即可,具体可以根据实际需要设定光源发光中心的位置。
在一个实施例中,透光区域5上设置有透明盖板或滤光片。
当透光区域5上设置有透明盖板时,透明盖板可以覆盖整个图像采集设备3到人眼的视场角的拍摄范围,且能够透过全部经用户眼部反射的光。光源4发射的光通过用户眼部反射,被用户眼部反射的光线经透明盖板射到图像采集设备3,进而被图像采集设备3接收。通过在透光区域5上设置透明盖板,可以对设置在主机2内部的图像采集设备3起到保护作用,同时也不会遮挡图像采集设备3的采集范围。
当透光区域5上设置有滤光片时,滤光片可以是指用来选取所需辐射波段的光学器件。滤光片的类型不作限定,如可以是薄膜滤光片。
滤光片可以根据光源4的波段确定,使得光源4发出的光经过用户眼部反射后可以通过滤光片再到达图像采集设备3,如光源4为红外光时,经用户眼部反射的光线可以为红外光线,滤光片只能通过与光源4对应波段的红外光,而其他波段的光不能通过滤光片。经用户眼部反射的光线通过滤光片过滤后,使得最终到达图像采集设备3的光线只有光源4对应波段的光,从而使图像采集设备3采集到的眼部图像更加清晰。
透明盖板或滤光片的大小不作限定,只要能够使透明盖板或滤光片覆盖整个图像采集设备3到人眼的视场角的拍摄范围即可。如,透明盖板或滤光片的大小与整个透光区域重合;又如,透明盖板或滤光片的大小大于透光区 域,并覆盖整个透光区域。
本公开中通过在透光区域5上设置透明盖板,实现了对图像采集设备3的保护;通过在透光区域5上设置滤光片,提高了图像采集设备3采集的眼部图像的精度。
在一个实施例中,图像采集设备3与主机2呈设定角度固定安装。
其中,设定角度可以是指设定的图像采集设备3与主机2所在平面的夹角,对设定角度不作限定,不同的设定角度可以使图像采集设备3对应不同的采集范围,设定角度可以为大于等于0度的任意角度,只要保证图像采集设备3能够采集用户眼部图像即可。
图像采集设备3固定安装的方式不作限定,只要能使图像采集设备3固定在主机2内部即可。如,图像采集设备3通过焊接方式固定在主机2内部,与主机2呈设定角度;图像采集设备3通过螺纹连接固定在主机2内部,与主机2呈设定角度。
通过将图像采集设备3与主机2呈设定角度固定安装,没有运动机构结构,可以使图像采集设备3更加可靠的固定在主机2内部,同时在安装图像采集设备3时,通过调整设定角度,还可以使图像采集设备3有不同的采集范围。
在一个实施例中,图像采集设备3朝向主机2上键盘区域所在平面设置,图像采集设备3直接通过透光区域5采集眼部图像。
图像采集设备3朝向主机2上键盘区域所在平面设置,可以理解为,图像采集设备3的拍摄范围在主机2上的投影区域处在主机2上键盘区域所在的平面。
图2是根据本公开提供的图1所示视角针对图像采集设备和透光区域的局部放大图,由图2可知,图像采集设备3朝向主机2上键盘区域6所在的平面设置,用户眼睛所反射的光线,经过透光区域5被图像采集设备3所采集,以形成光斑,即图像采集设备3可以直接通过透光区域5采集眼部图像, 无需增加其余部件,节省了主机2的内部空间。
在一个实施例中,主机2内设置有反射装置,图像采集设备3朝向反射装置,反射装置设置为反射经用户眼部反射的光线。
图3是根据本公开提供的又一种电子设备的结构示意图,在图3中,示出了主机2内部设置的反射装置7。
反射装置7的类型不作限定,只要能够反射经用户眼部反射的光线即可。如可以是包括如下一个或多个:镜面、棱镜和二向色镜,设置为反射经用户眼部反射的光线。
图像采集设备3朝向反射装置7,反射装置7设置为反射经用户眼部反射的光线,可以理解为,光源4发出的光经用户眼部反射后,经过透光区域5到达反射装置7,反射装置7反射经用户眼部反射的光线,再被图像采集设备3所采集。
反射装置7在主机2内部的设置位置不作限定,只要在图像采集设备3采集用户的眼部图像时,反射装置7能够反射经用户眼部反射的光线即可。如,反射装置7设置在主机2内部,反射装置7处在图像采集装置3的采集范围内,且与主机2所在的平面呈一定的夹角,通过改变反射装置7与主机2所在平面的夹角,使反射经用户眼部反射的光线的光路发生改变,进而调整图像采集设备3的采集范围。
透光区域5和反射装置7可以与光源4对应,使得光源4发出的光经过用户眼部反射后可以通过透光区域5再到达反射装置7。
在一个实施例中,光源4为红外光,透光区域5上设置有滤光片,滤光片为红外滤光片,反射装置7为二向色镜。经用户眼部反射的光经过红外滤光片过滤后,只有光源4对应波段的红外光可以到达反射装置7,反射装置7对光源4对应波段的红外光几乎完全透过,而对其他波长的光几乎完全反射,反射装置7反射经用户眼部反射的光,最后被图像采集设备3所采集。因此,经用户眼部反射的光经过红外滤光片和二向色镜之后,最终只有光源4对应 波段的红外光可以被图像采集设备3所采集,使得图像采集设备3采集到的眼部图像更加清晰。
本公开中在主机2内设置反射装置7来反射经用户眼部反射的光线,可以通过改变反射装置7在主机2内所处的位置或反射装置7与主机2所在平面的夹角,使反射经用户眼部反射的光线的光路发生改变,进而使图像采集设备3的采集范围更加灵活。
在一个实施例中,反射装置7包括如下一个或多个:
镜面、棱镜和二向色镜。
其中,当平行入射的光线射到镜面时,仍会平行地向一个方向反射出来。棱镜的类型不作限定,只要能够设置为反射经用户眼部反射的光线即可,如可以是简单棱镜、屋脊棱镜或者复合棱镜等,简单棱镜可以是指所有工作面均与主截面垂直的棱镜,分为一次反射、二次反射和三次反射形式的棱镜;屋脊棱镜可以是指带有屋脊面的棱镜,屋脊面为互相垂直的反射面;复合棱镜可以是指由两个或更多个棱镜组成的棱镜。二向色镜可以是指45度入射或大角度入射时,将入射光分离出特定的光谱进而改变部分光谱的光路方向,其特点是对一定波长的光几乎完全透过,而对其他波长的光几乎完全反射。二向色镜的类型不作限定,如可以是单带通二向色镜或者多带通二向色镜等。
通过选取不同的反射装置7,可以使经用户眼部反射的光线经反射装置7反射后具有不同的性质。
本公开中的图像采集设备3还可以活动设置在主机2内部。
在一个实施例中,图像采集设备3活动安装在主机2内。
图像采集设备3活动安装在主机2内的方式不作限定,只要能够将图像采集设备3活动安装在主机2内即可。如,图像采集设备3通过底座活动安装在主机2内,其中,底座固定在主机2内,图像采集设备3通过转轴连接底座,图像采集设备3可以通过转轴在底座上转动,进而调整图像采集设备3采集用户眼部图像的范围。
在一个实施例中,电子设备还包括:
电机8,电机8设置在主机2内,电机8与图像采集设备3连接,电机8设置为调整图像采集设备3的采集范围。
电机8可以是指依据电磁感应定律实现电能转换或传递的一种电磁装置。
电机8在主机2内部的设置位置不作限定,只要能够使电机8控制图像采集设备3转动即可。如,电机8可以设置在主机2内部靠近第二转轴的区域内,具体根据图像采集设备3在主机2内的设置位置确定电机8的设置位置。
电机8在主机2内部的设置方式不作限定,只要能够使电机8设置在主机2内部即可。如,电机8可以固定设置或活动设置在主机2内部。
电机8的类型本公开不作限定,只要能够控制图像采集设备3转动即可。如可以是直流电机、异步电机或者同步电机等。
电机8与图像采集设备3连接的方式不作限定,如电机8与图像采集设备3通过第一转轴连接。
在一个实施例中,图像采集设备3可以固定在主机2上,电机8也固定在主机2上,电机8和图像采集设备3活动连接。
在一个实施例中,图像采集设备3和电机8固定连接,然后再和主机2活动连接。
图4是根据本公开提供的又一种电子设备的结构示意图,在图4中,示出了图像采集设备3通过电机8活动安装在主机2内,电机8通过第一转轴9与图像采集设备3连接。
第一转轴9可以是指链接电机8与图像采集设备3的设置为转动工作中既承受弯矩又承受扭矩的轴。
本公开中对第一转轴9不作限定,只要能够使电机8通过第一转轴9控制图像采集设备转动即可。如,第一转轴9为一字型转轴。
电机8通过第一转轴9与图像采集设备3连接,设置为控制图像采集设 备3转动,通过电机8控制图像采集设备3转动,调整图像采集设备3的采集范围。
在一个实施例中,电子设备还包括:
设备角度测量装置10,设备角度测量装置10与图像采集设备3连接,设备角度测量装置10设置为采集图像采集设备3的设备角度参数,设备角度测量装置10还设置为将设备角度参数传输至处理器,以供处理器基于设备角度参数进行校准点纠正。
设备角度测量装置10可以是指设置为测量设备角度的装置,设置为采集图像采集设备3的设备角度参数。
设备角度参数可以为表征图像采集设备3角度或角度变换的参数。其中,图像采集设备3的设备角度参数可以包括但不限于图像采集设备3未通过电机8转动之前的第一参数、图像采集设备3通过电机8转动之后的第二参数、图像采集设备3通过电机8转动的角度变化的第三参数和图像采集设备3的设定角度参数,设定角度参数可以是指根据实际需要设定的图像采集设备3的角度参数,可以设置为根据设定角度参数和图像采集设备3通过电机8转动之后的第二参数来度量图像采集设备3转动的角度,如第三参数。通过设备角度测量装置10可以得到图像采集设备3转动的角度,方便对图像采集设备3的拍摄范围进行控制。
第一参数、第二参数和第三参数可以为与角度相关的参数,角度可以为图像采集设备3相对于参照物的角度,参照物不作限定,可以根据实际情况设定。如参照物为主机2。设定角度参数可以为图像采集设备3与主机2的初始角度。
设备角度测量装置10的类型不作限定,只要能够测量图像采集设备3跟随电机8转动的角度即可。如设备角度测量装置10可以是角度传感器,角度传感器中有能够转动的轴,当轴转动时,角度传感器就会根据轴的转动计数,往一个方向转动时,计数增加,转动方向改变时,计数减少。又如,设备角 度测量装置10为电机8,由于图像采集设备3是电机8控制的,故图像采集设备3的角度参数对于电机8是已知的。电机8能够确定图像采集设备3的角度参数。
设备角度测量装置10与图像采集设备3连接的方式不作限定,只要能够使角度测量装置10能够测量图像采集设备3随电机8转动的角度即可。如,设备角度测量装置10与图像采集设备3和电机8通过第一转轴9连接,当电机8转动时,带动第一转轴9转动,第一转轴9可以带动图像采集设备3转动,进而可以在第一转轴9转动时,设备角度测量装置10根据第一转轴9的转动确定图像采集设备3的设备角度参数。
图5是根据本公开提供的图4所示视角针对图像采集设备、透光区域、电机、第一转轴和设备角度测量装置的局部放大图,由图5可知,图像采集设备3通过电机8活动安装在主机2内,电机8通过第一转轴9与图像采集设备3连接。设备角度测量装置10与图像采集设备3连接,采集图像采集设备3的设备角度参数。当电机8转动时,电机8带动第一转轴9转动,第一转轴9可以带动图像采集设备3转动,使得设备角度测量装置10根据第一转轴9的转动确定图像采集设备3的设备角度参数。
设备角度测量装置10还设置为将所述设备角度参数传输至处理器,以供所述处理器基于所述设备角度参数进行校准点纠正。
校准点可以认为是显示屏1上显示的点,设置为进行视线校准。
设备角度测量装置10将设备角度参数传输至处理器,可以使处理器获取图像采集设备3的转动角度,即图像采集设备3的仰角,进而通过设备角度参数进行校准点纠正。
通过设备角度参数进行校准点纠正的方式不作限定,如,处理器根据设备角度测量装置10上传的设备角度参数获得图像采集设备3转动的角度,进而获得图像采集设备3的中心光轴的角度变化,通过图像采集设备3的中心光轴的角度变化使校准点与注视点在图像采集设备3的中心光轴方向调整前 后始终保持一一对应关系。
通过在图像采集设备3的中心光轴方向调整前后使校准点与注视点保持一一对应,实现了处理器基于设备角度参数对校准点的纠正。
在一个实施例中,图像采集设备3的仰角基于如下一个或多个确定:
用户头部位置;显示屏1的调整参数,调整参数基于当前显示屏1的屏幕角度参数确定。
其中,用户的头部位置可以是指用户头部相对于显示屏1的位置。
显示屏1的屏幕角度参数可以是指显示屏1相对于主机2的角度参数,屏幕角度参数可以包括但不限于显示屏1相对于主机2的角度发生变化之前的第一角度参数、显示屏1相对于主机2的角度发生变化之后的第二角度参数和显示屏1相对于主机2的角度为设定角度时的第三角度参数。其中,设定角度可以是根据实际需要设定的显示屏1相对于主机2的角度,如电子设备关闭时对应显示屏1相对于主机2的设定角度为0。
通过屏幕角度参数可以确定显示屏1的调整参数,即调整显示屏1时相对于主机2的角度变化,或相对于调整前的显示屏1的角度变化,或相对于设定角度显示屏1的角度变化。
当用户的头部位置发生改变,而显示屏1的调整参数未发生变化时,图像采集设备3为了达到更好的采集效果,可以根据用户的头部位置变化同步调整图像采集设备3的仰角。当图像采集设备3的仰角发生变化时,处理器根据图像采集设备3的仰角确定的设备角度参数进行校准点纠正。
当用户的头部位置不变,显示屏1的调整参数变化时,可以根据显示屏1的调整参数同步调整图像采集设备3的仰角,使图像采集设备3在显示屏1的调整参数变化的过程中始终可以保证对用户眼部图像的采集。
当用户的头部位置和显示屏1的调整参数均发生变化时,可以根据用户的头部位置和显示屏1的调整参数同步调整图像采集设备3的仰角。
通过用户头部位置和显示屏1的调整参数中的一个或多个确定图像采集 设备3的仰角,可以在用户头部位置和/或显示屏1的调整参数发生改变时,同步调整图像采集设备3的仰角,使图像采集设备3始终可以保证对用户眼部图像的采集。
在一个实施例中,显示屏1的一端通过第二转轴和主机2的一端连接,第二转轴内设置有屏幕角度测量装置11,屏幕角度测量装置11设置为测量显示屏1的屏幕角度参数,并将屏幕角度参数传输至处理器,以供处理器基于屏幕角度参数进行校准点纠正。
屏幕角度测量装置11可以是指设置为测量显示屏1与主机2相对角度的装置,通过屏幕角度测量装置11可以获取显示屏1的屏幕角度参数。
屏幕角度测量装置11的类型不作限定,只要能够确定显示屏1的屏幕角度参数即可,如可以是角度传感器。屏幕角度测量装置11与设备角度测量装置10可以是相同的角度测量装置,也可以是不同的角度测量装置,只要能够实现角度测量装置对应的作用即可。
屏幕角度测量装置11在第二转轴内设置的位置不作限定,只要能够确定显示屏1的屏幕角度参数即可。如可以是屏幕角度测量装置11设置在第二转轴的最右端;又如,屏幕角度测量装置11设置在第二转轴的最左端。
屏幕角度测量装置11将屏幕角度参数传输至处理器,可以是处理器获取显示屏1相对于主机2角度的变化,进而通过屏幕角度参数进行校准点纠正。
通过屏幕角度参数进行校准点纠正的方式不作限定,如,当显示屏1的屏幕角度参数发生变化时,相应的,显示屏1上的校准点跟随显示屏1的角度变化发生改变,用户在显示屏1上的注视点也发生改变。电子设备根据屏幕角度测量装置11确定显示屏1的转动角度,推导出发生改变之后的校准点与改变之后的注视点的位置关系,并将发生改变之后的校准点移动至改变之后的注视点对应的位置,从而使注视点与校准点在显示屏1角度位置调整前后始终保持一一对应关系。
在屏幕角度参数改变前后使注视点与校准点始终保持一一对应关系,实 现了处理器基于屏幕角度参数对校准点的纠正。
当用户的头部位置不变,显示屏1的调整参数变化时,可以根据显示屏1的调整参数同步调整图像采集设备3的仰角,使图像采集设备3在显示屏1的调整参数变化的过程中始终可以保证对用户眼部图像的采集。处理器根据图像采集设备3的仰角确定的设备角度参数和屏幕角度测量装置11测量的显示屏1的屏幕角度参数进行校准点纠正。
当用户的头部位置和显示屏1的调整参数均发生变化时,可以根据用户的头部位置和显示屏1的调整参数调整图像采集设备3的仰角。处理器根据图像采集设备3的仰角确定的设备角度参数和屏幕角度测量装置11测量的显示屏1的屏幕角度参数进行校准点纠正。
本公开中的电子设备均可通过设置设备角度测量装置10测量图像采集设备3的设备角度参数,设置屏幕角度测量装置测量显示屏1的屏幕角度参数,处理器可以根据设备角度参数和/或屏幕角度参数进行校准点纠正。
在一个实施例中,图6a和图6b所示为用户的头部位置改变而显示屏1的屏幕角度参数不变时电子设备纠正校准点的示意图,图6a是根据本公开提供的用户注视点位置的示意图;图6b是根据本公开提供的用户的头部位置改变而显示屏的屏幕角度参数不变时电子设备纠正校准点的示意图。
如图6a所示,用户的头部位置为头部位置16,用户在显示屏1上的注视点为注视点12,校准点为校准点13,设置为用户注视显示屏1时,对用户的注视点位置进行校准,使注视点12与校准点13最终处于显示屏1上的同一位置。图6a还示出了用户的视线14和图像采集设备3的中心光轴15。其中,用户的视线14为用户的视线由眼部到达显示屏1的路径。
如图6b所示,当用户的头部位置由头部位置16移动到头部位置17时,相应的,用户的视线由视线14变为视线18,图像采集设备3的中心光轴由中心光轴15变为中心光轴19。
假设头部移动,显示屏1和显示屏1所显示画面固定不动,视线在显示 屏1上的注视点不变,当用户的头部位置由头部位置16移动到头部位置17,为了达到更好的采集效果,电子设备根据用户头部位置变化,同步调整图像采集设备3的仰角,即电子设备控制电机8带动图像采集设备3转动,调整图像采集设备3中心光轴的方向。
当图像采集设备3的角度变化时,校准点会跟随图像采集设备3偏转,即图像采集设备3的中心光轴发生改变,校准点13会跟随图像采集设备3偏转至校准点20处。电子设备根据设备角度测量装置10测量图像采集设备3的设备角度参数获得图像采集设备3转动的角度,进而获得图像采集设备3的中心光轴的角度变化,推导出校准点20的位置与注视点12的位置关系,并将校准点20移动至注视点12处,从而使校准点20与注视点12在图像采集设备3的中心光轴方向调整前后始终保持一一对应关系。通过在图像采集设备3的中心光轴方向调整前后使校准点与注视点保持一一对应,实现了处理器基于设备角度参数对校准点的纠正。
在一个实施例中,图7a和图7b所示为用户的头部位置不变而显示屏的屏幕角度参数改变时电子设备纠正校准点的示意图,图7a是根据本公开提供的显示屏位置改变之前的示意图;图7b是根据本公开提供的用户的头部位置不变而显示屏的屏幕角度参数改变时电子设备纠正校准点的示意图。
如图7a所示,显示屏位置改变之前为位置23,用户在显示屏上的注视点为注视点21,校准点为校准点22,用户的视线为24、图像采集设备3的中心光轴为25。
如图7b所示,假设用户的头部位置不变,视线方向固定不变,用户的视线为24,图像采集设备3的中心光轴为25,当显示屏的位置由23调整为26时,显示屏上的校准点跟随显示屏角度调节由校准点22调整至校准点29,用户视线在屏幕上的注视点由注视点21变为注视点27。电子设备根据屏幕角度测量装置11测量的显示屏的屏幕角度参数获得显示屏的调整角度,推导出校准点29与注视点27的位置关系,并将校准点由校准点29移动到校准点 28处,使注视点与校准点在显示屏角度位置调整前后始终保持一一对应关系。在屏幕角度参数改变前后使注视点与校准点始终保持一一对应关系,实现了处理器基于屏幕角度参数对校准点的纠正。
以下对本公开进行示例性描述,
本公开提供的电子设备包括显示屏、主机、眼图相机模组(即眼图相机,又称图像采集设备)、反射装置、眼图相机窗口(即透光区域)、红外灯(即光源)。
眼图相机设置在主机内部结构空间内,与屏幕(即显示屏)分体设置。
眼图相机直接捕获眼部图像,或者,眼图相机通过反射装置反射捕获眼部图像。
反射装置可以是镜面、棱镜或二向色镜。
红外灯至少有一组,设置在主机上,设置在眼图相机两侧。红外灯发光中心与眼图相机的透镜的焦点被设置在同一轴线上。
眼图相机与显示屏呈一定角度安装,反射装置与眼图相机呈一定夹角安装。
红外滤光片设置在眼图相机与人眼之间的光路上。
在一个实施例中,眼图相机还可以设置在主机内部,眼图相机与屏幕分体设置。眼图相机分以下四种情况被设置:
1.如图2所示,眼图相机设置在主机内部,与屏幕分体设置,眼图相机单独固定安装,眼图相机不跟随屏幕调节。
屏幕转轴(即第二转轴)上设置有第一角度测量装置(即屏幕角度测量装置),第一角度测量装置将当前屏幕角度参数传输至处理器。处理器根据屏幕调整参数,纠正校准点。
2.如图3所示,眼图相机设置在主机内部,与屏幕分体设置,眼图相机前方设置有反射装置和窗口(即透光区域),眼图相机和反射装置固定安装,不跟随屏幕调节。
屏幕转轴上设置有第一角度测量装置,第一角度测量装置将当前屏幕角度参数传输至处理器。处理器根据屏幕调整参数,纠正校准点。
3.如图5所示,眼图相机设置在主机内部,与屏幕分体设置,眼图相机活动安装,眼图相机跟随屏幕相对同步调节,以保证眼图相机与屏幕间的相对角度不变。基于眼图相机与屏幕间的相对角度不变,校准点在眼图相机和屏幕调节前后的位置保持不变。若眼图相机和屏幕中的任意一个位置发生改变,则校准点的位置将发生改变。通过纠正校准点的位置实现校准点和注视点在调节前后始终保持一一对应关系。
屏幕转轴上设置有第一角度测量装置,第一角度测量装置将当前屏幕角度参数传输至处理器。眼图相机通过转轴(即第一转轴)与电机连接,转轴处设置有第二角度测量装置(即设备角度测量装置),第二角度测量装置将当前眼图相机角度参数(即设备角度参数)传输至处理器。处理器根据屏幕调整参数,同步调整眼图相机仰角,纠正校准点。
4.如图5所示,眼图相机设置在主机内部,与屏幕分体设置,眼图相机活动安装,眼图相机跟随头部位置变化而同步调节。
屏幕转轴上设置有第一角度测量装置,第一角度测量装置将当前屏幕角度参数传输至处理器。眼图相机通过转轴与电机连接,转轴处设置有第二角度测量装置,第二角度测量装置将当前眼图相机角度参数传输至处理器。处理器根据用户头部位置变化,同步调整眼图相机仰角,纠正校准点。
本公开提供的电子设备通过将图像采集设备设置在屏幕下方,图像采集设备与屏幕分体设置。图像采集设备的设置位置更加灵活,图像采集设备安装在主机壳体内,主机壳体内空间较大,更容易隐藏图像采集设备;图像采集设备体积较大,与屏幕分体设置,不占用屏幕空间,屏幕可以做的比较轻薄,使电子设备的外观更简洁美观。
本公开中,图像采集设备固定安装在主机内部时,好处是没有运动机构结构,更加可靠。通过测量显示屏角度传递给算法设置为计算视线与显示屏 的当前交点的方式估算注视点。
本公开中,图像采集设备活动安装在主机内部时,好处是适应更大的头动范围,图像采集设备的采集范围更广,灵活性更大。通过测量图像采集设备和显示屏角度传递给算法设置为计算视线与显示屏的当前交点的方式估算注视点。
需要说明的是,上述仅为本公开的较佳实施例及所运用技术原理。本领域技术人员会理解,本公开不限于这里所述的特定实施例,对本领域技术人员来说能够进行各种明显的变化、重新调整和替代而不会脱离本公开的保护范围。因此,虽然通过以上实施例对本公开进行了较为详细的说明,但是本公开不仅仅限于以上实施例,在不脱离本公开构思的情况下,还可以包括更多其他等效实施例,而本公开的范围由所附的权利要求范围决定。

Claims (12)

  1. 一种电子设备,包括:显示屏和主机,所述主机内部设置有图像采集设备,所述主机上设置有光源和透光区域,所述透光区域与所述图像采集设备的采集范围在所述主机上的投影交叠,所述图像采集设备设置为通过所述透光区域采集使用所述电子设备的用户的眼部图像,所述眼部图像包括有由所述光源形成的光斑。
  2. 根据权利要求1所述的电子设备,其中,所述光源的个数为至少两个,各所述光源至少设置在所述图像采集设备的两侧,所述光源发光中心与所述图像采集设备的透镜的焦点设置在同一直线上。
  3. 根据权利要求1所述的电子设备,其中,所述透光区域上设置有透明盖板或滤光片。
  4. 根据权利要求1所述的电子设备,其中,所述图像采集设备与所述主机呈设定角度固定安装。
  5. 根据权利要求4所述的电子设备,其中,所述图像采集设备朝向所述主机上键盘区域所在平面设置,所述图像采集设备直接通过所述透光区域采集所述眼部图像。
  6. 根据权利要求1所述的电子设备,其中,所述主机内设置有反射装置,所述图像采集设备朝向所述反射装置,所述反射装置设置为反射经用户眼部反射的光线。
  7. 根据权利要求6所述的电子设备,所述反射装置包括如下一个或多个:
    镜面、棱镜和二向色镜。
  8. 根据权利要求1所述的电子设备,其中,所述图像采集设备活动安装在所述主机内。
  9. 根据权利要求8所述的电子设备,其中,还包括:
    电机,所述电机设置在所述主机内,所述电机与所述图像采集设备连接,所述电机设置为调整所述图像采集设备的采集范围。
  10. 根据权利要求8所述的电子设备,其中,还包括:
    设备角度测量装置,所述设备角度测量装置与所述图像采集设备连接,所述设备角度测量装置设置为采集所述图像采集设备的设备角度参数,所述设备角度测量装置还设置为将所述设备角度参数传输至处理器,以供所述处理器基于所述设备角度参数进行校准点纠正。
  11. 根据权利要求9所述的电子设备,其中,所述图像采集设备的仰角基于如下一个或多个确定:
    用户头部位置;所述显示屏的调整参数,所述调整参数基于当前所述显示屏的屏幕角度参数确定。
  12. 根据权利要求1至11中任意一项所述的电子设备,其中,所述显示屏的一端通过第二转轴和所述主机的一端连接,所述第二转轴内设置有屏幕角度测量装置,所述屏幕角度测量装置设置为测量所述显示屏的屏幕角度参数,并将所述屏幕角度参数传输至处理器,以供所述处理器基于所述屏幕角度参数进行校准点纠正。
PCT/CN2023/114070 2022-08-22 2023-08-21 一种电子设备 WO2024041488A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211007084.2 2022-08-22
CN202211007084.2A CN117666706A (zh) 2022-08-22 2022-08-22 一种电子设备

Publications (1)

Publication Number Publication Date
WO2024041488A1 true WO2024041488A1 (zh) 2024-02-29

Family

ID=90012510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/114070 WO2024041488A1 (zh) 2022-08-22 2023-08-21 一种电子设备

Country Status (2)

Country Link
CN (1) CN117666706A (zh)
WO (1) WO2024041488A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101813976A (zh) * 2010-03-09 2010-08-25 华南理工大学 基于soc的视线跟踪人机交互方法及装置
CN103677270A (zh) * 2013-12-13 2014-03-26 电子科技大学 一种基于眼动跟踪的人机交互方法
CN106873774A (zh) * 2017-01-12 2017-06-20 北京奇虎科技有限公司 基于视线跟踪的交互控制方法、装置及智能终端
CN107357429A (zh) * 2017-07-10 2017-11-17 京东方科技集团股份有限公司 用于确定视线的方法、设备和计算机可读存储介质
CN109857253A (zh) * 2019-02-03 2019-06-07 北京七鑫易维信息技术有限公司 一种眼球追踪装置及方法
US10319154B1 (en) * 2018-07-20 2019-06-11 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for dynamic vision correction for in-focus viewing of real and virtual objects
CN113808160A (zh) * 2021-08-05 2021-12-17 虹软科技股份有限公司 视线方向追踪方法和装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101813976A (zh) * 2010-03-09 2010-08-25 华南理工大学 基于soc的视线跟踪人机交互方法及装置
CN103677270A (zh) * 2013-12-13 2014-03-26 电子科技大学 一种基于眼动跟踪的人机交互方法
CN106873774A (zh) * 2017-01-12 2017-06-20 北京奇虎科技有限公司 基于视线跟踪的交互控制方法、装置及智能终端
CN107357429A (zh) * 2017-07-10 2017-11-17 京东方科技集团股份有限公司 用于确定视线的方法、设备和计算机可读存储介质
US10319154B1 (en) * 2018-07-20 2019-06-11 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for dynamic vision correction for in-focus viewing of real and virtual objects
CN109857253A (zh) * 2019-02-03 2019-06-07 北京七鑫易维信息技术有限公司 一种眼球追踪装置及方法
CN113808160A (zh) * 2021-08-05 2021-12-17 虹软科技股份有限公司 视线方向追踪方法和装置

Also Published As

Publication number Publication date
CN117666706A (zh) 2024-03-08

Similar Documents

Publication Publication Date Title
TWI569040B (zh) 自動調焦頭戴式顯示裝置
CN106797422B (zh) 视线检测用的装置
US20190353894A1 (en) Method of utilizing defocus in virtual reality and augmented reality
US7364297B2 (en) Digital documenting ophthalmoscope
US20100278394A1 (en) Apparatus for Iris Capture
CN209117975U (zh) 一种眼球追踪模组和头戴式显示设备
US10488917B2 (en) Gaze-tracking system and method of tracking user's gaze using reflective element
US9004684B2 (en) Fundus camera
WO2015014058A1 (zh) 眼睛光学参数检测系统及眼睛光学参数检测方法
CN106796386B (zh) 投影型显示装置
WO2014041248A1 (en) Gaze guidance arrangement
EP0411475B1 (en) Photoscreening camera system
WO2014075557A1 (zh) 人脸识别装置
CN102657516A (zh) 视网膜自动成像系统
CN218273337U (zh) 头戴式显示设备
WO2024041488A1 (zh) 一种电子设备
KR101780669B1 (ko) 단일 카메라를 이용한 양안 촬영 장치
JP4148700B2 (ja) 目画像撮像装置
WO2010056542A1 (en) Apparatus for iris capture
CN117687475A (zh) 一种可折叠的电子设备
WO2023246816A1 (zh) 一种眼球追踪光学系统及头戴式设备
WO2023246814A1 (zh) 一种眼球追踪光学系统及头戴式设备
KR20200140168A (ko) 안면 피부 미용기기
WO2023246815A1 (zh) 一种眼球追踪光学系统及头戴式设备
JP2021062162A (ja) 走査型眼底撮影装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23856578

Country of ref document: EP

Kind code of ref document: A1