WO2020103763A1 - Procédé de commande d'écran d'affichage conformément au point de focalisation du globe oculaire et équipement électronique monté sur la tête - Google Patents

Procédé de commande d'écran d'affichage conformément au point de focalisation du globe oculaire et équipement électronique monté sur la tête

Info

Publication number
WO2020103763A1
WO2020103763A1 PCT/CN2019/118623 CN2019118623W WO2020103763A1 WO 2020103763 A1 WO2020103763 A1 WO 2020103763A1 CN 2019118623 W CN2019118623 W CN 2019118623W WO 2020103763 A1 WO2020103763 A1 WO 2020103763A1
Authority
WO
WIPO (PCT)
Prior art keywords
display screen
display
electronic device
user
focus
Prior art date
Application number
PCT/CN2019/118623
Other languages
English (en)
Chinese (zh)
Inventor
周国名
陈健
臧旭
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP19886694.9A priority Critical patent/EP3862845B1/fr
Priority to FIEP19886694.9T priority patent/FI3862845T3/fi
Priority to US17/295,699 priority patent/US20220019282A1/en
Publication of WO2020103763A1 publication Critical patent/WO2020103763A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the present application relates to the field of communication technology, and in particular, to a method for controlling a display screen according to eye focus and a head-mounted electronic device.
  • an augmented reality (AR) device can display virtual images for the user while watching the real-world scene. Users can also interact with virtual images to achieve augmented reality effects.
  • AR augmented reality
  • power consumption and heat dissipation are one of the technical bottlenecks of head-mounted electronic devices.
  • the power consumption and heat dissipation of the display screen of the head-mounted electronic device occupy the main part.
  • the image displayed on the display screen of the head-mounted electronic device will affect the user's viewing of the real world, especially when the background color of the image is bright, such as white, it is difficult for the user to see clearly The real world behind the image.
  • the present application provides a method for controlling a display screen according to the focus of eyeballs and a head-mounted electronic device. Save power and reduce the impact of images on viewing the real world.
  • the present application provides a method for controlling a display screen according to the focus of an eyeball, characterized in that the method is applied to a head-mounted electronic device, the head-mounted electronic device includes a display screen, and the display screen is turned off Is transparent; the method includes: displaying an image on the display screen, the focus of the user's eyeball is within the first distance range; when it is detected that the focus of the user's eyeball is not within the first distance range When the duration is greater than or equal to the first duration, the display screen is turned off.
  • the above method for controlling the display screen according to the focus of the eyeball is implemented.
  • the display screen is turned off. Users can view physical objects through the transparent display screen. The influence of the displayed image on the user's viewing of the real world can be reduced. In addition, the power consumption of the display screen can be reduced.
  • the display screen is turned off when one or more of the following conditions occur: a.
  • the processor calls the focus detection sensor to detect that the focus of the user's eyeball falls on the fixed focus of the display screen below the first threshold
  • the first threshold may be 1 second, for example.
  • the processor calls the focal length detection sensor to detect that the frequency at which the focus of the user's eyeball exceeds the fixed focus of the display screen exceeds the second threshold, which may be twice per minute, for example.
  • the processor calls the focus detection sensor to detect that the focus of the user's eyeball has moved away from the fixed focus of the display screen for more than a first time period, which may be 1 minute, for example.
  • the method further includes: when it is detected that the duration of the user's eyeball focal point falling within the first distance range is greater than or equal to a second duration, Turn on the display.
  • the display screen When the focus of the user's eyeball is within the fixed focus of the display screen, the display screen is turned on.
  • the operation convenience of the head-mounted electronic device can be improved.
  • the display screen is turned on when one or more of the following conditions occur: a.
  • the processor calls the focus detection sensor to detect that the focus of the user's eyeball falls on the fixed focus of the display screen for more than the second duration, The second duration may be, for example, 1 second; b.
  • the processor calls the focal length detection sensor to detect that the focus of the user's eyeball falls on the fixed focus of the display with a frequency of more than twice per minute; c.
  • the processor calls the focal length detection sensor Detecting that the focus of the user 200's eyeball focus falls near the fixed focus of the display screen for more than 1 second; d.
  • the duration that the processor calls the focal length detection sensor to detect that the user's eyeball focus is outside the fixed focus focus of the display screen Less than 1 second.
  • the display screen is a liquid crystal display screen
  • turning off the display screen when playing a video on the display screen includes: turning off the backlight of the display screen.
  • the user When watching a video, the user needs to watch the physical object and then pauses the video playback and turns off the display of the display screen, which can reduce power consumption and improve the convenience of the user to watch the video.
  • only closing the display panel does not need to close the screen driver IC, and it is not necessary to initialize and configure the screen driver IC when resuming video playback, thereby improving the response speed of resuming playback.
  • playing video on the display means that the display refreshes and displays different images in sequence in time.
  • Pause video playback that is, the image displayed on the display does not change with time, and the subsequent image displayed on the display is the image displayed when it is paused.
  • closing the display screen may refer to closing the first area on the display screen.
  • Opening the display screen may refer to opening the first area of the display screen.
  • the first area is turned off and the image displayed in the display area other than the first area does not change until the display screen is resumed to play the video.
  • the electronic device can also detect the depth of focus of the user's eyeball.
  • the electronic device can also detect the angle of focus of the user's eyeball, that is, the gaze direction of the eyeball.
  • the first area on the display screen that is turned off may be the area where the direction of the eye's viewing angle falls on the display screen.
  • the first area on the display screen that is turned off may also be an area where the eyeball is projected vertically on the display screen.
  • the head-mounted electronic device may perform any one of the following when it is detected that the duration of time when the user's eye focus is not within the first distance range is greater than or equal to the first duration. Brightness; turn off the display of the display; move the area of the image on the display; move and reduce the area of the image on the display.
  • the display screen is a liquid crystal display screen
  • turning off the display screen when displaying navigation information on the display screen includes turning off the backlight of the display screen and one or more of the following : A display panel, a driving circuit of the backlight, and a driving circuit of the display screen.
  • the backlight of the display, the display panel and the driver IC can be turned off to further save power consumption.
  • the display screen is an organic light-emitting diode display screen.
  • the closing the display screen includes: closing the display panel of the display screen.
  • the user When watching a video, the user needs to watch the physical object and then pauses the video playback and turns off the display of the display screen, which can reduce power consumption and improve the convenience of the user to watch the video.
  • only closing the display panel does not need to close the screen driver IC, and it is not necessary to initialize and configure the screen driver IC when resuming video playback, thereby improving the response speed of resuming playback.
  • the display screen is an organic light-emitting diode display screen
  • the turning off the display screen includes: turning off the driving circuit of the display screen and the display Display panel.
  • both the display panel and the driver IC can be turned off to further save power consumption.
  • the method further includes: when any one or more of the following are detected, turning off the display screen: detecting that the duration of the user's eye focus is not within the first distance range is greater than or equal to The first duration; the pressing operation of the first key is detected; the first gesture is detected; the first voice signal is detected.
  • the first gesture may be, for example, one or more of the following gestures: scissors hand, fist, finger snapping, OK gesture, etc.
  • the correspondence between the first key and the instruction to close the display screen may be preset by the system, or may be set in response to the user's operation.
  • the correspondence between the first gesture and the instruction to close the display screen may also be preset by the system, or may be set in response to the user's operation.
  • the correspondence between the first voice signal and the instruction to turn off the display screen may be preset by the system, or may be set in response to the user's operation.
  • the method further includes: turning on the display screen when any one or more of the following are detected:
  • the duration that the user's eyeball focal point falls within the first distance range is detected to be greater than or equal to the second duration; the second key press operation is detected; the second gesture is detected; the second is detected voice signal.
  • the correspondence between the second key and the instruction to open the display screen may be preset by the system, or may be set in response to the user's operation.
  • the correspondence between the second gesture and the instruction to open the display screen may also be preset by the system, or may be set in response to the user's operation.
  • the corresponding relationship between the second voice signal and the instruction to open the display screen may be preset by the system, or may be set in response to the user's operation.
  • the second gesture may be, for example, one or more of the following gestures: scissors hand, fist, finger snapping, OK gesture, etc.
  • the user operation may also be touch screen operation of the display screen and brain wave signals.
  • the head-mounted electronic device can also detect the eyeball in a specific state through the camera, for example, when the eyeball rolls up, down, or blinks quickly, the head-mounted electronic device can turn off the display or turn on the display.
  • the head-mounted electronic device can also detect that the movement speed exceeds the speed threshold through the speed sensor, turn off the display screen, or move the navigation information displayed on the display screen to the side of the display screen .
  • the display screen is turned off. It can reduce the situation that the user moves too fast in the navigation scene and the image displayed on the display screen affects the user's safety, providing convenience for the user.
  • the head-mounted electronic device can also determine that there is no intersection within the distance threshold range in the movement direction of the current walking route according to the device positioning position, for example, if there is no intersection within 1 km, the head-mounted electronic device can Turn off the display.
  • the intersection that requires user attention is not approached, turning off the display of the display screen can save power consumption of the head-mounted electronic device and reduce the obstruction of the user's line of sight by the display image of the display screen, providing convenience for the user.
  • the electronic device may turn off the display panel and the driver IC when it is detected that images of N consecutive solid objects on the display screen are not focused by the eyeball.
  • the display screen and the driver IC are turned on to display the image of the solid object in the current viewing angle of the electronic device: a. The focus of the eyeball is focused on the fixed focus of the display screen. b. In response to a user operation. c. It has been detected that M (an integer greater than 1) entity objects have entered the electronic device perspective.
  • the head-mounted electronic device can also obtain the number or probability of historical focus of images corresponding to different physical objects by the user through machine learning.
  • the head-mounted electronic device can determine the trigger condition for turning off the display screen according to the historical focusing times or probability of each type of entity object. The greater the number or probability of historical focusing, the looser the trigger condition for closing the display screen can be set for this type of entity object. For example, the smaller the first threshold value, the larger the second threshold value and the first duration value.
  • machine learning is used to determine the user's preferred display content of the display screen, and then the condition for turning off the display screen is set according to the user's preferred display content of the display screen. While saving power consumption, you can more accurately determine whether the user needs to turn off the display.
  • the head-mounted electronic device can obtain the number or frequency of historical focus of the corresponding images of the user in different scenes through machine learning.
  • the head-mounted electronic device may determine the trigger condition for turning off the display screen according to the historical focusing times or frequency of the solid object in each scene. The greater the number or frequency of historical focusing, the more relaxed the trigger conditions for turning off the display in this scenario can be set. For example, the smaller the first threshold value, the larger the second threshold value and the first duration value.
  • machine learning is used to determine the user's preferred display screen display scene, and then the conditions for turning off the display screen are set according to the user's preferred display screen display scene. While saving power consumption, you can more accurately determine whether the user needs to turn off the display.
  • the present application provides another method for controlling a display screen according to the focus of the eyeball, the method is applied to a head-mounted electronic device, the head-mounted electronic device includes a display screen, and the display screen is transparent when the display is closed
  • the method includes: playing a video on the display screen, the focus of the user's eyeball is within a first distance range; the duration of detecting that the focus of the user's eyeball is not within the first distance range is greater than or When it is equal to the first duration, the corresponding display area on the display panel in the focus angle of the user's eyeball is closed, and the video on the display screen is paused.
  • the method further includes: after detecting that the user's eyeball focus point falls within the first distance range When the duration within is greater than or equal to the second duration, the corresponding display area on the display panel in the viewing angle of the user's eye focus is opened to start playing the video on the display screen.
  • the present application provides a head-mounted electronic device, including: one or more processors and one or more memories; the one or more memories are coupled to the one or more processors, the One or more memories are used to store computer program code, where the computer program code includes computer instructions, and when the one or more processors execute the computer instructions, the terminal performs the first aspect, the second aspect,
  • the method for controlling a display screen according to the focus of an eye provided in any possible implementation manner of the first aspect or any possible implementation manner of the second aspect.
  • the present application provides a computer storage medium, including computer instructions, when the computer instructions run on a terminal, causing the terminal to perform any one of the first aspect, the second aspect, and the first aspect Of the second embodiment or any possible implementation manner of the second aspect provides a method for controlling a display screen according to the eye focus.
  • the display screen when the focus of the user's eyeball is not within the fixed focus of the display screen, the display screen is turned off. Users can view physical objects through the transparent display screen. The influence of the displayed image on the user's viewing of the real world can be reduced. In addition, the power consumption of the display screen can be reduced.
  • FIG. 1 is a schematic structural diagram of a head-mounted electronic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the principle of eye focal length measurement in an electronic device provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of some human-computer interactions in a navigation scenario provided by an embodiment of the present application.
  • 4a-4k are schematic diagrams of some embodiments of human-computer interaction provided by embodiments of the present application.
  • FIG. 5 is a schematic diagram of some human-computer interactions in a navigation scenario provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of other human-computer interactions in a navigation scenario provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of human-computer interaction provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a human-computer interaction provided by an embodiment of the present application.
  • 9a and 9b are schematic structural diagrams of a display screen provided by embodiments of the present application.
  • FIG. 10 is a schematic flowchart of a method for controlling a display screen according to eye focus according to an embodiment of the present invention.
  • the electronic devices involved in the embodiments of the present application are introduced.
  • the electronic device may be a head-mounted electronic device. Users can wear head-mounted electronic devices to achieve different effects such as virtual reality (virtual reality (VR), AR, mixed reality (MR)).
  • VR virtual reality
  • AR mixed reality
  • the head-mounted electronic device may be glasses, head-mounted electronic device, goggles, and the like.
  • the electronic device may also be other devices that include a display screen, such as an autonomous vehicle that includes a display screen.
  • FIG. 1 is a schematic structural diagram of a head-mounted electronic device according to an embodiment of the present application.
  • the user's eyes can see the image presented on the display screen of the head mounted electronic device.
  • the display screen is transparent, the user's eyes can see the physical object through the display screen, or the user's eyes can see the image displayed by another display device through the display screen.
  • the embodiments of the present application take the electronic device as the head-mounted electronic device for example, but the embodiment of the present application is not limited to the head-mounted electronic device, and the electronic device may also be other devices.
  • the head-mounted electronic device 100 may include a processor 110, a memory 120, a sensor module 130, a microphone 140, a key 150, an input / output interface 160, a communication module 170, a camera 180, a battery 190, a display screen 1100, and the like.
  • the sensor module 130 may include a focal length detection optical sensor 181, which is used to detect the focal length of the eyeball of the user 200.
  • the sensor module 130 may further include other sensors, such as a sound detector, a proximity light sensor, a distance sensor, a gyro sensor, an ambient light sensor, an acceleration sensor, and a temperature sensor.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the head-mounted electronic device 100.
  • the head-mounted electronic device 100 may include more or fewer components than shown, or combine some components, or split some components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), and an image signal processor (image) signal processor (ISP), video processing unit (video processing unit, VPU) controller, memory, video codec, digital signal processor (DSP), baseband processor, and / or neural network processing (Neural-network processing unit, NPU), etc.
  • application processor application processor
  • AP application processor
  • modem processor graphics processor
  • graphics processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • video processing unit video processing unit
  • VPU video processing unit
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural network processing
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be a nerve center and a command center of the head-mounted electronic device 100.
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal to complete the control of fetching instructions and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Avoid repeated access, reduce the waiting time of the processor 110, thus improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, universal asynchronous receiver / transmitter (UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general input output (general) -purpose input / output (GPIO) interface, user identification module (subscriber identity module, SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, serial peripheral interface (serial peripheral interface, SPI) Interface etc.
  • I2C integrated circuit
  • UART universal asynchronous receiver / transmitter
  • MIPI mobile industry processor interface
  • GPIO general input output
  • SIM subscriber identity module
  • USB universal serial bus
  • serial peripheral interface serial peripheral interface
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be respectively coupled with a focal length detection optical sensor 131, a battery 190, a camera 180, etc. through different I2C bus interfaces.
  • the processor 110 may couple the focal length detection optical sensor 131 through the I2C interface, so that the processor 110 and the focal length detection optical sensor 131 communicate through the I2C bus interface to obtain the user's eye focal length.
  • the SPI interface can be used for the connection between the processor and the sensor.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the communication module 170.
  • the processor 110 communicates with the Bluetooth module in the communication module 170 through the UART interface to implement the Bluetooth function.
  • the MIPI interface can be used to connect the processor 110 to the display 1100, the camera 180 and other peripheral devices.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI) and so on.
  • the processor 110 and the camera 180 communicate through a CSI interface to implement the shooting function of the head-mounted electronic device 100.
  • the processor 110 and the display screen 1100 communicate through a DSI interface to realize the display function of the head-mounted electronic device 100.
  • the GPIO interface can be configured via software.
  • the GPIO interface can be configured as a control signal or a data signal.
  • the GPIO interface may be used to connect the processor 110 to the camera 180, the display screen 1100, the communication module 170, the sensor module 130, the microphone 140, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the head mounted electronic device 100, and can also be used to transfer data between the head mounted electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as mobile phones.
  • the USB interface may be USB3.0, which is used for signal transmission compatible with a high-speed display interface (DP), and can transmit high-speed video and audio data.
  • DP high-speed display interface
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic description, and does not constitute a limitation on the structure of the head-mounted electronic device 100.
  • the head-mounted electronic device 100 may also use different interface connection methods in the foregoing embodiments, or a combination of multiple interface connection methods.
  • the head-mounted electronic device 100 may include a wireless communication function.
  • the communication module 170 may include a wireless communication module and a mobile communication module.
  • the wireless communication function can be realized by an antenna (not shown), a mobile communication module (not shown), a modem processor (not shown), a baseband processor (not shown), and the like.
  • the antenna is used to transmit and receive electromagnetic wave signals.
  • the head-mounted electronic device 100 may include multiple antennas, and each antenna may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module may provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied to the head-mounted electronic device 100.
  • the mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and so on.
  • the mobile communication module can receive electromagnetic waves from the antenna, filter and amplify the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module can also amplify the signal modulated by the modem processor and convert it to electromagnetic wave radiation through the antenna.
  • at least part of the functional modules of the mobile communication module may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module and at least part of the module of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker, etc.), or displays an image or video through the display screen 1100.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module or other functional modules.
  • the wireless communication module can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (bluetooth, BT), and global navigation that are applied to the headset 100 Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module may be one or more devices integrating at least one communication processing module.
  • the wireless communication module receives the electromagnetic wave through the antenna, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110.
  • the wireless communication module may also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it to electromagnetic wave radiation through the antenna.
  • the antenna of the head mounted electronic device 100 and the mobile communication module are coupled so that the head mounted electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include a global mobile communication system (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long-term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology, etc.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and / or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the head-mounted electronic device 100 realizes a display function through a GPU, a display screen 1100, and an application processor.
  • the GPU is a microprocessor for image processing, and connects the display screen 1100 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the number of display screens 1100 in the head-mounted electronic device 100 may be two, corresponding to two eyeballs of the user 200, respectively.
  • the content displayed on these two displays can be displayed independently. Different images can be displayed on the two display screens to improve the stereoscopic image.
  • the number of display screens 1100 in the head-mounted electronic device 100 may also be one, corresponding to two eyeballs of the user 200.
  • the head-mounted electronic device 100 can realize a shooting function through an ISP, a camera 180, a video codec, a GPU, a display screen 1100, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 180. For example, when taking a picture, the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, and the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 180.
  • the camera 180 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the head-mounted electronic device 100 may include 1 or N cameras 180, where N is a positive integer greater than 1.
  • the camera 180 may be installed on the side of the head-mounted electronic device 100, and may also be installed at a position between two display screens on the head-mounted electronic device 100.
  • the camera 180 is used to capture images and videos in the user's 200 perspective in real time.
  • the head-mounted electronic device 100 generates a virtual image based on the captured real-time image and video, and displays the virtual image through the display screen 1100.
  • the processor 110 may determine the virtual image displayed on the display screen 1100 according to the still image or video image captured by the camera 180 and the data acquired by the sensor module 130 (such as brightness, sound, etc.) to superimpose on the real-world object On virtual images.
  • the digital signal processor is used to process digital signals, in addition to digital image signals, it can also process other digital signals.
  • the digital signal processor is used to perform Fourier transform on the energy at the frequency point.
  • Video codec is used to compress or decompress digital video.
  • the head mounted electronic device 100 may support one or more video codecs.
  • the head-mounted electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent recognition of the head-mounted electronic device 100, such as image recognition, face recognition, voice recognition, and text understanding.
  • the memory 120 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes instructions stored in the memory 120 to execute various functional applications and data processing of the head mounted electronic device 100.
  • the memory 120 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one function required application programs (such as sound playback function, image playback function, etc.).
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the head-mounted electronic device 100.
  • the memory 120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the head-mounted electronic device 100 may implement audio functions through an audio module, a speaker, a microphone 140, a headphone jack, and an application processor. For example, music playback, recording, etc.
  • the audio module is used to convert digital audio information into analog audio signal output, and also used to convert analog audio input into digital audio signal.
  • the audio module can also be used to encode and decode audio signals.
  • the audio module may be disposed in the processor 110, or some functional modules of the audio module may be disposed in the processor 110.
  • Loudspeakers also called “horns" are used to convert audio electrical signals into sound signals.
  • the head-mounted electronic device 100 can listen to music through a speaker, or listen to a hands-free call.
  • the microphone 140 also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the head-mounted electronic device 100 may be provided with at least one microphone 140. In other embodiments, the head-mounted electronic device 100 may be provided with two microphones 140, which can also achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the head-mounted electronic device 100 may also be provided with three, four, or more microphones 140 to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the head mounted electronic device 100 may include a sound detector 132 that can detect and process voice signals used to control the portable electronic device.
  • the sound detector may include a microphone 140.
  • the head-mounted electronic device 100 can use the microphone 140 to convert sound into electrical signals.
  • the sound detector 132 may then process the electrical signal and recognize the signal as a command of the head mounted display system 1300.
  • the processor 110 may be configured to receive voice signals from the microphone 140. After receiving the voice signal, the processor 110 may run the sound detector 132 to recognize the voice command. For example, when a voice instruction is received, the head-mounted electronic device 110 may obtain a contact on the stored user contact list, and the head-mounted electronic device 100 may automatically dial the contact's phone number.
  • the headphone jack is used to connect wired headphones.
  • the headphone jack can be a USB jack, or a 3.5mm open mobile headset electronic platform (OMTP) standard interface, and the American Telecommunications Industry Association (cellular telecommunications industry association of the United States, CTIA) standard interface.
  • OMTP open mobile headset electronic platform
  • CTIA American Telecommunications Industry Association
  • the head-mounted electronic device 100 may include one or more keys 150 that can control the head-mounted electronic device and provide users with access to functions on the head-mounted electronic device 100.
  • the keys 150 may be in the form of buttons, switches, dials, and touch or near-touch sensing devices (such as touch sensors).
  • the user 20 can open the display screen 1100 of the head mounted electronic device 100 by pressing a button.
  • the key 190 includes a power-on key, a volume key, and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the head mounted electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the head mounted electronic device 100.
  • the head-mounted electronic device 100 may include an input-output interface 160, and the input-output interface 160 may connect other devices to the head-mounted electronic device 100 through suitable components.
  • Components may include audio / video jacks, data connectors, etc., for example.
  • the sound detector can detect and process the voice signal used to control the portable electronic device.
  • the head-mounted electronic device 100 can implement eye tracking.
  • an infrared device such as an infrared emitter
  • an image acquisition device such as a camera
  • the proximity light sensor may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the head mounted electronic device 100 emits infrared light outward through the light emitting diode.
  • the head mounted electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the head-mounted electronic device 100. When insufficient reflected light is detected, the head mounted electronic device 100 may determine that there is no object near the head mounted electronic device 100.
  • the head-mounted electronic device 100 may use a proximity light sensor to detect a gesture operation at a specific position of the head-mounted electronic device 100 to achieve the purpose of associating the gesture operation with an operation command.
  • the head mounted electronic device 100 can measure the distance by infrared or laser. In some embodiments, the head mounted electronic device 100 may use a distance sensor to measure distance to achieve fast focusing.
  • the gyro sensor may be used to determine the movement posture of the head mounted electronic device 100.
  • the angular velocity of the head mounted electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor can also be used for navigation and somatosensory game scenes.
  • the ambient light sensor is used to sense the ambient light brightness.
  • the head-mounted electronic device 100 can adaptively adjust the brightness of the display screen 1100 according to the perceived ambient light brightness.
  • the ambient light sensor can also be used to automatically adjust the white balance when taking pictures.
  • the acceleration sensor can detect the magnitude of acceleration of the head mounted electronic device 100 in various directions (generally three axes). When the head mounted electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the attitude of the head-mounted electronic device and used in applications such as pedometers.
  • the temperature sensor is used to detect the temperature.
  • the head mounted electronic device 100 uses the temperature detected by the temperature sensor to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor exceeds the threshold, the head-mounted electronic device 100 performs to reduce the performance of the processor located near the temperature sensor in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is below another threshold, the head-mounted electronic device 100 heats the battery 190 to avoid abnormal shutdown of the head-mounted electronic device 100 due to low temperature. In some other embodiments, when the temperature is below another threshold, the head mounted electronic device 100 performs boosting on the output voltage of the battery 190 to avoid abnormal shutdown due to low temperature.
  • the focal length detection optical sensor 131 is used to detect the focal length of the eyeball of the user 200.
  • the head-mounted electronic device 100 may further include an infrared light source 1200.
  • the focal length detection optical sensor 131 may detect the focal length of the eyeball of the user 200 in cooperation with the infrared light source 1200.
  • the focal length detection optical sensor 131 and the infrared light source 1200 may be disposed on the side of the display screen close to the eyeball.
  • the number of the focal length detection optical sensor 131 and the infrared light source 1200 may be two, and each eyeball may correspond to a focal length detection optical sensor 131 and the infrared light source 1200 for detecting the focal length of the eyeball focus.
  • the positions and numbers of the focal length detection optical sensor 131, the infrared light source 1200, and the camera 180 on the head mounted electronic device 100 shown in FIG. 1 are only used to explain the embodiments of the present application and should not constitute a limitation.
  • the number of the focal length detection optical sensor 131 and the infrared light source 1200 may be one.
  • the number of one focal length detection optical sensor 131 and one infrared light source 1200 can be used to detect the focal length of one eyeball focusing, or to detect the focal lengths of two eyeballs focusing at the same time.
  • FIG. 2 is a schematic diagram of the principle of eye focal length measurement in a head-mounted electronic device provided by an embodiment of the present application.
  • the infrared light source 1200, the eyeball, and the focal length detection optical sensor 131 constitute an optical system.
  • the focal length detection optical sensor 131 may include a slit diaphragm Z and an image sensor (for example, CCD or CMOS).
  • the infrared light source 1200 can periodically emit infrared light. The infrared light is projected to the eyeball, and the eyeball reflects the infrared light.
  • the refractive state of the eyeball is different, and the reflected light passes through the diaphragm Z to form eyeball images of different shapes on the imaging plane.
  • the eyeball image may be a circular light spot including light and dark parts.
  • the focal length detection optical sensor 131 can detect the size of the bright area of the eyeball image, and then calculate the diopter and the focal length of the eyeball according to the size of the bright area.
  • the relationship between the bright area size and the focal length can be determined by the following data in the optical system: the distance A from the principal plane of the eye to the principal plane of the focal length detection optical sensor 131, the radius of the eye, and the eccentric distance E from the center of the infrared light source to the edge of the aperture of the diaphragm.
  • the focal length detection optical sensor 131 may be located at the edge of the head mounted electronic device 100. That is, the optical axis of the focal length detection optical sensor 131 and the optical axis where the eyeball is located do not overlap, and the relationship between the size of the bright area and the focal length can be compensated according to the positions of the focal length detection optical sensor 131 and the eyeball.
  • the display screen 1100 is used to display images, videos and the like.
  • the display screen 1100 includes a display panel.
  • the display panel may use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode or an active matrix organic light-emitting diode (active-matrix organic light) emitting diode, AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active matrix organic light-emitting diode active-matrix organic light
  • AMOLED flexible light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • embodiments of the present application provide an electronic device and a display method of the electronic device. The following describes several examples provided by the embodiments of the present application.
  • FIG. 3 is a schematic diagram of some embodiments of human-computer interaction provided by embodiments of the present application.
  • the electronic device 100 may be set in a video playback mode.
  • the video clip is displayed on the display screen 1100 of the electronic device 100.
  • the user 200 can focus the eyeball focus on the fixed focus of the display screen 1100 to see a clear image.
  • the fixed focus may be, for example, a position 2 meters away from the user's 200 eyeballs.
  • the fixed focus point may also be within a distance from the front of the eye of the user 200, for example, a position of 2 to 4 meters from the front of the eye of the user 200. This distance range is the first distance range.
  • the user 200 may appear a scene that needs to watch an entity object other than the display screen of the display screen 1100. For example, as shown in 3b in FIG. 3, when the user 200 needs to observe the solid object 300, he can focus the eyeball on the solid object 300.
  • the focus detection sensor in the electronic device 100 detects that the focus of the user 200 ’s eyeball focus is beyond the fixed focus of the display screen 1100, for example, the focus of the user 200 ’s eyeball focus is detected
  • the physical object 300 is located 0.5 meters in front of the eye of the user 200.
  • the electronic device 100 may turn off the display in the first area 1101 on the display screen 1100.
  • the first area 1101 that is turned off is transparent.
  • the user 200 can view the physical object 300 through the first area 1101.
  • the electronic device 100 may also pause the video image played on the display screen . While the electronic device 100 pauses the video image played on the display screen, it can also pause playing the audio corresponding to the video.
  • playing video on the display means that the display refreshes and displays different images in sequence in time.
  • Pause video playback that is, the image displayed on the display does not change with time, and the subsequent image displayed on the display is the image displayed when it is paused.
  • the first area 1101 is turned off and displayed, and the image displayed in the display area other than the first area 1101 does not change until the display screen resumes playing the video.
  • the display screen 1100 is an OLED display screen
  • turning off the display in the first area 1101 can be achieved by the processor controlling the power supply of pixels in the first area on the OLED display panel to be turned off.
  • the processor controlling the power supply of pixels in the first area on the OLED display panel to be turned off.
  • the processor in the electronic device 100 determines that the focus of the user 200's eyeball is beyond the fixed focus of the display screen 1100, which can be determined by one or more of the following conditions: a.
  • the processor calls the focus detection sensor to detect the user The focus of the eyeball of 200 falls on the fixed focus of the display screen 1100 below the first threshold, and the first threshold may be, for example, 1 second.
  • the processor calls the focus detection sensor to detect that the frequency at which the focus of the eyeball of the user 200 exceeds the fixed focus of the display screen 1100 exceeds the second threshold, which may be twice per minute, for example.
  • the processor calls the focal length detection sensor to detect that the focus of the user 200's eyeballs has moved away from the fixed focus of the display screen 1100 for a first time period, which may be 1 minute, for example.
  • the electronic device 100 may restore the display to the first area 1101 on the display 1100 that is turned off.
  • the display screen 1100 is an OLED display screen
  • restoring the first area 1101 on the display screen 1100 can be achieved by the processor controlling the restoration of power supply to pixels in the first area on the OLED display panel.
  • the image displayed after the first area 1101 on the display screen 1100 is restored can be the same as the image displayed in 3c in FIG. 3 when the first area 1101 is closed.
  • the electronic device 100 may Resume playing video images.
  • the processor in the electronic device 100 determines that the focus of the user 200's eyeball is returned to the fixed focus of the display screen 1100, which can be determined by one or more of the following conditions: a.
  • the processor calls the focus detection sensor to detect The focus of the eyeball focus of the user 200 falls on the fixed focus of the display screen 1100 for more than a second duration, which may be, for example, 1 second; b.
  • the processor calls the focus detection sensor to detect that the focus of the eyeball focus of the user 200 falls The frequency of the fixed focus of the display screen 1100 exceeds twice per minute; c.
  • the processor calls the focus detection sensor to detect that the focus of the eyeball of the user 200 falls near the fixed focus of the display screen 1100 for more than 1 second; d. Processing
  • the camera calls the focal length detection sensor to detect that the focus of the user 200's eyeball falls outside the fixed focus of the display screen 1100 for less than 1 second.
  • the vicinity of the fixed focus may be 1 to 5 meters in front of the eye of the user 200.
  • the second duration and the first duration may be equal or different.
  • the electronic device 100 in addition to detecting the focus of the user's eyeball, can also detect the depth of focus of the user's eyeball.
  • the electronic device 100 can also detect the angle of focus of the user's eyeball, that is, the gaze direction of the eyeball.
  • the first area 1101 on the display screen 1100 that is turned off may be the area where the direction of the eyeball angle falls on the display screen 1100.
  • the first area 1101 on the display screen 1100 that is turned off may also be an area where the eyeball is projected vertically on the display screen 1100.
  • the display of the first area of the display screen is turned off.
  • the user can view the physical object through the transparent first area.
  • the influence of the displayed image on the user's viewing of the real world can be reduced.
  • the user needs to watch the physical object and then pauses the video playback and turns off the display of the first area of the display screen, which can reduce power consumption and improve the user's convenience of watching the video.
  • only closing the display panel does not need to close the screen driver IC, and it is not necessary to initialize and configure the screen driver IC when resuming video playback, thereby improving the response speed of resuming playback.
  • the processor of the electronic device 100 may reduce the brightness of the first area or turn off display.
  • the processor of the electronic device 100 can also move or reduce the area on the display screen for displaying images.
  • FIGS. 4a-4d are schematic diagrams of some embodiments of human-computer interaction provided by embodiments of the present application.
  • the processor of the electronic device 100 may decrease the brightness of the first area 1101.
  • the user 200 can view the solid object 300 through the transparent first area 1101, which can reduce the influence of the displayed image on the user viewing the real world.
  • only the brightness of the first area 1101 is reduced, and there is no need to close the screen driver IC and the display panel, and there is no need to initialize the configuration of the screen driver IC and the display panel when resuming video playback, thereby improving the response speed of resuming playback.
  • the processor of the electronic device 100 may turn off the display of the display screen.
  • the display can be an LCD display or an OLED display. After the display is turned off, the display screen is transparent, and the user 200 can view the physical object 300 through the transparent display screen, which can reduce the influence of the display image on the user viewing the physical object 300. Turning off the entire display can further save power consumption.
  • the processor may call a part of the display area 1102 on the display screen 1100 for display, and the area other than the area 1102 on the display screen where the image is displayed is turned off and displayed in a transparent state.
  • the processor of the electronic device 100 may switch the area where the image is displayed from the area 1102 to the area 1103.
  • the area 1103 displaying the image can be avoided from the area projected on the display screen by the focus angle of the eyeball, so that the eyeball can view the solid object 300 through the screen area other than the area displaying the image.
  • the processor may switch the area where the image is displayed to a position offset from the center of the display screen, and reserve a transparent area for the user's eyeball to view the solid object 300.
  • the processor of the electronic device 100 may switch the area where the image is displayed to the center area of the display screen, or use the entire The image is displayed in the display area.
  • Moving the area where the image is displayed on the display screen to an area other than the area projected by the eye-focusing view angle on the display screen can reduce the influence of the displayed image on the user viewing the solid object 300 and increase the response speed of the display screen when the display is resumed.
  • the processor of the electronic device 100 may switch the area where the image is displayed from the area 1104 For area 1105. And after switching, the number of pixels in the area 1105 is smaller than the area 1104 to reserve a larger transparent area of the display screen for the user's eyeballs. As shown in FIG. 4d, the area where the image is displayed can be avoided from the area where the focus angle of eyeball is projected on the display screen, so that the eyeball can view the solid object 300 through the screen area other than the area 1105 where the image is displayed.
  • the processor of the electronic device 100 When the processor of the electronic device 100 detects that the focus of the user 200's eyeballs returns to the fixed focus of the display screen 1100, the processor of the electronic device 100 can switch the area where the image is displayed to the center area of the display screen and expand the display The area of the image, or use the entire display area to display the image.
  • the direction in which the area of the image displayed on the display screen moves or shrinks can be determined according to the angle of view obtained by eye tracking. After the area on the display screen where the image is displayed is moved or reduced, the eyeball can view the physical object 300 through the display screen area other than the area where the image is displayed on the display screen.
  • Moving and reducing the area where the image is displayed on the display screen to an area other than the area where the eye-focusing angle of view is projected on the display screen can reduce the impact of the displayed image on the user viewing the physical object 300. Since the display screen is not turned off, the display can also be improved The response speed when the screen resumes display.
  • the electronic device 100 may restore the display screen display image in response to the user's operation.
  • User operations can be key operations, gesture operations, voice signals, or other operations. Please refer to FIG. 4e and FIG. 4f, which are schematic diagrams of an embodiment of human-computer interaction, respectively.
  • the electronic device 100 can detect the pressing operation of the second key to restore the first area 1101 Or display an image on the display.
  • the processor in the electronic device 100 may also call the ambient light sensor to detect The light reflected when the finger approaches, can restore the first area 1101 or the image displayed on the display screen.
  • the processor in the electronic device 100 may also call the camera to detect a specific The gesture, that is, the second gesture, can restore the first area 1101 or the image displayed on the display screen.
  • the processor may call an image recognition algorithm to recognize a specific gesture (second gesture) collected by the camera.
  • the second gesture may be, for example, one or more of the following gestures: scissors hand, fist, finger snapping, OK gesture, etc.
  • the processor in the electronic device 100 may restore the first when it detects one or more of the following An image is displayed on the area 1101 or the display screen: a detects a user operation; b detects that the focus of the user's eyeball is on the fixed focus of the display screen 1100.
  • the electronic device 100 when the first area 1101 on the display screen 1100 turns off the display or reduces the brightness, or turns off the display on the display screen 1100, when the electronic device 100 detects the second voice signal, the first area 1101 or The image is displayed on the monitor.
  • the processor in the electronic device 100 acquires the second voice signal through the sound detector, the processor may restore the image displayed on the first area 1101 or the display screen.
  • the electronic device 100 may store the mapping relationship between the second voice signal and the instruction to execute the display restoration.
  • the electronic device can also detect other signals, such as touch screen operation of the display screen and brain wave signals, to restore the display of the display screen, not limited to the operation of keys, gestures and voice.
  • the head-mounted electronic device can restore the display Display and display in low brightness. For example, it is possible to display at 30% of the brightness before the display is turned off.
  • the low-brightness display image can be used for the user's eye focus.
  • the brightness of the display screen can be adjusted to normal brightness, for example, to the brightness before the display is turned off.
  • the display screen At a certain time when the display screen is displayed at a low brightness, for example, 20 seconds, it is detected that the focus of the user's eyeball is not within the fixed focus of the display screen, and the display screen is turned off. Or for a certain period of time when the display screen is displayed at a low brightness, for example, 20 seconds, it is detected that the user's eyeball focus change frequency is still relatively large, and the display screen is turned off.
  • the head-mounted electronic device may reopen the display screen for the user's eyeball to refocus at intervals such as 2 minutes. If it is not detected that the display is focused within 5 seconds after opening the display, the display is turned off.
  • the brightness of the display screen may be a low-brightness display, for example, display with 30% of the brightness before the display is turned off. Until it is detected that the focus of the user's eyeball returns to the fixed focus of the display screen, the brightness of the display screen is adjusted to the normal brightness, for example, the brightness before the display is turned off.
  • the electronic device 100 may turn off the display screen display or move or reduce the area on the display screen in response to the user's operation.
  • User operations can also be key operations, gesture operations, voice signals, or other operations. Please refer to FIG. 4g and FIG. 4h, which are schematic diagrams of an embodiment of human-computer interaction, respectively.
  • the electronic device 100 detects the pressing operation of the first key, and can perform any one of the following: turn off the display of the display 1100; turn off the display Display of the first area; move the area where the image is displayed on the display screen; move and reduce the area where the image is displayed on the display screen.
  • the processor in the electronic device 100 calls the ambient light sensor to detect the light reflected when the finger approaches, and the processor may perform any of the following Item: Turn off the display on the display screen 1100; turn off the display of the first area on the display screen; move the area on the display screen to display the image; move and reduce the area on the display screen to display the image.
  • the processor in the electronic device 100 can also call the camera to detect a specific gesture, for example, the first gesture
  • the processor may perform the following Either: turn off the display on the display screen 1100; turn off the display of the first area on the display screen; move the area on the display screen to display the image; move and reduce the area on the display screen to display the image.
  • the first gesture may be, for example, one or more of the following gestures: scissors hand, fist, finger snapping, OK gesture, etc.
  • the processor in the electronic device 100 may turn off the display screen display when one or more of the following are detected: a. A user operation is detected; b It is detected that the focus of the user's eyeball is beyond the fixed focus of the display screen 1100.
  • the electronic device 100 when an image is displayed on the display screen 1100, when the electronic device 100 detects the first voice signal, it may perform any one of the following: turn off the display of the display 1100; turn off the first on the display Display of the area; move the area where the image is displayed on the display; move and reduce the area where the image is displayed on the display.
  • the processor in the electronic device 100 may detect the first voice signal through a sound detector.
  • the first voice signal may be a specific voice signal preset by the system.
  • the electronic device 100 may store the mapping relationship between the first voice signal and the instruction to execute the display screen off command.
  • Use the user's operation to restore the display screen display image, use the user's operation to turn off the display screen display, or move, reduce the area of the display image on the display screen, can reduce the erroneous operation, improve the accuracy of controlling the display screen display image.
  • the electronic device 100 may store a mapping relationship between user operations and instructions.
  • the instruction may include: restoring the display screen display image; turning off the display screen display; turning off the display of the first area on the display screen; moving the area of the display screen image; moving and reducing the area of the display screen image.
  • the mapping relationship may be preset by the system, or may be customized in response to user operation. An example of a user-defined mapping relationship between user operations and instructions is given below.
  • FIG. 4i is a schematic diagram of an embodiment of human-computer interaction.
  • the user can open the option of "first key associated display” on the setting interface of the head-mounted display device, then the head-mounted display device can establish the first key press and display in response to the opening operation The mapping relationship of switches.
  • the electronic device may respond to the pressing operation of the first key, and close the illuminated display screen 1100 or open the closed display screen 1100.
  • the user can open the option of “OK gesture associated display screen” on the setting interface of the head-mounted display device, then the head-mounted display device can establish an OK gesture and display switch in response to the opening operation Mapping relationship.
  • the electronic device can respond to the OK gesture detected by the camera, turn off the lighted display screen 1100 or turn on the closed display screen 1100.
  • the electronic device detects the user's option to open the "first key associated display screen", which may be that the electronic device processor uses the camera to detect that the finger is placed at the on switch of the option of the "first key associated display screen”.
  • mapping relationship between the user setting and the instruction of the custom setting shown in FIG. 4i is only used to explain the embodiment of the present application, and should not constitute a limitation.
  • the setting interface may also include other options for customizing the correspondence between user operations and instructions, which are not limited in the embodiments of the present application.
  • the mark 1106 may still be displayed on the display screen 1100.
  • the mark 1106 may be a circular bright spot, for example.
  • the mark 1106 may be displayed on the edge of the display screen 1100, such as the upper left corner or the upper right corner.
  • the display screen 1100 may display a prompt message "Look here to resume display".
  • the head mounted electronic device may reopen the display screen 1100.
  • FIG. 4k is a schematic diagram of an embodiment of a human-computer interaction provided by an embodiment of the present application.
  • the mark 1106 may still be displayed on the display 1100.
  • the mark 1106 may be a triangular bright spot, for example.
  • the mark 1106 may be displayed on the edge of the display screen 1100, for example, the upper left corner or the upper right corner.
  • the display screen 1100 may display a prompt message "Look here to resume display".
  • the head mounted electronic device may open the prompt page.
  • the prompt page can prompt the user whether to open the display screen.
  • the head-mounted electronic device detects that the finger is placed in the "yes" position through the camera, the display on the display is restored again.
  • the head-mounted electronic device can detect the eyeball in a specific state through the camera, for example, the eyeball rolls up, down, or blinks quickly, then the head-mounted electronic device can turn off the display or turn on the display .
  • the user's response speed to the recovery or shutdown of the display screen is different.
  • users have higher requirements for response speed.
  • the user does not need to pay attention to the navigation information displayed on the display screen at all times.
  • the user's response speed to the recovery or shutdown of the display screen is low, and the display screen backlight, display panel and driver IC can be turned off.
  • the electronic device 100 may set different solutions for turning off the display screen according to the requirements for the response speed to the recovery or shutdown of the display screen in different scenarios. In scenes where users have high requirements for response speed (such as viewing video scenes), only the backlight of the display screen or the display panel can be turned off.
  • the backlight of the display screen, the display panel, and the driver IC can be turned off to further save power consumption.
  • the backlight of the display screen, the display panel, and the driver IC can be turned off to further save power consumption.
  • FIGS. 9a and 9b For a specific description of turning off the backlight of the display screen, the display panel, and the driving IC, reference may be made to the examples shown in FIGS. 9a and 9b, and details are not described here.
  • FIG. 5 is a schematic diagram of some human-computer interactions in a navigation scenario provided by an embodiment of the present application.
  • navigation information can be displayed on the display screen 1100.
  • the display screen can clearly see the display Navigation information.
  • the focal length detection optical sensor in the head-mounted display device can detect that the focal length of the eyeball is outside the fixed focus of the display screen, such as the eyeball 10 meters ahead.
  • the processor in the head-mounted display device can turn off the display 1100.
  • the backlight of the display screen 1100, the display panel and the driving IC can be turned off.
  • the closed display screen 1100 is transparent, and the user 200 can view the physical object through the display screen 1100.
  • the electronic device can restore the display screen display image in response to the user's gesture.
  • FIG. 4e and FIG. 4g which will not be repeated here.
  • the electronic device can restart the display screen to display the navigation information.
  • the processor in the electronic device 100 may resume displaying the image on the display screen when one or more of the following are detected: a. The user gesture is detected; b. The focus of the user's eyeball is detected on the display screen 1100. Defocused.
  • the head-mounted electronic device may include a speed sensor to detect the moving speed of the device.
  • the head-mounted electronic device detects that the moving speed exceeds the speed threshold through the speed sensor, for example, more than 50 km / h
  • the head-mounted electronic device can turn off the display screen, or move the navigation information displayed on the display screen to the On the side, as shown in Figure 4c, you can also move and shrink the display screen to display the image position, as shown in Figure 4d.
  • the display screen is turned off. It can reduce the situation that the user moves too fast in the navigation scene and the image displayed on the display screen affects the user's safety, providing convenience for the user.
  • the head-mounted electronic device may turn off the display .
  • FIG. 6 is a schematic diagram of other human-computer interactions in a navigation scenario provided by an embodiment of the present application.
  • the head-mounted electronic device detects that there is no intersection after 5 kilometers in the current movement direction, it is determined that there is no intersection within 1 kilometer in the movement direction of the current walking route, as shown in FIG. 6
  • the head-mounted electronic device can turn off the display screen 1100 for display.
  • the navigation and positioning system still works.
  • the head-mounted electronic device determines that an intersection occurs within 1 kilometer according to the device positioning position, as shown by 6c in FIG. 6, the head-mounted electronic device may reopen the display screen 1100 to display navigation information.
  • the intersection that requires user attention is not approached, turning off the display of the display screen can save power consumption of the head-mounted electronic device and reduce the obstruction of the user's line of sight by the display image of the display screen, providing convenience for the user.
  • the display screen of the electronic device may display an image related to the physical object in the perspective of the electronic device.
  • the physical objects in the perspective of the electronic device can be determined by using the camera to collect images. For example, if the camera collects an image of a piece of clothing, the processor in the electronic device uses image recognition to determine the image content as "clothes", and determines the brand, price, shopping link and other information of the clothes, and displays the information on the display On screen.
  • the electronic device can statistically obtain the state of the user's eyeballs continuously watching the image of the display within a certain period of time, and determine whether to turn off the display according to the statistical results Screen.
  • the electronic device turns off the display screen display. After that, when the physical object 2 appears in the perspective of the electronic device, the display screen needs to be turned on again to display the image of the physical object 2 on the display screen. If it is detected that the image of the physical object 2 on the display screen is not focused by the eye, the display screen display is turned off again.
  • the electronic device can turn off the display panel and the driver IC when detecting that images of consecutive N (an integer greater than 1) solid objects on the display screen are not focused by the eyeball.
  • N an integer greater than 1
  • the display screen and the driver IC are turned on to display the image of the solid object in the current viewing angle of the electronic device: a. The focus of the eyeball is focused on the fixed focus of the display screen. b. In response to a user operation. c. It has been detected that M (an integer greater than 1) entity objects have entered the electronic device perspective.
  • the electronic device may relax the conditions for turning off the display of the display when it detects that images of consecutive S (integer greater than 1) solid objects on the display are focused by the eyeball. Specifically, the value of the first threshold in the condition may be reduced, and the value of the second threshold and the first duration may be increased. Therefore, the number of times the display screen is repeatedly closed and opened can be reduced, the life of the display screen is prolonged, and the stimulation of the eyeball when the display screen is repeatedly closed and opened is reduced.
  • the AR glasses use image recognition to determine the image content as "animal 1".
  • the AR glasses can determine that the content displayed on the display screen is an image associated with animal 1 according to "animal 1."
  • the display screen of the AR glasses displays the image associated with the animal 1, for example, it may include the web page link of the animal 1, the distribution location, the protection level, and the related animation video. After detecting that the image associated with Animal 1 on the display screen is not focused by the eye, the display screen is turned off.
  • the AR glasses open the display screen and display the image associated with the animal 2, and after detecting that the image associated with the animal 2 on the display screen has not been focused by the eye, close the display screen. Similarly, after the image associated with Animal 3 is still out of focus, turn off the display. Then, the AR glasses can continue to close the display screen until it detects that the focus of the eyeball is focused on the fixed focus of the display screen, and then open the display screen to display the image. Or until the user operation is detected, the display screen is turned on to display the image. Or, it does not turn on the display screen to display the image until it is detected that four physical objects have entered the perspective of the electronic device.
  • the display screen displays navigation information.
  • the AR glasses detect that the focus of the eyeball is not on the fixed focus of the display, they can turn off the backlight of the LCD display.
  • the AR glasses detect that the focus of the eyeball has not returned to the fixed focus of the display screen for five minutes, you can turn off the display panel of the LCD screen.
  • the driver IC of the LCD display can also be turned off.
  • the head-mounted electronic device can also obtain the number or probability of historical focusing of images corresponding to different physical objects by the user through machine learning.
  • the head-mounted electronic device can determine the trigger condition for turning off the display screen according to the historical focusing times or probability of each type of solid object. The greater the number or probability of historical focusing, the looser the trigger condition for closing the display screen can be set for this type of entity object. For example, the smaller the first threshold value, the larger the second threshold value and the first duration value.
  • Different physical objects may include, for example, people, cars, animals, buildings, etc.
  • a head-mounted electronic device undergoes machine learning to obtain the following result: when a person appears in the camera's view angle, and the image associated with the person is displayed on the image recognition display screen, the probability that the image on the display screen is focused by the eyeball is 85%.
  • the probability that the image on the display screen is focused by the eyeball is 50%.
  • the probability that the image on the display screen is focused by the eyeball is 10%.
  • the fixed focus of the display screen is focused by the eyeball for less than 1 second, and the electronic device determines that the image on the display screen is not focused by the eyeball.
  • the fixed focus of the display screen is focused by the eyeball for less than 2 seconds, and the electronic device determines that the image on the display screen is not focused by the eyeball.
  • the fixed focus of the display screen is focused by the eyeball for less than 3 seconds, and the electronic device determines that the image on the display screen is not focused by the eyeball.
  • machine learning is used to determine the user's preferred display content of the display screen, and then the condition for turning off the display screen is set according to the user's preferred display content of the display screen. While saving power consumption, you can more accurately determine whether the user needs to turn off the display.
  • the head-mounted electronic device can obtain the number or frequency of historical focus of the corresponding image under different scenes by the user through machine learning.
  • the head-mounted electronic device may determine the trigger condition for turning off the display screen according to the historical focusing times or frequency of the solid object in each scene. The greater the number or frequency of historical focusing, the more relaxed the trigger conditions for turning off the display in this scenario can be set. For example, the smaller the first threshold value, the larger the second threshold value and the first duration value.
  • the different scenes may include, for example, virtual dressing scenes, scenic spots and historic sites scenes, navigation scenes, and the like.
  • the electronic device undergoes machine learning and obtains the following result: In the virtual fitting scene, the probability that the image on the display screen is focused is 95%. In the navigation scene, the probability of the image on the display screen being focused is 48%. In the scenic spot scene, the probability of the image on the display screen being focused is 5%. Then, the processor of the electronic device detects that the virtual fitting scene is within the view angle of the camera, the fixed focus of the display screen is focused for less than 1 second, and the electronic device determines that the image on the display screen is not focused. When the navigation scene is within the camera angle of view, the fixed focus of the display screen is focused for less than 2 seconds, and the electronic device determines that the image on the display screen is not in focus. When the scene of a place of historical interest and historical interest is within the camera angle of view, the fixed focus of the display screen is focused for less than 3 seconds, and the electronic device determines that the image on the display screen is not in focus.
  • the degree of eyeballs can be corrected on the head-mounted electronic device.
  • a slot for an additional lens may be provided between the display screen and the eyeball, and the slot may fix a nearsighted lens or a farsighted lens for correcting vision.
  • the head-mounted electronic device may include a projection system, which can replace the display screen and directly generate a clear image in the user's field of view.
  • the projection system may be, for example, a holographic waveguide display device, which projects a visible holographic image to the eyeball using holographic technology.
  • turning off the display screen display may refer to turning off the projection system.
  • the user's eyeball cannot see the image projected by the projection system, and can see real-world physical objects.
  • Turning off the first area of the display screen may mean that the projection system does not project an image at a position corresponding to the first area in the user's eyeball.
  • Reducing the brightness of the display screen may mean that the projection system reduces the brightness of the projected image in the eyes of the user.
  • the electronic device may include two modes: in the first mode, the electronic device may close or open the display screen according to the focus of the eyeball, as shown in FIG. 3, FIG. 4a-FIG. Out of interaction.
  • the display screen of the electronic device is not controlled by the focus of the eyeball.
  • the electronic device may determine a mode in response to the user's operation.
  • FIG. 7, is a schematic diagram of human-computer interaction provided by an embodiment of the present application.
  • the user can turn on the option of “eye tracking bright screen off” on the setting interface of the head-mounted display device, then the head-mounted display device can set the electronic device in the first mode in response to the opening operation under.
  • the electronic device can turn off or turn on the display screen according to the focus of the eyeball, specifically perform the turning on or off as shown in Figure 3, Figure 4a- Figure 4k, Figure 5 and Figure 6 The display mode.
  • FIG. 8 is a schematic diagram of human-computer interaction provided by an embodiment of the present application.
  • the head-mounted electronic device 100 can establish a connection with other electronic devices 400, such as a mobile phone, a tablet, or the like.
  • the connection may be a Bluetooth connection, a WiFi connection, or other wireless connection.
  • the connection may also be a wired connection, which is not limited in this embodiment of the present application.
  • the head-mounted electronic device 100 can exchange data with the electronic device 400 through the established connection.
  • the electronic device 400 may also receive user operations, such as touch operations, and generate setting instructions to send to the head mounted electronic device 100.
  • the electronic device 400 can receive the user's touch operation and send an instruction to open the function: control the display screen according to the eye focus to ⁇ ⁇ ⁇ ⁇ 100 ⁇ Electronic headset 100.
  • the head-mounted electronic device 100 executes the function of turning on: controlling the display screen according to the focus of the eyeball.
  • the interaction example between the head-mounted electronic device 100 and the electronic device 400 shown in FIG. 8 is only used to explain the embodiments of the present application, and should not constitute a limitation.
  • the head-mounted electronic device 100 can also use the electronic device 400 to set other parameters or interact with other data, such as the setting interface in FIG. 4i.
  • the display screen 1100 may be an LCD display screen or an OLED display screen.
  • 9a and 9b are schematic structural diagrams of a display screen provided by embodiments of the present application.
  • 9a is a schematic structural view of an LCD
  • FIG. 9b is a schematic structural view of an OLED.
  • the LCD display screen includes a backlight, a display panel (panel), a backlight driving circuit (IC), and a panel driving IC.
  • the display panel is used to provide different rotation directions of the liquid crystal molecules, so as to achieve whether the polarized light of each pixel is emitted or not to achieve the purpose of displaying.
  • the display panel includes a liquid crystal cell, a polarizing plate, and the like.
  • the liquid crystal cell contains liquid crystal molecules, and different rotations of the liquid crystal molecules have different polarizations to the light, thereby realizing the light and dark state of the pixel.
  • the liquid crystal cell corresponding to each pixel of the display panel realizes different liquid crystal turning under the control of the screen driving IC, thereby allowing light to pass through or not through the display panel.
  • the polarizing plate is used to provide a certain polarization direction, and only the light with the polarization direction in that specific direction passes through.
  • the upper and lower sides of the liquid crystal cell can be provided with polarizing plates, respectively, so that the light with the polarization direction in a specific direction passes through the display panel.
  • the upper and lower sides of the liquid crystal cell may include a side close to the backlight and a side close to the eyeball.
  • the backlight can be realized by a light emitting diode (LED).
  • LED light emitting diode
  • the backlight can display red, green and blue (RGB).
  • RGB red, green and blue
  • each pixel may contain three corresponding LEDs.
  • the red, green and blue colors of the pixels can also be achieved through color filters and white LEDs.
  • the backlight driver IC and the backlight form a loop, and the backlight driver IC is used to control the backlight to provide a light source for the display screen.
  • the backlight driver IC can be interfaced with the processor and controlled by the processor to adjust the current of the backlight to achieve different backlight brightness.
  • the amount of current that the processor controls the backlight driver IC to input to the backlight may be using content-adaptive backlight control technology (content adaptive brightness control (CABC)) or ambient light detection corresponding backlight control technology (light adaptive brightness control (LABC)).
  • CABC content adaptive brightness control
  • LEC light adaptive brightness control
  • the screen driver IC can receive the display data transmitted by the processor interface to drive the display panel and the backlight. These display data include brightness data and color data of each pixel of the display panel.
  • the screen driving IC uses the brightness data to drive the display panel through the display data transmission channel to provide the brightness corresponding to each pixel.
  • the screen driver IC can also be connected to the backlight driver IC through the RGB backlight control interface.
  • the screen driver IC uses the color data to drive the backlight driver IC and the backlight through the RGB backlight control interface to provide colors corresponding to each pixel.
  • the LCD display screen may also include a load switch 1.
  • the load switch 1 can perform one or more of the following operations under the control of the processor: control the backlight driving IC to turn off or restore the backlight power supply, control to turn off or restore the display power supply, control to turn off or restore the backlight driving IC power supply, control to turn off or Restore the power supply of the screen driver IC.
  • the OLED display screen includes an OLED display panel (panel) and a screen driver IC.
  • the OLED display panel includes a plurality of display units arranged in an array. Each display unit can emit light under the drive of the panel driving IC.
  • the processor can control the size of the current delivered to the OLED display panel through the interface, so as to achieve different backlight brightness.
  • the amount of current input by the processor to the OLED display panel can be CABC or LABC.
  • the screen driver IC can receive the display data transmitted by the processor interface to drive the OLED display panel. These display data include brightness data and color data of each pixel of the OLED display panel. The screen driver IC uses the brightness data and color data to drive the OLED display panel to provide the brightness and color corresponding to each pixel.
  • the OLED display screen may further include a load switch 2.
  • the load switch 2 can perform one or more of the following operations under the control of the processor: control to turn off or restore power to the OLED display panel, control to turn off or restore power to the screen drive IC.
  • the processor may be the processor 110 in the electronic device 100 described in FIG. 1.
  • the above structural examples of the LCD display screen and the OLED display screen are only used to explain the embodiments of the present application, and should not constitute a limitation. Both the LCD display and the OLED display can contain more or less parts, and can also contain different parts arrangements.
  • closing the display screen display may include any one of the following situations: 1 Turn off the backlight power supply of the display screen. 2Turn off the backlight power supply of the display and the power supply of the display panel. 3 Turn off the display backlight power supply, display panel, screen driver IC and backlight driver IC.
  • the processor When the backlight driver IC is controlled to turn off the backlight power supply, the processor still sends display data to the display panel through the screen driver IC and the backlight driver IC, but the display cannot display images because the backlight is turned off. Since the display data has been sent to the display panel all the time, the backlight power supply is restored quickly.
  • the display panel cannot receive the display data sent by the screen driver IC, and the initial configuration data of the display panel is lost.
  • the speed of restoring the display on the display screen is slow, and the power consumption of the display panel can be saved.
  • the response speed for restoring the display to the display is not high, the display panel can be turned off to further save power consumption.
  • the backlight driver IC cannot receive the backlight data sent by the processor.
  • the backlight driver IC also cannot receive color data sent by the screen driver IC.
  • the backlight driver IC needs to be initialized and configured.
  • the screen driver IC cannot receive the display data sent by the processor, nor can it send color data to the backlight driver IC.
  • the screen driver IC needs to be initialized and configured. Therefore, the speed of restoring the display is slow.
  • turning off the display of the display screen may include any one of the following situations: 1 Turn off the power supply of the OLED display panel. 2Turn off the power supply of the OLED display panel and the power supply of the screen driver IC.
  • the processor controls to turn off the power supply of the OLED display panel
  • the processor still sends display data to the screen driver IC, but because the power supply of the OLED display panel is turned off, the OLED display screen cannot display images.
  • it is necessary to initialize the configuration of each pixel (such as some initial potential assignment, etc.). Since the display data has been sent to the OLED display panel all the time, the power recovery speed of the OLED display panel is restored quickly.
  • the screen driver IC After the control turns off the screen driver IC, the screen driver IC cannot receive the display data sent by the processor, nor can it send the display data to the OLED display panel. Similarly, when the power supply of the screen driver IC is restored, the screen driver IC needs to be initialized and configured. The recovery speed of the power supply of the recovery driver IC is slow.
  • the processor can control the power supply of pixels in some areas of the OLED display panel to be turned off. Then the image cannot be displayed in this part of the area. It is possible to turn off the display of some areas on the display.
  • the image displayed on the display screen can be a virtual image in the eyes of the user.
  • the focal point corresponding to the virtual image can be set within a certain distance from the front of the user's eye through the optical design of the display screen, such as 2 meters or 4 meters.
  • the distance can also be a distance interval, for example 2-4 meters. Then, the image displayed on the display screen appears to the user's eyeball to be imaged on the fixed focus in front of the user's eyeball.
  • FIG. 10 is a schematic flowchart of a method for controlling a display screen according to an eye focus according to an embodiment of the present invention. As shown in FIG. 10, the method is applied to a head-mounted electronic device that includes a display screen, which is transparent when the display is turned off.
  • the head-mounted electronic device may be the head-mounted electronic device shown in FIG. 1.
  • the method includes:
  • the focus of the user's eyeball is within the first distance.
  • first distance range reference may be made to the relevant description of the fixed focus point shown in 3a in FIG. 3, and it may be within a distance range in front of the user 200 eyeball, for example, a position 2 to 4 meters in front of the user 200 eyeball.
  • the display screen may also be replaced by a projection system in the head-mounted electronic device to form an image in the eyeball.
  • the first duration may refer to the relevant description beyond the fixed focus shown in 3b in FIG. 3, for example, it may be 1 minute.
  • step S102 when it is detected that the continuous duration of the user's eyeball focal point falling within the first distance range is greater than or equal to the second duration, the display screen is opened.
  • the second duration may refer to the description related to returning to the fixed focus as shown by 3c in FIG. 3, for example, it may be 1 second.
  • the display screen is a liquid crystal display screen
  • turning off the display screen when playing video on the display screen includes: turning off the backlight of the display screen.
  • the display screen is a liquid crystal display screen
  • turning off the display screen when displaying navigation information on the display screen includes: turning off the backlight of the display screen and one or more of the following: a display panel , A driving circuit for the backlight and a driving circuit for the display screen.
  • the display screen is an organic light-emitting diode display screen
  • turning off the display screen when playing a video on the display screen includes: turning off the display panel of the display screen.
  • the display screen is an organic light-emitting diode display screen.
  • turning off the display screen includes: turning off the drive circuit of the display screen and the display panel of the display screen.
  • the display screen when any one or more of the following are detected, the display screen is turned off: it is detected that the duration of the user's eye focus is not within the first distance range is greater than or equal to the first duration.
  • the press operation of the first key is detected.
  • the first gesture is detected.
  • the first voice signal is detected.
  • the first button reference may be made to the related description of the first button in FIG. 4g.
  • the first gesture reference may be made to the related description of the first gesture in FIG. 4g.
  • the first gesture may be, for example, one or more of the following gestures: scissors hand, fist, finger snapping, and OK gesture.
  • For the first voice signal reference may be made to the related description of the first voice signal in FIG. 4h.
  • the correspondence between the first key and the instruction to close the display screen may be preset by the system, or may be set in response to the user's operation.
  • the setting interface shown in FIG. 4i may also be displayed on an electronic device such as a mobile phone connected to the head-mounted electronic device.
  • the display screen when any one or more of the following are detected, the display screen is turned on: it is detected that the duration of the focus of the user's eyeball falling within the first distance is greater than or equal to the second duration.
  • the press operation of the second key is detected.
  • a second gesture is detected.
  • a second voice signal is detected.
  • the first button and the second button may be the same or different.
  • the second gesture reference may be made to the related description of the second gesture in FIG. 4e.
  • the first gesture and the second gesture may be the same or different.
  • the correspondence between the second key and the instruction to open the display screen may be preset by the system, or may be set in response to the user's operation.
  • the setting interface shown in FIG. 4i may also be displayed on an electronic device such as a mobile phone connected to the head-mounted electronic device.
  • the display screen when the focus of the user's eyeball is not within the fixed focus of the display screen, the display screen is turned off. Users can view physical objects through the transparent display screen. The influence of the displayed image on the user's viewing of the real world can be reduced. In addition, the power consumption of the display screen can be reduced. When the focus of the user's eyeball is within the fixed focus of the display screen, the display screen is turned on. The operation convenience of the head-mounted electronic device can be improved.
  • the user When watching a video, the user needs to watch the physical object and then pauses the video playback and turns off the display of the first area of the display screen, which can reduce power consumption and improve the user's convenience of watching the video.
  • only closing the display panel does not need to close the screen driver IC, and it is not necessary to initialize and configure the screen driver IC when resuming video playback, thereby improving the response speed of resuming playback.
  • the user operation may also be touch screen operation of the display screen and brain wave signals.
  • the head-mounted electronic device can also detect the eyeball in a specific state through the camera, for example, when the eyeball rolls up, down, or blinks quickly, the head-mounted electronic device can turn off the display or turn on the display.
  • the head-mounted electronic device can also detect that the movement speed exceeds the speed threshold through the speed sensor, turn off the display screen, or move the navigation information displayed on the display screen to the side of the display screen .
  • the display screen is turned off. It can reduce the situation that the user moves too fast in the navigation scene and the image displayed on the display screen affects the user's safety, providing convenience for the user.
  • the head-mounted electronic device can also determine that there is no intersection within the distance threshold range in the movement direction of the current walking route according to the device positioning position, for example, if there is no intersection within 1 km, the head-mounted electronic device Turn off the display.
  • the intersection that requires user attention is not approached, turning off the display of the display screen can save power consumption of the head-mounted electronic device and reduce the obstruction of the user's line of sight by the display image of the display screen, providing convenience for the user.
  • the electronic device may turn off the display panel and the driver IC when it is detected that images of N consecutive solid objects on the display screen are not focused by the eyeball.
  • the display screen and the driver IC are turned on to display the image of the solid object in the current viewing angle of the electronic device: a. The focus of the eyeball is focused on the fixed focus of the display screen. b. In response to a user operation. c. It has been detected that M (an integer greater than 1) entity objects have entered the electronic device perspective.
  • the backlight of the display, the display panel and the driver IC can be turned off to further save power consumption.
  • the head-mounted electronic device can also obtain the number or probability of historical focus of images corresponding to different physical objects by the user through machine learning.
  • the head-mounted electronic device can determine the trigger condition for turning off the display screen according to the historical focusing times or probability of each type of entity object. The greater the number or probability of historical focusing, the looser the trigger condition for closing the display screen can be set for this type of entity object. For example, the smaller the first threshold value, the larger the second threshold value and the first duration value.
  • machine learning is used to determine the user's preferred display content of the display screen, and then the condition for turning off the display screen is set according to the user's preferred display content of the display screen. While saving power consumption, you can more accurately determine whether the user needs to turn off the display.
  • the head-mounted electronic device can obtain the number or frequency of historical focus of the corresponding images of the user in different scenes through machine learning.
  • the head-mounted electronic device may determine the trigger condition for turning off the display screen according to the historical focusing times or frequency of the solid object in each scene. The greater the number or frequency of historical focusing, the more relaxed the trigger conditions for turning off the display in this scenario can be set. For example, the smaller the first threshold value, the larger the second threshold value and the first duration value.
  • machine learning is used to determine the user's preferred display screen display scene, and then the conditions for turning off the display screen are set according to the user's preferred display screen display scene. While saving power consumption, you can more accurately determine whether the user needs to turn off the display.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a dedicated computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transferred from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be from a website site, computer, server or data center Transmission to another website, computer, server or data center via wired (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device including a server, a data center, and the like integrated with one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de commande d'un écran d'affichage conformément au point de focalisation du globe oculaire et un équipement électronique monté sur la tête (100), le procédé est appliqué à l'équipement électronique monté sur la tête (100), l'équipement électronique monté sur la tête (100) contient un écran d'affichage (1100), l'écran d'affichage (1100) est transparent lorsque l'unité d'affichage est éteinte ; le procédé consiste : à afficher une image (101) sur l'écran d'affichage (1100) lorsque le point de focalisation du globe oculaire d'un utilisateur se trouve dans une première plage de distance ; lorsqu'il est détecté que la durée lors de laquelle le point de focalisation du globe oculaire de l'utilisateur n'est pas dans la première plage de distance est supérieure ou égale à une première durée, à éteindre l'écran d'affichage (1100) (102). Le procédé peut réduire la consommation d'énergie et réduire l'impact de l'image sur la visualisation du monde réel.
PCT/CN2019/118623 2018-11-23 2019-11-15 Procédé de commande d'écran d'affichage conformément au point de focalisation du globe oculaire et équipement électronique monté sur la tête WO2020103763A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19886694.9A EP3862845B1 (fr) 2018-11-23 2019-11-15 Procédé de commande d'écran d'affichage conformément au point de focalisation du globe oculaire et équipement électronique monté sur la tête
FIEP19886694.9T FI3862845T3 (fi) 2018-11-23 2019-11-15 Menetelmä näytön ohjaamiseksi silmämunan fokuksen mukaisesti ja päähän kiinnitettävä elektroninen laite
US17/295,699 US20220019282A1 (en) 2018-11-23 2019-11-15 Method for controlling display screen according to eye focus and head-mounted electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811407510.5 2018-11-23
CN201811407510.5A CN109582141B (zh) 2018-11-23 2018-11-23 根据眼球焦点控制显示屏的方法和头戴电子设备

Publications (1)

Publication Number Publication Date
WO2020103763A1 true WO2020103763A1 (fr) 2020-05-28

Family

ID=65924258

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/118623 WO2020103763A1 (fr) 2018-11-23 2019-11-15 Procédé de commande d'écran d'affichage conformément au point de focalisation du globe oculaire et équipement électronique monté sur la tête

Country Status (5)

Country Link
US (1) US20220019282A1 (fr)
EP (1) EP3862845B1 (fr)
CN (2) CN109582141B (fr)
FI (1) FI3862845T3 (fr)
WO (1) WO2020103763A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022212072A1 (fr) * 2021-03-31 2022-10-06 Snap Inc. Commande de luminosité de projecteur de lunettes
CN116880702A (zh) * 2023-09-08 2023-10-13 深圳市江元科技(集团)有限公司 一种感应式车载媒体的控制方法、系统和存储介质

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109582141B (zh) * 2018-11-23 2022-05-10 华为技术有限公司 根据眼球焦点控制显示屏的方法和头戴电子设备
CN111077989B (zh) * 2019-05-27 2023-11-24 广东小天才科技有限公司 一种基于电子设备的屏幕控制方法及电子设备
CN110332944A (zh) * 2019-07-18 2019-10-15 百度国际科技(深圳)有限公司 导航设备控制方法、装置、设备和存储介质
CN111240414B (zh) * 2020-01-23 2021-03-09 福州贝园网络科技有限公司 一种眼镜腰带式计算机装置
CN111741511B (zh) * 2020-05-29 2022-05-10 华为技术有限公司 快速匹配方法及头戴电子设备
CN111710284B (zh) * 2020-07-17 2023-03-31 Oppo广东移动通信有限公司 智能眼镜控制方法、智能眼镜控制装置及智能眼镜
KR20240009975A (ko) * 2021-05-17 2024-01-23 스냅 인코포레이티드 아이웨어 디바이스 동적 전력 구성
CN113359270B (zh) * 2021-05-25 2023-06-09 歌尔股份有限公司 头戴设备的屈光度调节方法及屈光度调节系统
CN113672085A (zh) * 2021-08-04 2021-11-19 Oppo广东移动通信有限公司 参数控制方法、装置、头戴式显示设备以及存储介质
US20230050526A1 (en) * 2021-08-10 2023-02-16 International Business Machines Corporation Internet of things configuration using eye-based controls
CN113703572B (zh) * 2021-08-25 2024-02-09 京东方科技集团股份有限公司 电子设备、控制方法、控制装置和存储介质
CN113504833B (zh) * 2021-09-10 2021-12-24 世纳微电子科技(成都)有限公司 数字光学色温传感器、眼球追踪装置及人机交互系统
CN113900253A (zh) * 2021-10-18 2022-01-07 杨敬尧 一种迷你功能型镜头模组
CN114615488B (zh) * 2022-03-14 2022-12-27 北京行者无疆科技有限公司 一种ar眼镜显示模式的控制方法
CN114779916A (zh) * 2022-03-29 2022-07-22 杭州海康威视数字技术股份有限公司 一种电子设备屏幕唤醒方法、门禁管理方法及装置
CN117056749B (zh) * 2023-10-12 2024-02-06 深圳市信润富联数字科技有限公司 点云数据处理方法、装置、电子设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202602831U (zh) * 2012-05-11 2012-12-12 山东沃飞电子科技有限公司 显示控制装置
CN105204651A (zh) * 2015-11-12 2015-12-30 上海卓易科技股份有限公司 一种控制方法及装置
US20180077409A1 (en) * 2016-09-09 2018-03-15 Samsung Electronics Co., Ltd. Method, storage medium, and electronic device for displaying images
CN108595009A (zh) * 2012-02-29 2018-09-28 联想(北京)有限公司 一种人机交互控制方法及电子终端
CN109582141A (zh) * 2018-11-23 2019-04-05 华为技术有限公司 根据眼球焦点控制显示屏的方法和头戴电子设备

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9081416B2 (en) * 2011-03-24 2015-07-14 Seiko Epson Corporation Device, head mounted display, control method of device and control method of head mounted display
WO2012154938A1 (fr) * 2011-05-10 2012-11-15 Kopin Corporation Ordinateur de casque d'écoute qui utilise des instructions de mouvement et des instructions vocales pour commander un affichage d'informations et des dispositifs à distance
AU2011204946C1 (en) * 2011-07-22 2012-07-26 Microsoft Technology Licensing, Llc Automatic text scrolling on a head-mounted display
US9310611B2 (en) * 2012-09-18 2016-04-12 Qualcomm Incorporated Methods and systems for making the use of head-mounted displays less obvious to non-users
WO2015047032A1 (fr) * 2013-09-30 2015-04-02 삼성전자 주식회사 Procédé de traitement de contenus sur la base d'un signal biologique et dispositif associé
US9836122B2 (en) * 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US20150288788A1 (en) * 2014-04-08 2015-10-08 Stephen Y Liu Mobile Device Management
US20160025971A1 (en) * 2014-07-25 2016-01-28 William M. Crow Eyelid movement as user input
CN104850317A (zh) * 2014-12-31 2015-08-19 华为终端(东莞)有限公司 可穿戴设备的屏幕的显示方法及可穿戴设备
US9652047B2 (en) * 2015-02-25 2017-05-16 Daqri, Llc Visual gestures for a head mounted device
US10345988B2 (en) * 2016-03-16 2019-07-09 International Business Machines Corporation Cursor and cursor-hover based on user state or sentiment analysis
US10372205B2 (en) * 2016-03-31 2019-08-06 Sony Interactive Entertainment Inc. Reducing rendering computation and power consumption by detecting saccades and blinks
US10095307B2 (en) * 2016-05-13 2018-10-09 Google Llc Eye tracking systems and methods for virtual reality environments
CN107424584A (zh) * 2016-05-24 2017-12-01 富泰华工业(深圳)有限公司 护眼系统及方法
US10552183B2 (en) * 2016-05-27 2020-02-04 Microsoft Technology Licensing, Llc Tailoring user interface presentations based on user state
CN109791295A (zh) * 2016-07-25 2019-05-21 奇跃公司 使用增强和虚拟现实眼镜的成像修改、显示和可视化
TWI610059B (zh) * 2016-08-04 2018-01-01 緯創資通股份有限公司 三維量測方法及應用其之三維量測裝置
WO2018080431A1 (fr) * 2016-10-24 2018-05-03 Hewlett-Packard Development Company, L.P. Réveil de dispositifs électroniques dans des modes de fonctionnement sélectionnés
US20180314066A1 (en) * 2017-04-28 2018-11-01 Microsoft Technology Licensing, Llc Generating dimming masks to enhance contrast between computer-generated images and a real-world view
US10810773B2 (en) * 2017-06-14 2020-10-20 Dell Products, L.P. Headset display control based upon a user's pupil state
CN207651151U (zh) * 2017-12-27 2018-07-24 北京枭龙防务科技有限公司 一种自动调整显示内容的透视型近眼显示装置
US10922862B2 (en) * 2018-04-05 2021-02-16 Lenovo (Singapore) Pte. Ltd. Presentation of content on headset display based on one or more condition(s)

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108595009A (zh) * 2012-02-29 2018-09-28 联想(北京)有限公司 一种人机交互控制方法及电子终端
CN202602831U (zh) * 2012-05-11 2012-12-12 山东沃飞电子科技有限公司 显示控制装置
CN105204651A (zh) * 2015-11-12 2015-12-30 上海卓易科技股份有限公司 一种控制方法及装置
US20180077409A1 (en) * 2016-09-09 2018-03-15 Samsung Electronics Co., Ltd. Method, storage medium, and electronic device for displaying images
CN109582141A (zh) * 2018-11-23 2019-04-05 华为技术有限公司 根据眼球焦点控制显示屏的方法和头戴电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3862845A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022212072A1 (fr) * 2021-03-31 2022-10-06 Snap Inc. Commande de luminosité de projecteur de lunettes
US11663941B2 (en) 2021-03-31 2023-05-30 Snap Inc. Eyewear projector brightness control
CN116880702A (zh) * 2023-09-08 2023-10-13 深圳市江元科技(集团)有限公司 一种感应式车载媒体的控制方法、系统和存储介质
CN116880702B (zh) * 2023-09-08 2024-01-05 深圳市江元科技(集团)有限公司 一种感应式车载媒体的控制方法、系统和存储介质

Also Published As

Publication number Publication date
CN109582141B (zh) 2022-05-10
US20220019282A1 (en) 2022-01-20
FI3862845T3 (fi) 2024-04-17
CN109582141A (zh) 2019-04-05
EP3862845A4 (fr) 2021-12-29
EP3862845A1 (fr) 2021-08-11
CN114879840A (zh) 2022-08-09
EP3862845B1 (fr) 2024-01-10

Similar Documents

Publication Publication Date Title
WO2020103763A1 (fr) Procédé de commande d'écran d'affichage conformément au point de focalisation du globe oculaire et équipement électronique monté sur la tête
CN110221432B (zh) 头戴式显示器的图像显示方法及设备
WO2020238741A1 (fr) Procédé de traitement d'image, dispositif associé et support de stockage informatique
WO2020207380A1 (fr) Visiocasque électronique et son procédé de commande
US20160063767A1 (en) Method for providing visual reality service and apparatus for the same
US20180176536A1 (en) Electronic device and method for controlling the same
CN111526407B (zh) 屏幕内容的显示方法及装置
US11798234B2 (en) Interaction method in virtual reality scenario and apparatus
CN112312366A (zh) 一种通过nfc标签实现功能的方法、电子设备及系统
EP4044000A1 (fr) Procédé d'affichage, dispositif électronique et système
CN114257920B (zh) 一种音频播放方法、系统和电子设备
CN109636715B (zh) 图像数据的传输方法、装置及存储介质
CN111458876B (zh) 一种头戴式显示设备的控制方法及头戴式显示设备
WO2022213937A1 (fr) Procédé de commande de dispositif pouvant être porté et dispositif électronique
WO2022089625A1 (fr) Procédé de commande de fonction de réalité augmentée et dispositif électronique
CN115016629B (zh) 防误触的方法和装置
CN112565735B (zh) 一种虚拟现实的测量和显示方法、装置、以及系统
WO2021057420A1 (fr) Procédé d'affichage d'interface de commande et visiocasque
WO2023185698A1 (fr) Procédé de détection de port, et appareil associé
WO2023197913A1 (fr) Procédé de traitement d'image et dispositif associé
WO2023142959A1 (fr) Procédé de photographie d'un système de photographie à caméras multiples, et dispositif, support de stockage et produit de programme
WO2022267467A1 (fr) Procédé et appareil de duplication d'écran, dispositif, et support d'enregistrement
CN110049252B (zh) 一种追焦拍摄方法、设备及计算机可读存储介质
CN114375027A (zh) 降低功耗的方法和装置
CN116546281A (zh) 一种投屏方法、系统、投屏源设备和屏幕设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19886694

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019886694

Country of ref document: EP

Effective date: 20210505

NENP Non-entry into the national phase

Ref country code: DE