WO2023011103A1 - Procédé et appareil de commande de paramètre, dispositif visiocasque et support de stockage - Google Patents

Procédé et appareil de commande de paramètre, dispositif visiocasque et support de stockage Download PDF

Info

Publication number
WO2023011103A1
WO2023011103A1 PCT/CN2022/104430 CN2022104430W WO2023011103A1 WO 2023011103 A1 WO2023011103 A1 WO 2023011103A1 CN 2022104430 W CN2022104430 W CN 2022104430W WO 2023011103 A1 WO2023011103 A1 WO 2023011103A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display screen
similarity
head
display device
Prior art date
Application number
PCT/CN2022/104430
Other languages
English (en)
Chinese (zh)
Inventor
李炳耀
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023011103A1 publication Critical patent/WO2023011103A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/38Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using electrochromic devices

Definitions

  • the present application relates to the technical field of electronic equipment, and more specifically, to a parameter control method and device, a head-mounted display device, and a storage medium.
  • head-mounted display devices are used more and more widely and have more and more functions, and have become one of the must-haves in people's daily life.
  • the head-mounted display device provides the user with the function of manually adjusting the parameters of its display screen.
  • the present application proposes a parameter control method, device, head-mounted display device and storage medium to solve the above problems.
  • the embodiment of the present application provides a parameter control method, which is applied to a head-mounted display device, and the head-mounted display device includes a camera and a display screen.
  • the method includes: when the head-mounted display device When in the wearing state, the image including the eyes of the wearer is collected by the camera as the first image; based on the first image, the image that the wearer is gazing at is obtained as the second image; the image displayed on the display screen is obtained As a third image; based on the second image and the third image, the parameters of the display screen are controlled.
  • the embodiment of the present application provides a parameter control device, which is applied to a head-mounted display device
  • the head-mounted display device includes a camera and a display screen
  • the device includes: a first image acquisition module for When the head-mounted display device is in the wearing state, the camera captures an image including the eyes of the wearer as a first image; the second image acquisition module is configured to acquire the wearer's eyes based on the first image. The image being watched is used as the second image; the third image acquisition module is used to acquire the image displayed on the display screen as the third image; the parameter control module is used to control the image based on the second image and the third image. Control the parameters of the above-mentioned display screen.
  • the embodiment of the present application provides a head-mounted display device, including a camera, a display screen, a memory and a processor, the memory is coupled to the processor, the memory stores instructions, and when the instructions When executed by the processor, the processor performs the above method.
  • the embodiment of the present application provides a computer-readable storage medium, where program code is stored in the computer-readable storage medium, and the program code can be invoked by a processor to execute the above method.
  • FIG. 1 shows a schematic structural diagram of a head-mounted display device that can be used in an embodiment of the present application
  • Figure 2 shows a schematic diagram of the use of a head-mounted display device that can be used in the embodiment of the present application
  • FIG. 3 shows a schematic flowchart of a parameter control method provided by an embodiment of the present application
  • Fig. 4 shows a schematic flowchart of a parameter control method provided in another embodiment of the present application.
  • FIG. 5 shows a schematic flowchart of a parameter control method provided in another embodiment of the present application.
  • FIG. 6 shows a schematic flowchart of a parameter control method provided by another embodiment of the present application.
  • Fig. 7 shows a schematic flowchart of a parameter control method provided by still another embodiment of the present application.
  • FIG. 8 shows a schematic flowchart of a parameter control method provided in yet another embodiment of the present application.
  • FIG. 9 shows a schematic flowchart of step S620 of the parameter control method shown in FIG. 8 of the present application.
  • FIG. 10 shows a block diagram of a parameter control device provided by an embodiment of the present application.
  • FIG. 11 shows a block diagram of an electronic device for executing a parameter control method according to an embodiment of the present application in an embodiment of the present application
  • Fig. 12 shows a storage unit for storing or carrying program codes for realizing the parameter control method according to the embodiment of the present application according to the embodiment of the present application.
  • the head-mounted display device in order to improve the user's sensory experience when wearing the head-mounted display device, that is, in order to enable the user to see a clearer virtual image, the head-mounted display device generally uses dark transparent lenses, and in order to improve the user's wearing the head-mounted display device.
  • the head-mounted display device For interactive experience when wearing a head-mounted display device, the head-mounted display device generally provides the user with a function of manually adjusting the parameters of its display screen, so as to satisfy the user's use experience and interaction experience.
  • a camera can be installed on the head-mounted display device to capture the user's eyeballs, and image processing technology can be used to determine the relative position of the human eyeballs and the main optical axis of the display screen of the head-mounted display device. The parameters are adjusted relative to the position.
  • this method cannot determine the user's observation area, and the involuntary eye movement will frequently trigger parameter adjustments, resulting in a bad user experience.
  • a high-speed infrared camera can be set on the head-mounted display device to capture infrared characteristic images of the human eye, and then image processing is performed on the infrared characteristic image of the human eye, and then gazed at through the pre-established mathematical model of the human eye. point solution, and finally adjust the parameters of the display screen or perform partial rendering of the image through the position of the gaze point of the human eye.
  • this method requires a higher computing power for the head-mounted display device, and different human eyes The infrared characteristics are different, and different human eye mathematical models need to be established for different races.
  • real-time gaze point detection will also increase the power consumption of the head-mounted display device.
  • the real world observed by users will have problems of insufficient sensitivity and color distortion.
  • the inventor found after long-term research, and proposed the parameter control method, device, head-mounted display device and storage medium provided by the embodiment of the present application.
  • the relationship between the displayed images is used to control the parameters of the display screen, thereby improving the accuracy of parameter control and improving the user experience.
  • the specific parameter control method will be described in detail in the subsequent embodiments.
  • FIG. 1 shows a schematic structural diagram of a head-mounted display device that can be used in an embodiment of the present application.
  • the head-mounted display device 100 includes a camera 110 and a display screen 120 , and the camera 110 and the display screen 120 are arranged on a device body of the head-mounted display device 100 .
  • the number of cameras 110 may be two, and when the head-mounted display device 100 is in the wearing state, the two cameras 110 may respectively correspond to the positions of the wearer's two eyes, and shoot toward the wearer's eyes.
  • the number of display screens 120 can be two, and when the head-mounted display device 100 is in the wearing state, the two display screens 120 can correspond to the positions of the wearer's two eyes respectively, so that the wearer can watch through the two eyes. displayed content.
  • the head-mounted display device 100 may be an integrated head-mounted display device, or the head-mounted display device may be an intelligent terminal such as a mobile phone connected to an external head-mounted display device.
  • the head-mounted display device 100 It can be used as a processing and storage device of the head-mounted display device, plugged into or connected to an external head-mounted display device, and the resource to be displayed (virtual object) can be displayed in the head-mounted display device 100 .
  • the head-mounted display device 100 may be a head-mounted augmented reality (augmented reality, AR) device, wherein, when the head-mounted display device 100 is a head-mounted augmented reality device, the head-mounted display device 100
  • the display screen 120 of the augmented reality device can be made of electrochromic materials.
  • the head-mounted display device 100 may be a head-mounted display device 100 (virtual reality, VR) device, wherein, when the head-mounted display device 100 is a head-mounted virtual reality device, the head-mounted display device
  • the display screen 120 of the wearable virtual reality device can be made with dark glasses.
  • the scene (marker) that the user wants to render may be stored in the head-mounted display device 100 .
  • the above-mentioned markers may include at least one sub-marker having one or more feature points.
  • the head-mounted display device 100 may use the aforementioned marker within the field of view as a target marker, and collect an image including the target marker.
  • the captured image of the target marker can be identified, and spatial position information such as the position and orientation of the target marker relative to the head-mounted display device 100, as well as the identity of the target marker can be obtained. Information and other identification results.
  • the head-mounted display device 100 may display corresponding virtual objects based on information such as the spatial position of the target marker relative to the head-mounted display device 100 . It can be understood that the specific marker is not limited in the embodiment of the present application, it only needs to be identified and tracked by the head-mounted display device 100 .
  • FIG. 2 shows a schematic view of a head-mounted display device that can be used in an embodiment of the present application.
  • the head-mounted display device 100 can be worn by a wearer, and the wearer can view the virtual world or the real world through the head-mounted display device 100 , which is not limited here.
  • FIG. 3 shows a schematic flowchart of a parameter control method provided by an embodiment of the present application.
  • the method is used to control the parameters of the display screen by comparing the relationship between the image gazed by the wearer and the image displayed by the head-mounted display device, thereby improving the accuracy of parameter control and improving user experience.
  • the parameter control method is applied to the parameter control device 200 shown in FIG. 10 and the head-mounted display device 100 ( FIG. 11 ) configured with the parameter control device 200 .
  • the following will take a head-mounted display device as an example to illustrate the specific process of this embodiment.
  • the head-mounted display device used in this embodiment can be a head-mounted augmented reality device or a head-mounted display device. type virtual reality equipment, etc., are not limited here.
  • the head-mounted display device includes a camera and a display screen.
  • the parameter control method may specifically include the following steps:
  • Step S110 When the head-mounted display device is in a wearing state, collect an image including eyes of the wearer as a first image through the camera.
  • the camera of the head-mounted display device may collect an image including eyes of the wearer as the first image.
  • it may be detected whether the display screen of the head-mounted display device is in a bright screen state, and when it is detected that the display screen of the head-mounted display device is in a bright screen state, An image including the eyes of the wearer may be collected by the camera as the first image.
  • the head-mounted display device when the head-mounted display device is in the wearing state and the display screen of the head-mounted display device is in the bright screen state, it can be detected whether the display screen displays a target image, wherein, when it is detected that the display screen displays a target image
  • the camera may collect an image including the eyes of the wearer as the first image.
  • the head-mounted display device may include a pressure sensor, and the head-mounted display device may detect whether it is in a wearing state through the pressure sensor; for another example, the head-mounted display device may include a contact sensor, and the head-mounted display device Whether it is worn or not can be detected by a contact sensor.
  • the camera of the head-mounted display device can be kept on when the head-mounted display device is on, and when the head-mounted display device is switched from the on state to the off state, it can be synchronized from the on state to the off state. Status toggles to off.
  • the camera of the head-mounted display device may remain in the off state when the head-mounted display device is in the on state, and switch from the off state to the on state when receiving an instruction from the user to turn on the camera, and when When receiving an instruction from the user to turn off the camera, switch from the on state to the off state, wherein the instruction can be generated based on the user's touch operation on the head-mounted display device, or based on the user's voice command, etc., here No limit.
  • the camera of the head-mounted display device can be kept in the off state when the head-mounted display device is in the on state, and when it is detected that the head-mounted display device is in the wearing state, switch from the off state to the on state, And when it is detected that the head-mounted display device is switched from the wearing state to the non-wearing state, switching from the on state to the off state.
  • the head-mounted display device when it is detected that the head-mounted display device is switched from the wearing state to the non-wearing state for a preset duration, the head-mounted display device can be switched from the on state to the off state.
  • the head-mounted display device when the head-mounted display device is shooting through the camera (two cameras on the left and right), at least the image of the wearer's eye can be captured and obtained, and the image of the wearer's eye can be taken as the first image . That is to say, the images captured by the camera may only include images of the wearer's eyes, or may include images of the wearer's eyes and images around the eyes, etc., which is not limited herein.
  • Step S120 Based on the first image, acquire the image that the wearer is gazing at as a second image.
  • the electronic device may obtain the image that the wearer is gazing at, and use the image that the wearer is gazing at as the second image.
  • the electronic device may process the first image through a gaze area determination algorithm to obtain the image that the wearer is gazing at as the second image.
  • the electronic device after the electronic device obtains the first image, it can input the first image into the gaze area judgment algorithm, and the gaze area judgment algorithm can obtain the wearer's pupil lens from the first image. and determine the mirror image of the image on the crystalline lens within the wearer's pupil as the image the wearer is looking at.
  • the electronic device after the electronic device obtains the first image, it can input the first image into the gaze area judgment algorithm, and the gaze area judgment algorithm can detect the position and size of the wearer's iris in the first image, based on The position and size of the iris Extract the iris image from the first image, and then from the iris image, obtain the mirror image of the image on the lens in the wearer's pupil, and convert the mirror image of the image on the lens in the wearer's pupil to A mirror image is defined as the image that the wearer looks at.
  • Step S130 acquiring the image displayed on the display screen as a third image.
  • the image displayed on the display screen of the head-mounted display device may be acquired, and the image displayed on the display screen of the head-mounted display device may be used as the third image.
  • an image capture program may be run on the head-mounted display device, and the image displayed on the display screen may be captured by the image capture program, and the captured image may be used as the third image.
  • image acquisition may be performed on the display screen of the head-mounted display device, and a mirror image of the captured image may be acquired, and the mirror image of the captured image may be used as a third image.
  • a screenshot may be taken on the display screen of the head-mounted display device to obtain a screenshot image, and the screenshot image may be used as the third image.
  • Step S140 Control the parameters of the display screen based on the second image and the third image.
  • controlling the parameters of the display screen of the head-mounted display device may include: adjusting the parameters of the display screen and not adjusting the parameters of the display screen, wherein adjusting the parameters of the display screen may include improving the display Screen parameters and reduce the parameters of the display screen, do not adjust the parameters of the display screen including maintaining the parameters of the display screen.
  • the second image and the third image can be compared to obtain the similarity between the second image and the third image, and according to the second image and the third image The similarity of the image controls the parameters of the display screen.
  • the second image and the third image can be compared to obtain the degree of difference between the second image and the third image, and according to the second image and the third image The degree of difference of the image controls the parameters of the display screen.
  • the parameters of the display screen may include brightness. Therefore, after the second image and the third image are obtained, the brightness of the display screen may be controlled based on the second image and the third image. For example, the brightness of the display screen is controlled according to the similarity between the second image and the image in FIG. 3 , and the brightness of the display screen is controlled according to the difference between the second image and the third image.
  • the parameters of the display screen may include color, therefore, after the second image and the third image are obtained, the color of the display screen may be controlled based on the second image and the third image.
  • the color of the display screen is controlled according to the similarity between the second image and the image in FIG. 3
  • the color of the display screen is controlled according to the difference between the second image and the third image.
  • the parameters of the display screen may include resolution. Therefore, after the second image and the third image are obtained, the resolution of the display screen may be controlled based on the second image and the third image. For example, the resolution of the display screen is controlled according to the similarity between the second image and the image in FIG. 3 , and the resolution of the display screen is controlled according to the difference between the second image and the third image.
  • the parameters of the display screen may include sharpness. Therefore, after the second image and the third image are obtained, the sharpness of the display screen may be controlled based on the second image and the third image. For example, the definition of the display screen is controlled according to the similarity between the second image and the image in FIG. 3 , and the definition of the display screen is controlled according to the difference between the second image and the third image.
  • the display screen can also include other more parameters, therefore, after obtaining the second image and the third image, other more parameters of the display screen can also be adjusted based on the second image and the third image. Parameters are controlled and will not be repeated here.
  • the image including the eyes of the wearer is collected by the camera as the first image, and based on the first image, the image that the wearer is looking at is obtained as the second image.
  • Two images acquire the image displayed on the display screen as the third image, based on the second image and the third image, control the parameters of the display screen, so that by comparing the image that the wearer is looking at with the image displayed by the head-mounted display device relationship, to control the parameters of the display screen, thereby improving the accuracy of parameter control and improving the user experience.
  • FIG. 4 shows a schematic flowchart of a parameter control method provided in another embodiment of the present application.
  • the method is applied to the above-mentioned head-mounted display device, which includes a camera and a display screen.
  • the head-mounted display device includes a head-mounted display device.
  • the process is described in detail, and the parameter control method may specifically include the following steps:
  • Step S210 When the head-mounted augmented reality device is in a wearing state, collect an image including eyes of the wearer as a first image through the camera.
  • Step S220 Based on the first image, acquire the image that the wearer is gazing at as a second image.
  • Step S230 acquiring the image displayed on the display screen as a third image.
  • step S210-step S230 please refer to step S110-step S130, which will not be repeated here.
  • Step S240 Obtain the similarity between the second image and the third image as a first similarity.
  • the head-mounted display device is a head-mounted augmented reality device. That is, when the wearer uses the head-mounted augmented reality device, he can obtain visual information from the virtual world and the real world, and through the superimposition of the images of the virtual world and the real world, it can give the wearer a novel gaming experience.
  • the second image and the third image may be compared to obtain the similarity between the second image and the third image as the first similarity.
  • the second image and the third image can be compared by a hash algorithm to obtain the similarity between the second image and the third image, as the first similarity.
  • feature extraction can be performed on the second image to obtain the first feature parameter
  • feature extraction can be performed on the third image to obtain the second feature parameter.
  • the first characteristic parameter and the second characteristic parameter can be compared to obtain the similarity between the first characteristic parameter and the second characteristic parameter, and the similarity between the first characteristic parameter and the second characteristic parameter The similarity between the second image and the third image is determined as the first similarity.
  • Step S250 When the first similarity is greater than a first similarity threshold, control the brightness of the display screen to be greater than the first brightness threshold.
  • the head-mounted augmented reality device may preset and store a similarity threshold as the first similarity threshold, where the first similarity threshold is used as a measure of the similarity between the second image and the third image. Judgments based. Therefore, in this embodiment, after obtaining the first similarity between the second image and the third image, the first similarity can be compared with the first similarity threshold to determine whether the first similarity is greater than The first similarity threshold.
  • the judgment result indicates that the first similarity is greater than the first similarity threshold
  • the similarity between the second image (the image that the wearer looks at) and the third image (the image displayed on the display screen) is greater than the similarity threshold, that is , the image the wearer looks at is basically the same as the image displayed on the display screen, therefore, it can be considered that the image the user is looking at is the image displayed on the display screen.
  • the area that the user is looking at is the area (display area) displayed by the virtual image
  • the brightness of the display screen can be controlled to be greater than the first brightness threshold, so as to improve the display effect of the image displayed on the display screen, and improve the user experience. Visual experience of virtual images.
  • the head-mounted augmented reality device may preset and store a first brightness threshold, which is used as a basis for controlling the brightness of the display screen. Therefore, in this embodiment, when it is determined that the first similarity is greater than the first similarity threshold, the current brightness of the display screen can be detected, and if the current brightness is greater than the first brightness threshold, the brightness of the display screen remains unchanged. If the current brightness is less than the first brightness threshold, the brightness of the display screen is increased to be greater than the first brightness threshold.
  • Step S260 When the first similarity is not greater than the first similarity threshold, control the brightness of the display screen to be smaller than the first brightness threshold.
  • the judgment result indicates that the first similarity is not greater than the first similarity threshold
  • it indicates that the similarity between the second image (the image that the wearer looks at) and the third image (the image displayed on the display screen) is not greater than the similarity threshold that is, the image that the wearer is gazing at is basically different from the image displayed on the display screen, therefore, it can be considered that the image the user is gazing at is not the image displayed on the display screen.
  • the brightness of the display screen can be controlled to be less than the first brightness threshold, so that by reducing the display screen Reduce the power consumption of head-mounted augmented reality devices in a way of increasing the brightness.
  • the head-mounted augmented reality device may preset and store a first brightness threshold, which is used as a basis for controlling the brightness of the display screen. Therefore, in this embodiment, when it is determined that the first similarity is not greater than the first similarity threshold, the current brightness of the display screen can be detected, and if the current brightness is greater than the first brightness threshold, then reduce the brightness of the display screen to less than the first threshold. A brightness threshold, if the current brightness is less than the first brightness threshold, keep the brightness of the display screen unchanged.
  • the image including the eyes of the wearer is collected by the camera as the first image, and based on the first image, the image that the wearer is gazing at is obtained as the first image.
  • the second image obtain the image displayed on the display screen as the third image, obtain the similarity between the second image and the third image as the first similarity, when the first similarity is greater than the first similarity threshold, control the display screen
  • the brightness is greater than the first brightness threshold, and when the first similarity is not greater than the first similarity threshold, the brightness of the control display screen is smaller than the first brightness threshold.
  • this embodiment also controls the brightness of the display screen by comparing the similarity between the image the wearer is looking at and the image displayed by the head-mounted augmented reality device, thereby improving brightness control. Accuracy to improve user experience and reduce power consumption of head-mounted augmented reality devices.
  • FIG. 5 shows a schematic flowchart of a parameter control method provided in another embodiment of the present application.
  • the method is applied to the above-mentioned head-mounted display device, which includes a camera and a display screen.
  • the head-mounted display device includes a head-mounted augmented reality device, and the display screen is made of electrochromic materials.
  • the process shown in Figure 5 will be described in detail below, and the parameter control method may specifically include the following steps:
  • Step S310 When the head-mounted augmented reality device is in a wearing state, collect an image including eyes of the wearer as a first image through the camera.
  • Step S320 Based on the first image, acquire the image that the wearer is gazing at as a second image.
  • Step S330 Obtain the image displayed on the display screen as a third image.
  • step S310-step S330 please refer to step S110-step S130, which will not be repeated here.
  • Step S340 Obtain the similarity between the second image and the third image as a second similarity.
  • the head-mounted display device is a head-mounted augmented reality device
  • the display screen is made of electrochromic materials.
  • Step S350 When the second similarity is greater than the second similarity threshold, control the color of the display screen to be the first target color, wherein the light transmittance of the display screen under the first target color is less than Transmittance threshold.
  • the head-mounted augmented reality device may preset and store a similarity threshold as a second similarity threshold, where the second similarity threshold is used as a measure of the similarity between the second image and the third image. Judgments based. Therefore, in this embodiment, after obtaining the second similarity between the second image and the third image, the second similarity can be compared with the second similarity threshold to determine whether the second similarity is greater than Second similarity threshold.
  • the judgment result indicates that the second similarity is greater than the second similarity threshold, it indicates that the similarity between the second image (the image that the wearer looks at) and the third image (the image displayed on the display screen) is greater than the similarity threshold, that is , the image the wearer looks at is basically the same as the image displayed on the display screen, therefore, it can be considered that the image the user is looking at is the image displayed on the display screen.
  • the color of the display screen can be controlled to be the first target color, wherein the light transmittance of the display screen under the first target color is less than The light transmittance threshold, so that the user can see a clearer image of the virtual world and improve the user's visual experience.
  • the head-mounted augmented reality device may preset and store a first target color, and the first target color is used as a basis for controlling the color of the display screen. Therefore, in this embodiment, when it is determined that the first similarity is greater than the first similarity threshold, the current color of the display screen can be detected, and if the current color is not lighter than the first target color, the color of the display screen remains unchanged , if the current color is lighter than the first target color, darken the color of the display to the first target color.
  • the head-mounted augmented reality device may preset and store a light transmittance threshold, and the light transmittance threshold is used as a basis for controlling the light transmittance of the display screen. Therefore, in this embodiment, when it is determined that the first similarity is greater than the first similarity threshold, the current light transmittance of the display screen can be detected, and if the current light transmittance is not less than the light transmittance threshold, the display screen’s current light transmittance is maintained. The color remains unchanged. If the current light transmittance is lower than the light transmittance threshold, the color of the display screen will be darkened to be greater than the first target color.
  • Step S360 When the second similarity is not greater than the second similarity threshold, control the color of the display screen to be a second target color, wherein the transparency of the display screen under the second target color is The light rate is greater than the light transmittance threshold.
  • the judgment result indicates that the second similarity is not greater than the second similarity threshold
  • the similarity between the second image (the image the wearer looks at) and the third image (the image displayed on the display screen) is not greater than the similarity threshold
  • the image that the wearer is gazing at is basically different from the image displayed on the display screen, therefore, it can be considered that the image the user is gazing at is not the image displayed on the display screen.
  • the area that the user is looking at is the real world (non-display area), and the wearer does not pay attention to the image displayed on the display screen, so the color of the display screen can be controlled to be the second target color, wherein the display screen is in the second target color.
  • the light transmittance under the second target color is greater than the light transmittance threshold, so that the user can see a clearer and distortion-free real world and improve the user's visual experience.
  • the head-mounted augmented reality device may preset and store a second target color, and the second target color is used as a basis for controlling the color of the display screen. Therefore, in this embodiment, when it is determined that the first similarity is not greater than the first similarity threshold, the current color of the display screen can be detected, and if the current color is darker than the second target color, the color of the display screen will be lightened. To the second target color, if the current color is not darker than the second target color, keep the color of the display screen unchanged.
  • the head-mounted augmented reality device may preset and store a light transmittance threshold, and the light transmittance threshold is used as a basis for controlling the light transmittance of the display screen. Therefore, in this embodiment, when it is determined that the first similarity is not greater than the first similarity threshold, the current light transmittance of the display screen can be detected, and if the current light transmittance is not greater than the light transmittance threshold, the display will be shallower. From the color of the screen to the second target color, if the current light transmittance is greater than the light transmittance threshold, keep the color of the display screen unchanged.
  • the image including the eyes of the wearer is collected by the camera as the first image, and based on the first image, the image that the wearer is gazing at is obtained as the first image.
  • For the second image obtain the image displayed on the display screen as the third image, obtain the similarity between the second image and the third image as the second similarity, when the second similarity is greater than the second similarity threshold, control the display screen
  • the color is the first target color, wherein the light transmittance of the display screen under the first target color is less than the light transmittance threshold, and when the second similarity is not greater than the second similarity threshold, the color of the control display screen is the second target color, wherein the light transmittance of the display screen under the second target color is greater than the light transmittance threshold.
  • this embodiment also controls the color of the display screen by comparing the similarity between the image the wearer is looking at and the image displayed by the head-mounted augmented reality device, thereby improving color control. Accuracy so that users can seamlessly switch between the real world and the virtual world.
  • FIG. 6 shows a schematic flowchart of a parameter control method provided by another embodiment of the present application.
  • the method is applied to the above-mentioned head-mounted display device, which includes a camera and a display screen.
  • the head-mounted display device includes a head-mounted augmented reality device, and the display screen is made of electrochromic materials.
  • the process shown in Figure 6 will be described in detail below, and the parameter control method may specifically include the following steps:
  • Step S410 When the head-mounted augmented reality device is in a wearing state, collect an image including eyes of the wearer as a first image through the camera.
  • Step S420 Based on the first image, acquire the image that the wearer is gazing at as a second image.
  • Step S430 Obtain the image displayed on the display screen as a third image.
  • step S410-step S430 please refer to step S110-step S120, which will not be repeated here.
  • Step S440 Obtain the similarity between the second image and the third image as a third similarity.
  • the head-mounted display device is a head-mounted augmented reality device
  • the display screen is made of electrochromic materials.
  • the second image and the third image may be compared to obtain the similarity between the second image and the third image as the third similarity.
  • the first degree of similarity, the second degree of similarity and the third degree of similarity are only distinguished as names, and there is no substantial difference.
  • the first degree of similarity may be the second degree of similarity or the degree of third similarity.
  • the second image after obtaining the second image and the third image, can be compared with the third image through a hash algorithm to obtain the similarity between the second image and the third image, as the third similarity.
  • feature extraction can be performed on the second image to obtain the first feature parameter
  • feature extraction can be performed on the third image to obtain the second feature parameter.
  • the first characteristic parameter and the second characteristic parameter can be compared to obtain the similarity between the first characteristic parameter and the second characteristic parameter, and the similarity between the first characteristic parameter and the second characteristic parameter The similarity between the second image and the third image is determined as the third similarity.
  • Step S450 When the third similarity is greater than the third similarity threshold, control the brightness of the display screen to be greater than the first brightness threshold, and control the color of the display screen to be the first target color, wherein the display The light transmittance of the screen under the first target color is less than a light transmittance threshold.
  • the head-mounted augmented reality device may preset and store a similarity threshold as a third similarity threshold, where the third similarity threshold is used as a measure of the similarity between the second image and the third image. Judgments based. Therefore, in this embodiment, after obtaining the third similarity between the second image and the third image, the third similarity can be compared with the third similarity threshold to determine whether the third similarity is greater than The third similarity threshold.
  • the judgment result indicates that the third similarity is greater than the third similarity threshold
  • the similarity between the second image (the image that the wearer looks at) and the third image (the image displayed on the display screen) is greater than the similarity threshold, that is , the image the wearer looks at is basically the same as the image displayed on the display screen, therefore, it can be considered that the image the user is looking at is the image displayed on the display screen.
  • the area that the user is looking at is the area (display area) displayed by the virtual image
  • the brightness of the display screen can be controlled to be greater than the first brightness threshold, so as to improve the display effect of the image displayed on the display screen, and improve the user experience.
  • Visual experience of virtual images and control the color of the display screen to be the first target color, wherein the light transmittance of the display screen under the first target color is less than the light transmittance threshold, so that users can see a clearer virtual world images to enhance the user's visual experience.
  • Step S460 when the third similarity is not greater than the third similarity threshold, control the brightness of the display screen to be smaller than the first brightness threshold, and control the color of the display screen to be the second target color, wherein, the light transmittance of the display screen under the second target color is greater than the light transmittance threshold.
  • the judgment result indicates that the third similarity is not greater than the third similarity threshold, it indicates that the similarity between the second image (the image the wearer looks at) and the third image (the image displayed on the display screen) is not greater than the similarity threshold , that is, the image that the wearer is gazing at is basically different from the image displayed on the display screen, therefore, it can be considered that the image the user is gazing at is not the image displayed on the display screen.
  • the brightness of the display screen can be controlled to be less than the first brightness threshold, so that by reducing the display screen reduce the power consumption of the head-mounted augmented reality device in a way of increasing the brightness, and control the color of the display screen to be the second target color, wherein the light transmittance of the display screen under the second target color is greater than the light transmittance threshold, so that the Users see a clearer and distortion-free real world, improving the user's visual experience.
  • Step S470 When the third similarity changes from not greater than the third similarity threshold to greater than the third similarity threshold, control the brightness of the display screen to gradually increase from less than the first brightness threshold to greater than the first brightness threshold, and control the color of the display screen to gradually change from the first target color to the second target color.
  • the second image (the image that the wearer is gazing at) and the third image (the image displayed on the display screen) can be continuously acquired. image), and obtain the similarity between the second image and the third image as the third similarity.
  • the third similarity is never greater than the third similarity threshold and changes greater than the third display degree threshold, the similarity that characterizes the second image (the image that the wearer looks at) and the third image (the image displayed on the display screen) is never greater than the third similarity threshold.
  • the change is greater than the similarity threshold, that is, the image the wearer looks at and the image displayed on the display screen change from basically different to basically the same. Therefore, it can be considered that the image the user is looking at changes from the image displayed on the display screen to The image displayed on the monitor.
  • the brightness of the display screen can be controlled to gradually increase from less than the first brightness threshold to greater than the first brightness threshold, And control the color of the display screen to gradually change from the first target color to the second target color, so as to improve the display effect of the image displayed on the display screen, so as to enhance the user's visual experience of the virtual image.
  • the image including the eyes of the wearer is collected by the camera as the first image, and based on the first image, the image that the wearer is gazing at is obtained as the first image.
  • the second image obtain the image displayed on the display screen as the third image, obtain the similarity between the second image and the third image, as the third similarity, when the third similarity is greater than the second similarity threshold, control the display screen
  • the brightness is greater than the first brightness threshold and the color of the display screen is controlled to be the first target color, wherein the light transmittance of the display screen under the first target color is less than the light transmittance threshold, when the third similarity is not greater than the second similarity threshold , the brightness of the display screen is controlled to be less than the first brightness threshold and the color of the display screen is controlled to be the second target color, wherein the light transmittance of the display screen under the second target color is greater than the light transmittance threshold, when the third similarity changes from When not greater than the third similarity threshold becomes greater than the third similarity threshold, the brightness of the control display screen is gradually increased from less than the first brightness threshold to greater than the first brightness threshold, and the color of the control display screen is gradually changed from the first target color for the second target color.
  • this embodiment also controls the brightness and color of the display screen by comparing the similarity between the image the wearer is looking at and the image displayed by the head-mounted augmented reality device, thereby improving Brightness and color control accuracy to enable users to seamlessly switch between real and virtual worlds and reduce power consumption of AR headsets.
  • FIG. 7 shows a schematic flowchart of a parameter control method provided in still another embodiment of the present application.
  • the method is applied to the above-mentioned head-mounted display device, which includes a camera and a display screen.
  • the head-mounted display device includes a head-mounted display device.
  • the process is described in detail, and the parameter control method may specifically include the following steps:
  • Step S510 When the head-mounted virtual reality device is in a wearing state, collect an image including eyes of the wearer as a first image through the camera.
  • Step S520 Based on the first image, acquire the image that the wearer is gazing at as a second image.
  • Step S530 Obtain the image displayed on the display screen as a third image.
  • step S510-step S530 please refer to step S110-step S130, which will not be repeated here.
  • Step S540 Obtain the similarity between each of the plurality of sub-images and the second image, and obtain a plurality of fourth similarities.
  • the head-mounted display device is a head-mounted virtual display device. That is, when the wearer uses the head-mounted virtual reality device, he can obtain visual information from the virtual world.
  • the third image includes a plurality of sub-images.
  • multiple objects may be included in the third image, and each object may correspond to a sub-image in the third image, wherein, the objects may include, but are not limited to, people, animals, and objects; regions, and each region may correspond to a sub-image in the third image.
  • a plurality of sub-images included in the third image may be obtained, and the second image is compared with the plurality of sub-images to obtain the degree of similarity between each of the plurality of sub-images and the second image , so as to obtain multiple similarities as multiple fourth similarities.
  • the second image after obtaining the second image and the multiple sub-images in the third image, the second image can be compared with the multiple sub-images respectively through a hash algorithm, so as to obtain the comparison between each of the multiple sub-images and the second image. Similarity, that is, to obtain a plurality of fourth similarities.
  • feature extraction can be performed on the second image to obtain the first feature parameters, and feature extraction is performed on the multiple sub-images respectively to obtain multiple second feature parameters, after obtaining After the first characteristic parameter and the plurality of second characteristic parameters, the plurality of second characteristic parameters and the first characteristic parameters can be compared respectively to obtain the similarity between each of the plurality of second characteristic parameters and the first characteristic parameter, and the plurality of The similarities between each of the second characteristic parameters and the first characteristic parameter are determined as the similarities between the plurality of sub-images and the second image to obtain a plurality of fourth similarities.
  • Step S550 Obtain a fourth similarity with a similarity greater than a fourth similarity threshold from the plurality of fourth similarities as a target similarity.
  • the head-mounted virtual reality device may preset and store a fourth similarity threshold, where the fourth similarity threshold is used as a basis for judging the similarity between each sub-image and the second image. Therefore, in this embodiment, after obtaining the fourth similarity between each of the multiple sub-images and the second image, the multiple similarities can be compared with the fourth similarity threshold to determine whether each fourth similarity Whether the degree is greater than the fourth similarity threshold, and according to the judgment result, obtain the fourth similarity with a similarity greater than the fourth similarity threshold from the plurality of fourth similarities as the target similarity.
  • this fourth similarity is used as the target similarity; when there are three fourth similarities among the multiple fourth similarities When the similarity is greater than the fourth similarity threshold, the three fourth similarities are used as the target similarity, etc., which are not limited here.
  • Step S560 Obtain the sub-image corresponding to the target similarity as the target sub-image.
  • a sub-image corresponding to the target similarity may be acquired from multiple sub-images as the target sub-image.
  • the corresponding number of target sub-images is one, and when the number of object similarities is multiple, then the number of corresponding target sub-images is multiple.
  • Step S570 Control the display screen to display the brightness of the area of the target sub-image greater than the second brightness threshold, and control the display screen to display the brightness of the area except the target sub-image not greater than the second brightness threshold.
  • the fourth similarity corresponding to the target sub-image is greater than the fourth similarity threshold
  • the similarity between the target sub-image and the second image is greater than the fourth similarity threshold, that is, the image that the wearer looks at It is basically the same as the target sub-image, therefore, the image that the user is looking at can be considered as the target sub-image.
  • the area that the user is looking at is the area displayed by the target sub-image
  • the brightness of the area where the display screen displays the target sub-image can be controlled to be greater than the second brightness threshold, so as to improve the display effect of the display screen displaying the target sub-image.
  • the first brightness threshold and the second brightness threshold are only distinguished as names, and there is no substantial difference.
  • the first brightness threshold may be the second brightness threshold.
  • the fourth similarity corresponding to other sub-images in the third image except the target sub-image is not greater than the fourth similarity threshold, therefore, the other sub-images in the third image except the target sub-image
  • the similarity between the sub-image and the second image is not greater than the fourth similarity threshold, that is, the image that the wearer is looking at is basically different from other sub-images in the third image except the target sub-image, therefore, it can be considered that the user is looking at
  • the image for is not a subimage other than the target subimage in the third image.
  • the display screen can be controlled to display the images of other sub-images in the third image except the target sub-image.
  • the brightness of the area is not greater than the second brightness threshold, so as to reduce the power consumption of the head-mounted virtual reality device by reducing the display power consumption of the display screen.
  • Yet another embodiment of the present application provides a parameter control method.
  • the image including the eyes of the wearer is collected by the camera as the first image, and based on the first image, the image that the wearer is gazing at is obtained.
  • the second image obtain the image displayed on the display screen as the third image, obtain the similarity between each of the multiple sub-images and the second image, obtain multiple fourth similarities, and obtain a similarity greater than the fourth from multiple fourth similarities
  • the fourth similarity of the similarity threshold is used as the target similarity, and the sub-image corresponding to the target similarity is obtained as the target sub-image, and the brightness of the area where the display screen displays the target sub-image is controlled to be greater than the second brightness threshold, and the display screen is controlled to display The brightness of the region except the target sub-image is not greater than the second brightness threshold.
  • this embodiment also analyzes the brightness of the display screen by comparing the similarity between the image the wearer is looking at and the image displayed by the head-mounted virtual reality device.
  • Regional control so as to improve the accuracy of brightness control, so as to improve the user experience and reduce the power consumption of the head-mounted virtual reality device.
  • FIG. 8 shows a schematic flowchart of a parameter control method provided in yet another embodiment of the present application. The method is applied to the above-mentioned head-mounted display device.
  • the head-mounted display device includes a camera and a display screen. The process shown in FIG. 8 will be described in detail below.
  • the parameter control method may specifically include the following steps:
  • Step S610 When the head-mounted display device is in a wearing state, collect an image including eyes of the wearer as a first image through the camera.
  • step S610 for the specific description of step S610, please refer to step S110, which will not be repeated here.
  • Step S620 From the first image, acquire an image on the lens in the pupil of the wearer as a fourth image.
  • an image of the lens in the wearer's pupil can be obtained from the first image, and the lens in the through hole can be The image on the shape, as the fourth image.
  • FIG. 9 shows a schematic flowchart of step S620 of the parameter control method shown in FIG. 8 of the present application.
  • the process shown in Figure 9 will be described in detail below, and the method may specifically include the following steps:
  • Step S621 Obtain the position and size of the wearer's iris from the first image.
  • the position and size of the wearer's iris can be obtained from the first image.
  • the first image can be input into the gaze area judgment algorithm, and the position and size of the iris in the first image can be detected by the gaze area judgment algorithm.
  • Step S622 Based on the position and size of the iris, extract an iris image from the first image.
  • an iris image may be extracted from the first image based on the position and size of the iris.
  • the iris region may be separated from the first image based on the position and size of the iris to obtain an iris image.
  • Step S623 From the iris image, acquire an image on the lens in the pupil of the wearer as the fourth image.
  • the image of the crystalline surface in the wearer's pupil can be obtained from the iris image, and the image of the crystalline surface in the through hole can be used as Fourth image.
  • the data processing amount of the head-mounted display device can be effectively reduced by extracting the iris image from the first image for subsequent processing.
  • Step S630 Obtain a mirror image of the fourth image as the second image.
  • the crystal in the pupil will reflect a part of the optical fiber, so that the mirror image of the object that the human eye is watching will be observed on the crystal. Therefore, the fourth image obtained in the preceding steps can be Believed to be a mirror image of the object the wearer is looking at. Therefore, in this embodiment, after obtaining the image on the lens in the pupil as the fourth image, the mirror image of the fourth image can be obtained, and the mirror image of the fourth image can be determined as the image that the wearer is gazing at, that is, the fourth image. Two images.
  • Step S640 Obtain the image displayed on the display screen as a third image.
  • Step S650 Based on the second image and the third image, control the parameters of the display screen.
  • step S640-step S660 please refer to step S130-step S140, which will not be repeated here.
  • the position and size of the wearer's iris are obtained from the first image, based on the position and size of the iris, an iris image is extracted from the first image, and the iris image is extracted from the iris image
  • the image on the lens in the pupil of the wearer is obtained as the fourth image
  • the mirror image of the fourth image is obtained as the second image
  • the image displayed on the display screen is obtained as the third image, based on the second image and the third image
  • this embodiment also acquires the mirror image of the lens-like image in the wearer's pupil from the image including the wearer's eyes as the image the wearer is gazing at, thereby improving the determined The accuracy of the image the wearer gazes at.
  • FIG. 10 shows a block diagram of a parameter control device provided by an embodiment of the present application.
  • the parameter control device 200 is applied to the above-mentioned head-mounted display device.
  • the head-mounted display device includes a camera and a display screen. The process shown in FIG. 10 will be described below.
  • the parameter control device 200 includes: the first image acquisition Module 210, the second image acquisition module 220, the third image acquisition module 230 and the parameter control module 240, wherein:
  • the first image acquisition module 210 is configured to, when the head-mounted display device is in the wearing state, acquire an image including eyes of the wearer as a first image through the camera.
  • the second image acquisition module 220 is configured to acquire an image that the wearer is gazing at as a second image based on the first image.
  • the second image acquisition module 220 includes: a fourth image acquisition submodule and a second image acquisition submodule, wherein:
  • the fourth image acquisition sub-module is configured to acquire an image on the lens in the wearer's pupil from the first image as a fourth image.
  • the fourth image acquisition submodule includes: an iris parameter acquisition unit, an iris image extraction unit, and a fourth image acquisition unit, wherein:
  • the iris parameter acquisition unit is configured to acquire the position and size of the wearer's iris from the first image.
  • An iris image extraction unit configured to extract an iris image from the first image based on the position and size of the iris.
  • the fourth image acquisition unit is configured to acquire an image on the lens in the pupil of the wearer from the iris image as the fourth image.
  • the second image acquisition sub-module is configured to acquire a mirror image of the fourth image as the second image.
  • the third image acquiring module 230 is configured to acquire the image displayed on the display screen as a third image.
  • the parameter control module 240 is configured to control the parameters of the display screen based on the second image and the third image.
  • the head-mounted display device includes a head-mounted augmented reality device
  • the parameter control module 240 includes: a first similarity acquisition submodule, a first brightness control submodule, and a second brightness control submodule, wherein:
  • the first similarity acquiring submodule is configured to acquire the similarity between the second image and the third image as the first similarity.
  • the first brightness control sub-module is configured to control the brightness of the display screen to be greater than the first brightness threshold when the first similarity is greater than the first similarity threshold.
  • the second brightness control submodule is configured to control the brightness of the display screen to be smaller than the first brightness threshold when the first similarity is not greater than the first similarity threshold.
  • the head-mounted display device includes a head-mounted augmented reality device
  • the display screen is made of electrochromic materials
  • the parameter control module 240 includes: a second similarity acquisition submodule, a first color control submodule and the second color control submodule, where:
  • the second similarity acquiring submodule is configured to acquire the similarity between the second image and the third image as the second similarity.
  • the first color control submodule is used to control the color of the display screen to be the first target color when the second similarity is greater than the second similarity threshold, wherein the display screen is in the first target color
  • the light transmittance under is less than the light transmittance threshold.
  • the second color control submodule is used to control the color of the display screen to be a second target color when the second similarity is not greater than the second similarity threshold, wherein the display screen is at the second target color.
  • the light transmittance of the two target colors is greater than the light transmittance threshold.
  • the head-mounted display device includes a head-mounted augmented reality device
  • the display screen is made of electrochromic materials
  • the parameter control module 240 includes: a third similarity acquisition submodule, a first parameter control submodule and the second parameter controls the submodule, where:
  • the third similarity acquiring submodule is configured to acquire the similarity between the second image and the third image as a third similarity.
  • the first parameter control submodule is used to control the brightness of the display screen to be greater than the first brightness threshold and control the color of the display screen to be the first target color when the third similarity is greater than the third similarity threshold , wherein the light transmittance of the display screen under the first target color is less than a light transmittance threshold.
  • the second parameter control submodule is used to control the brightness of the display screen to be smaller than the first brightness threshold and control the color of the display screen when the third similarity is not greater than the third similarity threshold is the second target color, wherein the light transmittance of the display screen under the second target color is greater than the light transmittance threshold.
  • the head-mounted display device includes a head-mounted augmented reality device
  • the display screen is made of electrochromic materials
  • the parameter control module 240 includes: a third parameter control submodule, wherein:
  • the third parameter control submodule is used to control the brightness of the display screen from less than the third similarity threshold to greater than the third similarity threshold when the third similarity is changed from being less than the third similarity threshold
  • a brightness threshold is gradually increased to be greater than the first brightness threshold, and the color of the display screen is controlled to gradually change from the first target color to the second target color.
  • the head-mounted display device includes a head-mounted virtual reality device
  • the parameter control module 240 includes: a fourth similarity acquisition submodule, a target similarity acquisition submodule, a target sub-image acquisition submodule, and a fourth parameter control submodule, where:
  • the fourth similarity obtaining submodule is configured to obtain the similarity between each of the plurality of sub-images and the second image to obtain a plurality of fourth similarities.
  • the target similarity acquisition submodule is configured to acquire a fourth similarity with a similarity greater than a fourth similarity threshold from the plurality of fourth similarities as the target similarity.
  • the target sub-image acquisition sub-module is configured to acquire the sub-image corresponding to the target similarity as the target sub-image.
  • the fourth parameter control submodule is used to control the display screen to display the brightness of the area of the target sub-image greater than the second brightness threshold, and control the display screen to display the brightness of the area other than the target sub-image not to be greater than the specified Describe the second brightness threshold.
  • the coupling between the modules may be electrical, mechanical or other forms of coupling.
  • each functional module in each embodiment of the present application may be integrated into one processing module, each module may exist separately physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules.
  • FIG. 11 shows a structural block diagram of a head-mounted display device 100 provided by an embodiment of the present application.
  • the head-mounted display device in this application may include one or more of the following components: camera 110, display screen 120, processor 130, memory 140, and one or more application programs, wherein one or more application programs may be stored in
  • the memory 140 is configured to be executed by the one or more processors 130, and one or more programs are configured to execute the methods described in the foregoing method embodiments.
  • the processor 130 may include one or more processing cores.
  • the processor 130 uses various interfaces and lines to connect various parts in the whole head-mounted display device, and runs or executes instructions, programs, code sets or instruction sets stored in the memory 140, and calls data stored in the memory , perform various functions of the head-mounted display device and process data.
  • the processor 130 may adopt at least one of Digital Signal Processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA). implemented in the form of hardware.
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PLA Programmable Logic Array
  • the processor 130 may integrate one or a combination of a central processing unit (Central Processing Unit, CPU), a graphics processing unit (Graphics Processing Unit, GPU), a modem, and the like.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the CPU mainly handles the operating system, user interface and application programs, etc.
  • the GPU is used to render and draw the content to be displayed
  • the modem is used to handle wireless communication. It can be understood that, the above modem may also not be integrated into the processor 130, but implemented by a communication chip alone.
  • the memory 140 may include random access memory (Random Access Memory, RAM), and may also include read-only memory (Read-Only Memory).
  • the memory 140 may be used to store instructions, programs, codes, sets of codes or sets of instructions.
  • the memory 140 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system and instructions for implementing at least one function (such as a touch function, a sound playback function, an image playback function, etc.) , instructions for implementing the following method embodiments, and the like.
  • the storage data area can also store data created during use of the head-mounted display device 100 (such as phonebook, audio and video data, chat record data) and the like.
  • FIG. 12 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
  • Program codes are stored in the computer-readable medium 300, and the program codes can be invoked by a processor to execute the methods described in the foregoing method embodiments.
  • the computer readable storage medium 300 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the computer-readable storage medium 300 includes a non-transitory computer-readable storage medium (non-transitory computer-readable storage medium).
  • the computer-readable storage medium 300 has a storage space for program code 310 for executing any method steps in the above methods. These program codes can be read from or written into one or more computer program products.
  • Program code 310 may, for example, be compressed in a suitable form.
  • the image including the eyes of the wearer is collected by the camera as the first Image, based on the first image, acquire the image that the wearer is looking at as the second image, acquire the image displayed on the display screen as the third image, and control the parameters of the display screen based on the second image and the third image, so that by comparing the wearing The relationship between the image the viewer is gazing at and the image displayed by the head-mounted display device is used to control the parameters of the display screen, thereby improving the accuracy of parameter control and improving the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente demande divulgue un procédé et un appareil de commande de paramètre, un dispositif visiocasque et un support de stockage, qui se rapportent au domaine technique des dispositifs électroniques. Le procédé est appliqué au dispositif visiocasque qui comprend une caméra et un écran d'affichage. Le procédé comprend les étapes suivantes consistant à : lorsque le dispositif visiocasque est dans un état de port, acquérir, au moyen d'une caméra, une image comprenant un œil d'un porteur en tant que première image ; acquérir, sur la base de la première image, une image du regard du porteur en tant que deuxième image ; acquérir une image affichée par l'écran d'affichage en tant que troisième image ; et commander le paramètre de l'écran d'affichage sur la base de la deuxième image et de la troisième image. La présente invention commande le paramètre de l'écran d'affichage par comparaison de l'image du regard du porteur avec l'image affichée par le dispositif visiocasque, ce qui permet d'améliorer la précision de la commande de paramètres et d'améliorer l'expérience de l'utilisateur.
PCT/CN2022/104430 2021-08-04 2022-07-07 Procédé et appareil de commande de paramètre, dispositif visiocasque et support de stockage WO2023011103A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110893069.1 2021-08-04
CN202110893069.1A CN113672085A (zh) 2021-08-04 2021-08-04 参数控制方法、装置、头戴式显示设备以及存储介质

Publications (1)

Publication Number Publication Date
WO2023011103A1 true WO2023011103A1 (fr) 2023-02-09

Family

ID=78541396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/104430 WO2023011103A1 (fr) 2021-08-04 2022-07-07 Procédé et appareil de commande de paramètre, dispositif visiocasque et support de stockage

Country Status (2)

Country Link
CN (1) CN113672085A (fr)
WO (1) WO2023011103A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117615248A (zh) * 2023-11-24 2024-02-27 北京东舟技术股份有限公司 一种vr显示屏内容的拍摄方法、装置、设备及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113672085A (zh) * 2021-08-04 2021-11-19 Oppo广东移动通信有限公司 参数控制方法、装置、头戴式显示设备以及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014048775A (ja) * 2012-08-30 2014-03-17 Kotaro Unno 注視位置特定装置、および注視位置特定プログラム
CN109582141A (zh) * 2018-11-23 2019-04-05 华为技术有限公司 根据眼球焦点控制显示屏的方法和头戴电子设备
CN112596247A (zh) * 2020-12-31 2021-04-02 Oppo广东移动通信有限公司 图像显示方法、装置以及头戴显示设备
WO2021134710A1 (fr) * 2019-12-31 2021-07-08 深圳市大疆创新科技有限公司 Procédé de commande et dispositif associé
CN113204281A (zh) * 2021-03-22 2021-08-03 闻泰通讯股份有限公司 终端屏幕亮度动态调整方法、装置、电子设备及存储介质
CN113672085A (zh) * 2021-08-04 2021-11-19 Oppo广东移动通信有限公司 参数控制方法、装置、头戴式显示设备以及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014048775A (ja) * 2012-08-30 2014-03-17 Kotaro Unno 注視位置特定装置、および注視位置特定プログラム
CN109582141A (zh) * 2018-11-23 2019-04-05 华为技术有限公司 根据眼球焦点控制显示屏的方法和头戴电子设备
WO2021134710A1 (fr) * 2019-12-31 2021-07-08 深圳市大疆创新科技有限公司 Procédé de commande et dispositif associé
CN112596247A (zh) * 2020-12-31 2021-04-02 Oppo广东移动通信有限公司 图像显示方法、装置以及头戴显示设备
CN113204281A (zh) * 2021-03-22 2021-08-03 闻泰通讯股份有限公司 终端屏幕亮度动态调整方法、装置、电子设备及存储介质
CN113672085A (zh) * 2021-08-04 2021-11-19 Oppo广东移动通信有限公司 参数控制方法、装置、头戴式显示设备以及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117615248A (zh) * 2023-11-24 2024-02-27 北京东舟技术股份有限公司 一种vr显示屏内容的拍摄方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN113672085A (zh) 2021-11-19

Similar Documents

Publication Publication Date Title
WO2023011103A1 (fr) Procédé et appareil de commande de paramètre, dispositif visiocasque et support de stockage
WO2020015468A1 (fr) Procédé et appareil de transmission d'image, dispositif terminal et support de stockage
US9143693B1 (en) Systems and methods for push-button slow motion
TWI545947B (zh) 具有影像擷取及分析模組之顯示裝置
EP2634727B1 (fr) Terminal portable et procédé permettant de corriger la direction du regard d'un utilisateur dans une image
CN111580652B (zh) 视频播放的控制方法、装置、增强现实设备及存储介质
CN108076290B (zh) 一种图像处理方法及移动终端
US11320655B2 (en) Graphic interface for real-time vision enhancement
JP4384240B2 (ja) 画像処理装置、画像処理方法、画像処理プログラム
US9911214B2 (en) Display control method and display control apparatus
WO2023065849A1 (fr) Procédé et appareil de réglage de luminosité d'écran pour dispositif électronique, et dispositif électronique
CN108712603B (zh) 一种图像处理方法及移动终端
US11487354B2 (en) Information processing apparatus, information processing method, and program
CA2773865A1 (fr) Dispositif d'affichage avec saisie d'images et module d'analyse
TWI729983B (zh) 電子裝置、螢幕調節系統及方法
CN112866576B (zh) 图像预览方法、存储介质及显示设备
TWI617994B (zh) 使用於互動裝置並用於辨識使用者操作該互動裝置的行為的方法,以及相關互動裝置與電腦可讀媒體
US20230229234A1 (en) Rendering enhancement based in part on eye tracking
CN110211211B (zh) 图像处理方法、装置、电子设备及存储介质
CN111880711B (zh) 显示控制方法、装置、电子设备及存储介质
JP4945617B2 (ja) 画像処理装置、画像処理方法、画像処理プログラム
WO2020044916A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US11328187B2 (en) Information processing apparatus and information processing method
CN110706164A (zh) 基于增强现实的管状视野图像变形显示方法及眼镜
US11270409B1 (en) Variable-granularity based image warping

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22851818

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE