CN110322818A - Display device and operating method - Google Patents

Display device and operating method Download PDF

Info

Publication number
CN110322818A
CN110322818A CN201910239748.XA CN201910239748A CN110322818A CN 110322818 A CN110322818 A CN 110322818A CN 201910239748 A CN201910239748 A CN 201910239748A CN 110322818 A CN110322818 A CN 110322818A
Authority
CN
China
Prior art keywords
image data
area
resolution image
display
rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910239748.XA
Other languages
Chinese (zh)
Other versions
CN110322818B (en
Inventor
安森·陈
刘乐群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omnivision Technologies Inc
Original Assignee
Omnivision Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omnivision Technologies Inc filed Critical Omnivision Technologies Inc
Publication of CN110322818A publication Critical patent/CN110322818A/en
Application granted granted Critical
Publication of CN110322818B publication Critical patent/CN110322818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0147Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Present application is related to a kind of display system and operating method.The display system includes: display, is located to show image to user;And sensor, be located to monitor the eyes of the user watches position attentively.Controller is coupled to the display and the sensor, and the controller includes the logic for making the display system execute operation.For example, the controller can be received from the sensor watches location information attentively, and determines that the described of the eyes watches position attentively.First resolution image data is output to the display for the firstth area in described image.Second resolution image data is output to the display for the secondth area in described image.And third resolution image data is output to the display for the third area in described image.

Description

Display device and operating method
Technical field
The present invention generally relates to display, and is related to eyes tracking in particular but not merely.
Background technique
Virtual reality (VR) is to reproduce the computer simulation experience of feeling of immersion true to nature.Current VR is experienced usually in user face It is preceding to utilize projection environment.In some cases, VR experience also may include sound wave feeling of immersion, such as be realized by using earphone. User can be able to use user interface and look around or move in simulated environment.It vibrates user interface or provides resistance to control Power sometimes can be formed to be interacted with environment.
In general, the display system to the performance requirement of VR head-mounted system than cellular phone, tablet computer and TV Strictly.This is since the eyes of user during operation can extremely be handled close to display screen and due to human eye to a certain extent The frequency of image.
Summary of the invention
In an aspect, present application provides a kind of display system, and the display system includes: display, through fixed Position is to show image to user;Sensor is located to monitor watching position attentively and exporting fixation position for the eyes of the user Confidence breath;And controller, it is coupled to the display and the sensor, wherein the controller is included in by the control Device makes the display system execute the logic of operation when executing, the operation includes: being connect by the controller from the sensor Watch location information described in receipts attentively;It is determined as the controller and watches position attentively described in the eyes;For in described image First resolution image data is output to the display by one area, wherein firstth area includes the eyes in the display Position is watched attentively described on device;Second resolution image data is output to the display for the secondth area in described image Device, wherein the first resolution image data has the resolution ratio higher than the second resolution image data;And for institute Third resolution image data is output to the display by the third area stated in image, wherein secondth area be placed in it is described Between firstth area and the third area, and wherein the second resolution image data has than the third image in different resolution number According to high resolution ratio.
In another aspect, present application provides a kind of headset equipment, and the headset equipment includes: shell, warp Moulding is to be mounted on the head of user;Display is located to be mounted on the head of the user in the shell Image was shown to the user when upper;Sensor is located in the shell to be mounted on the user's in the shell The position of watching attentively of the eyes of the user is monitored when on the head, and is exported and watched location information attentively;And controller, it is coupled to The display and the sensor, wherein the controller, which is included in when being executed by the controller, makes the headset equipment Execute operation logic, the operation includes: as the controller from the sensor reception described in watch location information attentively;By institute It states controller and determines that the described of the eyes watches position attentively;For the firstth area in described image by first resolution image data It is output to the display, wherein firstth area includes that the eyes on the display described watches position attentively;For Second resolution image data is output to the display by the secondth area in described image, wherein the first resolution image Data have the resolution ratio higher than the second resolution image data;And third is differentiated for the third area in described image Rate image data is output to the display, wherein secondth area is placed between firstth area and the third area, and Wherein the second resolution image data has the resolution ratio higher than the third resolution image data.
In another aspect, present application provides a kind of method, which comprises is received and is infused from sensor by controller Position is watched attentively with the eyes for capturing user depending on location information;It is determined as the controller and watches position attentively described in the eyes; Image is output to display from the controller, comprising exporting first resolution picture number for the firstth area in described image According to wherein firstth area includes that the eyes on the display described watches position attentively;And in described image Second resolution image data is output to the display by the secondth area, wherein the first resolution image data has than institute State the high resolution ratio of second resolution image data;And it is for the third area in described image that third resolution image data is defeated The display is arrived out, wherein secondth area is placed between firstth area and the third area, and wherein described second Resolution image data has the resolution ratio higher than the third resolution image data.
Detailed description of the invention
Non-limiting and Non-exclusive examples of the invention are described with reference to the following figure, wherein unless otherwise defined, through each The similar Ref. No. of a view refers to similar part.
Figure 1A describes the exemplary wear-type device of teaching according to the present invention.
Figure 1B describes the cross-sectional view of the exemplary wear-type device of Figure 1A of teaching according to the present invention.
Fig. 2A and 2B illustrates teaching according to the present invention and is supplied to image data in a manner of reducing required bandwidth The example of display.
Fig. 3 shows the exemplary methods of the operation wear-type device of teaching according to the present invention.
Fig. 4 shows the exemplary methods of the operation wear-type device of teaching according to the present invention.
Fig. 5 shows the exemplary methods of the operation wear-type device of teaching according to the present invention.
Fig. 6 shows the exemplary methods of the operation wear-type device of teaching according to the present invention.
In several views of schema, corresponding reference character indicates corresponding component.Those skilled in the art will Understand, the element in figure is illustrated for the sake of concise and is clear, is not necessarily been drawn to scale.It for example, can phase Some elements in element for other elements in enlarged drawing, to help to promote the reason to various embodiments of the present invention Solution.In addition, not being painted element that is useful or required common in commercially viable embodiment but understanding thoroughly, usually in order to more clear Understand to Chu these various embodiments of the invention.
Specific embodiment
Equipment relevant to display device described herein, the example of system and method.In the following description, it states numerous Detail is to provide the thorough understanding to example.However, those skilled in the relevant art are it will be recognized that can be without described Institute herein is practiced or can practiced by other methods, component, material etc. in the case where one or more of detail The technology of description.In other examples, well-known structure, material or operation are not shown in detail or described in order to avoid making some sides Face mould paste.
Refer to that " example " or " one embodiment " mean the spy in conjunction with described in the example in the whole text in this specification Determine feature, structure or characteristic is included at least one example of the invention.Therefore, go out in each position of this specification in the whole text Existing phrase " in an example " or " in one embodiment " all may not all refer to same instance.In addition, specific special Sign, structure or characteristic can combine in any suitable manner in one or more examples.
The performance requirement of virtual reality (VR) or augmented reality (AR) head-mounted system is calculated than cellular phone, plate The display system of machine and TV is stringent.One crucial performance requirement is high-resolution.In general, 60 pixels of central recess/degree Pixel density is commonly known as eyes limiting resolution.For VR, each high-resolution 3 D plane image shows two Secondary, each eye is primary, to occupy the major part of user's peripheral vision (for example, vertical vision is~180 degree, and level visual It is~135 degree).For high-definition picture is presented, it may be desired to mention big image data set from processor/controller of VR system It is supplied to VR display.
Another critical performance parameters are the short waiting time.High latency can lead to user and suffer from virtual reality disease. In some VR embodiments, the ideal waiting time will be 7 to 15 milliseconds.The main component of this latency is the brush of display New rate, the refresh rate are driven to up to 120Hz or even 240Hz.Graphics processing unit (GPU) is also required to become more It is powerful frame is more frequently presented.In some VR examples, in order to there is the sense of seamless connection, frame rate needs are at least 90fps.
Therefore, because needing big data set, therefore current graphics card and display will realize at least 90fps simultaneously (frame/second), 120Hz or bigger refresh rate (for the three-dimensional 3D more than 1080p resolution ratio) and wide this three of the visual field Have much challenge.The present invention describes a kind of wear-type device/system (and operating method) in the imperceptible image of user Bandwidth needed for quality is reduced in the case where being decreased obviously and realization better waiting time.
Illustrate to discuss example and other examples relevant to figure disclosed above below.
Figure 1A describe exemplary wear-type device 100, exemplary wear-type device 100 include display 101, shell 121, Frenulum 123, data/power connection 125, controller 131 and network 141.Controller 131 includes memory 132, power supply 133, number According to input/output 135, processor 137 and network connection 139.It will be appreciated that discribed all electronic devices are via bus etc. Coupling.It will be appreciated that wear-type device 100 is only the one embodiment for the device that the present invention is covered.The technology people of fields Member will be appreciated that teaching disclosed herein is equally applicable to the head-up display of vehicle (for example, hurricane globe) or aircraft, or very It is extremely built-in in personal computing device (for example, smart phone etc.).
As indicated, shell 121 is through moulding with by using frenulum 123, (it can be elastic material, Velcro (Velcro), plastics etc. and be wrapped on the head of user) be removably mounted on the head of user.Shell 121 can be by gold The formation such as category, plastics, glass.Display 101 is arranged in shell 121 and is located to be mounted on the head of user in shell 121 Image is shown to user when in portion.It will be appreciated that display 101 is built-in into shell 121, or can be removably attached to Shell 121.For example, display 101 can be a part for the smart phone that can be plugged into shell 121.Other or In same instance, display 101 may include light emitting diode indicator (LED), organic LED display, liquid crystal display, holography Display etc..In some instances, display 101 can partly transparent (or the whole dimnesses of vision that not will use family) to mention For augmented reality (AR) environment.It is merely positioned in front of the one eye eyeball of user it will be appreciated that display 101 can be configured to it.
In the illustrated case, controller 131 is coupled to display 101 and sensor (for example, with reference to the sensing of Figure 1B Device 151).Controller 131 includes logic, and the logic makes wear-type device 100 execute operation when being executed by controller 131 (comprising controlling the image shown on display 131).It will be appreciated that controller 131 can be and separate with wear-type device 100 Computer, or can partly be placed in wear-type device 100 (for example, if display 100 includes smart phone, and intelligence Processor in phone disposes some or all of processing).In addition, controller 131 may include distributed system, for example control Device 131 processed can receive instruction via internet or from remote server.In the illustrated case, controller 131 is coupled to It is received and is instructed from network 141 by 139 (for example, wireless receiver, ethernet ports etc.) of network connection.Controller 131 also wraps Containing processor 137, processor 137 may include graphics processing unit (for example, one or more graphics cards, general processor etc.).Place Reason device 137 can be coupled to memory 132, such as RAM, ROM, hard disk, remote storage etc..Data input/output 135 can incite somebody to action Instruction is output to wear-type device 100 by data connection 125 from controller 131, and data connection 125 may include cable etc..? In some examples, connection 125 can be wireless (for example, bluetooth etc.).Power supply 133 is also contained in controller 131 and can wrap Containing the electric supply (for example, AC to DC converter) being inserted into wall socket, battery, induction charging source etc..
The cross-sectional view of the exemplary wear-type device 100 of Figure 1B depiction 1A.As indicated, wear-type device 100 is also wrapped Containing lens optics 155, sensor 151, black light illuminator 153 and cushion 157, (therefore wear-type device 100 is relaxed It rests on the forehead of user to clothes).In the illustrated case, (it can include one or more of Fei Nie to lens optics 155 Ear (Fresnel) lens, convex lens, concavees lens etc.) it is located in the shell 121 between display 101 and the eyes of user, with In the eyes that the light of image on display 101 is focused on user.Black light illuminator 153 (for example, LED) positioning Eyes are irradiated to use black light (for example, infrared ray etc.) in shell 121, and sensor 151 is (for example, CMOS schemes As sensor etc.) through structure design (for example, there is IR to pass through filter, narrow bandgap semiconductor material, such as Ge/SiGe) to inhale It receives black light and monitors eyes and watch position attentively.Therefore, the eyes of user are irradiated to sensor 151 completely, but user is not It can be appreciated that any light in addition to the light from display 101.
In some instances, there is only a sensor 151 or multiple sensors 151 can may be present, and sensor 151 is pacified Each place around lens optics 155 is set to monitor the eyes of user.It will be appreciated that sensor 151 can be located to Eyes are imaged by lens optics 155, or eyes can be imaged without using intermediate optics.Also answer Understand, it can be by system calibration eye position to associate with the position that user on display 101 is watched.It can be in the factory Or it can be calibrated after user buys.
Fig. 2A and 2B illustrate in a manner of reducing required bandwidth by image data provide display 201 (for example,
The display 101 of Figure 1A and 1B) example.For example, Fig. 2A show output (arrive display 201) image ( This, is colored image) in the first area 261 first resolution image data.It will be appreciated that the first area 261 includes eyes aobvious Show and watches position attentively on device.In other words, the first area 261 is the position that eyes are watched on display 201.It is watched according to eyes Position, the first area 261 can change position, and the image data for being transferred to display also correspondingly changes (for example, different Resolution ratio, frame rate, refresh rate etc.).It will be appreciated that area 261 can since area 261 is the position that eyes are best seen It is supplied with the image data of highest resolution.Also show that second point of the second area 263 in output (to display 201) image Resolution image data.Second area 263 is in the peripheral vision of eyes;Therefore, it is supplied to the first resolution figure in the first area 261 As data have the resolution ratio higher than being supplied to the second resolution image data in the second area 263.Therefore, it is necessary to be transferred to display The data of device 201 are less, but not the user experience of wear-type device is made to be deteriorated.It will be appreciated that in some instances, for For area except one area 261,1 pixel in X pixel can receive image data from controller, therefore display 201 exists Functionally with 1/X resolution operation in this area.In other words, only 1/X pixel may be updated having new information to each refresh cycle.
Fig. 2 B is similar to Fig. 2A but includes further region: third area 265 and the 4th area 269.Therefore, Fig. 2 B includes multiple areas. In the illustrated case, the third resolution image data in the third area 265 in image is output to display 201.Second Area 263 is placed between the first area 261 and third area 265, and second resolution image data has than third image in different resolution The high resolution ratio of data.Therefore, the center that user watches attentively is more moved away from, the resolution ratio of image is lower.Similarly, the 4th area 269 include the 4th resolution image data, and the 4th resolution image data has lower than third resolution image data Resolution ratio.
It will be appreciated that the second area 263 is concentric with the first area 261, and the resolution ratio of second resolution image data is from One area 261 is gradually reduced to third area 265.Similarly, the resolution ratio in third area 265 can be gradually reduced towards the 4th area 269.The The resolution ratio of two resolution image datas and third resolution image data can be from the firstth area to the 4th area with linear velocity or non- Linear velocity reduces.
In same instance or different instances, first resolution image data has the first frame rate, second resolution figure As data have the second frame rate, third resolution image data has third frame rate, and the 4th image in different resolution has the Four frame rate.And first frame rate be greater than the second frame rate, the second frame rate is greater than third frame rate, and third frame rate is greater than 4th frame rate.The frame rate for reducing the peripheral region of user's vision can further save bandwidth, this is because needing to be transmitted to aobvious Show that the data of device 201 tail off.It will be appreciated that the second frame rate can be from the first area 261 to third area 265 gradually as resolution ratio Reduce, and third frame rate can be gradually reduced from the second area 263 to the 4th area 269.
In another example or same instance, first resolution image data can have the first refresh rate, and second differentiates Rate image data can have the second refresh rate, and third resolution image data can have third refresh rate, and the 4th differentiates Rate image data can have the 4th refresh rate.And first refresh rate be greater than the second refresh rate, the second refresh rate is greater than Third refresh rate, and third refresh rate is greater than the 4th refresh rate.It will be appreciated that the second refresh rate can be from the first area 261 It is gradually reduced to third area 265, and third refresh rate can be gradually reduced from the second area 263 to the 4th area 269.As reducing frame speed Rate and resolution ratio are the same, data volume needed for reduction refresh rate can equally reduce operation display 201.
Fig. 3 shows the exemplary methods 300 of operation wear-type device.Those skilled in the art will understand that methods 300 In box 301 to 309 can concurrently carry out in any order and even.In addition, teaching according to the present invention, can add box Or box is removed from method 300.
Box 301 is shown by controller (for example, controller 131 of Figure 1A) from being located in wear-type device to capture Location information is watched in sensor (for example, sensor 155 of Figure 1B) reception for watching position attentively of the eyes of user attentively.In some examples In, the position of watching attentively for capturing eyes includes the position that eyes are watched on capture display.This can be the specific quadrant of screen Or the respective pixel group on screen.
Box 303 describe determine eyes by controller watch position attentively.In some instances, this may include by user's rainbow The position of film or pupil associates with the position that user on screen is watched.It calibration system or user can use in the factory Calibrate head-mounted display before to realize this association.In addition, head-mounted display can be used machine learning algorithm (for example, mind Through network) etc. repeatedly understand the position that user is watching.
Box 305 illustrate by image (for example, video, video game graphics etc.) from controller (can be placed in PC or In game system) it is output to the display of the first resolution image data comprising the firstth area in image.It will be appreciated that the firstth area Comprising eyes over the display watch position (for example, place that eyes are watched on display) attentively.
Box 307, which is shown, is output to display for the second resolution image data in the secondth area in image.First differentiates Rate image data has the resolution ratio (for example, 1080p) higher than second resolution image data (for example, 720p or smaller).? In some examples, the secondth area is concentric with the firstth area.In some instances, but the area can not have identical center and that There is around here scheduled offset.
Box 309, which is described, is output to display for the third resolution image data in the third area in image.Described Example in, the secondth area is placed between the firstth area and third area, and second resolution image data have than third resolution ratio The high resolution ratio of image data.The resolution ratio of second resolution image data can be gradually reduced from the firstth area to third area (such as Linearly, index, with rate of regression reduce, reduced with incremental rate etc.).
In some instances, it should be understood that each area of image can have different frame rate.In an example, first Resolution image data has the first frame rate, and second resolution image data has the second frame rate, and third resolution chart As data have third frame rate.And first frame rate be greater than the second frame rate, and the second frame rate be greater than third frame rate.It answers Understand, as resolution ratio, frame rate can be gradually reduced from the firstth area to third area (such as linearly, index, to successively decrease Rate reduces, with ascending rate reduction etc.).It will be appreciated that in some instances, the frame rate of all pixels in all areas is pair Quasi-.In other words, although the pixel in same district does not have different frame rate, reception transmits new from controller simultaneously Image data.For example, the pixel in the firstth area can receive image data from controller at 120 hz, and in the secondth area Pixel can receive image data from controller at 60Hz;Two pixels will be updated when second (relatively slow) pixel updates.Therefore, First frame rate is the integral multiple of the second frame rate.In other embodiments, the second frame rate can be the whole of third frame rate Several times.
In some instances, it should be understood that each area of image can have different refresh rates.In the illustrated case, First resolution image data has the first refresh rate, and second resolution image data has the second refresh rate, and third Resolution image data has third refresh rate.And first refresh rate be greater than the second refresh rate, and the second refresh rate Greater than third refresh rate.In some instances, the second refresh rate is gradually reduced (such as linear from the firstth area to third area Ground, index, with rate of regression reduce, with ascending rate reduce etc.).It will be appreciated that in some instances, the institute in all areas There is the refresh cycle of pixel to be in alignment with.For example, the pixel in the firstth area can refresh under the rate of 240Hz, and second Pixel in area refreshes at 120 hz, thus the pixel in described two not same districts simultaneously but refresh in different cycles.Cause This, the first refresh rate is the integral multiple of the second refresh rate.In other embodiments, the second refresh rate can be third brush The integral multiple of new rate.
In an example, across whole display (for example, both in eye focus area or except eye focus area) with Full resolution passes through first frame starting display.So, watching user experience before position calculates attentively in execution not may be degraded. In addition, those skilled in the art will understand that " frame rate " refers to the frequency of image data, and " refresh rate " refers to display The refresh rate of pixel in device, and these rates can be different.
Fig. 4 shows the exemplary methods 400 of operation wear-type device.It will be appreciated that Fig. 4 can describe side demonstrated in Figure 3 The more specific example of method.Those skilled in the art will understand that the box 401 to 413 in method 400 can be in any order And it even concurrently carries out.In addition, teaching according to the present invention, can add box or remove box from method 400.
The displaying of box 401 is tracked eyes by sensor, and mobile (it may include tracking eye focus direction, over the display Position, gaze angle etc.).Then, can send this information to eye tracking module (such as the component in controller, it is described Component can be implemented with the combination of hardware, software or both) position is watched attentively with track eyes.
Box 403 describe calculate watch attentively position (for example, based between eye focus angle and eyes and display away from From) and define the eye focus area on display (for example, watching position attentively) boundary each pixel address.Then, will These addresses are sent to controller.It will be appreciated that teaching according to the present invention, the processor being placed in wear-type device or control Circuit system can be considered as a part of " controller ".
Box 405 is illustrated by controller come the address and received eye focus boundary to image pixel data Address is compared.As indicated, whether controller determines image pixel in eye focus area.
If box 407 shows image pixel in eye focus area, each pixel address is directed to by image data It is sent to interface module (such as another component in controller can be implemented with hardware, software or hardware and combination of software) To be used for high-resolution imaging.
If box 409 describes image pixel not in eye focus area, system continues to carry out neighbouring pixel Compare, until it reaches N pixel (for example, the 10th pixel), then system is only by the figure of N (for example, 10th) pixel As data are sent to interface module.Therefore, data set can greatly be reduced.In some instances, N pixel can be than 10 big pixels or small pixel.Those skilled in the art will understand that, it is possible to use other methods to reduce for part The data set of low resolution imaging.
Box 411 illustrates interface module and sends VR display via wireless connection or wired connection for frame.It is each Frame includes full resolution data set in the case where pixel address is located in eye focus area, and poly- positioned at eyes in pixel address It include 1/N (for example, 1/10) full resolution data set in the case where except burnt area.This is efficiently reduced image data from control Device (for example, controller 131 of Figure 1A) processed is provided to bandwidth needed for VR head-mounted display.
Box 413 is shown with full resolution at eye focus area and is come with 1/N full resolution except eye focus area Show (for example, on display 101) image.
Fig. 5 shows the exemplary methods 500 of operation wear-type device.It will be appreciated that Fig. 5 can describe with it is depicted in figure 4 Method difference but similar method.Those skilled in the art will understand that the box 501 to 517 in method 500 can be any It order and even concurrently carries out.In addition, teaching according to the present invention, can add box or remove box from method 500.
Box 501 describes the movement similar with the box 401 to 405 in the method 400 of Fig. 4 to box 505.
507 display systems of box determine image pixel whether in transition region, and whether pixel is in eye focus area.
Box 509 is shown if it is determined that image pixel is not in transition region, then system continues to carry out neighbouring pixel Compare, until system reaches N pixel (for example, the 10th pixel), then system being sent to N pixel image data Interface module.
Box 511 is shown if it is determined that image pixel is in transition region, then system continues to compare neighbouring pixel Compared with until system reaches (N/2) pixel (for example, the 5th pixel), then system is by the image data of (N/2) pixel It is sent to interface module.
If box 513 show image pixel in eye focus area (referring to box 505), for each pixel Location sends image data to interface module for high-resolution imaging.
Box 515 illustrate using interface module by with three subframes a frame be sent to VR display (via Wireless connection or wired connection).First subframe may include 1/N in pixel address except transition region (for example, 1/ 10) full resolution data set.Second subframe may include that the 2/N (for example, 1/5) in pixel address in transition region in situation divides entirely Resolution data set.Third subframe may include the full resolution data set in pixel address in eye focus area in situation.Therefore, It greatly reduces and provides image data to bandwidth needed for VR head-mounted display from controller.
Box 517, which is depicted at eye focus area, shows a frame image with high-resolution, in picture quality without obvious damage Resolution ratio, which is directed away from, in the case where mistake watches the area of position attentively and smootherly degrades.
Fig. 6 shows the exemplary methods 600 of operation wear-type device.It will be appreciated that Fig. 6 can describe with it is depicted in figure 5 Method difference but similar method.Those skilled in the art will understand that the box 601 to 621 in method 600 can be any It order and even concurrently carries out.In addition, teaching according to the present invention, can add box or remove box from method 600.
It is mobile that 601 display systems of box monitor eyes using sensor (for example, sensor 155), and by eye focus Angle is sent to eye tracking module.
Eye tracking module in 603 diagram illustrating system of box, which calculates, (based on eye focus angle and eyes and to be shown Show the distance between device) eyes watch position attentively, and define the eye focus area on display and transition region boundary it is every The address of one pixel.Then, this address can be sent to VR controller.
Box 605 describe using controller come to image pixel data address and received eye focus outland Location is compared.
607 display systems of box determine image pixel whether in transition region, and whether image pixel is in eye focus area.
If box 609 illustrates image pixel not in transition region, system continues to carry out neighbouring pixel Compare, until it reaches N pixel (for example, the 10th pixel), then system sends the image data of N pixel to and connects Mouth mold block.
If box 611 describes image pixel in transition region, system continues to be compared neighbouring pixel, directly Until it reaches (N/2) pixel (for example, the 5th pixel), then system sends the image data of (N/2) pixel to and connects Mouth mold block.
If box 613 shows image pixel in eye focus area, system is directed to each pixel address for image Data are sent to interface module for high-resolution imaging.
Box 615 illustrates, and interface module is by the subframe with high frame rate and high refresh rate via wireless Connection or wired connection are sent to VR display.In the case where pixel address is located in eye focus area, each subframe is all wrapped Collection containing high-resolution data.
Box 617 describe interface module by the subframe with medium frame rate and medium refresh rate via be wirelessly connected or Wired connection is sent to VR display.In the case where pixel address is located in transition region, each subframe all includes medium resolution Rate data set.
Box 619 show interface module by the subframe with low frame rate and low refresh rate via be wirelessly connected or Wired connection is sent to VR display.Except pixel address is located at transition region, each subframe all includes low resolution Rate data set.
Box 621 be illustrated in picture quality without significantly sacrificing in the case where at eye focus area with high-resolution, Quick frame rate and rapid refresh rate show image.
To (the including content described in abstract of invention) described above of illustrated example of the invention not purport Having exhaustive or is limiting the present invention to revealed precise forms.Although describing this for purpose of explanation herein The specific example of invention, but those skilled in the art will realize that various modifications can be made within the scope of the invention.
These modifications can be made to the present invention in view of being described in detail above.Term used in the attached claims is not It should be understood to limit the invention to specific example disclosed in this specification.But the scope of the present invention will completely by The appended claims determine that described claims will be construed in accordance with the doctrine of claim being created.

Claims (22)

1. a kind of display system comprising:
Display is located to show image to user;
Sensor, be located to monitor the eyes of the user watch position attentively and location information is watched in output attentively;And
Controller is coupled to the display and the sensor, is held wherein the controller is included in by the controller The display system is set to execute the logic of operation when row, the operation includes:
As the controller from the sensor receive described in watch location information attentively;
It is determined as the controller and watches position attentively described in the eyes;
First resolution image data is output to the display for the firstth area in described image, wherein firstth area Comprising the eyes on the display described in watch position attentively;
Second resolution image data is output to the display for the secondth area in described image, wherein described first point Resolution image data has the resolution ratio higher than the second resolution image data;And
Third resolution image data is output to the display for the third area in described image, wherein secondth area It is placed between firstth area and the third area, and wherein the second resolution image data has than the third point The high resolution ratio of resolution image data.
2. display system according to claim 1, further comprising:
Shell, through moulding to be removably mounted on the head of user, and wherein the display through structure design with The shell is placed in the shell when being mounted on the head of the user, and wherein the sensor is located in institute It states in shell and watches position attentively with monitor the eyes when the shell is mounted on the head of the user.
3. display system according to claim 1, wherein secondth area is concentric with firstth area, and wherein institute The resolution ratio for stating second resolution image data is gradually reduced from firstth area to the third area.
4. display system according to claim 3, wherein the resolution ratio of the second resolution image data is from described One area is reduced to the third area with linear velocity or non linear rate.
5. display system according to claim 1, wherein the first resolution image data has the first frame rate, institute Second resolution image data is stated with the second frame rate, and the third resolution image data has third frame rate, and Wherein the first frame rate is greater than second frame rate, and second frame rate is greater than the third frame rate.
6. display system according to claim 5, wherein second frame rate is from firstth area to the third area It is gradually reduced.
7. display system according to claim 1, wherein the first resolution image data has the first refresh rate, The second resolution image data has the second refresh rate, and there is the third resolution image data third to refresh speed Rate, and wherein first refresh rate is greater than second refresh rate, and second refresh rate is greater than the third Refresh rate.
8. display system according to claim 7, wherein second refresh rate is from firstth area to the third Area is gradually reduced.
9. a kind of headset equipment comprising:
Shell, through moulding to be mounted on the head of user;
Display is located to when the shell is mounted on the head of the user to user's display diagram Picture;
Sensor is located in the shell described in the monitoring when the shell is mounted on the head of the user The eyes of user watch position attentively, and export and watch location information attentively;And
Controller is coupled to the display and the sensor, is held wherein the controller is included in by the controller The headset equipment is set to execute the logic of operation when row, the operation includes:
As the controller from the sensor receive described in watch location information attentively;
It is determined as the controller and watches position attentively described in the eyes;
First resolution image data is output to the display for the firstth area in described image, wherein firstth area Comprising the eyes on the display described in watch position attentively;
Second resolution image data is output to the display for the secondth area in described image, wherein described first point Resolution image data has the resolution ratio higher than the second resolution image data;And
Third resolution image data is output to the display for the third area in described image, wherein secondth area It is placed between firstth area and the third area, and wherein the second resolution image data has than the third point The high resolution ratio of resolution image data.
10. headset equipment according to claim 9, further comprising:
Lens optics are being located between the display and the eyes in the shell, will be come from described aobvious Show that the light of the described image on device focuses in the eyes;And
Black light illuminator is located in be irradiated the eyes using black light in the shell, wherein the biography Sensor is designed through structure to absorb the black light and watch position attentively described in the eyes to monitor.
11. headset equipment according to claim 9, wherein the first resolution image data has first frame speed Rate, the second resolution image data has the second frame rate, and the third resolution image data has third frame speed Rate, and wherein the first frame rate is greater than second frame rate, and second frame rate is greater than the third frame rate.
12. headset equipment according to claim 9, wherein the first resolution image data has first to refresh speed Rate, the second resolution image data has the second refresh rate, and the third resolution image data has third brush New rate, and wherein first refresh rate is greater than second refresh rate, and second refresh rate is greater than described Third refresh rate.
13. a kind of method comprising:
It is received by controller from sensor and watches location information attentively and watch position attentively with capture the eyes of user;
It is determined as the controller and watches position attentively described in the eyes;
Image is output to display from the controller, comprising exporting first resolution figure for the firstth area in described image As data, wherein firstth area includes that the eyes on the display described watches position attentively;And
Second resolution image data is output to the display for the secondth area in described image, wherein described first point Resolution image data has the resolution ratio higher than the second resolution image data;And
Third resolution image data is output to the display for the third area in described image, wherein secondth area It is placed between firstth area and the third area, and wherein the second resolution image data has than the third point The high resolution ratio of resolution image data.
14. according to the method for claim 13, wherein secondth area is concentric with firstth area, and it is wherein described The resolution ratio of second resolution image data is gradually reduced from firstth area to the third area.
15. according to the method for claim 14, wherein the second resolution image data is from firstth area to described Third area reduces according to one of linear velocity, rate of regression or ascending rate.
16. described according to the method for claim 13, wherein the first resolution image data has the first frame rate Second resolution image data has the second frame rate, and the third resolution image data has third frame rate, and its Described in the first frame rate be greater than second frame rate, and second frame rate be greater than the third frame rate.
17. according to the method for claim 16, wherein second frame rate from firstth area to the third area by It is decrescence small.
18. according to the method for claim 16, wherein the first frame rate is the integral multiple of second frame rate, and Second frame rate is the integral multiple of the third frame rate.
19. according to the method for claim 13, wherein the first resolution image data has the first refresh rate, institute Second resolution image data is stated with the second refresh rate, and there is the third resolution image data third to refresh speed Rate, and wherein first refresh rate is greater than second refresh rate, and second refresh rate is greater than the third Refresh rate.
20. according to the method for claim 19, wherein second refresh rate is from firstth area to the third area It is gradually reduced.
21. according to the method for claim 19, wherein first refresh rate is the integer of second refresh rate Times, and second refresh rate is the integral multiple of the third refresh rate.
22. according to the method for claim 13, wherein capturing the described of the eyes watches position attentively comprising capturing described show Show the position that the eyes are watched on device, and wherein the display mounting in wear-type device.
CN201910239748.XA 2018-03-29 2019-03-27 Display device and operation method Active CN110322818B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/940,784 US20190302881A1 (en) 2018-03-29 2018-03-29 Display device and methods of operation
US15/940,784 2018-03-29

Publications (2)

Publication Number Publication Date
CN110322818A true CN110322818A (en) 2019-10-11
CN110322818B CN110322818B (en) 2023-03-28

Family

ID=68056093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910239748.XA Active CN110322818B (en) 2018-03-29 2019-03-27 Display device and operation method

Country Status (3)

Country Link
US (1) US20190302881A1 (en)
CN (1) CN110322818B (en)
TW (1) TWI711855B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553972A (en) * 2020-04-27 2020-08-18 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for rendering augmented reality data
CN112887646A (en) * 2021-01-22 2021-06-01 京东方科技集团股份有限公司 Image processing method and device, augmented reality system, computer device and medium
CN114660802A (en) * 2020-12-23 2022-06-24 托比股份公司 Head mounted display and optimization method
WO2023125217A1 (en) * 2021-12-28 2023-07-06 维沃移动通信有限公司 Image processing circuit and method, and electronic device

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200125169A1 (en) * 2018-10-18 2020-04-23 Eyetech Digital Systems, Inc. Systems and Methods for Correcting Lens Distortion in Head Mounted Displays
US20200166752A1 (en) * 2018-11-26 2020-05-28 Varjo Technologies Oy Display for use in display apparatus
US10971161B1 (en) 2018-12-12 2021-04-06 Amazon Technologies, Inc. Techniques for loss mitigation of audio streams
US11336954B1 (en) * 2018-12-12 2022-05-17 Amazon Technologies, Inc. Method to determine the FPS on a client without instrumenting rendering layer
US11368400B2 (en) 2018-12-13 2022-06-21 Amazon Technologies, Inc. Continuously calibrated network system
US11252097B2 (en) 2018-12-13 2022-02-15 Amazon Technologies, Inc. Continuous calibration of network metrics
US11356326B2 (en) 2018-12-13 2022-06-07 Amazon Technologies, Inc. Continuously calibrated network system
US11016792B1 (en) 2019-03-07 2021-05-25 Amazon Technologies, Inc. Remote seamless windows
US11461168B1 (en) 2019-03-29 2022-10-04 Amazon Technologies, Inc. Data loss protection with continuity
US11245772B1 (en) 2019-03-29 2022-02-08 Amazon Technologies, Inc. Dynamic representation of remote computing environment
BR112022001434A2 (en) * 2019-07-28 2022-06-07 Google Llc Methods, systems and media for rendering immersive video content with optimized meshes
US10788893B1 (en) 2019-08-06 2020-09-29 Eyetech Digital Systems, Inc. Computer tablet augmented with internally integrated eye-tracking camera assembly
TWI704378B (en) * 2019-11-21 2020-09-11 宏碁股份有限公司 Head-mounted display device
CN113534949A (en) * 2020-04-22 2021-10-22 宏达国际电子股份有限公司 Head-mounted display device and control method thereof
US20230115678A1 (en) * 2021-09-24 2023-04-13 Arm Limited Apparatus and Method of Focusing Light

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9118513D0 (en) * 1991-08-29 1991-10-16 British Aerospace An eye-slaved panoramic display apparatus
US20130070109A1 (en) * 2011-09-21 2013-03-21 Robert Gove Imaging system with foveated imaging capabilites
US20160021351A1 (en) * 2013-03-14 2016-01-21 Nittoh Kogaku K.K. Optical system and device having optical system
US20160133055A1 (en) * 2014-11-07 2016-05-12 Eye Labs, LLC High resolution perception of content in a wide field of view of a head-mounted display
US20160189747A1 (en) * 2014-12-25 2016-06-30 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20170011492A1 (en) * 2013-03-04 2017-01-12 Tobii Ab Gaze and saccade based graphical manipulation
WO2017036429A2 (en) * 2016-12-01 2017-03-09 Viewtrix Technology Co., Ltd. Zone-based display data processing and transmission
US20170178408A1 (en) * 2015-12-22 2017-06-22 Google Inc. Adjusting video rendering rate of virtual reality content and processing of a stereoscopic image
CN107155103A (en) * 2016-03-04 2017-09-12 罗克韦尔柯林斯公司 For transmitting image to the system and method for head mounted display systems
US20170363873A1 (en) * 2016-06-15 2017-12-21 Vrvaorigin Vision Technology Corp. Ltd. Head-mounted personal multimedia systems and visual assistance devices thereof
CN108463765A (en) * 2016-04-08 2018-08-28 谷歌有限责任公司 Based on pose information at head-mounted display apparatus coded image data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10296086B2 (en) * 2015-03-20 2019-05-21 Sony Interactive Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in HMD rendered environments
EP3724858A4 (en) * 2017-12-14 2021-01-13 Samsung Electronics Co., Ltd. Method and apparatus for managing immersive data

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9118513D0 (en) * 1991-08-29 1991-10-16 British Aerospace An eye-slaved panoramic display apparatus
US20130070109A1 (en) * 2011-09-21 2013-03-21 Robert Gove Imaging system with foveated imaging capabilites
US20170011492A1 (en) * 2013-03-04 2017-01-12 Tobii Ab Gaze and saccade based graphical manipulation
US20160021351A1 (en) * 2013-03-14 2016-01-21 Nittoh Kogaku K.K. Optical system and device having optical system
US20160133055A1 (en) * 2014-11-07 2016-05-12 Eye Labs, LLC High resolution perception of content in a wide field of view of a head-mounted display
US20160189747A1 (en) * 2014-12-25 2016-06-30 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20170178408A1 (en) * 2015-12-22 2017-06-22 Google Inc. Adjusting video rendering rate of virtual reality content and processing of a stereoscopic image
CN107155103A (en) * 2016-03-04 2017-09-12 罗克韦尔柯林斯公司 For transmitting image to the system and method for head mounted display systems
CN108463765A (en) * 2016-04-08 2018-08-28 谷歌有限责任公司 Based on pose information at head-mounted display apparatus coded image data
US20170363873A1 (en) * 2016-06-15 2017-12-21 Vrvaorigin Vision Technology Corp. Ltd. Head-mounted personal multimedia systems and visual assistance devices thereof
WO2017036429A2 (en) * 2016-12-01 2017-03-09 Viewtrix Technology Co., Ltd. Zone-based display data processing and transmission

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553972A (en) * 2020-04-27 2020-08-18 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for rendering augmented reality data
CN114660802A (en) * 2020-12-23 2022-06-24 托比股份公司 Head mounted display and optimization method
CN112887646A (en) * 2021-01-22 2021-06-01 京东方科技集团股份有限公司 Image processing method and device, augmented reality system, computer device and medium
CN112887646B (en) * 2021-01-22 2023-05-26 京东方科技集团股份有限公司 Image processing method and device, augmented reality system, computer device and medium
WO2023125217A1 (en) * 2021-12-28 2023-07-06 维沃移动通信有限公司 Image processing circuit and method, and electronic device

Also Published As

Publication number Publication date
TW201942646A (en) 2019-11-01
US20190302881A1 (en) 2019-10-03
CN110322818B (en) 2023-03-28
TWI711855B (en) 2020-12-01

Similar Documents

Publication Publication Date Title
CN110322818A (en) Display device and operating method
US11393435B2 (en) Eye mounted displays and eye tracking systems
CN109863533B (en) Virtual, augmented and mixed reality systems and methods
RU2693329C2 (en) Method and device for displaying with optimization of pixel redistribution
CN105359540B (en) Head-mounted display apparatus and method for controlling such equipment
US10908421B2 (en) Systems and methods for personal viewing devices
EP3029550A1 (en) Virtual reality system
CN102918853B (en) For the method and apparatus of the adaptive stabilizing image sequential in stereoscopic three-dimensional system
US20140204003A1 (en) Systems Using Eye Mounted Displays
CN205862013U (en) A kind of light-duty VR eyesight-protecting glasses system
WO2020140758A1 (en) Image display method, image processing method, and related devices
WO2009094643A2 (en) Systems using eye mounted displays
CN102540467A (en) Head-mounted display
WO2019143793A1 (en) Position tracking system for head-mounted displays that includes sensor integrated circuits
EP2583131B1 (en) Personal viewing devices
WO2021031822A1 (en) Image display method, apparatus and system for head-mounted display device
US11126001B2 (en) Image generating apparatus, head-mounted display, content processing system and image displaying method
KR20170029144A (en) Virtual reality system using smart phone
Gilson et al. High fidelity immersive virtual reality
CN108989784A (en) Image display method, device, equipment and the storage medium of virtual reality device
WO2022166712A1 (en) Image display method, apparatus, readable medium, and electronic device
GB2548151A (en) Head-mountable display
US20190079284A1 (en) Variable DPI Across A Display And Control Thereof
WO2022240707A1 (en) Adaptive backlight activation for low-persistence liquid crystal displays
WO2013078740A1 (en) Liquid crystal stereo display system and drive method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant