EP3395074A1 - A method and apparatus for facilitating video rendering in a device - Google Patents

A method and apparatus for facilitating video rendering in a device

Info

Publication number
EP3395074A1
EP3395074A1 EP16826156.8A EP16826156A EP3395074A1 EP 3395074 A1 EP3395074 A1 EP 3395074A1 EP 16826156 A EP16826156 A EP 16826156A EP 3395074 A1 EP3395074 A1 EP 3395074A1
Authority
EP
European Patent Office
Prior art keywords
motion
user
video
screen
video frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16826156.8A
Other languages
German (de)
English (en)
French (fr)
Inventor
Yu Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alcatel Lucent SAS
Original Assignee
Alcatel Lucent SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcatel Lucent SAS filed Critical Alcatel Lucent SAS
Publication of EP3395074A1 publication Critical patent/EP3395074A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/55Motion estimation with spatial constraints, e.g. at image or region borders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/214Specialised server platform, e.g. server located in an airplane, hotel, hospital
    • H04N21/2146Specialised server platform, e.g. server located in an airplane, hotel, hospital located in mass transportation means, e.g. aircraft, train or bus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • G09G2370/022Centralised management of display operation, e.g. in a server instead of locally

Definitions

  • the present invention generally relates to video rendering, more specifically, relates to a method and apparatus for facilitating video rendering in a device.
  • a very common video watching scenario is implemented on the metro or bus by users' mobile phones or tablets, and even some people only watch videos on these public transportation vehicles. This is usual in large city like Shanghai, where it often takes more than one hour to/off work. However, many people complain it is inconvenient to watch video on the metro or bus because there is continuously vibration. Doctors say it is harm to eyes if watching on bus for a long time.
  • the problem to be solved by the present invention is to improve the user experience in vibration environment.
  • the present description involves a method and apparatus for facilitating video rendering in a device.
  • a method for facilitating video rendering in a device comprising:
  • the eye gaze position of the user in the screen of the device determining whether the eye gaze is deep inside the screen or at the edge of the video frame based on the eye gaze position;
  • an apparatus for facilitating video rendering in a device comprising:
  • a first obtaining means configured to obtain the eye gaze position of the user in the screen of the device
  • a processing means configured to determine whether the eye gaze is deep inside the screen or at the edge of the video frame based on the eye gaze position
  • a second obtaining means configured to obtain the motion of the device on condition of the processing means determining that the eye gaze is deep inside the screen
  • a compensating means configured to compensate the motion of the device by video frame shifting in the screen of the device based on the motion of the device to make the video frame retain the position relative to the user same or nearly same with that before the motion.
  • the provided method and apparatus may compensate the motion of the device by video frame shifting in the screen of the device based on the motion of the device by using the eye gaze position of the user. In this way, the video frame will virtually retain the same position relative to the user same with that before the motion, then the impact caused by vibration is relieved and the user experience in vibration environment is improved.
  • Fig. l is a flowchart illustrating a method for facilitating video rendering in a device in accordance with embodiments of the present invention
  • Fig.2 shows an example of controlling the picture re-positioning to relieve the effect of device vibration by using motion sensors
  • Fig. 3 shows an example of controlling the picture re-positioning to relieve the effect of both device and body vibration by using motion sensors
  • Fig. 4 shows the time relationship of the human visual acuity of a point in retina
  • Fig.5 illustrates a block diagram of an apparatus for facilitating video rendering in a device in accordance with embodiments of the present invention.
  • the basic idea of present invention is to use sensors in phone or tablet to reduce the influence of vibration and optimize video transmission resource.
  • the solution includes three parts, reducing the influence of the device vibration, reducing the influence of the human body vibration and transmission resource optimization.
  • the transmission resources can be reduced during the user is watching real time high definition (HD) video, because of the vibration, the user does not need (for example, cannot find any difference from standard definition video) very high resolution video. So the environment information can be sent to the video server to facilitate the video encoding to save the transmission resource.
  • HD high definition
  • the present invention provides a method for facilitating video rendering in a device, as shown in Fig. 1, the method may comprise: at step S101, obtaining the eye gaze position of the user in the screen of the device; at step SI 02, determining whether the eye gaze is deep inside the screen or at the edge of the video frame based on the eye gaze position; at step SI 03, obtaining the motion of the device on condition of determining that the eye gaze is deep inside the screen; and at step SI 04, compensating the motion of the device by video frame shifting in the screen of the device based on the motion of the device to make the video frame virtually retain the same position relative to the user same or nearly same with that before the motion.
  • the eye gaze position information can be used to facilitate the positioning.
  • the eye gaze position is obtained by an eye tracker which is a type of sensor. If the eye gaze is deep inside the screen, a small part of the video frame that goes out of the screen is not a problem because the eye will not notice this. If the eye is just gazing the edge of the video frame, it is better not shift the picture. So with the eye gaze information, there is no need to shrink the video frame to leave the protection boarder.
  • the motion of the device can be detected by one or more sensor (s), such as an accelerometer and gyroscope.
  • the gyroscope ADXRS453 can track accurately the vibration at frequency 50Hz and still performs very well at 100Hz (see “Analyzing Frequency Response of Inertial MEMS in Stabilization Systems", ht ⁇ :/Avww.analog.com/library/an ⁇ . 100Hz corresponds to 6000rpm engine rotation speed, so it can track the engine accurately.
  • Most of the advanced mobile phones now are equipped with such sensors.
  • iPhone there is an official method to get the motion information by the UIKit, UIDevice, or even the low level Core Motion API. The proposed idea is to compensate the device vibration by video frame shifting on the screen to let the image virtually retain the same position.
  • Fig. 2 shows an example of controlling the picture re-positioning to relieve the effect of device vibration by using motion sensors, wherein the motion sensors output the instantaneous motion of the tablet, with this information, the video frame is shifted in the reverse direction, so the position of the video frame relative to the user is not changed.
  • the example ADXRS453 gyroscope is able to track 50Hz vibration and the screen refresh frequency is usually more than 50Hz, so the video frame can be displayed in new position at the 50Hz in this example. It is noted that higher frequency motion sensor can be used to tracking finer motion.
  • step SI 04 may comprise: shifting the video frame in the screen of the device in the direction reverse to the obtained motion by a distance same with the motion of the device.
  • the video is shifted by a distance smaller than or equal to a predetermined distance, i.e., a configured maximum shifting value.
  • a predetermined distance i.e., a configured maximum shifting value.
  • the video frame shifting can use the maximum shifting value or does not perform shifting as the device and the viewer may experience heavy rocking.
  • the video frame is shown in a size smaller than normal (conventional).
  • the video frame may move out of the tablet screen. This can be solved by a small protecting black boarder around the video frame and the video frame is smaller compared to normal video rendering. This is not an issue for large screen device because the screen is large enough.
  • the method for facilitating video rendering in a device may further comprise: obtaining the motion of the user's head; calculating the relative movement between the motion of the user's head and the motion of the device; and step SI 04 may comprise: compensating the motion of the device by video shifting on the screen of the device based on the relative movement between the motion of the user's head and the motion of the device.
  • the method may further comprise: determining whether the change of the eye gaze position is due to the motion of the user's head or eye saccadic movement; and the step of detecting the motion of the user's head is performed on condition of determining that the change of the eye gaze position is due to the motion of the user's head.
  • Fig. 3 shows an example of controlling the picture re-positioning to relieve the effect of device and body vibration by using motion sensors.
  • the user's body may also experience vibration, which may increase the relative movement between the eye and the device.
  • the camera of the tablet can be used.
  • the eye performs saccadic movement and fixation. In the saccadic movement, the eye ball turns, this does not need the movement of the head. But the head vibration will also cause the eye gaze position shakes. So the detection of the head movement by the tablet camera, in combination with the eye tracker, it is easy to know if the eye gaze position change is due to the head movement or the eye saccadic movement.
  • the relative movement to the tablet can be calculated and this information is used to shift the video frame on the screen to the reverse direction.
  • the method for facilitating video rendering in a device may further comprise: calculating the visual acuity of the user based on the eye gaze position; and transmitting the visual acuity to the video server which provides the video, so that the server can encode the video based on the visual acuity to reduce the quality of the video.
  • the influence of vibration can be reduced effectively. But there is still certain level of vibration that cannot be eliminated, because of the vibration tracking delay and error.
  • the influence of the vibration can be evaluated by the frequent gaze shaking in small area, which is different from the saccadic movement which is ballistic and moves from one point directly to the new position.
  • the eye gaze position can be filtered to get the high frequency movement, and this reflects the vibration.
  • the eye acuity in such condition can be obtained.
  • the visual acuity is considered to be normalized as 1.
  • the time a point exists in the preset degree fovea for example, 2 degree fovea
  • the visual acuity can be obtained, which is smaller than 1 if normalized.
  • This information is passed to the video server, and the server can encode the video based on this visual acuity information, to reduce the quality without the visual influence to the viewer.
  • the method for facilitating video rendering in a device according to the present invention can save network resource for video transmission.
  • the dynamic range of the visual acuity can be obtained, denoted by [a, b], where a is the lowest acuity, and b is the highest acuity.
  • the video to be transmitted is encoded by a given max bit rate m and an average bit rate n.
  • b maps to the maximum bit rate m
  • (a+b)/2 maps to the average bit rate n.
  • the max bit rate m and average bit rate n refer to a nominal acuity, e.g. averaging the measurement of many people, the mapping can be done as follows.
  • the nominal acuity is denoted by [aO, b0].
  • the video bit rate of a specific user with acuity x can be calculated by the following steps:
  • the present invention further provide an apparatus for facilitating video rendering in a device, as shown in Fig. 5, the apparatus may comprise: a first obtaining means 510, configured to obtain the eye gaze position of the user in the screen of the device; a processing means 520, configured to determine whether the eye gaze is deep inside the screen or at the edge of the video frame based on the eye gaze position; a second obtaining means 530, configured to obtain the motion of the device on condition of the processing means 520 determining that the eye gaze is deep inside the screen; and a compensating means 540, configured to compensate the motion of the device by video frame shifting in the screen of the device based on the motion of the device to make the video frame virtually retain the position relative to the user same or nearly same with that before the motion.
  • a first obtaining means 510 configured to obtain the eye gaze position of the user in the screen of the device
  • a processing means 520 configured to determine whether the eye gaze is deep inside the screen or at the edge of the video frame based on the eye gaze position
  • the compensating means 540 is further configured to: shift the video frame in the screen of the device in the direction reverse to the obtained motion by a distance same with the motion of the device.
  • the video frame is shifted by a distance smaller than or equal to a predetermined distance.
  • the video frame is shown in a size smaller than normal.
  • the first obtaining means 510 is further configured to obtain the motion of the user's head; the compensating means 540 is further configured to calculate the relative movement between the motion of the user's head and the motion of the device; and the compensating means 540 is further configured to compensate the motion of the device by video frame shifting on the screen of the device based on the relative movement between the motion of the user's head and the motion of the device.
  • the compensating means 540 is further configured to determine the change of the eye gaze position is due to the motion of the user's head or eye saccadic movement; and
  • the first obtaining means 510 is configured to obtain the motion of the user's head on condition of the compensating means 540 determining that the change of the eye gaze position is due to the motion of the user's head.
  • the apparatus may further comprise:
  • a calculating means 550 configured to calculate the visual acuity of the user based on the eye gaze position
  • a transmitting means 560 configured to transmit the visual acuity to the video server which provides the video, so that the server can encode the video based on the visual acuity to reduce the quality of the video.
  • the first obtaining means 510 comprises an eye tracker
  • the second obtaining means 530 comprises an accelerometer and gyroscope sensor.
  • At least one of the first obtaining means 510, the processing means 520, the second obtaining means 530, the compensating means 540, the calculating means 550, and the transmitting means 560 are assumed to comprise program instructions that, when executed, enable the apparatus to operate in accordance with the exemplary embodiments, as discussed above.
  • any of the first obtaining means 510, the processing means 520, the second obtaining means 530, the compensating means 540, the calculating means 550, and the transmitting means 560 as discussed above may be integrated together or implemented by separated components, and may be of any type suitable to the local technical environment, and may comprise one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSP) and processors based on multi-core processor architectures, as non-limiting examples.
  • the ROM mentioned above may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor based memory devices, flash memory, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory.
  • the various exemplary embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof.
  • some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • While various aspects of the exemplary embodiments of this invention may be illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other device.
  • the computer executable instructions may be stored on a computer readable medium such as a hard disk, optical disk, removable storage media, solid state memory, random access memory (RAM), and etc.
  • a computer readable medium such as a hard disk, optical disk, removable storage media, solid state memory, random access memory (RAM), and etc.
  • RAM random access memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Ophthalmology & Optometry (AREA)
  • Databases & Information Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
EP16826156.8A 2015-12-24 2016-11-24 A method and apparatus for facilitating video rendering in a device Withdrawn EP3395074A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510990448.7A CN106921890A (zh) 2015-12-24 2015-12-24 一种用于促进设备中的视频渲染的方法和装置
PCT/IB2016/001826 WO2017109567A1 (en) 2015-12-24 2016-11-24 A method and apparatus for facilitating video rendering in a device

Publications (1)

Publication Number Publication Date
EP3395074A1 true EP3395074A1 (en) 2018-10-31

Family

ID=57796755

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16826156.8A Withdrawn EP3395074A1 (en) 2015-12-24 2016-11-24 A method and apparatus for facilitating video rendering in a device

Country Status (3)

Country Link
EP (1) EP3395074A1 (zh)
CN (1) CN106921890A (zh)
WO (1) WO2017109567A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917923B (zh) * 2019-03-22 2022-04-12 北京七鑫易维信息技术有限公司 基于自由运动调整注视区域的方法以及终端设备
JP2024040035A (ja) * 2022-09-12 2024-03-25 富士フイルム株式会社 プロセッサ、画像処理装置、眼鏡型情報表示装置、画像処理方法、及び画像処理プログラム

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7903166B2 (en) * 2007-02-21 2011-03-08 Sharp Laboratories Of America, Inc. Methods and systems for display viewer motion compensation based on user image data
JP2012114544A (ja) * 2010-11-22 2012-06-14 Jvc Kenwood Corp 映像符号化装置
EP2505223A1 (en) * 2011-03-31 2012-10-03 Alcatel Lucent Method and device for displaying images
JP4966421B1 (ja) * 2011-03-31 2012-07-04 株式会社東芝 情報処理装置及び情報処理方法
TWI427618B (zh) * 2011-06-16 2014-02-21 Hon Hai Prec Ind Co Ltd 電子裝置及其螢幕資訊調整方法
US20130234929A1 (en) * 2012-03-07 2013-09-12 Evernote Corporation Adapting mobile user interface to unfavorable usage conditions
US9417666B2 (en) * 2012-10-19 2016-08-16 Microsoft Technology Licesning, LLC User and device movement based display compensation
US9727991B2 (en) * 2013-03-01 2017-08-08 Microsoft Technology Licensing, Llc Foveated image rendering
GB2523345B (en) * 2014-02-20 2018-01-17 Samsung Electronics Co Ltd Detecting user viewing difficulty from facial parameters
DE102014103621A1 (de) * 2014-03-17 2015-09-17 Christian Nasca Bildstabilisierungsverfahren

Also Published As

Publication number Publication date
WO2017109567A1 (en) 2017-06-29
CN106921890A (zh) 2017-07-04

Similar Documents

Publication Publication Date Title
KR102088472B1 (ko) 가상 현실 콘텐츠의 비디오 렌더링 속도의 조정 및 입체 이미지의 프로세싱
US11789686B2 (en) Facilitation of concurrent consumption of media content by multiple users using superimposed animation
US11816820B2 (en) Gaze direction-based adaptive pre-filtering of video data
US9626741B2 (en) Systems and methods for configuring the display magnification of an electronic device based on distance and user presbyopia
US10200725B2 (en) Adaptive data streaming based on virtual screen size
US20190026864A1 (en) Super-resolution based foveated rendering
US20140118240A1 (en) Systems and Methods for Configuring the Display Resolution of an Electronic Device Based on Distance
JP2014526157A (ja) ヘッドマウントディスプレイの全視野の分類
JP2017502338A (ja) 網膜知覚モデルを用いたgpuおよびビデオの動的解像度制御
US10764499B2 (en) Motion blur detection
JP6750697B2 (ja) 情報処理装置、情報処理方法及びプログラム
EP3395074A1 (en) A method and apparatus for facilitating video rendering in a device
US20230410699A1 (en) Structured display shutdown for video pass-through electronic devices
US11979657B1 (en) Power efficient object tracking
JP2014174312A (ja) データ再生装置、集積回路、モバイル機器、データ再生方法
KR20180101950A (ko) 영상을 처리하는 방법 및 장치

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180724

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200309

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200721