WO2020259402A1 - Procédé et dispositif de traitement d'image, dispositif terminal, support, et système habitronique - Google Patents

Procédé et dispositif de traitement d'image, dispositif terminal, support, et système habitronique Download PDF

Info

Publication number
WO2020259402A1
WO2020259402A1 PCT/CN2020/097056 CN2020097056W WO2020259402A1 WO 2020259402 A1 WO2020259402 A1 WO 2020259402A1 CN 2020097056 W CN2020097056 W CN 2020097056W WO 2020259402 A1 WO2020259402 A1 WO 2020259402A1
Authority
WO
WIPO (PCT)
Prior art keywords
resolution
range
angular velocity
image
linear velocity
Prior art date
Application number
PCT/CN2020/097056
Other languages
English (en)
Chinese (zh)
Inventor
李文宇
张�浩
陈丽莉
苗京花
孙玉坤
王雪丰
鄢名扬
李治富
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2020259402A1 publication Critical patent/WO2020259402A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Definitions

  • the present disclosure relates to the field of display technology, and in particular to an image processing method, device, terminal device, medium, and wearable system.
  • VR Virtual Reality
  • AR Augmented Reality
  • a VR device Take a VR device as an example.
  • a VR device needs to correspond to a computer, and the image rendering of VR applications (such as VR games) is realized through the computer's graphics card.
  • the embodiments of the present disclosure provide an image processing method, device, terminal device, medium, and wearable system.
  • an embodiment of the present disclosure provides an image processing method, and the image processing method includes:
  • the rendered image is output to the wearable device.
  • the determining the resolution of the image to be rendered according to the motion speed includes:
  • the resolution corresponding to the first motion speed range is determined.
  • the movement speed includes at least one of an angular velocity and a linear velocity.
  • the determining the resolution corresponding to the first movement speed range based on the corresponding relationship between the movement speed range and the resolution includes:
  • the first angular velocity range is one of at least two preset angular velocity ranges
  • the first linear velocity range is one of at least two preset linear velocity ranges.
  • the preset angular velocity range is two, and the preset linear velocity range is two;
  • the smaller angular velocity range in the preset angular velocity range and the smaller linear velocity range in the preset linear velocity range correspond to the same resolution
  • the resolution corresponding to the larger angular velocity range in the preset angular velocity range and the larger linear velocity range in the preset linear velocity range is the same.
  • the method further includes:
  • the corresponding relationship between the motion speed range and the resolution is obtained.
  • an image processing device which includes:
  • the obtaining module is configured to obtain the movement speed of the wearable device
  • the first determining module is configured to determine the resolution of the image to be rendered according to the motion speed, where the resolution is negatively related to the motion speed;
  • a rendering module configured to perform image rendering using the resolution
  • the output module is configured to output the rendered image to the wearable device.
  • the first determining module includes:
  • a first determining sub-module configured to determine a first motion speed range in which the motion speed is located, the first motion speed range being one of at least two preset motion speed ranges;
  • the second determining submodule is configured to determine the resolution corresponding to the first motion speed range based on the corresponding relationship between the motion speed range and the resolution.
  • the movement speed includes at least one of an angular velocity and a linear velocity.
  • the determining the resolution corresponding to the first movement speed range based on the corresponding relationship between the movement speed range and the resolution includes:
  • the first angular velocity range is one of at least two preset angular velocity ranges
  • the first linear velocity range is one of at least two preset linear velocity ranges
  • the preset angular velocity range is two, and the preset linear velocity range is two;
  • the smaller angular velocity range in the preset angular velocity range and the smaller linear velocity range in the preset linear velocity range correspond to the same resolution
  • the resolution corresponding to the larger angular velocity range in the preset angular velocity range and the larger linear velocity range in the preset linear velocity range is the same.
  • the device further includes:
  • the second determining module is configured to use a plurality of different resolutions to perform image rendering; respectively determine the frame rate range of the screen when each of the multiple different resolutions is used for image rendering; Frame rate ranges corresponding to multiple different resolutions, and frame rate ranges corresponding to different motion speed ranges, to obtain the corresponding relationship between the motion speed range and the resolution.
  • embodiments of the present disclosure also provide a terminal device, the terminal device includes: a processor; a memory configured to store executable instructions of the processor; wherein the processor is configured to execute The image processing method as described in any of the preceding items.
  • the embodiments of the present disclosure also provide a computer-readable storage medium.
  • the instructions in the computer-readable storage medium are executed by the processor of the terminal device, the terminal device can execute any of the preceding items.
  • the embodiments of the present disclosure also provide a wearable system, including: the terminal device as described above; and the wearable device configured to display an image output by the terminal device.
  • Figure 1 is a schematic structural diagram of a wearable system
  • FIG. 2 is a flowchart of an image processing method provided by an embodiment of the present disclosure
  • FIG. 3 is a flowchart of an image processing method provided by an embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram of an image processing device provided by an embodiment of the present disclosure.
  • Fig. 5 is a structural block diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 1 is a schematic structural diagram of a wearable system.
  • the wearable system includes a terminal device 10 and a VR device 20, and the terminal device 10 and the VR device 20 are connected.
  • the terminal device 10 is the host of the VR device 20, and its role is to run a VR application, render an image generated by the VR application, and then output to the VR device 20 for display.
  • the terminal device 10 may be a device such as a computer.
  • the terminal device 10 usually includes a processor, a memory, and a graphics card.
  • a VR application is stored in the memory, and the processor outputs an image by running the VR application.
  • the graphics card needs to be called to complete the image rendering work.
  • Image rendering refers to the process of converting three-dimensional light energy transfer processing into a two-dimensional image, using the three-dimensional geometric model information, three-dimensional animation definition information and material information provided by the VR application, through geometric transformation, projection transformation, perspective transformation and window trimming And other steps to generate images.
  • the role of the VR device 20 is to display the image output by the terminal device 10.
  • the VR device 20 includes an attitude sensor, such as a gyroscope.
  • the attitude sensor is used to detect the attitude information of the VR device and transmit the attitude information to the terminal device 10; the VR application in the terminal device 10 can adjust the display of the VR device according to the attitude information
  • the picture brings users an immersive viewing experience.
  • the VR device 20 may be a head-mounted VR device, that is, a VR head display.
  • the image is rendered at a predetermined resolution, and the frame rate of the resulting picture will change with the image effect.
  • the image effect is complex, the rendering takes a long time, the frame rate is low, and the image effect is simple. The time-consuming is short, the frame rate is high, and the frame rate changes within a range. But if you render at different resolutions, the higher the resolution, the lower the frame rate range, and the lower the resolution, the higher the frame rate range.
  • the default resolution can be used, or the user can select a fixed resolution from the optional resolutions provided by the VR application for image rendering, for example, 2k or 4k resolution.
  • the VR device in the aforementioned wearable system may also be an AR device or other wearable devices.
  • Fig. 2 is a flowchart of an image processing method provided by an embodiment of the present disclosure. This method is executed by the terminal device in FIG. 1. Referring to FIG. 2, the image processing method includes:
  • Step 101 Obtain the movement speed of the wearable device.
  • Wearable devices are usually worn on the user's head, and the user can change the picture viewed through the wearable device by moving the head.
  • the movement speed of the wearable device is the movement speed of the wearable device when the user uses the wearable device to drive the wearable device through head movement.
  • the movement speed may include angular speed, linear speed and so on.
  • Step 102 Determine the resolution of the image to be rendered according to the motion speed, and the resolution is negatively related to the motion speed.
  • the resolution of the image to be rendered is also the resolution of the rendered image.
  • the negative correlation between resolution and motion speed means that the faster the motion speed of the wearable device, the lower the resolution used for rendering, the slower the motion speed of the wearable device, and the higher the resolution used for rendering.
  • the movement speed of the wearable device is 180 degrees per second (angular velocity), and the resolution used for rendering is 2K at this time; the movement speed of the wearable device is 90 degrees per second, and the resolution used for rendering is 4K at this time.
  • Step 103 Perform image rendering using the determined resolution.
  • the terminal device can call the graphics card to perform image rendering, and when using the graphics card to perform image rendering, the resolution determined in step 102 is used.
  • Step 104 Output the rendered image to the wearable device.
  • the image is rendered, the image is output to the wearable device for display for the user to watch
  • the wearable device when the wearable device is at different motion speeds, different rendering resolutions are used for image rendering, and then the rendered image is output to the wearable device for display.
  • the resolution is negatively related to the motion speed, that is, the faster the motion speed of the wearable device, the lower the resolution used for rendering, the slower the motion speed of the wearable device, and the higher the resolution used for rendering.
  • the performance of the graphics card is constant, the resolution used for rendering is negatively related to the frame rate of the final picture. Therefore, when the wearable device is moving fast, the picture displayed by the wearable device has a low resolution and a high frame rate.
  • Fig. 3 is a flowchart of an image processing method provided by an embodiment of the present disclosure. This method is executed by the terminal device in FIG. 1. Referring to FIG. 3, the image processing method includes:
  • Step 201 Obtain posture data of the wearable device.
  • the posture sensor in the wearable device can detect the posture data of the wearable device, so this step is: the terminal device obtains the posture data generated by the posture sensor in the wearable device.
  • the posture data can be a quaternion or other forms of posture data.
  • the function of the posture data generated by the posture sensor is to adjust the output picture, thereby changing the perspective of the scene seen by the user.
  • the terminal device can determine the direction of the user's rotation or the movement of the position according to the posture data, and output a corresponding screen to the user based on the rotation or movement. Therefore, the screen update speed is related to the acquisition frequency of the attitude data, and only high-frequency attitude data collection can support the update speed of the image output to the user.
  • the terminal device obtains the posture data at a frequency of more than 100 Hz, for example, 1000 Hz. Under such high-frequency posture data acquisition conditions, the terminal device can refresh the display screen at a high speed.
  • the screen refresh frequency Lower than the acquisition frequency of attitude data.
  • Step 202 Determine the movement speed of the wearable device based on the posture data.
  • the terminal device After the terminal device obtains the posture data, it uses the posture algorithm to calculate the movement speed of the wearable device.
  • the movement speed may include at least one of an angular speed and a linear speed.
  • the movement speed may include angular velocity and linear velocity.
  • Step 203 Determine the first movement speed range in which the movement speed is located.
  • the movement speed is divided into at least two ranges in advance, and the first movement speed range is one of the preset at least two movement speed ranges.
  • step 203 is to determine the first angular velocity range in which the angular velocity is located, and the first linear velocity range in which the linear velocity is located.
  • the first angular velocity range is one of at least two preset angular velocity ranges.
  • the linear velocity range is one of at least two preset linear velocity ranges.
  • the preset angular velocity range and linear velocity range may both be two. This division can meet the requirements of fluency and clarity of the final wearable device display on the one hand, and reduce the resources required for processing on the other hand. .
  • Step 204 Determine the resolution corresponding to the first movement speed range based on the corresponding relationship between the movement speed range and the resolution.
  • the corresponding relationship between the motion speed range and the resolution is stored in the terminal device.
  • the corresponding relationship between the angular velocity range and the resolution, and the corresponding relationship between the linear velocity range and the resolution are stored in the terminal device.
  • the terminal device uses two resolutions to control the image rendering, on the one hand, it can ensure the user's requirements for the smoothness and clarity of the wearable device display, on the other hand, the processing process is simpler.
  • the resolutions corresponding to the two angular velocity ranges and the resolutions corresponding to the two linear velocity ranges may also be different.
  • Step 204 may include: based on the corresponding relationship between the angular velocity range and the resolution, determining the first candidate resolution corresponding to the first angular velocity range where the angular velocity is located; based on the corresponding relationship between the linear velocity range and the resolution, determining the linear velocity The second candidate resolution corresponding to the first linear velocity range; the smaller of the first candidate resolution and the second candidate resolution is adopted as the resolution of the image to be rendered.
  • Rate can increase the frame rate of the display and ensure the smoothness of users' viewing.
  • any one of them is selected as the resolution of the image to be rendered.
  • the motion speed includes angular velocity and linear velocity, and both angular velocity and linear velocity are divided into 2 ranges.
  • the larger angular velocity range and linear velocity range correspond to the first resolution
  • the smaller angular velocity range and linear velocity range correspond to the second resolution.
  • Resolution the first resolution is smaller than the second resolution as an example:
  • the angular velocity is greater than the angular velocity threshold, that is, the first angular velocity range in which the angular velocity is located is a larger angular velocity range; the linear velocity is greater than the linear velocity threshold, that is, the first linear velocity range in which the linear velocity is located is a larger linear velocity range.
  • the angular velocity threshold and the linear velocity threshold can be determined in advance through experiments. For example, multiple angular velocity values and multiple linear velocity values are selected at a fixed step interval, and each angular velocity value and linear velocity value are respectively used to execute the steps adopted in the embodiment of the present disclosure. Through the user's perception of viewing fluency and clarity, To select the most suitable angular velocity value and linear velocity value as the aforementioned angular velocity threshold and linear velocity threshold.
  • the above-mentioned method can also be used to determine the critical values of different ranges.
  • the method further includes: determining the resolution corresponding to each motion speed range, that is, determining the correspondence between the motion speed range and the resolution.
  • the configured graphics cards can be different. Different graphics cards have different processing capabilities. For example, when a low-configuration graphics card and a high-configuration graphics card are rendered at the same resolution, the images rendered by the low-configuration graphics card will show severe freezes. Therefore, it is possible to determine the resolution corresponding to the motion speed range that meets its processing capabilities for different graphics cards, so that the performance of the graphics cards can be fully utilized.
  • the terminal device may store the corresponding relationship between the motion speed range and the resolution associated with the graphics card of different models.
  • the terminal device first obtains the graphics card information, determines the graphics card model based on the graphics card information, and determines the correspondence between the motion speed range and the resolution based on the graphics card model, so that the used correspondence matches the performance of the graphics card.
  • determining the resolution corresponding to each motion speed range includes:
  • the resolution corresponding to each motion speed range is selected, and the corresponding relationship between the motion speed range and the resolution is obtained.
  • the frame rate range corresponding to different motion speed ranges can be divided into the set frame rate range in the way that the higher the motion speed, the larger the frame rate range, for example, the frame rate is divided into equal parts between 25-120 Multiple frame rate ranges, and the multiple frame rate ranges respectively correspond to multiple motion speed ranges one-to-one.
  • the frame rate is below 25, even if the user is still in a static state, the screen will feel stuck, so it is not used, and the frame rate is above 120.
  • the graphics card resource consumption is large, and the user's impression and frame rate are 75.
  • the frame rate range of 25-120 is used as the basis for determining the resolution.
  • the frame rate of 25 means that the screen is refreshed 25 times per second (corresponding to 25 images)
  • the frame rate of 120 means that the screen is refreshed 120 times per second (corresponding to 120 images).
  • the following is an example of determining the first resolution and the second resolution to explain how to determine the resolution corresponding to each motion speed range.
  • the determining of the first resolution and the second resolution may include:
  • a variety of different resolutions are used for image rendering respectively; the different resolutions here can be preset various resolutions for applications (for example, VR applications).
  • the frame rate range of the screen select the first resolution and the second resolution; when the first resolution is used for image rendering, the frame rate range of the screen is the first frame rate range, when the second resolution is used for image rendering
  • the frame rate range of the picture is the second frame rate range, and the first frame rate ⁇ the second frame rate range ⁇ the second frame rate ⁇ the first frame rate range.
  • the selected first frame rate range and the second frame rate range are not continuous, so that when the user watches the pictures of the first resolution and the second resolution, the fluency is significantly different, so that the lower resolution is displayed.
  • the smoothness of the picture is good, and the picture definition is high when the low frame rate picture is displayed.
  • the resolution is selected by the frame rate range of the screen displayed by the wearable device to ensure that the selected first resolution and second resolution can make the screen smooth when the wearable device is moving at a high speed; when the wearable device is moving at a low speed or When it is still, the picture is clear.
  • the highest resolution is selected from the multiple first resolutions (second resolutions) that meet the conditions.
  • the first resolution so as to make full use of the graphics card performance and improve the graphics card usage rate.
  • the first frame rate is greater than 25, and the second frame rate is greater than 60.
  • the frame rate range of the picture displayed by the wearable device is higher than 60, which can ensure the smoothness of the picture when the wearable device is moving at high speed;
  • the wearable device The frame rate range of the displayed picture is higher than 25, and at the same time lower than the frame rate when the first resolution is adopted. This can ensure that the wearable device does not appear to be stuck when the wearable device is moving or stationary at low speed, and at the same time, by reducing the frame rate to achieve high resolution
  • the image is rendered to ensure the clarity of the picture.
  • using the above frame rate to determine the resolution can not only ensure the utilization of high-performance graphics cards, but also ensure the normal operation of low-performance graphics cards.
  • the first frame rate may be 30, and the second frame rate may be 75. Because if you keep your head still, when the frame rate of the picture is above 30, the human eye will not notice the stuttering phenomenon; if the head moves faster, the graphics card rendering frame rate must reach 75 or higher, so that the human eye can not detect it. To the Caton phenomenon.
  • Step 205 Perform image rendering using the determined resolution.
  • the terminal device may automatically adjust the resolution based on the motion speed, that is, the solution provided in the embodiment of the present disclosure is adopted to realize the resolution adjustment, and then the image rendering is performed according to the adjusted resolution.
  • the resolution corresponding to each motion speed range may be the resolution preset by the VR application, so as to ensure the normal output of the VR application.
  • the maximum resolution of the resolutions corresponding to each motion speed range can be the same resolution as the screen of the wearable device among the multiple resolutions preset by the VR application, such as 4K resolution, so that the wearable device can display the maximum Resolution screen.
  • the wearable device When the wearable device is moving at a high speed, the resolution used for image rendering is low, which reduces the pressure on the terminal device to render the image, and the resulting picture has a higher frame rate and smooth picture. Due to the physiological characteristic of "saccade suppression" in the human eye, when the human eye moves quickly with the head, the human eye cannot focus accurately, and the perception of picture clarity is reduced, and the lower image resolution will not be noticed.
  • the wearable device Since the wearable device is moving at a low speed or even when it is still, the resolution used for image rendering is higher. At this time, although the frame rate of the obtained picture is lower, the image definition is high. Since there are few changes between adjacent frames, even if the frame rate is low, the user will not feel the stutter phenomenon. At this time, the user's focus is mainly on the clarity of the image. Due to the high image clarity, the user's viewing experience is improved.
  • the user can manually set the output image resolution, for example, select the output image resolution to be 2K or 4K. Therefore, in the embodiments of the present disclosure, in addition to the automatic resolution mode, that is, the automatic setting of the terminal device, the VR application may also have a manual resolution mode, that is, the manner set by the user.
  • An automatic/manual switch button can be set in VR games or other VR applications to switch between automatic setting and manual setting.
  • the terminal device can determine the mode to be used according to the user's selection instruction, and perform the resolution determination operation according to the mode .
  • Step 206 Output the rendered image to the wearable device.
  • the screen of the wearable device can be of various types, such as organic light emitting diode (OLED), liquid crystal display (LCD), and so on.
  • the terminal device may also perform a dynamic dimming (Local Dimming) process after the image rendering is completed, and transmit the result of the dynamic dimming to the wearable device.
  • Dynamic dimming processing refers to a scheme that modulates the screen backlight according to the picture to be displayed to enhance the picture contrast, that is, the terminal device determines the backlight source of the screen of the wearable device according to the grayscale of each pixel in the rendered image The brightness of each backlight unit.
  • the result of the aforementioned dynamic dimming may be the brightness information of each backlight unit of the backlight source of the screen of the wearable device.
  • the backlight of the screen of the wearable device is composed of multiple backlight units, and the brightness of each backlight unit can be individually controlled.
  • the result of the dynamic dimming can be transmitted to the wearable device through the image transmission channel together with the image obtained by the foregoing rendering.
  • Fig. 4 is a schematic structural diagram of an image processing device provided by an embodiment of the present disclosure.
  • the image processing apparatus includes: an acquisition module 301, a first determination module 302, a rendering module 303, and an output module 304.
  • the obtaining module 301 is configured to obtain the movement speed of the wearable device
  • the first determining module 302 is configured to determine the resolution of the image to be rendered according to the motion speed, and the resolution is negatively related to the motion speed;
  • the rendering module 303 is configured to perform image rendering using resolution
  • the output module 304 is configured to output the rendered image to the wearable device.
  • the acquisition module 301 may include a sensor, such as a posture sensor.
  • the posture sensor can detect the posture data of the wearable device, and the movement speed of the wearable device can be obtained based on the posture data.
  • the first determining module 302 includes:
  • the first determining submodule 321 is configured to determine a first motion speed range in which the motion speed is located, and the first motion speed range is one of at least two preset motion speed ranges;
  • the second determining sub-module 322 is configured to determine the resolution corresponding to the first motion speed range based on the corresponding relationship between the motion speed range and the resolution.
  • the movement speed includes at least one of an angular velocity and a linear velocity.
  • the second determining sub-module 322 is configured to determine the first angular velocity at which the angular velocity is based on the corresponding relationship between the angular velocity range and the resolution.
  • the first candidate resolution corresponding to the range based on the correspondence between the linear velocity range and the resolution, determine the second candidate resolution corresponding to the first linear velocity range where the linear velocity is located;
  • the first angular velocity range is at least the preset One of the two angular velocity ranges
  • the first linear velocity range is one of at least two preset linear velocity ranges; the smaller one of the first candidate resolution and the second candidate resolution is adopted as the waiting The resolution of the rendered image.
  • the preset angular velocity range is two, and the preset linear velocity range is two;
  • the smaller angular velocity range in the preset angular velocity range corresponds to the same resolution as the smaller linear velocity range in the preset linear velocity range;
  • the larger angular velocity range in the preset angular velocity range and the larger linear velocity range in the preset linear velocity range correspond to the same resolution.
  • the device may further include:
  • the second determining module 305 determines the resolution corresponding to each movement speed range.
  • the second determining module 305 is configured to use a plurality of different resolutions to perform image rendering; respectively determine to use various resolutions of a plurality of different resolutions for image rendering When, the frame rate range of the picture; according to the frame rate range of the picture and the frame rate range corresponding to different motion speed ranges, select the resolution corresponding to each motion speed range to obtain the corresponding relationship between the motion speed range and the resolution.
  • Fig. 5 is a structural block diagram of a terminal device provided by an embodiment of the present disclosure.
  • the terminal device 400 includes a central processing unit (CPU) 401, a system memory 404 including a random access memory (RAM) 402 and a read only memory (ROM) 403, and a system bus 405 connecting the system memory 404 and the central processing unit 401.
  • the terminal device 400 also includes a basic input/output system (I/O system) 406 to help transfer information between various devices in the computer, and a mass storage device for storing the operating system 413, application programs 414, and other program modules 415 407.
  • I/O system basic input/output system
  • the basic input/output system 406 includes a display 408 for displaying information and an input device 409 such as a mouse and a keyboard for the user to input information.
  • the display 408 and the input device 409 are both connected to the central processing unit 401 through the input and output controller 410 connected to the system bus 405.
  • the basic input/output system 406 may also include an input and output controller 410 for receiving and processing input from multiple other devices such as a keyboard, a mouse, or an electronic stylus.
  • the input and output controller 410 also provides output to a display screen, a printer, or other types of output devices.
  • the mass storage device 407 is connected to the central processing unit 401 through a mass storage controller (not shown) connected to the system bus 405.
  • the mass storage device 407 and its associated computer readable medium provide non-volatile storage for the terminal device 400. That is, the mass storage device 407 may include a computer-readable medium (not shown) such as a hard disk or a CD-ROM drive.
  • Computer-readable media may include computer storage media and communication media.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storing information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state storage technologies, CD-ROM, DVD or other optical storage, tape cartridges, magnetic tape, disk storage or other magnetic storage devices.
  • RAM random access memory
  • ROM read-only memory
  • EPROM Erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • the terminal device 400 may also be connected to a remote computer on the network through a network such as the Internet to operate. That is, the terminal device 400 can be connected to the network 412 through the network interface unit 411 connected to the system bus 405, or in other words, can also use the network interface unit 411 to connect to other types of networks or remote computer systems (not shown).
  • the memory also includes one or more programs, one or more programs are stored in the memory, and the central processing unit 401 executes the one or more programs to implement the image processing method shown in FIG. 2 or FIG. 3.
  • non-transitory computer-readable storage medium including instructions, such as a memory including instructions, which can be executed by a processor of a terminal device to complete the enhancements shown in each embodiment of the present disclosure.
  • Real device sharing method may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • the embodiment of the present disclosure also provides a wearable system, which includes:
  • the wearable device is configured to display the image output by the terminal device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente divulgation se rapporte au domaine technique des affichages. Un procédé et un dispositif de traitement d'image, un dispositif terminal, un support, et un système habitronique sont divulgués. Le procédé de traitement d'image comprend les étapes suivantes : acquérir la vitesse de déplacement d'un dispositif habitronique ; déterminer, sur la base de la vitesse de déplacement, la résolution d'une image à restituer, la résolution étant inversement corrélée à la vitesse de déplacement ; utiliser la résolution pour la restitution de l'image ; et afficher l'image produite par la restitution au dispositif habitronique. En utilisant différentes résolutions pour la restitution d'image dans différents scénarios, sans changer les performances de la carte d'affichage, un utilisateur est autorisé à visualiser une image lisse et claire, réduisant ainsi les exigences de performance de carte d'affichage, et réduisant la barrière à la popularisation de VR.
PCT/CN2020/097056 2019-06-24 2020-06-19 Procédé et dispositif de traitement d'image, dispositif terminal, support, et système habitronique WO2020259402A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910551530.8A CN110166758B (zh) 2019-06-24 2019-06-24 图像处理方法、装置、终端设备及存储介质
CN201910551530.8 2019-06-24

Publications (1)

Publication Number Publication Date
WO2020259402A1 true WO2020259402A1 (fr) 2020-12-30

Family

ID=67626975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/097056 WO2020259402A1 (fr) 2019-06-24 2020-06-19 Procédé et dispositif de traitement d'image, dispositif terminal, support, et système habitronique

Country Status (2)

Country Link
CN (1) CN110166758B (fr)
WO (1) WO2020259402A1 (fr)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110166758B (zh) * 2019-06-24 2021-08-13 京东方科技集团股份有限公司 图像处理方法、装置、终端设备及存储介质
CN110930307B (zh) * 2019-10-31 2022-07-08 江苏视博云信息技术有限公司 图像处理方法和装置
CN110860084B (zh) * 2019-11-14 2024-02-23 珠海金山数字网络科技有限公司 一种虚拟画面处理方法及装置
CN110910509A (zh) * 2019-11-21 2020-03-24 Oppo广东移动通信有限公司 图像处理方法以及电子设备和存储介质
CN111679739B (zh) * 2020-06-04 2024-04-09 京东方科技集团股份有限公司 可读存储介质、虚拟现实设备及其控制方法、控制装置
CN111629198B (zh) * 2020-06-08 2022-08-09 京东方科技集团股份有限公司 成像系统及其控制方法、控制装置和存储介质
CN111836116A (zh) * 2020-08-06 2020-10-27 武汉大势智慧科技有限公司 一种网络自适应渲染的视频展示方法及系统
CN113515193B (zh) * 2021-05-17 2023-10-27 聚好看科技股份有限公司 一种模型数据传输方法及装置
CN114167992A (zh) * 2021-12-17 2022-03-11 深圳创维数字技术有限公司 显示画面渲染方法、电子设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596834A (zh) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 图像的分辨率处理方法、图像处理装置及系统、存储介质
US20180307305A1 (en) * 2017-04-24 2018-10-25 Intel Corporation Compensating for High Head Movement in Head-Mounted Displays
CN109712224A (zh) * 2018-12-29 2019-05-03 青岛海信电器股份有限公司 虚拟场景的渲染方法、装置及智能设备
CN109803138A (zh) * 2017-11-16 2019-05-24 宏达国际电子股份有限公司 自适应交错图像卷绕的方法、系统以及记录介质
CN110166758A (zh) * 2019-06-24 2019-08-23 京东方科技集团股份有限公司 图像处理方法、装置、终端设备及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9897805B2 (en) * 2013-06-07 2018-02-20 Sony Interactive Entertainment Inc. Image rendering responsive to user actions in head mounted display
US9489044B2 (en) * 2014-11-07 2016-11-08 Eye Labs, LLC Visual stabilization system for head-mounted displays
US10460704B2 (en) * 2016-04-01 2019-10-29 Movidius Limited Systems and methods for head-mounted display adapted to human visual mechanism

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180307305A1 (en) * 2017-04-24 2018-10-25 Intel Corporation Compensating for High Head Movement in Head-Mounted Displays
CN109803138A (zh) * 2017-11-16 2019-05-24 宏达国际电子股份有限公司 自适应交错图像卷绕的方法、系统以及记录介质
CN108596834A (zh) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 图像的分辨率处理方法、图像处理装置及系统、存储介质
CN109712224A (zh) * 2018-12-29 2019-05-03 青岛海信电器股份有限公司 虚拟场景的渲染方法、装置及智能设备
CN110166758A (zh) * 2019-06-24 2019-08-23 京东方科技集团股份有限公司 图像处理方法、装置、终端设备及存储介质

Also Published As

Publication number Publication date
CN110166758B (zh) 2021-08-13
CN110166758A (zh) 2019-08-23

Similar Documents

Publication Publication Date Title
WO2020259402A1 (fr) Procédé et dispositif de traitement d'image, dispositif terminal, support, et système habitronique
US11321906B2 (en) Asynchronous time and space warp with determination of region of interest
EP3491489B1 (fr) Systèmes et procédés pour réduire la latence entre mouvement et photon et la largeur de bande de mémoire dans un système de réalité virtuelle
CN112639577B (zh) 基于应用程序渲染性能的预测和限流调整
US11442540B2 (en) Eye tracking using low resolution images
US9652893B2 (en) Stabilization plane determination based on gaze location
US9424767B2 (en) Local rendering of text in image
US10712817B1 (en) Image re-projection for foveated rendering
JP2023507085A (ja) ヘッドマウントディスプレイとホストコンピュータとの間の分割レンダリング
EP3485350B1 (fr) Restitution fovéale
WO2020140758A1 (fr) Procédé d'affichage d'image, procédé de traitement d'image et dispositifs associés
WO2020003860A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US10957063B2 (en) Dynamically modifying virtual and augmented reality content to reduce depth conflict between user interface elements and video content
CN109242944B (zh) 一种显示方法和装置
CN112805755B (zh) 信息处理装置、信息处理方法和记录介质
US20160252730A1 (en) Image generating system, image generating method, and information storage medium
CN111066081B (zh) 用于补偿虚拟现实的图像显示中的可变显示设备等待时间的技术
KR20200063614A (ko) Ar/vr/mr 시스템용 디스플레이 유닛
US11694379B1 (en) Animation modification for optical see-through displays
WO2024040613A1 (fr) Procédé et appareil de traitement d'image
US20240112303A1 (en) Context-Based Selection of Perspective Correction Operations
US11706383B1 (en) Presenting video streams on a head-mountable device
WO2020237421A1 (fr) Procédé et dispositif pour commander un dispositif d'affichage de réalité virtuelle
WO2019061632A1 (fr) Procédé de commande, dispositif et système destiné à un dispositif de visiocasque
JP2013003593A (ja) 表示装置、表示方法、プログラム、及び集積回路

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20833633

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20833633

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20833633

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04/08/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20833633

Country of ref document: EP

Kind code of ref document: A1