CN107545595B - VR scene processing method and VR equipment - Google Patents

VR scene processing method and VR equipment Download PDF

Info

Publication number
CN107545595B
CN107545595B CN201710703073.0A CN201710703073A CN107545595B CN 107545595 B CN107545595 B CN 107545595B CN 201710703073 A CN201710703073 A CN 201710703073A CN 107545595 B CN107545595 B CN 107545595B
Authority
CN
China
Prior art keywords
scene
resources
color difference
scene resources
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710703073.0A
Other languages
Chinese (zh)
Other versions
CN107545595A (en
Inventor
邱涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN201710703073.0A priority Critical patent/CN107545595B/en
Publication of CN107545595A publication Critical patent/CN107545595A/en
Application granted granted Critical
Publication of CN107545595B publication Critical patent/CN107545595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a VR scene processing method and VR equipment. The method comprises the following steps: acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position; if the edge color difference value is larger than the set color difference threshold value, increasing the relative distance between at least two scene resources to obtain a new scene layout; and rendering at least two scene resources into the VR scene according to the new scene layout. The method provided by the invention can solve the technical defect that the edge of the scene resource flickers when the VR scene is refreshed.

Description

VR scene processing method and VR equipment
Technical Field
The invention relates to the technical field of virtual reality, in particular to a VR scene processing method and VR equipment.
Background
With the development of Virtual Reality (VR) technology, scene resources such as pictures and characters can be rendered on a display screen of VR equipment, and the display screen is continuously refreshed for a user to watch.
In the prior art, when a scene resource is rendered on a display screen of a VR device, the edge of the scene resource may generate a sawtooth and other lossy phenomena, limited by the display technology of the display screen. Especially, the color difference of the edge sawtooth is larger among a plurality of scene resources with larger edge color difference. Therefore, when the display screen is continuously refreshed, the edge sawteeth with large color difference are mutually overlapped and mutually influenced, a flickering phenomenon of color jump can be generated, and the quality of the picture is seriously influenced.
Disclosure of Invention
Aspects of the present invention provide a VR scene processing method and VR equipment, so as to solve the technical defect that when a VR scene is refreshed, the edge of a scene resource flickers.
The invention provides a VR scene processing method, which comprises the following steps:
acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position from the at least two scene resources;
if the edge color difference value is larger than a set color difference threshold value, increasing the relative distance between the at least two scene resources to obtain a new scene layout;
rendering the at least two scene resources into the VR scene according to the new scene layout.
Optionally, the obtaining an edge color difference value of corresponding positions of at least two scene resources adjacent to the display position includes:
acquiring coordinates and geometric information of the at least two scene resources according to the labels of the at least two scene resources and the corresponding relation between the labels and the resource information;
determining edge coordinates of corresponding positions of the at least two scene resources according to the coordinates and the geometric information of the at least two scene resources;
and subtracting the color values of the grids where the edge coordinates are located to obtain the edge color difference values of the corresponding positions of the at least two scene resources adjacent to the display position.
Optionally, before obtaining the coordinates and the geometric information of the at least two scene resources according to the tags of the at least two scene resources and the correspondence between the tags and the resource information, the method further includes:
respectively marking labels on the at least two scene resources;
and establishing a corresponding relation between the label and the coordinate and geometric information of the scene resource marked by the label.
Optionally, the increasing the relative distance between the at least two scene resources comprises:
moving one scene resource of the at least two scene resources away from the other scene resource; or
And moving two scene resources in the at least two scene resources to opposite directions.
Optionally, the moving one of the at least two scene resources away from another scene resource includes:
moving one scene resource of the at least two scene resources to a direction far away from the other scene resource by N unit lengths;
the moving two scene resources of the at least two scene resources in opposite directions includes:
moving two scene assets of the at least two scene assets in opposite directions by M1 and M2 unit lengths, respectively;
wherein N, M1 and M2 are natural numbers.
Optionally, the rendering the at least two scene resources into the VR scene according to the new scene layout includes:
filling a background color in a blank area between the at least two scene resources with the increased relative distance;
rendering the at least two scene assets and the background color into the VR scene according to the new scene layout.
The present invention also provides a VR device comprising: a processor, and a memory and an output component respectively connected to the processor;
the memory to store one or more computer instructions;
the processor is configured to: executing the one or more computer instructions to:
acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position;
if the edge color difference value is larger than a set color difference threshold value, increasing the relative distance between the at least two scene resources to obtain a new scene layout;
the output component is to:
rendering the at least two scene resources into the VR scene according to the new scene layout.
Optionally, when obtaining the edge color difference values of the corresponding positions of at least two scene resources adjacent to the display position, the processor is specifically configured to:
acquiring coordinates and geometric information of the at least two scene resources according to the labels of the at least two scene resources and the corresponding relation between the labels and the resource information;
determining edge coordinates of corresponding positions of the at least two scene resources according to the coordinates and the geometric information of the at least two scene resources;
and subtracting the color values of the grids where the edge coordinates are located to obtain the edge color difference values of the corresponding positions of the at least two scene resources adjacent to the display position.
Optionally, the processor is further configured to:
respectively marking labels on the at least two scene resources;
and establishing a corresponding relation between the label and the coordinate and geometric information of the scene resource marked by the label.
Optionally, when increasing the relative distance between the at least two scene resources, the processor is specifically configured to:
moving one scene resource of the at least two scene resources away from the other scene resource; or
And moving two scene resources in the at least two scene resources to opposite directions.
In the invention, when the edge color difference value of the corresponding position of at least two scene resources adjacent to the display position is larger than the set color difference threshold value, the scene resources with larger edge color difference value are far away by increasing the relative distance between the at least two scene resources, so that when refreshing display is carried out on the display screen, the probability that the edge sawteeth of the scene resources with larger edge color difference value are mutually overlapped and mutually influenced can be reduced, the probability of the flicker phenomenon generating color jump is correspondingly reduced, and the scene resources are smoothly and really displayed to a user as much as possible.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic flowchart of a VR scene processing method according to an embodiment of the present invention;
fig. 2 is a schematic edge diagram of a scene resource according to an embodiment of the present invention;
fig. 3a and fig. 3b are schematic diagrams illustrating movement of two scene resources according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a VR device according to an embodiment of the present invention;
fig. 5 is a schematic view of an internal configuration structure of a head-mounted display device according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the specific embodiments of the present invention and the accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The technical solutions provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a VR scene processing method according to an embodiment of the present invention. As shown in fig. 1, the method comprises the steps of:
s101: and acquiring the edge color difference value of the corresponding positions of at least two scene resources adjacent to the display position.
S102: and if the edge color difference value is larger than the set color difference threshold value, increasing the relative distance between at least two scene resources to obtain a new scene layout.
S103: and rendering at least two scene resources into the VR scene according to the new scene layout.
The at least two scene assets may refer to scene assets in a target texture image, the target texture image being a two-dimensional image prior to rendering to the VR scene. In this embodiment, before the target texture image is rendered to the VR scene, the position of the scene resource in the target texture image is adjusted in advance, so as to solve the technical problem that when the target texture image is rendered to the VR scene and is continuously refreshed, the edge of the scene resource flickers. For convenience of description, the scene resources have a certain layout in the target texture image, which is referred to as an initial scene layout.
Firstly, obtaining the edge color difference value of the corresponding positions of at least two scene resources adjacent to the display position. The scene resources may include, but are not limited to, pictures, texts, a combination of pictures and texts, and the like.
Optionally, the display position of the scene resource may be a display position of the center of the scene resource in the target texture image, or may be a display position of the center of the scene resource to be displayed on the display screen. To obtain scene assets with adjacent display locations, a distance threshold may be set. If the distance between the display positions of the two scene resources is smaller than the set distance threshold, the two scene resources can be used as the scene resources with the adjacent display positions. In general, for a scene resource, there may be 1, 2 or more scene resources adjacent to its display position. Alternatively, there may be 1 group, 2 groups or multiple groups of scene resources with adjacent display positions, and each group of scene resources includes two scene resources with adjacent display positions. The method for obtaining the edge color difference value and the method for increasing the relative distance are the same among all the groups of scene resources.
Optionally, the edge color difference values of the corresponding positions of all the scene resources adjacent to the display position may be obtained, or the edge color difference values of the corresponding positions of some scene resources in all the scene resources adjacent to the display position may also be obtained. Wherein, the partial scene resources refer to two or more scene resources.
After the at least two scene resources adjacent to the display position are obtained, edge color values of corresponding positions of the at least two scene resources adjacent to the display position may be further obtained.
The edge of the corresponding position of the scene resource refers to the edge of the position close or proximate to the scene resource, and is a part or all of the whole edge surrounding the scene resource.
Wherein the edges of the scene assets may not be color filled, and may be color filled. For the edge of the filled color, the edge color value of the corresponding position of the scene resource can be obtained. Wherein the color values may refer to red (R), green (G), blue (B) tristimulus values.
Alternatively, the color values of a circle around the edge of the scene resource may be directly used as the edge color values of the corresponding position of the scene resource. The plurality of pixel points surround the edge of the scene resource for a circle, and the color values of the pixel points may be the same or different. If the color values of the pixels are the same, the same color value can be used as the edge color value of the scene resource. If the color values of the pixels are different, the average value or the median of the color values of the pixels can be obtained and used as the edge color value of the corresponding position of the scene resource.
Optionally, coordinates of corresponding positions of at least two scene resources adjacent to the display position may also be obtained, and then a color value corresponding to the coordinates is obtained as an edge color value of the corresponding position of the scene resource. Optionally, color values may also be randomly obtained at edges of at least two scene resources adjacent to the display position, as edge color values of the corresponding position of the scene resource.
After the edge color values of the corresponding positions of the at least two scene resources adjacent to the display position are obtained, the edge color values may be subtracted to obtain an edge color difference value of the corresponding position of the scene resource.
Optionally, the three colors of the edge color value R, G, B may be respectively subtracted to obtain R, G, B three color difference values; then, an average value of the three color difference values is R, G, B is obtained as an edge color difference value of the corresponding position of at least two scene resources whose display positions are adjacent. Of course, the R, G, B color difference values can also be directly used as the edge color difference values of the corresponding positions of at least two scene resources with adjacent display positions.
In an example, assume that the edge color value of the corresponding position of the character a is RGB: 100, 50; the edge color value of the corresponding position of the character B is RGB: 200,100,50, subtracting the two sets of RGB values respectively to obtain three color difference values RGB of the character a and the character B: 100,0,0. Then, optionally, an average value 33.3 of the three color difference values may be obtained as the edge color difference value of the corresponding positions of the characters a and B.
The edge color difference value represents the contrast degree of the edge color, and if the edge color difference value is larger, the edge color contrast is more obvious. For example, the edge color value of picture a is RGB: 0,0,0 (black) and the edge color value of picture B are RGB: 255, 255, 255 (white). The average value of the three RGB color difference values is large, which indicates that the edge color pairs of the picture a and the picture B are obvious.
When the edge color pairs at the corresponding positions of at least two scene resources adjacent to the display position are obvious, the flicker phenomenon of color jump is easy to occur. Based on this, a color difference threshold value can be set, and if the edge color difference value is greater than the set color difference threshold value, meaning that the edge color difference value is larger and the edge color contrast is more obvious, the relative distance between the scene resources adjacent to the display position is increased to obtain a new scene layout.
Alternatively, if the edge color difference value is RGB three color difference values, 3 color difference thresholds may be set, which correspond to R, G, B three colors respectively. And if at least one of the three color difference values is larger than the corresponding color difference threshold, increasing the relative distance between the scene resources adjacent to the display position. If the edge color difference value is the average of the three color difference values, a color difference threshold may be set. And if the average value of the three color differences is larger than the color difference threshold value, increasing the relative distance between the at least two scene resources.
Increasing the relative distance between the at least two scene assets may refer to increasing the relative distance between the centers of the at least two scene assets. Optionally, the relative distance between at least two scene resources may be increased to a set distance threshold, so that the display positions of the scene resources with the increased relative distance are no longer adjacent. In this way, in the new scene layout, the edge color difference values of at least two scene resources adjacent to the display position are both smaller than the set color difference threshold, and the display positions of the scene resources with the edge color difference value larger than or equal to the set color difference threshold are not adjacent. Then, at least two scene resources may be rendered into the VR scene in accordance with the new scene layout.
The at least two scene resources may include the scene resources with the increased relative distance, or both the scene resources with the increased relative distance and the scene resources without the increased relative distance. The scene resources with the relative distance not increased refer to the scene resources with the display positions not adjacent to each other and the scene resources with the display positions adjacent to each other and the edge color difference value smaller than or equal to the set color difference threshold.
In this embodiment, when the edge color difference value of the corresponding position of at least two scene resources adjacent to the display position is greater than the set color difference threshold, the scene resources with a larger edge color difference value are separated from each other by increasing the relative distance between the at least two scene resources, so that when the display is refreshed on the display screen, the probability that the edge sawteeth of the scene resources with a larger edge color difference value overlap each other and affect each other can be reduced, the probability of the flicker phenomenon causing color jump is correspondingly reduced, and the scene resources are displayed to the user as smoothly and truly as possible.
In the foregoing embodiment or the following embodiments, edge coordinates of corresponding positions of at least two scene resources adjacent to a display position may be first obtained, and then a difference between color values at the edge coordinates is obtained as an edge color difference value. Based on this, obtaining the edge color difference value of the corresponding positions of at least two scene resources adjacent to the display position, includes: and acquiring the coordinates and the geometric information of the at least two scene resources according to the labels of the at least two scene resources and the corresponding relation between the labels and the resource information. And determining edge coordinates of corresponding positions of the at least two scene resources according to the coordinates and the geometric information of the at least two scene resources. And subtracting the color values of the grids where the edge coordinates are located to obtain the edge color difference values of the corresponding positions of at least two scene resources adjacent to the display position.
Optionally, before obtaining the coordinates and the geometric information of the at least two scene resources according to the tags of the at least two scene resources and the corresponding relationship between the tags and the resource information, the tags may be respectively marked on the at least two scene resources in advance; and establishing a corresponding relation between the label and the coordinate and geometric information of the scene resource marked by the label.
Alternatively, the label of the scene resource is used to uniquely mark the scene resource, and information such as the number, name, usage, path, and the like of the scene resource may be used as the label of the scene resource.
The label and the resource information have a corresponding relationship, and the resource information corresponding to the label is the resource information of the scene resource marked by the label. The resource information of the scene resource includes, but is not limited to, geometric information such as length, width, and shape of the scene resource, and information such as coordinates of the scene resource. Based on this, the coordinates and the geometric information of the at least two scene resources corresponding to the labels can be obtained according to the labels of the at least two scene resources and the corresponding relationship between the labels and the resource information.
Optionally, the coordinates of the scene resources refer to coordinates of the center of the scene resources in the target texture image coordinate system.
And then, acquiring edge coordinates of the corresponding position of the scene resource by using a geometric relation according to the center coordinates of the scene resource and the geometric information of the scene resource. In one example, as shown in fig. 2, the scene resources whose display positions are adjacent are picture a and picture B. The center coordinates of the picture a are (0,0), the picture a is a square, and the side length is 2. The center coordinates of the picture B are (3,0), and the picture B is a square with a side length of 2. The edge coordinates of the corresponding position of the picture a and the picture B are coordinates on a thick solid line, for example, (1, 1), (1,0), and (1, -1) on the picture a and (2,1), (2,0), and (2, -1) on the picture B. .
Since the acquired edge coordinates are coordinates calculated in a mathematical coordinate system, it is substantially a point. The target texture image is composed of one grid, and one grid has a unique color value. Based on the method, the color value of the grid where the edge coordinates of the corresponding positions of the scene resources adjacent to the display position are located can be obtained and used as the edge color value of the corresponding positions of the two scene resources.
Alternatively, the coordinate range of each grid in the target texture image coordinate system may be calculated. Then, determining a grid corresponding to a coordinate range to which the edge coordinates of the corresponding position of the scene resource adjacent to the display position belong, and then taking the color value of the grid as the edge color value.
In the above embodiment, if the edge color difference value of the corresponding position of at least two scene resources adjacent to the display position is greater than the set color difference threshold, the relative distance between the at least two scene resources is increased to obtain a new scene layout. Optionally, when the relative distance between at least two scene resources is increased, the at least two scene resources may be grouped in pairs to increase the relative distance, and the processing manner of increasing the relative distance in each group is the same. Based on this, the present embodiment can increase the relative distance between at least two scene resources through the following two embodiments.
The first embodiment: and moving one scene resource in the at least two scene resources to a direction far away from the other scene resource.
Optionally, all the pixel points in the scene resource are integrally moved in a direction away from another scene resource, so that the integral movement of one scene resource is realized.
Optionally, one of the at least two scene assets is moved by N unit lengths in a direction away from the other scene asset. Wherein N is a natural number. Wherein one unit length refers to the length of one pixel. The shifting of the scene resource by N unit lengths substantially achieves the shifting of the scene resource by a length of N pixels, or a length of N grids.
In an example, as shown in fig. 3a, picture C and picture D are scene resources with adjacent display positions, and the edge color difference value of the corresponding positions of the two is greater than a set color difference threshold, the relative distance between the two can be increased. In this example, all the pixels of the picture C may be moved by the length of N pixels or N unit lengths in the direction away from the picture D. Wherein, N can be a natural number such as 1, 2, 3, etc.
In a specific implementation, all the pixel points of the picture C may be shifted by a length of N pixels or a unit length of N pixels in a direction in which the center of the picture D points to the center of the picture C. Of course, all the pixels of the picture D may be shifted by N pixel lengths or N unit lengths in a direction in which the center of the picture C points to the center of the picture D.
The second embodiment: two scene assets of the at least two scene assets are moved in opposite directions.
Optionally, all the pixel points in the two scene resources are integrally moved in opposite directions, so that the integral movement of the two scene resources is realized.
Optionally, two scene assets of the at least two scene assets are moved in opposite directions by M1 and M2 unit lengths, respectively.
In an example, as shown in fig. 3b, picture C and picture D are scene resources with adjacent display positions, and the edge color difference value of the corresponding positions of the two is greater than the set color difference threshold, the relative distance between the two can be increased. In this example, all the pixel points of the picture C may be moved by M1 unit lengths in the direction away from the picture D, and all the pixel points of the picture D may be moved by M2 unit lengths in the direction away from the picture C, where M1 and M2 may be natural numbers of 1, 2, 3, and the like.
In a specific implementation, all the pixel points of the picture C may be shifted by M1 unit lengths or M1 pixel lengths in a direction in which the center of the picture D points to the center of the picture C. All the pixels of the picture D are shifted by M2 unit lengths or M2 pixels in a direction in which the center of the picture C points to the center of the picture D.
Alternatively, M1 and M2 may be the same or different. For example, picture C is moved 1 unit length away from picture D, and picture D is moved 2 unit lengths away from picture C. Alternatively, both picture C and picture D are moved in opposite directions by 2 unit lengths.
It should be noted that, by moving two scene resources of the at least two scene resources in opposite directions by M1 and M2 unit lengths, respectively, the relative distance between the two scene resources can be increased by M1+ M2 unit lengths. Thus, alternatively, M1+ M2 may be equal to N, that is, the relative distance between two scene resources may be increased by N unit lengths in both the first embodiment and the second embodiment.
In an alternative embodiment, rendering at least two scene assets into a VR scene according to a new scene layout includes: filling a background color in a blank area between at least two scene resources with the increased relative distance; and rendering the at least two scene resources and the background color into the VR scene according to the new scene layout.
Optionally, a blank area between at least two scene assets with increased relative distance can be filled with a color with softer visual perception, such as the color of a background sky ball, to make the VR scene display more realistic.
In a specific implementation, the coordinate range of the blank area between at least two scene resources after the relative distance is increased may be determined according to the coordinates of the scene resources, the geometric information, and the moving distance of the scene resources. Then, according to the coordinate range of each grid, the grid included in the coordinate range of the blank area is determined, and the background color is filled in the grid included in the coordinate range of the blank area.
In an optional embodiment, an average value of edge color values of corresponding positions of at least two scene resources adjacent to the display position may be further calculated, and a color of the average value is used to fill the blank area, so as to smoothly transition two scene resources with a larger edge color difference.
Embodiments of the present invention also provide a VR device, as shown in fig. 4, a VR device 200 includes a processor 201, and a memory 202 and an output component 203 respectively connected to the processor 201.
The memory 202 is used to store one or more computer instructions.
The processor 201 is operable to execute one or more computer instructions stored in the memory 202 for: acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position; and if the edge color difference value is larger than the set color difference threshold value, increasing the relative distance between the scene resources adjacent to at least two display positions to obtain a new scene layout.
The output component 203 is configured to: and rendering at least two scene resources into the VR scene according to the new scene layout.
In this embodiment, when the edge color difference value of the corresponding position of at least two scene resources adjacent to the display position is greater than the set color difference threshold, the scene resources with a larger edge color difference value are separated from each other by increasing the relative distance between the at least two scene resources, so that when the display is refreshed on the display screen, the probability that the edge sawteeth of the scene resources with a larger edge color difference value overlap each other and affect each other can be reduced, the probability of the flicker phenomenon causing color jump is correspondingly reduced, and the scene resources are displayed to the user as smoothly and truly as possible.
Optionally, when obtaining the edge color difference value of the corresponding position of at least two scene resources adjacent to the display position, the processor 201 is specifically configured to obtain the coordinates and the geometric information of the at least two scene resources according to the labels of the at least two scene resources and the corresponding relationship between the labels and the resource information; determining edge coordinates of corresponding positions of the at least two scene resources according to the coordinates and the geometric information of the at least two scene resources; and subtracting the color values of the grids where the edge coordinates of the corresponding positions of the at least two scene resources are located to obtain the edge color difference value of the corresponding positions of the at least two scene resources adjacent to the display position.
Optionally, before the processor 201 obtains the coordinates and the geometric information of the at least two scene resources according to the tags of the at least two scene resources and the corresponding relationship between the tags and the resource information, specifically: respectively marking labels on at least two scene resources; and establishing a corresponding relation between the label and the coordinate and geometric information of the scene resource marked by the label.
Optionally, the processor 201 is specifically configured to move one of the at least two scene resources away from another scene resource when the relative distance between the at least two scene resources is increased; or two scene assets of the at least two scene assets are moved in opposite directions.
Optionally, when moving one of the at least two scene resources away from the other scene resource, the processor 201 is specifically configured to: and moving one scene resource in the at least two scene resources to a direction far away from the other scene resource by N unit lengths. Optionally, when moving two of the at least two scene resources in opposite directions, the processor 201 is specifically configured to move two of the at least two scene resources in opposite directions by M1 and M2 unit lengths, respectively; wherein N, M1 and M2 are natural numbers.
Optionally, the output component 203 is specifically configured to fill a background color in a blank area between the at least two scene resources with the increased relative distance when the at least two scene resources are rendered into the VR scene according to the new scene layout; and rendering the at least two scene resources and the background color into the VR scene according to the new scene layout.
Embodiments of the present invention further provide a computer storage medium, where the computer storage medium stores one or more computer instructions, and when the one or more computer instructions are executed by a computer, the computer storage medium may implement: acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position; if the edge color difference value is larger than the set color difference threshold value, increasing the relative distance between at least two scene resources to obtain a new scene layout; and rendering at least two scene resources into the VR scene according to the new scene layout.
Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
Still another embodiment of the present invention provides a VR device that can be an external head-mounted display device or an integrated head-mounted display device, where the external head-mounted display device needs to be used with an external processing system (e.g., a computer processing system).
Fig. 5 is a schematic diagram of an internal configuration structure of a head-mounted display device 300 according to another embodiment of the present invention.
The display unit 301 may include a display panel disposed on a side surface of the head-mounted display device 300 facing the face of the user, which may be an integral panel, or a left panel and a right panel corresponding to the left eye and the right eye of the user, respectively. The display panel may be an Electroluminescence (EL) element, a liquid crystal display or a micro display having a similar structure, or a laser scanning type display in which the retina can directly display or the like.
The virtual image optical unit 302 captures an image displayed by the display unit 301 in an enlarged manner, and allows the user to observe the displayed image as the enlarged virtual image. As the display image output onto the display unit 301, an image of a virtual scene provided from a content reproduction apparatus (blu-ray disc or DVD player) or a streaming server, or an image of a real scene photographed using the external camera 310 may be possible. In some embodiments, the virtual image optics unit 302 may include a lens unit, such as a spherical lens, an aspherical lens, a fresnel lens, or the like.
The input operation unit 303 includes at least one operation section such as a key, a button, a switch, or other like section having a similar function for performing an input operation, receives a user instruction through the operation section, and outputs the instruction to the control unit 307.
The status information acquisition unit 304 is used to acquire status information of a user wearing the head-mounted display device 300. The state information acquisition unit 304 may include various types of sensors for detecting state information by itself, and may acquire the state information from an external device (e.g., a smartphone, a wristwatch, and other multi-function terminal worn by the user) through the communication unit 305. The state information acquisition unit 304 may acquire position information and/or posture information of the head of the user. The state information acquisition unit 304 may include one or more of a gyro sensor, an acceleration sensor, a Global Positioning System (GPS) sensor, a geomagnetic sensor, a doppler effect sensor, an infrared sensor, and a radio frequency field intensity sensor. Further, the state information acquisition unit 304 acquires state information of the user wearing the head mounted display device 300, for example, acquires, for example, an operation state of the user (whether the user is wearing the head mounted display device 300), an action state of the user (a moving state such as still, walking, running, and the like, a posture of a hand or a fingertip, an open or closed state of an eye, a line of sight direction, a pupil size), a mental state (whether the user is immersed in viewing a displayed image, and the like), and even a physiological state.
The communication unit 305 performs communication processing with an external device, modulation and demodulation processing, and encoding and decoding processing of a communication signal. In addition, the control unit 307 can transmit transmission data from the communication unit 305 to an external device. The communication means may be in a wired or wireless form, such as mobile high definition link (MHL) or Universal Serial Bus (USB), High Definition Multimedia Interface (HDMI), wireless fidelity (Wi-Fi), bluetooth communication or bluetooth low energy communication, and mesh network of ieee802.11s standard, etc. Additionally, the communication unit 305 may be a cellular radio transceiver operating in accordance with wideband code division multiple access (W-CDMA), Long Term Evolution (LTE), and similar standards.
In some embodiments, the head mounted display device 300 may further include a storage unit, and the storage unit 306 is a mass storage device configured with a Solid State Drive (SSD) or the like. In some embodiments, the storage unit 306 may store applications or various types of data. For example, content viewed by a user using head mounted display device 300 may be stored in storage unit 406.
In some embodiments, the head mounted display device 300 may further include a control unit, and the control unit 307 may include a Computer Processing Unit (CPU) or other device with similar functionality. In some embodiments, the control unit 307 may be used to execute applications stored by the storage unit 306, or the control unit 307 may also be used to execute circuitry that performs the methods, functions, and operations disclosed in some embodiments of the invention.
The image processing unit 308 is used to perform signal processing such as image quality correction related to the image signal output from the control unit 307, and to convert the resolution thereof into a resolution according to the screen of the display unit 301. Then, the display driving unit 309 sequentially selects each row of pixels of the display unit 301 and sequentially scans each row of pixels of the display unit 301 row by row, thereby providing pixel signals based on the signal-processed image signals.
In some embodiments, head mounted display device 300 may also include an external camera. The external camera 310 may be disposed on a front surface of the body of the head mounted display device 300, and the external camera 310 may be one or more. The external camera 310 may acquire three-dimensional information and may also function as a distance sensor. Additionally, a Position Sensitive Detector (PSD) or other type of distance sensor that detects reflected signals from objects may be used with the external camera 310. The external camera 310 and the distance sensor may be used to detect the body position, posture and shape of the user wearing the head-mounted display device 300. In addition, the user may directly view or preview the real scene through the external camera 310 under certain conditions.
In some embodiments, the head-mounted display device 300 may further include a sound processing unit, and the sound processing unit 311 may perform sound quality correction or sound amplification of the sound signal output from the control unit 307, signal processing of the input sound signal, and the like. Then, the sound input/output unit 312 outputs sound to the outside and inputs sound from the microphone after sound processing.
It should be noted that the structure or components shown in the dashed box in fig. 5 may be independent from the head-mounted display device 300, and may be disposed in an external processing system (e.g., a computer system) for use with the head-mounted display device 300; alternatively, the structures or components shown in dashed line boxes may be disposed within or on the surface of the head mounted display device 300.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (10)

1. A VR scene processing method is characterized by comprising the following steps:
acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position;
if the edge color difference value is larger than a set color difference threshold value, increasing the relative distance between the at least two scene resources to obtain a new scene layout;
and rendering the at least two scene resources and the color for filling the blank area between the at least two scene resources into the VR scene according to the new scene layout.
2. The method of claim 1, wherein obtaining the edge color difference value of the corresponding position of at least two scene resources adjacent to the display position comprises:
acquiring coordinates and geometric information of the at least two scene resources according to the labels of the at least two scene resources and the corresponding relation between the labels and the resource information;
determining edge coordinates of corresponding positions of the at least two scene resources according to the coordinates and the geometric information of the at least two scene resources;
and subtracting the color values of the grids where the edge coordinates are located to obtain the edge color difference values of the corresponding positions of the at least two scene resources adjacent to the display position.
3. The method of claim 2, wherein before obtaining the coordinates and the geometric information of the at least two scene resources according to the tags of the at least two scene resources and the corresponding relationship between the tags and the resource information, the method further comprises:
respectively marking labels on the at least two scene resources;
and establishing a corresponding relation between the label and the coordinate and geometric information of the scene resource marked by the label.
4. The method of claim 1, wherein said increasing the relative distance between the at least two scene assets comprises:
moving one scene resource of the at least two scene resources away from the other scene resource; or
And moving two scene resources in the at least two scene resources to opposite directions.
5. The method of claim 4, wherein moving one of the at least two scene assets away from another scene asset comprises:
moving one scene resource of the at least two scene resources to a direction far away from the other scene resource by N unit lengths;
the moving two scene resources of the at least two scene resources in opposite directions includes:
moving two scene assets of the at least two scene assets in opposite directions by M1 and M2 unit lengths, respectively;
wherein N, M1 and M2 are natural numbers.
6. The method of any of claims 1-5, wherein said rendering the at least two scene assets and colors for filling a blank space between the at least two scene assets into a VR scene according to the new scene layout comprises:
filling a background color in a blank area between the at least two scene resources with the increased relative distance;
rendering the at least two scene assets and the background color into the VR scene according to the new scene layout.
7. A VR device comprising a processor, and a memory and an output component respectively coupled to the processor;
the memory to store one or more computer instructions;
the processor to execute the one or more computer instructions to:
acquiring an edge color difference value of corresponding positions of at least two scene resources adjacent to a display position;
if the edge color difference value is larger than a set color difference threshold value, increasing the relative distance between the at least two scene resources to obtain a new scene layout;
the output component is to:
and rendering the at least two scene resources and the color for filling the blank area between the at least two scene resources into the VR scene according to the new scene layout.
8. The VR device of claim 7, wherein the processor, when obtaining the color difference value of the edge at the corresponding position of at least two scene resources adjacent to the display position, is specifically configured to:
acquiring coordinates and geometric information of the at least two scene resources according to the labels of the at least two scene resources and the corresponding relation between the labels and the resource information;
determining edge coordinates of corresponding positions of the at least two scene resources according to the coordinates and the geometric information of the at least two scene resources;
and subtracting the color values of the grids where the edge coordinates are located to obtain the edge color difference values of the corresponding positions of the at least two scene resources adjacent to the display position.
9. The VR device of claim 8, wherein the processor is further configured to:
respectively marking labels on the at least two scene resources;
and establishing a corresponding relation between the label and the coordinate and geometric information of the scene resource marked by the label.
10. The VR device of claim 7, wherein the processor, when increasing the relative distance between the at least two scene resources, is specifically configured to:
moving one scene resource of the at least two scene resources away from the other scene resource; or
And moving two scene resources in the at least two scene resources to opposite directions.
CN201710703073.0A 2017-08-16 2017-08-16 VR scene processing method and VR equipment Active CN107545595B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710703073.0A CN107545595B (en) 2017-08-16 2017-08-16 VR scene processing method and VR equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710703073.0A CN107545595B (en) 2017-08-16 2017-08-16 VR scene processing method and VR equipment

Publications (2)

Publication Number Publication Date
CN107545595A CN107545595A (en) 2018-01-05
CN107545595B true CN107545595B (en) 2021-05-28

Family

ID=60971353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710703073.0A Active CN107545595B (en) 2017-08-16 2017-08-16 VR scene processing method and VR equipment

Country Status (1)

Country Link
CN (1) CN107545595B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765581B (en) * 2018-05-30 2020-12-25 贝壳技术有限公司 Method and device for displaying label in virtual three-dimensional space
CN113093903B (en) * 2021-03-18 2023-02-07 聚好看科技股份有限公司 Image display method and display equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103929606A (en) * 2014-04-01 2014-07-16 北京智谷睿拓技术服务有限公司 Image presenting control method and image presenting control device
CN106791741A (en) * 2016-12-07 2017-05-31 重庆杰夫与友文化创意有限公司 Multi-screen marching method and device
US20170186188A1 (en) * 2015-12-23 2017-06-29 Framy Inc. Method and apparatus for processing border of computer figure to be merged into background image

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6439214B2 (en) * 2013-02-18 2018-12-19 株式会社P2P Bank Image processing apparatus, image processing method, computer program for image processing, and information recording medium storing image processing computer program
CN103442159A (en) * 2013-09-02 2013-12-11 安徽理工大学 Edge self-adapting demosaicing method based on RS-SVM integration
US9508121B2 (en) * 2015-01-14 2016-11-29 Lucidlogix Technologies Ltd. Method and apparatus for controlling spatial resolution in a computer system by rendering virtual pixel into physical pixel
CN106251287B (en) * 2015-06-14 2020-04-10 奥多比公司 Controlling smoothness of transitions between images
CN105096370B (en) * 2015-07-15 2017-08-01 西安邮电大学 The equivalent partition reverse sawtooth method of ray tracing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103929606A (en) * 2014-04-01 2014-07-16 北京智谷睿拓技术服务有限公司 Image presenting control method and image presenting control device
US20170186188A1 (en) * 2015-12-23 2017-06-29 Framy Inc. Method and apparatus for processing border of computer figure to be merged into background image
CN106791741A (en) * 2016-12-07 2017-05-31 重庆杰夫与友文化创意有限公司 Multi-screen marching method and device

Also Published As

Publication number Publication date
CN107545595A (en) 2018-01-05

Similar Documents

Publication Publication Date Title
US20180165830A1 (en) Method and device for determining points of interest in an immersive content
CN108762492B (en) Method, device and equipment for realizing information processing based on virtual scene and storage medium
US11568606B2 (en) Method and device for compositing an image
JP5847924B2 (en) 2D image capture for augmented reality representation
CN111225150B (en) Method for processing interpolation frame and related product
EP3249922A1 (en) Method, apparatus and stream for immersive video format
US11004267B2 (en) Information processing apparatus, information processing method, and storage medium for generating a virtual viewpoint image
US20160171704A1 (en) Image processing method and apparatus
US10298903B2 (en) Method and device for processing a part of an immersive video content according to the position of reference parts
US11367259B2 (en) Method for simulating natural perception in virtual and augmented reality scenes
US20190251735A1 (en) Method, apparatus and stream for immersive video format
CN107560637B (en) Method for verifying calibration result of head-mounted display device and head-mounted display device
US20190129500A1 (en) Virtual reality device and content adjustment method thereof
JP2016528517A (en) Multi-laser drive system
CN107545595B (en) VR scene processing method and VR equipment
JPWO2019123509A1 (en) Terminal device, system, program and method
EP3236423A1 (en) Method and device for compositing an image
CN107704397B (en) Application program testing method and device and electronic equipment
JPWO2019229906A1 (en) Image display system
CN107705311B (en) Method and equipment for identifying inside and outside of image contour
CN107958478B (en) Rendering method of object in virtual reality scene and virtual reality head-mounted equipment
CN107506031B (en) VR application program identification method and electronic equipment
US20120050302A1 (en) Method and apparatus for enhancing a white-board
US20240087191A1 (en) Systems and method for rendering of virtual objects
US20240087247A1 (en) Systems and method for rendering of virtual objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201027

Address after: 261061 north of Yuqing East Street, east of Dongming Road, Weifang High tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Applicant before: GOERTEK TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 261061 east of Dongming Road, north of Yuqing East Street, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 261061 East of Dongming Road, Weifang High-tech Zone, Weifang City, Shandong Province, North of Yuqing East Street (Room 502, Goertek Office Building)

Patentee before: GoerTek Optical Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221129

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261061 east of Dongming Road, north of Yuqing East Street, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.