CN102693065A - Method for processing visual effect of stereo image - Google Patents
Method for processing visual effect of stereo image Download PDFInfo
- Publication number
- CN102693065A CN102693065A CN2011100719675A CN201110071967A CN102693065A CN 102693065 A CN102693065 A CN 102693065A CN 2011100719675 A CN2011100719675 A CN 2011100719675A CN 201110071967 A CN201110071967 A CN 201110071967A CN 102693065 A CN102693065 A CN 102693065A
- Authority
- CN
- China
- Prior art keywords
- coordinate
- stereopsis
- visual effect
- cursor
- disposal route
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a method for processing a visual effect of a stereo image. The method comprises the following steps of: supplying the stereo image, wherein the stereo image consists of a plurality of objects, and each object is provided with an object coordinate value; supplying a cursor, wherein the cursor is provided with a cursor coordinate value; judging whether the cursor coordinate value is overlapped with the object coordinate value of one of the objects; if the cursor coordinate value is overlapped with the object coordinate value of one of the objects, changing a depth coordinate parameter of the object coordinate value of each object; and redrawing an image of the object matched with the cursor value. Therefore, the stereo image of the object corresponding to the cursor is outstanding, so that the visual effect and the interaction are enhanced.
Description
Technical field
The present invention relates to a kind of image treatment method, be meant a kind of stereopsis visual effect disposal route especially.
Background technology
Nearly two during the last ten years, and computer graphics has become in the man-machine interface, most important data display method, and extensively apply in the various application.For example three-dimensional (three dimensional, 3-D) computer graphics.Multimedia (multimedia) and virtual reality (virtual reality) product is then more and more universal, and it is the still important breakthrough on the man-machine interface not, more in entertainment applications, plays the part of important role.And above-mentioned application is to be the basis with the real-time 3-D computer drawing technology of low cost mostly.Generally speaking, the 2-D computer graphics is a kind of generally record so that data and content are showed commonly used, particularly on interactive application.The 3-D computer graphics then is one increasing branch in the computer graphics, and it uses 3-D model and various image processing to produce the image with three dimensions sense of reality.
And the construction process of stereo computer figure (3D computer graphics) mainly can be divided into three root phases according to it in proper order:
1: modeling (modeling): the modelling phase can be described as the process of " confirm back scene the shape of the object that will use ", and the multiple modeling technique of tool, like constructive solid geometry, NURBS modeling, polygon modeling or subdivision curved surface etc.In addition, compilation surface or material character be can comprise in the modeling process, texture, concavo-convex correspondence and further feature increased.
2: scene layout and animation generate (layout & animation): scene is set and is related to the position and the size of arranging dummy object, light, video camera and other entity in the scene, and can be used for making a width of cloth tableaux or one section animation.Animation generates and then can use key frame technology such as (key frame) to set up complicated motion relation in the scene.
3: draw and to play up (rendering): playing up is to set up the actual bidimensional image or the terminal stage of animation from preparing scene, and it can be compared with photograph or the process of production scene after setting is accomplished in the real world.
And in the prior art; At interactive media; In recreation or types of applications program; Its solid object through drawing, it can't change the cursor coordinates position when operating mouse, Trackpad or contact panel as the user usually and produce in real time change to highlight its visual effect, causes giving the user enough scene interaction senses.
In addition, existing at present prior art can be the 3D image with the 2D video conversion, can in the 2D image, select a main object usually; And should be made as prospect by main object; All the other objects are made as background, and give the different depth of field of those objects (Depth of field) respectively, and then form the 3D image; But user's operation mouse is common and display screen is the same depth of field; And the position of operation mouse also for vision stops the place, if the depth of view information of mouse is different with the depth of field of the object of mouse position, then has the entanglement on the spatial vision usually.
Summary of the invention
Fundamental purpose of the present invention aims to provide a kind of stereopsis visual effect disposal route, and it can highlight corresponding object stereopsis with the cursor coordinates position, to strengthen human-computer interaction.
For reaching above-mentioned purpose, stereopsis visual effect disposal route of the present invention, it is to comprise the following step: at first, a stereopsis is provided, this stereopsis is made up of a plurality of object, and each these a plurality of object has an object coordinate figure; Then, a cursor is provided, this cursor has a cursor coordinates value; Then, judge that whether this cursor coordinates value coincides with this object coordinate of wherein these a plurality of objects; Then, if this object coordinate of this cursor coordinates value and these a plurality of objects wherein coincides, then change a depth coordinate parameter of the object coordinate of corresponding these a plurality of objects; At last, repaint the image of this object that is consistent with this cursor coordinates value.
Wherein, if when this cursor coordinates value changes, then rejudge this cursor coordinates value and whether coincide with the object coordinate of wherein these a plurality of objects.
Wherein, these a plurality of object coordinate are coordinate figures of corresponding local coordinate, world coordinates, visual angle coordinate or projection coordinate.
Wherein, this cursor coordinates value is to be produced by mouse, Trackpad or contact panel.
Wherein, this stereopsis is to play up computer graphics steps such as (rendering) by modeling (modeling), scene layout and animation generation (layout& animation) and drafting in regular turn to produce.
Wherein, this depth coordinate value of the object coordinate of these a plurality of objects is to be determined by modes such as Z buffer method (Z buffer), artist's depth ordering method, plane normal criterion, surface normal criterion, minimax methods.
Description of drawings
Figure 1A is the flow chart of steps of stereopsis visual effect disposal route of the present invention preferred embodiment;
Figure 1B is for using the formed stereopsis of stereopsis visual effect disposal route of the present invention preferred embodiment;
Fig. 2 is imitated the three-dimensional drawing process flow diagram of disposal route preferred embodiment for stereopsis vision of the present invention;
Fig. 3 A uses the synoptic diagram of union logical operator modeling for stereopsis visual effect disposal route of the present invention;
Fig. 3 B uses the synoptic diagram of common factor logical operator modeling for stereopsis visual effect disposal route of the present invention;
Fig. 3 C uses the synoptic diagram of supplementary set modeling for stereopsis visual effect disposal route of the present invention;
Fig. 4 A uses the synoptic diagram of nurbs curve modeling for stereopsis visual effect disposal route of the present invention;
Fig. 4 B uses the synoptic diagram of nurbs surface modeling for stereopsis visual effect disposal route of the present invention;
Fig. 5 uses polygonal mesh modeling synoptic diagram for stereopsis visual effect disposal route of the present invention;
Fig. 6 A uses subdivision curved surface modeling first synoptic diagram for stereopsis visual effect disposal route of the present invention;
Fig. 6 B uses subdivision curved surface modeling second synoptic diagram for stereopsis visual effect disposal route of the present invention;
Fig. 6 C uses subdivision curved surface modeling the 3rd synoptic diagram for stereopsis visual effect disposal route of the present invention;
Fig. 6 D uses subdivision curved surface modeling the 4th synoptic diagram for stereopsis visual effect disposal route of the present invention;
Fig. 6 E uses subdivision curved surface modeling the 5th synoptic diagram for stereopsis visual effect disposal route of the present invention;
Fig. 7 is the employed standard drawing of a stereopsis visual effect disposal route of the present invention shading pipeline synoptic diagram;
Fig. 8 shows first synoptic diagram for the image of stereopsis visual effect disposal route of the present invention preferred embodiment;
Fig. 9 shows second synoptic diagram for the image of stereopsis visual effect disposal route of the present invention preferred embodiment;
Figure 10 shows the 3rd synoptic diagram for the image of stereopsis visual effect disposal route of the present invention preferred embodiment;
Figure 11 A shows the 4th synoptic diagram for the image of stereopsis visual effect disposal route of the present invention preferred embodiment;
Figure 11 B shows the 5th synoptic diagram for the image of stereopsis visual effect disposal route of the present invention preferred embodiment;
Figure 12 A uses first synoptic diagram of Z buffering rendered object for stereopsis visual effect disposal route of the present invention;
Figure 12 B uses second synoptic diagram of Z buffering rendered object for stereopsis visual effect disposal route of the present invention;
Figure 13 A uses first synoptic diagram of artist's depth ordering method rendered object for stereopsis visual effect disposal route of the present invention;
Figure 13 B uses second synoptic diagram of artist's depth ordering method rendered object for stereopsis visual effect disposal route of the present invention;
Figure 13 C uses the 3rd synoptic diagram of artist's depth ordering method rendered object for stereopsis visual effect disposal route of the present invention;
Figure 14 uses the synoptic diagram of plane normal criterion rendered object for stereopsis visual effect disposal route of the present invention;
Figure 15 uses the synoptic diagram of minimax method rendered object for stereopsis visual effect disposal route of the present invention.
Description of reference numerals: 11-stereopsis; The 12-object; The 21-application program; 22-operating system; The 23-application programming interfaces; 24-geometric transformation subsystem; The painted subsystem of 25-; 31-geometric transformation subsystem; The painted subsystem of 32-; The local coordinate space of 41-; 42-world coordinates space; 43-visual angle coordinate space; 44-three-dimensional screen coordinate space; The 45-display space; 51-defines object; 52-defines scene, reference viewing angle and light source; 53-chooses and is trimmed to three-dimensional angular field of view; The 54-hidden surface is eliminated, painted and Shadows Processing; The conversion of 61-modelling; The conversion of 62-visual angle; 700-union geometric figure; 701-common factor geometric figure; 702-supplementary set geometric figure; The 703-NURBS curve; The 704-NURBS curved surface; 705-polygon modeling object; 706-side's body; 707-first kind spheroid; 708-second globoid; 709-the 3rd globoid; The 710-spheroid; 711-Z cushions stereopsis; 712-Z buffering signal image; 713-first artist's depth ordering image; 714-second artist's depth ordering image; 715-the 3rd artist's depth ordering image; The 716-visible planar; The 717-hidden surface; 718-three-dimensional depth image; S11~S17-steps flow chart.
Embodiment
For being known, your juror understands content of the present invention, and sincerely graphic with the description collocation, please consult.
See also Figure 1A, Figure 1B and shown in Figure 2, it is the flow chart of steps of stereopsis visual effect disposal route of the present invention preferred embodiment, the use formed stereopsis of stereopsis visual effect disposal route of the present invention and a three-dimensional drawing process flow diagram.Wherein, Stereopsis 11 is made up of 12 of a plurality of objects; Its in regular turn by application program 21 (Application), operating system 22 (Operation System), application programming interfaces 23 (Application programming interface, API), geometric transformation subsystem 24 (Geometric Subsystem) and painted subsystem 25 (Raster subsystem) produce.And this stereopsis visual effect disposal route comprises the following step:
S11: a stereopsis is provided, and this solid is made up of a plurality of object, and each this object has an object coordinate figure.
S12: a cursor is provided, and this cursor has a cursor coordinates value.
S13: judge that whether this cursor coordinates value coincides with this object coordinate of wherein these a plurality of objects.
S14:, then change a depth coordinate parameter of the object coordinate of corresponding these a plurality of objects if this object coordinate of this cursor coordinates value and these a plurality of objects wherein coincides.
S15: the image that repaints this object that is consistent with this cursor coordinates value.
S16:, rejudge this cursor coordinates value and whether coincide with the object coordinate of wherein these a plurality of objects if when this cursor coordinates value changes.
In addition, if when this cursor coordinates value and this object coordinate do not coincide, then rejudge this cursor coordinates value after the time and whether coincide, shown in step S17 with the object coordinate of wherein these a plurality of objects in each predetermined period.
Wherein, this cursor coordinates value can be produced with the interactive man-machine interface (Human-Computer interaction) of electronic installation by mouse, Trackpad or contact panel or any being available for users to.
Wherein, this stereopsis 11 is drawn with the mode of stereo computer drawing (3D computer graphic).This stereopsis can be played up computer graphics steps such as (rendering) by modeling (modeling), scene layout and animation generation (layout &animation) and drafting in regular turn and produce.
Wherein, this modelling phase is broadly divided into following several types again:
1: constructive solid geometry (constructive solid geometry; CSG); In constructive solid geometry, can use logical operator (logical operator) with different objects (like cube, right cylinder, prism, pyramid, spheroid, circular cone etc.), be combined into complicated curved surface with modes such as union, common factor and supplementary sets; Use forming a union geometric figure 700, common factor geometric figure 701 and supplementary set geometric figure 702, and available its construction complicated model or curved surface.Shown in Fig. 3 A, Fig. 3 B and Fig. 3 C.
2: non-uniform rational b spline (non uniform rational B-spline; NURBS): it can be used to produce and expression curve and curved surface; Article one, nurbs curve 703, and it is to have weight (weight) reference mark and a knot vector (knot vector) determines by order (order), one group.Wherein, NURBS is B-batten (B spline) and Bei Saier curve (B é zier curves) and both generalized concepts of curved surface.Through s and the t parameter of estimating a nurbs surface 704, can this curved surface be represented in volume coordinate.Shown in Fig. 4 A and Fig. 4 B.
3: polygon modeling (polygon modeling): the polygon modeling is an object modeling method of representing or be used for approximate object curved surface with polygonal mesh (polygon mesh).And grid (mesh) is formed a polygonal modeling object 705 with triangle, quadrilateral or other simple convex polygon usually.As shown in Figure 5.
4: subdivision curved surface (subdivision surface): be called son again and divide curved surface; It is used for setting up smooth surface from arbitrary mess; Through the initial polygonal mesh of refinement repeatedly; Can produce a series of grids and approach, and each segmentation portion all produces more polygon elements and more smooth grid to unlimited subdivision curved surface, and can be by approaching into a first kind spheroid 707, one second globoid 708, one the 3rd globoid 709 and a spheroid 710 by side's body 706 in regular turn.Shown in Fig. 6 A, 6B, 6C, 6D and 6E.
And in modeling procedure, also visual demand compilation surface or material character increase texture, concavo-convex correspondence or further feature.
And scene layout and animation generation are used to arrange dummy object, light, video camera or other entity in the scene, are used to make tableaux or animation.The scene layout is used for defining position and the big or small spatial relationship of object in scene.Animation generates and then to be used for instantaneous description one object, moves in time or is out of shape like it, and it can use key frame (key framing), contrary motion (inverse kinematic) and motion capture (motion capture) to reach.
It then is to set up the actual two-dimentional scape picture or the terminal stage of animation by the scene of preparing that drafting is played up, and it can be divided into non real-time (non real time) mode or real-time (real time) mode.
It is that model is intended the very true effect of (photo realistic) with acquisition like photograph with emulation light transmission (light transport) for the non real-time mode, and common available ray tracing method (ray tracing) or width of cloth degree of penetrating algorithm (radiosity) are reached.
(real time) mode then uses the method for playing up that non-photo intends true (non photo realistic) to obtain real-time render speed in real time; And available straight colouring (flat shading), Phong colouring, Gouraud are painted, bitmap texture (bit map texture), Z-Correct bump mapping Z-correct corresponding (bump mapping), shade (shading), motion blur (motion blur), depth of field variety of ways such as (depth of field) are drawn; As be used to play or the image rendering of interactive media such as simulated program; All need in time to calculate and show, be approximately 20 to 120 frame (frame) per seconds on its speed.
Be clearer understanding three-dimensional drawing mode, please in the lump with reference to Fig. 7, it is the synoptic diagram of a standard three-dimensional drawing shading pipeline.Among the figure.This shading pipeline is to be divided into several parts according to the different coordinate systems system, roughly comprises a geometric transformation subsystem 31 and a painted subsystem 32.Definition in the object 51 defined to as if be the description definition of three-dimensional model, it uses coordinate system to be called local coordinate space 41 (local coordinate space) with reference to itself RP.When a synthetic width of cloth 3 D stereoscopic image; Each different object is by reading in the database; And be converted to a unified world coordinates space 42 (world coordinate space); And in world's coordinate space 42, define scene, reference viewing angle and light source 52, and be called modelling conversion 61 by the process that body coordinate space 41 is converted to world coordinates space 42.Then, must define the position of observation station (view).Because the restriction of drafting system hardware resolution; And must be with continuous coordinate conversion to containing X and Y coordinate; And the three-dimensional screen space of depth coordinate (also being called the Z coordinate), be used as the elimination (hidden surface removal) of hidden surface and object is drawn out with the mode of pixel (pixel), and be converted to visual angle coordinate space 43 by world coordinates space 42; To choose and to be trimmed to the step of three-dimensional angular field of view 53, what this process was claimed again is that the visual angle changes 62.Then, be converted to three-dimensional screen coordinate space 44 by visual angle coordinate space 43, hidden surface is eliminated to carry out, painted and Shadows Processing 54.Afterwards, framework buffer zone (frame buffer) exports the image of net result on the screen to, and is converted to display space 45 by the three-dimensional screen coordinate space.In the present embodiment; In the step of this geometric transformation subsystem and this painted subsystem, can accomplish by microprocessor, or collocation is accomplished with hardware accelerator; As GPU (Graphic processing unit, GPU) or 3D cartographic accelerator card etc.
Please with reference to Fig. 8, Fig. 9, Figure 10, Figure 11 A and Figure 11 B, its image for the preferred embodiment of this creation stereopsis visual effect disposal route shows first synoptic diagram, second synoptic diagram, the 3rd synoptic diagram, the 4th synoptic diagram and the 5th synoptic diagram.When the user moves this cursor during through operation mouse, Trackpad, contact panel or any man-machine interface, and when changing this cursor coordinates value, then rejudge this cursor coordinates value and whether coincide with the object coordinate of wherein these a plurality of objects 12.If do not coincide, then keep the stereopsis 11 of former display frame and do not repaint.If the object coordinate of this cursor value and these a plurality of objects 12 wherein coincides, then change the depth coordinate parameter of the object coordinate of corresponding these a plurality of objects, and repaint stereopsis 11 through above-mentioned three-dimensional drawing shading pipeline step.If when being consistent when the change of cursor coordinates value and with other object 12; The then former object 12 that clicks is replied its former depth coordinate parameter; 12 of the objects that another is clicked change its depth coordinate parameter, just highlight the stereoscopic visual effect that is clicked object 12 when repainting whole stereopsis 11.Be available for users to operate man-machine interface instrument such as mouse thus and produce certain interaction effect with stereopsis.In addition, be consistent with the cursor coordinates position and when changing its depth coordinate position, the coordinate parameters of other object 12 also can change with the cursor coordinates position thereupon, so more can highlight its visual experience and interaction effect when wherein an object 12.
Wherein, this depth coordinate parameter of the object coordinate of this object can be determined by following manner:
1:Z buffer method (Z buffering); Be called depth buffered method again; When render objects; The degree of depth of the pixel of each generation (being the Z coordinate) is stored in the buffer zone, and this buffer zone also is called Z buffer zone or depth buffer, and this buffer zone is then formed the x-y two dimension group of each screen pixels degree of depth of storage.If the another one object also generates rendering result in same pixel in the scene; The depth value that then compares both; And keep the object nearer, and this object degree of depth is stored in the depth buffer apart from the observer, last; According to this depth buffer depth perception effect correctly, nearer object blocks object far away.And this process also is called Z blanking (Z culling).Z buffering stereopsis 711 shown in Figure 12 A and Figure 12 B and Z buffering signal image 712.
2: artist's depth ordering method (Painter ' salgorithm): it at first draws distance object far away; Then at the object of drawing close together to cover object part far away; It sorts each object earlier according to the degree of depth; Draw according to order then, and form one first artist's depth ordering image 713, one second artist's depth ordering image 714 and one the 3rd artist's depth ordering image 715 in regular turn.Shown in Figure 13 A, Figure 13 B and Figure 13 C.
3: the plane normal criterion: it is applicable to the convex polyhedron of no concave; For example regular polygon or crystal ball, its principle is to obtain the normal line vector of each face, if the Z component of normal line vector is greater than 0 (promptly facing the line observer); Then this face is a visible planar 716; If the Z component of normal line vector then is judged to be hidden surface 717 less than 0, need not draw.Shown in figure 14
4: the surface normal criterion: use the surface equation formula as judging base then; As when being used to ask object to receive light, then bring the coordinate figure of every bit into equation, try to achieve normal vector and carry out inner product operation with the light vector; In the hope of receiving light, when drawing, begin to draw by farthest point.So near point will cover in point far away when drawing, to handle degree of depth problem.
5: the minimax method: the Z coordinate from maximum when drawing begins to draw, and which some minimax point decides to be drawn according to Y seat target value, and forms a three-dimensional depth image 718.Shown in figure 15.
Stereopsis visual effect disposal route of the present invention, its effect are and can move through the operation cursor, make corresponding object its depth coordinate position of change and can highlight its visual effect.In addition, also corresponding its relative coordinate position that changes of other object is further to highlight the variation of video vision.
More than explanation is just illustrative for the purpose of the present invention, and nonrestrictive, those of ordinary skills understand; Under the situation of spirit that does not break away from following accompanying claims and limited and scope, can make many modifications, change; Or equivalence, but all will fall in protection scope of the present invention.
Claims (6)
1. a stereopsis visual effect disposal route is characterized in that, comprises the following step:
One stereopsis is provided, and this stereopsis is made up of a plurality of object, and each these a plurality of object has an object coordinate figure;
One cursor is provided, and this cursor has a cursor coordinates value;
Judge that whether this cursor coordinates value coincides with this object coordinate of wherein these a plurality of objects;
If this object coordinate of this cursor coordinates value and these a plurality of objects wherein coincides, then change a depth coordinate parameter of the object coordinate of corresponding these a plurality of objects; And
Repaint the image of this object that is consistent with this cursor coordinates value.
2. stereopsis visual effect disposal route according to claim 1 is characterized in that, if when this cursor coordinates value changes, the object coordinate of these a plurality of objects coincides with one of them then to rejudge this cursor coordinates value.
3. stereopsis visual effect disposal route according to claim 1 is characterized in that, these a plurality of object coordinate are coordinate figures of corresponding local coordinate, world coordinates, visual angle coordinate or projection coordinate.
4. stereopsis visual effect disposal route according to claim 1 is characterized in that this cursor coordinates value is to be produced by mouse, Trackpad or contact panel.
5. stereopsis visual effect disposal route according to claim 1 is characterized in that, this stereopsis is to play up these computer graphics steps by modeling, scene layout and animation generation and drafting in regular turn to produce.
6. stereopsis visual effect disposal route according to claim 1; It is characterized in that this depth coordinate parameter of this object coordinate of these a plurality of objects is to be determined by Z buffer method, artist's depth ordering method, plane normal criterion, surface normal criterion, these modes of minimax method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011100719675A CN102693065A (en) | 2011-03-24 | 2011-03-24 | Method for processing visual effect of stereo image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011100719675A CN102693065A (en) | 2011-03-24 | 2011-03-24 | Method for processing visual effect of stereo image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102693065A true CN102693065A (en) | 2012-09-26 |
Family
ID=46858571
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011100719675A Pending CN102693065A (en) | 2011-03-24 | 2011-03-24 | Method for processing visual effect of stereo image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102693065A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106162142A (en) * | 2016-06-15 | 2016-11-23 | 南京快脚兽软件科技有限公司 | A kind of efficient VR scene drawing method |
CN106406508A (en) * | 2015-07-31 | 2017-02-15 | 联想(北京)有限公司 | Information processing method and relay equipment |
CN104268922B (en) * | 2014-09-03 | 2017-06-06 | 广州博冠信息科技有限公司 | A kind of image rendering method and image rendering device |
CN108463837A (en) * | 2016-01-12 | 2018-08-28 | 高通股份有限公司 | System and method for rendering multiple detail grades |
CN110091614A (en) * | 2018-01-30 | 2019-08-06 | 东莞市图创智能制造有限公司 | Stereo-picture Method of printing, device, equipment and storage medium |
WO2021110038A1 (en) * | 2019-12-05 | 2021-06-10 | 北京芯海视界三维科技有限公司 | 3d display apparatus and 3d image display method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6295062B1 (en) * | 1997-11-14 | 2001-09-25 | Matsushita Electric Industrial Co., Ltd. | Icon display apparatus and method used therein |
US20040230918A1 (en) * | 2000-12-08 | 2004-11-18 | Fujitsu Limited | Window display controlling method, window display controlling apparatus, and computer readable record medium containing a program |
CN101587386A (en) * | 2008-05-21 | 2009-11-25 | 深圳华为通信技术有限公司 | Method for processing cursor, Apparatus and system |
-
2011
- 2011-03-24 CN CN2011100719675A patent/CN102693065A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6295062B1 (en) * | 1997-11-14 | 2001-09-25 | Matsushita Electric Industrial Co., Ltd. | Icon display apparatus and method used therein |
US20040230918A1 (en) * | 2000-12-08 | 2004-11-18 | Fujitsu Limited | Window display controlling method, window display controlling apparatus, and computer readable record medium containing a program |
CN101587386A (en) * | 2008-05-21 | 2009-11-25 | 深圳华为通信技术有限公司 | Method for processing cursor, Apparatus and system |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268922B (en) * | 2014-09-03 | 2017-06-06 | 广州博冠信息科技有限公司 | A kind of image rendering method and image rendering device |
CN106406508A (en) * | 2015-07-31 | 2017-02-15 | 联想(北京)有限公司 | Information processing method and relay equipment |
CN108463837A (en) * | 2016-01-12 | 2018-08-28 | 高通股份有限公司 | System and method for rendering multiple detail grades |
CN106162142A (en) * | 2016-06-15 | 2016-11-23 | 南京快脚兽软件科技有限公司 | A kind of efficient VR scene drawing method |
CN110091614A (en) * | 2018-01-30 | 2019-08-06 | 东莞市图创智能制造有限公司 | Stereo-picture Method of printing, device, equipment and storage medium |
WO2021110038A1 (en) * | 2019-12-05 | 2021-06-10 | 北京芯海视界三维科技有限公司 | 3d display apparatus and 3d image display method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102249577B1 (en) | Hud object design and method | |
US9282321B2 (en) | 3D model multi-reviewer system | |
KR101145260B1 (en) | Apparatus and method for mapping textures to object model | |
US20120229463A1 (en) | 3d image visual effect processing method | |
CN102289845B (en) | Three-dimensional model drawing method and device | |
US9202309B2 (en) | Methods and apparatus for digital stereo drawing | |
EP2051533A2 (en) | 3D image rendering apparatus and method | |
KR20080051134A (en) | 2d editing metaphor for 3d graphics | |
CN102693065A (en) | Method for processing visual effect of stereo image | |
KR101919077B1 (en) | Method and apparatus for displaying augmented reality | |
KR20080055581A (en) | Apparatus, method, application program and computer readable medium thereof capable of pre-storing data for generating self-shadow of a 3d object | |
JP2009116856A (en) | Image processing unit, and image processing method | |
CN108804061A (en) | The virtual scene display method of virtual reality system | |
CN103632390A (en) | Method for realizing naked eye 3D (three dimensional) animation real-time making by using D3D (Direct three dimensional) technology | |
RU2680355C1 (en) | Method and system of removing invisible surfaces of a three-dimensional scene | |
Sandnes | Sketching 3D immersed experiences rapidly by hand through 2D cross sections | |
WO2013152684A1 (en) | Method for dynamically displaying three-dimensional pie chart | |
KR20140019199A (en) | Method of producing 3d earth globes based on natural user interface using motion-recognition infrared camera | |
Gois et al. | Interactive shading of 2.5 D models. | |
TWI797761B (en) | Display method of virtual reality | |
CN111625093B (en) | Dynamic scheduling display method of massive digital point cloud data in MR (magnetic resonance) glasses | |
Elgndy | Using Immersive media to develop the most intuitive way to build an Interactive Animated Augmented Reality (AR) experiences for Product Design taking into consideration the Covid-19 Pandemic. | |
Ji | Design and Modeling of Chinese Classical Lanterns Based on Different Processes | |
Smith et al. | Multicam: A system for interactive rendering of abstract digital images | |
Öhrn | Different mapping techniques for realistic surfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20120926 |