CN107103637B - Method for enhancing texture force - Google Patents
Method for enhancing texture force Download PDFInfo
- Publication number
- CN107103637B CN107103637B CN201710218574.XA CN201710218574A CN107103637B CN 107103637 B CN107103637 B CN 107103637B CN 201710218574 A CN201710218574 A CN 201710218574A CN 107103637 B CN107103637 B CN 107103637B
- Authority
- CN
- China
- Prior art keywords
- force
- texture
- image
- point
- gradient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/80—Shading
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
Abstract
The invention discloses a method for enhancing texture force, which is characterized by comprising two parts of reconstruction of three-dimensional microscopic shapes on the surface of a two-dimensional texture image and texture force touch rendering. The first part adopts an evolution method in an SFS algorithm to reconstruct a surface three-dimensional microscopic shape from a two-dimensional texture image; the second part adds a gradient feedback force on the basis of the traditional texture force tactile representation method to achieve a more real force tactile representation effect. The texture force touch rendering is established on the surface three-dimensional microscopic shape, and the SFS technology utilizes the light and shade change of the object surface in a single image to recover the relative height of each point on the surface, thereby laying a foundation for further establishing the surface three-dimensional microscopic shape of the object; the gradient feedback force is a force existing under the condition that the surface gradient of the object meets the condition, and the direction is opposite to the direction of the speed of the virtual point, so that an operator can better perceive the concave-convex change of the surface of the object when touching the surface of the object.
Description
Technical Field
The invention belongs to the field of texture force rendering in virtual reality, and particularly relates to a method for enhancing texture force.
Background
With the continuous development of human-computer interaction technology in virtual reality, more and more human-computer interaction devices seek to provide immersive feeling for users, and although the users are in virtual environments, the feeling is as real as in reality. An important issue in human-computer interaction is the texture force haptic rendering. The texture on the surface of the object is fed back to the user through force, so that the effect of sensing the texture on the surface of the object can be achieved. At present, the traditional texture force touch reappearance mainly feeds back the resultant force of the normal force of the surface of the vertical object and the friction force of the surface of the parallel object, and the resultant force is output to a user through a texture force touch device, so that the purpose of enabling the user to perceive the texture of the surface of the object is achieved.
Disclosure of Invention
The method improves the traditional texture haptic reproduction method, and achieves a more real texture haptic reproduction effect. The technical scheme comprises two parts of three-dimensional microscopic shape reconstruction and texture force touch rendering on the surface of a two-dimensional texture image. In the first part, reconstructing a surface three-dimensional microscopic shape from a two-dimensional texture image by adopting an evolution method in an SFS algorithm; the second part adds a gradient feedback force on the basis of the traditional texture force touch representation method to achieve a more real representation effect.
1. The method comprises the steps of leading the images into two-dimensional texture images, and reconstructing a surface three-dimensional microscopic shape from the two-dimensional texture images by adopting an evolution method in an SFS algorithm, wherein the evolution method is used for gradually evolving shape information of the whole curved surface from a group of reference points with known heights in the images, the key of the algorithm is to find a certain point or certain points which can uniquely determine the shape in the images, and then the solution of the whole surface is obtained from the points in an iteration mode.
1-1) the Lambertian surface reflection model used in the SFS algorithm is:
wherein E (x, y) is normalized image brightness, (x, y) represents the position of image pixel point, R (p, q) is reflection function, (p, q) is surface gradient, and light source l direction vector is (-p)m,-qm1), m is a constant;
2-1) firstly rotating the image by a certain angle to make the projection direction of the X axis of the image and the light source direction on the image plane consistent, calculating the height, then rotating the image to the original position in the reverse direction to obtain the height of the surface point corresponding to the image point in the original position, and the evolution method comprises the following steps:
setting the tiny increment of a certain point on the surface along the direction with the deflection angle t as (dx, dy, dz), wherein dx is costds, dy is sintds, and s is a path parameter; if the surface normal vector n corresponding to the point is equal to (n)1,n2,n3) Satisfies the following conditions:
n1dx+n2dy+n3dz=0
let the surface slopeFor adjacent points in each direction t, searching the steepest rising slope k on an equal brightness line determined by the brightness of the adjacent points;
and from the relationship of (p, q) and n:andobtaining surface gradient values meeting the conditions, generally obtaining two sets of surface gradient values in an opposite relation, and selecting the surface gradient values meeting the conditions (cost, sint, k) & l & gt 0, namely selecting the surface gradient values far away from the light source; wherein, (p (D), q (D)) are surface gradients on the same isoluminance line, and D is an isoluminance line;
the height value z of the current point is obtained by the height values of the neighboring points in all directions(h+1)(x, y) is:
where h is the number of iterations and the estimated height that would result from point (x + costds, y + sintds) to point (x, y)Is composed of
2) Calculating gradient values of all pixel points, and calculating final texture force according to the size of the gradient values; assuming that A is the normalized image, the horizontal gradient value T of each pixel point1Comprises the following steps:
vertical gradient value T of pixel point2Comprises the following steps:
2-1) when the gradient value T < | lambda | is constant and changes according to the change of the material of the object in the two-dimensional texture image, the surface of the object is flat, and the final texture force is equal to the traditional texture force FcThen final texture force Fc' is:
Fc′=Fc=Fn+Ff
Fn=S×z(h+1)(x,y)
Ff=μ×Fn×sign(v)
wherein, FnNormal force to vertical surface, FfIs the friction force of the parallel surface, S is the rigidity coefficient of the surface of the object and is related to the material of the object, mu is the friction coefficient, sign (v) is a sign function, and v is the speed of the virtual point;
2-2) when the gradient value T > | lambda | indicates that the object surface at the point is steeply increased or steeply decreased, the gradient feedback force F' borne by the operator is obvious, the gradient feedback force is the main stress at the moment, and the traditional texture force FcFor minor forces, the final texture force Fc' is:
Fc′=αFc+βF′(α<β)
F′=S×|v|×sign(v)
wherein, alpha and beta respectively represent the weight values of the traditional texture force and the gradient feedback force, and S represents the surface rigidity coefficient of the object.
Advantageous effects
In the aspect of traditional texture force calculation, a gradient feedback force opposite to the speed direction of the virtual point is added, so that a more real force tactile feedback effect can be achieved.
The texture force touch rendering is established on the surface three-dimensional microscopic shape, the establishment of the surface three-dimensional microscopic shape adopts a light and shade recovery shape technology (SFS), and the relative height of each point on the surface is recovered by using the light and shade change of the object surface in a single image, so that a foundation is laid for further establishment of the object surface three-dimensional microscopic shape. Compared with a two-dimensional picture, the three-dimensional microscopic shape can reflect more characteristics of a real object, and the ability of enabling an operator to sense actively can be achieved.
Drawings
FIG. 1 is a flow chart of the present embodiment;
FIG. 2 is a graph of a haptic model of texture force at a gradient value T < | λ |;
FIG. 3 is a graph of a texture force haptic model at a gradient value T > | λ |.
Detailed Description
The technical scheme of the invention is concretely explained in the following by combining the attached drawings.
Fig. 1 is a flow chart of the present solution, a method for enhancing texture force,
1. firstly, importing an image to be processed, and recovering a surface three-dimensional micro shape of the image by adopting an evolution method in an SFS algorithm; in the SFS method, a lambertian body surface reflection model is adopted, a light source is an infinite point light source, an imaging geometric relation is orthogonal projection, and the image brightness of an object surface point in the lambertian body reflection model is only determined by the cosine of the incident angle of the point light source;
1) the lambertian surface reflection model adopted in the SFS algorithm is as follows:
wherein E (x, y) is normalized image brightness, (x, y) represents the position of image pixel point, R (p, q) is reflection function, (p, q) is surface gradient, and light source l direction vector is (-p)m,-qm1), m is a constant;
2) firstly, rotating the image by a certain angle to make the projection direction of the X axis of the image and the light source direction on the image plane consistent, calculating the height, then rotating the image to the original position in the reverse direction, thus obtaining the height of the surface point corresponding to the image point at the original position, and the evolution method comprises the following steps:
setting the tiny increment of a certain point on the surface along the direction with the deflection angle t as (dx, dy, dz), wherein dx is costds, dy is sintds, and s is a path parameter; if the surface normal vector n corresponding to the point is equal to (n)1,n2,n3) Satisfies the following conditions:
n1dx+n2dy+n3dz=0
let the surface slopeFor adjacent points in each direction t, searching the steepest rising slope k on an equal brightness line determined by the brightness of the adjacent points;
and from the relationship of (p, q) and n:andobtaining surface gradient values meeting the conditions, generally obtaining two sets of surface gradient values in an opposite relation, and selecting the surface gradient values meeting the conditions (cost, sint, k) & l & gt 0, namely selecting the surface gradient values far away from the light source; wherein, (p (D), q (D)) are surface gradients on the same isoluminance line, and D is an isoluminance line;
the height value z of the current point is obtained by the height values of the neighboring points in all directions(h+1)(x, y) is:
where h is the number of iterations and the estimated height that would result from point (x + costds, y + sintds) to point (x, y)Is composed of
2. The second part calculates the gradient value of each pixel point,calculating the final texture force according to the condition of the gradient value; assuming that A is the normalized image, the horizontal gradient value T of each pixel point1Comprises the following steps:
vertical gradient value T of each pixel point2Comprises the following steps:
1) When the gradient value T < | λ |, λ is constant and varies according to the change of the material of the object in the two-dimensional texture image, as shown in FIG. 2, it indicates that the surface of the object is flat, and the final texture force is equal to the conventional texture force FcThen final texture force Fc' is:
Fc′=Fc=Fn+Ff
Fn=S×z(h+1)(x,y)
Ff=μ×Fn×sign(v)
wherein, FnNormal force to vertical surface, FfIs the friction force of the parallel surface, S is the rigidity coefficient of the surface of the object and is related to the material of the object, mu is the friction coefficient, sign (v) is a sign function, and v is the speed of the virtual point P;
2) when the gradient value T > | λ |, as shown in FIG. 3, indicates that the object surface has a steep increase or a steep decrease at the point, the gradient feedback force F' on the operator is more obvious, so that the gradient feedback force is the main stress at the moment, and the traditional texture force F iscFor minor forces, the final texture Fc' force:
Fc′=αFc+βF′(α<β)
F′=S×|v|×sign(v)
wherein, alpha and beta respectively represent the weight values of the traditional texture force and the gradient feedback force, and S represents the surface rigidity coefficient of the object.
Claims (1)
1. The method for enhancing the texture force is characterized by comprising two parts of three-dimensional microscopic shape reconstruction and texture force touch rendering on the surface of a two-dimensional texture image;
1) the first part is to introduce a texture image, and reconstruct a surface three-dimensional microscopic shape from a two-dimensional texture image by adopting an evolution method in an SFS algorithm:
1-1) the Lambertian surface reflection model used in the SFS algorithm is:
wherein E (x, y) is normalized image brightness, (x, y) represents the position of image pixel point, R (p, q) is reflection function, (p, q) is surface gradient, and light source l direction vector is (-p)m,-qm1), m is a constant;
1-2) firstly rotating the image by a certain angle to make the projection direction of the X axis of the image and the light source direction on the image plane consistent, calculating the height, then rotating the image to the original position in the reverse direction to obtain the height of the surface point corresponding to the image point in the original position, and the evolution method comprises the following steps:
setting the tiny increment of a certain point on the surface along the deflection angle as the direction t as (dx, dy, dz), wherein dx is costds, dy is sintds, and s is a path parameter; if the surface normal vector n corresponding to the point is equal to (n)1,n2,n3) Satisfies the following conditions:
n1dx+n2dy+n3dz=0
let the surface slopeFor adjacent points in each direction t, searching the steepest rising slope k on an equal brightness line determined by the brightness of the adjacent points;
and from the relationship of (p, q) and n:andobtaining surface gradient values meeting the conditions, wherein two sets of surface gradient values in an opposite relation meet the conditions, and selecting the surface gradient values meeting (cost, sint, k) & l & gt 0, namely selecting the surface gradient values far away from the light source; wherein, (p (D), q (D)) are surface gradients on the same isoluminance line, and D is an isoluminance line;
then the height value z of the current point is obtained from the height values of the neighboring points in all directions(h+1)(x, y) is:
where h is the number of iterations and the estimated height that would result from point (x + costds, y + sintds) to point (x, y)Is composed of
2) The second part calculates the gradient value of each pixel point, and calculates the final texture force according to the condition of the gradient value; assuming that A is the normalized image, the horizontal gradient value T of each pixel point1Comprises the following steps:
vertical gradient value T of each pixel point2Comprises the following steps:
2-1) when the gradient value T < | lambda | is constant and changes according to the change of the material of the object in the two-dimensional texture image, the surface of the object is flat, and the final texture force is equal to the traditional texture force FcThen final texture force Fc' is:
Fc′=Fc=Fn+Ff
Fn=S×z(h+1)(x,y)
Ff=μ×Fn×sign(v)
wherein, FnIs a normal force perpendicular to the surface of the object, FfIs the friction force parallel to the surface of the object, S is the rigidity coefficient of the surface of the object and is related to the material of the object, mu is the friction coefficient, sign (v) is the standard sign function, and v is the speed of the virtual point;
2-2) when the gradient value T > | lambda | indicates that the object surface at the point is steeply increased or steeply decreased, the gradient feedback force F' borne by the operator is obvious, the gradient feedback force is the main stress at the moment, and the traditional texture force FcFor minor forces, the final texture force Fc' is:
Fc′=αFc+βF′(α<β)
F′=S×|v|×sign(v)
wherein, alpha and beta respectively represent the weight values of the traditional texture force and the gradient feedback force, and S represents the surface rigidity coefficient of the object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710218574.XA CN107103637B (en) | 2017-04-05 | 2017-04-05 | Method for enhancing texture force |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710218574.XA CN107103637B (en) | 2017-04-05 | 2017-04-05 | Method for enhancing texture force |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107103637A CN107103637A (en) | 2017-08-29 |
CN107103637B true CN107103637B (en) | 2020-08-11 |
Family
ID=59674858
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710218574.XA Active CN107103637B (en) | 2017-04-05 | 2017-04-05 | Method for enhancing texture force |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107103637B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109472853A (en) * | 2018-11-16 | 2019-03-15 | 厦门大学 | A kind of lambert's body microcosmic surface reconstructing method based on image irradiation intensity |
CN111258432B (en) * | 2020-02-07 | 2021-04-23 | 吉林大学 | Force touch reproduction method of high-definition image texture based on electrostatic force touch feedback device |
CN112835448B (en) * | 2021-01-27 | 2024-02-06 | 南京工程学院 | Object three-dimensional shape reconstruction method based on monocular image light and shade restoration technology and interactive data information fusion |
CN116664782B (en) * | 2023-07-31 | 2023-10-13 | 南京信息工程大学 | Neural radiation field three-dimensional reconstruction method based on fusion voxels |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102006013318A1 (en) * | 2005-03-21 | 2006-09-28 | Daimlerchrysler Ag | Static scene reconstruction method for testing work pieces, involves reconstructing scene of a two-dimensional image data by method of shape from motion when scene exhibits region with intensity gradients |
CN101615072B (en) * | 2009-06-18 | 2010-12-29 | 东南大学 | Method for reproducing texture force touch based on shape-from-shading technology |
CN101819462B (en) * | 2010-03-12 | 2011-07-20 | 东南大学 | Image texture haptic representation system based on force/haptic interaction equipment |
US9519999B1 (en) * | 2013-12-10 | 2016-12-13 | Google Inc. | Methods and systems for providing a preloader animation for image viewers |
CN103869984B (en) * | 2014-03-26 | 2016-08-17 | 东南大学 | A kind of haptic feedback method based on texture picture |
CN104050683A (en) * | 2014-07-09 | 2014-09-17 | 东南大学 | Texture force touch sensing method based on single image fractional order processing |
-
2017
- 2017-04-05 CN CN201710218574.XA patent/CN107103637B/en active Active
Non-Patent Citations (1)
Title |
---|
基于PDE方法的图像力触觉再现方法研究;田磊; 宋爱国; 王蔚;《仪器仪表学报》;20131031;第34卷(第10期);第2316-2321页 * |
Also Published As
Publication number | Publication date |
---|---|
CN107103637A (en) | 2017-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107103637B (en) | Method for enhancing texture force | |
CN109003325B (en) | Three-dimensional reconstruction method, medium, device and computing equipment | |
JP7403528B2 (en) | Method and system for reconstructing color and depth information of a scene | |
JP4555722B2 (en) | 3D image generator | |
US9298264B2 (en) | Methods and apparatus for actuated 3D surface with gestural interactivity | |
JP6412690B2 (en) | Method for obtaining depth information and display device | |
CA2923917A1 (en) | Flexible display for a mobile computing device | |
CN101303772A (en) | Method for modeling non-linear three-dimensional human face based on single sheet image | |
US20220237872A1 (en) | Multi-Resolution Voxel Meshing | |
CN110866966B (en) | Rendering virtual objects with realistic surface properties that match an environment | |
Sandnes | Sketching 3D immersed experiences rapidly by hand through 2D cross sections | |
TW201503050A (en) | Three dimensional data visualization | |
CN106131533A (en) | A kind of method for displaying image and terminal | |
Faeth et al. | Combining 3-D geovisualization with force feedback driven user interaction | |
KR101428577B1 (en) | Method of providing a 3d earth globes based on natural user interface using motion-recognition infrared camera | |
CN114255328A (en) | Three-dimensional reconstruction method for ancient cultural relics based on single view and deep learning | |
CN107564066B (en) | Combined calibration method for virtual reality glasses and depth camera | |
JP2010152870A (en) | Image processing apparatus, image processing method and image processing program | |
CN117011493B (en) | Three-dimensional face reconstruction method, device and equipment based on symbol distance function representation | |
CN101930626B (en) | Method and system for computing three-dimensional space layout based on scattered perspective image | |
JP7190785B1 (en) | Information processing device and program | |
Khan et al. | A 3D Classical Object Viewer for Device Compatible Display | |
Shao | Research on the Digital Promotion and Development of the Achang Forging Skills in Yunnan | |
Xu et al. | 3D Facade Reconstruction Using the Fusion of Images and LiDAR: A Review | |
Yang et al. | The virtual campus system based on VR-Platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |