Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a method and apparatus for setting a model, a computing device and a storage medium, so as to solve the technical drawbacks in the prior art.
The embodiment of the application discloses a method for setting a model, which is used in a virtual scene, wherein the virtual scene is provided with a virtual camera, and the method comprises the following steps:
determining a selection point according to an input instruction, and determining a target ray according to the selection point and a virtual camera;
determining a first intersection point of the target ray and the model, and taking the first intersection point as a center point of a drawing control;
and determining a model attribute value in the drawing control range according to the center point of the drawing control and the attribute value corresponding to the drawing control.
Optionally, determining the selection point according to the input instruction includes: and determining a selection point according to an input instruction generated by the mouse.
Optionally, determining a first intersection point of the target ray with the model includes:
dividing the model into a plurality of grids, and forming a multi-stage space cube according to the grids, wherein the i-th stage space cube comprises at least two i+1th stage space cubes, and i is a positive integer greater than or equal to 1;
intersecting the target ray with the space cube step by step, and determining the first space cube with the minimum level intersecting the target ray as a target space cube;
each grid in the target space cube is intersected with the target ray to obtain a first grid intersected with the target ray;
an intersection of the target ray with the grid is determined.
Optionally, the target ray and the space cube are intersected step by step, and the first space cube with the minimum level intersected with the target ray is determined to be the target space cube, which comprises the following steps:
s1, intersecting the target ray with an ith space cube, and determining a first ith space cube intersected with the target ray; wherein i is more than or equal to 1 and less than or equal to n;
s2, intersecting the target ray with an i+1st level space cube corresponding to the first i level space cube, and determining the first i+1st level space cube intersected with the target ray;
s3, judging whether i is smaller than n, if yes, executing the step S4, and if not, executing the step S5;
s4, adding 1 to the i, and then executing the step S2;
s5, determining the first nth-order space cube intersected with the target ray as a target space cube.
Optionally, determining an intersection point of the target ray and the grid includes:
and obtaining the intersection point coordinates of the target ray and the grid through a predefined function.
Optionally, determining the model attribute value in the drawing control range according to the center point of the drawing control and the attribute value corresponding to the drawing control includes:
determining a drawing control range according to the center point of the drawing control;
traversing all grid vertexes of the model, and determining grid vertexes positioned in a drawing control range;
determining attribute values of grid vertices positioned in the drawing control range according to the attribute values corresponding to the drawing control;
and determining the attribute value of the grid according to the attribute value of the vertex of the grid.
Optionally, determining the attribute value of the grid vertex within the drawing control according to the attribute value corresponding to the drawing control includes:
and determining the attribute value of the grid vertex in the drawing control range according to the product of the attribute value corresponding to the drawing control, the weight coefficient and the distance between the grid vertex and the center point of the drawing control.
Optionally, the attribute value includes: one or more of color value, weight value, bone weight value, hardness value.
The embodiment of the application discloses a device of model setting for in the virtual scene, the virtual scene is provided with virtual camera, the device includes:
the ray determination module is configured to determine a selection point according to an input instruction, and determine a target ray according to the selection point and the virtual camera;
the center point determining module is configured to determine a first intersection point of the target ray and the model, and the first intersection point is used as a center point of a drawing control;
and the model attribute value determining module is configured to determine the model attribute value in the drawing control range according to the center point of the drawing control and the attribute value corresponding to the drawing control.
Optionally, the ray determination module is specifically configured to: and determining a selection point according to an input instruction generated by the mouse.
Optionally, the center point determination module is specifically configured to:
dividing the model into a plurality of grids, and forming a multi-stage space cube according to the grids, wherein the i-th stage space cube comprises at least two i+1th stage space cubes, and i is a positive integer greater than or equal to 1;
intersecting the target ray with the space cube step by step, and determining the first space cube with the minimum level intersecting the target ray as a target space cube;
each grid in the target space cube is intersected with the target ray to obtain a first grid intersected with the target ray;
an intersection of the target ray with the grid is determined.
Optionally, the center point determination module is specifically configured to:
a first intersection module configured to intersect the target ray with an i-th level spatial cube, determining a first i-th level spatial cube that intersects the target ray; wherein i is more than or equal to 1 and less than or equal to n;
a second intersection module configured to intersect the target ray with an i+1st level spatial cube corresponding to the first i level spatial cube, and determine a first i+1st level spatial cube intersected by the target ray;
the judging module is configured to judge whether i is smaller than n, if so, the self-increasing module is executed, and if not, the determining module is executed;
the self-increasing module is configured to self-increase i by 1 and then execute a second intersection module;
a determination module configured to determine a first nth level spatial cube intersecting the target ray as a target spatial cube.
Optionally, the center point determination module is specifically configured to: and obtaining the intersection point coordinates of the target ray and the grid through a predefined function.
Optionally, the center point determination module is specifically configured to:
determining a drawing control range according to the center point of the drawing control;
traversing all grid vertexes of the model, and determining grid vertexes positioned in a drawing control range;
determining attribute values of grid vertices positioned in the drawing control range according to the attribute values corresponding to the drawing control;
and determining the attribute value of the grid according to the attribute value of the vertex of the grid.
Optionally, the center point determination module is specifically configured to: and determining the attribute value of the grid vertex in the drawing control range according to the product of the attribute value corresponding to the drawing control, the weight coefficient and the distance between the grid vertex and the center point of the drawing control.
Optionally, the attribute value includes: one or more of color value, weight value, bone weight value, hardness value.
The embodiment of the application discloses a computing device, which comprises a memory, a processor and computer instructions stored on the memory and capable of running on the processor, wherein the processor executes the instructions to realize the steps of the method for setting a model as described above.
The embodiment of the application discloses a computer readable storage medium storing computer instructions, which are characterized in that the instructions, when executed by a processor, implement the steps of the method for setting a model as described above.
According to the method and device for setting the model, the target ray is determined according to the selection point and the virtual camera, the first intersection point of the target ray and the model is used as the center point of the drawing control, and the model attribute value in the range of the drawing control is determined according to the center point of the drawing control and the attribute value corresponding to the drawing control, so that the model attribute value can be set at any position in the model selected by the drawing control.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
The terminology used in the one or more embodiments of the specification is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the specification. As used in this specification, one or more embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present specification refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of this specification to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the present description. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
In the present application, a method and apparatus for setting a model, a computing device, and a storage medium are provided, and detailed descriptions are given in the following embodiments.
Fig. 1 is a block diagram illustrating a configuration of a computing device 100 according to an embodiment of the present description. The components of the computing device 100 include, but are not limited to, a memory 110 and a processor 120. Processor 120 is coupled to memory 110 via bus 130 and database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 140 may include one or more of any type of network interface, wired or wireless (e.g., a Network Interface Card (NIC)), such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present description, the above-described components of computing device 100, as well as other components not shown in FIG. 1, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device shown in FIG. 1 is for exemplary purposes only and is not intended to limit the scope of the present description. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein the processor 120 may perform the steps of the method shown in fig. 2. Fig. 2 is a schematic flow chart illustrating a method of model setup according to an embodiment of the present application, including steps 201 to 203.
201. And determining a selection point according to the input instruction, and determining a target ray according to the selection point and the virtual camera.
It should be explained that the virtual camera is set in a virtual scene, for example, a 3D scene, and the corresponding 3D window shows the content shot by the virtual camera. The position of the camera is defined by the system and is used as the starting point of the target ray. This ray, i.e. the ray emitted from the origin towards the selected point. Therefore, each selected point corresponds to only one ray.
The input command may be various, for example, a command input by a mouse, a keyboard, or the like.
Taking mouse input as an example, in this step, determining a selection point according to an input instruction includes: and determining a selection point according to an input instruction generated by the mouse.
Specifically, the input instruction generated by the mouse may be a movement or click instruction of the mouse. For example, the mouse is moved to a point and then clicked, i.e., the point is determined to be the selected point.
202. And determining a first intersection point of the target ray and the model, and taking the first intersection point as a center point of the drawing control.
The drawing control may be various, such as a brush.
Specifically, referring to fig. 3, determining the first intersection point of the target ray and the model in this step includes:
301. dividing the model into a plurality of grids, and forming a multi-stage space cube according to the grids, wherein the i-th stage space cube comprises at least two i+1th stage space cubes, and i is a positive integer greater than or equal to 1.
Alternatively, the mesh may be a plurality, for example, a triangular mesh.
Wherein, form multistage space cube according to the net, specifically include: the multiple grids are arranged to form a multi-stage grid, and then a multi-stage spatial cube is formed according to the multi-stage grid.
For example, a model is divided into a plurality of meshes and then the plurality of meshes are formed into a 3-level mesh, wherein the model includes at least two 1 st-level meshes, each 1 st-level mesh includes at least two 2 nd-level meshes, and each 2 nd-level mesh includes at least two 3 rd-level meshes.
A level 3 spatial cube is formed from the level 3 mesh, wherein each level 1 spatial cube comprises at least two level 2 spatial cubes and each level 2 spatial cube comprises at least two level 3 spatial cubes.
Specifically, there are various methods for arranging a plurality of grids to form a multi-stage grid, such as a binary tree space division method, a quadtree space division method, an octree space division method, and the like.
The number of stages i may be set according to practical requirements, for example, setting the number of stages i to 10, 20 or 30.
302. And intersecting the target ray with the space cube step by step, and determining the first space cube with the minimum level intersecting the target ray as a target space cube.
For example, the model is divided into level 3 spatial cubes, wherein each level 1 spatial cube comprises at least two level 2 spatial cubes, and each level 2 spatial cube comprises at least two level 3 spatial cubes. Then, in step 302, it is necessary to determine the first level 3 spatial cube that intersects the target ray as the target spatial cube.
303. And intersecting each grid in the target space cube with the target ray to obtain a first grid intersected with the target ray.
Specifically, in this step, all grids in the target space cube need to be traversed, each grid is respectively intersected with the target ray, then a grid intersected with the target ray is obtained, and finally the first grid intersected with the target ray is judged.
304. An intersection of the target ray with the grid is determined.
Specifically, the coordinates of the intersection points of the target ray and the grid are obtained through a predefined function, such as a function directly provided by Direct3D of the bottom layer of the system.
In addition, the calculation may be performed by the following method: and firstly calculating the intersection point of the target ray and the plane where the grid is located, then judging whether the intersection point is inside the grid, and if so, determining the intersection point as the intersection point of the target ray and the grid.
The method of binary tree space partitioning is explained below with reference to fig. 4, as a method of forming a multi-level space cube from a plurality of grids.
Each model consists of thousands of grids. Referring to fig. 4, all meshes corresponding to the model may be first divided into two, and first, all meshes of the model are divided into two classes, forming two level 1 spatial cubes a41. And continuing classification, namely continuing classifying grids corresponding to each 1-level space cube into two types to generate two corresponding 2-level space cubes a42, continuing classifying grids corresponding to each 2-level space cube into two types to generate two corresponding 3-level space cubes a43 and … …, and so on until the final-level space cube is generated.
Eventually, all meshes of this 3D model form a binary tree. Each grid has its own nodes on the binary tree. When the intersection point of the ray and the model is found, the calculation is performed along the binary tree. The resulting binary tree is shown in fig. 4. Wherein, the numbers in the circles in fig. 4 indicate what number of grids.
Specifically, referring to fig. 5, step 302 includes:
501. and intersecting the target ray with an ith space cube, and determining a first ith space cube intersected with the target ray, wherein i is more than or equal to 1 and less than or equal to n.
Where n is the number of layers, e.g., 20, 30, etc.
502. And intersecting the target ray with an i+1st level space cube corresponding to the first i level space cube, and determining the first i+1st level space cube intersected with the target ray.
503. Whether i is smaller than n is determined, if yes, step 504 is executed, and if not, step 505 is executed.
504. I is incremented by 1 and then step 502 is performed.
505. And determining the first nth-order space cube intersected with the target ray as a target space cube.
Through steps 501-505, in the process of intersecting the target ray with the spatial cubes step by step, not all the spatial cubes are traversed, but the first i+1th spatial cube intersected with the target ray is determined by intersecting the target ray with the i+1th spatial cube corresponding to the first i-th spatial cube, and finally the first n-th spatial cube is obtained as the target spatial cube. By the method, the intersection range of the target ray and the n-level space cube is converged step by step, so that the calculation cost is saved, and the calculation speed is improved.
203. And determining a model attribute value in the drawing control range according to the center point of the drawing control and the attribute value corresponding to the drawing control.
Specifically, referring to fig. 6, step 203 includes:
601. and determining the range of the drawing control according to the center point of the drawing control.
Specifically, the drawing control range may be determined according to the center point of the drawing control and the outer contour of the drawing control.
In this case, the brush shape may be customized, for example, a spherical brush, a square brush, a conical brush, or the like.
Taking a spherical brush as an example, the center point of the brush is determined, and then the range of the brush can be determined according to the spherical radius of the brush.
Taking a square brush as an example, the center point of the brush is determined, and then the range of the brush can be determined according to the side length of the square.
602. And traversing all grid vertexes of the model, and determining the grid vertexes positioned in the drawing control range.
Taking a triangular mesh as an example, each mesh has three mesh vertices. By traversing all of the meshes of the model, mesh vertices of the model can be obtained. And then according to the drawing control range, the grid vertexes in the drawing control range can be determined.
Taking a spherical brush as an example, if the distance between the grid vertex and the sphere center is larger than the radius of the spherical brush, neglecting; if the distance is less than the radius of the spherical brush, the mesh vertex is determined to be within the brush range.
603. And determining the attribute value of the grid vertex positioned in the drawing control range according to the attribute value corresponding to the drawing control.
Specifically, step 603 includes: and determining the attribute value of the grid vertex in the drawing control range according to the product of the attribute value corresponding to the drawing control, the weight coefficient and the distance between the grid vertex and the center point of the drawing control.
Through this step, the closer to the center point of the drawing control, the smaller the attribute value of the mesh vertex.
Taking color attribute as an example, the smaller the attribute value, the darker the color can be set; the larger the attribute value, the lighter the color. Thus, the closer the grid vertex is to the center point of the drawing control, the darker the color; the farther apart the color is lighter.
Of course, the attribute values of the grid vertices within the drawing control range may also be set in other manners, for example, the attribute values of the grid vertices within the drawing control range are uniformly set as one attribute value without distinguishing them.
604. And determining the attribute value of the grid according to the attribute value of the vertex of the grid.
Taking a triangle mesh as an example, the attribute values of the mesh need to be determined according to the attribute values of three vertices.
And (3) re-executing the steps 601-604 when the drawing control moves to the next position, so as to realize the setting of the model attribute values in the drawing control range. Therefore, the user only needs to define the attribute value corresponding to the drawing control first, then operate the mouse to control the drawing control to act, and the setting of the model attribute value can be flexibly performed at any position in the drawing control selection model, so that the operation freedom degree is high, and the quick processing is facilitated.
The attribute values may be various, for example, the attribute values may include: one or more of color value, weight value, bone weight value, hardness value.
Taking color values as an example, the method of the embodiment can be used for editing model colors;
taking a weight value as an example, in a 3D material distribution system, a drawing control can be used for setting the property of the material, such as the weight degree of the material;
taking the skeletal weight values as an example, in a 3D animation system, skeletal weights of the animation, such as the magnitude weights of the grid vertex offsets in the motion, may be set with rendering controls.
In addition, the attribute values can be identified by drawing the color values of the grid vertices. Taking a character model as an example, the blue color is used for representing silk, and the more blue the color is, the softer the silk is; the metal armour is represented by red, the more red the colour is, the harder the armour is represented.
These properties can then be represented by color values on the mesh vertices by the drawing control, but in actual use, these color values have other special meanings that can be customized by the user using the drawing control. For example: the hardness of the armor may be defined as ranging from 0 to 100, red=0 indicating a hardness of 0 and red=255 indicating a hardness of 100.
According to the method for setting the model, the target ray is determined according to the selected point and the virtual camera, the first intersection point of the target ray and the model is used as the center point of the drawing control, and the model attribute value in the range of the drawing control is determined according to the center point of the drawing control and the attribute value corresponding to the drawing control, so that the model attribute value can be set at any position in the selected model through the drawing control.
The method of setting the model of the present embodiment will be schematically described below taking the setting of the color attribute of the model as an example. Referring to fig. 7, the embodiment of the application further discloses a method for setting a model color, which includes:
701. and determining a selected point according to an input instruction generated by the mouse, and determining a target ray according to the selected point and the virtual camera.
702. Dividing the model into a plurality of grids, and forming a multi-stage space cube according to the grids, wherein the i-th stage space cube comprises at least two i+1th stage space cubes, and i is a positive integer greater than or equal to 1;
703. intersection is carried out on the target ray and an ith level space cube, and a first ith level space cube intersected with the target ray is determined; wherein i is more than or equal to 1 and n is more than or equal to n.
704. And intersecting the target ray with an i+1st level space cube corresponding to the first i level space cube, and determining the first i+1st level space cube intersected with the target ray.
705. Whether i is smaller than n is determined, if yes, step 706 is executed, and if not, step 707 is executed.
706. I is incremented by 1 and then step 704 is performed.
707. The first nth level spatial cube intersecting the target ray is determined to be the target spatial cube.
708. And intersecting each grid in the target space cube with the target ray to obtain a first grid intersected with the target ray.
709. And determining an intersection point of the target ray and the grid, and taking the intersection point as a center point of the brush.
710. And determining the range of the brush according to the center point of the brush.
711. All grid vertices of the model are traversed to determine grid vertices that lie within the brush range.
712. And determining the color value of the grid vertex positioned in the range of the brush according to the color value corresponding to the brush.
713. And determining the color value of the grid according to the color value of the vertex of the grid.
It should be noted that, each mesh vertex has an initial value. For a mesh, taking a triangle mesh as an example, if only one mesh vertex is within the brush range and the color value is changed, then for the mesh, the color value of the mesh needs to be determined from the color values of the three mesh vertices.
The color value of the grid can be one, and the corresponding grid presents a single color, such as red, blue, etc.;
the color value of the grid can be multiple, and the corresponding grid presents a gradual color. The gradient colors are automatically rendered by the system according to the color values of the grid vertices.
According to the method for setting the model color, the target ray is determined according to the selected point and the virtual camera, the first intersection point of the target ray and the model is used as the center point of the brush, and the model color in the range of the brush is determined according to the center point of the brush and the color value corresponding to the brush, so that the model color can be set at any position in the selected model through the brush.
The embodiment of the application also discloses a device for setting a model, see fig. 8, which is used in a virtual scene, wherein the virtual scene is provided with a virtual camera, and the device comprises:
a ray determination module 801 configured to determine a selected point according to an input instruction, and determine a target ray according to the selected point and a virtual camera;
a center point determining module 802 configured to determine a first intersection point of the target ray and the model, and take the first intersection point as a center point of a drawing control;
the model attribute value determining module 803 is configured to determine a model attribute value within the scope of the drawing control according to the center point of the drawing control and the attribute value corresponding to the drawing control.
Optionally, the ray determination module 801 is specifically configured to: and determining a selection point according to an input instruction generated by the mouse.
Optionally, the center point determination module 802 is specifically configured to:
dividing the model into a plurality of grids, and forming a multi-stage space cube according to the grids, wherein the i-th stage space cube comprises at least two i+1th stage space cubes, and i is a positive integer greater than or equal to 1;
intersecting the target ray with the space cube step by step, and determining the first space cube with the minimum level intersecting the target ray as a target space cube;
each grid in the target space cube is intersected with the target ray to obtain a first grid intersected with the target ray;
an intersection of the target ray with the grid is determined.
Optionally, the center point determination module 802 is specifically configured to:
a first intersection module configured to intersect the target ray with an i-th level spatial cube, determining a first i-th level spatial cube that intersects the target ray; wherein i is more than or equal to 1 and less than or equal to n;
a second intersection module configured to intersect the target ray with an i+1st level spatial cube corresponding to the first i level spatial cube, and determine a first i+1st level spatial cube intersected by the target ray;
the judging module is configured to judge whether i is smaller than n, if so, the self-increasing module is executed, and if not, the determining module is executed;
the self-increasing module is configured to self-increase i by 1 and then execute a second intersection module;
a determination module configured to determine a first nth level spatial cube intersecting the target ray as a target spatial cube.
Optionally, the center point determination module 802 is specifically configured to: and obtaining the intersection point coordinates of the target ray and the grid through a predefined function.
Optionally, the center point determination module 802 is specifically configured to:
determining a drawing control range according to the center point of the drawing control;
traversing all grid vertexes of the model, and determining grid vertexes positioned in a drawing control range;
determining attribute values of grid vertices positioned in the drawing control range according to the attribute values corresponding to the drawing control;
and determining the attribute value of the grid according to the attribute value of the vertex of the grid.
Optionally, the center point determination module 802 is specifically configured to: and determining the attribute value of the grid vertex in the drawing control range according to the product of the attribute value corresponding to the drawing control, the weight coefficient and the distance between the grid vertex and the center point of the drawing control.
Wherein the attribute values include: one or more of color value, weight value, bone weight value, hardness value.
According to the device for setting the model, the target ray is determined according to the selected point and the virtual camera, the first intersection point of the target ray and the model is used as the center point of the drawing control, and the model attribute value in the range of the drawing control is determined according to the center point of the drawing control and the attribute value corresponding to the drawing control, so that the model attribute value can be set at any position in the selected model through the drawing control.
The above is a schematic version of a device for model setting of the present embodiment. It should be noted that, the technical solution of the device and the technical solution of the method for setting the model belong to the same conception, and details of the technical solution of the device, which are not described in detail, can be referred to the description of the technical solution of the method for setting the model.
An embodiment of the present application also provides a computer-readable storage medium storing computer instructions that, when executed by a processor, implement the steps of a method of model setup as described above.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the method for setting a model belong to the same concept, and details of the technical solution of the storage medium, which are not described in detail, can be referred to the description of the technical solution of the method for setting a model.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all necessary for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The above-disclosed preferred embodiments of the present application are provided only as an aid to the elucidation of the present application. Alternative embodiments are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and the practical application, to thereby enable others skilled in the art to best understand and utilize the application. This application is to be limited only by the claims and the full scope and equivalents thereof.