CN114443489A - Method and device for testing Scene gizmo geometry - Google Patents

Method and device for testing Scene gizmo geometry Download PDF

Info

Publication number
CN114443489A
CN114443489A CN202210111250.7A CN202210111250A CN114443489A CN 114443489 A CN114443489 A CN 114443489A CN 202210111250 A CN202210111250 A CN 202210111250A CN 114443489 A CN114443489 A CN 114443489A
Authority
CN
China
Prior art keywords
scene
coordinate
screen
geometric body
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210111250.7A
Other languages
Chinese (zh)
Inventor
张鑫
林顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Yaji Software Co Ltd
Original Assignee
Xiamen Yaji Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Yaji Software Co Ltd filed Critical Xiamen Yaji Software Co Ltd
Priority to CN202210111250.7A priority Critical patent/CN114443489A/en
Publication of CN114443489A publication Critical patent/CN114443489A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method and a device for testing Scene gizmo geometry, electronic equipment and a computer-readable storage medium, and relates to the field of game development. The method comprises the following steps: firstly, carrying out offset processing on a second scene coordinate of a target geometric body to obtain a first scene coordinate, and then processing the first scene coordinate according to a preset calculation mode to obtain a first screen coordinate; and then moving the target geometric body to a first screen coordinate of the display screen through the simulated mouse event, and calling Scene gizmo to update the position of the host node to a third Scene coordinate according to the simulated mouse event. After the fourth scene coordinate of the host node is determined through the third scene coordinate, whether the function of the target geometric body is normal is determined according to the fourth scene coordinate. Relevant interference is eliminated through preprocessing operation on the host node, and then dragging operation is provided through simulating a mouse event, so that manual operation of a mouse is replaced, and automatic testing is achieved.

Description

Method and device for testing Scene gizmo geometry
Technical Field
The application relates to the technical field of game development, in particular to a method and device for testing Scene gizmo geometry, electronic equipment and a computer-readable storage medium.
Background
At present, the mainstream mode of game development is a data-driven mode, so the editing efficiency of data directly relates to the development efficiency of projects, and Scene gizmo comes from the future. Scene gizmo is becoming the standard matching tool of each engine editor, and provides a set of data visualization tool for assisting Scene editing and debugging for game scenes.
Due to the influence of Scene gizmo on the game development efficiency, how to test the function of Scene gizmo also becomes an important research direction.
However, at present, the efficiency of testing the function of the Scene gizmo mainly depends on manpower, and no other more efficient way for testing the function of the Scene gizmo exists.
Disclosure of Invention
An object of the embodiments of the present application is to solve the above-mentioned problems.
According to an aspect of an embodiment of the present application, there is provided a method of testing Scene gizmo geometry, the method including:
preprocessing the selected host node;
determining a target geometric body of Scene gizmo and a second Scene coordinate of the target geometric body, and performing offset processing on the second Scene coordinate to obtain a first Scene coordinate;
under a preset camera angle, dragging the target geometric solid to a first screen coordinate of a display screen through a simulated mouse event, and calling Scene gizmo to update the position of the target geometric solid to a third Scene coordinate according to the simulated mouse event, wherein the first screen coordinate is obtained by processing a first Scene coordinate according to a preset calculation mode;
and determining a fourth scene coordinate of the host node through the third scene coordinate, and determining the functional state of the target geometric body according to the fourth scene coordinate.
In a possible implementation manner, dragging the target geometry to the first screen coordinate of the display screen by simulating a mouse event may specifically include:
processing the second scene coordinate according to a preset calculation mode to obtain a second screen coordinate in the display screen;
the target geometry is moved from the second screen coordinate to the first screen coordinate by simulating a mouse event.
In a possible implementation manner, the displaying a screen includes a current window of an editor, and the processing the first scene coordinate according to a preset calculation manner to obtain the first screen coordinate may specifically include:
acquiring a third screen coordinate of the current window in the display screen;
acquiring window coordinates and height of an editing panel of a current scene in a current window;
and calculating according to the window coordinate, the first scene coordinate, the third screen coordinate and the height to obtain a first screen coordinate.
In a possible implementation manner, the calling Scene gizmo to update the position of the target geometry to the third Scene coordinate according to the simulated mouse event may specifically include:
calling Scene gizmo to capture a simulated mouse event, and acquiring the latest screen coordinate of the target geometric body according to the simulated mouse event;
and calling Scene gizmo to update the position of the target geometric body to the third Scene coordinate according to the latest screen coordinate.
In one possible implementation, Scene gizmo includes an X axis, a Y axis, a Z axis, an xy patch, an xz patch, and a yz patch, and the adjusting operation of the preset camera angle includes:
if the target geometric body is any axis of the X axis, the Y axis and the Z axis, adjusting the camera to enable the target geometric body to be parallel to the display screen;
and if the target geometric body is any patch of an xy patch, an xz patch and a yz patch, adjusting the camera to enable the target geometric body to be vertical to the display screen.
In a possible implementation manner, the shifting the second scene coordinates to obtain the first scene coordinates may specifically include:
setting an offset coordinate according to the direction corresponding to the target geometric body;
and carrying out offset processing on the second scene coordinate according to the offset coordinate to obtain a first scene coordinate.
In one possible implementation, the preprocessing operation includes setting the host node to be at the origin of the current scene, and setting the rotation angle of the host node to be an initial angle.
In a possible implementation manner, obtaining a fourth scene coordinate of the host node through the third scene coordinate, and determining a functional state of the target geometric object according to the fourth scene coordinate may specifically include:
calling the Scenegizmo to determine a fourth scene coordinate of the host node according to the third scene coordinate;
if the values in the directions except the direction of the target geometric body in the fourth scene coordinate are zero, determining that the dragging direction of the target geometric body is correct;
and if the difference value of the fourth scene coordinate and the offset coordinate in the direction of the target geometric body is within a preset range, determining that the functional state of the target geometric body is normal.
According to another aspect of the embodiments of the present application, there is provided an apparatus for testing Scene gizmo, the apparatus including:
the first processing module is used for preprocessing the selected host node;
the first determining module is used for determining a target geometric body of Scene gizmo and a second Scene coordinate of the target geometric body, and performing offset processing on the second Scene coordinate to obtain a first Scene coordinate;
the second processing module is used for dragging the target geometric solid to a first screen coordinate of a display screen through a simulated mouse event under a preset camera angle, and calling Scene gizmo to update the position of the target geometric solid to a third Scene coordinate according to the simulated mouse event, wherein the first screen coordinate is obtained by processing the first Scene coordinate according to a preset calculation mode;
the second determining module is used for determining a fourth scene coordinate of the host node through the third scene coordinate;
and the third determining module is used for determining whether the function of the target geometric body is normal or not according to the fourth scene coordinate.
According to another aspect of the embodiments of the present application, there is provided an electronic device, which includes a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to implement the steps of the method shown in the above aspect of the present application.
According to yet another aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of one of the above aspects of the present application.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
the application provides a method for testing Scene gizmo geometric bodies, which comprises the steps of firstly carrying out offset processing on second Scene coordinates of a target geometric body to obtain first Scene coordinates, and then processing the first Scene coordinates according to a preset calculation mode to obtain first screen coordinates; and then moving the target geometric body to a first screen coordinate of the display screen through the simulated mouse event, and calling Scene gizmo to update the position of the host node to a third Scene coordinate according to the simulated mouse event. After the fourth scene coordinate of the host node is determined through the third scene coordinate, whether the function of the target geometric body is normal is determined according to the fourth scene coordinate. Relevant interference is eliminated through preprocessing operation on the host node, and then dragging operation is provided through simulating a mouse event, so that manual operation of a mouse is replaced, and automatic testing is achieved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
FIG. 1 is a schematic structural diagram of a Scenegizmo in the prior art;
FIG. 2a is a test scenario diagram of an automated test script written based on a Spectron according to an embodiment of the present application;
FIG. 2b is a schematic diagram of a test scenario of another automated test script written based on Spectron provided in this application;
FIG. 3 is a schematic flow chart of a method for testing Scene gizmo geometry according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of an apparatus for testing Scene gizmo geometry according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below in conjunction with the drawings in the present application. It should be understood that the embodiments set forth below in connection with the drawings are exemplary descriptions for explaining technical solutions of the embodiments of the present application, and do not limit the technical solutions of the embodiments of the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the terms "comprises" and/or "comprising," when used in this specification in connection with embodiments of the present application, specify the presence of stated features, information, data, steps, operations, elements, and/or components, but do not preclude the presence or addition of other features, information, data, steps, operations, elements, components, and/or groups thereof, as embodied in the art. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein indicates at least one of the items defined by the term, e.g., "a and/or B" may be implemented as "a", or as "B", or as "a and B".
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms referred to in this application will first be introduced and explained:
electron is an open source framework on GitHub, and development of a desktop GUI application program across platforms is completed by using rendering engines of node.
The Cocos creator editor: an application program developed based on an Electron framework, in which an editor shown in the embodiment of the present application is configured with SceneGizmo, and includes a plurality of HTML panels, for example: a scene editor panel, a hierarchy manager panel, and an attribute checker panel. Wherein the scene editor panel: a scene editor renders a scene based on a game engine; hierarchy manager panel: the node hierarchy in the scene can be viewed and the selected node can be clicked by left clicking; property checker panel: this panel displays attributes of the current node, such as position (coordinates), rotation (rotation angle), etc.
The game engine: the method refers to a core component of some edited computer game systems or some interactive real-time image application programs. The system or core component provides game designers with the various tools needed to write games with the goal of allowing game designers to easily and quickly make game programs without starting from scratch.
And (3) node: a node is a basic building block for creating a game, with editable attributes.
Scene: the scene of the game is composed of a plurality of nodes. A scenario is composed of a set of hierarchically organized (in a tree-like fashion) nodes. A camera exists in a scene, and the rendering result of the scene can be viewed at different angles and orientations by changing the position of the camera.
Scene gizmo: the sceneGizmo is a set of data visualization tools for assisting scene editing and debugging, and consists of various types of Gizmo modules, such as positionGizmo for adjusting positions. sceneGizmo includes various geometries, such as X-axis, Y-axis, Z-axis. The intersection point of the X axis, the Y axis and the Z axis is a node which can carry out interactive operation at present, and the position of the node can be changed by dragging any axis. Since the intersection of the X, Y, and Z axes is the node, the node that can be interactively operated is also referred to as the host node.
Referring to fig. 1, a schematic structural diagram of Scenegizmo in the prior art, the Scenegizmo may include an xy patch, an xz patch, and a yz patch in addition to the X axis, the Y axis, and the Z axis. Dragging the Z axis to enable the host node to move along the Z axis direction; dragging the X axis to enable the host node to move along the X axis direction; dragging the Y axis, so that the host node can move along the Y axis direction; dragging the xy patch of the patch to enable the host node to move along the plane where the X axis and the Y axis are located; dragging the xz plane of the patch to enable the host node to move along the plane where the X axis and the Z axis are located; dragging the patch yz can cause the host node to move along the plane of the Y axis and the Z axis.
As for background technologies, the effect of the sceneGizmo on game development efficiency is great, and the function test of the sceneGizmo becomes an important research direction, but at present, the efficiency is low mainly depending on a mode of manually testing the function of the sceneGizmo. The X-axis, the Y-axis, the Z-axis, the xy patch, the xz patch and the yz patch are common geometric functions, and how to test the functions is also the key point of the application.
The application provides a method, a device, an electronic device and a computer readable storage medium for testing a Scene gizmo geometry, and aims to solve the problem of how to test the Scene gizmo geometry.
The technical solutions of the embodiments of the present application and the technical effects produced by the technical solutions of the present application will be described below through descriptions of several exemplary embodiments. It should be noted that the following embodiments may be referred to, referred to or combined with each other, and the description of the same terms, similar features, similar implementation steps and the like in different embodiments is not repeated.
Referring to fig. 2a, an embodiment of the present application provides a test scenario diagram of an automated test script written based on Spectron, where the script is used to test whether a geometry is functioning normally. Wherein, the Spectron is a testing framework recommended by Electron officials, and Scenegizmo functions can be tested through the Spectron. The script also adopts RobotJS, which is a GUI automation tool based on the node. js frame and can be used for controlling a mouse, a keyboard, reading a screen and the like so as to provide mouse events, keyboard events and the like generated when the mouse and the keyboard are simulated to be manually operated.
The terminal environment involved by the script may include: a display screen of the terminal, a window of an editor (hereinafter referred to as a current window) displayed on the display screen, and a scene editor panel in the current window (i.e. an editing panel in which the current scene is located). The display screen is provided with a screen coordinate system, and the origin of the screen coordinate system is the upper left corner of the display screen, namely the coordinate system identified by the X3 axis and the Y3 axis; the current window is configured with a window coordinate system, and the origin of the window coordinate system is the upper left corner of the current window, namely the coordinate systems identified by the X2 axis and the Y2 axis; the editing panel is provided with a screen space coordinate system, and the origin of the screen space coordinate system is positioned at the lower left corner of the editing panel, namely a coordinate system identified by an X1 axis and a Y1 axis; a scene coordinate system is configured in the current scene, the scene coordinate system is a three-dimensional coordinate system, coordinates in the scene coordinate system can be converted into coordinates in a screen space coordinate system through a game engine of an editor, and the coordinates on the screen coordinate system are finally obtained through multi-step conversion based on the converted coordinates. It should be noted that in fig. 1, 2a and 2b, the origin of the coordinate system is represented by a black circle, and node a is represented by a smiley-face icon.
The window of the editor may specifically include: a scene editor panel (i.e., an editing panel in the embodiments of the present application), a hierarchy manager panel, and a property checker panel.
The original position of the geometric object (e.g., the Z-axis of scenagizmo) to be moved in the scene coordinate system is point a, and the target position is point b. When the geometry is dragged, the operation is embodied as dragging the geometry from the original position a to the position of b.
The implementation idea of the script is shown in the flow S210-S260:
and S210, selecting the host node.
Specifically, a node A is newly established in the current scene, and the node A is selected by clicking a left key in the hierarchical manager through RobotJS, and the node is the host node A. Further, the position (x, y, z) of the node a is set by the attribute checker as: (0, 0, 0), other information of the node a may also be set, for example, the rotation angle is set to (0, 0, 0). The node may be a rigid node configured with a mesh renderer, such as a capsule, a sphere, and the like, and the shape of the node is not limited, and the node may also be a null node without a mesh renderer in the current scene.
Specifically, when the node a is selected by RobotJS, the location of the node a in the hierarchy manager needs to be acquired first, and then the mouse is moved to the location and left-clicked by RobotJS, so that the node a is selected. The position of the node A can be obtained through the following processes:
firstly, acquiring the width, height and coordinates of a current window through an interface in a Spectron frame; second, the width, height, and position of the edit panel are queried by the element selector (the edit panel is an HTML element, whose position and size information can be queried by the element selector). And acquiring the coordinates of the node A in the scene coordinate system through the game engine, and converting the coordinates into the coordinates in the screen space coordinate system. And calculating the screen coordinates of the node A on the display screen according to the screen space coordinates of the node A in a screen space coordinate system, the position information and the height of the editing panel and the position information of the current window in sequence.
After the screen coordinates are calculated, the position is simulated and clicked through a simulated mouse, and then the node A can be selected. After selection, the position information may be input through a simulation keyboard so as to adjust the coordinates and rotation angle of the node a on the attribute checker. It should be noted that other information of node a, such as scaling, layers, etc., may also be adjusted on the attribute checker, and those skilled in the art may determine the attributes to be set in the attribute editor according to their needs.
And S220, adjusting the camera of the current scene to a proper angle.
Specifically, in a test script based on a Spectron framework, interactive operation with an editor is added, so that the editor calls a camera interface in the current scene to change the position of a camera, and a proper angle is obtained. Wherein, the proper angle represents that the target geometry needing to be dragged is completely parallel or perpendicular to the display screen. The target geometry to be dragged and the display screen are completely parallel or perpendicular to each other, and specifically may include: if the target geometric body is an X axis, a Y axis and a Z axis, the target geometric body is parallel to the display screen; and if the target geometry is xy, xz and yz patches, the target geometry is vertical to the display screen.
In the present embodiment, the target geometry may be selected as the Z-axis.
And S230, acquiring screen coordinates of the original position a point of the target geometry on the display screen according to the configuration file.
Specifically, the screen space coordinate of the point a is queried and obtained to be p1(x1, y 1); inquiring and acquiring window coordinates of an editing panel relative to a current window, wherein the window coordinates are p2(x2, y 2); the screen coordinates of the current window relative to the display screen are queried and obtained as p3(x3, y 3). Wherein the height of the scene editor panel is h. And obtaining the actual coordinates of the point a on the display screen according to p1, p2, p3 and h as follows: (x1+ x2+ x3, y2+ y3+ h-y 1).
And S240, acquiring the screen coordinates of the target position b point of the target geometric body on the display screen according to the configuration file.
Specifically, a deviation coordinate is set in the configuration file, and the scene coordinate of the point a is processed according to the deviation coordinate to obtain the scene coordinate of the point b at the target position. Then, the scene coordinates of point b are transformed into screen space coordinates p4(x4, y4) by the engine of the editor; inquiring and acquiring window coordinates of an editing panel in a current window, wherein the window coordinates are p5(x5, y 5); and inquiring and acquiring the screen coordinate of the current window in the display screen to be p6(x6, y 6). Obtaining the coordinates of the target position b point on the display screen according to p4, p5, p6 and h as follows: (x4+ x5+ x6, y5+ y6+ h-y 4).
Wherein if the current window does not change relative to the display screen, p6 is the same as p 3; if the edit panel does not change relative to the current window, then p5 and p2 are the same.
Illustratively, the scene coordinates of point a are: p7(0, 0, 10), and the screen coordinate of the point a on the display screen is p 7' (200 ). If the target geometry is the Z axis, the set offset coordinate is (0, 0, 1), the scene coordinate of the point a is processed according to the offset coordinate, the scene coordinate of the point b is obtained as p8(0, 0, 10+1), and the screen coordinate of the point b on the display screen is obtained as p 8' (190, 200) through the calculation.
And S250, dragging the target geometry from the point a to the point b through the simulated mouse.
Specifically, the original coordinates of the mouse, i.e., p7 ', and the target coordinates of the mouse, p 8', are obtained. And providing a mouse dragging event from the point a to the point b, and dragging the Z axis from the point a to the point b. Wherein, the change of the coordinate in the ideal dragging process is as follows: the screen coordinates are changed from p7 'to p 8' and the scene coordinates are changed from p7 to p 8.
And S260, comparing the attributes of the node A displayed on the attribute checker, and determining whether the position of the dragged node changes towards the expected position.
Specifically, after dragging the target geometry, scenagizmo determines a real coordinate p9 of the target geometry in the scene coordinate system from a real coordinate p 9' of the target geometry on the display screen, and determines a moved coordinate of the node a from p 9.
Referring to fig. 2b for example, the coordinate of node a shown on the property checker panel is (0, 0, 0.965), and the adjustment of the Z-axis direction of Scenegizmo is accurate with respect to the offset coordinate (0, 0, 1), but there is a small error in the drag distance.
Because the calculation is difficult to avoid, errors exist, the test result is that the obtained scene coordinate of the node A changes towards the direction of the target geometric body, namely the function of the Z axis is accurate in direction. Although the offset value set in the Z-axis direction is 1, and the actual offset value of the Z-axis is 0.965, the error is 0.035. If 0.035 is within the allowable error range, it can be determined that the functional status of the Z-axis is normal. The reasons for error include, but are not limited to, discrepancies in the Spectron return position, discrepancies in the RobotJS framework operating the mouse, and so on.
Therefore, in the case where the direction of displacement is not as expected, a deviation range may be set to determine whether the functional state of the Z-axis is normal.
In the process of measuring the Z axis of the Scenegizmo, when other geometric bodies are tested, the angle of the camera is adjusted according to the steps, and corresponding offset coordinates are set. When three patches of the Scenegizmo are tested, the angle of the camera needs to be adjusted so that the patches are perpendicular to the screen direction, and offset coordinates are set.
It should be noted that, in the above embodiments, when describing the position of the Z axis, the coordinates of the arrow on the Z axis are used to correspond to the position of the Z axis. When describing the positions of the Y-axis and the X-axis, they describe the positions in the same manner. When describing the position of a patch, since the patch is square, the center point of the square can be used to correspond to the position of the patch.
Referring to fig. 3, a flowchart of a method for testing Scene gizmo geometry is provided in the embodiment of the present application, and is applied to a terminal, where a display screen, an editor, and an editing panel on the editor are configured as shown in fig. 2a, and a current Scene is displayed on the editing panel. The method comprises the following steps:
and S310, preprocessing the selected host node.
In a possible implementation manner, the preprocessing operation includes setting the origin of the host node in the current scene and the rotation angle of the host node as an initial angle, and the setting operation can make the testing process more intuitive and concise.
Specifically, the coordinates (X, Y, Z) of the host node are set to (0, 0, 0), and the rotation angles of the host node with respect to the X axis, the Y axis, and the Z axis are respectively: 0.0 and 0. In the attribute checker panel, x is 0, y is 0, and z is 0.
For example, when the coordinates of the host node are set as the origin (0, 0, 0), if the Z-axis of Scenegizmo is dragged, the expected result is: the z value in the coordinates of the host node is updated, and the x value and the y value are both kept to be 0. If this is not the case, for example, the x and/or y values are greater than or less than 0, it is immediately apparent that the Z axis is not functioning properly. Because the testing process is intuitive, a user can see the final coordinates of the host node through the attribute checker panel on the editor, and can quickly determine whether the Z-axis functional state is normal.
Specifically, after the editor is started, a new node may be created in the current scene, and then node a may be selected by clicking in the hierarchy manager of the current window by invoking RobotJS, after selection, this node a may be the host node.
S320, determining a target geometric body of Scene gizmo and a second Scene coordinate of the target geometric body, and performing offset processing on the second Scene coordinate to obtain a first Scene coordinate.
Wherein, the Scenegizmo comprises a plurality of geometric bodies, such as: x-axis, Y-axis, Z-axis, xy patch, xz patch, yz patch. Dragging different geometries can adjust the position information of the host node from different directions.
A three-dimensional coordinate system is configured in the current scene, and can be called as a scene coordinate system; in the embodiment of the application, the first scene coordinate, the second scene coordinate and the third scene coordinate are all coordinates in a scene coordinate system. The current scene is located in an editing panel, and the editing panel is also provided with a two-dimensional coordinate system which can be called as a screen space coordinate system; in the embodiment of the application, the screen space coordinate is a coordinate on a screen space coordinate system. The editing panel is in the current window of the editor, and the current window is also configured with a two-dimensional coordinate system, which may be referred to as a window coordinate system in the embodiment of the present application. It should be noted that the coordinates in the scene coordinate system can be converted into the coordinates in the screen space coordinate system through an engine in the editor, and the specific conversion manner may refer to the prior art, and is not described herein again for simplicity and convenience of description.
In a possible implementation manner, the shifting the second scene coordinates to obtain the first scene coordinates includes:
setting an offset coordinate according to the direction corresponding to the target geometric body; and carrying out offset processing on the second scene coordinate according to the offset coordinate to obtain a first scene coordinate. Specifically, the corresponding directions of the target geometry may include: if the target geometry is an axis, the direction of the axis is determined; if the target geometry is a patch, it is in a direction perpendicular to the patch.
The offset coordinates may exemplarily refer to the offset coordinates (0, 0, 1) in the above-described embodiment; the first scene coordinates and the second scene coordinates may refer to p7(0, 0, 10) and p8(0, 0, 10+1), respectively.
S330, under the preset camera angle, dragging the target geometric body to a first screen coordinate of a display screen through a simulated mouse event, and calling a scene gimmo to update the position of the target geometric body to a third scene coordinate according to the simulated mouse event, wherein the first screen coordinate is obtained by processing the first scene coordinate according to a preset calculation mode.
In one possible implementation, Scene gizmo includes an X-axis, a Y-axis, a Z-axis, an xy patch, an xz patch, and a yz patch, and the adjusting operation of the preset camera angle may include:
if the target geometric body is any axis of the X axis, the Y axis and the Z axis, adjusting the camera to enable the target geometric body to be parallel to the display screen; and if the target geometric body is any patch of an xy patch, an xz patch and a yz patch, adjusting the camera to enable the target geometric body to be vertical to the display screen.
In particular, when testing the function of the target geometry of Scenegizmo, involving coordinate conversion, if the tested target geometry (e.g., X-axis, Y-axis, Z-axis) is not parallel to the display screen, or the target geometry (xy patch, xz patch, yz patch) is not perpendicular to the display screen, an error may occur in the coordinate conversion, which in turn affects the third scene coordinates and ultimately the test result. Therefore, it is necessary to test the functional state of any axis by setting any axis parallel to the display screen, or setting any patch perpendicular to the display screen.
Exemplarily referring to S240, S250, S260 in the above embodiments, the first screen coordinate may refer to the screen coordinate p 8', and the third scene coordinate may refer to the real coordinate p9 of the target geometry in the scene coordinate system.
S340, determining a fourth scene coordinate of the host node through the third scene coordinate, and determining whether the function of the target geometry body is normal according to the fourth scene coordinate.
And the fourth scene coordinate is the scene coordinate of the host node in the current scene. In addition, the fourth scene coordinate is displayed on the attribute checker panel of the editor and can be viewed by the user. And dragging the Scene gizmo target geometry by the simulated mouse, obtaining a third Scene coordinate by analyzing the dragged mouse event by the Scene gizmo target geometry, and transmitting the third Scene coordinate to the host node to update the coordinate data of the host node to obtain a fourth Scene coordinate of the host node. It should be noted that the principle of simulating the mouse to operate the target geometry so as to update the position of the host node is as follows: the relative positions of the target geometry and the host node are always unchanged, so that if the position of the target geometry is updated, in order to ensure that the relative position between the target geometry and the host node is unchanged, the coordinates of the host node need to be modified according to the same offset.
The application provides a method for testing Scene gizmo geometric bodies, which comprises the steps of firstly carrying out offset processing on second Scene coordinates of a target geometric body to obtain first Scene coordinates, and then processing the first Scene coordinates according to a preset calculation mode to obtain first screen coordinates; and then moving the target geometric body to a first screen coordinate of the display screen through the simulated mouse event, and calling Scene gizmo to update the position of the host node to a third Scene coordinate according to the simulated mouse event. After the fourth scene coordinate of the host node is determined through the third scene coordinate, whether the function of the target geometric body is normal is determined according to the fourth scene coordinate. Relevant interference is eliminated through preprocessing operation on the host node, and then dragging operation is provided through simulating a mouse event, so that manual operation of a mouse is replaced, and automatic testing is achieved.
The embodiment of the present application further provides a possible implementation manner, where dragging the target geometry to the first screen coordinate of the display screen by simulating a mouse event may specifically include:
processing the second scene coordinates according to a preset calculation mode to obtain second screen coordinates in the display screen; the target geometry is moved from the second screen coordinate to the first screen coordinate by simulating a mouse event.
Specifically, an engine of the editor is called to convert the second scene coordinate into a second screen space coordinate on the screen space coordinate system; acquiring a window coordinate of an editing panel on a window coordinate system and the height of the editing panel; and acquiring a third screen coordinate of the current window on the screen coordinate system. And finally, calculating the second screen coordinate according to the second screen space coordinate, the window coordinate, the third screen coordinate and the height of the editing panel. Regarding the budget calculation manner, reference may also be made to S230 and S240 in the above embodiments, for example, the second screen space coordinate, the window coordinate, and the third screen coordinate refer to p1, p2, and p3 in sequence, and the height reference h of the panel is edited.
In a possible implementation manner, the displaying a current window including an editor in a screen, and processing the first scene coordinates according to a preset calculation manner to obtain first screen coordinates may include:
acquiring a third screen coordinate of the current window in the display screen; acquiring window coordinates and height of an editing panel of a current scene in a current window; and calculating according to the window coordinate, the first scene coordinate, the third screen coordinate and the height to obtain a first screen coordinate.
Specifically, an engine of an editor is called to convert a three-dimensional first scene coordinate into a two-dimensional first screen space coordinate on a screen space coordinate system; acquiring a window coordinate of an editing panel on a window coordinate system and the height of the editing panel; and acquiring a third screen coordinate of the current window on the screen coordinate system. And finally, calculating the first screen coordinate according to the first screen space coordinate, the window coordinate, the third screen coordinate and the height of the editing panel. Regarding the budget calculation manner, the above embodiments may be exemplarily referred to S230 and S240, for example, the first screen space coordinate, the window coordinate, and the third screen coordinate are sequentially referred to p4, p5, and p6, and the height reference h of the panel is edited.
Specifically, the simulated mouse events may include: simulating a mouse click event to realize the selection of the target geometry; simulating a mouse dragging event to realize dragging of the selected target geometry; a mouse deselect event is simulated to effect deselection after the selected target geometry reaches the first screen coordinate.
In a possible implementation manner, the calling Scene gizmo to update the position of the target geometry to the third Scene coordinate according to the simulated mouse event may specifically include:
calling Scene gizmo to capture a simulated mouse event, and acquiring the latest position data of the target geometric body according to the simulated mouse event; and calling Scene gizmo to update the position of the target geometric body to the third Scene coordinate according to the latest position data.
Specifically, there is a module in Scene gizmo for capturing simulated mouse events, the most critical of which is the position data of the mouse. Therefore, after the position data is acquired, the Scenegizmo is called to update the position of the target geometric body according to the position data, and finally, the position data of the host node is updated according to the updated position data of the target geometric body.
The embodiment of the present application further provides a possible implementation manner, where the obtaining of the fourth scene coordinate of the host node through the third scene coordinate specifically includes:
and calling the Scenegizmo to determine the fourth scene coordinate of the host node according to the third scene coordinate.
In a possible implementation manner, determining whether the function of the target geometry is normal according to the fourth scene coordinate may specifically include:
if the values in the directions except the direction of the target geometric body in the fourth scene coordinate are zero, determining that the dragging direction of the target geometric body is correct; and if the difference value of the fourth scene coordinate and the offset coordinate in the direction of the target geometric body is within a preset range, determining that the function of the target geometric body is normal.
Specifically, the embodiment of the present application provides a desired direction to drag the target geometry according to the desired direction, and therefore, the key for determining that the target geometry is functioning normally is that the target geometry can be dragged in the desired direction. If the dragging direction accords with the expected direction, whether the error between the dragging distance in the expected direction and the expected distance is within a preset range needs to be further determined, and if the error is within the preset range, the dragging function of the target geometric body is determined to be normal. Wherein the desired direction may include: if the target geometry is an axis, the direction of the axis is determined; if the target geometry is a patch, the target geometry is in a direction perpendicular to the patch.
Since the initial position of the host node is set as the origin, after the target geometry is dragged in the direction of the target geometry, the value in the direction other than the direction of the target geometry in the first scene coordinate is zero, for example, when the target geometry is the Z axis, the direction of the dragging is the direction of the Z axis, the value in the direction of the Z axis in the coordinates of the host node will not be zero, and the values in the directions of the X axis and the Y axis are zero.
In addition, determining whether the target geometry functions normally according to the fourth scene coordinates may also refer to step S260 in the above-described embodiment.
Referring to fig. 4, an embodiment of the present application provides an apparatus for testing Scene gizmo geometry, where the apparatus 400 may specifically include:
a first processing module 410, configured to pre-process the selected host node; the first determining module 420 is configured to determine a target geometric body of Scene gizmo and a second Scene coordinate of the target geometric body, and perform offset processing on the second Scene coordinate to obtain a first Scene coordinate; the second processing module 430 is configured to, at a preset camera angle, drag the target geometry to a first screen coordinate of the display screen through a simulated mouse event, and call Scene gizmo to update the position of the target geometry to a third Scene coordinate according to the simulated mouse event, where the first screen coordinate is obtained by processing the first Scene coordinate in a preset calculation manner; a second determining module 440, configured to determine a fourth scene coordinate of the host node according to the third scene coordinate; a third determining module 450, configured to determine a functional state of the target geometry according to the fourth scene coordinates.
In one possible implementation, the second processing module 430 is specifically configured to, in dragging the target geometry to the first screen coordinate of the display screen by simulating a mouse event:
processing the second scene coordinate according to a preset calculation mode to obtain a second screen coordinate in the display screen; the target geometry is moved from the second screen coordinate to the first screen coordinate by simulating a mouse event.
In a possible implementation manner, the display screen includes a current window of the editor, and the apparatus 500 further includes a coordinate scaling module 460 (not shown in the figure), which is specifically configured to, in processing the first scene coordinates according to a preset calculation manner to obtain first screen coordinates:
acquiring a third screen coordinate of the current window in the display screen;
acquiring window coordinates and height of an editing panel of a current scene in a current window;
and calculating according to the window coordinate, the first scene coordinate, the third screen coordinate and the height to obtain a first screen coordinate.
In one possible implementation, the second processing module 430, in invoking Scene gizmo to update the position of the target geometry to the third Scene coordinate according to the simulated mouse event, is specifically configured to:
calling Scene gizmo to capture a simulated mouse event, and acquiring the latest screen coordinate of the target geometric body according to the simulated mouse event; and calling Scene gizmo to update the position of the target geometric body to the third Scene coordinate according to the latest screen coordinate.
In a possible implementation manner, the Scene gizmo includes an X axis, a Y axis, a Z axis, an xy patch, an xz patch, and a yz patch, and under the preset camera angle, the first processing module 410 is further configured to, in an adjustment operation of the preset camera angle:
if the target geometric body is any axis of the X axis, the Y axis and the Z axis, adjusting the camera to enable the target geometric body to be parallel to the display screen; and if the target geometric body is any patch of an xy patch, an xz patch and a yz patch, adjusting the camera to enable the target geometric body to be vertical to the display screen.
In a possible implementation manner, the first determining module 420, in obtaining the first scene coordinate by performing offset processing on the second scene coordinate, is specifically configured to:
setting an offset coordinate according to the direction corresponding to the target geometric body; and carrying out offset processing on the second scene coordinates according to the offset coordinates to obtain first scene coordinates.
In a possible implementation manner, the first processing module 410 is specifically configured to, in the preprocessing operation:
and setting the origin of the host node in the current scene and the rotation angle of the host node as an initial angle.
In a possible implementation manner, the second determining module 440, in determining the fourth scene coordinates of the host node through the third scene coordinates, is specifically configured to:
and calling the Scenegizmo to determine the fourth scene coordinate of the host node according to the third scene coordinate.
In a possible implementation, the third determining module 450 is specifically configured to, in determining the functional state of the target geometry according to the fourth scene coordinates:
if the values in the directions except the direction of the target geometric body in the fourth scene coordinate are zero, determining that the dragging direction of the target geometric body is correct; and if the difference value of the fourth scene coordinate and the offset coordinate in the direction of the target geometric body is within a preset range, determining that the functional state of the target geometric body is normal.
An embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to implement the steps of the method in the foregoing embodiment.
Referring to fig. 5, an embodiment of the present application provides a schematic structural diagram of an electronic device. The electronic device 5000 shown in fig. 5 includes: a processor 5001 and a memory 5003. The processor 5001 and the memory 5003 are coupled, such as via a bus 5002. Optionally, the electronic device 5000 may further include a transceiver 5004, and the transceiver 5004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. It should be noted that the transceiver 5004 is not limited to one in practical application, and the structure of the electronic device 5000 is not limited to the embodiment of the present application.
The Processor 5001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 5001 may also be a combination of processors implementing computing functionality, e.g., a combination comprising one or more microprocessors, a combination of DSPs and microprocessors, or the like.
Bus 5002 can include a path that conveys information between the aforementioned components. The bus 5002 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 5002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 5, but this is not intended to represent only one bus or type of bus.
The Memory 5003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium, other magnetic storage devices, or any other medium that can be used to carry or store a computer program and that can be Read by a computer, without limitation.
The memory 5003 is used for storing computer programs for executing the embodiments of the present application, and is controlled by the processor 5001 for execution. The processor 5001 is configured to execute computer programs stored in the memory 5003 to implement the steps shown in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: computer devices, server devices, etc.
Embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, and when being executed by a processor, the computer program may implement the steps and corresponding contents of the foregoing method embodiments.
The terms "first," "second," "third," "fourth," "1," "2," and the like in the description and in the claims of the present application and in the above-described drawings (if any) are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in other sequences than illustrated or otherwise described herein.
It should be understood that, although each operation step is indicated by an arrow in the flowchart of the embodiment of the present application, the implementation order of the steps is not limited to the order indicated by the arrow. In some implementation scenarios of the embodiments of the present application, the implementation steps in the flowcharts may be performed in other sequences as desired, unless explicitly stated otherwise herein. In addition, some or all of the steps in each flowchart may include multiple sub-steps or multiple stages based on an actual implementation scenario. Some or all of these sub-steps or stages may be performed at the same time, or each of these sub-steps or stages may be performed at different times, respectively. In a scenario where execution times are different, an execution sequence of the sub-steps or the phases may be flexibly configured according to requirements, which is not limited in the embodiment of the present application.
The foregoing is only an optional implementation manner of a part of implementation scenarios in this application, and it should be noted that, for those skilled in the art, other similar implementation means based on the technical idea of this application are also within the protection scope of the embodiments of this application without departing from the technical idea of this application.

Claims (11)

1. A method of testing Scene gizmo geometry, the method comprising:
preprocessing the selected host node;
determining a target geometric body of Scene gizmo and a second Scene coordinate of the target geometric body, and performing offset processing on the second Scene coordinate to obtain a first Scene coordinate;
dragging the target geometric body to a first screen coordinate of a display screen through a simulated mouse event under a preset camera angle, and calling the Scene gimmo to update the position of the target geometric body to a third Scene coordinate according to the simulated mouse event, wherein the first screen coordinate is obtained by processing the first Scene coordinate according to a preset calculation mode;
and determining a fourth scene coordinate of the host node through the third scene coordinate, and determining the functional state of the target geometric body according to the fourth scene coordinate.
2. The method of claim 1, wherein dragging the target geometry to a first screen coordinate of a display screen by simulating a mouse event comprises:
processing the second scene coordinate according to the preset calculation mode to obtain a second screen coordinate in the display screen;
moving the target geometry from the second screen coordinate to the first screen coordinate via the simulated mouse event.
3. The method of claim 1, wherein the display screen includes a current window of an editor, and wherein processing the first scene coordinates in a pre-determined computational manner to obtain first screen coordinates comprises:
acquiring a third screen coordinate of the current window in the display screen;
acquiring window coordinates and height of an editing panel of the current scene in the current window;
and calculating according to the window coordinate, the first scene coordinate, the third screen coordinate and the height to obtain the first screen coordinate.
4. The method of claim 1, wherein the invoking Scene gizmo to update the position of the target geometry to a third Scene coordinate according to the simulated mouse event comprises:
calling the Scene gizmo to capture the simulated mouse event, and acquiring the latest screen coordinate of the target geometric body according to the simulated mouse event;
calling the Scene gizmo to update the position of the target geometry to the third Scene coordinate according to the latest screen coordinate.
5. The method according to claims 1-4, wherein the Scene gizmo comprises an X-axis, a Y-axis, a Z-axis, an xy patch, an xz patch, and a yz patch, and the adjusting operation of the preset camera angle comprises:
if the target geometric body is any axis of an X axis, a Y axis and a Z axis, adjusting the camera to enable the target geometric body to be parallel to the display screen;
and if the target geometry is any patch of an xy patch, an xz patch and a yz patch, adjusting the camera to enable the target geometry to be vertical to the display screen.
6. The method of claim 5, wherein the shifting the second scene coordinates to obtain first scene coordinates comprises:
setting an offset coordinate according to the direction corresponding to the target geometric body;
and carrying out offset processing on the second scene coordinate according to the offset coordinate to obtain the first scene coordinate.
7. The method of claim 1, wherein the preprocessing operation comprises setting the host node at an origin of the current scene and a rotation angle of the host node to an initial angle.
8. The method of claim 1, wherein the obtaining fourth scene coordinates of the host node from the third scene coordinates and determining the functional state of the target geometry according to the fourth scene coordinates comprises:
calling the Scenegizmo to determine a fourth scene coordinate of the host node according to the third scene coordinate;
if the values in the directions except the direction of the target geometric body in the fourth scene coordinate are zero, determining that the dragging direction of the target geometric body is correct;
and if the difference value of the fourth scene coordinate and the offset coordinate in the direction of the target geometric body is within a preset range, determining that the functional state of the target geometric body is normal.
9. An apparatus for testing Scene gizmo, comprising:
the first processing module is used for preprocessing the selected host node;
the first determining module is used for determining a target geometric body of Scene gizmo and a second Scene coordinate of the target geometric body, and performing offset processing on the second Scene coordinate to obtain a first Scene coordinate;
the second processing module is used for dragging the target geometric body to a first screen coordinate of a display screen through a simulated mouse event under a preset camera angle, and calling the Scene gizmo to update the position of the target geometric body to a third Scene coordinate according to the simulated mouse event, wherein the first screen coordinate is obtained by processing the first Scene coordinate according to a preset calculation mode;
the second determining module is used for determining a fourth scene coordinate of the host node through the third scene coordinate;
and the third determining module is used for determining whether the function of the target geometric body is normal or not according to the fourth scene coordinate.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the steps of the method of any of claims 1-8.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202210111250.7A 2022-01-29 2022-01-29 Method and device for testing Scene gizmo geometry Pending CN114443489A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210111250.7A CN114443489A (en) 2022-01-29 2022-01-29 Method and device for testing Scene gizmo geometry

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210111250.7A CN114443489A (en) 2022-01-29 2022-01-29 Method and device for testing Scene gizmo geometry

Publications (1)

Publication Number Publication Date
CN114443489A true CN114443489A (en) 2022-05-06

Family

ID=81370912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210111250.7A Pending CN114443489A (en) 2022-01-29 2022-01-29 Method and device for testing Scene gizmo geometry

Country Status (1)

Country Link
CN (1) CN114443489A (en)

Similar Documents

Publication Publication Date Title
JP6551184B2 (en) Simulation apparatus, simulation method, and simulation program
CN111300416B (en) Modularized reconfigurable robot planning simulation method and system based on augmented reality
US8676723B2 (en) Automated test system based on three-dimensional application software framework and a method thereof
US5940296A (en) Method and system for interactively developing a graphical control-flow structure and associated application software for use in a machine vision system
US7561164B2 (en) Texture map editing
CN103106077A (en) Machine vision system
WO2000067120A1 (en) Method and system for interactively developing graphical control-flow structure and associated software for machine vision system
WO2020003888A1 (en) External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result
EP3904829B1 (en) Method and apparatus for generating information, device, medium and computer program product
CN116664776A (en) Three-dimensional visual editing system based on digital twin
CN111985014B (en) Modeling method and system based on standard atlas
CN111429587A (en) Display method, terminal and storage medium of three-dimensional design model
CN114443489A (en) Method and device for testing Scene gizmo geometry
WO2020003887A1 (en) External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result
WO2014055597A2 (en) Providing a three-dimensional view that includes a plurality of systems engineering models
CN114564268A (en) Equipment management method and device, electronic equipment and storage medium
JP2010147322A (en) Method of creating 3d mounting data of component mounting machine
KR100949875B1 (en) Autotest system based framework of 3 dimensional application software and method thereof
EP4369242A1 (en) Method and device for modifying parameter of kinematic pair, and production line system
Blut et al. X-Reality for intuitive BIM-based as-built documentation
CN112017297B (en) Augmented reality positioning method, device, equipment and medium
CN117725767A (en) Automatic generation method, plug-in, system, terminal and medium for parameterized component model
CN115761198A (en) Data model lightweight method, device, equipment and storage medium
CN115577417A (en) Nuclear power outdoor engineering pipeline spacing checking method and system based on BIM
WO2022018233A1 (en) Computer-implemented human-machine interaction method and user interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination