CN116999845A - Abnormality detection method and related device - Google Patents

Abnormality detection method and related device Download PDF

Info

Publication number
CN116999845A
CN116999845A CN202211418947.5A CN202211418947A CN116999845A CN 116999845 A CN116999845 A CN 116999845A CN 202211418947 A CN202211418947 A CN 202211418947A CN 116999845 A CN116999845 A CN 116999845A
Authority
CN
China
Prior art keywords
collision
vertex
virtual
distance
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211418947.5A
Other languages
Chinese (zh)
Inventor
黄超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Chengdu Co Ltd
Original Assignee
Tencent Technology Chengdu Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Chengdu Co Ltd filed Critical Tencent Technology Chengdu Co Ltd
Priority to CN202211418947.5A priority Critical patent/CN116999845A/en
Publication of CN116999845A publication Critical patent/CN116999845A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5526Game data structure

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses an abnormality detection method and a related device, at least relates to scenes such as the map field, and aims to detect abnormal situations of virtual objects and improve detection effects. The method comprises the following steps: acquiring object plane information of a virtual object in a virtual scene and collision plane information of a virtual collision body corresponding to the virtual object, wherein the object plane information comprises coordinate information of object vertexes of a plurality of object planes, and the collision plane information comprises coordinate information of collision vertexes of a plurality of collision planes; calculating a distance from each collision vertex to each object surface based on the coordinate information of each collision vertex, and calculating a distance from each object vertex to each collision surface based on the coordinate information of each object vertex; the virtual object is collision detected based on the distance from each collision vertex to each object surface, and the distance from each object vertex to each collision surface.

Description

Abnormality detection method and related device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method for detecting abnormality and a related device.
Background
In the traditional scheme, aiming at the detection mode of collision abnormal resources in the virtual game scene, a large number of game scenes are marked manually, and whether the virtual character encounters an abnormal object such as an obstacle or the like is judged based on game images. After the game is run, the position of the virtual character in the virtual game is determined in an image recognition mode, the virtual character is controlled to move, and then the game image is intercepted in the moving process. Then, the intercepted game image is subjected to recognition processing through the depth network model, so that whether the virtual character encounters an abnormal object or not is recognized. The described abnormal object can be understood as a virtual object having abnormal collision in the virtual game, such as a through mold abnormal condition, an air wall abnormal condition, and the like.
In other words, in the conventional abnormality detection scheme, a large number of training samples need to be manually marked, resulting in a large cost consumption; moreover, the depth network model has lower precision, and is easy to cause the condition of error detection and detection, so that the detection effect is poor.
Disclosure of Invention
The embodiment of the application provides an abnormality detection method and a related device, which can automatically detect whether an abnormality occurs in a virtual object, avoid missing detection and improve the detection effect.
In a first aspect, an embodiment of the present application provides a method for detecting an anomaly. The method comprises the following steps: acquiring object plane information of a virtual object in a virtual scene and collision plane information of a virtual collision body corresponding to the virtual object, wherein the object plane information comprises coordinate information of object vertexes of a plurality of object planes of the virtual object, and the collision plane information comprises coordinate information of collision vertexes of a plurality of collision planes of the virtual collision body; calculating a distance from each collision vertex to each object surface based on the coordinate information of each collision vertex, and calculating a distance from each object vertex to each collision surface based on the coordinate information of each object vertex; and performing collision detection on the virtual object based on the distance from each collision top point to each object surface and the distance from each object top point to each collision surface.
In a second aspect, an embodiment of the present application provides an abnormality detection apparatus. The abnormality detection device includes an acquisition unit and a processing unit. The device comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring object surface information of a virtual object in a virtual scene and collision surface information of a virtual collision body corresponding to the virtual object, the object surface information comprises coordinate information of object vertexes of a plurality of object surfaces of the virtual object, and the collision surface information comprises coordinate information of collision vertexes of a plurality of collision surfaces of the virtual collision body; a processing unit for calculating a distance from each collision vertex to each object surface based on the coordinate information of each collision vertex, and calculating a distance from each object vertex to each collision surface based on the coordinate information of each object vertex; and the processing unit is used for carrying out collision detection on the virtual object based on the distance from each collision top point to each object surface and the distance from each object top point to each collision surface.
In some alternative embodiments, the processing unit is configured to: determining an abnormal collision vertex based on a distance from each of the collision vertices to each of the object faces, and determining an abnormal object vertex based on a distance from each of the object vertices to each of the collision faces; and determining that the virtual object collides abnormally based on the abnormal collision vertex and the abnormal object vertex.
In other alternative embodiments, the processing unit is configured to: calculating first normal line information according to coordinate information of each object vertex in the first object plane, wherein the first normal line information is used for indicating a normal line direction of the first object plane, and the first object plane is any one of a plurality of object planes; the distance of each collision vertex to the first object surface is determined based on the coordinate information of each collision vertex and the first normal line information.
In other alternative embodiments, the processing unit is configured to: determining a first projection coordinate based on the coordinate information of the first collision vertex and the first normal line information, wherein the first projection coordinate is the coordinate of a projection point of the first collision vertex on the first object surface, and the first collision vertex is any one of the collision vertices; when the first projection coordinates meet the coordinate range of the first object plane, calculating the distance from the first collision vertex to the projection point based on the coordinate information of the first collision vertex and the first projection coordinates so as to determine the distance from the first collision vertex to the first object plane; or when the first projection coordinates do not meet the coordinate range of the first object plane, respectively calculating the distance from the first collision vertex to each object vertex in the first object plane based on the coordinate information of the first collision vertex and the coordinate information of each object vertex in the first object plane; a target distance is selected from the distances of the first collision vertex to each object vertex to determine the distance of the first collision vertex to the first object surface.
In other alternative embodiments, the processing unit is configured to: selecting a minimum distance from the distances from the first collision top point to each object surface respectively; and when the minimum distance is greater than or equal to a preset threshold value, determining the first collision vertex as an abnormal collision vertex.
In other alternative embodiments, the processing unit is configured to: calculating second normal line information based on coordinate information of each collision vertex in the first collision surface, the second normal line information being used for indicating a normal line direction of the first collision surface, the first collision surface being any one of a plurality of collision surfaces; the distance of each object vertex to the first collision surface is determined based on the coordinate information of each object vertex and the second normal line information.
In other alternative embodiments, the processing unit is configured to: determining second projection coordinates based on the coordinate information of the first object vertexes and the second normal line information, wherein the second projection coordinates are coordinates of projection points of the first object vertexes on the first collision surface, and the first object vertexes are any one of the object vertexes; when the second projection coordinates meet the coordinate range of the first collision surface, calculating the distance from the first object vertex to the projection point based on the coordinate information of the first object vertex and the second projection coordinates so as to determine the distance from the first object vertex to the first collision surface; or when the second projection coordinates do not meet the coordinate range of the first collision surface, respectively calculating the distance from the first object vertex to each collision vertex in the first collision surface based on the coordinate information of the first object vertex and the coordinate information of each collision vertex in the first collision surface; a target distance is selected from the distances from the first object vertex to each collision vertex to determine the distance of the first object vertex to the first collision surface.
In other alternative embodiments, the processing unit is configured to: selecting a minimum distance from the distances from the vertex of the first object to each collision surface respectively; and when the minimum distance is greater than or equal to a preset threshold value, determining the first object vertex as an abnormal object vertex.
In other alternative embodiments, the processing unit is configured to: when the abnormal collision top point is outside the virtual object or the abnormal object top point is inside the virtual collision body, determining that the virtual object has an air wall abnormal condition; or when the abnormal collision top point is in the virtual object or the abnormal object top point is outside the virtual collision body, determining that the virtual object has the abnormal through-mould condition.
In other alternative embodiments, the processing unit is further configured to: after determining that the virtual object has an abnormal condition based on the abnormal collision vertex and the abnormal object vertex, calculating the distance between the object vertex of each object plane in the virtual object and the virtual ground, wherein the virtual ground is the ground in the game map of the virtual scene; selecting a target value from the distances between the object vertexes of each object surface and the virtual ground, wherein the target value is used for indicating the floating distance of the virtual object; when the target value is smaller than a preset floating value, calculating the distance between the virtual character and the virtual object; and when the distance between the virtual character and the virtual object is smaller than a preset threshold value, recording the position of the virtual object.
A third aspect of an embodiment of the present application provides an abnormality detection apparatus, including: memory, input/output (I/O) interfaces, and memory. The memory is used for storing program instructions. The processor is configured to execute program instructions in the memory to perform the method of anomaly detection corresponding to the implementation manner of the first aspect.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform to execute the method corresponding to the embodiment of the first aspect described above.
A fifth aspect of the embodiments of the present application provides a computer program product comprising instructions which, when run on a computer or processor, cause the computer or processor to perform the method described above to perform the embodiment of the first aspect described above.
From the above technical solutions, the embodiment of the present application has the following advantages:
in the embodiment of the application, the object surface information includes coordinate information of object vertexes of a plurality of object surfaces of a virtual object in a virtual scene, and the collision surface information includes coordinate information of collision vertexes of a plurality of collision surfaces of a virtual collision body corresponding to the virtual object, so that after the object surface information of the virtual object and the collision surface information of the virtual collision body are acquired, a distance from each corresponding collision vertex to each object surface can be calculated according to the coordinate of each collision vertex, and a distance from each corresponding object vertex to each collision surface can be calculated according to the coordinate information of each object vertex. Further, according to the distance from each collision top to each object surface and the distance from each object top to each collision surface, whether the virtual object collides or not can be detected. By the mode, a large number of workers are not needed to label the sample, so that labor cost is saved; moreover, the recognition processing is not required to be performed through a depth network model with poor precision, and the detection of whether the virtual object is abnormal or not is automatically realized from the angles of the object surface information of the virtual object and the collision surface information of the collision body, so that the detection effect is improved, and the detection omission is avoided.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows an application scenario schematic diagram provided by an embodiment of the present application;
FIG. 2 is a flowchart of a method for anomaly detection according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a virtual object according to an embodiment of the present application;
FIG. 4 is a flowchart of calculating a distance from an impact vertex to an object surface according to an embodiment of the present application;
FIG. 5 shows a flowchart for calculating the distance from the object vertex to the collision surface provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a virtual object with an abnormal situation according to an embodiment of the present application;
fig. 7 is a schematic diagram of a virtual collision body corresponding to an abnormal virtual object according to an embodiment of the present application;
FIG. 8 is another flow chart of a method for anomaly detection provided by an embodiment of the present application;
Fig. 9 is a schematic structural diagram of an abnormality detection apparatus according to an embodiment of the present application;
fig. 10 shows a schematic hardware structure of an abnormality detection apparatus according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides an abnormality detection method and a related device, which can automatically detect whether an abnormality occurs in a virtual object, avoid missing detection and improve the detection effect.
It will be appreciated that in the specific embodiments of the present application, related data such as user information, personal data of a user, etc. are involved, and when the above embodiments of the present application are applied to specific products or technologies, user permission or consent is required, and the collection, use and processing of related data is required to comply with relevant laws and regulations and standards of relevant countries and regions.
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The anomaly detection method provided by the embodiment of the application is realized based on artificial intelligence (artificial intelligence, AI). Artificial intelligence is the theory, method, technique and application system that uses a digital computer or a digital computer-controlled machine to simulate, extend and expand human intelligence, sense the environment, acquire knowledge and use the knowledge to obtain optimal results. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision.
In the traditional scheme, whether the virtual character in the virtual game encounters an abnormal object or not is determined by means of manual annotation and recognition of the game image through a depth network model. However, this conventional scheme causes a lot of cost consumption and is prone to occurrence of erroneous detection, resulting in poor detection results.
Therefore, in order to solve the above-described technical problems, the embodiment of the present application provides an anomaly detection method that can be applied to a scene diagram shown in fig. 1. As shown in fig. 1, a scene of a virtual game is presented. The abnormal detection of the virtual object is completed under the scene of the virtual game, a large number of manual marks on the sample are not needed, and the labor cost is saved; moreover, the recognition processing is not required to be performed through a depth network model with poor precision, and the detection of whether the virtual object is abnormal or not is automatically realized from the angles of the object surface information of the virtual object and the collision surface information of the collision body, so that the detection effect is improved, and the detection omission is avoided. In addition, the method can be applied to various scenes such as map fields, cloud technical fields and the like, and the embodiment of the application is not limited.
In addition, the abnormality detection method provided by the application can be applied to an abnormality detection device with data processing capability, such as a terminal device, a server and the like. The terminal device may include, but is not limited to, a smart phone, a desktop computer, a notebook computer, a tablet computer, a smart speaker, a vehicle-mounted device, a smart watch, a smart home appliance, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server or the like for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (content delivery network, CDN), basic cloud computing services such as big data and artificial intelligent platforms, and the application is not limited in particular. In addition, the terminal device and the server may be directly connected or indirectly connected by wired communication or wireless communication, and the present application is not particularly limited.
The abnormality detection device mentioned above may also be provided with a processing capability to implement cloud technology (closed technology). The described cloud technology refers to a hosting technology for unifying serial resources such as hardware, software, network and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. Cloud technology (cloud technology) is based on the general terms of network technology, information technology, integration technology, management platform technology, application technology and the like applied by cloud computing business models, and can form a resource pool, so that the cloud computing business model is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
The mentioned cloud computing (closed computing) is a computing mode that distributes computing tasks over a resource pool of a large number of computers, enabling various application systems to acquire computing power, storage space, and information services as needed. The network that provides the resources is referred to as the "cloud". Resources in the cloud are infinitely expandable in the sense of users, and can be acquired at any time, used as needed, expanded at any time and paid for use as needed. As a basic capability provider of cloud computing, a cloud computing resource pool (cloud platform for short, generally called infrastructure as a service (infrastructure as a service, iaaS) platform is established, various types of virtual resources are deployed in the resource pool for external clients to select for use, and the cloud computing resource pool mainly comprises computing equipment (which is a virtualized machine and comprises an operating system), storage equipment and network equipment.
In order to facilitate understanding of the technical scheme of the present application, a method for detecting an abnormality provided by the embodiment of the present application is described below with reference to the accompanying drawings.
Fig. 2 shows a flowchart of a method for anomaly detection according to an embodiment of the present application. As shown in fig. 2, the abnormality detection method may include the steps of:
201. object plane information of a virtual object in a virtual scene is acquired, and collision plane information of a virtual collision body corresponding to the virtual object is acquired, wherein the object plane information comprises coordinate information of object vertexes of a plurality of object planes of the virtual object, and the collision plane information comprises coordinate information of collision vertexes of a plurality of collision planes of the virtual collision body.
In this example, a virtual scene may be understood as a scene in a virtual game, such as a scene of a virtual game start stage, a scene of a virtual game run stage, and the like, and the present application is not particularly limited. The virtual game described may be a world game, a racing game, a shooting game, etc., the present application may be applied to a single game, a game of a countermeasure virtual game, or the like, but is not limited to the embodiment of the present application.
In the virtual scene, at least one virtual object is contained. The virtual object may include a building, a tree, a vehicle, or other game props on the game map of the virtual scene, which is not limited in the embodiment of the present application. For each virtual object, the corresponding object plane information may be used for representation. The object plane information may be understood as shape data of a corresponding virtual object, and may include coordinate information of object vertices of a plurality of object planes of the virtual object. The object plane information may further include an index number of each object plane, which can be understood as orientation information of the corresponding object plane.
In addition, in the process of collision detection of the abnormal object, a virtual collision body corresponding to the virtual object can be selected for collision test. For each virtual collision volume corresponding to a virtual object, the collision surface information may be used. The collision surface information may be understood as shape data of a corresponding virtual collision body, and may include coordinate information of collision vertices of a plurality of collision surfaces of the virtual collision body. The collision surface information may also include an index number of each collision surface, which can be understood as the orientation information of the corresponding collision surface. It is noted that the virtual collision volumes described may include, but are not limited to, base collision volumes, simplex collision volumes, and the like. In addition, the described basic collision body may include, but is not limited to, a box collision body, a ball collision body, a cylindrical collision body, a conical collision body, a capsule collision body, etc., and the embodiments of the present application are not limited thereto. A simplex collision volume is understood to be a collision volume that is capable of providing the functions of a point, a line, a triangular surface, a tetrahedral collision, etc.
Illustratively, the anomaly detection device may traverse all virtual objects in the virtual scene through the game interface. After traversing to obtain all the virtual objects, the abnormality detection device may obtain object plane information of each virtual object from a database or a cloud network according to the mapping relationship between the virtual object and the first mapping relationship. Similarly, the abnormality detection device may acquire collision surface information of the virtual collision body corresponding to each virtual object from the database or the cloud network based on the virtual object and the second mapping relationship. It should be noted that, the first mapping relationship is understood as an association relationship between the virtual object and the corresponding object plane information. The second mapping relationship is understood as an association relationship between the virtual object and the collision surface of the corresponding virtual collision body.
In addition, the above-mentioned object surfaces are understood to be triangular surfaces, tetrahedrons, etc., and the description of the subsequent embodiments will be given by taking triangular surfaces as examples only. Likewise, the collision surfaces mentioned are also understood to be triangular surfaces, tetrahedrons, etc., which are described in the following embodiments only by way of example.
For example, fig. 3 shows a schematic shape of a virtual object according to an embodiment of the present application. As shown in fig. 3, the virtual object may be approximately seen as a triangle. In addition, the triangle body includes at least 5 triangular faces, for example: for right triangular faces, it may be defined by O, P 1 、P 2 Three vertices. When the index numbers of the right triangular surfaces are different, it may be indicated that the directions of the right triangular surfaces are different.For example, if the face formed by the index number in the counterclockwise direction is the front face, when the index number is (O, P 1 ,P 2 ) In this case, the normal direction of the triangular surface is oriented out of the plane as is known from the left-hand rule theorem. Similarly, when the index number is (O, P 2 ,P 1 ) At this time, the normal direction of the triangular surface is directed into the plane.
It should be noted that the shape of the virtual object shown in fig. 3 is a triangle, which is only a schematic description, and other shapes may be used in practical applications, and the present application is not limited to the description.
202. The distance of the corresponding collision vertex to each object surface is calculated based on the coordinate information of each collision vertex, and the distance of the corresponding object vertex to each collision surface is calculated based on the coordinate information of each object vertex.
In this example, after the coordinate information of each collision vertex is obtained, the distance from the collision vertex to each of all the object surfaces may be determined for each collision vertex. For example, it can be calculated by means of the normal direction of the object surface. Based on this, the process of calculating the distance of each collision vertex to each object surface from the coordinate information of each collision vertex can be understood with reference to the flowchart shown in fig. 4. As shown in fig. 4, the process may include the steps of:
s401, calculating first normal line information according to coordinate information of each object vertex in the first object plane, wherein the first normal line information is used for indicating a normal line direction of the first object plane, and the first object plane is any one of a plurality of object planes.
In this example, after the above step 201 is performed, coordinate information of each object vertex in each object plane may already be acquired. Therefore, to calculate the normal line information of each of the plurality of object planes, it is possible to calculate by means of the coordinate information of all the object vertices in the respective object planes. For example, for any one of the object surfaces, i.e., the first object surface, the first normal information may be calculated according to coordinate information of all object vertices in the first object surface, i.e., the normal direction of the first object surface.
For example, if the shape of the first object plane is the triangular plane OP shown in FIG. 3 1 P 2 For example, assume that the coordinate information of the object vertex O corresponding to the first index of the triangular surface is (x) 0 ,y 0 ,z 0 ) Object vertex P corresponding to the second index 1 Is (x) 1 ,y 1 ,z 1 ) And object vertex P corresponding to third index 2 Is (x) 2 ,y 2 ,z 2 ). At this time, it may be based on the three object vertices O, P 1 、P 2 And calculating first normal line information of the first object plane. Illustratively, the first normal information may be calculated according to a vector cross-wise manner, e.g. a vector cross-wise manner may be understood as:by apexes O, P of the three objects 1 、P 2 Is input into the vector cross-multiplied formula, whereby the first normal information +.>Wherein, the->The three values of (a) are:
a=(y 1 -y 0 )(z 2 -z 0 )-(y 2 -y 0 )(z 1 -z 0 )
b=(x 2 -x 0 )(z 1 -z 0 )-(x 1 -x 0 )(z 2 -z 0 )
c=(x 1 -x 0 )(y 2 -y 0 )-(x 2 -x 0 )(y 1 -y 0 )
s402, determining the distance between each collision vertex and the first object plane based on the coordinate information of each collision vertex and the first normal line information.
In this example, after the above-described step 201 is performed, coordinate information of each collision vertex in each collision surface may already be acquired. Therefore, after determining the normal direction of the first object surface (i.e., the first normal information), the distance from the corresponding collision vertex to the first object surface can be determined according to the coordinate information of each collision vertex and the first normal information. It should be noted that the first object plane and the first normal line information of the first object plane may be understood with reference to the description of step S401, which is not described herein.
Illustratively, the calculation process is substantially similar since the distance of each collision vertex to the first object surface is calculated. Thus, taking any one of a plurality of collision vertices (e.g., a first collision vertex) as an example, calculating the distance of the first collision vertex to the first object surface can be understood with reference to the following:
first, first projection coordinates are determined according to coordinate information of a first collision vertex and first normal line information. The first projection coordinate may be understood as a coordinate of a projection point of the first collision vertex on the first object plane. For example, it is assumed that the coordinate information of the first collision vertex is (x, y, z), and the coordinate information of one object vertex O on the first object plane is (x 0 ,y 0 ,z 0 ) The first normal information of the first object surface isIf the position of the projection point at which the first collision vertex is projected onto the first object surface is set to (x ', y ', z '), then the coordinate information (x, y, z) and the first normal line information ∈of the first collision vertex are based on>A specific value of the first projection coordinates (x ', y ', z ') may be determined. For example, the calculation formula is as follows: />Wherein (1)>
Then, after the first projection coordinate is obtained by calculation, whether the first projection coordinate is in the coordinate range of the first object plane or not can be judged, so that the distance from the first collision top point to the first object plane can be calculated in different modes according to the result of whether the first projection coordinate is in the coordinate range of the first object plane or not. For example, after obtaining the maximum value and the minimum value of the first object plane in the x coordinate axis, the y coordinate axis and the z coordinate axis, respectively, the maximum value x 'of the first projection coordinate (x', y ', z') in the x coordinate axis is obtained and the minimum value x of the first object plane in the x coordinate axis is obtained min And maximum value x max Comparing; similarly, the value y 'of the first projection coordinate (x', y ', z') on the y coordinate axis is respectively matched with the minimum value y of the first object plane on the y coordinate axis min And maximum value y max Comparing the value z 'of the first projection coordinate (x', y ', z') on the z coordinate axis with the minimum value z of the first object plane on the z coordinate axis min And maximum value z max A comparison is made.
If it is determined that the first projection coordinates (x ', y ', z ') may satisfy the following coordinate ranges: x is x min ≤x’≤x max &y min ≤y’≤y max &z min ≤z’≤z max It may be determined that the first projection coordinates lie within the first object plane. At this time, the distance from the first collision vertex to the projection point corresponding to the first projection coordinate may be calculated according to the coordinate information of the first collision vertex and the first projection coordinate, and then the distance from the first collision vertex to the projection point may be used as the distance from the first collision vertex to the first object plane.
If it is determined that the first projection coordinates (x ', y ', z ') do not satisfy the above-mentioned coordinate range (i.e., x min ≤x’≤x max &y min ≤y’≤y max &z min ≤z’≤z max ) When it is determined that the first projection coordinates are notIs located in the first object plane. At this time, it is necessary to calculate the distance from the first collision vertex to each object vertex in the first object plane according to the coordinate information of the first collision vertex and the coordinate information of each object vertex in the first object plane. For example, at the first object plane OP 1 P 2 Respectively calculating a first collision peak M and an object peak O, P 1 P 2 Distance between them. Then, a target distance, for example, a minimum distance, is selected from the distances from the first collision vertex to the vertices of each object in the first object plane, and the target distance is further used as the distance from the first collision vertex to the first object plane.
The above description is made on how to calculate the distance from the first collision vertex to the first object plane, taking only any one of the plurality of object planes (i.e., the first object plane) as an example. In practical applications, the process of the distance from the first collision vertex to the other object surface and the distance from the other collision vertex to the first object surface shown in the above steps S401 to S402 may be understood, and will not be described herein.
Similarly, after the coordinate information of each object vertex is obtained, the distance from the object vertex to each of all the collision surfaces can be determined for each object vertex. For example, it can be calculated by means of the normal direction of the collision surface. Based on this, the process of calculating the distance of the corresponding object vertex to each collision surface from the coordinate information of each object vertex can be understood with reference to the flowchart shown in fig. 5. As shown in fig. 5, the process may include the steps of:
S501, calculating second normal line information according to coordinate information of each collision vertex in the first collision surface, where the second normal line information is used to indicate a normal line direction of the first collision surface, and the first collision surface is any one of multiple collision surfaces.
In this example, after the above-described step 201 is performed, coordinate information of each collision vertex in each collision surface may already be acquired. Therefore, to calculate the normal line information of each of the plurality of collision surfaces, it is possible to calculate by means of the coordinate information of all the collision vertices in the respective collision surfaces. For example, for any one of the collision surfaces, i.e., the first collision surface, the second normal information may be calculated according to the coordinate information of all the collision vertices in the first collision surface, i.e., the normal direction of the first collision surface.
It should be noted that, how to calculate the second normal information according to the coordinate information of all the collision vertices in the first collision surface may be understood by referring to the foregoing calculation process of the first normal information shown in step 401 in fig. 4, which is not described herein.
S502, determining the distance between each object vertex and the first collision surface based on the coordinate information of each object vertex and the second normal line information.
In this example, after the above step 201 is performed, coordinate information of each object vertex in each object plane may already be acquired. Therefore, after determining the normal direction (i.e., the second normal information) of the first collision surface, the distance from the corresponding object vertex to the first collision surface can be determined based on the coordinate information of each object vertex and the second normal information. It should be noted that the first collision surface and the second normal information described above may be understood with reference to the description of step S501, which is not described herein.
Illustratively, the calculation process is substantially similar since the distance of each object vertex to the first collision surface is calculated. Thus, taking any one of a plurality of object vertices (e.g., a first object vertex) as an example, calculating the distance of the first object vertex to the first collision surface can be understood with reference to the following:
first, according to the coordinate information of the vertex of the first object and the second normal line information, a second projection coordinate is determined. The second projection coordinates may be understood as coordinates of a projection point of the first object vertex on the first collision surface. For the calculation process of the second projection coordinate, the calculation process of the first projection coordinate in the aforementioned step S402 may be also referred to for understanding, and will not be described herein.
Then, after the second projection coordinate is calculated, whether the second projection coordinate is within the coordinate range of the first collision surface or not can be judged, so that the distance from the vertex of the first object to the first collision surface can be calculated in different modes according to the result of whether the second projection coordinate is within the coordinate range of the first collision surface or not. For example, after the maximum value and the minimum value of the first collision surface in the x coordinate axis, the y coordinate axis and the z coordinate axis are obtained, respectively, the second projection coordinate (x * ,y * ,z * ) The value x in the x coordinate axis * Respectively with the minimum value x of the first collision surface on the x coordinate axis · min And maximum value x · max Comparing; likewise, the second projection coordinates (x * ,y * ,z * ) The value y in the y coordinate axis * Respectively with the minimum value y of the first collision surface on the y coordinate axis · min And maximum value y · max Comparing and taking the value z in the z coordinate axis in the second projection coordinate (x ', y ', z ') * Respectively with the minimum value z of the first collision surface on the z coordinate axis · min And maximum value z · max A comparison is made.
If it is judged that the second projection coordinates (x * ,y * ,z * ) The following coordinate ranges may be satisfied, namely: x is x · min ≤x * ≤x · max &y · min ≤y * ≤y · max &z · min ≤z * ≤z · max It may be determined that the second projection coordinates lie within the first collision plane. At this time, the distance from the first object vertex to the projection point corresponding to the second projection coordinate may be calculated according to the coordinate information of the first object vertex and the second projection coordinate, and then the distance from the first object vertex to the projection point may be used as the distance from the first object vertex to the first collision surface.
If the second projection coordinate (x * ,y * ,z * ) Not meeting the above-mentioned requirementsCoordinate range (i.e. x · min ≤x * ≤x · max &y · min ≤y * ≤y · max &z · min ≤z * ≤z · max ) When it is determined that the second projection coordinates are not located in the first object plane. At this time, it is necessary to calculate the distance from the first object vertex to each collision vertex in the first collision surface based on the coordinate information of the first object vertex and the coordinate information of each collision vertex in the first collision surface, respectively. Then, a target distance, for example, a minimum distance, is selected from the distances from the first object vertex to each collision vertex in the first collision surface, and the target distance is further used as the distance from the first object vertex to the first collision surface.
The above description is made on how to calculate the distance from the first object vertex to the first collision surface, taking only any one of the plurality of collision surfaces (i.e., the first collision surface) as an example. In practical applications, the process of the distance from the first object vertex to the other collision surface and the distance from the other object vertex to the first collision surface shown in the above steps S501 to S502 may be understood, and will not be described herein.
203. The virtual object is collision detected based on the distance from each collision vertex to each object surface, and the distance from each object vertex to each collision surface.
In this example, after the distance from each collision vertex to each object surface and the distance from each object vertex to each collision surface are calculated, the virtual object can be collision-detected based on the distance from each collision vertex to each object surface and the distance from each object vertex to each collision surface.
For example, collision detection of a virtual object may be achieved by determining whether a collision vertex is an abnormal collision vertex and whether an object vertex is an abnormal object vertex. For example, the abnormal collision vertex is determined based on the distance from each collision vertex to each object plane. Likewise, the abnormal object vertices are determined from the distance of each object vertex to each collision surface.
In this example, after the distances from each collision vertex to each object surface are calculated, it may be determined whether the corresponding collision vertex is an abnormal collision vertex according to the distances from each collision vertex to each object surface. Illustratively, taking the first collision vertex as an example, a minimum distance may be selected from the distances from the first collision vertex to each object surface respectively; and then judging whether the minimum distance exceeds a preset threshold value, and further determining that the first collision vertex is an abnormal collision vertex when the minimum distance is larger than or equal to the preset threshold value. The described abnormal collision vertices are understood to be points in the virtual collision volume that may obstruct the movement of the virtual character. It should be noted that, regarding the judging process of whether other collision vertices are abnormal collision vertices, the judging process of the first collision vertex may be referred to for understanding, and the present application will not be described in detail.
Similarly, after calculating the distance from each object vertex to each collision surface, it may be determined whether the corresponding object vertex is an abnormal object vertex according to the distance from each object vertex to each collision surface. Illustratively, taking the first object vertex described above as an example, a minimum distance may be first selected from the distances from the first object vertex to each collision surface, respectively; and then judging whether the minimum distance exceeds a preset threshold value, and further determining that the first object vertex is an abnormal collision vertex when the minimum distance is larger than or equal to the preset threshold value. The described abnormal object vertices may be understood as points in the virtual object that may obstruct the movement of the virtual character. It should be noted that, regarding the judging process of whether other object vertices are abnormal object vertices, the judging process of the first object vertex may be referred to for understanding, and the present application will not be described in detail.
Thus, after the abnormal collision peak and the abnormal object peak are determined, the virtual object can be determined to have an abnormal collision according to the abnormal collision peak and the abnormal object peak. The abnormal collision may include a through mold abnormal condition or an air wall abnormal condition.
For example, after the abnormal collision vertex and the abnormal object vertex are determined, it is possible to determine whether the abnormal collision vertex is outside the virtual object or whether the abnormal object vertex is inside the virtual collision body. In this way, when it is determined that the abnormal collision vertex is outside the virtual object or the abnormal object vertex is inside the virtual collision body, it is possible to determine that the virtual object has an air wall abnormality. Otherwise, if it is determined that the abnormal collision vertex is inside the virtual object or the abnormal object vertex is outside the virtual collision body, it may be determined that the virtual object has a through-mold abnormality. Fig. 6 shows a schematic diagram of a virtual object with an abnormal situation according to an embodiment of the present application. Fig. 7 shows a schematic diagram of a virtual collision body corresponding to an abnormal virtual object according to an embodiment of the present application. As can be seen from fig. 6 and 7, the virtual object with the abnormal situation is an air conditioner and includes a duct portion of the air conditioner, and the duct portion of the air conditioner is not detected in the virtual collision body, so that the problem of mold penetration easily occurs when the virtual character moves to the vicinity of the virtual object.
The determination of whether the abnormal collision vertex is inside or outside the virtual object may be achieved by: taking the first collision vertex as an abnormal collision vertex as an example, after the first collision vertex is determined, the target object plane can be determined based on the distance between the first collision vertex and each object plane, and the distance between the target object plane and the first collision vertex is the minimum distance in the distances between the first collision vertex and each object plane. Then, a target vector is constructed by taking any vertex in the target object plane as a vector starting point and a first collision vertex as a vector ending point, and then the target vector is multiplied by normal line information of the target object plane to obtain a first result. Judging whether the first result is larger than zero or not, if the first result is larger than zero, determining that the first collision vertex is outside the target object surface, and further indicating that the first collision vertex is outside the virtual object; otherwise, if the first result is smaller than or equal to zero, determining that the first collision vertex is inside the target object plane, and further indicating that the first collision vertex is inside the virtual object.
It should be noted that, regarding the normal information of the target object surface, the foregoing solving process of the first normal information shown in fig. 4 may be specifically referred to for understanding, and will not be described herein.
Likewise, determining whether the abnormal object vertex is inside or outside the virtual collision volume may be accomplished by: taking the first object vertex as an abnormal object vertex as an example, after the first object vertex is determined, a target collision surface can be determined based on the distance between the first object vertex and each collision surface, wherein the distance between the target collision surface and the first object vertex is the minimum distance among the distances between the first object vertex and each collision surface. Then, any vertex in the target collision surface is taken as a vector starting point, the vertex of the first object is taken as a vector end point to construct a target vector, and the target vector is multiplied by normal line information of the target collision surface to obtain a second result. Judging whether the second result is larger than zero, if so, determining that the first object vertex is outside the target collision surface, and further indicating that the first object vertex is outside the virtual collision body; otherwise, if the second result is smaller than or equal to zero, determining that the second object vertex is inside the target collision surface, and further indicating that the first object vertex is inside the virtual collision body.
It should be noted that, regarding the normal information of the target collision surface, the foregoing solving process of the second normal information shown in fig. 5 may be specifically referred to for understanding, and will not be described herein.
In the embodiment of the application, the object surface information includes coordinate information of object vertexes of a plurality of object surfaces of a virtual object in a virtual scene, and the collision surface information includes coordinate information of collision vertexes of a plurality of collision surfaces of a virtual collision body corresponding to the virtual object, so that after the object surface information of the virtual object and the collision surface information of the virtual collision body are acquired, a distance from each corresponding collision vertex to each object surface can be calculated according to the coordinate of each collision vertex, and a distance from each corresponding object vertex to each collision surface can be calculated according to the coordinate information of each object vertex. Further, collision detection is performed on the virtual object based on the distance from each collision vertex to each object surface, and the distance from each object vertex to each collision surface. By the mode, a large number of workers are not needed to label the sample, so that labor cost is saved; moreover, the recognition processing is not required to be performed through a depth network model with poor precision, and the detection of whether the virtual object is abnormal or not is automatically realized from the angles of the object surface information of the virtual object and the collision surface information of the collision body, so that the detection effect is improved, and the detection omission is avoided.
In other optional examples, after the virtual object is detected by collision, and it is detected that the virtual object does have an abnormal collision, it may be further determined whether all virtual objects that have an abnormal collision are within the range that the virtual character can contact. Illustratively, FIG. 8 shows another flow diagram of a method of anomaly detection. As shown in fig. 8, the abnormality detection method may include the steps of:
801. object plane information of a virtual object in a virtual scene is acquired, and collision plane information of a virtual collision body corresponding to the virtual object is acquired, wherein the object plane information comprises coordinate information of object vertexes of a plurality of object planes of the virtual object, and the collision plane information comprises coordinate information of collision vertexes of a plurality of collision planes of the virtual collision body.
802. The distance of the corresponding collision vertex to each object surface is calculated based on the coordinate information of each collision vertex, and the distance of the corresponding object vertex to each collision surface is calculated based on the coordinate information of each object vertex.
803. And performing collision detection on the virtual object based on the distance from each collision top point to each object surface and the distance from each object top point to each collision surface.
It should be noted that, the descriptions of step 801 to step 803 in the embodiment of the present application may be specifically understood with reference to the descriptions of step 201 to step 203 in fig. 2, which are not described herein.
804. After abnormal collision of the virtual objects is detected, calculating the distance between the object vertex of each object plane in the virtual objects and the virtual ground, wherein the virtual ground is the ground in the game map of the virtual scene.
In this example, in the virtual scene, if the virtual object floats in the air and is far from the virtual ground, it is indicated that the virtual object belongs to an area where the virtual character is not reachable in the game map of the virtual scene. Therefore, for all object surfaces in the virtual object, with each object vertex in each object surface as a starting point, detection rays are respectively emitted to the virtual ground, so that the distance between each object vertex and the virtual ground is calculated.
805. And selecting a target value from the distances between the object vertexes of each object surface and the virtual ground, wherein the target value is used for indicating the floating distance of the virtual object.
In this example, after determining the distance between each object vertex in each object plane and the virtual ground, a target value, such as a minimum distance, may be selected from among them. Further, the target value is determined as the float distance of the virtual object. It should be noted that the floating distance may reflect a floating height of the virtual object from the virtual ground.
806. And when the target value is smaller than the preset floating value, calculating the distance between the virtual character and the virtual object.
In this example, the preset float value may be understood as the highest distance that the avatar can reach. Thus, after the target value is determined, the target value may be compared with a preset float value. If the target value is smaller than the preset floating value, the virtual character is enabled to contact the virtual object in the moving process, and abnormal objects which are not reachable by the virtual character are screened out through the comparison result. In this way, the virtual character is controlled to move in the direction of the virtual object by calling the navigation interface or the like. Further, during the movement, the distance from the virtual character to the virtual object needs to be further calculated.
807. And when the distance between the virtual character and the virtual object is smaller than a preset threshold value, recording the position of the virtual object.
In this example, after the distance between the virtual character and the virtual object is calculated, the distance is further compared with a preset threshold. And under the condition that the distance is smaller than a preset threshold value, the virtual object is indicated to be in the reachable range of the virtual character, and the position of the virtual object can be recorded at the moment, so that subsequent parameter adjustment and the like on the abnormal virtual object are facilitated. Or by recording the position of the virtual object, the virtual object at the position can be automatically bypassed when the virtual character moves to the vicinity of the virtual object with abnormal collision in the subsequent deployment stage, and the collision between the virtual object and the virtual object at the position can be avoided.
By the method, the abnormal virtual objects with reachable virtual roles are screened from the virtual objects with more abnormal conditions according to the floating distance and the reachable range, so that the abnormal objects in the virtual game can be quickly and accurately found, manual participation is not needed, and the detection efficiency of the abnormal virtual objects is greatly improved.
The above-described fig. 2 to 8 mainly describe application scenarios in which the abnormality detection method of the present application is applied to a test stage, for example. In practical application, the method for detecting the abnormality can be applied to an application scene of a deployment stage of abnormality detection after the test is completed. For example, in the virtual map of the virtual scene shown in fig. 1, various virtual objects are encountered when a virtual character drives a virtual vehicle to automatically find a path in the virtual map. The abnormality detection device automatically applies the abnormality detection method shown in fig. 2 to 8 described above to detect whether each virtual object is an object that has an abnormal collision in the process of moving the virtual character to the vicinity of each virtual object. If the virtual object is detected to be an abnormal collision object, the abnormal detection device controls the virtual character to bypass the abnormal collision virtual object in a mode of instructions and the like, so that collision between the automatic avoidance virtual character and the abnormal collision virtual object is realized, and game experience can be improved.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. It should be understood that, in order to implement the above-described functions, hardware structures and/or software modules corresponding to the respective functions are included. Those of skill in the art will readily appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The following describes the abnormality detection device in the embodiment of the present application in detail, and fig. 9 is a schematic diagram of an embodiment of the abnormality detection device provided in the embodiment of the present application. As shown in fig. 9, the abnormality detection apparatus may include an acquisition unit 901 and a processing unit 902.
The acquiring unit 901 is configured to acquire object plane information of a virtual object in a virtual scene, and collision plane information of a virtual collision body corresponding to the virtual object, where the object plane information includes coordinate information of object vertices of a plurality of object planes of the virtual object, and the collision plane information includes coordinate information of collision vertices of a plurality of collision planes of the virtual collision body; a processing unit 902, configured to calculate a distance from each collision vertex to each object surface based on the coordinate information of each collision vertex, and calculate a distance from each collision surface to each corresponding object vertex based on the coordinate information of each object vertex; a processing unit 902, configured to perform collision detection on the virtual object based on a distance from each collision vertex to each object plane, and a distance from each object vertex to each collision plane.
In some alternative embodiments, processing unit 902 is configured to: determining an abnormal collision vertex based on a distance from each of the collision vertices to each of the object faces, and determining an abnormal object vertex based on a distance from each of the object vertices to each of the collision faces; and determining that the virtual object collides abnormally based on the abnormal collision vertex and the abnormal object vertex.
In some alternative embodiments, processing unit 902 is configured to: calculating first normal line information according to coordinate information of each object vertex in the first object plane, wherein the first normal line information is used for indicating a normal line direction of the first object plane, and the first object plane is any one of a plurality of object planes; the distance of each collision vertex to the first object surface is determined based on the coordinate information of each collision vertex and the first normal line information.
In other alternative embodiments, processing unit 902 is configured to: determining a first projection coordinate based on the coordinate information of the first collision vertex and the first normal line information, wherein the first projection coordinate is the coordinate of a projection point of the first collision vertex on the first object surface, and the first collision vertex is any one of the collision vertices; when the first projection coordinates meet the coordinate range of the first object plane, calculating the distance from the first collision vertex to the projection point based on the coordinate information of the first collision vertex and the first projection coordinates so as to determine the distance from the first collision vertex to the first object plane; or when the first projection coordinates do not meet the coordinate range of the first object plane, respectively calculating the distance from the first collision vertex to each object vertex in the first object plane based on the coordinate information of the first collision vertex and the coordinate information of each object vertex in the first object plane; a target distance is selected from the distances of the first collision vertex to each object vertex to determine the distance of the first collision vertex to the first object surface.
In other alternative embodiments, processing unit 902 is configured to: selecting a minimum distance from the distances from the first collision top point to each object surface respectively; and when the minimum distance is greater than or equal to a preset threshold value, determining the first collision vertex as an abnormal collision vertex.
In other alternative embodiments, processing unit 902 is configured to: calculating second normal line information based on coordinate information of each collision vertex in the first collision surface, the second normal line information being used for indicating a normal line direction of the first collision surface, the first collision surface being any one of a plurality of collision surfaces; the distance of each object vertex to the first collision surface is determined based on the coordinate information of each object vertex and the second normal line information.
In other alternative embodiments, processing unit 902 is configured to: determining second projection coordinates based on the coordinate information of the first object vertexes and the second normal line information, wherein the second projection coordinates are coordinates of projection points of the first object vertexes on the first collision surface, and the first object vertexes are any one of the object vertexes; when the second projection coordinates meet the coordinate range of the first collision surface, calculating the distance from the first object vertex to the projection point based on the coordinate information of the first object vertex and the second projection coordinates so as to determine the distance from the first object vertex to the first collision surface; or when the second projection coordinates do not meet the coordinate range of the first collision surface, respectively calculating the distance from the first object vertex to each collision vertex in the first collision surface based on the coordinate information of the first object vertex and the coordinate information of each collision vertex in the first collision surface; a target distance is selected from the distances from the first object vertex to each collision vertex to determine the distance of the first object vertex to the first collision surface.
In other alternative embodiments, processing unit 902 is configured to: selecting a minimum distance from the distances from the vertex of the first object to each collision surface respectively; and when the minimum distance is greater than or equal to a preset threshold value, determining the first object vertex as an abnormal object vertex.
In other alternative embodiments, processing unit 902 is configured to: when the abnormal collision top point is outside the virtual object or the abnormal object top point is inside the virtual collision body, determining that the virtual object has an air wall abnormal condition; or when the abnormal collision top point is in the virtual object or the abnormal object top point is outside the virtual collision body, determining that the virtual object has the abnormal through-mould condition.
In other alternative embodiments, the processing unit 902 is further configured to: after determining that the virtual object has an abnormal condition based on the abnormal collision vertex and the abnormal object vertex, calculating the distance between the object vertex of each object plane in the virtual object and the virtual ground, wherein the virtual ground is the ground in the game map of the virtual scene; selecting a target value from the distances between the object vertexes of each object surface and the virtual ground, wherein the target value is used for indicating the floating distance of the virtual object; when the target value is smaller than a preset floating value, calculating the distance between the virtual character and the virtual object; and when the distance between the virtual character and the virtual object is smaller than a preset threshold value, recording the position of the virtual object.
The abnormality detection device in the embodiment of the present application is described above from the viewpoint of the modularized functional entity, and the abnormality detection device in the embodiment of the present application is described below from the viewpoint of hardware processing. Fig. 10 is a schematic structural diagram of an abnormality detection apparatus according to an embodiment of the present application. The abnormality detection device may have a relatively large difference due to different configurations or performances. The anomaly detection means may be at least one processor 1001, a communication line 1007, a memory 1003 and at least one communication interface 1004.
The processor 1001 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (server IC), or one or more integrated circuits for controlling the execution of the program of the present application.
Communication line 1007 may include a pathway to transfer information between the components.
Communication interface 1004, a device using any transceiver or the like for communicating with other devices or communication networks, such as ethernet, radio access network (radio access network, RAN), wireless local area network (wireless local area networks, WLAN), etc.
The memory 1003 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that may store information and instructions, and the memory may be stand alone and coupled to the processor via a communication line 1007. The memory may also be integrated with the processor.
The memory 1003 is used for storing computer-executable instructions for executing the present application, and is controlled to be executed by the processor 1001. The processor 1001 is configured to execute computer-executable instructions stored in the memory 1003, thereby implementing the method provided by the above-described embodiment of the present application.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application program codes, which are not particularly limited in the embodiments of the present application.
In a specific implementation, as an embodiment, the abnormality detection device may include a plurality of processors, such as the processor 1001 and the processor 1002 in fig. 10. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In a specific implementation, as an embodiment, the abnormality detection apparatus may further include an output device 1005 and an input device 1006. The output device 1005 communicates with the processor 1001 and may display information in a variety of ways. The input device 1006 is in communication with the processor 1001 and may receive input of a target object in a variety of ways. For example, the input device 1006 may be a mouse, a touch screen device, a sensing device, or the like.
The abnormality detecting device may be a general-purpose device or a special-purpose device. In a specific implementation, the abnormality detection device may be a server, a terminal device, or the like, or a device having a similar structure in fig. 10. The embodiment of the application is not limited to the type of the abnormality detection device.
It should be noted that the processor 1001 in fig. 10 may cause the abnormality detection device to execute the method in the method embodiment as shown in fig. 2, 4 to 5, and 8 by calling the computer-executable instructions stored in the memory 1003.
In particular, the functions/implementations of the processing unit 902 in fig. 9 may be implemented by the processor 1001 in fig. 10 invoking computer executable instructions stored in the memory 1003. The function/implementation procedure of the acquisition unit 901 in fig. 9 can be implemented by the communication interface 1004 in fig. 10.
The embodiment of the present application also provides a computer storage medium storing a computer program for electronic data exchange, where the computer program causes a computer to execute part or all of the steps of any one of the methods for identifying anomaly detection as described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the anomaly detection methods described in the method embodiments above.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above-described embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof, and when implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When the computer-executable instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are fully or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer readable storage media can be any available media that can be stored by a computer or data storage devices such as servers, data centers, etc. that contain an integration of one or more available media. Usable media may be magnetic media (e.g., floppy disks, hard disks, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., SSD)), or the like.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (14)

1. A method of anomaly detection, comprising:
acquiring object plane information of a virtual object in a virtual scene and collision plane information of a virtual collision body corresponding to the virtual object, wherein the object plane information comprises coordinate information of object vertexes of a plurality of object planes of the virtual object, and the collision plane information comprises coordinate information of collision vertexes of a plurality of collision planes of the virtual collision body;
calculating a distance from each collision vertex to each object surface based on the coordinate information of each collision vertex, and calculating a distance from each object vertex to each collision surface based on the coordinate information of each object vertex;
And performing collision detection on the virtual object based on the distance from each collision top point to each object surface and the distance from each object top point to each collision surface.
2. The method of claim 1, wherein said collision detection of said virtual object based on the distance of each said collision vertex to each said object surface and the distance of each said object vertex to each said collision surface comprises:
determining an abnormal collision vertex based on a distance from each of the collision vertices to each of the object faces, and determining an abnormal object vertex based on a distance from each of the object vertices to each of the collision faces;
and determining that the virtual object collides abnormally based on the abnormal collision vertex and the abnormal object vertex.
3. The method of claim 2, wherein calculating the distance of each collision vertex from each object surface based on the coordinate information of each collision vertex comprises:
calculating first normal line information according to coordinate information of each object vertex in a first object plane, wherein the first normal line information is used for indicating a normal line direction of the first object plane, and the first object plane is any one of the plurality of object planes;
And determining the distance from each collision vertex to the first object surface based on the coordinate information of each collision vertex and the first normal line information.
4. A method according to claim 3, wherein said determining the distance of each collision vertex to the first object plane based on the coordinate information of each collision vertex and the first normal information comprises:
determining first projection coordinates based on coordinate information of a first collision vertex and the first normal line information, wherein the first projection coordinates are coordinates of projection points of the first collision vertex on the first object plane, and the first collision vertex is any one of the collision vertices;
calculating the distance from the first collision vertex to the projection point based on the coordinate information of the first collision vertex and the first projection coordinate when the first projection coordinate meets the coordinate range of the first object plane, so as to determine the distance from the first collision vertex to the first object plane; or,
when the first projection coordinates do not meet the coordinate range of the first object plane, respectively calculating the distance from the first collision vertex to each object vertex in the first object plane based on the coordinate information of the first collision vertex and the coordinate information of each object vertex in the first object plane;
And selecting a target distance from the distances from the first collision peak to each object peak to determine the distance from the first collision peak to the first object plane.
5. The method of claim 4, wherein said determining an abnormal collision vertex based on a distance of each of said collision vertices to each of said object faces comprises:
selecting a minimum distance from the distances from the first collision vertexes to each object surface respectively;
and when the minimum distance is greater than or equal to a preset threshold value, determining that the first collision vertex is an abnormal collision vertex.
6. The method of claim 2, wherein calculating the distance of the corresponding object vertex to each collision surface based on the coordinate information of each object vertex comprises:
calculating second normal information indicating a normal direction of a first collision surface, which is any one of the plurality of collision surfaces, based on coordinate information of each of the collision vertices in the first collision surface;
and determining the distance from each object vertex to the first collision surface based on the coordinate information of each object vertex and the second normal line information.
7. The method of claim 6, wherein the determining a distance of each object vertex to the first collision surface based on the coordinate information of each object vertex and the second normal information comprises:
determining second projection coordinates based on coordinate information of a first object vertex and the second normal line information, wherein the second projection coordinates are coordinates of projection points of the first object vertex on the first collision surface, and the first object vertex is any one of the object vertices;
calculating the distance from the first object vertex to the projection point based on the coordinate information of the first object vertex and the second projection coordinate when the second projection coordinate meets the coordinate range of the first collision surface, so as to determine the distance from the first object vertex to the first collision surface; or,
when the second projection coordinates do not meet the coordinate range of the first collision surface, respectively calculating the distance from the first object vertex to each collision vertex in the first collision surface based on the coordinate information of the first object vertex and the coordinate information of each collision vertex in the first collision surface;
And selecting a target distance from the distances from the first object vertex to each collision vertex to determine the distance from the first object vertex to the first collision surface.
8. The method of claim 7, wherein said determining abnormal object vertices based on the distance of each of said object vertices to each of said collision surfaces comprises:
selecting a minimum distance from the distances from the vertex of the first object to each collision surface respectively;
and when the minimum distance is greater than or equal to a preset threshold value, determining that the first object vertex is an abnormal object vertex.
9. The method of any one of claims 1 to 8, wherein the determining that the virtual object is abnormally impacted based on the abnormally impacted vertex and the abnormally impacted object vertex comprises:
when the abnormal collision top point is outside the virtual object or the abnormal object top point is inside the virtual collision body, determining that the virtual object has an air wall abnormal condition;
or,
and determining that the virtual object has a die-through abnormal condition when the abnormal collision vertex is inside the virtual object or the abnormal object vertex is outside the virtual collision body.
10. The method according to any one of claims 2 to 8, wherein after the determination that the virtual object has an abnormal collision based on the abnormal collision vertex and the abnormal object vertex, the method further comprises:
calculating the distance between the object vertex of each object surface in the virtual object and the virtual ground, wherein the virtual ground is the ground in the game map of the virtual scene;
selecting a target value from the distances between the object vertexes of each object surface and the virtual ground, wherein the target value is used for indicating the floating distance of the virtual object;
when the target value is smaller than a preset floating value, calculating the distance between the virtual character and the virtual object;
and recording the position of the virtual object when the distance between the virtual character and the virtual object is smaller than a preset threshold value.
11. An abnormality detection apparatus, comprising:
an acquisition unit configured to acquire object plane information of a virtual object in a virtual scene, the object plane information including coordinate information of object vertices of a plurality of object planes of the virtual object, and collision plane information of a virtual collision body corresponding to the virtual object, the collision plane information including coordinate information of collision vertices of a plurality of collision planes of the virtual collision body;
A processing unit, configured to calculate a distance from each collision vertex to each object plane based on the coordinate information of each collision vertex, and calculate a distance from each object vertex to each collision plane based on the coordinate information of each object vertex;
the processing unit is used for carrying out collision detection on the virtual object based on the distance from each collision top point to each object surface and the distance from each object top point to each collision surface.
12. An abnormality detection apparatus, characterized by comprising: an input/output (I/O) interface, a processor, and a memory, the memory having program instructions stored therein;
the processor is configured to execute program instructions stored in a memory and to perform the method of any one of claims 1 to 10.
13. A computer readable storage medium comprising instructions which, when run on a computer device, cause the computer device to perform the method of any of claims 1 to 10.
14. A computer program product, characterized in that the computer program product comprises instructions which, when run on a computer device or a processor, cause the computer device or the processor to perform the method of any of claims 1 to 10.
CN202211418947.5A 2022-11-14 2022-11-14 Abnormality detection method and related device Pending CN116999845A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211418947.5A CN116999845A (en) 2022-11-14 2022-11-14 Abnormality detection method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211418947.5A CN116999845A (en) 2022-11-14 2022-11-14 Abnormality detection method and related device

Publications (1)

Publication Number Publication Date
CN116999845A true CN116999845A (en) 2023-11-07

Family

ID=88560710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211418947.5A Pending CN116999845A (en) 2022-11-14 2022-11-14 Abnormality detection method and related device

Country Status (1)

Country Link
CN (1) CN116999845A (en)

Similar Documents

Publication Publication Date Title
US10235764B2 (en) Method, terminal, and storage medium for detecting collision between colliders in real-time virtual scene
CN111325796B (en) Method and apparatus for determining pose of vision equipment
CN110648363A (en) Camera posture determining method and device, storage medium and electronic equipment
CN107952243B (en) Path determining method and device
CN113077548B (en) Collision detection method, device, equipment and storage medium for object
CN111744199B (en) Image processing method and device, computer readable storage medium and electronic equipment
CN108876857A (en) Localization method, system, equipment and the storage medium of automatic driving vehicle
CN112991459A (en) Camera calibration method, device, equipment and storage medium
CN115719436A (en) Model training method, target detection method, device, equipment and storage medium
CN114998433A (en) Pose calculation method and device, storage medium and electronic equipment
CN113658203A (en) Method and device for extracting three-dimensional outline of building and training neural network
CN114170365A (en) System and method for accelerated ray tracing using asynchronous operations and ray transformation
CN112197708B (en) Measuring method and device, electronic device and storage medium
CN103837135A (en) Workpiece detecting method and system
CN112734827A (en) Target detection method and device, electronic equipment and storage medium
US20230186554A1 (en) Rendering method, device, and system
CN114674328B (en) Map generation method, map generation device, electronic device, storage medium, and vehicle
CN116999845A (en) Abnormality detection method and related device
CN113379826A (en) Method and device for measuring volume of logistics piece
CN112418316B (en) Robot repositioning method and device, laser robot and readable storage medium
CN105677843A (en) Method for automatically obtaining attribute of four boundaries of parcel
CN115471808A (en) Processing method and device for reconstructing target point cloud based on reference image
CN110399892B (en) Environmental feature extraction method and device
Chen et al. Non-metric lens distortion correction using modified particle swarm optimisation
Meng et al. PROB-SLAM: Real-time Visual SLAM Based on Probabilistic Graph Optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication