Disclosure of Invention
Aiming at the defects in the prior art, the VR game system and the VR game method based on the clustering algorithm are provided, when a user wears a VR head-mounted display to perform VR game, a current environment image of a real environment is obtained in real time by a camera of the VR head-mounted display facing to an external real environment, the current environment image is subjected to region segmentation by adopting the FCM clustering algorithm, a target region with obstacles is identified to give a proper prompt to the user, and unnecessary injury between the user and the obstacles in the game is avoided.
In a first aspect, the application discloses a VR game method based on a clustering algorithm, comprising:
acquiring a current environment image through a camera;
carrying out region segmentation on the current environment image after filtering processing by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining the current environment image area with obstacles as a target area;
displaying a virtual alert image within the target area.
The VR game system and method based on the clustering algorithm and the terminal device are provided, when a user wears a VR head-mounted display to perform VR game, a camera of the VR head-mounted display facing an external real environment can acquire a current environment image of the real environment in real time, the current environment image is subjected to region segmentation by adopting an FCM clustering algorithm, a target region with obstacles is identified to perform appropriate prompt on the user, unnecessary damage between the user and the obstacles in the game is avoided, and safety of the user in the VR game process is guaranteed.
As an optional implementation manner, after the determining the current environment image area where the obstacle exists as the target area, the method further includes: pausing the current animation and audio playback in the VR game.
It will be appreciated that upon identifying a target area in which an obstacle is present, the user should be prompted appropriately to avoid unnecessary injury between the user and the obstacle in the game. The prompting mode can be various, and can comprise pausing the current animation and audio playing in the VR game, enabling users in the game to realize that potential safety hazards exist in the current real environment, stopping the game or moving bodies to other positions to continue the game.
As an optional implementation manner, the performing, by using an FCM clustering algorithm, region segmentation on the filtered current environment image includes:
receiving clustering algorithm parameters, wherein the clustering algorithm parameters comprise a fuzzy index m, an iteration stop threshold epsilon and a maximum iteration number L;
receiving a set of cluster class numbers c, where c ═ { c ═ ch,ch1,2, …,20}, said chThe number of clustering categories is obtained;
clustering algorithm parameters and clustering category numbers c in the clustering category number set c through FCMhPerforming region segmentation on the current environment image after filtering processing to obtain an image segmentation result set P;
wherein P ═ { Ct,t=1,2,…,20},Ct={Cp,p=1,2,…,ch};
Wherein, CtRepresenting the number c of clusters based on the cluster algorithm parameters and the cluster categories by FCM cluster algorithmhCarrying out region segmentation on the current environment image after filtering processing to obtain a category set; wherein, CpRepresenting the p-th category in the current ambient image, i.e. for category C in the current ambient imagepA set of pixels having a maximum degree of membership;
by using a clustering validity index function J (C)t) For each class set C in the image segmentation result set PtCalculating to obtain a clustering validity index value set J', wherein J ═ { J ═ Jz,z=1,2,…,20};
Selecting a minimum value J from the clustering validity index value set JminDetermining the minimum value JminThe corresponding cluster category number is the target cluster category number cx;
Determining the number c of clusters based on the clustering algorithm parameters and the target cluster categories by FCM clustering algorithmxPerforming region segmentation on the current environment image after filtering processing to obtain a category set CxAnd the final target region segmentation result is obtained.
It can be understood that the FCM clustering algorithm performs region segmentation on the current environment image based on different clustering category numbers to obtain different region segmentation results. In the process of image segmentation, the edge pixels of each image area are comparedFor example, a certain class of edge pixels of the domain class may have a higher degree of membership to the class, and similarly, the class of edge pixels may also have a higher degree of membership to the domain class. The application can adopt a clustering validity index function J (C)t) To determine the optimal cluster category number of FCM clustering algorithm, i.e. the target cluster category number cxWhen the cluster validity index value J iszWhen the minimum value is obtained, the category number at this time is the optimal cluster number. And performing region segmentation on the filtered current environment image based on the optimal clustering number through an FCM clustering algorithm to obtain a class set which is an optimal region segmentation result, namely a final target region segmentation result. According to the clustering effectiveness index, the edge pixels which are easy to be subjected to wrong classification among the classes are detected, so that the separation degree and the overlapping degree among the classes in the image segmentation process can be measured more accurately.
As an alternative embodiment, the method uses a clustering validity indicator function J (C)t) For each class set C in the image segmentation result set PtPerforming a calculation comprising:
calculating each of the class sets CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) Wherein the first tightness measure coefficient ρ (C)p) Represents class CpA second compactness measure p (C) of the middle pixelp'(Cp,q) ) represents a set Cp'(Cp,q) Compactness of middle pixel, where Cp'(Cp,q) Represents class CpPixel of (2) and set Cp,2(Cp,q) Wherein the set C is a set of pixelsp,2(Cp,q) Represents class Cp,qFor class CpA set of pixels having a second largest degree of membership greater than a predetermined degree of membership threshold μ 0, wherein Cp,qRepresents class CpThe qth neighborhood category of (1);
calculating each of the class sets CtIntegrated inter-separation metric D (C)p);
Calculating the cluster validity indicator function J (C) by the following formulat):
It is understood that the preset membership threshold μ 0 may be factory set by the manufacturer, for example, the value of μ 0 may be set to 0.3. The clustering effectiveness index function comprises a first compactness measurement coefficient rho (C)p) And a measure of the degree of separation between the complexes D (C)p) That is, the clustering effect of the FCM clustering algorithm is measured by the first closeness coefficient ρ (C)p) And a measure of the degree of separation between the complexes D (C)p) To decide.
As an optional implementation, the computing each of the category sets CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) ) comprising:
the first compactness measure coefficient ρ (C) is scaled by the following formulap) And (3) calculating:
the second tightness measure coefficient ρ (C) by the following formulap'(Cp,q) To calculate:
wherein, I' (a)r,br) Represents class CpMiddle coordinate (a)r,br) Pixel of (d), f' (a)r,br) Represents a pixel I' (a)r,br) Gray value of VpRepresents class CpCluster center pixel of (d), f' (V)p) In representing clustersHeart pixel VpGray value of N (C)p) Represents class CpNumber of pixels in (1), I' (a)z,bz) Represents class CpMiddle coordinate (a)z,bz) Pixel of (d), f' (a)z,bz) Represents a pixel I' (a)z,bz) Gray value of (A), I' (A)R,BR) Represents a collective C'p(Cp,q) Middle coordinate (A)R,BR) Pixel of (d), f' (A)R,BR) Represents a pixel I' (A)R,BR) Gray value of (A), I' (A)Z,BZ) Represents a collective C'p(Cp,q) Middle coordinate (A)Z,BZ) Pixel of (d), f' (A)Z,BZ) Represents a pixel I' (A)Z,BZ) Gray value of N (C'p(Cp,q) Represents a set C'p(Cp,q) The number of pixels in (1).
As an optional implementation, the computing each of the category sets CtIntegrated inter-separation metric D (C)p) The method comprises the following steps:
the inter-integration separation measure coefficient D (C) is measured by the following formulap) And (3) calculating:
wherein D is1(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qFirst inter-class separation measure of D2(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qSecond inter-class separation metric of y (C)p) Representation set CtClass C present inpThe number of neighborhood categories.
It can be understood that the inter-integrated separation metric D (C)p) The first inter-class separation degree measuring coefficient D is included in the1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) That is, the quality of the clustering effect of the FCM clustering algorithm is also measured by the first inter-class separation degree1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) To decide.
As an optional implementation, the computing each of the category sets CtIs measured by the first inter-class separation metric coefficient D1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) The method also comprises the following steps:
measuring coefficient D for the first inter-class separation degree by the following formula1(Cp,Cp,q) And (3) calculating:
measuring coefficient D for the second inter-class separation degree by the following formula2(Cp,Cp,q) And (3) calculating:
wherein,
represents class C
pThe mean of the gray values of the middle pixels,
representing a neighborhood class C
p,qMean value of the gray values of the middle pixels, I' (α)
g,β
g) Representation set C
p,q,2(C
p) Middle coordinate (alpha)
g,β
g) Pixel of (d), f' (α)
g,β
g) Represents a pixel I' (α)
g,β
g) Gray value of (d)
w,e
w) Representation set C
p,2(C
p,q) Middle coordinate (d)
w,e
w) Pixel of (d), f' (d)
w,e
w) Represents pixel I' (d)
w,e
w) Gray value of N (C)
p,q,2(C
p) ) represents a set C
p,q,2(C
p) The number of pixels in (1), N (C)
p,2(C
p,q) ) represents a set C
p,2(C
p,q) The number of pixels in (1).
It will be appreciated that the degree of separation between the first class of computational classes measures the coefficient D1(Cp,Cp,q) When the similarity between the selected possible edge pixels of the category to be measured and the selected pixels in the category to be measured is larger, the similarity between the selected possible edge pixels and the selected similarity between the similarity and the selected similarity is smaller, the probability that the selected possible edge pixels belong to the category to be measured is higher, at the moment, the value of the sine part of the first inter-class separation degree measurement coefficient of the category to be measured is smaller, namely the first inter-class separation degree measurement coefficient of the category to be measured is reduced, and the probability that the selected possible edge pixels belong to the category to be measured is higher, the difference between the sine part and the selected possible edge pixels is higher, at the moment, the first inter-class separation degree measurement coefficient of the category to be measured is increased;
a second inter-class separation metric D in the class of calculation to be measured2(Cp,Cp,q) Comparing the selected possible edge pixels of the category to be measured with the selected possible edge pixels of the neighborhood category, wherein the smaller the difference value is, the larger the overlapping degree between the category to be measured and the neighborhood category is, at the moment, the smaller the value of the separation degree measuring coefficient between the second categories of the category to be measured is, and the larger the difference value is, the smaller the overlapping degree between the category to be measured and the neighborhood category is, namely, the edge pixels of the area are effectively divided, at the moment, the larger the value of the separation degree measuring coefficient between the second categories of the category to be measured is; i.e. compared to the conventional way of measuring the degree of separation between classes by using the cluster validity index,
in a second aspect, the present application discloses a VR game security assurance terminal device, where the terminal device is configured to execute any one of the above VR game systems and methods based on a clustering algorithm, and the VR game security assurance terminal device includes:
the device comprises a camera, a clustering calculation module, an identification module and a processing module;
the camera is used for acquiring a current environment image through the camera;
the cluster calculation module is used for carrying out region segmentation on the current environment image after filtering processing by utilizing an FCM (fuzzy C-means) clustering algorithm;
the identification module is used for identifying each segmented current environment image area and judging whether an obstacle exists in each current environment image area; determining the current environment image area with obstacles as a target area;
the processing module is used for displaying a virtual warning image in the target area.
In a third aspect, the present application also discloses a terminal device, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the method according to the first aspect.
In a fourth aspect, the present application also discloses a computer-readable storage medium storing a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of the first aspect described above.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a plurality" typically includes at least two.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used to describe … … in embodiments of the present invention, these … … should not be limited to these terms. These terms are used only to distinguish … …. For example, the first … … can also be referred to as the second … … and similarly the second … … can also be referred to as the first … … without departing from the scope of embodiments of the present invention.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article of manufacture or terminal equipment that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article of manufacture or terminal equipment. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of additional like elements in the article of commerce or the terminal device in which the element is included.
With the continuous development of Virtual Reality (VR) technology and the continuous reduction of equipment cost, Virtual Reality has gradually gone into people's study, work and entertainment. Therefore, many game companies develop games based on the virtual reality technology, and the games based on the virtual reality technology can construct more vivid game situations in the games, so that game experiences with stronger immersion are brought to players.
However, users in VR games tend to be immersed in the virtual scene displayed by VR glasses, and cannot observe the characteristics of the surrounding real environment. Therefore, people or objects suddenly appearing in the surrounding environment may be injured by the user immersed in the game, and the user immersed in the game may be tripped over, which causes a certain safety risk. For example, when a user is waving a handle to simulate a war "monster" in a VR game, a person or pet suddenly intruding into the game area may be injured by the user in the game; obstacles such as tables, cabinets, etc. within the gaming area may also trip the user while the user is walking in the virtual environment of the VR game.
In a first aspect, as shown in fig. 1, the present application discloses a VR game system and method based on a clustering algorithm, including:
101. and acquiring a current environment image through a camera.
In the embodiment of the application, the VR game safety guarantee method is mainly applied to VR head-mounted display equipment worn by a user, and a camera is arranged on one side of the VR head-mounted display equipment facing a real environment to shoot a current environment image where the user is located in real time.
102. And carrying out region segmentation on the filtered current environment image by using an FCM clustering algorithm.
In the embodiment of the present application, a Fuzzy C-means (FCM) is an improvement of the conventional C-means algorithm as a typical unsupervised clustering algorithm. The method introduces the concept of fuzzy sets into clustering analysis, so that objects do not only belong to two extreme states of 'belonging' and 'not belonging', but also can express the degree of belonging to various classes by using membership function between 0 and 1. The FCM algorithm is popularized on the basis of fuzzy mathematics, the K-means algorithm is clustered by optimizing a fuzzy objective function, each point is considered to belong to a certain class unlike the K-means clustering, each point is endowed with a membership degree to each class, the membership degree is used for better describing the characteristics that edge pixels are also the same, and the FCM algorithm is suitable for processing the inherent uncertainty of things. The image segmentation is carried out by utilizing the characteristic of the FCM algorithm unsupervised fuzzy clustering calibration, so that the human intervention can be reduced, and the method is more suitable for the characteristics of uncertainty and fuzziness in the image.
103. And identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area.
In the embodiment of the present application, "identifying each segmented current environment image region" may include: extracting image features of a current environment image area by using a convolutional neural network image processing technology to obtain an image feature map of the current environment image area, wherein the image feature map comprises barrier elements; and performing content identification on the obstacle elements in the image feature map to obtain an obstacle identification result.
In the embodiment of the present application, the obstacle refers to a person or an object appearing in the user VR game area. For example, when a user is waving a handle to simulate a war "monster" in a VR game, a person or pet suddenly intruding into the game area may be injured by the user in the game; obstacles such as tables, cabinets, etc. within the gaming area may also trip the user while the user is walking in the virtual environment of the VR game.
104. And determining the current environment image area with the obstacle as the target area.
105. Displaying a virtual alert image within the target area.
In the embodiment of the application, after the target area with the obstacle is confirmed, the user should be reminded to avoid the obstacle or stop the game in various ways. The effective reminding mode is that a virtual warning image, such as an alarm icon, is displayed at a position corresponding to the target area in the game picture through the VR head-mounted display.
The VR game system and method based on the clustering algorithm and the terminal device are provided, when a user wears a VR head-mounted display to perform VR game, a camera of the VR head-mounted display facing an external real environment can acquire a current environment image of the real environment in real time, the current environment image is subjected to region segmentation by adopting an FCM clustering algorithm, a target region with obstacles is identified to perform appropriate prompt on the user, unnecessary damage between the user and the obstacles in the game is avoided, and safety of the user in the VR game process is guaranteed.
As an optional implementation, after determining the current environment image area where the obstacle exists as the target area, the method further includes: the current animation and audio play in the VR game is paused.
It will be appreciated that upon identifying a target area in which an obstacle is present, the user should be prompted appropriately to avoid unnecessary injury between the user and the obstacle in the game. The prompting mode can be various, and can comprise pausing the current animation and audio playing in the VR game, enabling users in the game to realize that potential safety hazards exist in the current real environment, stopping the game or moving bodies to other positions to continue the game.
As an optional implementation manner, performing region segmentation on the filtered current environment image by using an FCM clustering algorithm, including:
receiving clustering algorithm parameters, wherein the clustering algorithm parameters comprise a fuzzy index m, an iteration stop threshold epsilon and a maximum iteration number L;
receiving a set of cluster class numbers c, where c ═ { c ═ ch,ch=1,2,…,20},chThe number of clustering categories is obtained;
clustering algorithm parameters and clustering class number c in clustering class number set c through FCM clustering algorithmhPerforming region segmentation on the current environment image after filtering processing to obtain an image segmentation result set P;
wherein P ═ { Ct,t=1,2,…,20},Ct={Cp,p=1,2,…,ch};
Wherein, CtRepresenting the number c of clustering algorithm parameters and clustering categories based on the FCM clustering algorithmhCarrying out region segmentation on the current environment image after filtering processing to obtain a category set; wherein, CpRepresenting the p-th class in the current ambient image, i.e. for class C in the current ambient imagepA set of pixels having a maximum degree of membership;
by using a clustering validity index function J (C)t) For each class set C in the image segmentation result set PtCalculating to obtain a clustering validity index value set J', wherein J ═ { J ═ Jz,z=1,2,…,20};
Selecting a minimum value J from the clustering effectiveness index value set JminDetermining the minimum value JminThe corresponding cluster category number is the target cluster category number cx;
Determining the clustering algorithm parameter and the target clustering category number c based on the FCM clustering algorithmxCarrying out region segmentation on the current environment image after filtering processing to obtain a category set CxAnd the final target region segmentation result is obtained.
In the examples of the present applicationIn, let Cp′Representing the p' th class in the current ambient image, when class Cp′Edge pixel in (2) and class CpWhen the edge pixels of (1) are adjacent, then class Cp′Is classified as class CpIs given as Y (C)p) Represents class CpSet of neighborhood classes in the current ambient image, and Y (C)p)={Cp,q,q=1,...,y(Cp) In which C isp,qRepresents class CpThe qth neighborhood class of (2), y (C)p) Representing a class C present in the set C (I')pThe number of neighborhood categories.
It can be understood that the FCM clustering algorithm performs region segmentation on the current environment image based on different clustering category numbers to obtain different region segmentation results. In the image segmentation process, the edge pixels of each image region are more likely to be wrongly classified, for example, the edge pixels of a domain class of a certain class may have a higher membership to the class, and similarly, the edge pixels of the class may also have a higher membership to the domain class. The application can adopt a clustering validity index function J (C)t) To determine the optimal cluster category number of FCM clustering algorithm, i.e. the target cluster category number cxWhen the cluster validity index value J is usedzWhen the minimum value is obtained, the category number at this time is the optimal cluster number. And performing region segmentation on the filtered current environment image based on the optimal clustering number through an FCM clustering algorithm to obtain a class set which is an optimal region segmentation result, namely a final target region segmentation result. According to the clustering effectiveness index, the edge pixels which are easy to be subjected to wrong classification among the classes are detected, so that the separation degree and the overlapping degree among the classes in the image segmentation process can be measured more accurately.
As an alternative embodiment, the effectiveness of the clustering is improved by using a clustering effectiveness index function J (C)t) For each class set C in the image segmentation result set PtPerforming a calculation comprising:
calculate each class set CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) Where the first tightness measure coefficient ρ (C)p) Represents class CpThe compactness coefficient of the middle pixel, and a second compactness coefficient ρ (C)p'(Cp,q) ) represents a set Cp'(Cp,q) Compactness of middle pixel, where Cp'(Cp,q) Represents class CpPixel of (2) and set Cp,2(Cp,q) Wherein the set C is a set of pixelsp,2(Cp,q) Represents class Cp,qFor class CpA set of pixels having a second largest degree of membership greater than a predetermined degree of membership threshold μ 0, wherein Cp,qRepresents class CpThe qth neighborhood category of (1);
calculate each class set CtIntegrated inter-separation metric D (C)p);
The clustering effectiveness index function J (C) is calculated by the following formulat):
It is understood that the preset membership threshold μ 0 may be factory set by the manufacturer, for example, the value of μ 0 may be set to 0.3. The clustering effectiveness index function comprises a first compactness measurement coefficient rho (C)p) And a measure of the degree of separation between the complexes D (C)p) That is, the clustering effect of the FCM clustering algorithm is measured by the first closeness coefficient ρ (C)p) And a measure of the degree of separation between the complexes D (C)p) To decide.
As an alternative embodiment, each class set C is calculatedtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) ) comprising:
the first tightness measure coefficient ρ (C) is obtained by the following formulap) And (3) calculating:
the second tightness measure coefficient ρ (C) by the following formulap'(Cp,q) To calculate:
wherein, I' (a)r,br) Represents class CpMiddle coordinate (a)r,br) Pixel of (d), f' (a)r,br) Represents a pixel I' (a)r,br) Gray value of VpRepresents class CpCluster center pixel of (d), f' (V)p) Representing cluster center pixel VpGray value of N (C)p) Represents class CpNumber of pixels in (1), I' (a)z,bz) Represents class CpMiddle coordinate (a)z,bz) Pixel of (d), f' (a)z,bz) Represents a pixel I' (a)z,bz) Gray value of (A), I' (A)R,BR) Represents a collective C'p(Cp,q) Middle coordinate (A)R,BR) Pixel of (d), f' (A)R,BR) Represents a pixel I' (A)R,BR) Gray value of (A), I' (A)Z,BZ) Represents a collective C'p(Cp,q) Middle coordinate (A)Z,BZ) Pixel of (d), f' (A)Z,BZ) Represents a pixel I' (A)Z,BZ) Gray value of N (C'p(Cp,q) Represents a set C'p(Cp,q) The number of pixels in (1).
As an alternative embodiment, each class set C is calculatedtIntegrated inter-separation metric D (C)p) The method comprises the following steps:
the coefficient D (C) is measured for the degree of separation between the complexes by the following formulap) And (3) calculating:
wherein D is1(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qFirst inter-class separation measure of D2(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qSecond inter-class separation metric of y (C)p) Representation set CtClass C present inpThe number of neighborhood categories.
It can be understood that the inter-integrated separation metric D (C)p) The first inter-class separation degree measuring coefficient D is included in the1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) That is, the quality of the clustering effect of the FCM clustering algorithm is also measured by the first inter-class separation degree1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) To decide.
As an alternative embodiment, each class set C is calculatedtIs measured by the first inter-class separation metric coefficient D1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) The method also comprises the following steps:
the first inter-class separation measure coefficient D is measured by the following formula1(Cp,Cp,q) And (3) calculating:
the coefficient D is measured for the degree of separation between the second classes by the following formula2(Cp,Cp,q) And (3) calculating:
wherein,
represents class C
pThe mean of the gray values of the middle pixels,
representing a neighborhood class C
p,qMean value of the gray values of the middle pixels, I' (α)
g,β
g) Representation set C
p,q,2(C
p) Middle coordinate (alpha)
g,β
g) Pixel of (d), f' (α)
g,β
g) Represents a pixel I' (α)
g,β
g) Gray value of (d)
w,e
w) Representation set C
p,2(C
p,q) Middle coordinate (d)
w,e
w) Pixel of (d), f' (d)
w,e
w) Represents pixel I' (d)
w,e
w) Gray value of N (C)
p,q,2(C
p) ) represents a set C
p,q,2(C
p) The number of pixels in (1), N (C)
p,2(C
p,q) ) represents a set C
p,2(C
p,q) The number of pixels in (1).
It will be appreciated that the degree of separation between the first class of computational classes measures the coefficient D1(Cp,Cp,q) When the similarity between the selected possible edge pixels of the category to be measured and the selected pixels in the category to be measured is larger, the similarity between the selected possible edge pixels and the selected similarity between the similarity and the selected similarity is smaller, the probability that the selected possible edge pixels belong to the category to be measured is higher, at the moment, the value of the sine part of the first inter-class separation degree measurement coefficient of the category to be measured is smaller, namely the first inter-class separation degree measurement coefficient of the category to be measured is reduced, and the probability that the selected possible edge pixels belong to the category to be measured is higher, the difference between the sine part and the selected possible edge pixels is higher, at the moment, the first inter-class separation degree measurement coefficient of the category to be measured is increased;
a second inter-class separation metric D in the class of calculation to be measured2(Cp,Cp,q) Then, the possible edge pixels of the selected category to be measured and the selectedComparing possible edge pixels of the neighborhood categories, wherein the smaller the difference value is, the larger the overlapping degree between the category to be measured and the neighborhood category is, at the moment, the smaller the value of the separation degree measurement coefficient between the second category of the category to be measured is, and the larger the difference value is, the smaller the overlapping degree between the category to be measured and the neighborhood category is, namely, the edge pixels of the area are effectively divided, at the moment, the larger the value of the separation degree measurement coefficient between the second categories of the category to be measured is; i.e. compared to the conventional way of measuring the degree of separation between classes by using the cluster validity index,
in a second aspect, as shown in fig. 2, the present application discloses a VR game terminal device, which is configured to execute any one of the above VR game system and method based on the clustering algorithm, and includes:
the system comprises a camera 201, a cluster calculation module 202, an identification module 203 and a processing module 204;
the camera 201 is used for acquiring a current environment image through the camera 201;
the cluster calculation module 202 is configured to perform region segmentation on the filtered current environment image by using an FCM clustering algorithm;
the identification module 203 is configured to identify each segmented current environment image area, and determine whether an obstacle exists in each current environment image area; determining a current environment image area with an obstacle as a target area;
and the processing module 204 is configured to display the virtual warning image in the target area.
It should be noted that the functions of each functional device of the VR game security assurance terminal device shown in fig. 2 may be specifically implemented according to the method in the method embodiment shown in fig. 1, and the specific implementation process may refer to the related description of the method embodiment of fig. 1, which is not described herein again.
In a third aspect, the application also discloses another terminal device. The terminal device in the present embodiment shown in fig. 3 may include: one or more processors 301; one or more input devices 302, one or more output devices 303, and memory 304. The processor 301, the input device 302, the output device 303, and the memory 304 are connected by a bus 305. The memory 502 is used to store a computer program comprising program instructions and the processor 301 is used to execute the program instructions stored by the memory 502. Wherein the processor 301 is configured to call the program instruction to perform the following operations:
acquiring a current environment image through a camera;
carrying out region segmentation on the filtered current environment image by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining a current environment image area with an obstacle as a target area;
displaying a virtual alert image within the target area.
It should be understood that, in the embodiment of the present invention, the Processor 301 may be a Central Processing Unit (CPU), and the Processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 302 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 303 may include a display (LCD, etc.), a speaker, etc.
The memory 304 may include a read-only memory and a random access memory, and provides instructions and data to the processor 301. A portion of the memory 304 may also include non-volatile random access memory. For example, the memory 304 may also store device type information.
In a specific implementation, the processor 301, the input device 302, and the output device 303 described in this embodiment of the present invention may execute the implementation manners described in the first embodiment and the second embodiment of the endurance test method provided in this embodiment of the present invention, and may also execute the implementation manner of the terminal device described in this embodiment of the present invention, which is not described herein again.
In a fourth aspect, in another embodiment of the invention, there is provided a computer-readable storage medium storing a computer program comprising program instructions that, when executed by a processor, implement:
acquiring a current environment image through a camera;
carrying out region segmentation on the filtered current environment image by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining a current environment image area with an obstacle as a target area;
displaying a virtual alert image within the target area.
The computer readable storage medium may be an internal storage unit of the terminal device in any of the foregoing embodiments, for example, a hard disk or a memory of the terminal device. The computer-readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided in the terminal device. Further, the computer-readable storage medium may include both an internal storage unit and an external storage device of the terminal device. The computer-readable storage medium stores the computer program and other programs and data required by the terminal device. The above-described computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal device and the unit described above may refer to corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal device and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the above-described division of units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present invention essentially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present disclosure, and the present disclosure should be construed as being covered by the claims and the specification.