CN112348827A - VR game system and method based on clustering algorithm - Google Patents

VR game system and method based on clustering algorithm Download PDF

Info

Publication number
CN112348827A
CN112348827A CN202011158295.7A CN202011158295A CN112348827A CN 112348827 A CN112348827 A CN 112348827A CN 202011158295 A CN202011158295 A CN 202011158295A CN 112348827 A CN112348827 A CN 112348827A
Authority
CN
China
Prior art keywords
class
pixel
current environment
category
environment image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011158295.7A
Other languages
Chinese (zh)
Other versions
CN112348827B (en
Inventor
罗子尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ruiyun Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202011158295.7A priority Critical patent/CN112348827B/en
Publication of CN112348827A publication Critical patent/CN112348827A/en
Application granted granted Critical
Publication of CN112348827B publication Critical patent/CN112348827B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Abstract

When a user wears a VR head-mounted display to perform VR game, a camera of the VR head-mounted display facing the external real environment acquires the current environment image of the real environment in real time, the current environment image is subjected to region segmentation by adopting an FCM clustering algorithm, a target region with obstacles is identified to perform appropriate prompt on the user, unnecessary injury between the user and the obstacles in the game is avoided, and the safety of the user in the VR game process is guaranteed.

Description

VR game system and method based on clustering algorithm
Technical Field
The application relates to the technical field of virtual reality, in particular to a VR game system and method based on a clustering algorithm.
Background
With the continuous development of Virtual Reality (VR) technology and the continuous reduction of equipment cost, Virtual Reality has gradually gone into people's study, work and entertainment. Therefore, many game companies develop games based on the virtual reality technology, and the games based on the virtual reality technology can construct more vivid game situations in the games, so that game experiences with stronger immersion are brought to players.
However, users in VR games tend to be immersed in the virtual scene displayed by VR glasses, and cannot observe the characteristics of the surrounding real environment. Therefore, people or objects suddenly appearing in the surrounding environment may be injured by the user immersed in the game, and the user immersed in the game may be tripped over, which causes a certain safety risk.
Disclosure of Invention
Aiming at the defects in the prior art, the VR game system and the VR game method based on the clustering algorithm are provided, when a user wears a VR head-mounted display to perform VR game, a current environment image of a real environment is obtained in real time by a camera of the VR head-mounted display facing to an external real environment, the current environment image is subjected to region segmentation by adopting the FCM clustering algorithm, a target region with obstacles is identified to give a proper prompt to the user, and unnecessary injury between the user and the obstacles in the game is avoided.
In a first aspect, the application discloses a VR game method based on a clustering algorithm, comprising:
acquiring a current environment image through a camera;
carrying out region segmentation on the current environment image after filtering processing by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining the current environment image area with obstacles as a target area;
displaying a virtual alert image within the target area.
The VR game system and method based on the clustering algorithm and the terminal device are provided, when a user wears a VR head-mounted display to perform VR game, a camera of the VR head-mounted display facing an external real environment can acquire a current environment image of the real environment in real time, the current environment image is subjected to region segmentation by adopting an FCM clustering algorithm, a target region with obstacles is identified to perform appropriate prompt on the user, unnecessary damage between the user and the obstacles in the game is avoided, and safety of the user in the VR game process is guaranteed.
As an optional implementation manner, after the determining the current environment image area where the obstacle exists as the target area, the method further includes: pausing the current animation and audio playback in the VR game.
It will be appreciated that upon identifying a target area in which an obstacle is present, the user should be prompted appropriately to avoid unnecessary injury between the user and the obstacle in the game. The prompting mode can be various, and can comprise pausing the current animation and audio playing in the VR game, enabling users in the game to realize that potential safety hazards exist in the current real environment, stopping the game or moving bodies to other positions to continue the game.
As an optional implementation manner, the performing, by using an FCM clustering algorithm, region segmentation on the filtered current environment image includes:
receiving clustering algorithm parameters, wherein the clustering algorithm parameters comprise a fuzzy index m, an iteration stop threshold epsilon and a maximum iteration number L;
receiving a set of cluster class numbers c, where c ═ { c ═ ch,ch1,2, …,20}, said chThe number of clustering categories is obtained;
clustering algorithm parameters and clustering category numbers c in the clustering category number set c through FCMhPerforming region segmentation on the current environment image after filtering processing to obtain an image segmentation result set P;
wherein P ═ { Ct,t=1,2,…,20},Ct={Cp,p=1,2,…,ch};
Wherein, CtRepresenting the number c of clusters based on the cluster algorithm parameters and the cluster categories by FCM cluster algorithmhCarrying out region segmentation on the current environment image after filtering processing to obtain a category set; wherein, CpRepresenting the p-th category in the current ambient image, i.e. for category C in the current ambient imagepA set of pixels having a maximum degree of membership;
by using clusteringPerformance index function J (C)t) For each class set C in the image segmentation result set PtCalculating to obtain a clustering validity index value set J', wherein J ═ { J ═ Jz,z=1,2,…,20};
Selecting a minimum value J from the clustering validity index value set JminDetermining the minimum value JminThe corresponding cluster category number is the target cluster category number cx
Determining the number c of clusters based on the clustering algorithm parameters and the target cluster categories by FCM clustering algorithmxPerforming region segmentation on the current environment image after filtering processing to obtain a category set CxAnd the final target region segmentation result is obtained.
It can be understood that the FCM clustering algorithm performs region segmentation on the current environment image based on different clustering category numbers to obtain different region segmentation results. In the image segmentation process, the edge pixels of each image region are more likely to be wrongly classified, for example, the edge pixels of a domain class of a certain class may have a higher membership to the class, and similarly, the edge pixels of the class may also have a higher membership to the domain class. The application can adopt a clustering validity index function J (C)t) To determine the optimal cluster category number of FCM clustering algorithm, i.e. the target cluster category number cxWhen the cluster validity index value J iszWhen the minimum value is obtained, the category number at this time is the optimal cluster number. And performing region segmentation on the filtered current environment image based on the optimal clustering number through an FCM clustering algorithm to obtain a class set which is an optimal region segmentation result, namely a final target region segmentation result. According to the clustering effectiveness index, the edge pixels which are easy to be subjected to wrong classification among the classes are detected, so that the separation degree and the overlapping degree among the classes in the image segmentation process can be measured more accurately.
As an alternative embodiment, the method uses a clustering validity indicator function J (C)t) For each class set C in the image segmentation result set PtPerforming a calculation comprising:
calculating each of the class sets CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) Wherein the first tightness measure coefficient ρ (C)p) Represents class CpA second compactness measure p (C) of the middle pixelp'(Cp,q) ) represents a set Cp'(Cp,q) Compactness of middle pixel, where Cp'(Cp,q) Represents class CpPixel of (2) and set Cp,2(Cp,q) Wherein the set C is a set of pixelsp,2(Cp,q) Represents class Cp,qFor class CpA set of pixels having a second largest degree of membership greater than a predetermined degree of membership threshold μ 0, wherein Cp,qRepresents class CpThe qth neighborhood category of (1);
calculating each of the class sets CtIntegrated inter-separation metric D (C)p);
Calculating the cluster validity indicator function J (C) by the following formulat):
Figure BDA0002743494600000041
It is understood that the preset membership threshold μ 0 may be factory set by the manufacturer, for example, the value of μ 0 may be set to 0.3. The clustering effectiveness index function comprises a first compactness measurement coefficient rho (C)p) And a measure of the degree of separation between the complexes D (C)p) That is, the clustering effect of the FCM clustering algorithm is measured by the first closeness coefficient ρ (C)p) And a measure of the degree of separation between the complexes D (C)p) To decide.
As an optional implementation, the computing each of the category sets CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) ) comprising:
is obtained by the following formulaThe first tightness measure coefficient ρ (C)p) And (3) calculating:
Figure BDA0002743494600000042
the second tightness measure coefficient ρ (C) by the following formulap'(Cp,q) To calculate:
Figure BDA0002743494600000051
wherein, I' (a)r,br) Represents class CpMiddle coordinate (a)r,br) Pixel of (d), f' (a)r,br) Represents a pixel I' (a)r,br) Gray value of VpRepresents class CpCluster center pixel of (d), f' (V)p) Representing cluster center pixel VpGray value of N (C)p) Represents class CpNumber of pixels in (1), I' (a)z,bz) Represents class CpMiddle coordinate (a)z,bz) Pixel of (d), f' (a)z,bz) Represents a pixel I' (a)z,bz) Gray value of (A), I' (A)R,BR) Represents a collective C'p(Cp,q) Middle coordinate (A)R,BR) Pixel of (d), f' (A)R,BR) Represents a pixel I' (A)R,BR) Gray value of (A), I' (A)Z,BZ) Represents a collective C'p(Cp,q) Middle coordinate (A)Z,BZ) Pixel of (d), f' (A)Z,BZ) Represents a pixel I' (A)Z,BZ) Gray value of N (C'p(Cp,q) Represents a set C'p(Cp,q) The number of pixels in (1).
As an optional implementation, the computing each of the category sets CtIntegrated inter-separation metric D (C)p) The method comprises the following steps:
by the following formulaMeasuring coefficient D (C) for the separation degree between the synthesisp) And (3) calculating:
Figure BDA0002743494600000052
wherein D is1(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qFirst inter-class separation measure of D2(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qSecond inter-class separation metric of y (C)p) Representation set CtClass C present inpThe number of neighborhood categories.
It can be understood that the inter-integrated separation metric D (C)p) The first inter-class separation degree measuring coefficient D is included in the1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) That is, the quality of the clustering effect of the FCM clustering algorithm is also measured by the first inter-class separation degree1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) To decide.
As an optional implementation, the computing each of the category sets CtIs measured by the first inter-class separation metric coefficient D1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) The method also comprises the following steps:
measuring coefficient D for the first inter-class separation degree by the following formula1(Cp,Cp,q) And (3) calculating:
Figure BDA0002743494600000061
measuring coefficient D for the second inter-class separation degree by the following formula2(Cp,Cp,q) And (3) calculating:
Figure BDA0002743494600000062
wherein the content of the first and second substances,
Figure BDA0002743494600000063
represents class CpThe mean of the gray values of the middle pixels,
Figure BDA0002743494600000064
representing a neighborhood class Cp,qMean value of the gray values of the middle pixels, I' (α)g,βg) Representation set Cp,q,2(Cp) Middle coordinate (alpha)g,βg) Pixel of (d), f' (α)g,βg) Represents a pixel I' (α)g,βg) Gray value of (d)w,ew) Representation set Cp,2(Cp,q) Middle coordinate (d)w,ew) Pixel of (d), f' (d)w,ew) Represents pixel I' (d)w,ew) Gray value of N (C)p,q,2(Cp) ) represents a set Cp,q,2(Cp) The number of pixels in (1), N (C)p,2(Cp,q) ) represents a set Cp,2(Cp,q) The number of pixels in (1).
It will be appreciated that the degree of separation between the first class of computational classes measures the coefficient D1(Cp,Cp,q) When the similarity between the selected possible edge pixels of the category to be measured and the selected similarity between the selected possible edge pixels of the category to be measured is larger than the similarity between the selected possible edge pixels of the category to be measured, the similarity between the similarity and the similarity of the category to be measured is smaller, the probability that the selected possible edge pixels belong to the category to be measured is higher, at the moment, the value of the sine part of the separation degree measurement coefficient between the first categories of the category to be measured is smaller, namely, the separation degree measurement coefficient between the first categories of the category to be measured is reduced, the difference between the similarity and the similarity is larger, the probability that the selected possible edge pixels belong to the category to be measured is higher, at the moment, the probability that the selected possible edge pixels belong to the category to be measured is higher, and the probability that theInter-class separation measurement coefficient;
a second inter-class separation metric D in the class of calculation to be measured2(Cp,Cp,q) Comparing the selected possible edge pixels of the category to be measured with the selected possible edge pixels of the neighborhood category, wherein the smaller the difference value is, the larger the overlapping degree between the category to be measured and the neighborhood category is, at the moment, the smaller the value of the separation degree measuring coefficient between the second categories of the category to be measured is, and the larger the difference value is, the smaller the overlapping degree between the category to be measured and the neighborhood category is, namely, the edge pixels of the area are effectively divided, at the moment, the larger the value of the separation degree measuring coefficient between the second categories of the category to be measured is; i.e. compared to the conventional way of measuring the degree of separation between classes by using the cluster validity index,
in a second aspect, the present application discloses a VR game security assurance terminal device, where the terminal device is configured to execute any one of the above VR game systems and methods based on a clustering algorithm, and the VR game security assurance terminal device includes:
the device comprises a camera, a clustering calculation module, an identification module and a processing module;
the camera is used for acquiring a current environment image through the camera;
the cluster calculation module is used for carrying out region segmentation on the current environment image after filtering processing by utilizing an FCM (fuzzy C-means) clustering algorithm;
the identification module is used for identifying each segmented current environment image area and judging whether an obstacle exists in each current environment image area; determining the current environment image area with obstacles as a target area;
the processing module is used for displaying a virtual warning image in the target area.
In a third aspect, the present application also discloses a terminal device, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the method according to the first aspect.
In a fourth aspect, the present application also discloses a computer-readable storage medium storing a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of the first aspect described above.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings that are needed in the detailed description of the present application or the technical solutions in the prior art will be briefly described below. Throughout the drawings, like elements or portions are generally identified by like reference numerals. In the drawings, elements or portions are not necessarily drawn to scale.
Fig. 1 is a schematic flowchart of a VR game system and method based on a clustering algorithm according to an embodiment of the present application;
fig. 2 is a schematic connection diagram of a terminal device of a VR game system according to an embodiment of the present disclosure;
fig. 3 is a schematic connection diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a plurality" typically includes at least two.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used to describe … … in embodiments of the present invention, these … … should not be limited to these terms. These terms are used only to distinguish … …. For example, the first … … can also be referred to as the second … … and similarly the second … … can also be referred to as the first … … without departing from the scope of embodiments of the present invention.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article of manufacture or terminal equipment that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article of manufacture or terminal equipment. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of additional like elements in the article of commerce or the terminal device in which the element is included.
With the continuous development of Virtual Reality (VR) technology and the continuous reduction of equipment cost, Virtual Reality has gradually gone into people's study, work and entertainment. Therefore, many game companies develop games based on the virtual reality technology, and the games based on the virtual reality technology can construct more vivid game situations in the games, so that game experiences with stronger immersion are brought to players.
However, users in VR games tend to be immersed in the virtual scene displayed by VR glasses, and cannot observe the characteristics of the surrounding real environment. Therefore, people or objects suddenly appearing in the surrounding environment may be injured by the user immersed in the game, and the user immersed in the game may be tripped over, which causes a certain safety risk. For example, when a user is waving a handle to simulate a war "monster" in a VR game, a person or pet suddenly intruding into the game area may be injured by the user in the game; obstacles such as tables, cabinets, etc. within the gaming area may also trip the user while the user is walking in the virtual environment of the VR game.
In a first aspect, as shown in fig. 1, the present application discloses a VR game system and method based on a clustering algorithm, including:
101. and acquiring a current environment image through a camera.
In the embodiment of the application, the VR game safety guarantee method is mainly applied to VR head-mounted display equipment worn by a user, and a camera is arranged on one side of the VR head-mounted display equipment facing a real environment to shoot a current environment image where the user is located in real time.
102. And carrying out region segmentation on the filtered current environment image by using an FCM clustering algorithm.
In the embodiment of the present application, a Fuzzy C-means (FCM) is an improvement of the conventional C-means algorithm as a typical unsupervised clustering algorithm. The method introduces the concept of fuzzy sets into clustering analysis, so that objects do not only belong to two extreme states of 'belonging' and 'not belonging', but also can express the degree of belonging to various classes by using membership function between 0 and 1. The FCM algorithm is popularized on the basis of fuzzy mathematics, the K-means algorithm is clustered by optimizing a fuzzy objective function, each point is considered to belong to a certain class unlike the K-means clustering, each point is endowed with a membership degree to each class, the membership degree is used for better describing the characteristics that edge pixels are also the same, and the FCM algorithm is suitable for processing the inherent uncertainty of things. The image segmentation is carried out by utilizing the characteristic of the FCM algorithm unsupervised fuzzy clustering calibration, so that the human intervention can be reduced, and the method is more suitable for the characteristics of uncertainty and fuzziness in the image.
103. And identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area.
In the embodiment of the present application, "identifying each segmented current environment image region" may include: extracting image features of a current environment image area by using a convolutional neural network image processing technology to obtain an image feature map of the current environment image area, wherein the image feature map comprises barrier elements; and performing content identification on the obstacle elements in the image feature map to obtain an obstacle identification result.
In the embodiment of the present application, the obstacle refers to a person or an object appearing in the user VR game area. For example, when a user is waving a handle to simulate a war "monster" in a VR game, a person or pet suddenly intruding into the game area may be injured by the user in the game; obstacles such as tables, cabinets, etc. within the gaming area may also trip the user while the user is walking in the virtual environment of the VR game.
104. And determining the current environment image area with the obstacle as the target area.
105. Displaying a virtual alert image within the target area.
In the embodiment of the application, after the target area with the obstacle is confirmed, the user should be reminded to avoid the obstacle or stop the game in various ways. The effective reminding mode is that a virtual warning image, such as an alarm icon, is displayed at a position corresponding to the target area in the game picture through the VR head-mounted display.
The VR game system and method based on the clustering algorithm and the terminal device are provided, when a user wears a VR head-mounted display to perform VR game, a camera of the VR head-mounted display facing an external real environment can acquire a current environment image of the real environment in real time, the current environment image is subjected to region segmentation by adopting an FCM clustering algorithm, a target region with obstacles is identified to perform appropriate prompt on the user, unnecessary damage between the user and the obstacles in the game is avoided, and safety of the user in the VR game process is guaranteed.
As an optional implementation, after determining the current environment image area where the obstacle exists as the target area, the method further includes: the current animation and audio play in the VR game is paused.
It will be appreciated that upon identifying a target area in which an obstacle is present, the user should be prompted appropriately to avoid unnecessary injury between the user and the obstacle in the game. The prompting mode can be various, and can comprise pausing the current animation and audio playing in the VR game, enabling users in the game to realize that potential safety hazards exist in the current real environment, stopping the game or moving bodies to other positions to continue the game.
As an optional implementation manner, performing region segmentation on the filtered current environment image by using an FCM clustering algorithm, including:
receiving clustering algorithm parameters, wherein the clustering algorithm parameters comprise a fuzzy index m, an iteration stop threshold epsilon and a maximum iteration number L;
receiving a set of cluster class numbers c, where c ═ { c ═ ch,ch=1,2,…,20},chThe number of clustering categories is obtained;
clustering algorithm parameters and clustering class number c in clustering class number set c through FCM clustering algorithmhPerforming region segmentation on the current environment image after filtering processing to obtain an image segmentation result set P;
wherein P ═ { Ct,t=1,2,…,20},Ct={Cp,p=1,2,…,ch};
Wherein, CtRepresenting the number c of clustering algorithm parameters and clustering categories based on the FCM clustering algorithmhCarrying out region segmentation on the current environment image after filtering processing to obtain a class setCombining; wherein, CpRepresenting the p-th class in the current ambient image, i.e. for class C in the current ambient imagepA set of pixels having a maximum degree of membership;
by using a clustering validity index function J (C)t) For each class set C in the image segmentation result set PtCalculating to obtain a clustering validity index value set J', wherein J ═ { J ═ Jz,z=1,2,…,20};
Selecting a minimum value J from the clustering effectiveness index value set JminDetermining the minimum value JminThe corresponding cluster category number is the target cluster category number cx
Determining the clustering algorithm parameter and the target clustering category number c based on the FCM clustering algorithmxCarrying out region segmentation on the current environment image after filtering processing to obtain a category set CxAnd the final target region segmentation result is obtained.
In the embodiment of the present application, let Cp′Representing the p' th class in the current ambient image, when class Cp′Edge pixel in (2) and class CpWhen the edge pixels of (1) are adjacent, then class Cp′Is classified as class CpIs given as Y (C)p) Represents class CpSet of neighborhood classes in the current ambient image, and Y (C)p)={Cp,q,q=1,...,y(Cp) In which C isp,qRepresents class CpThe qth neighborhood class of (2), y (C)p) Representing a class C present in the set C (I')pThe number of neighborhood categories.
It can be understood that the FCM clustering algorithm performs region segmentation on the current environment image based on different clustering category numbers to obtain different region segmentation results. In the image segmentation process, the edge pixels of each image region are more likely to be wrongly classified, for example, the edge pixels of a domain class of a certain class may have a higher membership to the class, and similarly, the edge pixels of the class may also have a higher membership to the domain class. The application can adopt a clustering validity index function J (C)t) To determine optimal clustering for FCM clustering algorithmsNumber of classes, i.e. number of classes c of object clusterxWhen the cluster validity index value J is usedzWhen the minimum value is obtained, the category number at this time is the optimal cluster number. And performing region segmentation on the filtered current environment image based on the optimal clustering number through an FCM clustering algorithm to obtain a class set which is an optimal region segmentation result, namely a final target region segmentation result. According to the clustering effectiveness index, the edge pixels which are easy to be subjected to wrong classification among the classes are detected, so that the separation degree and the overlapping degree among the classes in the image segmentation process can be measured more accurately.
As an alternative embodiment, the effectiveness of the clustering is improved by using a clustering effectiveness index function J (C)t) For each class set C in the image segmentation result set PtPerforming a calculation comprising:
calculate each class set CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) Where the first tightness measure coefficient ρ (C)p) Represents class CpThe compactness coefficient of the middle pixel, and a second compactness coefficient ρ (C)p'(Cp,q) ) represents a set Cp'(Cp,q) Compactness of middle pixel, where Cp'(Cp,q) Represents class CpPixel of (2) and set Cp,2(Cp,q) Wherein the set C is a set of pixelsp,2(Cp,q) Represents class Cp,qFor class CpA set of pixels having a second largest degree of membership greater than a predetermined degree of membership threshold μ 0, wherein Cp,qRepresents class CpThe qth neighborhood category of (1);
calculate each class set CtIntegrated inter-separation metric D (C)p);
The clustering effectiveness index function J (C) is calculated by the following formulat):
Figure BDA0002743494600000131
It is understood that the preset membership threshold μ 0 may be factory set by the manufacturer, for example, the value of μ 0 may be set to 0.3. The clustering effectiveness index function comprises a first compactness measurement coefficient rho (C)p) And a measure of the degree of separation between the complexes D (C)p) That is, the clustering effect of the FCM clustering algorithm is measured by the first closeness coefficient ρ (C)p) And a measure of the degree of separation between the complexes D (C)p) To decide.
As an alternative embodiment, each class set C is calculatedtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p'(Cp,q) ) comprising:
the first tightness measure coefficient ρ (C) is obtained by the following formulap) And (3) calculating:
Figure BDA0002743494600000132
the second tightness measure coefficient ρ (C) by the following formulap'(Cp,q) To calculate:
Figure BDA0002743494600000133
wherein, I' (a)r,br) Represents class CpMiddle coordinate (a)r,br) Pixel of (d), f' (a)r,br) Represents a pixel I' (a)r,br) Gray value of VpRepresents class CpCluster center pixel of (d), f' (V)p) Representing cluster center pixel VpGray value of N (C)p) Represents class CpNumber of pixels in (1), I' (a)z,bz) Represents class CpMiddle coordinate (a)z,bz) Pixel of (d), f' (a)z,bz) Represents a pixel I' (a)z,bz) Gray value of (A), I' (A)R,BR) Represents a collective C'p(Cp,q) Middle seatLabel (A)R,BR) Pixel of (d), f' (A)R,BR) Represents a pixel I' (A)R,BR) Gray value of (A), I' (A)Z,BZ) Represents a collective C'p(Cp,q) Middle coordinate (A)Z,BZ) Pixel of (d), f' (A)Z,BZ) Represents a pixel I' (A)Z,BZ) Gray value of N (C'p(Cp,q) Represents a set C'p(Cp,q) The number of pixels in (1).
As an alternative embodiment, each class set C is calculatedtIntegrated inter-separation metric D (C)p) The method comprises the following steps:
the coefficient D (C) is measured for the degree of separation between the complexes by the following formulap) And (3) calculating:
Figure BDA0002743494600000141
wherein D is1(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qFirst inter-class separation measure of D2(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qSecond inter-class separation metric of y (C)p) Representation set CtClass C present inpThe number of neighborhood categories.
It can be understood that the inter-integrated separation metric D (C)p) The first inter-class separation degree measuring coefficient D is included in the1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) That is, the quality of the clustering effect of the FCM clustering algorithm is also measured by the first inter-class separation degree1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) To decide.
As an alternative embodiment, each class set C is calculatedtIs measured by the first inter-class separation metric coefficient D1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) The method also comprises the following steps:
the first inter-class separation measure coefficient D is measured by the following formula1(Cp,Cp,q) And (3) calculating:
Figure BDA0002743494600000142
the coefficient D is measured for the degree of separation between the second classes by the following formula2(Cp,Cp,q) And (3) calculating:
Figure BDA0002743494600000143
wherein the content of the first and second substances,
Figure BDA0002743494600000144
represents class CpThe mean of the gray values of the middle pixels,
Figure BDA0002743494600000145
representing a neighborhood class Cp,qMean value of the gray values of the middle pixels, I' (α)g,βg) Representation set Cp,q,2(Cp) Middle coordinate (alpha)g,βg) Pixel of (d), f' (α)g,βg) Represents a pixel I' (α)g,βg) Gray value of (d)w,ew) Representation set Cp,2(Cp,q) Middle coordinate (d)w,ew) Pixel of (d), f' (d)w,ew) Represents pixel I' (d)w,ew) Gray value of N (C)p,q,2(Cp) ) represents a set Cp,q,2(Cp) The number of pixels in (1), N (C)p,2(Cp,q) ) represents a set Cp,2(Cp,q) The number of pixels in (1).
It will be appreciated that the degree of separation measure between the first class of computational classesD1(Cp,Cp,q) When the similarity between the selected possible edge pixels of the category to be measured and the selected pixels in the category to be measured is larger, the similarity between the selected possible edge pixels and the selected similarity between the similarity and the selected similarity is smaller, the probability that the selected possible edge pixels belong to the category to be measured is higher, at the moment, the value of the sine part of the first inter-class separation degree measurement coefficient of the category to be measured is smaller, namely the first inter-class separation degree measurement coefficient of the category to be measured is reduced, and the probability that the selected possible edge pixels belong to the category to be measured is higher, the difference between the sine part and the selected possible edge pixels is higher, at the moment, the first inter-class separation degree measurement coefficient of the category to be measured is increased;
a second inter-class separation metric D in the class of calculation to be measured2(Cp,Cp,q) Comparing the selected possible edge pixels of the category to be measured with the selected possible edge pixels of the neighborhood category, wherein the smaller the difference value is, the larger the overlapping degree between the category to be measured and the neighborhood category is, at the moment, the smaller the value of the separation degree measuring coefficient between the second categories of the category to be measured is, and the larger the difference value is, the smaller the overlapping degree between the category to be measured and the neighborhood category is, namely, the edge pixels of the area are effectively divided, at the moment, the larger the value of the separation degree measuring coefficient between the second categories of the category to be measured is; i.e. compared to the conventional way of measuring the degree of separation between classes by using the cluster validity index,
in a second aspect, as shown in fig. 2, the present application discloses a VR game terminal device, which is configured to execute any one of the above VR game system and method based on the clustering algorithm, and includes:
the system comprises a camera 201, a cluster calculation module 202, an identification module 203 and a processing module 204;
the camera 201 is used for acquiring a current environment image through the camera 201;
the cluster calculation module 202 is configured to perform region segmentation on the filtered current environment image by using an FCM clustering algorithm;
the identification module 203 is configured to identify each segmented current environment image area, and determine whether an obstacle exists in each current environment image area; determining a current environment image area with an obstacle as a target area;
and the processing module 204 is configured to display the virtual warning image in the target area.
It should be noted that the functions of each functional device of the VR game security assurance terminal device shown in fig. 2 may be specifically implemented according to the method in the method embodiment shown in fig. 1, and the specific implementation process may refer to the related description of the method embodiment of fig. 1, which is not described herein again.
In a third aspect, the application also discloses another terminal device. The terminal device in the present embodiment shown in fig. 3 may include: one or more processors 301; one or more input devices 302, one or more output devices 303, and memory 304. The processor 301, the input device 302, the output device 303, and the memory 304 are connected by a bus 305. The memory 502 is used to store a computer program comprising program instructions and the processor 301 is used to execute the program instructions stored by the memory 502. Wherein the processor 301 is configured to call the program instruction to perform the following operations:
acquiring a current environment image through a camera;
carrying out region segmentation on the filtered current environment image by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining a current environment image area with an obstacle as a target area;
displaying a virtual alert image within the target area.
It should be understood that, in the embodiment of the present invention, the Processor 301 may be a Central Processing Unit (CPU), and the Processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 302 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 303 may include a display (LCD, etc.), a speaker, etc.
The memory 304 may include a read-only memory and a random access memory, and provides instructions and data to the processor 301. A portion of the memory 304 may also include non-volatile random access memory. For example, the memory 304 may also store device type information.
In a specific implementation, the processor 301, the input device 302, and the output device 303 described in this embodiment of the present invention may execute the implementation manners described in the first embodiment and the second embodiment of the endurance test method provided in this embodiment of the present invention, and may also execute the implementation manner of the terminal device described in this embodiment of the present invention, which is not described herein again.
In a fourth aspect, in another embodiment of the invention, there is provided a computer-readable storage medium storing a computer program comprising program instructions that, when executed by a processor, implement:
acquiring a current environment image through a camera;
carrying out region segmentation on the filtered current environment image by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining a current environment image area with an obstacle as a target area;
displaying a virtual alert image within the target area.
The computer readable storage medium may be an internal storage unit of the terminal device in any of the foregoing embodiments, for example, a hard disk or a memory of the terminal device. The computer-readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided in the terminal device. Further, the computer-readable storage medium may include both an internal storage unit and an external storage device of the terminal device. The computer-readable storage medium stores the computer program and other programs and data required by the terminal device. The above-described computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal device and the unit described above may refer to corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal device and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the above-described division of units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present invention essentially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present disclosure, and the present disclosure should be construed as being covered by the claims and the specification.

Claims (10)

1. A VR game method based on a clustering algorithm is characterized by comprising the following steps:
acquiring a current environment image through a camera;
carrying out region segmentation on the current environment image after filtering processing by using an FCM clustering algorithm;
identifying each segmented current environment image area, and judging whether an obstacle exists in each current environment image area;
determining the current environment image area with obstacles as a target area;
displaying a virtual alert image within the target area.
2. The VR gaming method of claim 1,
after the determining the current environment image area where the obstacle exists as the target area, the method further includes:
pausing the current animation and audio playback in the VR game.
3. The VR gaming method of claim 1,
the region segmentation of the filtered current environment image by using the FCM clustering algorithm comprises the following steps:
receiving clustering algorithm parameters, wherein the clustering algorithm parameters comprise a fuzzy index m, an iteration stop threshold epsilon and a maximum iteration number L;
receiving a set of cluster class numbers c, where c ═ { c ═ ch,ch1,2, …,20}, said chThe number of clustering categories is obtained;
clustering algorithm parameters and clustering category numbers c in the clustering category number set c through FCMhPerforming region segmentation on the current environment image after filtering processing to obtain an image segmentation result set P;
wherein P ═ { Ct,t=1,2,…,20},Ct={Cp,p=1,2,…,ch};
Wherein, CtRepresenting the number c of clusters based on the cluster algorithm parameters and the cluster categories by FCM cluster algorithmhCarrying out region segmentation on the current environment image after filtering processing to obtain a category set; wherein, CpRepresenting the p-th category in the current ambient image, i.e. for category C in the current ambient imagepA set of pixels having a maximum degree of membership;
by using a clustering validity index function J (C)t) For each class set in the image segmentation result set PCt is calculated to obtain a clustering validity index value set J', wherein J ═ Jz,z=1,2,…,20};
Selecting a minimum value J from the clustering validity index value set JminDetermining the minimum value JminThe corresponding cluster category number is the target cluster category number cx
Determining the number c of clusters based on the clustering algorithm parameters and the target cluster categories by FCM clustering algorithmxPerforming region segmentation on the current environment image after filtering processing to obtain a category set CxAnd the final target region segmentation result is obtained.
4. The VR gaming method of claim 3,
by using a cluster validity indicator function J (C)t) For each class set C in the image segmentation result set PtPerforming a calculation comprising:
calculating each of the class sets CtFirst tightness ofMeasurement coefficient rho (C)p) And a second tightness measure coefficient ρ (C)p′(Cp,q) Wherein the first tightness measure coefficient ρ (C)p) Represents class CpA second compactness measure p (C) of the middle pixelp′(Cp,q) ) represents a set Cp′(Cp,q) Compactness of middle pixel, where Cp′(Cp,q) Represents class CpPixel of (2) and set Cp,2(Cp,q) Wherein the set C is a set of pixelsp,2(Cp,q) Represents class Cp,qFor class CpA set of pixels having a second largest degree of membership greater than a predetermined degree of membership threshold μ 0, wherein Cp,qRepresents class CpThe qth neighborhood category of (1);
calculating each of the class sets CtIntegrated inter-separation metric D (C)p);
Calculating the cluster validity indicator function J (C) by the following formulat):
Figure FDA0002743494590000021
5. The VR gaming method of claim 4,
said computing each of said category sets CtIs measured by a first tightness measure coefficient ρ (C)p) And a second tightness measure coefficient ρ (C)p′(Cp,q) ) comprising:
the first compactness measure coefficient ρ (C) is scaled by the following formulap) And (3) calculating:
Figure FDA0002743494590000031
the second tightness measure coefficient ρ (C) by the following formulap′(Cp,q) To calculate:
Figure FDA0002743494590000032
wherein, I' (a)r,br) Represents class CpMiddle coordinate (a)r,br) Pixel of (d), f' (a)r,br) Represents a pixel I' (a)r,br) Gray value of VpRepresents class CpCluster center pixel of (d), f' (V)p) Representing cluster center pixel VpGray value of N (C)p) Represents class CpNumber of pixels in (1), I' (a)z,bz) Represents class CpMiddle coordinate (a)z,bz) Pixel of (d), f' (a)z,bz) Represents a pixel I' (a)z,bz) Gray value of (A), I' (A)R,BR) Represents a collective C'p(Cp,q) Middle coordinate (A)R,BR) Pixel of (d), f' (A)R,BR) Represents a pixel I' (A)R,BR) Gray value of (A), I' (A)Z,BZ) Represents a collective C'p(Cp,q) Middle coordinate (A)Z,BZ) Pixel of (d), f' (A)Z,BZ) Represents a pixel I' (A)Z,BZ) Gray value of N (C'p(Cp,q) Represents a set C'p(Cp,q) The number of pixels in (1).
6. The VR game security method of claim 5,
said computing each of said category sets CtIntegrated inter-separation metric D (C)p) The method comprises the following steps:
the inter-integration separation measure coefficient D (C) is measured by the following formulap) And (3) calculating:
Figure FDA0002743494590000033
wherein D is1(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qFirst inter-class separation measure of D2(Cp,Cp,q) Represents class CpAnd neighborhood class Cp,qSecond inter-class separation metric of y (C)p) Representation set CtClass C present inpThe number of neighborhood categories.
7. The VR gaming method of claim 6,
said computing each of said category sets CtIs measured by the first inter-class separation metric coefficient D1(Cp,Cp,q) And a second inter-class separation metric D2(Cp,Cp,q) The method also comprises the following steps:
measuring coefficient D for the first inter-class separation degree by the following formula1(Cp,Cp,q) And (3) calculating:
Figure FDA0002743494590000041
measuring coefficient D for the second inter-class separation degree by the following formula2(Cp,Cp,q) And (3) calculating:
Figure FDA0002743494590000042
wherein the content of the first and second substances,
Figure FDA0002743494590000043
represents class CpThe mean of the gray values of the middle pixels,
Figure FDA0002743494590000044
representing a neighborhoodClass Cp,qMean value of the gray values of the middle pixels, I' (α)g,βg) Representation set Cp,q,2(Cp) Middle coordinate (alpha)g,βg) Pixel of (d), f' (α)g,βg) Represents a pixel I' (α)g,βg) Gray value of (d)w,ew) Representation set Cp,2(Cp,q) Middle coordinate (d)w,ew) Pixel of (d), f' (d)w,ew) Represents pixel I' (d)w,ew) Gray value of N (C)p,q,2(Cp) ) represents a set Cp,q,2(Cp) The number of pixels in (1), N (C)p,2(Cp,q) ) represents a set Cp,2(Cp,q) The number of pixels in (1).
8. A VR game system based on a clustering algorithm, comprising:
the device comprises a camera, a clustering calculation module, an identification module and a processing module;
the camera is used for acquiring a current environment image through the camera;
the cluster calculation module is used for carrying out region segmentation on the current environment image after filtering processing by utilizing an FCM (fuzzy C-means) clustering algorithm;
the identification module is used for identifying each segmented current environment image area and judging whether an obstacle exists in each current environment image area; determining the current environment image area with obstacles as a target area;
the processing module is used for displaying a virtual warning image in the target area;
the gaming system performs the method of any of claims 1 to 7.
9. A terminal device comprising a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the method according to any one of claims 1 to 7.
CN202011158295.7A 2020-10-26 2020-10-26 VR game system and method based on clustering algorithm Active CN112348827B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011158295.7A CN112348827B (en) 2020-10-26 2020-10-26 VR game system and method based on clustering algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011158295.7A CN112348827B (en) 2020-10-26 2020-10-26 VR game system and method based on clustering algorithm

Publications (2)

Publication Number Publication Date
CN112348827A true CN112348827A (en) 2021-02-09
CN112348827B CN112348827B (en) 2021-07-13

Family

ID=74358536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011158295.7A Active CN112348827B (en) 2020-10-26 2020-10-26 VR game system and method based on clustering algorithm

Country Status (1)

Country Link
CN (1) CN112348827B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101881615A (en) * 2010-05-28 2010-11-10 清华大学 Method for detecting visual barrier for driving safety
CN102774325A (en) * 2012-07-31 2012-11-14 西安交通大学 Rearview reversing auxiliary system and method for forming rearview obstacle images
CN103366367A (en) * 2013-06-19 2013-10-23 西安电子科技大学 Pixel number clustering-based fuzzy C-average value gray level image splitting method
CN103473786A (en) * 2013-10-13 2013-12-25 西安电子科技大学 Gray level image segmentation method based on multi-objective fuzzy clustering
CN103559716A (en) * 2013-11-12 2014-02-05 广州太普信息技术有限公司 Method for automatic segmentation of defective image
CN103870845A (en) * 2014-04-08 2014-06-18 重庆理工大学 Novel K value optimization method in point cloud clustering denoising process
US20150026808A1 (en) * 2010-01-19 2015-01-22 Damballa, Inc. Method and system for network-based detecting of malware from behavioral clustering
CN104574368A (en) * 2014-12-22 2015-04-29 河海大学 Self-adaptive kernel cluster image partitioning method
CN105430664A (en) * 2015-10-30 2016-03-23 上海华为技术有限公司 Method and device of predicting propagation path loss based on classification fitting
CN105652873A (en) * 2016-03-04 2016-06-08 中山大学 Mobile robot obstacle avoidance method based on Kinect
CN107220977A (en) * 2017-06-06 2017-09-29 合肥工业大学 The image partition method of Validity Index based on fuzzy clustering
CN108284793A (en) * 2018-01-10 2018-07-17 深圳市鑫汇达机械设计有限公司 A kind of vehicle sub-controlling unit
CN111507145A (en) * 2019-01-31 2020-08-07 上海欧菲智能车联科技有限公司 Method, system and device for detecting barrier at storage position of embedded vehicle-mounted all-round looking system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150026808A1 (en) * 2010-01-19 2015-01-22 Damballa, Inc. Method and system for network-based detecting of malware from behavioral clustering
CN101881615A (en) * 2010-05-28 2010-11-10 清华大学 Method for detecting visual barrier for driving safety
CN102774325A (en) * 2012-07-31 2012-11-14 西安交通大学 Rearview reversing auxiliary system and method for forming rearview obstacle images
CN103366367A (en) * 2013-06-19 2013-10-23 西安电子科技大学 Pixel number clustering-based fuzzy C-average value gray level image splitting method
CN103473786A (en) * 2013-10-13 2013-12-25 西安电子科技大学 Gray level image segmentation method based on multi-objective fuzzy clustering
CN103559716A (en) * 2013-11-12 2014-02-05 广州太普信息技术有限公司 Method for automatic segmentation of defective image
CN103870845A (en) * 2014-04-08 2014-06-18 重庆理工大学 Novel K value optimization method in point cloud clustering denoising process
CN104574368A (en) * 2014-12-22 2015-04-29 河海大学 Self-adaptive kernel cluster image partitioning method
CN105430664A (en) * 2015-10-30 2016-03-23 上海华为技术有限公司 Method and device of predicting propagation path loss based on classification fitting
CN105652873A (en) * 2016-03-04 2016-06-08 中山大学 Mobile robot obstacle avoidance method based on Kinect
CN107220977A (en) * 2017-06-06 2017-09-29 合肥工业大学 The image partition method of Validity Index based on fuzzy clustering
CN108284793A (en) * 2018-01-10 2018-07-17 深圳市鑫汇达机械设计有限公司 A kind of vehicle sub-controlling unit
CN111507145A (en) * 2019-01-31 2020-08-07 上海欧菲智能车联科技有限公司 Method, system and device for detecting barrier at storage position of embedded vehicle-mounted all-round looking system

Also Published As

Publication number Publication date
CN112348827B (en) 2021-07-13

Similar Documents

Publication Publication Date Title
CN107423690B (en) Face recognition method and device
US10699103B2 (en) Living body detecting method and apparatus, device and storage medium
CN109948497B (en) Object detection method and device and electronic equipment
WO2018028546A1 (en) Key point positioning method, terminal, and computer storage medium
CN110826370B (en) Method and device for identifying identity of person in vehicle, vehicle and storage medium
CN110390229B (en) Face picture screening method and device, electronic equipment and storage medium
CN109299658B (en) Face detection method, face image rendering device and storage medium
KR20190098858A (en) Method and apparatus for pose-invariant face recognition based on deep learning
CN111696080B (en) Face fraud detection method, system and storage medium based on static texture
CN116051115A (en) Face-brushing payment prompting method, device and equipment
US9704024B2 (en) Object discriminating apparatus and method
CN106778731B (en) A kind of license plate locating method and terminal
CN108389053B (en) Payment method, payment device, electronic equipment and readable storage medium
CN111680546A (en) Attention detection method, attention detection device, electronic equipment and storage medium
JP2006065447A (en) Discriminator setting device, degree-of-attention measuring device, discriminator setting method, degree-of-attention measuring method, and program
CN105740752B (en) Sensitive picture filtering method and system
CN115471824A (en) Eye state detection method and device, electronic equipment and storage medium
CN114419378A (en) Image classification method and device, electronic equipment and medium
CN112348827B (en) VR game system and method based on clustering algorithm
US20200019776A1 (en) Currency verification and transaction validation system
CN112861743A (en) Palm vein image anti-counterfeiting method, device and equipment
CN111931617B (en) Human eye image recognition method and device based on image processing and self-service terminal
CN112364846A (en) Face living body identification method and device, terminal equipment and storage medium
WO2020217812A1 (en) Image processing device that recognizes state of subject and method for same
CN112800847A (en) Face acquisition source detection method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210623

Address after: 518000 17th floor, block B, Sunshine Technology Innovation Center, No.2 Shanghua Road, Nanshan street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: SHENZHEN RAYVISION TECHNOLOGY Co.,Ltd.

Address before: 343000 No.106, Beimen Road, Fanrong street, Hechuan Town, Yongxin County, Ji'an City, Jiangxi Province

Applicant before: Luo Ziyao

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 518000 17th floor, block B, Sunshine Technology Innovation Center, No.2 Shanghua Road, Nanshan street, Nanshan District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Ruiyun Technology Co.,Ltd.

Address before: 518000 17th floor, block B, Sunshine Technology Innovation Center, No.2 Shanghua Road, Nanshan street, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN RAYVISION TECHNOLOGY CO.,LTD.

CP01 Change in the name or title of a patent holder
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A VR game system and method based on clustering algorithm

Effective date of registration: 20230619

Granted publication date: 20210713

Pledgee: Shenzhen hi tech investment small loan Co.,Ltd.

Pledgor: Shenzhen Ruiyun Technology Co.,Ltd.

Registration number: Y2023980044570

PE01 Entry into force of the registration of the contract for pledge of patent right