CN112800885A - Data processing system and method based on big data - Google Patents

Data processing system and method based on big data Download PDF

Info

Publication number
CN112800885A
CN112800885A CN202110058251.5A CN202110058251A CN112800885A CN 112800885 A CN112800885 A CN 112800885A CN 202110058251 A CN202110058251 A CN 202110058251A CN 112800885 A CN112800885 A CN 112800885A
Authority
CN
China
Prior art keywords
person
designated
risk
designated person
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110058251.5A
Other languages
Chinese (zh)
Other versions
CN112800885B (en
Inventor
夏凤霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Zhongxin Yunchuang Software Technology Co ltd
Original Assignee
Nanjing Zhongxin Yunchuang Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Zhongxin Yunchuang Software Technology Co ltd filed Critical Nanjing Zhongxin Yunchuang Software Technology Co ltd
Priority to CN202110058251.5A priority Critical patent/CN112800885B/en
Publication of CN112800885A publication Critical patent/CN112800885A/en
Application granted granted Critical
Publication of CN112800885B publication Critical patent/CN112800885B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Development Economics (AREA)
  • Artificial Intelligence (AREA)
  • Primary Health Care (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Social Psychology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Geometry (AREA)
  • Psychiatry (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a data processing system and a method based on big data, the system comprises a video analysis end, an identity verification identification system, a database end and a public security system scheduling module, wherein the video analysis end is used for verifying whether a face image appears in a high-risk area and calling the face image in a related video for comparison verification, so that the opportunity of virus propagation in the non-risk area can be avoided, the identity verification confirmation module is used for further analyzing the walking form of a designated person by using the video analysis end when only detecting the face part area of the designated person, so as to determine the identity state of the designated person as soon as possible, the database end is used for verifying the face image with higher similarity with the designated person, and simultaneously, the identity of the designated person is identified according to the walking posture of the designated person, the public security system scheduling module is used for detecting that the designated person contacts suspected persons or passes through the high-risk area, and distributing corresponding community isolation according to the risk value grade of the designated personnel.

Description

Data processing system and method based on big data
Technical Field
The invention relates to the technical field of big data processing, in particular to a big data-based data processing system and method.
Background
Before each user enters a larger place, the people need to show own health codes, the health codes are declared by the people through a website, personal two-dimensional codes are generated immediately after background audit, and two-dimensional codes with different colors are generated according to the running tracks of different people, but in the process of applying for the two-dimensional codes, whether people who contact a diagnostician or a person who closely contact the diagnostician or the person who contacts the diagnostician are selected and filled in the two-dimensional codes or not needs to be judged autonomously, the real situation of the two-dimensional codes is not known, and the subjective degree of the democratic people is completely relied on, so that the situation of the people needs to be determined more strictly in a judgment rule;
because of the existence of new coronavirus, it is inevitable that people wear masks when going out, but before people enter and exit, public security departments need to check videos of areas where people stay and judge whether people who enter stay in high-risk areas, and therefore face features of people need to be checked in videos called by the public security departments, but face features of people cannot be completely displayed in the detection process, people can wear hats, sunglasses and the like to shield partial features, and the features are similar when the face is partially detected, so that except for detecting the face features, the walking posture of each person needs to be judged, states of people who statically stand, walk and run need to be checked to judge the identity condition of a user, and isolation is performed in communities of different levels according to the risk value of the people;
therefore, a data processing system and method based on big data are needed to solve the above problems.
Disclosure of Invention
The present invention is directed to a data processing system and method based on big data, so as to solve the problems mentioned in the background art.
In order to solve the technical problems, the invention provides the following technical scheme: a data processing system and method based on big data, the system includes a video analysis end, an identity verification identification system, a database end and a public security system scheduling module, the video analysis end is used for verifying whether a corresponding face image appears in a high risk area and calling the face image of a related video for comparison so as to avoid the opportunity of virus propagation to a non-risk area, the identity verification confirmation module is used for further analyzing the walking shape of a designated person by using the video analysis end when detecting a face part area of the designated person so as to be capable of determining the identity state of the designated person as soon as possible, the database end is used for verifying the face image with high similarity with the designated person and commonly identifying the identity of the designated person according to the walking posture of the designated person, the public security system scheduling module is used for detecting that the designated person contacts suspected persons or passes through the high risk area, and distributing corresponding community isolation according to the risk value grade of the designated personnel.
Further, the video analysis terminal comprises an identity information checking unit, a position calling unit, a photo capturing unit and an image analysis unit, wherein the identity information checking unit is used for registering information according to names, current positions and frequent addresses of appointed persons and sending the information to the public security system, the position calling unit is used for calling traces of cities where the appointed persons stay within preset days to check whether the appointed persons stay in high-risk areas or not so as to prevent the high-risk areas from spreading viruses to local positions, the photo capturing unit is used for calling monitoring video images when detecting that the appointed persons stay in the high-risk areas or not and judging whether the appointed persons are in close contact with the high-risk areas or not, the image analysis unit is used for judging whether complete face images can be detected in videos during video analysis or not, and if not, the walking pattern of the designated person is analyzed to further determine the identity of the person.
Furthermore, the identity verification confirming system comprises a face similarity detecting module and a human body posture determining module, wherein the face similarity detecting module is used for judging and comparing with the face characteristics of other people according to partial face characteristics when a complete face image is not detected, further analyzing the walking posture characteristics of the human body when the detected similarity is higher, the human body posture determining module is used for collecting the walking posture with higher similarity to the face characteristics, confirming whether the collected information is the same as a real person or not, sending the collected result to a database end for storage, and the face similarity detecting module is connected with the human body posture determining module.
The human body posture determining module is used for analyzing the identity condition of a designated person for the walking speed of the person in the video monitoring during static standing, walking and running, the contact area proportion of the sole and the heel to the ground during walking and running and the friction angle of the sole and the heel to the ground.
Further, the public security system scheduling module comprises a personnel contact determining unit, a risk level determining unit, a personnel arrangement scheduling management unit and a message transmission unit, wherein the personnel contact determining unit is used for detecting whether the contact personnel of the designated personnel in the high-risk area have the transmission virus according to the video so that other areas can guarantee corresponding safety, the risk level determining unit is used for determining the risk level of the designated personnel according to the direct contact and indirect contact between the trip route of the designated personnel and the personnel in the high-risk area so as to determine the risk level of the designated personnel and isolate the designated personnel in communities with different risk levels, the personnel arrangement scheduling management unit is used for reasonably arranging and isolating the communities according to the current risk numbers with different levels so as to guarantee the safety of the area and other areas, and the message transmission unit is used for transmitting the isolated information of the designated personnel, and the output end of the personnel contact determining unit is connected with the input ends of the risk level determining unit, the personnel loss arrangement management unit and the message dissemination unit.
A big data-based data processing method comprises the following steps:
step Z01: calling position information of a designated person within preset time through a video analysis end, capturing face picture information through the video end, verifying contact conditions of the designated person and other persons from the video end for analysis, and executing a step Z02 when face features are not completely captured or are not clear;
step Z02: when the face features of the appointed person are not detected completely, analyzing the walking state of the appointed person, and determining the identity of the current appointed person;
step Z03: analyzing the face characteristic image detected in the video and the corresponding walking state, judging whether the identity is correct, continuously verifying the contact condition of the designated person and the high-risk area when the identity of the designated person is verified to be correct, and otherwise, continuously verifying the step Z01 and the step Z02;
step Z04: and judging the risk level of the designated personnel according to the verification result of the step Z03, and distributing corresponding communities according to the corresponding risk levels for isolation.
In step Z01, the method determines, according to the face pictures appearing in the video image and the face features appearing in the pictures taken at different angles, and calculates feature points of the eyes and the mouth, specifically including the following steps:
z001: when only eye characteristic points appear in the picture, calculating a vertical distance L1 between the middle of the upper eyelid and the middle of the lower eye when the eyes are normally opened, a straight-line distance L2 between the canthus of one side of the left eye and one side of the right eye, and an included angle r between the upper eyelid and the lower eyelid;
z002: when only nose feature points appear in the picture, calculating the included angle degree d between the nose bridge and the region in the person, and when a person with higher similarity to the feature of the designated person is detected in the step Z001 and the step Z002, continuing to verify the step Z003;
z003: and calculating the distance o between the perpendicular line q between the centers of the two eyes and the top point of the nose bridge, the included angle between the point of the nose bridge and the two eyes a and b, and the included angle between the point of the nose bridge and the two eyes tails.
Further, in step Z02, when the designated person walks on a level road, the ratio of the front sole stepping on the plane area s is determined before the rear heel falls to the ground, and the coordinate positions of the left and right vertices e and f of the front sole are set to (x)1,y1)、(x2,y2) The corresponding coordinate of the vertex of the front sole is
Figure BDA0002901485210000031
Wherein the front sole forms an area S on the ground3=(L*y3)-S1-S2Wherein S is1、S2Refers to the partial area formed by e and f at the top point of the forefoot,
Figure BDA0002901485210000032
two vertex positions formed after the rear sole rubs with the ground are c (x)5,y5) And d (x)6,y6) When the designated person walks forward, the coordinate corresponding to the vertex of the front foot when the designated person walks is g (x)3,y3) The coordinate of the vertex of the rear sole is h (x)4,y4) The angle of the rear sole offset from the original angle of walking on a plane is
Figure BDA0002901485210000041
According to the area S formed by the front sole and the ground3Angle offset from heel
Figure BDA0002901485210000042
And judging the identity of the appointed personnel.
In the step Z04, it is detected that the facial features and the change of the features during walking of the designated person have determined the identity of the designated person, and the public security organization has called the travel track of the designated person from point B to point E, set the current location position of the high risk area to be A, C, and set the vehicle taken by the designated person to pass through the high risk area C, where the set of persons taken by the area C is J ═ { J ═ J { (J) } J1,j2,j3...jm-1,jmAnd the nearest distance to the designated person is M, according to jiAs a center of circle, according to jiThe movable radius r judges whether the designated personnel is influenced or not, when M-r>k, indicating that the designated person is not in contact with the person, and only needs to be isolated in the local community, when M-r<k represents that a designated person contacts the person, the risk degree of the person is high, and an isolation point needs to be independently arranged for isolation, wherein k is a set safety distance.
In step Z02, the step velocity v of the designated person during running is analyzed by combining the area S3 formed by the front sole and the ground and the angle of the rear heel offset from the groundDegree of rotation
Figure BDA0002901485210000043
And judging the running state of the appointed person by combining the factors, and detecting whether the appointed person has a crowd with high similarity with the appointed person.
Compared with the prior art, the invention has the following beneficial effects:
1. the method comprises the steps that a video analysis end is used for detecting face images before people enter a community, public security organs call images of areas where the people are located before, whether the face images of the people are partial face feature images or not is detected, whether images with high similarity rate with the faces of the people can be detected or not is judged, whether the people are the people can be rapidly identified or not is judged, whether the people stay in high-risk areas or not is judged, and whether the people stay in the high-risk areas or not and whether the people are in close contact with the high-risk areas or not is monitored constantly;
2. when the identity verification and identification system is used, when the similarity of the face of the people to the face of the people is detected to be high, the personnel identity is further analyzed according to the walking speed, the contact area ratio of the sole and the heel to the ground and the friction angle between the heel and the ground of the people during static standing, walking and running of the people, and whether the people are in close contact with the personnel in the high-risk area is further analyzed, so that the safety of the local personnel is ensured, and the risk of virus propagation is avoided;
3. the public security system scheduling module is used, whether people directly or indirectly contact with high-risk areas or not is detected in videos, isolation is carried out in communities with different risk levels according to the risk levels of the people, and therefore safety of the people is guaranteed.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a block diagram of a big data based data processing system and method according to the present invention;
FIG. 2 is a schematic diagram of a big data based data processing system and method according to the present invention;
FIG. 3 is a schematic diagram of walking of a sole of a designated person in a big data-based data processing system and method according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-3, the present invention provides the following technical solutions:
a data processing system and method based on big data, the system includes a video analysis end, an identity verification identification system, a database end and a public security system scheduling module, the video analysis end is used for verifying whether a corresponding face image appears in a high risk area and calling the face image of a related video for comparison so as to avoid the opportunity of virus propagation to a non-risk area, the identity verification confirmation module is used for further analyzing the walking shape of a designated person by using the video analysis end when detecting a face part area of the designated person so as to be capable of determining the identity state of the designated person as soon as possible, the database end is used for verifying the face image with high similarity with the designated person and commonly identifying the identity of the designated person according to the walking posture of the designated person, the public security system scheduling module is used for detecting that the designated person contacts suspected persons or passes through the high risk area, and distributing corresponding community isolation according to the risk value grade of the designated personnel.
Further, the video analysis terminal comprises an identity information checking unit, a position calling unit, a photo capturing unit and an image analysis unit, wherein the identity information checking unit is used for registering information according to names, current positions and frequent addresses of appointed persons and sending the information to the public security system, the position calling unit is used for calling traces of cities where the appointed persons stay within preset days to check whether the appointed persons stay in high-risk areas or not so as to prevent the high-risk areas from spreading viruses to local positions, the photo capturing unit is used for calling monitoring video images when detecting that the appointed persons stay in the high-risk areas or not and judging whether the appointed persons are in close contact with the high-risk areas or not, the image analysis unit is used for judging whether complete face images can be detected in videos during video analysis or not, and if not, analyzing the walking form of the designated person so as to further determine the identity of the person;
when a designated person enters a social area, whether the designated person is in close contact with people in a high-risk area or not is required, so that the contact condition of a face image in a video of the designated person with other people is required to be checked, but the face is often shielded, so that the face feature change value needs to be carefully checked.
Furthermore, the identity verification confirming system comprises a face similarity detecting module and a human body posture determining module, wherein the face similarity detecting module is used for judging and comparing with the face characteristics of other people according to partial face characteristics when a complete face image is not detected, further analyzing the walking posture characteristics of the human body when the detected similarity is higher, the human body posture determining module is used for collecting the walking posture with higher similarity to the face characteristics, confirming whether the collected information is the same as a real person or not, sending the collected result to a database end for storage, and the face similarity detecting module is connected with the human body posture determining module.
The human body posture determining module is used for analyzing the identity condition of a designated person for the walking speed of the person in the video monitoring during static standing, walking and running, the contact area proportion of the sole and the heel to the ground during walking and running and the friction angle of the sole and the heel to the ground.
Further, the public security system scheduling module comprises a personnel contact determining unit, a risk level determining unit, a personnel arrangement scheduling management unit and a message transmission unit, wherein the personnel contact determining unit is used for detecting whether the contact personnel of the designated personnel in the high-risk area have the transmission virus according to the video so that other areas can guarantee corresponding safety, the risk level determining unit is used for determining the risk level of the designated personnel according to the direct contact and indirect contact between the trip route of the designated personnel and the personnel in the high-risk area so as to determine the risk level of the designated personnel and isolate the designated personnel in communities with different risk levels, the personnel arrangement scheduling management unit is used for reasonably arranging and isolating the communities according to the current risk numbers with different levels so as to guarantee the safety of the area and other areas, and the message transmission unit is used for transmitting the isolated information of the designated personnel, and the output end of the personnel contact determining unit is connected with the input ends of the risk level determining unit, the personnel loss arrangement management unit and the message dissemination unit.
A big data-based data processing method comprises the following steps:
step Z01: calling position information of a designated person within preset time through a video analysis end, capturing face picture information through the video end, verifying contact conditions of the designated person and other persons from the video end for analysis, and executing a step Z02 when face features are not completely captured or are not clear;
step Z02: when the face features of the appointed person are not detected completely, analyzing the walking state of the appointed person, and determining the identity of the current appointed person;
step Z03: analyzing the face characteristic image detected in the video and the corresponding walking state, judging whether the identity is correct, continuously verifying the contact condition of the designated person and the high-risk area when the identity of the designated person is verified to be correct, and otherwise, continuously verifying the step Z01 and the step Z02;
step Z04: and judging the risk level of the designated personnel according to the verification result of the step Z03, and distributing corresponding communities according to the corresponding risk levels for isolation.
In step Z01, the method determines, according to the face pictures appearing in the video image and the face features appearing in the pictures taken at different angles, and calculates feature points of the eyes and the mouth, specifically including the following steps:
z001: when only eye characteristic points appear in the picture, calculating a vertical distance L1 between the middle of the upper eyelid and the middle of the lower eye when the eyes are normally opened, a straight-line distance L2 between the canthus of one side of the left eye and one side of the right eye, and an included angle r between the upper eyelid and the lower eyelid; z002: when only nose feature points appear in the picture, calculating the included angle degree d between the nose bridge and the region in the person, and when a person with higher similarity to the feature of the designated person is detected in the step Z001 and the step Z002, continuing to verify the step Z003;
z003: calculating the distance o between a perpendicular line q between the centers of the two eyes and the top of the nose bridge, the included angle between the point of the nose bridge and the two eyes a and b, and the included angle between the point of the nose bridge and the two eyes tails;
and meanwhile, judging the characteristic similarity rate of the person and the designated person, simultaneously checking the similarity value between eyes and mouth of the person, further judging the walking behavior of the person when detecting that the changes are the same as the characteristic changes, and judging that the identity of the person is different from that of the designated person when detecting that the changes of the characteristic values are different.
Further, in step Z02, when the designated person walks on a level road, the ratio of the front sole stepping on the plane area s is determined before the rear heel falls to the ground, and the coordinate positions of the left and right vertices e and f of the front sole are set to (x)1,y1)、(x2,y2) The distance of the apex of the forefoot is
Figure BDA0002901485210000071
Wherein the front sole forms an area S on the ground3=(L*y3)-S1-S2Wherein S is1、S2The area of the part formed by the top point of the front sole and e and f, and the position of two top points formed by the friction between the rear sole and the ground are c (x)3,y3) And d (x)5,y5),
Figure BDA0002901485210000072
When the designated person walks forwards, the coordinate corresponding to the vertex of the front foot when the designated person walks is g (x)3,y3) The coordinate of the vertex of the rear sole is h (x)4,y4) The angle of the rear sole offset from the original angle of walking on a plane is
Figure BDA0002901485210000073
According to the area S formed by the front sole and the ground3Angle offset from heel
Figure BDA0002901485210000074
Judging the identity of the designated personnel;
Figure BDA0002901485210000075
the area of the front sole is irregular, so that the maximum area divided by a vertical line and a horizontal line formed between the left and right points of the front sole and a triangular area with unequal areas formed on the left and right sides are subtracted to obtain the final area of the front sole, and therefore, the area S is the area of the front sole1、S2According to the redundant area formed by the walking of the person and the ground in the video, the specific point is shown in fig. 3, and only S is described in the third embodiment1And S2Two redundant areas are formed with the ground, but the area can still exceed S with the ground when people walk1、S2The two redundant areas, which are not specifically detailed herein, should be computational processes that are conceivable in the art;
when walking because of everyone, the angle that back heel formed on ground is all inequality, and everyone is when normally walking or running, and the back heel can produce the friction on ground and form the angle of skew, judges whether appointed personnel's identity is the same through the angle that back sole produced on ground, and it is inequality when walking at ordinary times, running because of sliding to or with the personnel that detect at present here, perhaps because of emergency, current personnel are when current walking at ordinary times, running because of the emergencyThe area formed by the heel and the ground and the area formed by the ball of foot and the ground are greatly different, so that the above-mentioned situation needs to be eliminated, in fig. 3, when the friction is performed between the heel and the ground, when the foot of a person is normally placed, the vertex of the ball of foot of the person and the vertex of the heel are in the same direction, and when the heel of the foot is inclined, the corresponding change value is generated
Figure BDA0002901485210000081
The angle in the triangular function can be calculated to obtain a change angle value, and the real identity of the designated personnel is judged.
In the step Z04, it is detected that the facial features and the change of the features during walking of the designated person have determined the identity of the designated person, and the public security organization has called the travel track of the designated person from point B to point E, set the current location position of the high risk area to be A, C, and set the vehicle taken by the designated person to pass through the high risk area C, where the set of persons taken by the area C is J ═ { J ═ J { (J) } J1,j2,j3...jm-1,jmAnd the nearest distance to the designated person is M, according to jiAs a center of circle, according to jiThe movable radius r judges whether the designated personnel is influenced or not, when M-r>k, indicating that the designated person is not in contact with the person, and only needs to be isolated in the local community, when M-r<k, the designated person is in contact with the person, the risk degree of the person is high, and an isolation point needs to be independently arranged for isolation, wherein k is a set safety distance;
after passengers get on the vehicle, the passengers need to check the positions where the passengers sit and move in the carriage, so that the distance between a designated person and the person in the high-risk area closest to the designated person needs to be checked, and the set r refers to the moving range of the designated person; or when the person does not move in the carriage, the radius of the person with the virus spread is referred to, therefore, the M-r is set to judge whether the designated person moves in the set radius or not, whether the designated person is directly contacted or indirectly contacted by the person in the high-risk area is judged, k refers to the safety distance kept between the designated person and the person in the high-risk area, so that the safety of the designated person can be judged, and meanwhile, communities with corresponding risk levels can be timely distributed for isolation, so that the safety of other people is ensured.
In step Z02, the step speed v of the designated person during running is analyzed, and the area S3 formed by the front sole and the ground and the angle of deviation of the rear heel from the ground are combined
Figure BDA0002901485210000082
And judging the running state of the appointed person by combining the factors, and detecting whether the appointed person has a crowd with high similarity with the appointed person.
Example 1: setting coordinate positions of e and f at the left and right top points of the sole of the foot of a designated person as (x)1,y1)=(10,40)、(x2,y2) When the person moves forward, the coordinate corresponding to the vertex of the front foot when the person originally walks is designated as g (x) (25,45)3,y3) The vertex where the rear sole is located has the coordinate h (x) (18, 53)4,y4) The angle of the rear sole when walking on the plane is (16, 20) from the original angle
Figure BDA0002901485210000083
Two vertex positions formed after the rear sole rubs with the ground are c (x)3,y3) (18, 12) and d (x)5,y5) (33, 16) according to the area S formed by the forefoot and the ground3Angle offset from heel
Figure BDA0002901485210000084
The identity of the designated person is judged, and the area formed by the front sole of the person with higher similarity with the face position of the designated person and the ground is s after verification1150, the offset angle of the rear sole on the ground is 43.6;
thus, the calculation yields: the distance between two top points of the forefoot is
Figure BDA0002901485210000091
The area formed by the left and right sides is:
Figure BDA0002901485210000092
Figure BDA0002901485210000093
wherein the front sole forms an area S on the ground3=(L*y3)-S1-S2=(15.81*53)-477-186=175;
After verification:
Figure BDA0002901485210000094
the detected characteristic value of the person is the same as that of the designated person, and the person can be considered as the same person.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A big-data based data processing system, characterized by: the system comprises a video analysis end, an identity verification and identification system, a database end and a public security system scheduling module, wherein the video analysis end is used for verifying whether a corresponding face image appears in a high-risk area or not and calling the face image of a related video for comparison, the identity verification and confirmation module is used for further analyzing the walking form of a designated person by using the video analysis end when only a partial face area of the designated person is detected, the database end is used for checking the face image with higher similarity with the designated person and commonly identifying the identity of the designated person according to the walking posture of the designated person, and the public security system scheduling module is used for allocating corresponding community isolation according to the risk value grade of the designated person when the designated person is detected to contact suspected person or pass through the high-risk area.
2. A big-data based data processing system according to claim 1, wherein: the video analysis terminal comprises an identity information checking unit, a position calling unit, a photo capturing unit and an image analysis unit, wherein the identity information checking unit is used for registering information according to names, current positions and frequent addresses of appointed persons and sending the information to a public security system, the position calling unit is used for calling traces of cities where the appointed persons stay within preset days to check whether the cities stay in high-risk areas or not, the photo capturing unit is used for calling monitoring video images when the appointed persons stay in the high-risk areas or not and judging whether the appointed persons are in close contact with the high-risk areas or not, the image analysis unit is used for judging whether complete face images can be detected in videos when the videos are analyzed, and if the complete face images cannot be detected, the walking form of the appointed persons is analyzed.
3. A big-data based data processing system according to claim 1, wherein: the identity verification confirming system comprises a face similarity detecting module and a human body posture determining module, wherein the face similarity detecting module is used for judging and comparing with the face characteristics of other people according to partial face characteristics when a complete face image is not detected, further analyzing the walking posture characteristics of a human body when the detected similarity is high, the human body posture determining module is used for collecting the walking posture with high similarity to the face characteristics, confirming whether the collected information is the same as a real person or not, sending the collected result to a database end for storage, and the face similarity detecting module is connected with the human body posture determining module.
4. A big-data based data processing system according to claim 3, wherein: the human body posture determining module is used for analyzing the identity condition of a designated person for the walking speed of the person in the video monitoring during static standing, walking and running, the contact area proportion of the sole and the heel to the ground during walking and running and the friction angle of the heel to the ground.
5. A big-data based data processing system according to claim 1, wherein: the public security system scheduling module comprises a personnel contact determination unit, a risk level determination unit, a personnel arrangement scheduling management unit and a message transmission unit, wherein the personnel contact determination unit is used for detecting whether the contact personnel of the designated personnel in the high-risk area have the transmission viruses or not according to videos, the risk level determination unit is used for judging the risk level of the designated personnel according to the direct contact or indirect contact of the travel route of the designated personnel and the personnel in the high-risk area and isolating the designated personnel in communities with different risk levels, the personnel arrangement scheduling management unit is used for reasonably arranging the isolated communities according to the current risk numbers of the different levels, the message transmission unit is used for transmitting the isolated information of the designated personnel and reminding the human eyes of the communities to pay attention to the self safety, the output end of the personnel contact determination unit and the risk level determination unit, the information transmission unit is used for, The management unit for arranging the loss of the person is connected with the input end of the message dissemination unit.
6. A data processing method based on big data is characterized in that: the method comprises the following steps:
step Z01: calling position information of a designated person within preset time through a video analysis end, capturing face picture information through the video end, verifying contact conditions of the designated person and other persons from the video end for analysis, and executing a step Z02 when face features are not completely captured or are not clear;
step Z02: when the face features of the appointed person are not detected completely, analyzing the walking state of the appointed person, and determining the identity of the current appointed person;
step Z03: analyzing the face characteristic image detected in the video and the corresponding walking state, judging whether the identity is correct, continuously verifying the contact condition of the designated person and the high-risk area when the identity of the designated person is verified to be correct, and otherwise, continuously verifying the step Z01 and the step Z02;
step Z04: and judging the risk level of the designated personnel according to the verification result of the step Z03, and distributing corresponding communities according to the corresponding risk levels for isolation.
7. The big-data-based data processing method according to claim 6, wherein: in step Z01, the method determines, according to the face pictures appearing in the video image and the face features appearing in the pictures taken at different angles, and calculates feature points of the eyes and the mouth, specifically including the following steps:
z001: when only eye characteristic points appear in the picture, calculating a vertical distance L1 between the middle of the upper eyelid and the middle of the lower eye when the eyes are normally opened, a straight-line distance L2 between the canthus of one side of the left eye and one side of the right eye, and an included angle r between the upper eyelid and the lower eyelid;
z002: when only nose feature points appear in the picture, calculating the included angle degree d between the nose bridge and the region in the person, and when a person with higher similarity to the feature of the designated person is detected in the step Z001 and the step Z002, continuing to verify the step Z003;
z003: and calculating the distance o between the perpendicular line q between the centers of the two eyes and the top point of the nose bridge, the included angle between the point of the nose bridge and the two eyes a and b, and the included angle between the point of the nose bridge and the two eyes tails.
8. The big-data-based data processing method according to claim 6, wherein: in step Z02, when the designated person walks on a level road, the ratio of the forefoot portion to the flat surface area s is determined before the rear heel falls to the ground, and coordinate positions of e and f at the left and right vertices of the forefoot portion are set to (x)1,y1)、(x2,y2) The corresponding coordinate of the vertex of the front sole is
Figure FDA0002901485200000031
Wherein the front sole forms an area S on the ground3=(L*y3)-S1-S2Wherein S is1、S2Refers to the partial area formed by e and f at the top point of the forefoot,
Figure FDA0002901485200000032
two vertex positions formed after the rear sole rubs with the ground are c (x)3,y3) And d (x)5,y5) When the designated person walks forward, the coordinate of the vertex of the rear sole is h (x)4,y4) The angle of the rear sole offset from the original angle of walking on a plane is
Figure FDA0002901485200000033
Friction deviation angle and actual deviation angle of rear sole of person with high face feature similarity rate on ground
Figure FDA0002901485200000034
Comparing the area formed by the front sole of the person with high face feature similarity rate and the ground with the actual S3And comparing to judge whether the identity is the identity of the designated personnel.
9. The big data-based data processing method according to claim 6The method is characterized in that: in the step Z04, it is detected that the facial features and the change of the features during walking of the designated person have determined the identity of the designated person, and the public security organization has called the travel track of the designated person from point B to point E, set the current location position of the high risk area to be A, C, and set the vehicle taken by the designated person to pass through the high risk area C, where the set of persons taken by the area C is J ═ { J ═ J { (J) } J1,j2,j3...jm-1,jmAnd the nearest distance to the designated person is M, according to jiAs a center of circle, according to jiThe movable radius r judges whether the designated personnel is influenced or not, when M-r>k, indicating that the designated person is not in contact with the person, and only needs to be isolated in the local community, when M-r<k represents that a designated person contacts the person, the risk degree of the person is high, and an isolation point needs to be independently arranged for isolation, wherein k is a set safety distance.
10. The big-data-based data processing method according to claim 8, wherein: in step Z02, the step speed v of the designated person during running is analyzed, and the area S3 formed by the front sole and the ground and the angle of deviation of the rear heel from the ground are combined
Figure FDA0002901485200000035
And judging the running state of the appointed person by combining the factors, and detecting whether the appointed person has a crowd with high similarity with the appointed person.
CN202110058251.5A 2021-01-16 2021-01-16 Data processing system and method based on big data Active CN112800885B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110058251.5A CN112800885B (en) 2021-01-16 2021-01-16 Data processing system and method based on big data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110058251.5A CN112800885B (en) 2021-01-16 2021-01-16 Data processing system and method based on big data

Publications (2)

Publication Number Publication Date
CN112800885A true CN112800885A (en) 2021-05-14
CN112800885B CN112800885B (en) 2023-09-26

Family

ID=75809955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110058251.5A Active CN112800885B (en) 2021-01-16 2021-01-16 Data processing system and method based on big data

Country Status (1)

Country Link
CN (1) CN112800885B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115169988A (en) * 2022-09-01 2022-10-11 广东广宇科技发展有限公司 Big data based information flow control method and device, electronic equipment and medium thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106504104A (en) * 2016-10-27 2017-03-15 江西瓷肌电子商务有限公司 A kind of method of social activity of being made friends based on face recognition
CN106599873A (en) * 2016-12-23 2017-04-26 安徽工程大学机电学院 Figure identity identification method based on three-dimensional attitude information
CN107066983A (en) * 2017-04-20 2017-08-18 腾讯科技(上海)有限公司 A kind of auth method and device
CN108053587A (en) * 2018-01-08 2018-05-18 京东方科技集团股份有限公司 Personal identification method and its device, carpet
CN110516623A (en) * 2019-08-29 2019-11-29 中新智擎科技有限公司 A kind of face identification method, device and electronic equipment
US20200064444A1 (en) * 2015-07-17 2020-02-27 Origin Wireless, Inc. Method, apparatus, and system for human identification based on human radio biometric information
CN111523380A (en) * 2020-03-11 2020-08-11 浙江工业大学 Mask wearing condition monitoring method based on face and posture recognition
CN111862413A (en) * 2020-07-28 2020-10-30 公安部第三研究所 Method and system for realizing epidemic situation resistant non-contact multidimensional identity rapid identification
CN112163448A (en) * 2020-08-20 2021-01-01 深圳英飞拓智能技术有限公司 Forehead temperature detection method and system based on risk grade classification and storage medium
CN112185582A (en) * 2020-09-14 2021-01-05 清华大学 Infectious disease prevention and control method and system based on active reporting data
CN112232110A (en) * 2020-08-31 2021-01-15 中天天河(天津)大数据科技有限公司 Intelligent face temperature control recognition device and epidemic prevention system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200064444A1 (en) * 2015-07-17 2020-02-27 Origin Wireless, Inc. Method, apparatus, and system for human identification based on human radio biometric information
CN106504104A (en) * 2016-10-27 2017-03-15 江西瓷肌电子商务有限公司 A kind of method of social activity of being made friends based on face recognition
CN106599873A (en) * 2016-12-23 2017-04-26 安徽工程大学机电学院 Figure identity identification method based on three-dimensional attitude information
CN107066983A (en) * 2017-04-20 2017-08-18 腾讯科技(上海)有限公司 A kind of auth method and device
CN108053587A (en) * 2018-01-08 2018-05-18 京东方科技集团股份有限公司 Personal identification method and its device, carpet
CN110516623A (en) * 2019-08-29 2019-11-29 中新智擎科技有限公司 A kind of face identification method, device and electronic equipment
CN111523380A (en) * 2020-03-11 2020-08-11 浙江工业大学 Mask wearing condition monitoring method based on face and posture recognition
CN111862413A (en) * 2020-07-28 2020-10-30 公安部第三研究所 Method and system for realizing epidemic situation resistant non-contact multidimensional identity rapid identification
CN112163448A (en) * 2020-08-20 2021-01-01 深圳英飞拓智能技术有限公司 Forehead temperature detection method and system based on risk grade classification and storage medium
CN112232110A (en) * 2020-08-31 2021-01-15 中天天河(天津)大数据科技有限公司 Intelligent face temperature control recognition device and epidemic prevention system
CN112185582A (en) * 2020-09-14 2021-01-05 清华大学 Infectious disease prevention and control method and system based on active reporting data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115169988A (en) * 2022-09-01 2022-10-11 广东广宇科技发展有限公司 Big data based information flow control method and device, electronic equipment and medium thereof

Also Published As

Publication number Publication date
CN112800885B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
KR102444277B1 (en) Health abnormality detection method using face monitoring and thermal image monitoring, health abnormality detection device and computer program for the same
CN111523480B (en) Method and device for detecting face obstruction, electronic equipment and storage medium
US10776627B2 (en) Human flow analysis method, human flow analysis apparatus, and human flow analysis system
Kawato et al. Detection and tracking of eyes for gaze-camera control
CN101390128B (en) Detecting method and detecting system for positions of face parts
JP6517325B2 (en) System and method for obtaining demographic information
US11657650B2 (en) Techniques for automatically identifying secondary objects in a stereo-optical counting system
CN110191234B (en) Intelligent terminal unlocking method based on fixation point analysis
CN112818901A (en) Wearing mask face recognition method based on eye attention mechanism
CN114894337B (en) Temperature measurement method and device for outdoor face recognition
CN111598021B (en) Wearing detection method and device for face shield, electronic equipment and storage medium
WO2019220589A1 (en) Video analysis device, video analysis method, and program
Rezaee et al. Real-time intelligent alarm system of driver fatigue based on video sequences
CN112800885A (en) Data processing system and method based on big data
US20220036056A1 (en) Image processing apparatus and method for recognizing state of subject
An et al. VFP290k: A large-scale benchmark dataset for vision-based fallen person detection
Huang et al. Skeleton-based automatic assessment and prediction of intrusion risk in construction hazardous areas
CN111811667A (en) Temperature detection method and device, electronic equipment and readable storage medium
JP6605104B2 (en) Suspicious person detection device and program
WO2021241293A1 (en) Action-subject specifying system
US20230103555A1 (en) Information processing apparatus, information processing method, and program
Ibrahim et al. Mouth covered detection for yawn
KR102537188B1 (en) System and method for contact tracking and computer program for the same
WO2024079777A1 (en) Information processing system, information processing device, information processing method, and recording medium
CN113038381B (en) Evacuation information pushing method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant