CN114494995A - User behavior analysis method and device, electronic equipment and storage medium - Google Patents

User behavior analysis method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114494995A
CN114494995A CN202111663355.5A CN202111663355A CN114494995A CN 114494995 A CN114494995 A CN 114494995A CN 202111663355 A CN202111663355 A CN 202111663355A CN 114494995 A CN114494995 A CN 114494995A
Authority
CN
China
Prior art keywords
user
personnel
attribute
practice
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111663355.5A
Other languages
Chinese (zh)
Inventor
张勇俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN202111663355.5A priority Critical patent/CN114494995A/en
Publication of CN114494995A publication Critical patent/CN114494995A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7837Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
    • G06F16/784Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content the detected or recognised objects being people
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/7867Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/787Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a user behavior analysis method, a user behavior analysis device, electronic equipment and a storage medium, wherein the method comprises the steps of obtaining image information shot at a place to be detected for operation; determining user attribute characteristics of users contained in the image information according to the image information; comparing the user attribute features of the user with the personnel attribute features of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute features and the user attribute features; and determining whether the user has abnormal practice behaviors or not based on the comparison result between the practice place corresponding to the target person and the place to be detected. Therefore, manpower in the troubleshooting process can be reduced, and the troubleshooting efficiency is improved.

Description

User behavior analysis method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of medical monitoring, in particular to a user behavior analysis method and device, electronic equipment and a storage medium.
Background
In recent years, the related national departments actively carry out special activities and special supervision and inspection, increase the medical supervision and enforcement force, and strictly check illegal behaviors according to law, thereby achieving obvious effect and continuously improving the order of the medical service market. However, some medical institutions and medical staff are not conscious of legal practice, and illegal practice behaviors still exist; at present, the cross-regional execution of the investigation medical staff requires law enforcement personnel to perform the on-site law enforcement investigation, so that excessive manpower is consumed in the investigation process, and the investigation efficiency is too low.
Disclosure of Invention
In a first aspect, a primary object of the present invention is to provide a user behavior analysis method, including:
acquiring image information shot at a place to be detected for operation;
determining user attribute characteristics of users contained in the image information according to the image information;
comparing the user attribute features of the user with the personnel attribute features of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute features and the user attribute features;
and determining whether the user has abnormal practice behaviors or not based on the comparison result between the practice place corresponding to the target person and the place to be detected.
Optionally, the user attribute features include face attribute features and human attribute features, and the person attribute features include face attribute features; the step of comparing the user attribute features of the user with the personnel attribute features of the personnel in a preset personnel database to determine the target personnel with matched personnel attribute features comprises the following steps:
comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library;
and when the preset person library has persons with person attribute characteristics matched with the user attribute characteristics and the human body attribute characteristics are detected to be matched with the preset human body attribute characteristics, determining the persons with the person attribute characteristics matched with the user attribute characteristics as target persons, wherein the preset human body attribute characteristics comprise at least one of clothing characteristics and body type characteristics.
Optionally, the user behavior analysis method further includes: if the user attribute characteristics determined by a plurality of pieces of image information containing the same user can be matched with the personnel attribute characteristics of the target personnel;
acquiring the shooting time of image information shot at a place to be detected for operation;
the step of determining whether the user has an abnormal practice behavior based on the comparison result between the practice place corresponding to the target person and the place to be detected, comprises:
judging whether the practice place corresponding to the target person is consistent with the to-be-detected practice place or not;
when the place of practice corresponding to the target person is inconsistent with the place of practice to be detected, judging whether the information of the plurality of images can be accumulated as a sub-time period according to the shooting time;
and determining whether the user has abnormal practice behaviors according to a plurality of sub-time periods accumulated in a preset time period.
Optionally, the determining whether the pieces of image information can be accumulated as a sub-period according to the shooting time includes:
counting the occurrence times of the user in each sub-time period according to the shooting time of the plurality of pieces of image information; counting the same shooting time of the plurality of pieces of image information as one time;
judging whether the occurrence frequency of the user in each sub-time period exceeds a preset frequency threshold value or not;
and when the preset times threshold value is exceeded, accumulating the occurrence times of each sub-time period as one sub-time period to obtain a plurality of sub-time periods.
Optionally, the determining whether the user has an abnormal practice behavior according to a plurality of sub-time periods accumulated by the user within a predetermined time period includes:
judging whether a plurality of sub-time periods accumulated by the user in a preset time period exceed a preset time threshold or not;
and when the preset time threshold is exceeded, determining that the user has abnormal practice behaviors, and outputting the abnormal practice behaviors to generate a cross-practice warning table.
In a second aspect, an embodiment of the present invention provides a user behavior analysis apparatus, including:
the first acquisition module is used for acquiring image information shot at a place to be detected for operation;
the first determining module is used for determining the user attribute characteristics of the users contained in the image information according to the image information;
the comparison module is used for comparing the user attribute characteristics of the user with the personnel attribute characteristics of all personnel in a preset personnel library and determining target personnel of which the personnel attribute characteristics are matched with the user attribute characteristics;
and the second determination module is used for determining whether the user has abnormal operation practice behaviors or not based on the comparison result between the operation place corresponding to the target person and the operation place to be detected.
Optionally, the user attribute features include face attribute features and human attribute features, and the person attribute features include face attribute features; the alignment module comprises:
the comparison unit is used for comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library;
the first determining unit is used for determining the person with the person attribute characteristics matched with the user attribute characteristics as a target person when the person with the person attribute characteristics matched with the user attribute characteristics exists in the preset person library and the human attribute characteristics are detected to be matched with the preset human attribute characteristics, wherein the preset human attribute characteristics comprise at least one of clothing characteristics and body type characteristics.
Optionally, the apparatus further comprises: the second acquisition module is used for acquiring the shooting time of the image information shot at the place to be detected for the job practice;
the second determining module includes:
the judging unit is used for judging whether the practice place corresponding to the target person is consistent with the to-be-detected practice place or not;
the second determining unit is used for determining a plurality of sub-time periods accumulated by the user within a preset time period according to the shooting time when the practice place corresponding to the target person is inconsistent with the place to be detected;
and the third determining unit is used for determining whether the user has abnormal practice behaviors according to a plurality of sub-time periods accumulated in the preset time period.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the steps of the user behavior analysis method as described above.
In a fourth aspect, the present invention provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the user behavior analysis method as described above.
The scheme of the invention at least comprises the following beneficial effects:
the user behavior analysis method provided by the invention comprises the steps of firstly, acquiring image information shot in a place to be detected for operation; determining user attribute characteristics of users contained in the image information according to the image information; comparing the user attribute features of the user with the personnel attribute features of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute features and the user attribute features; determining whether the user has abnormal operation practicing behaviors or not based on a comparison result between the operation practicing place corresponding to the target person and the operation practicing place to be detected; therefore, manpower in the troubleshooting process can be reduced, and the troubleshooting efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
Fig. 1 is a network architecture diagram of a user behavior analysis method according to an embodiment of the present invention;
fig. 2 is an overall flowchart schematic diagram of a user behavior analysis method according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of step S30 according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of step S40 according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of step S43 according to an embodiment of the present invention;
fig. 6 is another schematic flow chart of a user behavior analysis method according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating a step 31 according to an embodiment of the present invention;
fig. 8 is a block diagram of a user behavior analysis apparatus according to an embodiment of the present invention;
FIG. 9 is a block diagram of a comparison module according to an embodiment of the present invention;
fig. 10 is a block diagram of a second determining module according to an embodiment of the present invention;
fig. 11 is a block diagram of an electronic device according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and "third," etc. in the description and claims of the present invention and the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprises" and any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, a network architecture to which the scheme of the embodiments of the present application may be applied will be described by way of example with reference to the accompanying drawings.
As shown in fig. 1, the network architecture relates to a server, an image capturing device deployed in a medical institution, and a medical management platform. The server can be understood as storing the information of the human face characteristics, the human body characteristics, the registered operation place and the like of the monitored medical personnel; the image acquisition equipment can be used for acquiring monitoring image data around and in a hospital, and the image acquisition equipment can be a camera, a snapshot machine and the like; the medical management platform is used for receiving and displaying data uploaded by the server to medical staff and the like, so that behavior analysis can be conveniently carried out on the basis of the data of the medical staff in the following process, and whether the medical staff perform work across regions or not can be further determined.
As shown in fig. 2, a specific embodiment of the present invention provides a user behavior analysis method, including:
and S10, acquiring the image information shot at the position to be detected for the practice.
In this embodiment, the place to be detected for medical practice may be a non-registered medical practice place or a registered medical practice place, the user may be a medical care professional, the image information of the medical care professional may be acquired by an image acquisition device inside the medical institution in real time or at regular time, for example, a photo taken by a camera every hour or a multi-frame picture in a video stream taken by the camera in real time, and the image information may be in a jpg format, a PNG format, a TIF format, a BMP format, or the like; it can be understood that the image information may also be a grayscale image, or may also be an RGB image, a YUV image, or an HSV image, and the image information may include a scene image and a person image, for example, which are shot by medical staff in a medical hall, an emergency center, a medical office, or other places; the server can determine corresponding position information and acquisition time in the image information according to the acquired image information of the medical personnel, so that the medical personnel can determine the operation place where the medical personnel is located through the image information, and further judge whether the medical personnel cross the region to operate.
And S20, determining the user attribute characteristics of the user contained in the image information according to the image information.
In this embodiment, the server may perform feature extraction on the image information through the feature extraction network model, so as to determine a user attribute feature corresponding to the user; the feature extraction network model can be a VGG19 network model, a ResNet network model, a MobileNet network model and other neural network models, and the feature extraction represents the extraction of useful data or information from an image to obtain the representation or description of a non-image of the image, such as numerical values, vectors, symbols and the like; it can be understood that the image information is subjected to feature extraction through the feature extraction network, and then human body attribute features, human face attribute features and the like corresponding to the user can be determined.
For example, image information acquired by a camera in a certain hospital includes a medical worker a and a patient B, and in order to determine the user attribute feature of the medical worker a, the user attribute features of the medical worker a and the patient B can be extracted through a feature extraction network, and then compared with medical worker data stored in a preset personnel library of a server, so that it is determined that the user attribute feature of the medical worker is a medical worker, and the user attribute feature of the patient B is a non-medical worker.
And S30, comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library, and determining the target personnel of which the personnel attribute characteristics are matched with the user attribute characteristics.
In this embodiment, the user attribute features include face attribute features and human attribute features, the person attribute features include face attribute features, the person attribute features may be stored in a preset person library of the server, and the person attribute features may be face attribute features of a plurality of users acquired within a historical time, for example, may be acquired by medical staff during the working hours, and the face attribute features include features such as whether glasses, a nose, a mouth, a hair style, eyebrows, ears, and the like are worn by the users; therefore, in the process of comparing the extracted user attribute features with the personnel attribute features, the human face attribute features of the user can be compared with the human face attribute features in the preset personnel library, and the human body attribute features can be compared with the preset human body attribute features at the same time; of course, the human face attribute features and the preset human face attribute features may be compared after the human face attribute features and the preset human face attribute features are compared, so that it may be ensured that the accuracy of the user in the image information is higher in the recognition process.
As shown in fig. 3, the comparing the user attribute characteristics of the user with the personnel attribute characteristics of the personnel in the preset personnel database to determine the target personnel whose personnel attribute characteristics are matched with the user attribute characteristics includes:
s31, comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library;
and S32, when the preset personnel library has personnel with matched personnel attribute characteristics and user attribute characteristics and the human body attribute characteristics are detected to be matched with the preset human body attribute characteristics, determining the personnel with the matched personnel attribute characteristics and user attribute characteristics as target personnel, wherein the preset human body attribute characteristics comprise at least one of clothing characteristics and body type characteristics.
In this embodiment, when the face attribute feature of the user is compared with the preset face attribute feature, if the face attribute feature is successfully compared with the preset face attribute feature, it indicates that the face attribute feature corresponding to the user exists in the preset personnel library of the server; when the comparison between the face attribute characteristics of the user and the face attribute characteristics of the preset personnel library fails, the fact that the face attribute characteristics corresponding to the user do not exist in the preset personnel library of the server is indicated, therefore, the image information of the user can be classified and stored or deleted, and meanwhile, the image information of other users can be processed; it can be understood that the target person is represented as a medical care worker, the target person may include a first type of target person or a second type of target person, the first type of target person is represented as a medical care worker in a working state, the second type of target person is represented as a medical care worker in a non-working state (for example, a medical care worker after work), the preset body attribute feature may be represented as a specific clothing feature, for example, a medical work garment, which includes a white coat, a pink nurse garment, a blue surgical garment, a protective garment, and the like, and may of course include a body shape feature, for example, a feature of different height or being thin and fat; after the face attribute features are successfully compared, the human body attribute features of the user can be compared with the preset human body attribute features, and the human body attribute features of the user are compared with the preset human body attribute features, so that whether the user is a medical worker or not can be determined, and the identification accuracy is higher.
For example, a certain piece of image information acquired by a camera in hospital a includes medical staff B wearing a white gown, a patient C and medical staff D not wearing a white gown, the medical staff B and the patient C can be simultaneously subjected to face comparison together with medical staff data stored in a preset staff library of a server, so that the face feature of the medical staff B and the medical staff data stored in the preset staff library can be determined to be subjected to comparison, after the comparison is successful, the clothing feature of the medical staff B and the clothing feature stored in the preset staff library can be compared, and after the comparison is successful, the medical staff with the working state of B can be determined; comparing the facial features of the patient C with medical staff data stored in a preset staff database, and determining that the patient C is a non-medical staff due to the failure of comparison; compare the processing with medical personnel's that medical personnel D's facial feature and storage in the personnel storehouse of predetermineeing, after comparing successfully, can compare medical personnel D's dress characteristic and the dress characteristic of predetermineeing storage in the personnel storehouse again, then can determine D for the medical personnel of non-operating condition because compare the failure.
S40, determining whether the user has abnormal practice behaviors based on the comparison result between the practice places corresponding to the target person and the to-be-detected practice places.
In this embodiment, the normal practice behavior indicates that the user works in the registered practice place, and the abnormal practice behavior indicates that the user works in the unregistered practice place; the server can compare the job practice place to be detected with the job practice place corresponding to the target person to determine the relationship between the job practice place to be detected and the job practice place corresponding to the target person, and further can determine whether abnormal job practice behaviors occur to the user.
The user behavior analysis method provided by the invention further comprises the following steps: if the user attribute characteristics determined by a plurality of pieces of image information containing the same user can be matched with the personnel attribute characteristics of the target personnel; acquiring the shooting time of image information shot at a place to be detected for operation;
as shown in fig. 4, the determining whether the user has an abnormal practicing behavior based on the comparison result between the practicing location corresponding to the target person and the place to be checked includes:
s41, judging whether the corresponding place for practicing of the target person is consistent with the place for practicing to be detected;
s42, when the corresponding place of the target person is inconsistent with the place of the practice to be detected, judging whether the information of a plurality of images can be accumulated as a sub-time period according to the shooting time;
and S43, determining whether the user has abnormal practice behavior according to a plurality of sub-time periods accumulated by the user within the preset time period.
The preset time period comprises a plurality of sub-time periods, wherein the sub-time periods can be 1 day, the preset time period can be 30 days and the like, the accumulated sub-time periods are expressed as a plurality of days, such as 2 days, 5 days and the like, when the current practice place corresponding to the user is inconsistent with the preset practice place, the user is expressed as a cross-region practice behavior, and whether the number of times of the cross-region practice behavior of the user is excessive is judged, so that whether the user is an abnormal practice behavior is determined; it is understood that the place of practicing the work corresponding to the target person may be pre-stored in the server, and the place of practicing the work corresponding to the target person may be a place of practicing the work pre-registered by the user.
As shown in fig. 5, the determining whether the plurality of pieces of image information can be accumulated as one sub-period according to the capturing time includes:
s421, counting the occurrence frequency of the user in each sub-time period according to the shooting time of the information of the plurality of images; counting the same shooting time of the image information of a plurality of images as one time;
s422, judging whether the occurrence frequency of the user in each sub-time period exceeds a preset frequency threshold value;
and S423, when the preset times threshold value is exceeded, accumulating the occurrence times of each sub-time period as one sub-time period to obtain a plurality of sub-time periods.
In this embodiment, the preset time threshold may be 3 times, 4 times, 5 times, and the like, and the preset time threshold may be 5 days, 6 days, and the like; the method comprises the steps that the occurrence frequency of a user in a place to be detected for approval is determined by judging whether the occurrence frequency of the user in a sub-time period exceeds a preset frequency threshold, and when the occurrence frequency exceeds the preset frequency threshold, the user is accumulated according to the occurrence frequency of the user in the place to be detected for approval, wherein the accumulation mode can be expressed as that the occurrence frequency is accumulated to one day when the occurrence frequency exceeds the preset frequency threshold, and a plurality of images shot each time can be taken as one time, for example, 5 photos shot in 2 seconds are determined to appear once; therefore, the plurality of sub-periods are determined by counting the number of occurrences per day and accumulating the plurality of sub-periods within a preset period.
Wherein, the determining whether the user has the abnormal practice behavior according to the plurality of sub-time periods accumulated by the user within the preset time period comprises:
s431, judging whether a plurality of sub-time periods accumulated by the user in a preset time period exceed a preset time threshold value;
s432, when the preset time threshold is exceeded, determining that the user has abnormal practice behaviors, and outputting the abnormal practice behaviors to generate a cross-practice warning table.
It can be understood that, whether the sub-time periods exceed the preset time threshold or not is used to determine whether the user is a normal practice behavior or an abnormal practice behavior, that is, when the sub-time periods exceed the preset time threshold, the user is indicated as an abnormal practice behavior, and when the sub-time periods do not exceed the preset time threshold, the user is indicated as a normal practice behavior.
For example, if the registered place for practicing of the medical worker a is a, and the number of times that the medical worker a appears in the current place for practicing b a day is 10, it indicates that the working time of the medical worker a in the current place for practicing b can be accumulated to 1 day, and if the accumulated time that the medical worker a appears in the current place for practicing b at intervals within 30 days exceeds 5 days, it indicates that the medical worker a has abnormal practicing behavior; when the accumulated time that the medical staff A appears at the current practice place b at intervals within 30 days does not exceed 5 days, the medical staff A does not have abnormal practice behaviors, such as borrow work and the like, so that the practice behaviors of the medical staff can be accurately judged.
The user behavior analysis method provided by the invention comprises the steps of firstly, acquiring image information shot in a place to be detected for operation; determining user attribute characteristics of users contained in the image information according to the image information; comparing the user attribute characteristics of the user with the personnel attribute characteristics of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute characteristics and user attribute characteristics; determining whether the user has abnormal operation practicing behaviors or not based on a comparison result between the operation practicing place corresponding to the target person and the operation practicing place to be detected; therefore, manpower in the troubleshooting process can be reduced, and the troubleshooting efficiency is improved.
As shown in fig. 6, in an alternative embodiment, the image information is obtained from a non-practicing location or a predetermined practicing location; the user behavior analysis method provided by the invention further comprises the following steps:
31. processing the image information acquired from the non-operation practicing place and the image information acquired from the preset operation practicing place to obtain a processing result;
32. and determining the position information of the user as a non-practical place or a preset practical place according to the processing result.
The image information of the non-operation-practicing place and the preset operation-practicing place can be analyzed and processed simultaneously, and the non-operation-practicing places can be multiple, so that when the user appears in the preset operation-practicing place, the user does not appear in the non-operation-practicing place; for example, a hospital a, a hospital B, and a hospital C manage medical staff information through the same medical management platform, when a registered place of a user is the hospital a, image information of the hospital a may be analyzed first, when the user appears in the image information of the hospital a, the current place of the user is represented as a preset place of the user, and when the user does not appear in the image information of the hospital a, the image information of the hospital B and the hospital C may be queried to determine the current place of the user.
As shown in fig. 7, the processing of the image information acquired from the non-practicing location and the image information acquired from the preset practicing location to obtain the processing result includes:
311. acquiring first image information of a preset operation place and second image information of a plurality of non-operation places;
312. judging whether the user appears in the first image information;
313. when the user appears in the first image information, determining the position information of the user as a preset operation place;
314. when the user does not appear in the first image information, the image information of a plurality of non-practical places is inquired to determine the corresponding position information of the user.
In this embodiment, the server may obtain first image information of a preset practice location and second image information of a plurality of non-practice locations, and then determine whether a user appears in the first image information, and if the user appears in the first image information, it may determine that the user is a normal practice behavior; when the user does not appear in the first image information, whether the user appears in the second image information or not can be inquired, and under the condition that the user appears in the second image information, subsequent steps of analyzing user attribute characteristics and position information and the like can be carried out on the user, so that the identification is more accurate, and the efficiency of troubleshooting medical personnel is higher.
As shown in fig. 8, an embodiment of the present invention provides a user behavior analysis apparatus 10, including:
the first acquisition module 11 is used for acquiring image information shot at a place to be detected for operation;
the first determining module 12 is configured to determine, according to the image information, a user attribute feature of a user included in the image information;
the comparison module 13 is configured to compare the user attribute features of the user with the personnel attribute features of the personnel in the preset personnel database, and determine a target personnel whose personnel attribute features are matched with the user attribute features;
the second determining module 14 is configured to determine whether the user has an abnormal practicing behavior based on a comparison result between the practicing location corresponding to the target person and the place to be tested for practicing.
As shown in fig. 9, the user attribute features include face attribute features and human attribute features, and the person attribute features include face attribute features; the alignment module 13 includes:
a comparing unit 131, configured to compare the user attribute features of the user with the personnel attribute features of the personnel in the preset personnel database;
the first determining unit 132 is configured to determine, as a target person, a person whose person attribute feature matches the user attribute feature when the person whose person attribute feature matches the user attribute feature exists in the preset person library and it is detected that the human attribute feature matches the preset human attribute feature, where the preset human attribute feature includes at least one of a clothing feature and a body type feature.
The user behavior analysis device 10 further includes:
the second acquisition module 15 is used for acquiring the shooting time of the image information shot at the place to be detected for the job practice;
as shown in fig. 10, the second determination module 14 includes:
a determining unit 141, configured to determine whether the place of practicing corresponding to the target person is consistent with the place to be detected;
the second determining unit 142 is configured to determine, according to the shooting time, a plurality of sub-time periods accumulated in the predetermined time period by the user when the place of practicing corresponding to the target person is inconsistent with the place of practicing to be detected;
the third determining unit 143 is configured to determine whether the user has an abnormal practice behavior according to a plurality of sub-time periods accumulated by the user within the predetermined time period.
The user behavior analysis device provided by the invention firstly acquires image information shot in a place to be detected for approval; determining user attribute characteristics of users contained in the image information according to the image information; comparing the user attribute characteristics of the user with the personnel attribute characteristics of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute characteristics and user attribute characteristics; determining whether the user has abnormal operation practicing behaviors or not based on a comparison result between the operation practicing place corresponding to the target person and the operation practicing place to be detected; therefore, manpower in the troubleshooting process can be reduced, and the troubleshooting efficiency is improved.
It should be noted that the user behavior analysis device 10 provided in the specific embodiment of the present invention is a device corresponding to the user behavior analysis method, all embodiments of the user behavior analysis method are applicable to the user behavior analysis device 10, and corresponding modules in the embodiments of the user behavior analysis device 10 correspond to steps in the user behavior analysis method, so that the same or similar beneficial effects can be achieved, and in order to avoid too many repetitions, each module in the user behavior analysis device 2 is not described in detail here.
As shown in fig. 11, the embodiment of the present invention further provides an electronic device 20, which includes a memory 202, a processor 201, and a computer program stored in the memory 202 and executable on the processor 201, wherein the processor 201 implements the steps of the user behavior analysis method when executing the computer program.
Specifically, the processor 201 is configured to call the computer program stored in the memory 202, and execute the following steps:
acquiring image information shot at a place to be detected for operation;
determining user attribute characteristics of users contained in the image information according to the image information;
comparing the user attribute characteristics of the user with the personnel attribute characteristics of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute characteristics and user attribute characteristics;
and determining whether the user has abnormal practice behaviors or not based on the comparison result between the practice place corresponding to the target person and the to-be-detected practice place. .
Optionally, the user attribute features include face attribute features and human attribute features, and the human attribute features include face attribute features;
the comparing, performed by the processor 201, the user attribute feature of the user with the person attribute features of the persons in the preset person library to determine a target person whose person attribute feature matches the user attribute feature includes:
comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library;
when people with person attribute characteristics matched with the user attribute characteristics exist in the preset person library and the human body attribute characteristics are detected to be matched with the preset human body attribute characteristics, the people with the person attribute characteristics matched with the user attribute characteristics are determined as target people, and the preset human body attribute characteristics comprise at least one of clothing characteristics and body type characteristics.
Optionally, the user behavior analysis method executed by the processor 201 further includes: acquiring the shooting time of image information shot at a place to be detected for operation; and determining whether the user has abnormal practice behaviors based on the comparison result of the practice place corresponding to the target person and the to-be-detected practice place, wherein the comparison result comprises:
judging whether the practice place corresponding to the target person is consistent with the to-be-detected practice place or not;
when the practice place corresponding to the target person is inconsistent with the practice place to be detected, determining a plurality of accumulated sub-time periods of the user in a preset time period according to the shooting time;
and determining whether the user has abnormal practice behaviors according to a plurality of sub-time periods accumulated in the preset time period.
The processor 201 determines a plurality of sub-periods accumulated by the user in a predetermined period according to the shooting time, including:
counting the occurrence frequency of the user in each sub-time period according to the shooting time; wherein, a plurality of images shot each time are counted as one time;
judging whether the occurrence frequency of the user in each sub-time period exceeds a preset frequency threshold value or not;
and when the preset times threshold value is exceeded, accumulating the occurrence times of each sub-time period as one sub-time period to obtain a plurality of sub-time periods.
The determining, by the processor 201, whether the user has an abnormal practice behavior according to a plurality of sub-time periods accumulated by the user within a predetermined time period includes:
judging whether a plurality of accumulated sub-time periods of a user in a preset time period exceed a preset time threshold or not;
and when the preset time threshold is exceeded, determining that the user has abnormal practice behaviors, and outputting the abnormal practice behaviors to generate a cross-practice warning table.
That is, in the embodiment of the present invention, when the processor 201 of the electronic device 20 executes the computer program, the steps of the user behavior analysis method are implemented, so that the manpower in the troubleshooting process can be reduced, and the troubleshooting efficiency can be improved.
It should be noted that, since the steps of the user behavior analysis method described above are implemented when the processor 201 of the electronic device 20 executes the computer program, all embodiments of the user behavior analysis method described above are applicable to the electronic device 20, and can achieve the same or similar beneficial effects.
The computer-readable storage medium provided in the embodiments of the present invention stores a computer program thereon, and when the computer program is executed by a processor, the computer program implements each process of the user behavior analysis method or the application-side user behavior analysis method provided in the embodiments of the present invention, and can achieve the same technical effect, and is not described here again to avoid repetition.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, and the program can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A user behavior analysis method is characterized by comprising the following steps:
acquiring image information shot at a place to be detected for operation;
determining user attribute characteristics of users contained in the image information according to the image information;
comparing the user attribute features of the user with the personnel attribute features of all personnel in a preset personnel library, and determining target personnel with matched personnel attribute features and the user attribute features;
and determining whether the user has abnormal practice behaviors or not based on the comparison result between the practice place corresponding to the target person and the place to be detected.
2. The user behavior analysis method according to claim 1, wherein the user attribute features include face attribute features and human attribute features, and the person attribute features include face attribute features; the step of comparing the user attribute features of the user with the personnel attribute features of the personnel in a preset personnel database to determine the target personnel with matched personnel attribute features comprises the following steps:
comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library;
and when the preset person library has persons with person attribute characteristics matched with the user attribute characteristics and the human body attribute characteristics are detected to be matched with the preset human body attribute characteristics, determining the persons with the person attribute characteristics matched with the user attribute characteristics as target persons, wherein the preset human body attribute characteristics comprise at least one of clothing characteristics and body type characteristics.
3. The user behavior analysis method according to claim 1, further comprising: if the user attribute characteristics determined by a plurality of pieces of image information containing the same user can be matched with the personnel attribute characteristics of the target personnel;
acquiring the shooting time of image information shot at a place to be detected for operation;
the step of determining whether the user has an abnormal practice behavior based on the comparison result between the practice place corresponding to the target person and the place to be detected, comprises:
judging whether the practice place corresponding to the target person is consistent with the to-be-detected practice place or not;
when the place of practice corresponding to the target person is inconsistent with the place of practice to be detected, judging whether the information of the plurality of images can be accumulated as a sub-time period according to the shooting time;
and determining whether the user has abnormal practice behaviors according to a plurality of sub-time periods accumulated in a preset time period.
4. The user behavior analysis method according to claim 3, wherein the determining whether the pieces of image information can be accumulated as one sub-period according to the capturing time includes:
counting the occurrence times of the user in each sub-time period according to the shooting time of the plurality of pieces of image information; counting the same shooting time of the plurality of pieces of image information as one time;
judging whether the occurrence frequency of the user in each sub-time period exceeds a preset frequency threshold value or not;
and when the preset times threshold value is exceeded, accumulating the occurrence times of each sub-time period as one sub-time period to obtain a plurality of sub-time periods.
5. The method as claimed in claim 3, wherein the determining whether the user has abnormal practice behavior according to a plurality of sub-periods accumulated in a predetermined period of time comprises:
judging whether a plurality of sub-time periods accumulated by the user in a preset time period exceed a preset time threshold or not;
and when the preset time threshold is exceeded, determining that the user has abnormal practice behaviors, and outputting the abnormal practice behaviors to generate a cross-practice warning table.
6. A user behavior analysis apparatus, comprising:
the first acquisition module is used for acquiring image information shot at a place to be detected for operation;
the first determining module is used for determining the user attribute characteristics of the users contained in the image information according to the image information;
the comparison module is used for comparing the user attribute characteristics of the user with the personnel attribute characteristics of all personnel in a preset personnel library and determining target personnel of which the personnel attribute characteristics are matched with the user attribute characteristics;
and the second determination module is used for determining whether the user has abnormal operation practice behaviors or not based on the comparison result between the operation place corresponding to the target person and the operation place to be detected.
7. The apparatus according to claim 6, wherein the user attribute features include face attribute features and human attribute features, and the person attribute features include face attribute features; the alignment module comprises:
the comparison unit is used for comparing the user attribute characteristics of the user with the personnel attribute characteristics of each personnel in a preset personnel library;
the first determining unit is used for determining the person with the person attribute characteristics matched with the user attribute characteristics as a target person when the person with the person attribute characteristics matched with the user attribute characteristics exists in the preset person library and the human attribute characteristics are detected to be matched with the preset human attribute characteristics, wherein the preset human attribute characteristics comprise at least one of clothing characteristics and body type characteristics.
8. The apparatus of claim 6, wherein the apparatus further comprises: the second acquisition module is used for acquiring the shooting time of the image information shot at the place to be detected for the job practice;
the second determining module includes:
the judging unit is used for judging whether the practice place corresponding to the target person is consistent with the to-be-detected practice place or not;
the second determining unit is used for determining a plurality of sub-time periods accumulated by the user within a preset time period according to the shooting time when the practice place corresponding to the target person is inconsistent with the place to be detected;
and the third determining unit is used for determining whether the user has abnormal practice behaviors according to a plurality of sub-time periods accumulated in the preset time period.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the user behavior analysis method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the user behavior analysis method according to any one of claims 1 to 7.
CN202111663355.5A 2021-12-31 2021-12-31 User behavior analysis method and device, electronic equipment and storage medium Pending CN114494995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111663355.5A CN114494995A (en) 2021-12-31 2021-12-31 User behavior analysis method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111663355.5A CN114494995A (en) 2021-12-31 2021-12-31 User behavior analysis method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114494995A true CN114494995A (en) 2022-05-13

Family

ID=81508210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111663355.5A Pending CN114494995A (en) 2021-12-31 2021-12-31 User behavior analysis method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114494995A (en)

Similar Documents

Publication Publication Date Title
CN110738135B (en) Method and system for judging and guiding worker operation step standard visual recognition
CN112766050B (en) Dressing and operation checking method, computer device and storage medium
CN110648352B (en) Abnormal event detection method and device and electronic equipment
JP2020078058A (en) Image processing apparatus, monitoring system, image processing method, and program
CN110532988B (en) Behavior monitoring method and device, computer equipment and readable storage medium
CN111753643B (en) Character gesture recognition method, character gesture recognition device, computer device and storage medium
CN110321852B (en) Action type identification method and device, storage medium and computer equipment
CN114937232B (en) Wearing detection method, system and equipment for medical waste treatment personnel protective appliance
RU2724785C1 (en) System and method of identifying personal protective equipment on a person
JP2007102482A (en) Automatic counting apparatus, program, and method
CN114821483B (en) Monitoring method and system capable of measuring temperature and applied to monitoring video
CN113408464A (en) Behavior detection method and device, electronic equipment and storage medium
CN113822164A (en) Dynamic emotion recognition method and device, computer equipment and storage medium
CN113392765A (en) Tumble detection method and system based on machine vision
CN110909684A (en) Working state checking system and method based on human body detection
US20220130148A1 (en) System and Method for Identifying Outfit on a Person
CN111582183A (en) Mask identification method and system in public place
CN110443187B (en) Recording method and device of characteristic information
CN114494995A (en) User behavior analysis method and device, electronic equipment and storage medium
CN110751125A (en) Wearing detection method and device
CN109801394B (en) Staff attendance checking method and device, electronic equipment and readable storage medium
CN116229502A (en) Image-based tumbling behavior identification method and equipment
CN109190495A (en) Gender identification method, device and electronic equipment
CN113835950A (en) Interface display stuck identification method and device, storage medium and electronic equipment
CN113344124A (en) Trajectory analysis method and device, storage medium and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination