CN114092045A - Community profiling method and device based on Internet of things and storage medium - Google Patents

Community profiling method and device based on Internet of things and storage medium Download PDF

Info

Publication number
CN114092045A
CN114092045A CN202111326592.2A CN202111326592A CN114092045A CN 114092045 A CN114092045 A CN 114092045A CN 202111326592 A CN202111326592 A CN 202111326592A CN 114092045 A CN114092045 A CN 114092045A
Authority
CN
China
Prior art keywords
behavior
target object
information
determining
behaviors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111326592.2A
Other languages
Chinese (zh)
Inventor
漆发明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wenjun Chuangyi Advertising Culture Media Co ltd
Original Assignee
Shenzhen Wenjun Chuangyi Advertising Culture Media Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wenjun Chuangyi Advertising Culture Media Co ltd filed Critical Shenzhen Wenjun Chuangyi Advertising Culture Media Co ltd
Priority to CN202111326592.2A priority Critical patent/CN114092045A/en
Publication of CN114092045A publication Critical patent/CN114092045A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Operations Research (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application discloses a community profiling method, device and storage medium based on the Internet of things, wherein a server is connected with at least one terminal device and at least one camera, and the method comprises the following steps: acquiring registration information of a target object through the at least one terminal device; acquiring a first video clip of the target object in a first preset time period through the at least one camera; analyzing according to the first video clip to obtain first associated information of the target object; and performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object. By adopting the embodiment of the application, the practicability of the file can be improved.

Description

Community profiling method and device based on Internet of things and storage medium
Technical Field
The application relates to the technical field of Internet of things, in particular to a community filing method and device based on the Internet of things and a storage medium.
Background
At present, along with social progress, mobility of people is gradually increased, and thus difficulty in managing people is increased. Although there are some departments or systems that establish a staff profile for the convenience of staff management, the established staff profile often includes only basic information of users, such as: the name of the user or his contact. The basic situation of the user cannot be effectively reflected by the personnel file. Therefore, the problem that the existing established personnel file is poor in practicability exists.
Disclosure of Invention
The embodiment of the application provides a community filing method and device based on the Internet of things and a storage medium, and the practicability of files can be improved.
In a first aspect, an embodiment of the present application provides a community profiling method based on the internet of things, which is applied to a server, where the server is connected to at least one terminal device and at least one camera, and the method includes:
acquiring registration information of a target object through the at least one terminal device;
acquiring a first video clip of the target object in a first preset time period through the at least one camera;
analyzing according to the first video clip to obtain first associated information of the target object;
and performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object.
In a second aspect, an embodiment of the present application provides an internet of things-based community profiling apparatus, where the server is connected to at least one terminal device and at least one camera, the apparatus includes: a first acquisition unit, a second acquisition unit, an analysis unit and a profiling unit, wherein,
the first acquisition unit is used for acquiring the registration information of the target object through the at least one terminal device;
the second acquiring unit is used for acquiring a first video clip of the target object in a first preset time period through the at least one camera;
the analysis unit is used for analyzing according to the first video clip to obtain first associated information of the target object;
and the filing unit is used for performing filing operation according to the registration information and the first associated information to obtain first file information of the target object.
In a third aspect, an embodiment of the present application provides a server, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that the method, apparatus, and storage medium for community profiling based on internet of things described in this embodiment of the application are applied to a server, where the server is connected to at least one terminal device and at least one camera, obtains registration information of a target object through the at least one terminal device, obtains a first video segment of the target object in a first preset time period through the at least one camera, performs analysis according to the first video segment to obtain first associated information of the target object, performs profiling operation according to the registration information and the first associated information to obtain first profile information of the target object, and since the registration information may be only self-describing of a user, the first associated information is a real photo of the user, the two can be fused together to perform a profile operation, and further, a profile that better meets the current situation, i.e. the first profile information is obtained, therefore, the practicability of the file can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic architecture diagram of a community filing system for implementing an internet-of-things-based community filing method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a community profiling method based on the internet of things according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another method for profiling a community based on the internet of things according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 5 is a block diagram illustrating functional units of a community archive building device based on the internet of things according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The following describes embodiments of the present application in detail.
Referring to fig. 1, fig. 1 is a schematic diagram of an architecture of a community archive system for implementing a community archive method based on the internet of things according to an embodiment of the present application, and as shown in the figure, the community archive system includes: the system comprises a server, at least one camera and at least one terminal device, wherein communication connection can be realized among the devices, an internet of things can be formed among the server, the at least one camera and the at least one terminal device, of course, other devices can also be included, the at least one terminal device can be used for acquiring registration information of a user, the at least one camera can be used for acquiring video information of the user, and the community filing system based on the internet of things can be used for executing a community filing method based on the internet of things, and the method specifically comprises the following steps:
acquiring registration information of a target object through the at least one terminal device;
acquiring a first video clip of the target object in a first preset time period through the at least one camera;
analyzing according to the first video clip to obtain first associated information of the target object;
and performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object.
It can be seen that, according to the embodiment of the application, the registration information of the target object can be acquired through at least one terminal device, the first video clip of the target object in the first preset time period is acquired through at least one camera, the first video clip is analyzed according to the first video clip to obtain the first associated information of the target object, and the filing operation is performed according to the registration information and the first associated information to obtain the first file information of the target object.
Referring to fig. 2, fig. 2 is a schematic flowchart of a community profiling method based on the internet of things according to an embodiment of the present application, and as shown in the drawing, the method is applied to a server in the community profiling system shown in fig. 1, where the server is connected to at least one terminal device and at least one camera, and the method includes: the community filing method based on the Internet of things comprises the following steps:
201. and acquiring the registration information of the target object through the at least one terminal device.
The embodiment of the present application can be applied to the archive of a specific unit, and the unit can include at least one of the following: schools, hospitals, parks, hotels, airports, train stations, tourist attractions, museums, art museums, supermarkets, districts, companies, research institutes, etc., without limitation. The target object may be a person, for example, a person within a unit, and may be a person who needs to access the unit.
In a specific implementation, the registration information may include at least one of the following: age, gender, academic calendar, occupation, home address, work experience, height, weight, face image, voice print information, etc., without limitation.
The target object may enter or be in a target community, and the target community may register registration information of the target object, or may also obtain registration information of the target object through reservation information of the target object or registration information of a related target object, or may also obtain registration information of the target object through a third party institution, where the third party institution may be a school or a bank, and the like, which is not limited herein.
202. And acquiring a first video clip of the target object in a first preset time period through the at least one camera.
The first preset time period may be a default, for example, the first preset time period may be 1 month or 2 months.
In specific implementation, when a target object is in a range of a target community, the target object can be shot through a camera in the range to obtain a video clip, and the video clips in a first preset time period can be integrated to obtain a first video clip.
203. And analyzing according to the first video clip to obtain first associated information of the target object.
Since the first video segment includes some behaviors of the target object, the behaviors can be evaluated, and further the part of the evaluation content can be used as the first related information, or since the first video segment can also analyze some characteristics, a communication range, an activity track and the like of the target object, the part of the information can also be used as the first related information.
Optionally, when the first association information includes the behavior evaluation parameter of the target object, the step 203 of analyzing according to the first video segment to obtain the first association information of the target object may include the following steps:
31. analyzing the first video clip to obtain a plurality of video images;
32. performing behavior analysis on the video images to obtain a plurality of behaviors;
33. and determining the behavior evaluation parameters of the target object according to the behaviors.
In a specific implementation, the first video segment may be analyzed to obtain a frame-by-frame video image, that is, a plurality of video images, and then behavior analysis may be performed on the plurality of video images to obtain a plurality of behaviors, where each behavior may correspond to a time point, that is, represents a specific time when the behavior occurs. The behavior may include at least one of: throw garbage, smile, walk, look, send a leaflet, spit anywhere, smoke, quarrel, uneven clothing crown, etc., without limitation.
Furthermore, since the behavior represents the quality of a person, the behavior evaluation parameter of the target object may be determined according to the behaviors, and the behavior evaluation parameter may include one or more specific values for reflecting the quality of the target object.
Optionally, in the step 33, determining the behavior evaluation parameter of the target object according to the plurality of behaviors may include the following steps:
331. detecting whether a preset behavior is included in the plurality of behaviors;
332. when the preset behaviors are not included in the behaviors, determining the score of each behavior in the behaviors to obtain a plurality of scores;
333. classifying the behaviors to obtain multiple types of behaviors;
334. determining an important factor of each type of behavior in the plurality of types of behaviors to obtain a plurality of important factors;
335. adjusting the scores according to the important factors to obtain reference scores;
336. constructing a behavior score curve according to the plurality of reference scores;
337. and determining the behavior evaluation parameters according to the behavior scoring curve.
Wherein the preset behavior can be preset or default by the system. In a specific implementation, when a special behavior occurs, the behavior evaluation of a person is different, for example, when a lawbreaker occurs, the behavior evaluation parameter of the person can be set in a fixed range within a specified time period, wherein the specified time period can be set by a system default, for example, the specified time period can be 2 years.
Specifically, whether the plurality of behaviors include the preset behavior or not can be detected, when the plurality of behaviors do not include the preset behavior, the score of each behavior in the plurality of behaviors is determined, a plurality of scores are obtained, the plurality of behaviors are classified, a plurality of types of behaviors are obtained, different behaviors have different importance on the influence of the user behavior on the quality of one person, furthermore, in the application, an important factor is introduced to decide the influence degree of the user behavior on the quality evaluation, the value range of the important factor is larger than 0, if a certain behavior does not matter on the quality evaluation of one person, the corresponding important factor can be set to be 1, and certainly, the corresponding important factor can also be set to be larger than 1.
Furthermore, the important factors of each behavior in the multiple types of behaviors can be determined to obtain multiple important factors, one type of behavior represents a behavior group, and whether the behavior of the user is accidental or frequent can be analyzed, wherein the accidental indicates that the heart and blood may only get damp, and the frequent indicates that the behavior is a habit and a quality experience. Furthermore, the scores may be adjusted according to the importance factors to obtain reference scores, for example, a product between the importance factor and the score corresponding thereto may be used as the reference score, and then a behavior score curve may be constructed according to the reference scores, where a horizontal axis of the behavior score curve is time and a vertical axis thereof is a behavior score, and the behavior score curve may represent behavior habits or behavior performances of a person over a period of time, so that a behavior evaluation parameter may be determined according to the behavior score curve.
Optionally, in the step 334, determining the importance factor of each of the multiple classes of behaviors to obtain multiple importance factors, may include the following steps:
3341. determining the occupation ratio of each type of behavior in the plurality of types of behaviors to obtain a plurality of occupation ratios;
3342. determining a reference important factor corresponding to each type of behavior according to a mapping relation between preset type behaviors and important factors to obtain a plurality of reference important factors;
3343. determining a reference important factor in a preset range in the plurality of reference important factors to obtain at least one reference important factor;
3344. determining an adjusting parameter corresponding to the ratio corresponding to the at least one reference important factor according to a mapping relation between a preset ratio and the adjusting parameter to obtain at least one adjusting parameter;
3345. and adjusting the at least one reference important factor according to the at least one adjusting parameter to obtain at least one adjusted important factor, and combining the at least one adjusted important factor and the reference important factors except the at least one reference important factor in the plurality of reference important factors to obtain the plurality of important factors.
In a specific implementation, a mapping relationship between a preset class behavior and an important factor and a mapping relationship between a preset proportion and an adjustment parameter may be stored in advance. The preset range can be preset or default by the system. The value range of the regulating factor is-1 to 1, for example, -0.1 to 0.1.
In specific implementation, the proportion of each type of behavior in multiple types of behaviors can be determined, multiple proportions are obtained, and a user habit is embodied when the proportion is higher, namely, the higher the proportion is, the higher the certainty is, and the lower the proportion is, the higher the contingency is. Furthermore, the reference important factors corresponding to each type of behavior can be determined according to the mapping relationship between the preset type of behavior and the important factors, so that a plurality of reference important factors are obtained, namely the reference important factors are not in the preset range, the evaluation strength of the behavior corresponding to the reference important factors on the quality evaluation is weak, the reference important factors can not be adjusted, the reference important factors are in the preset range, the behavior is a bright spot behavior, and the behavior can be used as an 'adding item' or a 'subtracting item' to highlight one behavior. Furthermore, a reference important factor in a preset range in the multiple reference important factors can be determined to obtain at least one reference important factor, then an adjusting parameter corresponding to the occupation ratio of the at least one reference important factor is determined according to a mapping relation between a preset occupation ratio and an adjusting parameter to obtain at least one adjusting parameter, the at least one reference important factor is adjusted according to the at least one adjusting parameter to obtain at least one adjusted important factor, and a specific adjusting mode is as follows:
adjusted significance factor = (1 + adjustment parameter) × reference adjustment factor
Finally, the adjusted at least one importance factor may be merged with a reference importance factor other than the at least one reference importance factor in the plurality of reference importance factors to obtain a plurality of importance factors.
Optionally, in the step 337, determining the behavior evaluation parameter according to the behavior score curve may include the following steps:
3371. dividing the behavior scoring curve into a plurality of sections to obtain a plurality of sectional behavior scoring curves;
3372. fitting each segmental behavior scoring curve in the segmental behavior scoring curves to obtain a plurality of fitting straight lines;
3373. determining the slope of each fitting straight line in the fitting straight lines to obtain a plurality of slopes;
3374. determining the slope mean value and the target mean square error of the slopes, and optimizing the slope mean value according to the target mean square error to obtain a predicted slope;
3375. according to a target segmental behavior scoring curve which is closest to the current time in the plurality of segmental behavior scoring curves;
3376. determining a first evaluation parameter corresponding to the target segmental behavior scoring curve;
3377. and determining the behavior evaluation parameter according to the prediction slope and the first evaluation parameter.
In a specific implementation, the behavior scoring curve may be divided into multiple segments to obtain multiple segmented behavior scoring curves, and the time duration of each segment may be equal. And fitting each segmental behavior score curve in the segmental behavior score curves to obtain a plurality of fitted straight lines, further determining the slope of each fitted straight line in the fitted straight lines to obtain a plurality of slopes, wherein the slopes reflect the behavior performance trend of a user, further determining the slope mean value and the target mean square error of the slopes, optimizing the slope mean value according to the target mean square error to obtain a predicted slope, and different mean square errors reflect the fluctuation degree of the user quality, further, presetting the mapping relation between the mean square error and the fluctuation coefficient, further, determining the target fluctuation coefficient corresponding to the target mean square error based on the mapping relation, wherein the value range of the fluctuation coefficient can be-0.2, and the predicted slope = the slope mean value (1 + target fluctuation coefficient).
Further, the first evaluation parameter corresponding to the target segmental behavior score curve may be determined according to a target segmental behavior score curve closest to the current time among the plurality of segmental behavior score curves, for example, a mean value of the target segmental behavior score curve may be used as the first evaluation parameter, and then the behavior evaluation parameter may be determined according to the prediction slope and the first evaluation parameter, that is, a target prediction coefficient corresponding to the prediction slope may be determined according to a mapping relationship between a preset slope and the prediction coefficient, the prediction coefficient reflects a possible change situation of the user quality in a future period of time, a product between the first evaluation parameter and the target prediction coefficient may be used as a final behavior evaluation parameter, since through segmentation, the user quality fluctuation is considered, and further, the change situation of the next period is predicted by using the behavior score curve based on the user quality fluctuation, and based on the current latest quality evaluation, and estimating the user quality of the next section, and taking the user quality as a current behavior evaluation, thereby more accurately obtaining the quality score of the current moment.
Optionally, after the step 331, the following steps may be further included:
and when the plurality of behaviors comprise the preset behavior, determining the behavior evaluation parameters corresponding to the preset behavior according to the mapping relation between the preset behavior and the evaluation parameters.
In a specific implementation, a mapping relationship between a preset behavior and an evaluation parameter may be stored in advance, and when a plurality of behaviors include the preset behavior, a behavior evaluation parameter corresponding to the preset behavior may be determined according to the mapping relationship, that is, as long as a user has an appointed behavior, the user may individually evaluate the quality of the behavior.
204. And performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object.
In the specific implementation, the registration information is only self-description of the user, and the first associated information is the actual photo of the user, so that the first associated information and the second associated information can be fused together to perform archive operation, and further, an archive which better meets the current situation, namely the first archive information, is obtained.
Optionally, after the step 204, the following steps may be further included:
a1, acquiring a second video clip of the target object in a second preset time period, wherein the second preset time period is a time period after the first preset time period;
a2, analyzing the second video clip to obtain second associated information of the target object;
and A3, updating the first file information according to the second associated information to obtain second file information of the target object.
Wherein, the second preset time period can be preset or default by the system. In the specific implementation, a second video clip of the target object in a second preset time period can be obtained, the second preset time period is a time period after the first preset time period, the second video clip is analyzed to obtain second associated information of the target object, the first file information is updated according to the second associated information to obtain second file information of the target object, for example, behavior evaluation parameters of a user can be adjusted, certainly, file information of the user can be enriched, and file information of the user is expanded or perfected, so that a dynamic file can be obtained, and the practicability of the file is further improved.
Optionally, in the step a3, the updating the first file information according to the second association information to obtain the second file information of the target object may include the following steps:
a31, determining updatable part information in the first file information, wherein the updatable part information comprises behavior evaluation parameters;
a32, determining a reference behavior evaluation parameter corresponding to the second correlation information;
a33, determining a target duration between the occurrence time of the second associated information and the occurrence time of the first associated information;
a34, determining a first difference value between the reference behavior evaluation parameter and the behavior evaluation parameter;
a35, determining a target influence factor corresponding to the target duration;
a36, carrying out influence evaluation on the first difference value according to the target influence factor to obtain a second difference value;
and A37, updating the first file information according to the second difference value to obtain second file information of the target object.
The specific implementation manner of the step a32 may refer to the above steps, and is not described herein again.
In a specific implementation, the first file information may include updatable portion information, and may further include non-updatable portion information, where the updatable portion information may implement a portion that is changed, and the non-updatable portion information is a portion that cannot be changed, and may determine updatable portion information in the first file information, where the updatable portion information includes a behavior evaluation parameter, and then determine a reference behavior evaluation parameter corresponding to the second association information, and may also determine a target duration between an occurrence time of the second association information and an occurrence time of the first association information, and determine a first difference between the reference behavior evaluation parameter and the behavior evaluation parameter.
Further, a target influence factor corresponding to the target duration can be determined, the longer the duration is, the smaller the influence factor is, and conversely, the shorter the duration is, the smaller the influence factor is, and the value range of the influence factor is larger than 0. And then, carrying out influence evaluation on the first difference value according to the target influence factor to obtain a second difference value, wherein the second difference value can be the product of the first difference value and the target influence factor. Finally, the first file information may be updated according to the second difference to obtain second file information of the target object, that is, the behavior evaluation parameter in the first file information may be updated according to the second difference, for example, the sum between the second difference and the behavior evaluation parameter in the first file information may be used as the behavior evaluation parameter in the second file information, that is, the user's quality evaluation may be appropriately modified through future user performance, so as to provide a positive opportunity to the user.
It can be seen that the community profiling method based on the internet of things described in the embodiments of the present application is applied to a server, where the server is connected to at least one terminal device and at least one camera, acquiring registration information of a target object through at least one terminal device, acquiring a first video clip of the target object in a first preset time period through at least one camera, analyzing according to the first video clip to obtain first associated information of the target object, performing profiling operation according to the registration information and the first associated information to obtain first profile information of the target object, since the registration information may be the user's own description, the first related information is the user's actual picture, the two can be merged together for archive operation, furthermore, the file which is more in line with the current situation, namely the first file information, is obtained, so that the practicability of the file can be improved.
Referring to fig. 3, fig. 3 is a schematic flowchart of a community profiling method based on the internet of things according to an embodiment of the present application, and is applied to a server, where the server is connected to at least one terminal device and at least one camera, as shown in the figure, the community profiling method based on the internet of things includes:
301. and acquiring the registration information of the target object through the at least one terminal device.
302. And acquiring a first video clip of the target object in a first preset time period through the at least one camera.
303. And analyzing according to the first video clip to obtain first associated information of the target object.
304. And performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object.
305. And acquiring a second video clip of the target object in a second preset time period, wherein the second preset time period is a time period after the first preset time period.
306. And analyzing the second video clip to obtain second associated information of the target object.
307. And updating the first file information according to the second associated information to obtain second file information of the target object.
For the detailed description of the steps 301 to 307, reference may be made to the corresponding steps of the method for building a profile of a community based on the internet of things described in fig. 2, which are not described herein again.
It can be seen that the community profiling method based on the internet of things described in the embodiment of the application is applied to a server, the server is connected with at least one terminal device and at least one camera, the registration information of a target object is obtained through the at least one terminal device, a first video clip of the target object in a first preset time period is obtained through the at least one camera, analysis is performed according to the first video clip to obtain first relevant information of the target object, profiling operation is performed according to the registration information and the first relevant information to obtain first profile information of the target object, a second video clip of the target object in a second preset time period is obtained, the second preset time period is a time period after the first preset time period, the second video clip is analyzed to obtain second relevant information of the target object, the first profile information is updated according to the second relevant information, the second file information of the target object is obtained, the registration information can be only self-describing of the user, the first associated information is the actual picture of the user, the first associated information and the second associated information can be fused together to carry out file operation, and then files which are more consistent with the current situation are obtained, namely the first file information.
Referring to fig. 4 in keeping with the above embodiments, fig. 4 is a schematic structural diagram of a server according to an embodiment of the present application, and as shown in the drawing, the server includes a processor, a memory, a communication interface, and one or more programs, the one or more programs are stored in the memory and configured to be executed by the processor, the server is connected to at least one terminal device and at least one camera, and in an embodiment of the present application, the programs include instructions for performing the following steps:
acquiring registration information of a target object through the at least one terminal device;
acquiring a first video clip of the target object in a first preset time period through the at least one camera;
analyzing according to the first video clip to obtain first associated information of the target object;
and performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object.
Optionally, when the first association information includes a behavior evaluation parameter of the target object, in terms of obtaining the first association information of the target object by analyzing according to the first video segment, the program includes instructions for performing the following steps:
analyzing the first video clip to obtain a plurality of video images;
performing behavior analysis on the video images to obtain a plurality of behaviors;
and determining the behavior evaluation parameters of the target object according to the behaviors.
Optionally, in the aspect of determining the behavior evaluation parameter of the target object according to the behaviors, the program includes instructions for performing the following steps:
detecting whether a preset behavior is included in the plurality of behaviors;
when the preset behaviors are not included in the behaviors, determining the score of each behavior in the behaviors to obtain a plurality of scores;
classifying the behaviors to obtain multiple types of behaviors;
determining an important factor of each type of behavior in the plurality of types of behaviors to obtain a plurality of important factors;
adjusting the scores according to the important factors to obtain reference scores;
constructing a behavior score curve according to the plurality of reference scores;
and determining the behavior evaluation parameters according to the behavior scoring curve.
Optionally, in the aspect of determining the behavior evaluation parameter according to the behavior score curve, the program includes instructions for performing the following steps:
dividing the behavior scoring curve into a plurality of sections to obtain a plurality of sectional behavior scoring curves;
fitting each segmental behavior scoring curve in the segmental behavior scoring curves to obtain a plurality of fitting straight lines;
determining the slope of each fitting straight line in the fitting straight lines to obtain a plurality of slopes;
determining the slope mean value and the target mean square error of the slopes, and optimizing the slope mean value according to the target mean square error to obtain a predicted slope;
according to a target segmental behavior scoring curve which is closest to the current time in the plurality of segmental behavior scoring curves;
determining a first evaluation parameter corresponding to the target segmental behavior scoring curve;
and determining the behavior evaluation parameter according to the prediction slope and the first evaluation parameter.
Optionally, the program further includes instructions for performing the following steps:
and when the plurality of behaviors comprise the preset behavior, determining the behavior evaluation parameters corresponding to the preset behavior according to the mapping relation between the preset behavior and the evaluation parameters.
Optionally, the program further includes instructions for performing the following steps:
acquiring a second video clip of the target object in a second preset time period, wherein the second preset time period is a time period after the first preset time period;
analyzing the second video clip to obtain second associated information of the target object;
and updating the first file information according to the second associated information to obtain second file information of the target object.
Optionally, in the aspect that the first file information is updated according to the second association information to obtain the second file information of the target object, the program includes an instruction for executing the following steps:
determining updatable part information in the first file information, wherein the updatable part information comprises behavior evaluation parameters;
determining a reference behavior evaluation parameter corresponding to the second associated information;
determining a target duration between the occurrence time of the second associated information and the occurrence time of the first associated information;
determining a first difference between the reference behavior evaluation parameter and the behavior evaluation parameter;
determining a target influence factor corresponding to the target duration;
carrying out influence evaluation on the first difference value according to the target influence factor to obtain a second difference value;
and updating the first file information according to the second difference value to obtain second file information of the target object.
It can be seen that, in the server described in this embodiment of the application, the server is connected to at least one terminal device and at least one camera, the registration information of the target object is obtained through the at least one terminal device, the first video clip of the target object in the first preset time period is obtained through the at least one camera, analysis is performed according to the first video clip, the first association information of the target object is obtained, the filing operation is performed according to the registration information and the first association information, and the first file information of the target object is obtained.
Fig. 5 is a block diagram of functional units of an internet-of-things-based community profiling apparatus 500 according to an embodiment of the present application. This community arching device 500 based on thing networking is applied to the server, device 500 includes: a first acquisition unit 501, a second acquisition unit 502, an analysis unit 503, and a profiling unit 504, wherein,
the first obtaining unit 501 is configured to obtain registration information of a target object through the at least one terminal device;
the second obtaining unit 502 is configured to obtain, through the at least one camera, a first video clip of the target object within a first preset time period;
the analysis unit 503 is configured to perform analysis according to the first video segment to obtain first associated information of the target object;
the filing unit 504 is configured to perform a filing operation according to the registration information and the first association information, so as to obtain first file information of the target object.
Optionally, when the first association information includes the behavior evaluation parameter of the target object, in terms of obtaining the first association information of the target object by analyzing according to the first video clip, the analyzing unit 503 is specifically configured to:
analyzing the first video clip to obtain a plurality of video images;
performing behavior analysis on the video images to obtain a plurality of behaviors;
and determining the behavior evaluation parameters of the target object according to the behaviors.
Optionally, in the aspect of determining the behavior evaluation parameter of the target object according to the plurality of behaviors, the analysis unit 503 is specifically configured to:
detecting whether a preset behavior is included in the plurality of behaviors;
when the preset behaviors are not included in the behaviors, determining the score of each behavior in the behaviors to obtain a plurality of scores;
classifying the behaviors to obtain multiple types of behaviors;
determining an important factor of each type of behavior in the plurality of types of behaviors to obtain a plurality of important factors;
adjusting the scores according to the important factors to obtain reference scores;
constructing a behavior score curve according to the plurality of reference scores;
and determining the behavior evaluation parameters according to the behavior scoring curve.
Optionally, in the aspect of determining the behavior evaluation parameter according to the behavior score curve, the analysis unit 503 is specifically configured to:
dividing the behavior scoring curve into a plurality of sections to obtain a plurality of sectional behavior scoring curves;
fitting each segmental behavior scoring curve in the segmental behavior scoring curves to obtain a plurality of fitting straight lines;
determining the slope of each fitting straight line in the fitting straight lines to obtain a plurality of slopes;
determining the slope mean value and the target mean square error of the slopes, and optimizing the slope mean value according to the target mean square error to obtain a predicted slope;
according to a target segmental behavior scoring curve which is closest to the current time in the plurality of segmental behavior scoring curves;
determining a first evaluation parameter corresponding to the target segmental behavior scoring curve;
and determining the behavior evaluation parameter according to the prediction slope and the first evaluation parameter.
Optionally, the analysis unit 503 is further specifically configured to:
and when the plurality of behaviors comprise the preset behavior, determining the behavior evaluation parameters corresponding to the preset behavior according to the mapping relation between the preset behavior and the evaluation parameters.
Optionally, the apparatus 500 is further specifically configured to:
acquiring a second video clip of the target object in a second preset time period, wherein the second preset time period is a time period after the first preset time period;
analyzing the second video clip to obtain second associated information of the target object;
and updating the first file information according to the second associated information to obtain second file information of the target object.
Optionally, in the aspect that the first file information is updated according to the second association information to obtain the second file information of the target object, the apparatus 500 is specifically configured to:
determining updatable part information in the first file information, wherein the updatable part information comprises behavior evaluation parameters;
determining a reference behavior evaluation parameter corresponding to the second associated information;
determining a target duration between the occurrence time of the second associated information and the occurrence time of the first associated information;
determining a first difference between the reference behavior evaluation parameter and the behavior evaluation parameter;
determining a target influence factor corresponding to the target duration;
carrying out influence evaluation on the first difference value according to the target influence factor to obtain a second difference value;
and updating the first file information according to the second difference value to obtain second file information of the target object.
It can be seen that the community profiling device based on the internet of things described in the embodiment of the present application is applied to a server, the server is connected to at least one terminal device and at least one camera, acquiring registration information of a target object through at least one terminal device, acquiring a first video clip of the target object in a first preset time period through at least one camera, analyzing according to the first video clip to obtain first associated information of the target object, performing profiling operation according to the registration information and the first associated information to obtain first profile information of the target object, since the registration information may be the user's own description, the first related information is the user's actual picture, the two can be merged together for archive operation, furthermore, the file which is more in line with the current situation, namely the first file information, is obtained, so that the practicability of the file can be improved.
It can be understood that the functions of each program module of the community profiling device based on the internet of things according to the embodiment of the method may be specifically implemented, and the specific implementation process may refer to the relevant description of the embodiment of the method, which is not described herein again.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods as described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. The community filing method based on the Internet of things is applied to a server, the server is connected with at least one terminal device and at least one camera, and the method comprises the following steps:
acquiring registration information of a target object through the at least one terminal device;
acquiring a first video clip of the target object in a first preset time period through the at least one camera;
analyzing according to the first video clip to obtain first associated information of the target object;
and performing a filing operation according to the registration information and the first associated information to obtain first file information of the target object.
2. The method according to claim 1, wherein when the first association information includes a behavior evaluation parameter of the target object, the analyzing according to the first video segment to obtain the first association information of the target object includes:
analyzing the first video clip to obtain a plurality of video images;
performing behavior analysis on the video images to obtain a plurality of behaviors;
and determining the behavior evaluation parameters of the target object according to the behaviors.
3. The method of claim 2, wherein determining behavior evaluation parameters of the target object based on the plurality of behaviors comprises:
detecting whether a preset behavior is included in the plurality of behaviors;
when the preset behaviors are not included in the behaviors, determining the score of each behavior in the behaviors to obtain a plurality of scores;
classifying the behaviors to obtain multiple types of behaviors;
determining an important factor of each type of behavior in the plurality of types of behaviors to obtain a plurality of important factors;
adjusting the scores according to the important factors to obtain reference scores;
constructing a behavior score curve according to the plurality of reference scores;
and determining the behavior evaluation parameters according to the behavior scoring curve.
4. The method of claim 3, wherein determining the behavior assessment parameter from the behavior scoring curve comprises:
dividing the behavior scoring curve into a plurality of sections to obtain a plurality of sectional behavior scoring curves;
fitting each segmental behavior scoring curve in the segmental behavior scoring curves to obtain a plurality of fitting straight lines;
determining the slope of each fitting straight line in the fitting straight lines to obtain a plurality of slopes;
determining the slope mean value and the target mean square error of the slopes, and optimizing the slope mean value according to the target mean square error to obtain a predicted slope;
according to a target segmental behavior scoring curve which is closest to the current time in the plurality of segmental behavior scoring curves;
determining a first evaluation parameter corresponding to the target segmental behavior scoring curve;
and determining the behavior evaluation parameter according to the prediction slope and the first evaluation parameter.
5. The method of claim 4, further comprising:
and when the plurality of behaviors comprise the preset behavior, determining the behavior evaluation parameters corresponding to the preset behavior according to the mapping relation between the preset behavior and the evaluation parameters.
6. The method according to any one of claims 2-5, further comprising:
acquiring a second video clip of the target object in a second preset time period, wherein the second preset time period is a time period after the first preset time period;
analyzing the second video clip to obtain second associated information of the target object;
and updating the first file information according to the second associated information to obtain second file information of the target object.
7. The method according to claim 6, wherein the updating the first file information according to the second association information to obtain second file information of the target object includes:
determining updatable part information in the first file information, wherein the updatable part information comprises behavior evaluation parameters;
determining a reference behavior evaluation parameter corresponding to the second associated information;
determining a target duration between the occurrence time of the second associated information and the occurrence time of the first associated information;
determining a first difference between the reference behavior evaluation parameter and the behavior evaluation parameter;
determining a target influence factor corresponding to the target duration;
carrying out influence evaluation on the first difference value according to the target influence factor to obtain a second difference value;
and updating the first file information according to the second difference value to obtain second file information of the target object.
8. The utility model provides a shelves device is built to community based on thing networking which characterized in that is applied to the server, at least one terminal equipment and at least one camera are connected to the server, the device includes: a first acquisition unit, a second acquisition unit, an analysis unit and a profiling unit, wherein,
the first acquisition unit is used for acquiring the registration information of the target object through the at least one terminal device;
the second acquiring unit is used for acquiring a first video clip of the target object in a first preset time period through the at least one camera;
the analysis unit is used for analyzing according to the first video clip to obtain first associated information of the target object;
and the filing unit is used for performing filing operation according to the registration information and the first associated information to obtain first file information of the target object.
9. A server, comprising a processor, a memory for storing one or more programs and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN202111326592.2A 2021-11-10 2021-11-10 Community profiling method and device based on Internet of things and storage medium Pending CN114092045A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111326592.2A CN114092045A (en) 2021-11-10 2021-11-10 Community profiling method and device based on Internet of things and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111326592.2A CN114092045A (en) 2021-11-10 2021-11-10 Community profiling method and device based on Internet of things and storage medium

Publications (1)

Publication Number Publication Date
CN114092045A true CN114092045A (en) 2022-02-25

Family

ID=80299512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111326592.2A Pending CN114092045A (en) 2021-11-10 2021-11-10 Community profiling method and device based on Internet of things and storage medium

Country Status (1)

Country Link
CN (1) CN114092045A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110363428A (en) * 2019-07-16 2019-10-22 上海秒针网络科技有限公司 A kind of profile associated processing method and processing device
CN110781771A (en) * 2019-10-08 2020-02-11 北京邮电大学 Abnormal behavior real-time monitoring method based on deep learning
CN111814653A (en) * 2020-07-02 2020-10-23 苏州交驰人工智能研究院有限公司 Method, device, equipment and storage medium for detecting abnormal behaviors in video
US20210110138A1 (en) * 2019-04-30 2021-04-15 Beijing Sensetime Technology Development Co., Ltd. Target detection method and apparatus, device, and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210110138A1 (en) * 2019-04-30 2021-04-15 Beijing Sensetime Technology Development Co., Ltd. Target detection method and apparatus, device, and storage medium
CN110363428A (en) * 2019-07-16 2019-10-22 上海秒针网络科技有限公司 A kind of profile associated processing method and processing device
CN110781771A (en) * 2019-10-08 2020-02-11 北京邮电大学 Abnormal behavior real-time monitoring method based on deep learning
CN111814653A (en) * 2020-07-02 2020-10-23 苏州交驰人工智能研究院有限公司 Method, device, equipment and storage medium for detecting abnormal behaviors in video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
中国金融出版社编委会 等: "《征信知识伴我行》", 机械工业出版社 *

Similar Documents

Publication Publication Date Title
Martí et al. Social media data: Challenges, opportunities and limitations in urban studies
Cunliffe et al. Usability evaluation for museum web sites
CN111310019B (en) Information recommendation method, information processing method, system and equipment
US9817477B1 (en) Eye event detection for electronic documents
Suk et al. Investigation of Evalglare software, daylight glare probability and high dynamic range imaging for daylight glare analysis
JP6569313B2 (en) Method for updating facility characteristics, method for profiling a facility, and computer system
Gerich et al. Effects of social networks on the quality of life in an elder and middle-aged deaf community sample
CN101908057B (en) Information processing apparatus and information processing method
Harrits et al. Class categories and the subjective dimension of class: the case of D enmark
Zhang The rural-urban divide, intergroup relations, and social identity formation of rural migrant children in a Chinese urban school
Cooper et al. Personality assessment through the situational and behavioral features of Instagram photos
Spyratos et al. Evaluating the services and facilities of European cities using crowdsourced place data
JP5993664B2 (en) Employment support system
CN112669095A (en) Client portrait construction method and device, electronic equipment and computer storage medium
WO2014100517A2 (en) Method for prompting photographs of events
Bristow et al. Defining and evaluating context for wearable computing
KR102365897B1 (en) Device and method for recommending cultural life content
Jiang et al. Exploring the geo virtual linguistic landscape of Dublin urban areas: Before and during the COVID-19 outbreak
CN106850777B (en) Method and device for pushing information
CN114092045A (en) Community profiling method and device based on Internet of things and storage medium
WO2004029874A2 (en) Method and system for active knowledge management
SONG et al. Social Inequalities in neighborhood-level streetscape perceptions in Shanghai: the coherence and divergence between the objective and subjective measurements
Castagnos et al. Inferring art preferences from gaze exploration in a museum
KR102157370B1 (en) Method for caculating business density index and system for supporting the establishment using the same
Choi et al. A longitudinal comparison of public libraries’ posting activities on Twitter in April of 3 years, pre-, during, and post-COVID-19

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination