CN116682071B - Commodity interest information analysis method, equipment and storage medium - Google Patents

Commodity interest information analysis method, equipment and storage medium Download PDF

Info

Publication number
CN116682071B
CN116682071B CN202310975269.0A CN202310975269A CN116682071B CN 116682071 B CN116682071 B CN 116682071B CN 202310975269 A CN202310975269 A CN 202310975269A CN 116682071 B CN116682071 B CN 116682071B
Authority
CN
China
Prior art keywords
commodity
target
user
area
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310975269.0A
Other languages
Chinese (zh)
Other versions
CN116682071A (en
Inventor
李中振
叶丹
温淳
潘华东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202310975269.0A priority Critical patent/CN116682071B/en
Publication of CN116682071A publication Critical patent/CN116682071A/en
Application granted granted Critical
Publication of CN116682071B publication Critical patent/CN116682071B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The application discloses a commodity concern information analysis method, equipment and a storage medium, wherein the commodity concern information analysis method comprises the following steps: determining a target user corresponding to the target commodity, and extracting user characteristic information of the target user; carrying out attention area division on the target commodity by utilizing the user characteristic information to obtain an attention area of the target commodity aiming at a target user; behavior characteristic information of a target user in the attention area is detected, and attention information of a target commodity is determined based on the behavior characteristic information. According to the method and the system for analyzing the attention information of the commodity, the attention area of the target user is flexibly divided according to the user characteristic information of the user, the accuracy of behavior characteristic extraction is improved, and the attention information analysis of the commodity is more accurate.

Description

Commodity interest information analysis method, equipment and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, an apparatus, and a storage medium for analyzing information of interest of a commodity.
Background
The commodity sales industry is more and more competitive, and commodity attention statistical analysis is gradually becoming a reference for merchants to discover potential customers or adjust marketing modes.
For example, for vehicle sales, the traditional automobile sales mode is mainly for salesperson to receive, then record the attention information of customers to vehicles, but more dimension information cannot be fed back effectively, and the subjectivity of the recorded attention information is strong, so that the attention information is unreliable, and statistics of attention degree is affected.
Therefore, how to improve the accuracy of the analysis of the attention of the user to the commodity is a problem to be solved by those skilled in the art.
Disclosure of Invention
The application provides at least a commodity concern information analysis method, equipment and storage medium.
The first aspect of the present application provides a commodity interest information analysis method, including: determining a target user corresponding to the target commodity, and extracting user characteristic information of the target user; carrying out attention area division on the target commodity by utilizing the user characteristic information to obtain an attention area of the target commodity aiming at a target user; behavior characteristic information of a target user in the attention area is detected, and attention information of a target commodity is determined based on the behavior characteristic information.
In an embodiment, the method for dividing the attention area of the target commodity by using the user characteristic information to obtain the attention area of the target commodity for the target user includes: acquiring a commodity area of a target commodity; calculating expansion parameters corresponding to the commodity area by utilizing the user characteristic information; and expanding the commodity area based on the expansion parameters to obtain the attention area of the target commodity for the target user.
In an embodiment, calculating an expansion parameter corresponding to a commodity area by using user feature information includes: extracting commodity characteristics of the target commodity; carrying out regional splitting on the commodity region based on commodity characteristics to obtain a plurality of local commodity regions; and calculating expansion parameters corresponding to each local commodity area respectively by using the user characteristic information.
In an embodiment, expanding the commodity area based on the expansion parameter to obtain a target commodity interest area for the target user includes: based on the expansion parameters corresponding to each local commodity area, respectively expanding each local commodity area to obtain local expansion areas corresponding to each local commodity area; and splicing each local expansion area to obtain the attention area of the target commodity aiming at the target user.
In an embodiment, the user characteristic information includes physical characteristics of the target user; calculating expansion parameters corresponding to the commodity area by using the user characteristic information, wherein the expansion parameters comprise: obtaining an expansion base based on physical characteristics of a target user, and obtaining expansion multiples corresponding to target commodities; and calculating the expansion parameters corresponding to the commodity area by using the expansion base and the expansion multiples.
In an embodiment, determining a target user corresponding to a target commodity includes: acquiring a scene image acquired by image acquisition equipment; if the target commodity exists in the scene image, acquiring a user detection area of the target commodity; and taking the user detected in the user detection area as a target user.
In an embodiment, the region of interest is composed of a plurality of local expansion regions, different local expansion regions corresponding to different local commodity regions in the target commodity; detecting behavior feature information of a target user in a region of interest, determining information of interest of a target commodity based on the behavior feature information, including: acquiring behavior characteristic information of a target user in each local expansion area; counting behavior characteristic information corresponding to each local expansion area respectively to obtain local attention of each local expansion area; and combining the local attention degree of each local expansion area to obtain attention information of the target commodity.
In one embodiment, detecting behavior feature information of a target user in a region of interest, determining information of interest of a target commodity based on the behavior feature information includes: acquiring behavior characteristic information of a plurality of target users acquired in a preset time period; user identity identification is carried out on each target user respectively, and the user identity of each target user is obtained; aggregating behavior feature information belonging to the same user identity to obtain target features; and determining the attention information of the target commodity based on the target characteristics.
A second aspect of the present application provides a commodity interest information analysis apparatus, comprising: the user characteristic extraction module is used for determining a target user corresponding to the target commodity and extracting user characteristic information of the target user; the attention area determining module is used for dividing attention areas of target commodities by using the user characteristic information to obtain attention areas of the target commodities for target users; and the analysis module is used for detecting the behavior characteristic information of the target user in the attention area and determining the attention information of the target commodity based on the behavior characteristic information.
A third aspect of the present application provides an electronic device including a memory and a processor for executing program instructions stored in the memory to implement the above-described merchandise interest information analysis method.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon program instructions which, when executed by a processor, implement the above-described merchandise interest information analysis method.
According to the scheme, the user characteristic information of the target user is extracted by determining the target user corresponding to the target commodity; carrying out attention area division on the target commodity by utilizing the user characteristic information to obtain an attention area of the target commodity aiming at a target user; behavior characteristic information of a target user in the attention area is detected, and attention information of a target commodity is determined based on the behavior characteristic information. Compared with the mode of carrying out behavior feature extraction by adopting a preset fixed attention area in the related art, the method and the device flexibly divide the attention area of the target user according to the user feature information of the user so as to improve the accuracy of behavior feature extraction and enable the analysis of the attention information of the commodity to be more accurate.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of an implementation environment in which an exemplary embodiment of the present application is shown;
FIG. 2 is a flow chart of a method of analyzing information of interest for a commodity, according to an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram illustrating the division of a region of interest according to an exemplary embodiment of the present application;
FIG. 4 is a flow chart illustrating the partitioning of a region of interest according to an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a split commodity area when the target commodity is a vehicle according to an exemplary embodiment of the present application;
FIG. 6 is a flow chart illustrating analysis of information of interest in an exemplary embodiment of the application;
fig. 7 is a block diagram of a commodity interest information analysis apparatus according to an exemplary embodiment of the present application;
FIG. 8 is a schematic diagram of an electronic device shown in an exemplary embodiment of the application;
fig. 9 is a schematic diagram of a structure of a computer-readable storage medium according to an exemplary embodiment of the present application.
Detailed Description
The following describes embodiments of the present application in detail with reference to the drawings.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, interfaces, techniques, etc., in order to provide a thorough understanding of the present application.
The term "and/or" is herein merely an association information describing an associated object, meaning that three relationships may exist, e.g., a and/or B may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. Further, "a plurality" herein means two or more than two. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Referring to fig. 1, fig. 1 is a schematic diagram of an implementation environment related to a method for analyzing information of interest of a commodity according to the present application. As shown in fig. 1, the implementation environment includes a scene detection device 110 and a server 120, and the scene detection device 110 and the server 120 may be directly or indirectly connected through wired or wireless communication.
The scene detection device 110 is used for detecting a scene containing a target commodity, and may be an image acquisition device, a laser radar, or the like, but is not limited thereto. The scene detection device 110 may refer broadly to one of a plurality of scene detection devices, the present embodiment being illustrated with the scene detection device 110 only. Those skilled in the art will appreciate that the number of scene detection devices described above may be greater or lesser. The number of the scene detection devices and the device types of the embodiment of the application are not limited, and the scene detection devices can be only one, or the scene detection devices can be tens or hundreds, or more.
The server 120 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content delivery network (Content Delivery Network, CDN), basic cloud computing services such as big data and an artificial intelligence platform.
Alternatively, the server 120 may undertake a primary merchandise interest information analysis job, and the scene detection device 110 undertakes a secondary merchandise interest information analysis job; alternatively, the server 120 performs a secondary commodity interest information analysis job, and the scene detection device 110 performs a primary commodity interest information analysis job; alternatively, the server 120 or the scene detection device 110 may each independently undertake merchandise interest information analysis work.
Illustratively, the scene detection device 110 acquires a scene image in real time, the scene image contains a target commodity, the scene detection device 110 sends the scene image to the server 120, the server 120 determines a corresponding target user of the target commodity according to the scene image, extracts user characteristic information of the target user, and divides and obtains a focus area of the target commodity for the target user according to the user characteristic information so as to record behavior characteristic information of the target user in the focus area, and identifies focus information of the target user on the target commodity by using the recorded behavior characteristic information.
It will be appreciated that in the specific embodiments of the present application, related data such as user information, images containing users, etc. are involved, and when the above embodiments of the present application are applied to specific products or technologies, user permissions or agreements need to be obtained, and the collection, use and processing of related data requires the relevant laws and regulations and standards of the relevant countries and regions.
Referring to fig. 2, fig. 2 is a flowchart illustrating a method for analyzing information of interest of a commodity according to an exemplary embodiment of the present application. The commodity interest information analysis method can be applied to the implementation environment shown in fig. 1 and is specifically executed by a server in the implementation environment. It should be understood that the method may be adapted to other exemplary implementation environments and be specifically executed by devices in other implementation environments, and the implementation environments to which the method is adapted are not limited by the present embodiment.
The method for analyzing information of interest of commodity according to the embodiment of the present application is described in detail below with a server as a specific execution body.
As shown in fig. 2, the method for analyzing the information of interest of the commodity at least includes steps S210 to S230, and is described in detail as follows:
step S210: and determining a target user corresponding to the target commodity, and extracting user characteristic information of the target user.
The target commodity refers to a commodity which needs to be analyzed at present, and the commodity comprises, but is not limited to, a vehicle, a mobile phone, a computer, furniture, clothes and the like.
For example, a user detection area corresponding to the target commodity may be acquired, and a user entering the user detection area may be taken as the target user.
The user detection area is used for performing user detection to judge whether an identifiable user exists in the current scene.
The user detection area can be divided in advance for the target commodity, or the user detection area can be flexibly divided according to the environment information of the target commodity, so that the user detection area obtained by division is more accurate.
For example, the user detection area may be obtained by dividing the user movable area in the scene where the target commodity is located, for example, all the user movable areas may be divided into user detection areas, or the user movable area in the preset range of the target commodity may be divided into user detection areas, which is not limited by the present application.
For example, the number of target products may be plural, and the user detection areas of the target products may be divided according to the distance between the target products, the priority between the target products, or the like.
For another example, volume information, length and width information, commodity type, etc. of the target commodity may also be acquired, and the user detection area of the target commodity may be partitioned according to one or more of the volume information, length and width information, commodity type.
An image acquisition device is arranged opposite to a target commodity, and acquires an image of a user detection area corresponding to the target commodity in real time, and whether a user exists in the user detection area or not is detected through the image of the user detection area. The image mentioned in the application can be a picture or a video.
For example, an image of the user detection area is input to the target detection network, a user included in the image of the user detection area output from the target detection network is obtained, and the detected user is set as the target user. The object detection network may be implemented based on a YOLO (You Only Look Once) network, among other things.
Then, user characteristic information of the target user is extracted. Wherein the user characteristic information includes, but is not limited to, one or more of physical characteristics, movement characteristics, identity information, etc. of the target user, the application is not limited to the type of user characteristic information.
The number of target users detected in the user detection area may be one or more. And if the number of the target users detected in the user detection area is a plurality of, extracting the user characteristic information of each target user respectively.
The image acquisition device acquires an image of a target user in a user detection area, and performs user feature extraction on the image of the target user to obtain user feature information of the target user.
Step S220: and carrying out attention area division on the target commodity by utilizing the user characteristic information to obtain an attention area of the target commodity aiming at the target user.
The region of interest is a region for extracting behavior features of the target user.
And carrying out attention area division on the target commodity according to the user characteristic information so as to obtain an attention area of the target commodity aiming at the target user.
For example, the size of the region, such as the diameter of the region, may be calculated according to the user feature information, and the region of interest of the target commodity may be obtained by dividing the region of interest according to the calculated size of the region and the coordinates of the target commodity.
For example, if the user characteristic information is a physical characteristic of the target user, such as the height of the target user, the size of the region may be calculated according to the height of the target user; if the user characteristic information is the movement characteristic of the target user, such as the movement speed of the target user, the size of the area can be calculated according to the movement speed of the target user; if the user characteristic information is the identity information of the target user, the size of the area can be calculated according to the identity information of the target user, for example, the commodity purchasing record corresponding to the identity information is inquired, and the size of the area is calculated according to the commodity purchasing record.
Taking the case where the number of target users is plural as an example, the division of the region of interest is exemplified:
for example, the region of interest may be divided for each target user separately. Referring to fig. 3, fig. 3 is a schematic diagram illustrating a region of interest division according to an exemplary embodiment of the present application, where, as shown in fig. 3, a target user detected in a user detection region includes a user a and a user b, and a region a is obtained by dividing according to user feature information corresponding to the user a, and is taken as a region of interest corresponding to the user a; and dividing the region B according to the user characteristic information corresponding to the user B, and taking the region B as a concerned region corresponding to the user B. That is, the region of interest associated with user a is region a, and the region of interest associated with user B is region B.
For example, it is also possible to acquire the divided areas of each target user, and select the area with the largest area from these areas as the attention area of each target user. As shown in fig. 3, if the area of the area a is larger than the area B by comparing the area size relationship between the area a and the area B, the area a is regarded as the attention area corresponding to the user a and the user B. That is, the attention areas associated with the user a and the user b are each the area a.
For example, the area obtained by dividing each target user may be obtained, and the areas may be fused to obtain the attention area of each target user. As also shown in fig. 3, the region a and the region B are fused to obtain a region C, and the region C is used as the attention region corresponding to the user a and the user B. That is, the attention areas associated with the user a and the user b are each the area C.
Step S230: behavior characteristic information of a target user in the attention area is detected, and attention information of a target commodity is determined based on the behavior characteristic information.
The attention information is used for reflecting the attention condition of the user to the target commodity. For example, a stay time of each target user in the region of interest of the target commodity, the number of times the target user performs the region of interest, an operation performed by the target user on the target commodity, an age distribution of the target user entering the region of interest, and the like.
Wherein the behavior characteristic information of the target user includes, but is not limited to, one or more of an action, a gesture, a sound, a stay time in the region of interest, an operation performed on the target commodity, and the like of the target user.
And extracting behavior characteristics of a target user in the attention area, and analyzing commodity attention information according to the extracted behavior characteristic information to obtain attention information of the target commodity.
The image acquisition device acquires an image of the target user in the concerned area, obtains an image to be identified, inputs the image to be identified into a feature extraction model which is trained in advance, and obtains behavior feature information of the target user output by the feature extraction model. The feature extraction model may be implemented based on a convolutional network (Convolutional Neural Networks, CNN), a Long Short-Term Memory (LSTM), or the like, which is not limited in the present application.
And then, counting the behavior characteristics of each target user to obtain the attention information of the target commodity.
The following further describes the steps of the commodity interest information analysis method according to the present application:
in some embodiments, determining the target user corresponding to the target commodity in step S210 includes: acquiring a scene image acquired by image acquisition equipment; if the target commodity exists in the scene image, acquiring a user detection area of the target commodity; and taking the user detected in the user detection area as a target user.
An image acquisition device is arranged in the commodity sales scene, the image acquisition device acquires images of the commodity sales scene to obtain scene images, and then image features of all sales commodities contained in the scene images are extracted. And acquiring a commodity image of the target commodity, extracting image features of the target commodity according to the commodity image, comparing the extracted image features of each sales commodity with the image features of the target commodity to obtain the similarity between each sales commodity and the target commodity, and taking the sales commodity with the similarity greater than a preset similarity threshold as the target commodity.
Further, a user detection area of the target commodity is acquired. For example, position information of the target commodity is acquired, and a user detection area of the target commodity is determined based on the position information.
For example, the size of the detection area of the target commodity is calculated according to one or more of the volume information, the length and width information and the commodity type of the target commodity, and then the position coordinates of the target commodity are taken as the center of the detection area, and the user detection area of the target commodity is obtained by dividing according to the size of the detection area.
And then taking the user detected in the user detection area as a target user, extracting user characteristic information of the target user, and dividing the attention area of the target commodity by utilizing the user characteristic information to obtain the attention area of the target commodity aiming at the target user.
In some embodiments, in step S220, the method for dividing the attention area of the target commodity by using the user feature information to obtain the attention area of the target commodity for the target user includes:
step S221: and acquiring a commodity area of the target commodity.
The commodity area of the target commodity refers to an area covered by the target commodity.
Step S222: and calculating the expansion parameters corresponding to the commodity area by utilizing the user characteristic information.
The expansion parameter is used to define the extent to which the commodity area expands.
And calculating the expansion parameters corresponding to the commodity area according to the user characteristic information.
For example, the user characteristic information contains a plurality of user information items, expansion sub-parameters corresponding to each user information item are calculated respectively, and then weighting calculation is carried out on each expansion sub-parameter to obtain expansion parameters corresponding to the commodity area.
For example, the user characteristic information includes a height and a moving speed of the user, a first extension sub-parameter is obtained by calculating according to the height, a second extension sub-parameter is obtained by calculating according to the moving speed, the first extension sub-parameter and the second extension sub-parameter are weighted according to the weight of the first extension sub-parameter and the weight of the second extension sub-parameter, and the weighted calculation result is used as the extension parameter corresponding to the commodity area.
The weights of the first extension sub-parameter and the weights of the second extension sub-parameter may be preset or flexibly calculated, for example, according to the commodity characteristics of the target commodity, such as the volume information, the length and width information, the commodity type and other information of the target commodity, the weights of the first extension sub-parameter and the weights of the second extension sub-parameter are calculated, so that the accuracy of the calculated extension parameters is improved.
Step S223: and expanding the commodity area based on the expansion parameters to obtain the attention area of the target commodity for the target user.
And carrying out expansion processing on the commodity area according to the expansion parameters, and taking the expanded area as a focus area of the target commodity aiming at the target user.
In some embodiments, taking an example that the user characteristic information includes physical characteristics of the target user, the calculation process of the expansion parameter is illustrated: obtaining an expansion base based on physical characteristics of a target user, and obtaining expansion multiples corresponding to target commodities; and calculating the expansion parameters corresponding to the commodity area by using the expansion base and the expansion multiples.
The physical characteristics of the target user are used to describe the physical information of the target user, such as the height of the target user, the head height of the target user, the shoulder width of the target user, etc.
The expansion base is obtained according to the physical characteristics of the target user.
For example, the physical characteristics of the target user include the head height H of the target user and the shoulder width W of the target user, from which the expansion base is derived. Such as the sum of W and H as the expansion radix.
Then, the expansion times corresponding to the target commodity are obtained, and the expansion times can be preset or flexibly calculated, for example, the expansion times corresponding to the target commodity are calculated according to the type of the target commodity, the commodity purchase record of the target user and the like.
And calculating the expansion parameters corresponding to the commodity area according to the expansion base and the expansion multiple. For example, multiplying the expansion base according to the expansion multiple to obtain the expansion parameter corresponding to the commodity area.
In some implementations, referring to fig. 4, fig. 4 is a flowchart illustrating a method for partitioning a region of interest according to an exemplary embodiment of the present application, where, as shown in fig. 4, the steps for partitioning the region of interest include:
step S410: and extracting commodity characteristics of the target commodity.
Step S420: and carrying out regional splitting on the commodity region of the target commodity based on commodity characteristics to obtain a plurality of local commodity regions.
Taking a target commodity as an example of a vehicle, referring to fig. 5, fig. 5 is a schematic diagram illustrating splitting a commodity area when the target commodity is a vehicle according to an exemplary embodiment of the present application, and as shown in fig. 5, the commodity area of the vehicle may be split into a head area, a tail area, and an area corresponding to each door according to commodity characteristics.
Step S430: and calculating expansion parameters corresponding to each local commodity area respectively by using the user characteristic information.
And respectively carrying out expansion parameter calculation on each local commodity area, and refining the granularity of the attention information analysis of the target commodity.
For example, an expansion base is calculated according to the user characteristic information, then expansion multiples corresponding to each local commodity area are obtained, and multiplication operation is carried out on the expansion base according to the expansion multiples, so that expansion parameters corresponding to each local commodity area are obtained.
Wherein the expansion factor may be different between different local commodity areas. For example, based on preset priority information of each local commodity area, determining an expansion multiple corresponding to each local commodity area, wherein if the priority is higher, the expansion multiple is larger; for another example, the user history interaction records of the local commodity areas are obtained, and the expansion multiple corresponding to each local commodity area is determined according to the user history interaction records, for example, the expansion multiple is larger as the user stay time in the user history interaction records is longer.
Step S440: and respectively expanding each local commodity area based on the expansion parameters respectively corresponding to each local commodity area to obtain the local expansion area respectively corresponding to each local commodity area.
As shown in fig. 5, the local commodity area has a head area, a tail area and an area corresponding to each vehicle door, and in combination with the above embodiment, if the user characteristic information is calculated to obtain an expansion base number M and the expansion multiple of the head area is N1, the local expansion area corresponding to the head area is an area R1; the expansion multiple of the vehicle tail region is N2, and the local expansion region corresponding to the vehicle tail region is region R2; the expansion multiple of the main driving door area is N3, and the local expansion area corresponding to the main driving door area is an area R3; the expansion multiple of the copilot door area is N4, and the local expansion area corresponding to the copilot door area is an area R4; and if the expansion multiples of the two rear door areas are N5, the local expansion areas corresponding to the two rear door areas are respectively an area R5 and an area R6.
Step S450: and splicing each local expansion area to obtain the attention area of the target commodity aiming at the target user.
And performing splicing processing on each local expansion region, and taking the spliced region as a region of interest of the target commodity for the target user.
Based on the above embodiment, the region of interest is composed of a plurality of local expansion regions, different local expansion regions corresponding to different local commodity regions in the target commodity; in step S230, behavior feature information of the target user in the region of interest is detected, and information of interest of the target user to the target commodity is identified based on the behavior feature information, including:
step S2311: and acquiring behavior characteristic information of the target user in each local expansion area.
And extracting the behavior characteristic information of the target user in the local expansion area by taking the local expansion area as a unit.
Step S2312: and counting the behavior characteristic information corresponding to each local expansion area respectively to obtain the local attention of each local expansion area.
And recording the behavior characteristic information of each target user, respectively counting the behavior characteristic information detected by each local expansion area in a preset time period, and obtaining the local attention of each local expansion area according to the counting result.
Step S2313: and combining the local attention degree of each local expansion area to obtain attention information of the target commodity.
For example, the target attention of the target commodity may be calculated according to the local attention of each local expansion area. For example, the local attention of each local expansion region is weighted, and the weighted calculation result is used as the target attention of the target commodity.
Then, the local attention of each local expansion area and the target attention of the target commodity can be combined to generate attention information of the target commodity.
In some embodiments, detecting behavior feature information of the target user within the region of interest in step S230, identifying information of interest of the target user to the target commodity based on the behavior feature information includes:
step S2321: and acquiring behavior characteristic information of a plurality of target users acquired in a preset time period.
Tracking the target user in the attention area, extracting behavior characteristic information of the target user in the tracking process until the target user leaves the attention area, and guaranteeing the behavior characteristic information of the target user obtained by tracking.
And acquiring behavior characteristic information of a plurality of target users acquired in a preset time period. Since the same target user may also return to the region of interest after leaving the region of interest, two behavioral characteristic information may be generated for the unified target user. Therefore, in order to ensure the accuracy of the behavior feature information corresponding to each different target user, the same target user needs to be distinguished.
Step S2322: and respectively carrying out user identity identification on each target user to obtain the user identity of each target user.
Extracting at least one of the face features or the human body features of the target users, and respectively carrying out user identity recognition on each target user according to the extracted face features or human body features to obtain the user identity of each target user.
Step S2323: and aggregating the behavior characteristic information belonging to the same user identity to obtain the target characteristic.
I.e. a user identity may have a plurality of behavior feature information within a preset period of time, and these behavior feature information are aggregated.
For example, there are two behavior feature information for the same target user within a preset period of time. Wherein the behavior characteristic information indicates that the stay time of the target user in the main driving door area of the vehicle is 10 minutes from the first moment to the second moment; the other behavior feature information indicates that the target user stays in the main driving door area of the vehicle for 15 minutes in the third to fourth time. The target characteristics obtained after the target user performs behavior characteristic information aggregation indicate that the stay time of the target user in the main driving door area of the vehicle is 25 minutes in a preset time period.
Step S2324: and determining the attention information of the target commodity based on the target characteristics.
And determining the attention information of the target commodity according to the target characteristics obtained after the polymerization treatment.
For example, all the target features are counted, and the attention information of the target commodity is obtained according to the counted result.
Taking a target commodity as an example of a vehicle, a commodity interest information analysis method of the present application will be described as an example. Referring to fig. 6, fig. 6 is a flowchart illustrating analysis of attention information according to an exemplary embodiment of the present application, and as shown in fig. 6, a target vehicle in a commodity sales scene is detected, and a vehicle area of the target vehicle is divided to obtain a plurality of local vehicle areas.
And then, in response to the detection of the target user in the user detection area corresponding to the target vehicle, expanding each local vehicle area according to the user characteristic information of the target user to obtain the attention area corresponding to the target vehicle.
And extracting behavior characteristic information of the target user in the attention area, and generating characteristic information ID corresponding to the behavior characteristic information so as to uniquely identify the behavior characteristic information through the characteristic information ID. For example, the location relationship of the target user with the target vehicle, whether the target user is entering main drive, the interaction of the target user with the target vehicle (e.g., starting the vehicle engine, opening the vehicle trunk), etc.
And detecting whether the target user leaves the attention area, and storing the behavior characteristic information and the characteristic information ID of the target user in an associated manner after detecting that the target user leaves the attention area.
Wherein the stored behavior characteristic information is aggregated.
For example, when the newly added behavior feature information exists, the newly added behavior feature information and the stored behavior feature information are compared, if the similarity between the stored behavior feature information and the newly added behavior feature information is larger than a preset similarity threshold value, the two behavior feature information are the same behavior feature of the target user, so that feature information IDs corresponding to the two behavior feature information are corrected, for example, the feature information IDs corresponding to the two behavior feature information are unified.
For another example, the target user is tracked in the image acquisition range of the image acquisition device, the moving track of the target user is recorded, then whether the stored behavior feature information exists in the target user corresponding to the newly added behavior feature information is judged based on the moving track, and if the stored behavior feature information exists, the feature information ID correction is performed.
And aggregating behavior characteristic information of the same characteristic information ID to count the aggregated behavior characteristic information, so as to obtain vehicle attention information with different dimensions.
For example, the stay time of each user in each local expansion area of the vehicle, the number of users entering each local expansion area, the frequency of the same user entering each local expansion area, the number and time of vehicle engine start, the number and time of vehicle trunk start, and the like are counted.
According to the commodity interest information analysis method provided by the application, the user characteristic information of the target user is extracted by determining the target user corresponding to the target commodity; carrying out attention area division on the target commodity by utilizing the user characteristic information to obtain an attention area of the target commodity aiming at a target user; behavior characteristic information of a target user in the attention area is detected, and attention information of a target commodity is determined based on the behavior characteristic information. Compared with the mode of carrying out behavior feature extraction by adopting a preset fixed attention area in the related art, the method and the device flexibly divide the attention area of the target user according to the user feature information of the user so as to improve the accuracy of behavior feature extraction and enable the analysis of the attention information of the commodity to be more accurate.
Fig. 7 is a block diagram of a commodity interest information analysis apparatus according to an exemplary embodiment of the present application. As shown in fig. 7, the exemplary commodity interest information analysis apparatus 700 includes: a user feature extraction module 710, a region of interest determination module 720, and an analysis module 730. Specifically:
the user feature extraction module 710 is configured to determine a target user corresponding to the target commodity, and extract user feature information of the target user;
the attention area determining module 720 is configured to divide attention areas of the target commodity by using the user feature information, so as to obtain an attention area of the target commodity for the target user;
the analysis module 730 is configured to detect behavior feature information of the target user in the region of interest, and determine the information of interest of the target commodity based on the behavior feature information.
In the above-mentioned exemplary commodity interest information analysis device, the interest area of the target user is flexibly divided according to the user feature information of the user, the behavior feature information of the target user is detected in the interest area, and the interest information of the target commodity is determined based on the behavior feature information, so that the accuracy of behavior feature extraction is improved, and the interest information analysis of the commodity is more accurate.
The functions of each module may be described in the embodiments of the method for analyzing information of interest of the commodity, which is not described herein.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the application. The electronic device 800 comprises a memory 801 and a processor 802, the processor 802 being adapted to execute program instructions stored in the memory 801 to implement the steps of any of the above described merchandise interest information analysis method embodiments. In one particular implementation scenario, electronic device 800 may include, but is not limited to: the electronic device 800 may also include mobile devices such as a notebook computer and a tablet computer, and is not limited herein.
Specifically, the processor 802 is configured to control itself and the memory 801 to implement the steps in any of the merchandise interest information analysis method embodiments described above. The processor 802 may also be referred to as a central processing unit (Central Processing Unit, CPU). The processor 802 may be an integrated circuit chip with signal processing capabilities. The processor 802 may also be a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Field-programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 802 may be commonly implemented by an integrated circuit chip.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an embodiment of a computer readable storage medium according to the present application. The computer readable storage medium 900 stores program instructions 910 executable by the processor, the program instructions 910 for implementing the steps in any of the above-described embodiments of the merchandise interest information analysis method.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical, or other forms.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units. The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (10)

1. A commodity interest information analysis method, the method comprising:
determining a target user corresponding to a target commodity, and extracting user characteristic information of the target user;
carrying out attention area division on the target commodity by utilizing the user characteristic information to obtain an attention area of the target commodity for the target user; the concerned area contains an area obtained by expanding the commodity area of the target commodity;
and detecting behavior characteristic information of the target user in the concerned area, and determining concerned information of the target commodity based on the behavior characteristic information.
2. The method according to claim 1, wherein the dividing the target commodity into the regions of interest by using the user characteristic information, to obtain the regions of interest of the target commodity for the target user, includes:
acquiring a commodity area of the target commodity;
calculating expansion parameters corresponding to the commodity area by utilizing the user characteristic information;
and expanding the commodity area based on the expansion parameters to obtain the attention area of the target commodity for the target user.
3. The method according to claim 2, wherein calculating the expansion parameter corresponding to the commodity area using the user characteristic information includes:
extracting commodity characteristics of the target commodity;
carrying out regional splitting on the commodity region based on the commodity characteristics to obtain a plurality of local commodity regions;
and calculating expansion parameters corresponding to each local commodity area respectively by utilizing the user characteristic information.
4. The method of claim 3, wherein expanding the commodity area based on the expansion parameter results in a region of interest of the target commodity for the target user, comprising:
based on the expansion parameters respectively corresponding to each local commodity area, respectively expanding each local commodity area to obtain a local expansion area respectively corresponding to each local commodity area;
and splicing each local expansion area to obtain the attention area of the target commodity aiming at the target user.
5. The method of claim 2, wherein the user characteristic information comprises physical characteristics of the target user; the calculating the expansion parameters corresponding to the commodity area by using the user characteristic information comprises the following steps:
obtaining an expansion base based on the physical characteristics of the target user, and obtaining expansion multiples corresponding to the target commodity;
and calculating the expansion parameters corresponding to the commodity area by using the expansion base and the expansion multiples.
6. The method according to any one of claims 1 to 5, wherein determining the target user to which the target commodity corresponds comprises:
acquiring a scene image acquired by image acquisition equipment;
if the target commodity exists in the scene image, acquiring a user detection area of the target commodity;
and taking the user detected in the user detection area as a target user.
7. The method of any one of claims 1 to 5, wherein the region of interest is comprised of a plurality of locally expanded regions, different locally expanded regions corresponding to different local commodity regions in the target commodity; the detecting the behavior feature information of the target user in the attention area, and determining the attention information of the target commodity based on the behavior feature information, includes:
acquiring behavior characteristic information of the target user in each local expansion area;
counting the behavior characteristic information corresponding to each local expansion region to obtain the local attention of each local expansion region;
and combining the local attention degree of each local expansion area to obtain attention information of the target commodity.
8. The method of any one of claims 1 to 5, wherein the detecting behavior feature information of the target user within the region of interest, and determining the information of interest of the target commodity based on the behavior feature information, comprises:
acquiring behavior characteristic information of a plurality of target users acquired in a preset time period;
user identity identification is carried out on each target user respectively, and the user identity of each target user is obtained;
aggregating behavior feature information belonging to the same user identity to obtain target features;
and determining the attention information of the target commodity based on the target characteristics.
9. An electronic device comprising a memory and a processor for executing program instructions stored in the memory to implement the method of any one of claims 1 to 8.
10. A computer readable storage medium having stored thereon program instructions, which when executed by a processor, implement the method of any of claims 1 to 8.
CN202310975269.0A 2023-08-04 2023-08-04 Commodity interest information analysis method, equipment and storage medium Active CN116682071B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310975269.0A CN116682071B (en) 2023-08-04 2023-08-04 Commodity interest information analysis method, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310975269.0A CN116682071B (en) 2023-08-04 2023-08-04 Commodity interest information analysis method, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116682071A CN116682071A (en) 2023-09-01
CN116682071B true CN116682071B (en) 2023-11-10

Family

ID=87784117

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310975269.0A Active CN116682071B (en) 2023-08-04 2023-08-04 Commodity interest information analysis method, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116682071B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003242297A (en) * 2001-12-11 2003-08-29 Japan Tobacco Inc Dispenser system and sales analysis method using the system
JP2009116510A (en) * 2007-11-05 2009-05-28 Fujitsu Ltd Attention degree calculation device, attention degree calculation method, attention degree calculation program, information providing system and information providing device
JP2011227886A (en) * 2010-03-30 2011-11-10 Rakuten Inc Commodity information providing system, commodity information providing method, and program
CN110264219A (en) * 2019-05-06 2019-09-20 浙江华坤道威数据科技有限公司 A kind of client's monitoring analysis system based on big data
CN111126288A (en) * 2019-12-25 2020-05-08 南京甄视智能科技有限公司 Target object attention calculation method, target object attention calculation device, storage medium and server
TW202032461A (en) * 2019-02-18 2020-09-01 宏碁股份有限公司 Customer behavior analyzing method and customer behavior analyzing system
CN111681018A (en) * 2019-03-11 2020-09-18 宏碁股份有限公司 Customer behavior analysis method and customer behavior analysis system
WO2021082636A1 (en) * 2019-10-29 2021-05-06 深圳云天励飞技术股份有限公司 Region of interest detection method and apparatus, readable storage medium and terminal device
CN113674037A (en) * 2021-10-21 2021-11-19 西安超嗨网络科技有限公司 Data acquisition and recommendation method based on shopping behaviors
CN114943586A (en) * 2022-05-31 2022-08-26 中国银行股份有限公司 Commodity recommendation method, device and equipment based on position detection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10095937B2 (en) * 2016-06-21 2018-10-09 GM Global Technology Operations LLC Apparatus and method for predicting targets of visual attention

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003242297A (en) * 2001-12-11 2003-08-29 Japan Tobacco Inc Dispenser system and sales analysis method using the system
JP2009116510A (en) * 2007-11-05 2009-05-28 Fujitsu Ltd Attention degree calculation device, attention degree calculation method, attention degree calculation program, information providing system and information providing device
JP2011227886A (en) * 2010-03-30 2011-11-10 Rakuten Inc Commodity information providing system, commodity information providing method, and program
TW202032461A (en) * 2019-02-18 2020-09-01 宏碁股份有限公司 Customer behavior analyzing method and customer behavior analyzing system
CN111681018A (en) * 2019-03-11 2020-09-18 宏碁股份有限公司 Customer behavior analysis method and customer behavior analysis system
CN110264219A (en) * 2019-05-06 2019-09-20 浙江华坤道威数据科技有限公司 A kind of client's monitoring analysis system based on big data
WO2021082636A1 (en) * 2019-10-29 2021-05-06 深圳云天励飞技术股份有限公司 Region of interest detection method and apparatus, readable storage medium and terminal device
CN111126288A (en) * 2019-12-25 2020-05-08 南京甄视智能科技有限公司 Target object attention calculation method, target object attention calculation device, storage medium and server
CN113674037A (en) * 2021-10-21 2021-11-19 西安超嗨网络科技有限公司 Data acquisition and recommendation method based on shopping behaviors
CN114943586A (en) * 2022-05-31 2022-08-26 中国银行股份有限公司 Commodity recommendation method, device and equipment based on position detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Empirical Study on Sales Conversion Rate of Product Attention and Buying Intention Based on Purchasing Behavior Model;J Jing;《Shanghai Management Science》;全文 *
融合标签平均划分距离和结构关系的微博用户可重叠社区发现;马慧芳;陈海波;赵卫中;邴睿;黄乐乐;;电子学报(第11期);全文 *

Also Published As

Publication number Publication date
CN116682071A (en) 2023-09-01

Similar Documents

Publication Publication Date Title
US11004129B2 (en) Image processing
US11494594B2 (en) Method for training model and information recommendation system
CN106557480B (en) Method and device for realizing query rewriting
CN105938622A (en) Method and apparatus for detecting object in moving image
US10346861B2 (en) Adaptive sampling scheme for imbalanced large scale data
CN108229999B (en) Method and device for evaluating competitive products
CN109658194A (en) A kind of lead referral method and system based on video frequency tracking
CN106991425B (en) Method and device for detecting commodity transaction quality
Gothai et al. Design features of grocery product recognition using deep learning
Wan et al. Efficient virtual data search for annotation‐free vehicle reidentification
CN116682071B (en) Commodity interest information analysis method, equipment and storage medium
CN110880133A (en) Commodity information pushing method, system, storage medium and electronic equipment
WO2019046329A1 (en) Search method and apparatus
US11030769B2 (en) Methods and apparatus to perform image analyses in a computing environment
CN113763057A (en) User identity portrait data processing method and device
Maliatski et al. Hardware-driven adaptive k-means clustering for real-time video imaging
CN113378071A (en) Advertisement recommendation method and device, electronic equipment and storage medium
CN112200711A (en) Training method and system of watermark classification model
CN112085553A (en) Specific commodity detection method and device
CN112949752B (en) Training method and device of business prediction system
CN111027326A (en) Commodity classification method, storage medium and electronic device
CN116071569A (en) Image selection method, computer equipment and storage device
US20220222297A1 (en) Generating search results based on an augmented reality session
CN117934088A (en) Commodity recommendation method and device, electronic equipment and storage medium
CN112818258A (en) Social network user searching method based on keywords, computer device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant