JP6226240B2 - Activity map analyzer, activity map analysis system, and activity map analysis method - Google Patents

Activity map analyzer, activity map analysis system, and activity map analysis method Download PDF

Info

Publication number
JP6226240B2
JP6226240B2 JP2014161046A JP2014161046A JP6226240B2 JP 6226240 B2 JP6226240 B2 JP 6226240B2 JP 2014161046 A JP2014161046 A JP 2014161046A JP 2014161046 A JP2014161046 A JP 2014161046A JP 6226240 B2 JP6226240 B2 JP 6226240B2
Authority
JP
Japan
Prior art keywords
activity
map
unit
moving object
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014161046A
Other languages
Japanese (ja)
Other versions
JP2015127940A (en
Inventor
岩井 和彦
和彦 岩井
和麻 吉田
和麻 吉田
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2014161046A priority Critical patent/JP6226240B2/en
Publication of JP2015127940A publication Critical patent/JP2015127940A/en
Application granted granted Critical
Publication of JP6226240B2 publication Critical patent/JP6226240B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an activity map analysis device, an activity map analysis system, and an activity map analysis method for analyzing an activity status of a moving object in a monitoring area and outputting an activity map visualizing the activity status of the moving object.

  In stores such as convenience stores, a monitoring system that installs a camera that captures the inside of the store and monitors the person in the store with the image of the camera is widely used, but using the image of this camera, By making it possible to grasp the customer's activity status in the store, based on the customer's activity status, by considering measures to improve the display method of products in the store, etc. Improvements can be made.

  Conventionally, a technique for generating an activity map that visualizes the activity status of a person in a monitoring area has been known (see Patent Document 1). In this technique, an activity map is displayed by color-coding in contour lines according to the activity level of a person. In addition, a technique is known in which a monitoring area is divided into a plurality of blocks and a person's staying degree is detected for each block (see Patent Document 2). In this technique, a value (score) indicating the degree of person retention is output for each block.

JP 2009-134688 A JP 2011-248836 A

  Now, in the case of a user such as a store manager, for example, in order to confirm the status of attracting customers for a specific product (new product, seasonal product, lunch box, beverage, etc.), the person is limited to the area where the product is placed. Although there is a demand for grasping the activity status, the technology disclosed in Patent Document 1 can easily grasp the overall activity status of the monitoring area, but the activity map is displayed in a complicated shape. In the monitoring area, there is a problem that it is impossible to immediately grasp the activity status of a person in a specific area that the user particularly pays attention to.

  On the other hand, since the technique disclosed in Patent Document 2 is divided into blocks and outputs the activity level (retention level) of the person, it is easy to grasp the staying place where the people gather, What can be grasped by this is the activity status of the person in a specific period, and there is a problem that it is impossible to immediately grasp how the activity status of the person has changed over time.

  For example, when focusing on an area with a high degree of activity, it is not possible to immediately grasp how the activity has progressed in that area. Specifically, a large number of customers suddenly gathered. It is impossible to immediately determine whether or not a certain number of customers have stopped by. Even when attention is paid to an area with a low activity level, it cannot be immediately determined whether or not there is a time zone in which customers temporarily gather.

  The present invention has been devised to solve such problems of the prior art, and its main purpose is to immediately grasp the activity status of the moving object in the area that the user pays attention within the monitoring area. An activity map analysis device, an activity map analysis system, and an activity map analysis method configured to be able to immediately grasp how the activity status of a moving object has changed over time It is to provide.

An activity map analyzer according to the present invention is an activity map analyzer that analyzes an activity state of a moving object in a monitoring area and outputs an activity map that visualizes the activity state of the moving object. A moving object detection unit to detect, an activity value acquisition unit that acquires a moving object activity value representing the degree of activity of the moving object in the monitoring area based on a detection result in the moving object detection unit, and a user input operation And a target area setting unit that sets a target area in the monitoring area, and the target set by the target area setting unit based on the moving body activity value of the monitoring area acquired by the activity value acquiring unit A map generation unit that generates the activity map related to an area, and the activity map generated by the map generation unit on a captured image of the monitoring area An output information generating unit that generates output information having the map image displayed together ne, a configuration having a.

The activity map analysis system of the present invention is an activity map analysis system that analyzes an activity status of a moving object in a monitoring area and outputs an activity map that visualizes the activity status of the moving object, and images the monitoring area. A camera and a plurality of information processing devices, and any one of the plurality of information processing devices detects a moving object from a captured image of a monitoring area, and based on a detection result of the moving object detection unit. , An activity value acquisition unit that acquires the activity value of the moving object representing the degree of activity of the moving object in the monitoring area for each predetermined unit period , and a target area that sets the target area in the monitoring area according to a user input operation Based on the moving activity value of the monitoring area acquired by the setting unit and the activity value acquiring unit, the target area set by the target area setting unit is related to A map generation unit that generates the activity map, and an output information generation unit that generates output information having a map image in which the activity map generated by the map generation unit is displayed on the captured image of the monitoring area. It is set as the structure provided with.

The activity map analysis method according to the present invention is an activity map analysis method that performs an analysis on the activity status of a moving object in a monitoring area and causes the information processing apparatus to perform processing to output an activity map that visualizes the activity status of the moving object. Detecting a moving object from a captured image of the monitoring area; acquiring a moving object activity value representing a degree of activity of the moving object in the monitoring area for each predetermined unit period based on a detection result in the moving object detection unit; The activity map relating to the target area based on the moving body activity value of the monitoring area acquired in the step of setting the target area in the monitoring area and the step of acquiring the moving body activity value according to a user input operation And a map image displayed by superimposing the activity map on a captured image of the monitoring area. And generating an output information, a configuration having a.

  According to the present invention, since an activity map based on the moving activity value of the target area set in the monitoring area is output in accordance with the input operation of the user, the activity status of the moving object in the area of interest in the monitoring area Can be grasped immediately.

Overall configuration diagram of an activity map analysis system according to the first embodiment Floor plan of the store explaining the layout of the store and the installation status of the camera 1 Functional block diagram showing schematic configuration of PC3 Explanatory drawing explaining the outline of activity map analysis processing performed on PC3 Explanatory drawing which shows the analysis condition input screen displayed on the monitor 4 Explanatory drawing which shows the analysis condition input screen displayed on the monitor 4 Explanatory drawing explaining the point of the activity value acquisition process performed in the activity value acquisition part 34 Explanatory drawing which shows the color table used by the activity map production | generation process performed by the map production | generation part 35 Explanatory drawing explaining the point of the graph production | generation process performed by the graph production | generation part 36 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows another example of the analysis result output screen displayed on the monitor 4 Explanatory drawing which shows the analysis result output document produced | generated by the document production | generation part 39 Explanatory drawing which shows the analysis result output document produced | generated by the document production | generation part 39 Explanatory drawing which shows the analysis result output document produced | generated by the document production | generation part 39 Explanatory drawing which shows the analysis result output document produced | generated by the document production | generation part 39 Explanatory drawing which shows the analysis condition input screen in the activity map analysis system which concerns on 2nd Embodiment

A first invention made to solve the above-mentioned problems is an activity map analyzing apparatus that analyzes an activity status of a moving object in a monitoring area and outputs an activity map that visualizes the activity status of the moving object. A moving object detection unit that detects a moving object from the captured image , and an activity value acquisition unit that acquires a moving object activity value representing a degree of activity of the moving object in the monitoring area based on a detection result of the moving object detection unit for each predetermined unit period And a target area setting unit that sets a target area in the monitoring area according to a user's input operation, and the target area setting unit based on the moving body activity value of the monitoring area acquired by the activity value acquisition unit A map generation unit that generates the activity map related to the target area set by the control unit, and the activity map generated by the map generation unit To output information generating unit that generates output information having the map image displayed superimposed on the captured image, and configurations with.

  According to this, since an activity map is output based on the moving activity value of the target area set in the monitoring area according to the input operation of the user, the moving activity status in the area that the user pays attention to in the monitoring area is immediately displayed. Can grasp.

  According to a second aspect of the invention, the moving object detection unit acquires a moving object frame representing a region where a moving object detected from the captured image of the monitoring area exists, and the activity value acquisition unit is acquired by the moving object detection unit. The number of times of positioning within the set range based on the moving body frame is counted for each detection element, and the moving body activity value for each detection element is obtained.

  According to this, the moving body activity value for every detection element can be acquired by simple processing.

  Further, the third invention further includes a transition information generation unit that generates transition information related to a transition state of the dynamic activity value in the target area based on the dynamic activity value acquired by the activity value acquisition unit, The output information including the map image and the transition information is generated.

  According to this, since the transition information regarding the transition state of the moving object activity value in the target area is output, it is possible to immediately grasp how the moving object's activity state has changed over time. Moreover, since the activity map is output, it is possible to immediately grasp the activity status of the moving object in a specific period. Then, by considering both the activity map and the transition information, the activity status of the moving object in the monitoring area can be grasped from various angles.

  According to a fourth aspect of the present invention, the target area setting unit sets the target area in units of a plurality of grids obtained by dividing the captured image in a grid shape in accordance with a user input operation.

  According to this, since the target area is set in units of grids, the target area can be easily specified. In this case, the grid may be divided in accordance with a user input operation that specifies the number of vertical and horizontal divisions.

  Moreover, 5th invention sets the said target area to the arbitrary shapes which a user designates according to a user's input operation according to 5th invention.

  According to this, since the target area can be set to an arbitrary shape designated by the user, the target area can be set to an optimal shape according to the actual state of the monitoring area.

The sixth invention is an activity map analysis system that analyzes an activity status of a moving object in a monitoring area and outputs an activity map that visualizes the activity status of the moving object, and a camera that photographs the monitoring area; A plurality of information processing devices, and any one of the plurality of information processing devices detects a moving object from a captured image of a monitoring area, and the monitoring based on a detection result of the moving object detection unit. An activity value acquisition unit that acquires a dynamic activity value representing a degree of activity of a moving object in an area for each predetermined unit period, a target area setting unit that sets a target area in the monitoring area in accordance with a user input operation, Based on the moving activity value of the monitoring area acquired by the activity value acquisition unit, the activity map related to the target area set by the target area setting unit is generated. A map generator, and an output information generator that generates output information having a map image in which the activity map generated by the map generator is displayed on the captured image of the monitoring area. To do.

  According to this, similarly to the first invention, it is possible to immediately grasp the activity state of the moving object in the area in which the user particularly pays attention within the monitoring area.

The seventh invention is an activity map analysis method for performing an analysis on the activity status of a moving object in a monitoring area, and causing the information processing apparatus to perform a process of outputting an activity map visualizing the activity status of the moving object. A step of detecting a moving object from the captured image of the area, a step of acquiring a moving object activity value representing a degree of activity of the moving object in the monitoring area based on a detection result in the moving object detection unit, and a user's In response to an input operation, the activity map related to the target area is generated based on the moving activity value of the monitoring area acquired in the step of setting the target area in the monitoring area and the step of acquiring the moving activity value. Generating output information having a map image displaying the step and the activity map superimposed on the captured image of the monitoring area A step that is configured to include a.

  According to this, similarly to the first invention, it is possible to immediately grasp the activity state of the moving object in the area in which the user particularly pays attention within the monitoring area.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

(First embodiment)
FIG. 1 is an overall configuration diagram of an activity map analysis system according to the first embodiment. This activity map analysis system is constructed for a retail chain store such as a convenience store, and includes a camera (imaging device) 1, a recorder (image recording device) 2, a PC provided for each of a plurality of stores. (Activity map analysis device) 3 and a monitor (display device) 4, and a PC 11 and a monitor 12 provided in a headquarters for generalizing a plurality of stores.

  The camera 1 is installed at an appropriate place in the store, the inside of the store is imaged by the camera 1, and image information obtained thereby is recorded in the recorder 2. With the PC 3 provided in the store and the PC 11 provided in the headquarters, the store image captured by the camera 1 can be viewed in real time, and the past store image recorded in the recorder 2 can be viewed. It is possible to check the situation in the store at the store or the headquarters.

  The PC 3 installed in the store is configured as an activity map analysis device that analyzes the customer activity status in the store. Further, the analysis result information generated by the PC 3 can be browsed by the PC 3 itself, and further transmitted to the PC 11 installed in the headquarters so that it can also be browsed by the PC 11. Is configured as a browsing device for browsing.

  Next, a store layout and the installation status of the camera 1 will be described using a convenience store as an example. FIG. 2 is a plan view of the store explaining the store layout and the installation status of the camera 1.

  The store is provided with an entrance, a display shelf (display area), and a cashier counter. Display shelves are divided into different types of products such as bento boxes, plastic bottles, and rice balls. Next to the checkout counter is a fast food display shelf. The customer enters the store from the entrance, moves through the passage between the display shelves, finds the desired product, and goes to the checkout counter with the product and pays at the checkout counter (payment). After exiting, exit from the entrance.

  In the store, a plurality of cameras 1 that capture images of the store (monitoring area) are installed. In particular, in the example shown in FIG. 2A, a camera having a predetermined angle of view, a so-called box camera, is employed as the camera 1, and the camera 1 images the passage between the display shelves. It is installed on the ceiling at the end. In the example shown in FIG. 2B, an omnidirectional camera having a 360-degree shooting range using a fisheye lens is adopted as the camera 1, and the camera 1 is installed on the ceiling directly above the display shelf. Thus, the passage between the display shelves can be imaged.

  Next, an outline of the activity map analysis process performed by the PC 3 shown in FIG. 1 will be described. FIG. 3 is a functional block diagram showing a schematic configuration of the PC 3. FIG. 4 is an explanatory diagram for explaining the outline of the activity map analysis process performed by the PC 3.

  As shown in FIG. 3, the PC 3 includes a monitoring unit 31 and an activity map analysis unit 32. The monitoring unit 31 causes the PC 3 to function as a monitoring system for monitoring the inside of the store. The operation of the camera 1 and the recorder 2 is controlled by the monitoring unit 31, and the in-store image captured by the camera 1 Can be viewed in real time, and images in the store recorded by the recorder 2 can be viewed.

  The activity map analysis unit 32 analyzes the activity status of a person (moving object) in the monitoring area and outputs an activity map that visualizes the activity status of the person. The activity detection unit 33 and the activity value acquisition unit 34 A map generation unit 35, a graph generation unit (transition information generation unit) 36, an analysis condition setting unit 37, a GUI control unit 38, and a document generation unit 39.

  The analysis condition setting unit (target area setting unit) 37 causes the user to perform an input operation for designating an area of particular interest in the monitoring area as the target area, and sets the target area according to the user's input operation. Processing to set as analysis conditions is performed.

  In particular, in the present embodiment, as shown in FIG. 4, the target area can be set in units of a plurality of grids obtained by dividing the captured image in a grid pattern. The captured image is divided in accordance with a user input operation, and the number of grid divisions when the captured image is divided into grids is set as an analysis condition. The grid is equally divided according to the number of grid divisions.

  Further, in the analysis condition setting unit 37, the display unit (pixel and grid) of the activity map and the counter upper limit value are set as the analysis conditions in accordance with a user input operation. Furthermore, in the analysis condition setting unit 37, the aggregation mode is set according to the input operation of the user who designates each aggregation mode for day, week, month, and day of the week.

  In the moving body detection unit 33, processing for detecting a person from the captured image of the monitoring area by the camera 1 is performed. In particular, in the present embodiment, processing for acquiring a person frame (moving body frame) representing an area where a person exists is performed. For the moving object detection process performed by the moving object detection unit 33, a known person detection technique or the like may be used. Further, based on the person frame, any setting range of the person frame, the person's upper half of the person frame, or the floor near the person frame is specified, and a dynamic activity value acquisition process described later is performed.

  The activity value acquisition unit 34 acquires a dynamic activity value representing the activity level of the person in the monitoring area based on the detection result of the dynamic object detection unit 33. The activity value acquisition unit 34 includes the first activity value acquisition unit 41 and the second activity value acquisition unit 41. An activity value acquisition unit 42.

  The first activity value acquisition unit 41 performs processing for acquiring a moving body activity value representing a person's activity level for each pixel (detection element) based on the detection result of the moving body detection unit 33. That is, for each pixel of the camera image, the number of times of being located in any setting range of the person frame acquired by the moving object detection unit 33, the upper body of the person frame, or the floor near the person frame is counted, and the moving object for each pixel is counted. Acquires the activity value (counter value).

  In the second activity value acquiring unit 42, the moving activity values for each pixel acquired by the first activity value acquiring unit 41 are aggregated in units of grids set in the target area, and the moving unit activity values of the grid units are obtained. Processing to obtain is performed. In particular, in the present embodiment, the moving body activity value for each of a plurality of pixels positioned in the grid is averaged to obtain a moving body activity value in units of grids.

  The map generation unit 35 performs a process of generating an activity map that visualizes the activity status of the person in the monitoring area based on the dynamic activity value acquired by the activity value acquisition unit 34. In particular, in the present embodiment, a pixel-based activity map is generated based on the moving object activity value for each pixel acquired by the first activity value acquisition unit 41, and the grid acquired by the second activity value acquisition unit 42 Based on the dynamic activity value of the unit, an activity map of the grid unit is generated. In the present embodiment, an activity map relating only to the grid set in the target area is generated, and grids not set in the target area are excluded.

  In the graph generation unit 36, based on the moving activity value of the grid unit acquired by the second activity value acquiring unit 42, a graph representing the transition status (temporal change status) of the moving activity value in the grid set in the target area A process of generating (transition information) is performed.

  The GUI control unit 38 shown in FIG. 3 obtains input information by a user input operation through a GUI (Graphical User Interface) using the monitor 4 and an input device (input unit) 6 such as a mouse, and an activity map. A process for outputting the analysis result of the analysis is performed, and an input information acquisition unit 43 and a screen information generation unit (output information generation unit) 44 are provided.

  The screen information generation unit 44 performs processing for generating display information related to an analysis condition input screen for allowing the user to input an analysis condition and an analysis result output screen for displaying an analysis result of the activity map analysis process. An analysis result output screen is displayed on the monitor 4. The input information acquisition unit 43 performs a process of acquiring input information according to an input operation performed by the user using the input device 6 on the analysis condition input screen or the analysis result output screen displayed on the monitor 4. Based on the input information acquired by the input information acquisition unit 43, the analysis condition setting unit 37 performs processing for setting analysis conditions.

  The document generation unit (output information generation unit) 39 performs processing for generating an analysis result output document for displaying the analysis result of the activity map analysis processing in a predetermined file format. The analysis result output document generated by the document generation unit is transmitted to a device different from the PC 3 that performs the activity map analysis process, for example, the PC 11 provided in the headquarters. The PC 11 displays the analysis result output document on the monitor 12. In addition, by outputting with the printer 45, the analysis result output document can be browsed.

  The monitoring unit 31 and the activity map analysis unit 32 are realized by causing the CPU of the PC 3 to execute a monitoring program and an activity map analysis program. These programs are installed in advance on the PC 3 as an information processing device and configured as a dedicated device, or are recorded on an appropriate program recording medium as an application program that runs on a general-purpose OS, and also via a network May be provided.

  Next, an analysis condition input screen for inputting analysis conditions for the activity map analysis process performed by the activity map analysis unit 32 shown in FIG. 3 will be described. 5 and 6 are explanatory diagrams showing an analysis condition input screen displayed on the monitor 4. The example shown in FIG. 5 is a box camera, and the example shown in FIG. 6 is an omnidirectional camera. .

  The analysis condition input screens shown in FIGS. 5 and 6 are for allowing the user to input analysis conditions in the activity map analysis process performed by the activity map analysis unit 32. The analysis condition input screen includes a start button 51 and , A setting button 52, an analysis condition input unit 53, a division status display unit 54, and a target area registration unit 55 are provided.

  The start button 51 is for starting an activity map analysis process in the activity map analysis unit 32. The setting button 52 displays the analysis condition input unit 53, the division status display unit 54, and the target area registration unit 55, and executes processing for setting analysis conditions for activity map analysis processing in accordance with a user operation. is there.

  The analysis condition input unit 53 is provided with a display unit selection unit 56, a counter upper limit value input unit 57, and a grid division number input unit 58.

  The display unit selection unit 56 allows the user to select the display unit of the activity map generated by the map generation unit 35. In the present embodiment, either a pixel or a grid can be selected as a display unit. When a pixel is selected, an activity map is generated for each pixel of the captured image, and when a grid is selected, an activity map is generated for each grid. Is done.

  The counter upper limit value input unit 57 inputs the counter upper limit value as a numerical value. The counter upper limit value may be input using the input device 6 such as a keyboard. In the present embodiment, the counter upper limit value can be designated by an arbitrary value of 10 to 500. The upper limit value of the counter limits the display color of the activity map and will be described in detail later. The count of the person frame for each pixel performed by the activity value acquisition unit 34 shown in FIG. This is done regardless of the upper limit.

  The grid division number input unit 58 inputs the number of grid divisions when the captured image is divided into a grid. This grid division number input section 58 is provided with an X direction input field and a Y direction input field, and inputs the number of divisions in the X direction (horizontal direction) and the number of divisions in the Y direction (vertical direction). be able to. In particular, in this embodiment, the number of divisions can be input as an arbitrary value of 2 to 5.

  The division status display unit 54 displays the grid division status according to the number of divisions input by the grid division number input unit 58 on the image captured by the camera 1. That is, a grid dividing line is displayed on the captured image. In the example shown in FIGS. 5 and 6, the number of divisions in the X direction is 5, the number of divisions in the Y direction is 4, and the captured image is divided into a total of 20 grids. In addition, the grid is assigned numerals (1, 2, 3,...) With respect to the X direction and alphabets (A, B, C,...) With respect to the Y direction. The grid can be specified by two codes.

  The target area registration unit 55 registers the grid displayed on the division status display unit 54 as a target area. The target area registration unit 55 is provided with an X direction input field, a Y direction input field, and a name input field. In the X direction input field, a numerical code indicating the position of the grid in the X direction is input. In the Y direction input field, an alphabetic code indicating the position of the grid in the Y direction is input. In the name input field, Enter the name of each target area. These inputs may be performed using an input device 6 such as a keyboard.

  In the example shown in FIG. 5, the 1-B grid showing the area of the display shelf of the lunch box and the 2-D grid showing the area of the cashier counter clerk are registered in the target area. Yes. In the example shown in FIG. 6, a 3-C grid showing the area of the checkout counter customer side and a 3-D grid showing the lunch shelf display area are registered in the target area. Yes.

  Next, the activity value acquisition process performed by the activity value acquisition unit 34 shown in FIG. 3 will be described. FIG. 7 is an explanatory diagram for explaining the point of the activity value acquisition process performed by the activity value acquisition unit 34. Here, an example of counting the number of times of positioning in the person frame will be described. However, as described above, the position corresponding to the person's upper body in the person frame and the floor near the person frame is specified, and the number of times is counted. Good.

  The first activity value acquisition unit 41 of the activity value acquisition unit 34 performs a process of acquiring a moving body activity value for each pixel of the captured image. In particular, in the present embodiment, the moving object detection unit 33 detects a person from the captured image (frame) in the monitoring area, acquires a person frame (moving object frame) representing an area where the person exists, and performs the first activity. In the value acquisition unit 41, for each pixel (detection element), the number of times of being located within the person frame acquired by the moving body detection unit 33 is counted, and a moving body activity value (counter value) for each pixel is acquired.

  The counting of the person frame for each pixel is performed for all the pixels constituting the captured image, and as shown in FIG. 7, the counter value of the pixel is incremented by 1 each time each pixel enters the person frame. The counting of the person frame for each pixel is continuously performed in a predetermined detection unit period, and the moving body activity value for each pixel is sequentially output for each detection unit period. In consideration of erroneous detection of a person frame, the moving body activity value (counter value) may be incremented by 1 when the person frame is continuously entered a predetermined number of times (for example, three times).

  In the second activity value acquisition unit 42, a process of acquiring the moving activity value in units of grids by collecting the moving activity values in units of pixels acquired in the first activity value acquisition unit 41 in units of grids is performed. In particular, in this embodiment, the average value of the moving activity values of each pixel in the grid, i.e., the total of the moving activity values of all the pixels in the grid, and a value obtained by dividing the total value by the number of pixels of the grid, The moving activity value of the grid is used.

  In the example shown in FIG. 7, the counter value of the pixel located in the person frame is 1, and the moving activity values of the three pixels in the lower left of the drawing in the target area are 20, 60, and 60. In addition, a pixel whose numerical value is not described has a moving object activity value of zero. Therefore, the total value of the moving activity values of all the pixels in the grid is (1 × 20) + 60 + 60 + 20 = 160, and the moving activity value of the grid, that is, the average value of the moving activity values of each pixel in the grid is 160. / (8 × 10) = 2.

  Next, an activity map generation process performed by the map generation unit 35 shown in FIG. 3 will be described. FIG. 8 is an explanatory diagram showing a color table used in an activity map generation process performed by the map generation unit 35.

  The map generation unit 35 performs a process of generating an activity map that visualizes the activity status of a person based on the dynamic activity value acquired by the activity value acquisition unit 34. In particular, in the present embodiment, an activity map is generated that is colored for each display unit (pixel or grid) with a display color corresponding to the magnitude of the moving body activity value.

  At this time, the display color for each display unit is determined with reference to the color table shown in FIG. In this color table, R, G, and B color values are registered for each color management number (0 to 499), and color management numbers are assigned according to counter values (moving body activity values). When the dynamic activity value is large, the display color is warm, and when the dynamic activity value is small, the display color is cold.

  In particular, in this embodiment, the counter upper limit value is input by the user in the counter upper limit value input unit 57 of the analysis condition input screen shown in FIGS. 5 and 6, and a color management number is assigned according to this counter upper limit value. Is done. For example, when the counter upper limit value is 500, all 500 colors registered in the color table are used. Specifically, when the counter value is 500 or more, which is the upper limit value, the color management number is 499 display color, and when the counter value is 130, the color management number is 129 display color. When the value is 1, the color management number is 000, and when the counter value is 0, the display is colorless.

  If the counter upper limit value is 10, 10 out of 500 colors registered in the color table are used. For example, display colors of color management numbers 000, 049, 099,. Specifically, when the counter value is 10 or more, which is the upper limit value, the color management number is 499, and when the counter value is 1, the color management number is 000. When the value is 0, it becomes colorless.

  Next, a graph generation process performed by the graph generation unit 36 shown in FIG. 3 will be described. FIG. 9 is an explanatory diagram for explaining the outline of the graph generation processing performed by the graph generation unit 36.

  In the present embodiment, the second activity value acquisition unit 42 aggregates (averages) the pixel-based moving activity values acquired by the first activity value acquiring unit 41 in units of grids, and the grid-unit moving activity values. The process of acquiring is performed. Then, the graph generation unit 36 generates a graph (transition information) regarding the transition state of the dynamic activity value in the grid set in the target area based on the dynamic activity value in grid units acquired by the second activity value acquisition unit 42. Processing is performed.

  In particular, in the present embodiment, as shown in FIG. 9, a graph representing the transition status of the moving body activity value of each grid set in the target area is generated. In particular, in the example shown in FIG. 9, a graph (line graph) is generated in which the daily business hours are divided into time zones (1 hour) and the dynamic activity values for each time zone are connected in time series.

  In the present embodiment, the activity value acquisition unit 34 acquires the dynamic activity value for each predetermined detection unit period, and the graph generation unit 36 averages the dynamic activity value for each detection unit period in the total unit period. Then, the movement activity value for each aggregation unit period is acquired, and a graph (transition information) representing the transition status of the movement activity value for each aggregation unit period is generated.

  Here, when the detection unit period and the totaling unit period are the same, the averaging process is unnecessary. In the example shown in FIG. 9, the total unit period is 1 hour. Here, the activity value acquisition unit 34 acquires the dynamic activity value with the detection unit period as 1 hour, that is, over 1 hour. When the person frame is counted, the detection unit period and the totaling unit period become the same, so that the averaging process is unnecessary. On the other hand, when the detection unit period and the totaling unit period are different, an averaging process is performed. For example, when graphing the transition status of one week's moving activity value, the total unit period is 1 day, but when the detection unit period is 1 hour, the moving activity value per hour is averaged over 1 day To obtain daily activity values.

  Next, the analysis result output screen displayed on the monitor 4 shown in FIG. 3 will be described. 10-16 is explanatory drawing which shows the analysis result output screen displayed on the monitor 4, Among these, FIGS. 10-13 shows the example from which an activity map differs, FIGS. 14-16 is a graph. Shows different examples.

  The analysis result output screens shown in FIGS. 10 to 16 output the analysis results in the activity map analysis unit 32. This analysis result output screen includes a display button 61, a download button 62, and a totaling mode selection. A unit 63, a date display unit 64, and an analysis result output unit 65 are provided. The analysis result output unit 65 is provided with a map display unit 66, a graph display unit 67, and a slider (operation unit) 68.

  The display button 61 is for causing the analysis result output unit 65 to display the analysis result. The download button 62 is used to obtain analysis result data. In particular, in the present embodiment, the analysis result output document generated by the document generation unit 39 of the PC 3 can be acquired by operating the download button 62. Thereby, the analysis result output document is displayed on the monitor 12 by a device different from the PC 3 that performs the activity map analysis process, for example, the PC 11 provided in the headquarters, and is output by the printer, thereby browsing the analysis result output document. be able to. This analysis result output document will be described in detail later.

  The tabulation mode selection unit 63 selects the tabulation mode related to the graph to be displayed on the graph display unit 67. In particular, in the present embodiment, each aggregation mode of day, week, month, and day of the week is set. When the day, week, and month aggregation modes are selected, a graph showing the transition status of the dynamic activity values for the day, week, and month is displayed. When the day of the week aggregation mode is selected, the moving objects on the same day of each week are displayed. A graph showing the transition state of the activity value is displayed. Further, when the aggregation mode is selected by the aggregation mode selection unit 63, a calendar screen (not shown) is displayed in a pop-up, and the day, week, and month for displaying the analysis result can be selected on this calendar screen. .

  The date display unit 64 displays the date of the analysis result displayed on the analysis result output unit 65. Note that the date may be directly input to the date display unit 64 so that the analysis result output unit 65 can specify the date for displaying the analysis result.

  The map display unit 66 displays the activity map generated by the map generation unit 35 in a state of being superimposed on the captured image of the monitoring area. The graph generated by the graph generation unit 36 is displayed on the graph display unit 67. The map display unit 66 and the graph display unit 67 will be described in detail later.

  The slider 68 adjusts the date and time of the activity map and captured image displayed on the map display unit 66. By operating the slider 68, the activity map and the captured image can be switched to those of a desired date and time. Specifically, the slider 68 is provided so as to be movable in the direction along the horizontal axis (time axis) of the graph displayed on the graph display unit 67. When the slider 68 is shifted using the input device 6 such as a mouse, The line 69 displayed on the graph display unit 67 moves, and the activity map and captured image of the date and time indicated by the line 69 are displayed on the map display unit 66.

  Next, the map display unit 66 of the analysis result output screen shown in FIGS. 10 to 16 will be described. In the map display unit 66, an activity map that visualizes the degree of activity of a person is displayed in a state of being superimposed on the captured image of the monitoring area. This activity map is colored with a display color corresponding to the size of the moving body activity value based on the color table shown in FIG. Note that only the grid set in the target area is displayed in a display color corresponding to the moving activity value, and the grid not set in the target area is blank without a display color.

  The example shown in FIGS. 10 and 11 is a case of a box camera, and a rectangular captured image by the box camera is displayed on the map display unit 66. The example shown in FIGS. 12 and 13 is an omnidirectional camera, and a circular captured image by the omnidirectional camera is displayed on the map display unit 66.

  In the present embodiment, the display unit (pixel or grid) can be selected on the analysis condition input screen shown in FIGS. 5 and 6, and the captured image is displayed on the map display unit 66 according to this selection operation. An activity map in pixel units or an activity map in grid units is displayed. The example shown in FIG. 10 and FIG. 12 is a case where a pixel is selected as a display unit (for convenience of description, one pixel is shown in a larger size), and an activity map in pixel units is displayed on the map display unit 66. The example shown in FIGS. 11 and 13 is a case where a grid is selected as a display unit, and an activity map in units of grid is displayed on the map display unit 66.

  Next, the graph display unit 67 of the analysis result output screen shown in FIGS. 10 to 16 will be described. In the graph display unit 67, the temporal transition state of the moving body activity value is displayed in a graph (line graph) for each grid set in the target area. In particular, in this embodiment, the tabulation mode selection unit 63 can select the tabulation mode related to the graph displayed on the graph display unit 67, and the horizontal axis of the graph differs depending on the tabulation mode. The vertical axis of the graph represents the moving body activity value (counter value) for each grid.

  The example shown in FIGS. 10-13 is a case where the day totaling mode is selected. In this case, when the user designates a specific day, a graph (line graph) representing the transition state of the moving body activity value on the designated day is displayed on the graph display unit 67. Specifically, the horizontal axis of the graph is a time zone, and a graph is displayed in which the daily business hours are divided into time zones (1 hour) and the dynamic activity values for each time zone are connected in time series. Thereby, the change state of the moving body activity value of each time slot | zone of the specific day in each grid set to the object area can be compared.

  Moreover, the example shown in FIG. 14 is a case where the week totaling mode is selected. In this case, when the user designates a specific week, a graph (line graph) representing the transition state of the moving body activity value in the designated week is displayed on the graph display unit 67. Specifically, the horizontal axis of the graph is the day (day of the week), and a graph is displayed in which one week is divided by day and the daily dynamic activity values are connected in time series. Thereby, the change state of the dynamic activity value of each day of a specific week in each grid set in the target area can be compared.

  Further, the example shown in FIG. 15 is a case where the month totaling mode is selected. In this case, when the user designates a specific month, a graph (line graph) representing the transition state of the moving body activity value in the designated one month is displayed on the graph display unit 67. Specifically, the horizontal axis of the graph is the day, and a graph is displayed in which one month is divided by day and the daily dynamic activity values are connected in time series. Thereby, the change situation of the dynamic activity value of each day of a specific month in each grid set in the target area can be compared.

  Further, the example shown in FIG. 16 is a case where the day counting mode is selected. In this case, when the user designates a specific day of the week, a graph (line graph) representing the transition state of the moving activity value on the designated day of each week is displayed on the graph display unit 67. Specifically, the horizontal axis of the graph is the same day of the week, and a graph is displayed in which the movement activity values of the designated day of the week are connected in time series. Thereby, the change situation of the dynamic activity value on a specific day of the week in each grid set in the target area can be compared. In particular, here, the activity value of the specified day of the week is displayed from the most recently specified day of the week, so that the change status of the activity value of the specific day of the week can be traced back to the past. .

  Note that the graph displayed on the graph display unit 67 is displayed for each grid set in the target area, but the activity map displayed on the map display unit 66 displays the pixel and grid according to the selection of the display unit. Either can be displayed.

  As described above, in the present embodiment, a person is detected from the captured image of the monitoring area, and based on the detection result, the moving body activity value is acquired for each predetermined detection element (pixel) obtained by dividing the captured image into a plurality of detection elements. The movement activity value for each detection element is aggregated in the target area set in the monitoring area according to the user's input operation, and the movement activity value of the target area is acquired and the movement activity value of the target area is obtained. Based on this, since the activity map related to the target area is generated, it is possible to immediately grasp the activity status of the person in the area that the user pays attention to in the monitoring area.

  In the present embodiment, a person frame representing an area where a person detected from a captured image in the monitoring area is present is acquired, and the number of times the person is located in the person frame is counted for each detection element (pixel). Since the dynamic activity value for each element is acquired, the dynamic activity value for each detected element can be acquired by a simple process.

  In the present embodiment, since the graph (transition information) indicating the transition state of the moving body activity value in the target area is displayed, it is possible to immediately determine how the person's activity state has changed over time. Can grasp. In addition, since the activity map is displayed, it is possible to immediately grasp the activity status of the person in the monitoring area during a specific period. Then, by considering both the activity map and the graph, it is possible to grasp the person's activity status in the monitoring area from various angles.

  Further, in the present embodiment, the target area is set in units of a plurality of grids obtained by dividing the captured image in a grid shape according to the user's input operation, so that the target area can be easily specified.

  In the present embodiment, since an activity map that visualizes the degree of activity of a person in grid units set in the target area is generated, the activity status of the person in the area that the user particularly pays attention to in the monitoring area is immediately determined. I can grasp it.

  Moreover, in this embodiment, since the activity map which visualized the activity level of the person for each pixel of the captured image is generated, the activity status of the person in the monitoring area can be grasped in detail.

  Further, in the present embodiment, the movement activity value for each detection unit period is averaged over the aggregation unit period, the movement activity value for each aggregation unit period is acquired, and the transition state of the movement activity value for each aggregation unit period is acquired. Since the transition information related to the data is generated, the dynamic activity value for each arbitrary unit time period (for example, day, week, and month) is easily obtained from the dynamic activity value for each fixed detection unit period (for example, 1 hour). be able to. Thereby, since various transition information can be acquired by changing a totaling unit period variously, the change condition of the activity level of a person can be grasped multilaterally.

  In the present embodiment, the analysis result output screen is provided with the slider (operation unit) 68 for adjusting the date and time of the activity map displayed on the map display unit 66 by the user's operation. Since the date and time of the displayed map image can be adjusted, the activity map and captured image at the required date and time can be easily viewed. Further, by continuously operating the slider 68, the activity map and the captured image change with the passage of time, so it is immediately possible to see how the person's activity status and the actual state of the monitoring area have changed. I can grasp it.

  In particular, in the present embodiment, the slider 68 is provided so as to be movable in the direction along the horizontal axis (time axis) of the graph displayed on the graph display unit 67, and the date and time are displayed on the graph according to the movement operation of the slider 68. Since the pointing unit (line) 69 to be moved is moved, the activity map of the map display unit 66 and the imaging of the activity state of the person and the actual state of the monitoring area at the date and time of interest on the graph displayed on the graph display unit 67 are displayed. Can be easily confirmed with images.

  In this embodiment, since the activity map is displayed superimposed on the captured image of the monitoring area, it is possible to check the activity status of the person while comparing it with the actual state of the monitoring area shown in the captured image. it can. In addition, since the transition state of the dynamic activity value is displayed in a graph, it becomes even easier to grasp the temporal change status of the dynamic activity value. Since the activity map and the graph are arranged side by side, the comparison between the activity map and the graph is facilitated, and the activity status of the person in the monitoring area can be easily grasped from various perspectives.

  Next, another example of the graph displayed on the graph display unit 67 of the analysis result output screen displayed on the monitor 4 will be described. FIG. 17 is an explanatory diagram illustrating another example of the graph displayed on the graph display unit 67.

  In the present embodiment, the graph generation unit 36 performs processing for generating a graph (transition information) related to the transition state of the accumulated value obtained by accumulating the moving body activity values for each of the plurality of grids set in the target area. Thereby, as shown in FIG. 17, an area graph can be displayed on the graph display part 67 of an analysis result output screen.

  In this area graph, a graph (line graph) in which the moving body activity values of each grid set in the target area are connected in time series is displayed in an accumulated state. In the example shown in FIG. 17, as in the examples shown in FIGS. 10 to 13, the day aggregation mode is selected, the horizontal axis of the graph is the time zone, and the moving object for each time zone on the specified day A line graph connecting the activity values in time series is displayed.

  In addition, in FIG. 17, although the area graph comprised by a line graph was displayed, you may make it display by a stacked bar graph.

  As described above, in the present embodiment, since the area graph regarding the transition state of the cumulative value obtained by accumulating the moving body activity value of each grid specified in the target area is displayed, the activity state of the person who has integrated the plurality of target areas is displayed. It is possible to immediately grasp how it has changed over time.

  In this embodiment, the PC 3 is provided with a Web server function, and the analysis condition input screens shown in FIGS. 5 and 6 and the analysis result output screens shown in FIGS. 10 to 17 are displayed on the Web browser. I have to. In this way, the analysis condition input and analysis result output functions are realized by a general-purpose web browser installed in an apparatus different from the PC 3 that performs the activity map analysis process, for example, an apparatus such as the PC 11 provided in the headquarters. be able to.

  Next, the analysis result output document generated by the document generation unit 39 shown in FIG. 3 will be described. 18 to 21 are explanatory diagrams illustrating the analysis result output document generated by the document generation unit 39.

  In the present embodiment, the document generation unit 39 shown in FIG. 3 performs a process for generating an analysis result output document, and this analysis result output document can be viewed on the PC 3 that performs the activity map analysis process. In particular, in this embodiment, when browsing the analysis result output screen shown in FIGS. 10 to 17 on a device different from the device that performs the activity map analysis process, for example, the PC 11 of the headquarters, the analysis result output screen is displayed. By operating the provided download button 62, an analysis result output document can be acquired and the analysis result output document can be viewed using a monitor or a printer.

  As shown in FIGS. 18 to 21, the analysis result document is provided with a graph display unit 71 and a map display unit 72. The graph generated by the graph generation unit 36 is displayed on the graph display unit 71. The map display unit 72 displays the activity map generated by the map generation unit 35 in a state of being superimposed on the captured image of the monitoring area.

  In particular, the example shown in FIG. 18 is a case where the day totaling mode is selected as in the analysis result output screens shown in FIGS. 10 to 13, and the graph display unit 71 displays each grid specified as the target area. In addition, a graph (line graph) in which activity values for each time zone are connected in time series is displayed. The map display unit 72 displays an activity map for each time period. In FIG. 18, only the activity maps for the first two time zones are shown. However, using the horizontal scroll button (not shown), the activity maps for these time zones are used to display other time zones. The band activity map is displayed in the same way.

  In addition, the example shown in FIG. 19 is a case where the week totaling mode is selected as in the analysis result output screen shown in FIG. 14, and the graph display unit 71 displays the date for each grid specified in the target area. A graph (line graph) in which the moving body activity values for each day of the week are connected in time series is displayed. In addition, an activity map for each day is displayed on the map display unit 72. FIG. 19 shows only the activity maps for the first two days of the week, but using the horizontal scroll button (not shown), the activity maps for these days are followed by the activities for other days of the week. The map is displayed in the same way.

  Further, the example shown in FIG. 20 is a case where the month totaling mode is selected as in the analysis result output screen shown in FIG. 15, and the graph display unit 71 displays the date for each grid designated as the target area. A graph (line graph) in which the movement activity values for each time are connected in time series is displayed. In addition, an activity map for each day is displayed on the map display unit 72. In FIG. 20, only the activity maps for the first two days are shown, but the activity maps for other days are displayed in the same manner following the activity maps for these days.

  Further, the example shown in FIG. 21 is a case where the day totaling mode is selected, as in the analysis result output screen shown in FIG. 16, and the graph display unit 71 displays each week for each grid specified in the target area. A graph (line graph) in which the motion activity values of the same day are connected in time series is displayed. In particular, in the example illustrated in FIG. 21, unlike the analysis result output screen illustrated in FIG. 16, the month and day of the week are specified, and the graph of the specified day of the week in the specified month is displayed. In addition, an activity map for each day is displayed on the map display unit 72. In FIG. 21, only the activity maps for the first two days are shown. However, using the side scroll buttons (not shown), the activity maps for these days are followed by the activity maps for these days. The map is displayed in the same way.

(Second Embodiment)
Next, an activity map analysis system according to the second embodiment will be described. FIG. 22 is an explanatory diagram showing an analysis condition input screen in the activity map analysis system according to the second embodiment. The points not particularly mentioned here are the same as in the first embodiment.

  In the first embodiment, the user designates the target area in units of a plurality of grids obtained by dividing the captured image into a grid shape. However, in the second embodiment, the user designates the target area in an arbitrary shape. In the analysis condition setting unit 37, processing for setting the target area in an arbitrary shape designated by the user is performed according to the input operation of the user.

  In the second embodiment, as shown in FIG. 22, an arbitrary area setting mode selection unit 81 is provided in the analysis condition input unit 53 of the analysis condition input screen. Further, the analysis condition input screen is provided with a target area input unit 82 and a target area registration unit 83. Others are the same as the example shown in FIG. 5 and FIG.

  The arbitrary area setting mode selection unit 81 selects an arbitrary area setting mode for setting the target area in an arbitrary shape. When the arbitrary area setting mode is selected, naturally, the grid division number input unit 58 cannot input the number of grid divisions.

  The target area input unit 82 displays an image captured by the camera 1 and allows the user to input the target area on the captured image. Here, for example, when the target area is set as a polygon, the vertex of the measurement area may be input using the input device 6 such as a mouse. Moreover, what is necessary is just to input a center and a diameter, when setting an object area | region circularly. Alternatively, a line representing the outer periphery of the target area may be drawn with a mouse, and the target area may be set with the mouse trajectory. When the input of the target area is completed, the serial number of the target area is displayed on the image representing the measurement area.

  The target area registration unit 83 is provided with a serial number display field and a name input field. The serial number (No.) of the measurement area is displayed in the serial number display column. In the name input field, the user inputs the name of each target area using the input device 6 such as a keyboard.

  When the target area is set in this way, the same processing as in the first embodiment is performed in the activity value acquisition unit 34, the map generation unit 35, and the graph generation unit 36 based on the target area. An analysis result output screen similar to the analysis result output screen shown in FIGS. 10 to 17 is displayed on the monitor, and an analysis result output document similar to the analysis result output document shown in FIGS. 18 to 21 is generated.

  As described above, in the second embodiment, the target area is set in an arbitrary shape designated by the user in accordance with the user's input operation. Therefore, the target area has an optimal shape in accordance with the actual state of the monitoring area. Can be set.

  As mentioned above, although this invention was demonstrated based on specific embodiment, these embodiment is an illustration to the last, Comprising: This invention is not limited by these embodiment. In addition, all the components of the activity map analysis device, the activity map analysis system, and the activity map analysis method according to the present invention shown in the above embodiment are not necessarily essential, and at least as long as they do not depart from the scope of the present invention. It is possible to choose.

  For example, in the present embodiment, an example of a store such as a convenience store has been described. However, the present invention is not limited to such a store, and a place where it is useful to grasp the activity status of persons existing in the monitoring area Can be widely applied to.

  Further, in the present embodiment, an example in which the target moving object is a person has been described. However, the present invention may be applied to a purpose of grasping the activity state of a vehicle for a moving object other than a person, for example, a vehicle such as an automobile or a bicycle. Is possible.

  In the present embodiment, the first activity value acquisition unit 41 acquires the moving body activity value for each pixel as the predetermined detection element obtained by dividing the captured image into a plurality of images. However, the detection unit is not limited to a pixel that is a minimum unit of a captured image.

  Further, in the present embodiment, the first activity value acquisition unit 41 counts the number of times of positioning within the moving object frame (person frame) mainly for each detection element (pixel), and the moving object activity value for each detection element. However, the moving body activity value acquisition process is not limited to such a method. For example, the position of a moving object (for example, the center of a moving object frame) detected from a captured image (frame) is connected in time series to generate a flow line, and the number of times this flow line passes is counted for each detection element. You may make it acquire the moving body activity value for every detection element.

  In the present embodiment, the magnitude of the dynamic activity value is expressed in shades in the activity map. However, the magnitude of the dynamic activity value may be expressed by a single color shading. You may make it express with figures, characters, symbols, etc.

  In this embodiment, as shown in FIGS. 9 to 21, a graph, particularly a line graph, is displayed as the transition information related to the transition state of the moving body activity value in the target area, but a table is displayed. Alternatively, a graph in another format such as a bar graph may be displayed.

  Further, in the present embodiment, as shown in FIG. 4 and the like, the target area is set by a rectangular divided area (grid) obtained by dividing the captured image into a grid shape, but the shape of this divided area is rectangular. For example, a hexagon (honeycomb) is also possible.

  In this embodiment, as shown in FIGS. 10 to 17, the map image and the graph image are arranged side by side on the analysis result output screen. However, the map image and the graph image are switched on different screens. You may make it display.

  In the present embodiment, the activity map related to only the target area set on the monitoring area is displayed. However, the activity map of the entire monitoring area can also be displayed.

  Further, in this embodiment, the moving object detection unit 33 is provided in the PC 3 that performs the activity map analysis. However, a part or all of the function of detecting the moving object is integrated with the camera 1 and configured as an imaging device with a moving object detection function. Is also possible.

  Further, in this embodiment, the processing necessary for the activity map analysis is performed by the PC 3 provided in the store, but this necessary processing is performed by the PC 11 provided in the headquarters as shown in FIG. Or you may make it make the cloud computer 21 which comprises a cloud computing system perform. In addition, necessary processing may be shared by a plurality of information processing apparatuses, and information may be transferred between the plurality of information processing apparatuses via a communication medium such as an IP network or a LAN. In this case, an activity map analysis system is configured by a plurality of information processing apparatuses that share necessary processing.

  In such a configuration, it is preferable that at least a moving object detection process be performed by an apparatus provided in the store. With this configuration, the amount of information required for the remaining processing can be reduced, so that the remaining processing is performed by an information processing apparatus installed at a location different from the store, for example, the PC 11 installed at the headquarters. Even if it does in this way, since communication load can be reduced, operation of the system by a wide area network connection form becomes easy.

  In addition, the cloud computer 21 may be caused to perform at least a process with a large calculation amount among processes necessary for the activity map analysis, for example, a moving object detection process and an activity value acquisition process. With this configuration, since the remaining processing requires a small amount of computation, a high-speed information processing device is not necessary on the user side such as a store, and the information processing device that constitutes a sales information management device installed in the store can be used. It can also be used as an extended function, and the cost burdened by the user can be reduced.

  In addition, the cloud computer 21 may perform all of the necessary processing, or at least output information generation processing among the necessary processing may be shared by the cloud computer 21. With this configuration, the store or the headquarters In addition to the PCs 3 and 11 provided in the mobile phone, the mobile phone or other portable terminal can display the analysis result, and the analysis result can be displayed at any place such as a store or headquarters. Can be confirmed.

  In the present embodiment, the analysis result is output to the monitor 4 of the PC 3 installed in the store. However, an analysis result output device can be provided separately from the PC 3, for example, as described above. In addition to using the PC 11 and the smartphone 22 installed at the headquarters as the analysis result browsing apparatus, it is also possible to add a function as an analysis result browsing apparatus to the sales information management apparatus installed at the store. It is also possible to output the analysis result with a printer.

  In this embodiment, the analysis condition input screen and the analysis result output screen are displayed on the monitor 4 of the PC 3 installed in the store. However, necessary input and output are performed separately from the PC 3 that performs the activity map analysis processing. An information processing apparatus that performs the above, particularly a portable information processing apparatus such as a tablet terminal may be provided.

  The activity map analysis apparatus, the activity map analysis system, and the activity map analysis method according to the present invention can immediately grasp the activity status of the moving object in the area that the user pays attention to within the monitoring area, and the activity status of the moving object It has the effect of being able to immediately grasp how it has changed over time, analyzes the activity status of the moving object in the monitoring area, and outputs an activity map that visualizes the activity status of the moving object. It is useful as an activity map analysis device, an activity map analysis system, and an activity map analysis method.

1 Camera 3 PC (Activity Map Analyzer)
2 Recorder 4 Monitor (display device)
6 Input device 11 PC
12 Monitor 21 Cloud Computer 22 Smartphone 31 Monitoring Unit 32 Activity Map Analysis Unit 33 Moving Object Detection Unit 34 Activity Value Acquisition Unit 35 Map Generation Unit 36 Graph Generation Unit (Transition Information Generation Unit)
37 Analysis condition setting part (target area setting part)
38 GUI control unit 39 Document generation unit (output information generation unit)
41 first activity value acquisition unit 42 second activity value acquisition unit 43 input information acquisition unit 44 screen information generation unit (output information generation unit)
45 Printer 66 Map display section 67 Graph display section 68 Slider (operation section)

Claims (7)

  1. An activity map analyzer that analyzes an activity status of a moving object in a monitoring area and outputs an activity map that visualizes the activity status of the moving object.
    A moving object detection unit for detecting a moving object from a captured image of the monitoring area;
    Based on the detection result in the moving object detection unit, an activity value acquisition unit that acquires a moving object activity value representing the degree of activity of the moving object in the monitoring area for each predetermined unit period ;
    A target area setting unit that sets a target area in the monitoring area in accordance with a user input operation;
    A map generating unit that generates the activity map related to the target area set by the target area setting unit based on the moving body activity value of the monitoring area acquired by the activity value acquiring unit;
    An output information generating unit that generates output information having a map image displayed by superimposing the activity map generated by the map generating unit on the captured image of the monitoring area;
    An activity map analyzer characterized by comprising:
  2. The moving object detection unit obtains a moving object frame representing an area where a moving object detected from a captured image of the monitoring area exists,
    The activity value acquisition unit is configured to count, for each detection element, the number of times positioned within a setting range based on the moving object frame acquired by the moving object detection unit, and to acquire a moving object activity value for each detection element. The activity map analyzer according to claim 1.
  3. Furthermore, based on the dynamic activity value acquired by the activity value acquisition unit, having a transition information generation unit that generates transition information regarding the transition status of the dynamic activity value in the target area,
    The activity map analysis apparatus according to claim 1, wherein output information including the map image and the transition information is generated.
  4.   The said target area setting part sets the said target area in the some grid unit which divided | segmented the said captured image into the grid | lattice form according to a user's input operation, The any one of Claims 1-3 characterized by the above-mentioned. Activity map analyzer described in 1.
  5.   The activity map analysis according to any one of claims 1 to 3, wherein the target area setting unit sets the target area in an arbitrary shape designated by the user in accordance with a user input operation. apparatus.
  6. An activity map analysis system that analyzes an activity state of a moving object in a monitoring area and outputs an activity map that visualizes the activity state of the moving object,
    A camera for photographing the surveillance area;
    A plurality of information processing devices;
    Have
    Any of the plurality of information processing devices
    A moving object detection unit for detecting a moving object from a captured image of the monitoring area;
    Based on the detection result in the moving object detection unit, an activity value acquisition unit that acquires a moving object activity value representing the degree of activity of the moving object in the monitoring area for each predetermined unit period ;
    A target area setting unit that sets a target area in the monitoring area in accordance with a user input operation;
    A map generation unit that generates the activity map related to the target area set by the target area setting unit based on the moving body activity value of the monitoring area acquired by the activity value acquisition unit;
    An output information generating unit that generates output information having a map image displayed by superimposing the activity map generated by the map generating unit on the captured image of the monitoring area;
    An activity map analysis system characterized by comprising
  7. An activity map analysis method for analyzing an activity state of a moving object in a monitoring area and causing an information processing device to output an activity map that visualizes the activity state of the moving object,
    Detecting a moving object from a captured image of a monitoring area;
    Based on the detection result in the moving object detection unit, obtaining a moving object activity value representing the degree of activity of the moving object in the monitoring area for each predetermined unit period ;
    Setting a target area in the monitoring area according to a user input operation;
    Generating the activity map for the target area based on the dynamic activity value of the monitoring area acquired in the step of acquiring the dynamic activity value;
    Generating output information having a map image displayed by superimposing the activity map on a captured image of the monitoring area;
    An activity map analysis method characterized by comprising:
JP2014161046A 2014-08-07 2014-08-07 Activity map analyzer, activity map analysis system, and activity map analysis method Active JP6226240B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014161046A JP6226240B2 (en) 2014-08-07 2014-08-07 Activity map analyzer, activity map analysis system, and activity map analysis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014161046A JP6226240B2 (en) 2014-08-07 2014-08-07 Activity map analyzer, activity map analysis system, and activity map analysis method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2013270927 Division 2013-12-27

Publications (2)

Publication Number Publication Date
JP2015127940A JP2015127940A (en) 2015-07-09
JP6226240B2 true JP6226240B2 (en) 2017-11-08

Family

ID=53837899

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014161046A Active JP6226240B2 (en) 2014-08-07 2014-08-07 Activity map analyzer, activity map analysis system, and activity map analysis method

Country Status (1)

Country Link
JP (1) JP6226240B2 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000324445A (en) * 1999-05-10 2000-11-24 Nri & Ncc Co Ltd Device and method for recording customer behavior image data
JP2002133075A (en) * 2000-10-23 2002-05-10 Shimizu Corp System for evaluating level of interest in commodity
JP2003324726A (en) * 2002-05-07 2003-11-14 Itochu Corp Object detecting apparatus using supervisory camera
US8098888B1 (en) * 2008-01-28 2012-01-17 Videomining Corporation Method and system for automatic analysis of the trip of people in a retail space using multiple cameras
JP2009265830A (en) * 2008-04-23 2009-11-12 Toshiba Corp In-store behavior analyzing device and method
JP5715863B2 (en) * 2011-03-25 2015-05-13 セコム株式会社 Image processing device
JP5356615B1 (en) * 2013-02-01 2013-12-04 パナソニック株式会社 Customer behavior analysis device, customer behavior analysis system, and customer behavior analysis method

Also Published As

Publication number Publication date
JP2015127940A (en) 2015-07-09

Similar Documents

Publication Publication Date Title
JP4299138B2 (en) Techniques to facilitate the use of optotype tracking data
EP2230629A2 (en) A system and method for capturing, storing, analyzing and displaying data relating to the movements of objects
AU2004233453B2 (en) Recording a sequence of images
JP5740318B2 (en) Image processing system, image processing method, and program
US9070227B2 (en) Particle based visualizations of abstract information
JP2016001505A5 (en) POS system, registration device, and program
JP4876687B2 (en) Attention level measuring device and attention level measuring system
JP4932161B2 (en) Viewer information measuring device
US9191633B2 (en) Tracking assistance device, tracking assistance system and tracking assistance method
JP2011029737A (en) Surveillance image retrieval apparatus and surveillance system
Andrienko et al. Visual analytics for understanding spatial situations from episodic movement data
JP4865811B2 (en) Viewing tendency management apparatus, system and program
JP2005309951A (en) Sales promotion support system
US9870684B2 (en) Information processing apparatus, information processing method, program, and information processing system for achieving a surveillance camera system
JP6360885B2 (en) Viewing angle image manipulation based on device rotation
RU2012155513A (en) Method and system for searching images in interactive purchase mode
JP2011248836A (en) Residence detection system and program
KR101140533B1 (en) Method and system for recommending a product based upon skin color estimated from an image
US10049283B2 (en) Stay condition analyzing apparatus, stay condition analyzing system, and stay condition analyzing method
JP2008112401A (en) Advertisement effect measurement apparatus
JP2007267294A (en) Moving object monitoring apparatus using a plurality of cameras
ElSayed et al. Situated analytics
CN103604371A (en) Mobile terminal and object measurement method thereof
JP4717934B2 (en) Relational analysis method, relational analysis program, and relational analysis apparatus
US20120209715A1 (en) Interaction with networked screen content via motion sensing device in retail setting

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160811

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170706

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170711

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170906

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170919

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170928

R151 Written notification of patent or utility model registration

Ref document number: 6226240

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151