US20190378279A1 - Video image processing device, video image analysis system, method, and program - Google Patents

Video image processing device, video image analysis system, method, and program Download PDF

Info

Publication number
US20190378279A1
US20190378279A1 US16/489,374 US201816489374A US2019378279A1 US 20190378279 A1 US20190378279 A1 US 20190378279A1 US 201816489374 A US201816489374 A US 201816489374A US 2019378279 A1 US2019378279 A1 US 2019378279A1
Authority
US
United States
Prior art keywords
image
video image
display
trajectory
designated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/489,374
Other languages
English (en)
Inventor
Yasufumi Hirakawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Assigned to NEC CORPORATION reassignment NEC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HIRAKAWA, YASUFUMI
Publication of US20190378279A1 publication Critical patent/US20190378279A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19682Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the present invention relates to a video image processing device, a video image processing method, and a video image processing program for processing video images.
  • the present invention also relates to a video image analysis system for analyzing video images.
  • a video image analysis technique for analyzing a video image obtained from a camera device with a computer, and issuing an alert.
  • a video image analysis technique for detecting a position or a motion of an object from a video image, and generating an alert when the detection result satisfies a predetermined condition.
  • Such a video image analysis technique is utilized at a control center where an operator is present, for example, to check a video image based on which an alert has been issued, and taking appropriate measures in accordance with the issued alert.
  • the operation to check the video image based on which the alert has been issued is normally performed while the current video image is being monitored at the control center.
  • an operator checks the past video images, to determine whether the alert is a false alert. If the alert is not a false alert, the operator acquires necessary information, and takes measures such as sending the information as an appropriate alert to a predetermined address, for example. In doing so, the operator selects, from the past video images, information from which a check can be made to determine whether the alarm is a false alarm, and, if the alert is not a false alarm, the operator acquires features of an intruder as the target in the alert, a dangerous moving object, or the like (any of these will be hereinafter referred to as an object).
  • Patent Literatures 1 to 3 disclose example techniques.
  • Patent Literature 1 discloses that, on the display screen of a display device that displays a moving image, the trajectory of movement of an object is superimposed and displayed on images sequentially obtained from the imaging device that is the source of the moving image. Patent Literature 1 also discloses that designation of the movement trajectory being displayed is received from a user, the movement trajectory is displayed in a different display mode from that for the other movement trajectories, and an object detection region is set so that the movement trajectory does not intersect with any other movement trajectory.
  • Patent Literature 2 discloses an example in which different persons and a trajectory of one vehicle are shown in a video image.
  • Patent Literature 3 displaying a composite image obtained by combining frame images is disclosed as a method of determining a behavior from displacement of a part.
  • Patent Literature 3 also discloses an example of combining images by superimposing consecutive frames on one another, and an example of indicating a moving image and a movement trajectory of a target part with dots and arrows in a composite image.
  • Patent Literature 1 Japanese Patent Application Laid-Open No. 2015-018340
  • Patent Literature 2 Japanese Patent Application Laid-Open No. 2009-015827
  • Patent Literature 3 Japanese Patent Application Laid-Open No. 2012-133666
  • Patent Literature 2 The problems with the method disclosed in Patent Literature 2 are basically the same as those with Patent Literature 1. That is, by the method disclosed in Patent Literature 2, a movement trajectory is always simply superimposed on the current image. While it is possible to observe the movement paths of objects, it is not possible to know the recent and current behaviors of an object shown only in past video images, the situations in the surrounding area, changes in features, and the like.
  • Patent Literature 3 By the method disclosed in Patent Literature 3, consecutive frames are superimposed on one another, so that the states of the object or a part of the object in the past can be checked. However, even if this method is simply applied to a surveillance video image, the image currently being displayed becomes complicated, and it is difficult to obtain the necessary information.
  • the present invention has been made in view of the above problems, and aims to provide a video image processing device, a video image analysis system, a video image processing method, and a video image processing program that enable a user to quickly grasp the situations of an object at two or more points of time in a video image.
  • a video image processing device characteristically includes a display control means that causes a display unit to display a trajectory indicating a change in a position of an object in a video image, wherein the display control means acquires a first request designating a point in the trajectory being displayed, and displays a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • a video image analysis system characteristically includes: a tracking means that analyzes a video image, and continuously acquires a position of a tracking target object from the video image; a storage means that stores position information indicating the position acquired by the tracking means in assocation with identification information about an image from which the position was acquired in the video image; and a display control means that causes a display unit to display a trajectory indicating a change in the position of the object in the video image, based on the information stored in the storage means, wherein the display control means acquires a first request designating a point in the trajectory being displayed, and displays a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • a video image processing method characteristically includes: causing a display unit to display a trajectory indicating a change in a position of an object in a video image; acquiring a first request designating a point in the trajectory being displayed; and displaying a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • a video image processing program characteristically causes a computer to: perform a process of causing a display unit to display a trajectory indicating a change in a position of an object in a video image; in the process, acquire a first request designating a point in the trajectory being displayed; and display a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • a user can quickly grasp the situations of an object at two or more points of time in a video image.
  • FIG. 1 is a diagram schematically showing the configuration of a video image analysis system of a first embodiment.
  • FIG. 2 is a block diagram showing an example configuration of the video image analysis system of the first embodiment.
  • FIG. 3 is an explanatory diagram showing an example of detection in a tracking unit 103 .
  • FIG. 4 is an explanatory diagram showing an example expression of a trajectory.
  • FIG. 5 is an explanatory diagram showing an example of correspondence between a tracking line and an object.
  • FIG. 6 is an explanatory diagram showing an example of attaching information to a tracking line.
  • FIG. 7 is an explanatory diagram showing an example of attaching information to a tracking line.
  • FIG. 8 is a flowchart showing an example operation of a video image analysis device 2 of the first embodiment.
  • FIG. 9 is a flowchart showing an example operation of a video image processing device 4 of the first embodiment.
  • FIG. 10 is a flowchart showing an example of display control in a display control unit 105 .
  • FIG. 11 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 12 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 13 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 14 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 15 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 16 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 17 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 18 is a block diagram showing an example configuration of a video image analysis system of a second embodiment.
  • FIG. 19 is a flowchart showing an example of display control in a display control unit 205 .
  • FIG. 20 is a flowchart showing an example of event processing.
  • FIG. 21 is a flowchart showing an example of event processing.
  • FIG. 22 is a flowchart showing an example of event processing.
  • FIG. 23 is a flowchart showing an example of event processing.
  • FIG. 24 is a flowchart showing an example of event processing.
  • FIG. 25 is a flowchart showing an example of event processing.
  • FIG. 26 is an explanatory diagram showing an example of composite pattern of display images.
  • FIG. 27 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 28 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 29 is an explanatory view showing an example of a display image and an example of a superimposed image at the time of the display image.
  • FIG. 30 is a schematic block diagram showing an example configuration of a computer according to an embodiment of the present invention.
  • FIG. 31 is a block diagram showing the outline of a video image processing device of the present invention.
  • FIG. 1 is a diagram schematically showing the configuration of a video image analysis system of a first embodiment.
  • the video image analysis system 100 includes a video image input device 1 , a video image analysis device 2 , a storage device 3 , a video image processing device 4 , and a display device 5 .
  • the video image input device 1 inputs an image to be analyzed.
  • the video image input device 1 is formed with an imaging device capable of capturing a moving image, for example. Although only one video image input device 1 is shown in FIG. 1 , but more than one video image input device 1 may be used.
  • a “video image” means a moving image formed with frame images corresponding to respective consecutive frames.
  • a “video image” may not be a so-called moving image, but may be a still image group including two or more still images accompanied by information about the imaging times and the imaging regions, a moving image or a composite image group formed with still images included in the above described still image group, or the like.
  • an image in a video image may not be a so-called frame image in a moving image, but may be an image included in the above mentioned “video image”.
  • an image at a certain time point in a video image may not be the frame image at the time point in a so-called moving image, but may be the image of the time corresponding to the time point on a predetermined time axis included in the above mentioned “video image”.
  • the video image analysis device 2 analyzes a video image that has been input thereto (this video image will be hereinafter referred to as the “input video image”), stores the analysis result into the storage device 3 , and outputs a warning or any other message as necessary.
  • the video image analysis device 2 analyzes the input video image, detects and tracks a predetermined object such as a moving object appearing in the input video image, and continuously acquires the position of the object from the input video image. For example, the video image analysis device 2 may acquire the position of the object at each time point in the input video image. The video image analysis device 2 also generates analysis information in which position information indicating the acquired position of the object is associated with identification information about the image of the time when the position was acquired.
  • the object to be tracked will be also referred to as the “tracking target object” in some cases.
  • the video image analysis device 2 also analyzes the input video image, and may further detect a correspondence time which is the time point of the image in the input video image from which the position of the tracking target object has been acquired, the features of the tracking target object in the image in the input video image at the time point, the presence or absence of another object related to the tracking target object, and the features thereof if there is such an object.
  • Examples of the features of an object are the states of the object such as the direction, the size, and operation of the object or a predetermined part thereof, changes in the features of the object in the input video image, such as the states, the clothes, and the possessions, other matters related to the object and another related object (such as the presence/absence of another object, and the classification thereof), and the like.
  • the video image analysis device 2 may also output a predetermined message, in accordance with analysis results including those items detected from the input video image.
  • the video image analysis device 2 may determine whether the position of the object and the other items detected from the input video image satisfy a predetermined condition, for example. If the video image analysis device 2 determines that the predetermined condition is satisfied, the video image analysis device 2 may output the predetermined message.
  • the video image analysis device 2 may output the predetermined message to a predetermined display device, a user terminal, or a predetermined terminal in the control center, for example.
  • the storage device 3 stores the input video image and information indicating the results of the video image analysis conducted by the video image analysis device 2 .
  • the storage device 3 may store not only the input video image but also, for each tracking target object, analysis information that associates information indicating analysis results including the position acquired from the input video image and other detected items, with information (such as identification information) indicating the image from which the position has been acquired in the input video image, for example.
  • the storage device 3 does not necessarily store all of the input video image.
  • the storage device 3 may store only a predetermined amount of the video image from the latest part of the input video image.
  • a certain number of images, a time length, or a data capacity may be determined as the predetermined amount.
  • the video image processing device 4 displays, on the display device 5 , an image (which may be a moving image and a still image) included at any time point in the input video image, an image generated in accordance with the input video image, or the like.
  • the image generated in accordance with the input video image is not limited to any particular kind.
  • such an image may be a sketch of a predetermined region, or may be a composite image formed with images included at two or more points of time in the input video image.
  • the video image processing device 4 displays an image designated by a user on the display device 5 , for example.
  • the video image processing device 4 has a function of displaying a trajectory indicating changes in the position of the object in the input video image when displaying such an image on the display device 5 .
  • the trajectory display method will be described later.
  • the video image processing device 4 may output image information about the image to be displayed on the display device 5 , and cause the display device 5 to display the desired image.
  • the display device 5 is an image display device that displays an image in accordance with the image information output from the video image processing device 4 .
  • the display device 5 is formed with a display or the like. Although only one display device 5 is shown in FIG. 1 , more than one display device 5 may be used. In a case where more than one display device 5 is used, the video image processing device 4 superimposes the trajectory on the image being displayed on at least one display device 5 designated to display the trajectory by the user.
  • FIG. 2 is a block diagram showing an example configuration of the video image analysis system of this embodiment. It should be noted that FIG. 2 shows an example configuration of the video image analysis system on the functional aspect. As shown in FIG. 2 , the video image analysis system 100 may include a video image input unit 101 , a video image holding unit 102 , a tracking unit 103 , an analysis information storage unit 104 , a display control unit 105 , and a display unit 106 .
  • the video image input unit 101 corresponds to the video image input device 1 .
  • the video image holding unit 102 and the analysis information storage unit 104 correspond to the storage device 3 .
  • the tracking unit 103 corresponds to the video image analysis device 2 .
  • the display control unit 105 corresponds to the video image processing device 4 .
  • the display unit 106 corresponds to the display device 5 .
  • the tracking unit 103 is formed with an information processing device such as a CPU included in the video image analysis device 2 , for example.
  • the display control unit 105 is formed with an information processing device such as a CPU included in the video image processing device 4 .
  • the video image analysis device 2 and the video image processing device 4 are shown as separate devices in FIG. 1 , these devices may be formed with one device.
  • the video image input unit 101 inputs a video image.
  • the video image holding unit 102 stores the input video image. It should be noted that the video image holding unit 102 may store only a predetermined amount of the video image from the latest image in the input video image.
  • the tracking unit 103 analyzes the input video image, and continuously acquires the position of the tracking target object from the input video image.
  • the method of tracking the tracking target object with the tracking unit 103 is not limited to any particular method.
  • the tracking unit 103 When acquiring the position, the tracking unit 103 further detects the time (the corresponding time) corresponding to the time when the position was acquired, the features of the tracking target object or the features of another object related to the tracking target object in the image at the time point in the input video image.
  • the tracking unit 103 then stores analysis information into the analysis information storage unit 104 .
  • information indicating analysis results including the position acquired from the input video image and the items detected together with the position is associated with identification information about the image from which the position was acquired in the input video image.
  • the identification information is not limited to any particular information, as long as it can identify the image in the input video image.
  • the identification information may be information indicating a time point of the image in the input video image, or an identifier attached to the image in the video image.
  • the tracking unit 103 may have a function of outputting a predetermined message in accordance with the analysis results or other information.
  • the tracking unit 103 may output a message indicating an alert, in accordance with sensor information that is input from a predetermined sensor such as an infrared sensor, a pressure sensor, or a vibration sensor.
  • a predetermined sensor such as an infrared sensor, a pressure sensor, or a vibration sensor.
  • the tracking unit 103 may output a message to that effect.
  • a predetermined object such as a human being or a specific person Y
  • the tracking unit 103 may output a message to that effect.
  • the tracking unit 103 may output a message to that effect.
  • FIGS. 3( a ) to 3( d ) are explanatory diagrams showing examples of detection in the tracking unit 103 .
  • the example shown in FIG. 3( a ) is an example in which an object (a moving object) is detected from an image region a 01 of an input video image.
  • sign T represents the object regarded as the tracking target object.
  • the example shown in FIG. 3( b ) is an example in which a predetermined object is detected from the image region a 01 of the input video image.
  • sign a 02 represents an object region that is the region of the object detected from the image in the input video image.
  • FIG. 3( a ) is an example in which an object (a moving object) is detected from an image region a 01 of an input video image.
  • sign T represents the object regarded as the tracking target object.
  • sign a 02 represents an object region that is the region of the object detected from the image in the input video image.
  • FIG. 3( a ) is an example in which an object (a
  • FIG. 3( c ) is an example in which an object crossing over an intrusion detection line a 03 in the image region a 01 of the input video image is detected.
  • FIG. 3( d ) is an example in which an object being left behind in a predetermined monitoring region a 04 of the image region a 01 of the input video image is detected.
  • the object being left behind is detected as a related object ro that is another object related to the tracking target object T.
  • the tracking unit 103 may store the information indicating the position acquired as a result of tracking separately from the other information. For each tracking target object, the tracking unit 103 may store, into the analysis information storage unit 104 , information that associates the information indicating the position with the identification information about the image in the input video image, separately from information that associates the information indicating the detected items other than the position with the identification information about the image in the input video image. In this embodiment, a combination of these kinds of information even in such a case is referred to as “analysis information”.
  • the display control unit 105 causes the display unit 106 to display a trajectory indicating changes in the position of the object in the video image. For example, when displaying an image at some time point included in the input video image or a predetermined image generated in accordance with the input video image, the display control unit 105 sets the image as the background image, and superimposes the trajectory of the predetermined object in the input video image on the background image.
  • the background image is not limited to any particular kind of image, as long as it includes a region corresponding to at least a part of the moving path of the object in the image region.
  • the display control unit 105 of this embodiment adds a notification function of notifying the user of the results of the analysis of the input video image and the time elapsed since a predetermined time, to the trajectory in the input video image of the object.
  • the display control unit 105 may make the display mode of a part of the trajectory differ from another part, or add information indicating the analysis results or the elapsed time in the vicinity of a part of the trajectory, in accordance with the features of the object or the features of another object related to the object shown in the analysis result obtained by analyzing the input video image.
  • the display control unit 105 may make the display mode of a part of the trajectory differ from another part, or add information indicating the elapsed time in the vicinity of a part of the trajectory, in accordance with the elapsed time from the predetermined time, for example.
  • the analysis results are not limited to any particular results, as long as they are items obtained as a result of analyzing the input video image.
  • the analysis results may be the object-related items obtained as a result of tracking the object shown in the input video image (the items may be the features of the object or the features of another object related to the object), for example. Examples of such items include the state of the object, changes in the features of the object, the presence or absence of another object related to the object, and information amount such a related object, as described above.
  • the state of the object may be the direction, the size, movement (motions or behaviors), and the like of the entire object or a predetermined part of the object.
  • An example of another object related to the object is an object that draws a trajectory having a predetermined relationship with the trajectory of the object in the input video image. It should be noted that the above described “being left behind” and “being take away”, and the later described “interaction” are examples of the predetermined relationship.
  • the corresponding position in the trajectory accompanied by information may be the position in the trajectory corresponding to the time point when an item to be displayed was detected, for example.
  • the corresponding position in the trajectory is the position in the trajectory corresponding to the time when the elapsed time passed since the predetermined time
  • the display control unit 105 may set the “vicinity” that is a predetermined pixel range from the corresponding position in the trajectory, for example.
  • the predetermined time may be the latest time in the input video image or the time when the predetermined message is sent from the predetermined system that analyzes the input video image.
  • the display control unit 105 may display the trajectory in such a form that the point or the section can be recognized, when superimposing the trajectory.
  • the display control unit 105 may make the display mode of a part (such as the point or the section) of the trajectory differ from another part, or add information indicating the analysis results or the elapsed time in the vicinity of a part of the trajectory (such the point or the section).
  • condition point such a point that satisfies a condition
  • condition section such a section satisfying a condition
  • condition point or the condition section may be a point or a section where the direction, the size, or movement of the object or a predetermined part thereof satisfies a predetermined condition, or a point or a section where changes in the features of the object satisfy a predetermined condition, for example.
  • the display control unit 105 may shorten the subject section, and then display the section in a different display mode from the other sections.
  • the roaming section is also an example of a condition section.
  • the display control unit 105 may determine whether a section in the trajectory is a roaming section, in accordance with the roughness ratio of a trajectory that is the trajectory of sections of respective time unites in the trajectory of the object or the trajectory of one or more sections that follows the time unit sections and has a predetermined thickness, to the extensional rectangle surrounding the trajectory.
  • the method of determining a roaming section is not limited to the above described method.
  • a roaming section may be determined depending on whether the current position is located in a region based on the past position within the set time or whether the change in the position at each set time is equal to or smaller than a predetermined distance.
  • the display control unit 105 acquires a predetermined request designating the shortened and displayed section.
  • the display control unit 105 may then restore the section to the original state and display the original section, or display an alternative display including the points corresponding to the points included in the original section in the designated section.
  • the display control unit 105 may display a line or a slide bar as the above mentioned alternative display.
  • the line or the slide bar corresponds to the section prior to the shortening in the display region of the display unit 106 as the above alternative display. At this stage, it is preferable to make the line segment the same display target as the subject section or attach the same information to the line segment.
  • the display control unit 105 may set one of an image at some time point in the input video image and a predetermined image generated based on the input video image as a background image, and display a trajectory superimposed on the background image.
  • the user can constantly recognize the latest elapsed time and the state of the object.
  • the display control unit 105 can superimpose, on the background image, a trajectory in the display mode corresponding to the results of the latest analysis of the object detected from the input video image, or a trajectory in the display mode corresponding to the latest time elapsed since the predetermined time.
  • the user can recognize the latest situation by checking a trajectory including the latest position of the object displayed together with the image, the latest features and the like of the object, and a trajectory that has a display mode or accompanying information varying with the elapsed time.
  • a trajectory that has a display mode or accompanying information varying with the above described analysis results or the time elapsed since the predetermined time is also called a tracking line.
  • the display control unit 105 can display the following information (features), using the tracking line.
  • the traveling direction of the object (which is the tracking line)
  • the staying time at the corresponding point (using display of the elapsed time and display of the above point or the section)
  • a motion of the object (such as crouching, standing up, walking, running (speed), or jumping)
  • An interaction with another object such as crossing, joining, or branching
  • Examples of interactions with another object include an interaction between persons and an interaction with another object (such as a vehicle).
  • the presence or absence of an interaction may be determined depending on whether there is a point or section located in a range close to the trajectory of another object in terms of time and distance in the trajectory of the object.
  • the interaction may be displayed as “crossing”.
  • the interaction may be displayed as “joining”.
  • two or more trajectories are derived from one trajectory by an interaction like persons getting off a vehicle, the interaction may be displayed as “branching”.
  • the display control unit 105 may narrow down object(s) to display the trajectory, display item(s) to be displayed on the display unit 106 using the trajectory, and range of the trajectory to be displayed, based on user's designation, a reference time or a reference timing for the elapsed time, the time elapsed from the reference, a direction of the object, staying time of the object, a color of an outfit of the object, a pattern of an outfit of the object, and a progressing direction of the object, for example.
  • the display control unit 105 can also display the trajectories of two or more objects in one background image. For example, the display control unit 105 may display an interaction with another object related to the object by displaying the trajectory of the object.
  • the display mode in a trajectory can be varied by changing colors, line types (including the shapes of dotted lines and the like), spacing (including spacing between the dots and spacing between lines in dotted lines), line thicknesses, the shapes of elements such as the parts corresponding to the lines in dotted lines or markers, or the directions of the markers or directional signs, for example.
  • the markers may be marks indicating the spacing between the spots or the sections included in a line segment, or marks, figures, or some other signs that represent regular items that always exist in the trajectory. Symbols are any appropriate characters, marks, figures, and other signs that are provided only when a specific condition is satisfied.
  • the display control unit 105 may vary display modes in a trajectory by changing the display method (narrowed display) when narrowing down predetermined sections in the trajectory.
  • FIGS. 4( a ) to 4( j ) are explanatory diagrams each showing an example expression of a trajectory (tracking line).
  • FIG. 4( a ) shows an example in which color density is varied in trajectory.
  • FIG. 4( b ) shows an example in which color is varied in a trajectory. In FIG. 4( b ) , differences in color are represented by shades of the line.
  • FIG. 4( c ) shows an example in which marker intervals are varied in a trajectory.
  • FIG. 4( d ) shows an example in which the shapes of markers are varied in a trajectory.
  • FIG. 4( e ) shows an example in which the directions of directional signs attached in a trajectory are varied. The directional signs are associated with directions of the object, for example.
  • FIG. 4( e ) shows an example in which symbols indicating specific information are attached to the corresponding points.
  • FIG. 4( g ) shows an example in which display modes are varied as specific markers are made to flicker.
  • FIG. 4( h ) shows an example of narrowed display of sections, and an example in which the range other than a specific range is grayed out.
  • FIG. 4( i ) shows another example of narrowed display of sections, and an example in which the range other than a specific range (such as a section within a specific time period) is erased.
  • FIG. 4( j ) shows another example of narrowed display of sections, and an example in which the thickness of the tracking line is varied between the current section and the other sections.
  • FIG. 5 is an explanatory view showing an example of correspondence between a tracking line and an object.
  • the display control unit 105 may associate a tracking line with an object by assigning the same number or the like to the tracking line and the object.
  • sign TL represents the tracking line.
  • the tracking line TL in the drawing is colored in accordance with the state of the object. However, in FIG. 5 , the different colors are indicated by shades of the line.
  • the display control unit 105 may indicate the identity of the object by assigning the same number or the like to the vicinities of object regions of the object, or surrounding the object regions of the object with frames in the same color or lines of the same type.
  • FIGS. 6 and 7 are explanatory diagrams showing examples in which information is attached to a tracking line.
  • FIG. 6 shows an example of accompanying information to be attached to a trajectory in a case where images of a certain object at different points of time in the trajectory are simultaneously displayed. It should be noted that images of the object at the other points of time other than the background image are also included in the accompanying information.
  • FIG. 6 shows an example in which the line type of the frame surrounding the object region a 02 of the object is varied, and symbols are added to the frames on display.
  • the frames of object regions a 02 - 1 to a 02 - 3 are indicated by solid lines in accordance with a change in the features of the object (the presence or absence of a coat), and the frame of an object region a 02 - 4 Is indicated by a dot-and-dash line.
  • FIG. 6 also shows an example in which an image obtained by cutting out the object region of the object included in the image at the time point corresponding to the time when an alert was issued (an alerting time point) in the input video image is further superimposed, and a symbol (the circled A in the drawing) to that effect is attached to the frame line of the object region a 02 - 2 .
  • the circled R in the drawing is an example of a symbol attached at the time point corresponding to the current time
  • the circle P in the drawing is an example of a symbol attached to the time point corresponding to a past time.
  • the “vicinity” of a certain time point in the trajectory also includes the vicinity of the object region a 02 of the object displayed in accordance with the point.
  • the information to be attached also includes a clipped image (hereinafter referred to as an object image) of the object region of the object included in the image generated when the object is located at the point within the trajectory.
  • the exclamation mark in the drawing is an example of a symbol indicating that a feature has changed.
  • the “ro” mark in the drawing is an example of a symbol indicating that there is an interaction with another object. In a case where the object is not accompanied with any object image, similar symbols may be attached near the corresponding points on the tracking line.
  • the display control unit 105 may attach the information about the object at the time point corresponding to a certain point such as a point designated by the user in the trajectory, or the information about the elapsed time at the time point corresponding to the certain point. Other than that, the display control unit 105 may express the corresponding time with color or accompanying information, by varying the colors of the frame lines at the alerting time point, some other past time point, and the current time point, for example.
  • the display control unit 105 may allow the user to designate the target to display the trajectory, the display items, and the range of the trajectory, from the elapsed time and the analysis results.
  • a graphical user interface capable of designating the items listed below may be prepared so that the user can narrow down the object to display the trajectory, the display items, the range of the trajectory, and the like.
  • the time or the timing set as the reference time for the elapsed time, and the time elapsed since the reference time (Within minutes from the time of alert issuance, for example)
  • the traveling direction of the object is the traveling direction of the object
  • the GUI may be a general menu expression such as a combo box, a list, a check box, a radio button, a text input, or time selection.
  • FIG. 8 is a flowchart showing an example operation of the video image analysis device 2 (the tracking unit 103 ) of this embodiment.
  • a video image to be analyzed is first input from the video image input device 1 (step S 11 ).
  • the video image analysis device 2 then tracks the tracking target object in the input video image, continuously acquires the position of the tracking target object, and detects a predetermined item about the tracking target object in the image of the corresponding time or of the time when the position was acquired (step S 12 : a video image analysis process).
  • the video image analysis device 2 then outputs the video image accompanied by the analysis result (step S 13 : a video image output with an analysis result).
  • the video image analysis device 2 may associate the input video image with information indicating the analysis result, and store the input video image and the information into the storage device 3 .
  • the video image analysis device 2 repeats the above processing in steps S 11 to S 13 until the video image input is ended (step S 14 ).
  • FIG. 9 is a flowchart showing an example operation of the video image processing device 4 (the display control unit 105 ) of this embodiment.
  • a video image accompanied by an analysis result is first input from the video image analysis device 2 or the like (step S 21 ).
  • the video image processing device 4 may read the input video image and the information indicating the analysis result from a predetermined storage unit, instead of receiving an input of a video image accompanied by an analysis result.
  • the video image processing device 4 displays, on the display device 5 , the image at a certain time point in the input video image or an image created in accordance with the input video image (step S 22 : a display control process).
  • the display control process will be described later in detail.
  • the video image processing device 4 repeats the above processing in steps S 21 and S 22 until an end of display is detected (step S 23 ).
  • FIG. 10 is a flowchart showing an example of the display control (the above display control process in step S 22 ) in the display control unit 105 .
  • the background image and the object to be displayed are designated by the user or are determined in advance. It is also assumed that an image that can be a background image (such as a predetermined amount of image in the input video image or an image generated from the input video image) is stored in the video image holding unit 102 together with its identifier. Further, it is assumed that the analysis information storage unit 104 stores analysis information in which information indicating the analysis results including the position acquired from the input video image by the tracking unit 103 and the other detected items is associated with identification information about the image from which the position was acquired in the input video image.
  • the display control unit 105 first acquires a background image from the video image holding unit 102 (step S 101 ).
  • the display control unit 105 then acquires the analysis information from the analysis information storage unit 104 (step S 102 ).
  • the display control unit 105 then generates the object's trajectory suitable for the background image, in accordance with region information about the background image (step S 103 ).
  • the region information is information that associates the coordinates of the background image with the coordinates of the (real) imaging region.
  • a trajectory image in which only the trajectory (tracking line) to be displayed is drawn in the image region corresponding to the background image should be generated.
  • the technique of calculating or drawing the path of a trajectory line suitable for a known background image whose positional relationship between the region in the image and the imaging region in accordance with the position information indicating the continuous positions of the object is a known technique, and therefore, detailed explanation thereof is not made herein.
  • step S 103 in accordance with the analysis information, the display control unit 105 generates a trajectory accompanied by information attached in the vicinity of the corresponding position in the trajectory.
  • the information indicates whether the display mode in the trajectory varies with the results of the analysis of the input video image or the time elapsed since the predetermined time, or indicates the results of the analysis of the input video image or the elapsed time.
  • the display control unit 105 then superimposes the generated trajectory on the background image, to generate a display image that is an image for display (step S 104 ).
  • the display control unit 105 outputs the image data of the generated display image to the display unit 106 , and causes the display unit 106 to display the display image (step S 105 ).
  • FIG. 11 to FIG. 17 are explanatory views showing examples of display images.
  • (a) is an explanatory view showing an image obtained by converting a display image as a color image into a simplified image
  • (b) is an explanatory view showing a simplified superimposed image that is an image other than the background image in the display image.
  • the example shown in FIG. 11 is an example of a display image in a case where the latest image is set as the background image, and the background image including the latest object is updated every moment.
  • the image of the object corresponding to the latest time is an image of the background image.
  • the display mode (specifically, the color) of the trajectory is varied with the time elapsed since the latest time, while a symbol (such as circled A, circled P, circled R, or “ro” mark) corresponding to the state of the object at the corresponding time is provided.
  • the display control unit 105 may superimpose and display information other than the tracking line, such as the intrusion detection line a 03 .
  • the example shown in FIG. 12 is an example of a display image in a case where the image at the alerting time point is set as the background image, and only the tracking line is updated every moment.
  • the display control unit 105 may superimpose a trajectory based on the latest position of the object while displaying the latest image as well as a past image. As a result, it becomes possible to observe another tracking target object (another object related to an intruder, for example).
  • another tracking target object another object related to an intruder, for example.
  • the lines surrounding the object regions of the object and another object are different, being a solid line and a dashed line, and the colors of the lines are varied for each object, for example.
  • FIG. 12( b ) instead of colors, numbers for identifying objects are provided.
  • the latest object image is not included.
  • the display control unit 105 may further perform predetermined processing, such as transparency boosting, on the latest object image (an image cut out from an object region of the object) or the entire latest image, and then superimpose the processed image on the background image.
  • the example shown in FIG. 13 is an example of a display image in a case where a tracking target object other than a tracking target object for an alert, such as an intruder, is designated as the object.
  • the object whose trajectory is to be displayed is not limited to any particular object, and may be any tracking target object designated by the user, for example.
  • the example shown in FIG. 14 is an example of a display image in a case where there is another object related to the object.
  • the display control unit 105 may determine that these objects are related to each other, and superimpose and display the trajectory of the other object, as well as the trajectory of the object, on the background image.
  • the latest image is set as the background image, and the background image including the latest object is updated every moment.
  • the background image at the time when the trajectory of a related object is displayed, and the method of updating the background image are not limited to any particular image and any particular method.
  • the display control unit 105 may display the trajectory of the other object having the interaction with the object, together with the trajectory of the object, even though the interaction is not the object getting off a vehicle. In such a case, a similar notification function may be given to the trajectory of other object.
  • FIG. 15 is an example of a display image on which a trajectory including a roaming section is superimposed.
  • (a-1) and (b-1) are an example of a display image and an example of a simplified superimposed image on which a roaming section is superimposed without shortening, respectively.
  • Reference numeral all in (b-1) indicates the region in which the object roaming has been detected.
  • (a-2) and (b-2) are an example of a display image and an example of a simplified superimposed image on which the roaming section is superimposed after being shortened.
  • (a-3) and (b-3) are an example of a display image and an example of a simplified superimposed image on which the roaming section is superimposed after being shortened and expanded.
  • the display control unit 105 may set a roaming section that is the sections in the trajectory corresponding to the roaming part, and turn the analysis results within the section into a group, to display only typical information.
  • the display control unit 105 When performing the grouping, the display control unit 105 preferably displays the information in a mode indicating that the section is a grouped and shortened section (by attaching some symbol or providing a narrowed display, for example).
  • (a-1) and (b-1) correspond to the display image prior to the grouping (in a normal state)
  • (a-2) and (b-2) correspond to the display image after the grouping.
  • the display control unit 105 may expand and display the section, as shown in (a-3) and (b-3).
  • an instruction input about the point within the section after the expansion can be received through a direct click on the mouse wheel or the tracking line after the expansion, an operation of a slide bar or the like alternatively displayed in a predetermined region on the display screen, or the like.
  • the example shown in FIG. 16 is an example in which a section in a specific range designated by the user is displayed as an annotation section in such a mode that the annotation section can be recognized.
  • the display control unit 105 may add time information and image information corresponding to the annotation section to the analysis information so that the supervisor who is not present can be notified of the annotation section.
  • annotation information the information to be added in accordance with such an annotation section will also be referred to as annotation information.
  • the display control unit 105 may cut out the video image (image group) corresponding to the annotation section, and output the video image to a predetermined device. As a result, the checking costs of the supervisor can be lowered.
  • the example shown in FIG. 17 is an example in which the results of classification of changes in the direction and the posture of the object are expressed with line types. As shown in FIG. 17 , where the display modes are varied within a trajectory in accordance with changes in the state and the features of the object, the operator can determine of which time point an image should be checked to examine the appearance of the object.
  • the trajectory of the object is displayed on the screen currently being checked, while the display mode of a part of the trajectory is made to differ from another part in accordance with the time elapsed since the predetermined time and the results of the analysis of the input video image, or information associated with a point in the trajectory is attached.
  • the user can intuitively know the elapsed time and the analysis results in association not only with the traveling direction of the object but also with the point in the trajectory of the object.
  • the user can know the position of the object (or where the object was located) of the time when a certain period of time elapsed since the predetermined time, and how the object spent time before and after the certain period of time elapsed. From the trajectory of the object, the user can also recognize the situation of the time of intrusion (whether the intrusion really occurred), the state of the object (classification results such as the direction, the posture, the motion, and the color of the outfit), and the presence or absence of another object related to the object.
  • the display mode of a part of a trajectory is changed in accordance with the time elapsed since the predetermined time or the analysis results (the direction and movement of the object, for example), or information is attached to the vicinity of a part of the trajectory, so that it becomes easy to select at which point in the trajectory the image should be checked to observe the details of the object.
  • the display mode of a part of the trajectory is changed or information is attached to the vicinity of a part of the trajectory, for example, it is not possible to determine at which point in the trajectory the object should be checked to observe the details of the object. As a result, it takes a long time to check the details of the object.
  • the trajectory of the point corresponding to the time point when the object is facing toward the camera is thicker than the other points.
  • the thick part of the trajectory should be selected, and the object at the time point corresponding to the point, or the object located at the point, should be checked.
  • the details of the object can be easily checked. It should be noted that the above effect can also be achieved by narrowing down the objects whose trajectories are to be displayed, the display items, or the range of trajectories, in accordance with designation from the user or a predetermined condition.
  • the user can promptly recognize the situations of the object at two or more points of time in a video image, in accordance with the information presented by the trajectory currently being displayed. For example, while checking the current image, the user can quickly determine from what point in the past an image should be selected to obtain desired information. The user can also recognize the current position of the object while checking a past image, for example.
  • the image of the object at a specific time point (such as an alert issuance time) is superimposed and displayed in the vicinity of the corresponding point in the trajectory, so that the user can know the situations of the object at two or more points of time without switching screens.
  • the user may be enabled to designate from which time point the image to be superimposed should be selected, using a trajectory.
  • the trajectory display can aid the user in determining from which time point the object should be selected and checked.
  • the user can promptly grasp the situations of the object at two or more points of time in a video image.
  • the display control unit 105 has a GUI function in a trajectory superimposed and displayed on a background image. More specifically, the video image processing device 4 further has a GUI function of acquiring a request (predetermined instruction input) that designates a point in the trajectory being displayed, the request being associated with the trajectory superimposed and displayed on the background image, and performing screen control in accordance with the request.
  • a request predetermined instruction input
  • the user is enabled to simultaneously display the object from two or more points of time (such as the time point of the background image and the time point corresponding to a designated point, or the time point corresponding to a first designated point and the time point corresponding to a second designated point), and to switch background images.
  • points of time such as the time point of the background image and the time point corresponding to a designated point, or the time point corresponding to a first designated point and the time point corresponding to a second designated point
  • the system configuration of a video image analysis system of this embodiment is basically the same as the configuration of the first embodiment shown in FIG. 1 .
  • the video image analysis device 2 (the tracking unit 103 ) detects the position of a tracking target object, and also detects information other than the position (such as the corresponding time, the features of the tracking target object, and another object related to the tracking target object). However, it is not necessary to detect the information. That is, the video image analysis device 2 (the tracking unit 103 ) of this embodiment should be able to analyze an input video image, and generate analysis information in which position information indicating the position of the tracking target object in the input video image is associated with identification information about the image from which the position was acquired.
  • FIG. 18 is a block diagram showing an example configuration of the video image analysis system of this embodiment.
  • the video image analysis system 200 shown in FIG. 18 includes a display control unit 205 , instead of the display control unit 105 of the video image analysis system 100 of the first embodiment shown in FIG. 2 .
  • the display control unit 205 displays the trajectory of a predetermined object in an input video image on the display unit 106 .
  • the display control unit 205 sets the image as the background image, and superimposes the trajectory of the predetermined object in the input video image on the background image.
  • the background image is not limited to any particular image, as long as it is an image including the region corresponding to at least a part of the moving path of the object in the image region.
  • the display control unit 205 of this embodiment further adds a GUI function to the trajectory of the object in the input video image.
  • the GUI function is for acquiring a predetermined request that designates a point in the trajectory currently being displayed, and performing display control in accordance with the request.
  • the display control unit 205 acquires a first request designating a point in the trajectory currently being displayed, and displays a designated image superimposed on an appropriate background image being displayed on the display unit 106 .
  • the designated image includes the object of the time when the object was located at the designated point.
  • the first request will be also referred to as a “designated image addition request”.
  • the point designated by the predetermined request including the first request will be also referred to as the “designated point”.
  • the section will be also referred to as the “designated section”.
  • the display control unit 205 may superimpose a designated image, every time the designated point changes.
  • the designated image includes the object of the time when the object was located at the designated point.
  • the user can be enabled to check the object shown in the image corresponding to the time point of a pointed spot in the same image, simply by moving the position of the pointed spot in the trajectory.
  • the display control unit 205 may superimpose a designated image that is an image generated when the object was located at the designated point among the images included in the video image (this image will be hereinafter referred to as the corresponding image).
  • a first request is a request that is input together with the position information about a pointed spot when the spot pointed by a pointing device is moved in a trajectory
  • the display control unit 205 may superimpose, every time a first request is acquired, a designated image including the object of the time when the object was located at the point indicated by the position information as the designated point. In this manner, the user can check the object of the time when the object was located at the designated point in the same image, simply by tracing the trajectory.
  • the display control unit 205 superimposes the designated image in the position corresponding to the position of the object in the corresponding image in the background image.
  • the display control unit 205 may superimpose the designated image on the background image after determining the position and the size in which the designated image is superimposed on the background image in accordance with the position and the size of the object in the corresponding image, for example.
  • the display control unit 205 may superimpose a second designated image, together with the designated image, on the background image.
  • the second designated image is obtained by cutting out another object related to the object from the corresponding image.
  • the display control unit 205 may also acquire a second request designating a point in the trajectory currently being displayed, and switch the background image to the corresponding image. At the same time as the switching, the display control unit 205 may superimpose and display the trajectory of the object on the switched background image.
  • the second request will also be referred to as the “background switching request”.
  • the display control unit 205 may also acquire a third request designating a point or a section in the trajectory currently being displayed. The display control unit 205 may then add information indicating the designation to the image (corresponding image) generated when the object was located at the designated point or section among the images included in the video image, or may extract the image and output the image to the outside. Further, the display control unit 205 may set the section at this stage as the annotation section. The display control unit 205 may then make the display mode of the section differ from the other sections in the image currently being displayed, and add information indicating the variation to the section.
  • a third request will also be referred to as an “annotation addition request”.
  • the display control unit 205 may also superimpose a designated image, together with the trajectory, on the background image.
  • the designated image is obtained from the image of a time when a predetermined condition was satisfied in the input video image. This is equivalent to the display of a trajectory accompanied by the object image of a specific point time in the first embodiment.
  • the display control unit 205 may further superimpose identification information about the object or information about the time point corresponding to a designated point, on the designated image, for example. This is equivalent to the associating by assigning the same numbers or the like to indicate the identity of the object, and the adding of information (such as circled P, circled A, and circled R) in accordance with the elapsed time or the corresponding time in the first embodiment.
  • the display control unit 205 may also acquire a fourth request designating a point or a section in the trajectory currently being displayed, for example.
  • the display control unit 205 may then display a designated image on the background image until an instruction to cancel the fourth request is acquired.
  • the designated image includes the object of the time when the object was located at the designated point.
  • the fourth request will also be referred to as the “designated image pinning request”.
  • the display control unit 205 can constantly display the image of the object of an alerting time point or the image of a related object at the alerting time point, for example, after adding a symbol or the like indicating the alerting time point, as in the first embodiment.
  • the display control unit 205 may add a GUI function not only to a trajectory but also to the object region of the tracking target object included in a display image. That is, in a case where tracking target objects other than the current object are displayed in a display image (the tracking target objects are shown in or superimposed on the background image), the display control unit 205 can cause the user to select the object whose trajectory is to be newly displayed from among the tracking target objects. For example, in a situation where tracking target objects other than the current object are displayed on the screen as shown in FIG. 12 , when an object switching request that designates a tracking target object other than the current object is acquired from the user, the display control unit 205 may set the designated tracking target object as the new object, and display the trajectory of the new object on the current background image. At this stage, the trajectory of the previous object may be left behind or erased, or may be selected by the user.
  • the display control unit 205 can also perform control to display time information and the detected item that triggered an alert, receive a user operation or the like in response to the display, and display the image of the time corresponding to those items.
  • the display method may be a method of further superimposing the image on the display image currently being displayed, or a method of displaying the image by switching background images.
  • the image is further superimposed, the entire image of the corresponding time may be superimposed, or a part (the corresponding part) of the image may be cut out and then be superimposed.
  • the display control unit 205 is also formed with an information processing device such as a CPU included in the video image processing device 4 .
  • FIG. 19 is a flowchart showing an example of display control in the display control unit 205 .
  • the background image and the object to be displayed are designated by the user or are determined in advance. It is also assumed that an image that can be a background image (such as a predetermined amount of image in an input video image or an image generated from the input video image) is stored in the video image holding unit 102 together with its identifier. Further, it is assumed that the analysis information storage unit 104 stores analysis information in which information indicating the analysis results including the position acquired from the input video image by the tracking unit 103 and the other detected items is associated with identification information about the image from which the position was acquired in the input video image.
  • the image information about the display image to be displayed on the display unit 106 is divided into layers, and the layers are stored and managed. The layers are then superimposed on one another, and are output.
  • the method of generating a display image is not limited to this example.
  • the display control unit 205 first acquires a background image from the video image holding unit 102 , and sets the background image in a background layer (step S 201 ). More specifically, setting in a layer means storing the image information about the image to be displayed in the layer into a buffer provided for the layer.
  • the display control unit 205 then acquires analysis information from the analysis information storage unit 104 (step S 202 ).
  • the display control unit 205 then generates a trajectory of the object suitable for the background image in accordance with the region information about the background image, and sets the generated trajectory in a trajectory layer (steps S 203 and S 204 ). At this stage, the display control unit 205 generates a trajectory (a trajectory image) in which a point is time in the trajectory is associated with the image or the time point in the input video image or time information about the time point.
  • a trajectory a trajectory image
  • the display control unit 205 then superimposes the image information in the background layer and the image information in the trajectory layer on each other, and stores the superimposed image information into the display buffer that stores the image information to be output to the display unit 106 (step S 205 ).
  • the display control unit 205 may convert the object image as a designated image into an image in the position and the size corresponding to those of the corresponding image set as the background image. The display control unit 205 may then set the object image, together with a pinning flag, in a superimposed image layer in which the designated image is to be set. The number of superimposed image layers is equal to the number of images to be superimposed on one another.
  • the display control unit 205 determines whether an image is set in a superimposed image layer (step S 206 ), to superimpose the designated images set so far on the background image. If an image is set in a superimposed image layer (Yes in step S 206 ), the display control unit 205 further superimposes and stores the set image into the display buffer (step S 207 ).
  • the display control unit 205 If images are set in two or more superimposed image layers, the display control unit 205 superimposes and stores all the set images into the display buffer in step S 207 . The display control unit 205 then moves on to step S 208 .
  • step S 206 If any image is not set in the superimposed image layers (No in step S 206 ), on the other hand, the display control unit 205 moves on directly to step S 208 .
  • step S 208 the display control unit 205 outputs the image information stored in the display buffer to the display unit 106 .
  • a display image in which the background image, the trajectory, and, if any, the image(s) set in the superimposed image layers are superimposed on one another is displayed on the display unit 106 .
  • the display control unit 205 acquires a predetermined request including designation of a point in the trajectory currently being displayed. In this example, the display control unit 205 receives an event corresponding to the request. The display control unit 205 then performs the processing corresponding to the event (step S 209 : event processing). Examples of the event processing are shown in FIGS. 20 to 25 .
  • the display control unit 205 may return to step S 202 after a predetermined time has passed since the end of the event processing, for example, and acquire a request while repeating the operation in steps S 202 to S 208 , to update the trajectory.
  • FIG. 20 is a flowchart showing an example of the event processing in accordance with a first request (designated image addition request).
  • the display control unit 205 performs the processing in steps E 11 to E 14 in a case where the accepted event is a first request event indicating that a first request has been acquired (Yes in step E 11 ).
  • the display control unit 205 first clears the superimposed image layer(s) to which the pinning flag is not attached (step E 12 ).
  • the pinning flag is the flag indicating that the image in the corresponding superimposed image layer is to be constantly displayed.
  • the designated point image(s) set in the superimposed image layer(s) to which the pinning flag, which was displayed before the first request, can be cleared at the next display image update timing.
  • the display control unit 205 then acquires the designated image from the corresponding image corresponding to the point designated by the first request related to the event, adjusts the position and the size as necessary, and sets the designated image in a new superimposed image layer (steps E 13 and E 14 ). To reflect the setting contents in the display image, the display control unit 205 returns to step S 206 .
  • FIG. 21 is a flowchart showing an example of event processing corresponding to a first request cancellation event indicating that a first request is invalidated.
  • the first request cancellation event supposedly occurs when the point designated by a first request moves to another position or when a first request indicating a new point is received.
  • the display control unit 205 performs the processing in step E 16 in a case where the accepted event is a first request cancellation event (Yes in step E 15 ).
  • the display control unit 205 clears the superimposed image layer(s) to which the pinning flag is not attached (step E 16 ).
  • the display control unit 205 may immediately return to step S 205 and update the display screen.
  • FIG. 22 is a flowchart showing an example of event processing corresponding to a second request (a background switching request).
  • the display control unit 205 performs the processing in steps E 22 and E 23 in a case where the accepted event is a second request event indicating that a second request has been acquired (Yes in step E 21 ).
  • the display control unit 205 first clears all of the trajectory layer and the superimposed image layers (step E 22 ).
  • the display control unit 205 then sets a background image that is the corresponding image corresponding to the designated point (step E 23 ). To reflect the setting contents in the display image, the display control unit 205 returns to step S 201 .
  • the corresponding image is set as the background image, and a display image in which a trajectory is superimposed on the background image is displayed on the display unit 106 .
  • the display control unit 205 may convert the object image as a designated image into an image in the position and the size corresponding to those of the corresponding image set as the background image.
  • the display control unit 205 may then set the object image, together with a pinning flag, in a superimposed layer.
  • FIG. 23 is a flowchart showing an example of event processing corresponding to a third request (an annotation addition request).
  • the display control unit 205 performs the processing in step E 32 in a case where the accepted event is a third request event indicating that a third request has been acquired (Yes in step E 31 ).
  • the display control unit 205 adds annotation information (information indicating that the user has issued an instruction) to the image corresponding to the designated point or the designated section and the trajectory (step E 32 ).
  • the display control unit 205 may further cut out an image corresponding to the designated point or the designated section from an input video image, and output the image to the outside.
  • FIG. 24 is a flowchart showing an example of event processing corresponding to a fourth request (a designated image pinning request).
  • the display control unit 205 performs the processing in step E 42 in a case where the accepted event is a fourth request event indicating that a fourth request has been acquired (Yes in step E 41 ).
  • the display control unit 205 sets a pinning flag in the superimposed image layer in which the designated image corresponding to the designated point is set (step E 42 ).
  • FIG. 25 is a flowchart showing an example of event processing corresponding to a fifth request event corresponding to a fifth request (a fourth request cancellation request).
  • the display control unit 205 performs the processing in step E 52 in a case where the accepted event is a fifth request event indicating that a fifth request has been acquired (Yes in step E 51 ).
  • the display control unit 205 cancels the pinning flag in the superimposed image layer in which the designated image corresponding to the designated point is set (step E 52 ).
  • FIG. 26 is an explanatory diagram showing an example of generation patterns for composite images (display images) to be generated in this embodiment.
  • FIG. 26 examples of generation patterns for display images from which trajectories are removed are shown.
  • the display control unit 205 may generate a display image by superimposing an image cut out from a designated point image on a past image, for example (generation pattern 1 ).
  • the designated point image corresponds to the above mentioned corresponding image.
  • the image cut out from the designated point image corresponds to the above mentioned designated image.
  • the display control unit 205 may generate a display image by superimposing an image cut out from the latest image and an image cut out from a designated point image on a past image, for example (generation pattern 2 ).
  • the image cut out from the latest image may be the object image of the object included in the latest image.
  • the display control unit 205 may generate a display image by superimposing an image cut out from a designated point image on the latest image, for example (generation pattern 3 ).
  • the display control unit 205 may generate a display image by superimposing an image cut out from a past image and an image cut out from a designated point image on the latest image, for example (generation pattern 4 ).
  • the image cut out from a past image may be an object image of an object or a related object included in any of the past images.
  • object images include an object image of an object or a related object included in the past image at a specific time point, such as an alerting time point, a time point when a feature change was detected, or a time point when there was an interaction with another related object.
  • FIGS. 27 to 29 are explanatory views showing examples of display images according to this embodiment.
  • (a) is an explanatory view showing an image obtained by converting a display image as a color image into a gray scale image
  • (b) is an explanatory view showing a simplified superimposed image that is an image other than the background image in the display image, as in the first embodiment.
  • the example shown in FIG. 27 is an example of the display image to be displayed after the user issues a first request designating the point corresponding to a certain past time point in a trajectory in a display image in which the latest image is set as the background image and the background image including the latest object is updated every moment.
  • the point designated by the first request is indicated by a white arrow.
  • the display control unit 205 may further superimpose and display a designated image (the object image at that time) on the display image.
  • the designated image is formed by cutting out an object region a 02 - 3 of the object from the corresponding image corresponding to the designated point.
  • a trajectory and a designated image obtained by cutting out an object region a 02 - 2 of the object from the corresponding image at the time of issuance of an alert are superimposed on the latest image.
  • the example shown in FIG. 28 is an example of a display image that is displayed while the designated image to be superimposed on the display image is changed with movement of such a first request.
  • the object region a 02 - 3 , an object region a 02 - 4 , and an object region a 02 - 5 are displayed at the same time. In practice, however, these regions are switched on display, in accordance with movement of the pointed spot (the white arrow in the drawing).
  • the example shown in FIG. 28 is an example of a display device in a case where the user designates a certain time point in the past in a situation where the latest image is set as the background image, and the background image including the latest object is updated every moment.
  • the display control unit 105 may superimpose and display the object image (the object region a 02 - 3 in the drawing) of the object of the time when the object was located at the point, and an object image of a related object (see the object region a 02 - 4 in the drawing). In this manner, the user may also be enabled to check the characteristics and the like of a related object.
  • the display control unit 205 may set the image of the other object of the time of the interaction as an image of a related object, and superimpose and display the image, together with the image of the object of the same time point.
  • the user simply designates a trajectory currently being displayed, an object region of the object accompanying the trajectory, or an object region of a related object, and makes a predetermined input.
  • a cut-out image corresponding to a designated point can be displayed, or background images can be switched.
  • the states of the object at two or more points of time in a video image and the states of the surroundings can be promptly grasped.
  • the display control unit 205 of this embodiment also adds a notification function to a trajectory as described in the first embodiment.
  • the display control unit 205 can add only a GUI function to a trajectory, without giving such a notification function to the trajectory. That is, it is possible to provide a GUI function of this embodiment even for simple trajectory display.
  • FIG. 30 is a schematic block diagram showing an example configuration of a computer according to an embodiment of the present invention.
  • a computer 1000 includes a CPU 1001 , a main storage device 1002 , an auxiliary storage device 1003 , an interface 1004 , a display device 1005 , and an input device 1006 .
  • the video image analysis device and the video image processing device described above may be mounted on the computer 1000 , for example. In that case, operations of the respective devices may be stored as a program in the auxiliary storage device 1003 .
  • the CPU 1001 reads the program from the auxiliary storage device 1003 , loads the program into the main storage device 1002 , and performs predetermined processing according to the above embodiments, in accordance with the program.
  • the auxiliary storage device 1003 is an example of a non-transitory physical medium.
  • Other examples of non-transitory physical media include magnetic disks, magneto-optical disks, CD-ROMs, DVD-ROMs, semiconductor memories, and the like to be connected to the computer 1000 via the interface 1004 .
  • the computer 1000 may load the program into the main storage device 1002 after receiving the delivery, and perform predetermined processing according to the above embodiments.
  • the program may be for performing part of the predetermined processing in each embodiment.
  • the program may be a differential program for performing predetermined processing according to the above embodiments in combination with another program already stored in the auxiliary storage device 1003 .
  • the interface 1004 transmits and receives information to and from other devices.
  • the display device 1005 presents information to users.
  • the input device 1006 receives inputs of information from users.
  • some of the components of the computer 1000 can be omitted.
  • the display device 1005 can be omitted.
  • each component of each device is implemented by general-purpose or special circuitry, processors or the like, or combinations thereof. These may be formed with a single chip or may be formed with chips connected via a bus. Alternatively, part or all of each component of each device may be formed with a combination of the above mentioned circuitry or the like and a program.
  • each component of each device is formed with information processing devices and circuitry or the like
  • the information processing devices and the circuitry or the like may be arranged in a centralized manner or a distributed manner.
  • the information processing devices and the circuitry or the like may be formed with a client and server system, a cloud computing system, and the like connected to one another via a communication network.
  • FIG. 31 is a block diagram showing the outline of a video image processing device of the present invention.
  • the video image processing device 50 of the present invention may include a display control means 501 .
  • the display control means 501 acquires a first request designating a point in the trajectory being displayed, and displays a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • the display control means 501 may make the display mode of a part of the trajectory differ from another part, or attach information indicating the analysis result or the elapsed time to the vicinity of a part of the trajectory.
  • the user can promptly grasp the situations of the object at two or more points of time in a video image.
  • a video image processing device comprising a display control means that causes a display unit to display a trajectory indicating a change in a position of an object in a video image, wherein the display control means acquires a first request designating a point in the trajectory being displayed, and displays a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • the designated image is an image obtained by cutting out the object from a corresponding image being an image generated when the object was located at the designated point among images included in the video image and, when superimposing the designated image on the background image, the display control means superimposes a second designated image together with the designated image on the background image, the second designated image being an image obtained by cutting out another object related to the object from the corresponding image.
  • the video image processing device according to any of Supplementary notes 1 to 4, wherein the display control means acquires a second request designating a point in the trajectory being displayed, and switches the background image to a corresponding image being an image generated when the object was located at the designated point among images included in the video image.
  • the video image processing device according to any of Supplementary notes 1 to 5, wherein the display control means acquires a third request designating one of a point and a section in the trajectory being displayed, and attaches information indicating the designation to an image generated when the object was located at the designated one of the point and the section among images included in the video image, or extracts the image from the video image and outputs the image to outside.
  • the video image processing device according to any of Supplementary notes 1 to 6, wherein the display control means attaches identification information about the object or information about time corresponding to the designated point, to the designated image, and superimposes the designated image on the background image.
  • the video image processing device according to any of Supplementary notes 1 to 7, wherein the display control means, in accordance with one of a feature of the object shown in an analysis result obtained by analyzing the video image, a feature of another object related to the object shown in the analysis result, and an elapsed time from a predetermined time, makes a display mode of a part of the trajectory differ from another part or attach information indicating one of the analysis result and the elapsed time to a vicinity of a part of the trajectory.
  • a video image analysis system comprising:
  • a tracking means that analyzes a video image, and continuously acquires a position of a tracking target object from the video image; a storage means that stores position information indicating the position acquired by the tracking means in assocation with identification information about an image from which the position was acquired in the video image; and a display control means that causes a display unit to display a trajectory indicating a change in the position of the object in the video image, based on the information stored in the storage means, wherein the display control means acquires a first request designating a point in the trajectory being displayed, and displays a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • a video image processing method comprising:
  • a display unit to display a trajectory indicating a change in a position of an object in a video image; acquiring a first request designating a point in the trajectory being displayed; and displaying a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • a video image processing program for causing a computer to:
  • a display unit performs a process of causing a display unit to display a trajectory indicating a change in a position of an object in a video image; in the process, acquire a first request designating a point in the trajectory being displayed; and display a designated image including the object which it was located at the designated point by superimposing it on an arbitrary background image being displayed on the display unit.
  • the present invention can be suitably used not only in surveillance, but also in checking a video image accompanied by analysis information.
  • the present invention can be suitably used in marketing to recognize the situations of customers from a video image taken in a store or in the vicinity of a specific item.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
US16/489,374 2017-03-31 2018-02-21 Video image processing device, video image analysis system, method, and program Abandoned US20190378279A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-070677 2017-03-31
JP2017070677 2017-03-31
PCT/JP2018/006229 WO2018180039A1 (ja) 2017-03-31 2018-02-21 映像処理装置、映像解析システム、方法およびプログラム

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/006229 A-371-Of-International WO2018180039A1 (ja) 2017-03-31 2018-02-21 映像処理装置、映像解析システム、方法およびプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/228,816 Continuation US11748892B2 (en) 2017-03-31 2021-04-13 Video image processing device, video image analysis system, method, and program

Publications (1)

Publication Number Publication Date
US20190378279A1 true US20190378279A1 (en) 2019-12-12

Family

ID=63677955

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/489,374 Abandoned US20190378279A1 (en) 2017-03-31 2018-02-21 Video image processing device, video image analysis system, method, and program
US17/228,816 Active US11748892B2 (en) 2017-03-31 2021-04-13 Video image processing device, video image analysis system, method, and program
US18/218,917 Pending US20230351612A1 (en) 2017-03-31 2023-07-06 Video image processing device, video image analysis system, method, and program

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/228,816 Active US11748892B2 (en) 2017-03-31 2021-04-13 Video image processing device, video image analysis system, method, and program
US18/218,917 Pending US20230351612A1 (en) 2017-03-31 2023-07-06 Video image processing device, video image analysis system, method, and program

Country Status (7)

Country Link
US (3) US20190378279A1 (ja)
EP (1) EP3606055A4 (ja)
JP (4) JP6725061B2 (ja)
CN (1) CN110476421A (ja)
AR (1) AR111195A1 (ja)
SG (1) SG11201907834UA (ja)
WO (1) WO2018180039A1 (ja)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180012410A1 (en) * 2016-07-06 2018-01-11 Fujitsu Limited Display control method and device
US20220172447A1 (en) * 2019-04-18 2022-06-02 Sony Group Corporation Image processing device, image processing method, and program
US20220272303A1 (en) * 2021-02-24 2022-08-25 Amazon Technologies, Inc. Techniques for displaying motion information with videos

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111447403B (zh) * 2019-01-16 2021-07-30 杭州海康威视数字技术股份有限公司 一种视频显示方法、装置及系统
WO2020147792A1 (zh) * 2019-01-16 2020-07-23 杭州海康威视数字技术股份有限公司 一种视频显示方法、装置、系统及摄像机
CN110602525B (zh) * 2019-08-23 2021-09-17 江西憶源多媒体科技有限公司 一种视频分析结果与图像帧绑定传输的方法
GB2592035A (en) * 2020-02-13 2021-08-18 Univ Cranfield Object location status monitoring apparatus and method
CN111238323A (zh) * 2020-03-09 2020-06-05 深圳市宏源建设工程有限公司 一种控制爆破远程监控系统
CN111832539A (zh) 2020-07-28 2020-10-27 北京小米松果电子有限公司 视频处理方法及装置、存储介质

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0594536A (ja) 1991-10-01 1993-04-16 Nippon Telegr & Teleph Corp <Ntt> 移動物体抽出処理方法
JP2000099851A (ja) 1998-09-21 2000-04-07 Oki Electric Ind Co Ltd 監視システム装置
JP2006093955A (ja) * 2004-09-22 2006-04-06 Matsushita Electric Ind Co Ltd 映像処理装置
JP2006350761A (ja) 2005-06-17 2006-12-28 Nippon Telegr & Teleph Corp <Ntt> Rfidタグ発行装置,rfidタグ発行方法およびrfidタグ発行プログラム
JP2006350751A (ja) * 2005-06-17 2006-12-28 Hitachi Ltd 店舗内の販売分析装置及びその方法
JP4541316B2 (ja) * 2006-04-06 2010-09-08 三菱電機株式会社 映像監視検索システム
JP4933354B2 (ja) * 2007-06-08 2012-05-16 キヤノン株式会社 情報処理装置、及び情報処理方法
US20090002489A1 (en) 2007-06-29 2009-01-01 Fuji Xerox Co., Ltd. Efficient tracking multiple objects through occlusion
JP5159381B2 (ja) 2008-03-19 2013-03-06 セコム株式会社 画像配信システム
CN201315654Y (zh) * 2008-08-25 2009-09-23 云南正卓信息技术有限公司 监狱专用SkyEyesTM智能监控系统
JP5634266B2 (ja) * 2008-10-17 2014-12-03 パナソニック株式会社 動線作成システム、動線作成装置及び動線作成方法
JP2010123069A (ja) * 2008-11-21 2010-06-03 Panasonic Corp センシングデータ検索装置及び検索画像作成方法
JP5603663B2 (ja) * 2010-06-02 2014-10-08 Toa株式会社 移動体軌跡表示装置および移動体軌跡表示プログラム
DE102010031429A1 (de) * 2010-07-16 2012-01-19 Robert Bosch Gmbh Verfahren zum Bereitstellen eines Kombinations-Videos
US10645344B2 (en) * 2010-09-10 2020-05-05 Avigilion Analytics Corporation Video system with intelligent visual display
JP5718632B2 (ja) 2010-12-22 2015-05-13 綜合警備保障株式会社 部位認識装置、部位認識方法、及び部位認識プログラム
JP6031735B2 (ja) * 2011-06-13 2016-11-24 ソニー株式会社 情報処理装置、情報処理方法およびコンピュータプログラム
JP6171374B2 (ja) * 2013-02-06 2017-08-02 ソニー株式会社 情報処理装置、情報処理方法、プログラム、及び情報処理システム
JP6159179B2 (ja) * 2013-07-09 2017-07-05 キヤノン株式会社 画像処理装置、画像処理方法
US9453904B2 (en) * 2013-07-18 2016-09-27 Golba Llc Hybrid multi-camera based positioning
JP6364743B2 (ja) 2013-10-31 2018-08-01 株式会社Jvcケンウッド 情報処理装置、制御方法、プログラム、及び情報システム
US10432877B2 (en) 2014-06-30 2019-10-01 Nec Corporation Image processing system, image processing method and program storage medium for protecting privacy
JP6532234B2 (ja) * 2015-01-09 2019-06-19 キヤノン株式会社 情報処理システム、情報処理方法及びプログラム
JP5909709B1 (ja) * 2015-05-29 2016-04-27 パナソニックIpマネジメント株式会社 動線分析システム、カメラ装置及び動線分析方法
JP2017070667A (ja) 2015-10-09 2017-04-13 京楽産業.株式会社 遊技機
CN105759720B (zh) * 2016-04-29 2018-06-29 中南大学 基于计算机视觉的机械手跟踪定位在线识别与纠偏方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180012410A1 (en) * 2016-07-06 2018-01-11 Fujitsu Limited Display control method and device
US20220172447A1 (en) * 2019-04-18 2022-06-02 Sony Group Corporation Image processing device, image processing method, and program
US11995784B2 (en) * 2019-04-18 2024-05-28 Sony Group Corporation Image processing device and image processing method
US20220272303A1 (en) * 2021-02-24 2022-08-25 Amazon Technologies, Inc. Techniques for displaying motion information with videos

Also Published As

Publication number Publication date
CN110476421A (zh) 2019-11-19
JP2021185698A (ja) 2021-12-09
WO2018180039A1 (ja) 2018-10-04
AR111195A1 (es) 2019-06-12
US20210233254A1 (en) 2021-07-29
EP3606055A1 (en) 2020-02-05
JP6725061B2 (ja) 2020-07-15
SG11201907834UA (en) 2019-09-27
US20230351612A1 (en) 2023-11-02
US11748892B2 (en) 2023-09-05
JP7279747B2 (ja) 2023-05-23
JP2020167720A (ja) 2020-10-08
JPWO2018180039A1 (ja) 2019-12-26
JP2023085397A (ja) 2023-06-20
EP3606055A4 (en) 2020-02-26

Similar Documents

Publication Publication Date Title
US11748892B2 (en) Video image processing device, video image analysis system, method, and program
US10846865B2 (en) Video image processing device, video image analysis system, method, and program
US11132887B2 (en) Eyeglasses-type wearable terminal, control method thereof, and control program
CN104521230B (zh) 用于实时重建3d轨迹的方法和系统
US20120019659A1 (en) Video surveillance system and method for configuring a video surveillance system
JP6127659B2 (ja) 運転支援装置及び運転支援方法
JP6331785B2 (ja) 物体追跡装置、物体追跡方法および物体追跡プログラム
US20200336704A1 (en) Monitoring system, monitoring method, and monitoring program
JP7459916B2 (ja) 物体追跡方法、物体追跡装置、及びプログラム
US9547905B2 (en) Monitoring system with a position-dependent protected area, method for monitoring a monitoring area and computer program
KR20120038322A (ko) 마커 또는 마커리스를 융합하는 증강현실 장치 및 방법
JP2021132267A (ja) 映像監視システムおよび映像監視方法
KR101840042B1 (ko) 복합 가상 팬스 라인 설정 방법 및 이를 이용한 침입 감지 시스템
KR20160060068A (ko) 모바일 터미널 보안 시스템
JP2019221115A (ja) 走行状況提示装置
Kim et al. Real-Time Struck-By Hazards Detection System for Small-and Medium-Sized Construction Sites Based on Computer Vision Using Far-Field Surveillance Videos
JP2012169826A (ja) 画像処理装置、画像表示システム及び画像処理方法
KR102621875B1 (ko) 객체 추적 시스템
JP2012084078A (ja) 動体情報解析装置、動体情報解析システム、動体情報解析装置の制御方法、および動体情報解析装置の制御プログラム
US20230215113A1 (en) Visualization device of a 3d augmented object for displaying a pickup target in a manufacturing process assembly operation and the method thereof
US20190244364A1 (en) System and Method for Detecting the Object Panic Trajectories
KR20230032976A (ko) 증강현실을 이용한 영상 감시 시스템 및 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HIRAKAWA, YASUFUMI;REEL/FRAME:050194/0117

Effective date: 20190805

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION